Sample records for distributed object environment

  1. Model of Distributed Learning Objects Repository for a Heterogenic Internet Environment

    ERIC Educational Resources Information Center

    Kaczmarek, Jerzy; Landowska, Agnieszka

    2006-01-01

    In this article, an extension of the existing structure of learning objects is described. The solution addresses the problem of the access and discovery of educational resources in the distributed Internet environment. An overview of e-learning standards, reference models, and problems with educational resources delivery is presented. The paper…

  2. WWWinda Orchestrator: a mechanism for coordinating distributed flocks of Java Applets

    NASA Astrophysics Data System (ADS)

    Gutfreund, Yechezkal-Shimon; Nicol, John R.

    1997-01-01

    The WWWinda Orchestrator is a simple but powerful tool for coordinating distributed Java applets. Loosely derived from the Linda programming language developed by David Gelernter and Nicholas Carriero of Yale, WWWinda implements a distributed shared object space called TupleSpace where applets can post, read, or permanently store arbitrary Java objects. In this manner, applets can easily share information without being aware of the underlying communication mechanisms. WWWinda is a very useful for orchestrating flocks of distributed Java applets. Coordination event scan be posted to WWWinda TupleSpace and used to orchestrate the actions of remote applets. Applets can easily share information via the TupleSpace. The technology combines several functions in one simple metaphor: distributed web objects, remote messaging between applets, distributed synchronization mechanisms, object- oriented database, and a distributed event signaling mechanisms. WWWinda can be used a s platform for implementing shared VRML environments, shared groupware environments, controlling remote devices such as cameras, distributed Karaoke, distributed gaming, and shared audio and video experiences.

  3. Distributed service-based approach for sensor data fusion in IoT environments.

    PubMed

    Rodríguez-Valenzuela, Sandra; Holgado-Terriza, Juan A; Gutiérrez-Guerrero, José M; Muros-Cobos, Jesús L

    2014-10-15

    The Internet of Things (IoT) enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments.

  4. Distributed Service-Based Approach for Sensor Data Fusion in IoT Environments

    PubMed Central

    Rodríguez-Valenzuela, Sandra; Holgado-Terriza, Juan A.; Gutiérrez-Guerrero, José M.; Muros-Cobos, Jesús L.

    2014-01-01

    The Internet of Things (IoT) enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments. PMID:25320907

  5. Dome: Distributed Object Migration Environment

    DTIC Science & Technology

    1994-05-01

    Best Available Copy AD-A281 134 Computer Science Dome: Distributed object migration environment Adam Beguelin Erik Seligman Michael Starkey May 1994...Beguelin Erik Seligman Michael Starkey May 1994 CMU-CS-94-153 School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Abstract Dome... Linda [4], Isis [2], and Express [6] allow a pro- grammer to treat a heterogeneous network of computers as a parallel machine. These tools allow the

  6. Project Integration Architecture: Distributed Lock Management, Deadlock Detection, and Set Iteration

    NASA Technical Reports Server (NTRS)

    Jones, William Henry

    2005-01-01

    The migration of the Project Integration Architecture (PIA) to the distributed object environment of the Common Object Request Broker Architecture (CORBA) brings with it the nearly unavoidable requirements of multiaccessor, asynchronous operations. In order to maintain the integrity of data structures in such an environment, it is necessary to provide a locking mechanism capable of protecting the complex operations typical of the PIA architecture. This paper reports on the implementation of a locking mechanism to treat that need. Additionally, the ancillary features necessary to make the distributed lock mechanism work are discussed.

  7. Advancements in Distributed Learning (ADL) Environment in Support of Transformation

    DTIC Science & Technology

    2017-01-01

    REPORT TR-HFM-212 Advancements in Distributed Learning (ADL) Environment in Support of Transformation (Progrès en apprentissage distribué (ADL) à...l’appui de la transformation ) This report documents the findings of Task Group 212. The primary objective of this Task Group was to explore an agile...STO TECHNICAL REPORT TR-HFM-212 Advancements in Distributed Learning (ADL) Environment in Support of Transformation (Progrès en apprentissage

  8. Distributed Architecture for the Object-Oriented Method for Interoperability

    DTIC Science & Technology

    2003-03-01

    Collaborative Environment. ......................121 Figure V-2. Distributed OOMI And The Collaboration Centric Paradigm. .....................123 Figure V...of systems are formed into a system federation to resolve differences in modeling. An OOMI Integrated Development Environment (OOMI IDE) lends ...space for the creation of possible distributed systems is partitioned into User Centric systems, Processing/Storage Centric systems, Implementation

  9. Distributed and collaborative synthetic environments

    NASA Technical Reports Server (NTRS)

    Bajaj, Chandrajit L.; Bernardini, Fausto

    1995-01-01

    Fast graphics workstations and increased computing power, together with improved interface technologies, have created new and diverse possibilities for developing and interacting with synthetic environments. A synthetic environment system is generally characterized by input/output devices that constitute the interface between the human senses and the synthetic environment generated by the computer; and a computation system running a real-time simulation of the environment. A basic need of a synthetic environment system is that of giving the user a plausible reproduction of the visual aspect of the objects with which he is interacting. The goal of our Shastra research project is to provide a substrate of geometric data structures and algorithms which allow the distributed construction and modification of the environment, efficient querying of objects attributes, collaborative interaction with the environment, fast computation of collision detection and visibility information for efficient dynamic simulation and real-time scene display. In particular, we address the following issues: (1) A geometric framework for modeling and visualizing synthetic environments and interacting with them. We highlight the functions required for the geometric engine of a synthetic environment system. (2) A distribution and collaboration substrate that supports construction, modification, and interaction with synthetic environments on networked desktop machines.

  10. Data analysis environment (DASH2000) for the Subaru telescope

    NASA Astrophysics Data System (ADS)

    Mizumoto, Yoshihiko; Yagi, Masafumi; Chikada, Yoshihiro; Ogasawara, Ryusuke; Kosugi, George; Takata, Tadafumi; Yoshida, Michitoshi; Ishihara, Yasuhide; Yanaka, Hiroshi; Yamamoto, Tadahiro; Morita, Yasuhiro; Nakamoto, Hiroyuki

    2000-06-01

    New framework of data analysis system (DASH) has been developed for the SUBARU Telescope. It is designed using object-oriented methodology and adopted a restaurant model. DASH shares the load of CPU and I/O among distributed heterogeneous computers. The distributed object environment of the system is implemented with JAVA and CORBA. DASH has been evaluated by several prototypings. DASH2000 is the latest version, which will be released as the beta version of data analysis system for the SUBARU Telescope.

  11. The structure of the clouds distributed operating system

    NASA Technical Reports Server (NTRS)

    Dasgupta, Partha; Leblanc, Richard J., Jr.

    1989-01-01

    A novel system architecture, based on the object model, is the central structuring concept used in the Clouds distributed operating system. This architecture makes Clouds attractive over a wide class of machines and environments. Clouds is a native operating system, designed and implemented at Georgia Tech. and runs on a set of generated purpose computers connected via a local area network. The system architecture of Clouds is composed of a system-wide global set of persistent (long-lived) virtual address spaces, called objects that contain persistent data and code. The object concept is implemented at the operating system level, thus presenting a single level storage view to the user. Lightweight treads carry computational activity through the code stored in the objects. The persistent objects and threads gives rise to a programming environment composed of shared permanent memory, dispensing with the need for hardware-derived concepts such as the file systems and message systems. Though the hardware may be distributed and may have disks and networks, the Clouds provides the applications with a logically centralized system, based on a shared, structured, single level store. The current design of Clouds uses a minimalist philosophy with respect to both the kernel and the operating system. That is, the kernel and the operating system support a bare minimum of functionality. Clouds also adheres to the concept of separation of policy and mechanism. Most low-level operating system services are implemented above the kernel and most high level services are implemented at the user level. From the measured performance of using the kernel mechanisms, we are able to demonstrate that efficient implementations are feasible for the object model on commercially available hardware. Clouds provides a rich environment for conducting research in distributed systems. Some of the topics addressed in this paper include distributed programming environments, consistency of persistent data and fault-tolerance.

  12. Emerald: an object-based language for distributed programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchinson, N.C.

    1987-01-01

    Distributed systems have become more common, however constructing distributed applications remains a very difficult task. Numerous operating systems and programming languages have been proposed that attempt to simplify the programming of distributed applications. Here a programing language called Emerald is presented that simplifies distributed programming by extending the concepts of object-based languages to the distributed environment. Emerald supports a single model of computation: the object. Emerald objects include private entities such as integers and Booleans, as well as shared, distributed entities such as compilers, directories, and entire file systems. Emerald objects may move between machines in the system, but objectmore » invocation is location independent. The uniform semantic model used for describing all Emerald objects makes the construction of distributed applications in Emerald much simpler than in systems where the differences in implementation between local and remote entities are visible in the language semantics. Emerald incorporates a type system that deals only with the specification of objects - ignoring differences in implementation. Thus, two different implementations of the same abstraction may be freely mixed.« less

  13. Accessing and distributing EMBL data using CORBA (common object request broker architecture).

    PubMed

    Wang, L; Rodriguez-Tomé, P; Redaschi, N; McNeil, P; Robinson, A; Lijnzaad, P

    2000-01-01

    The EMBL Nucleotide Sequence Database is a comprehensive database of DNA and RNA sequences and related information traditionally made available in flat-file format. Queries through tools such as SRS (Sequence Retrieval System) also return data in flat-file format. Flat files have a number of shortcomings, however, and the resources therefore currently lack a flexible environment to meet individual researchers' needs. The Object Management Group's common object request broker architecture (CORBA) is an industry standard that provides platform-independent programming interfaces and models for portable distributed object-oriented computing applications. Its independence from programming languages, computing platforms and network protocols makes it attractive for developing new applications for querying and distributing biological data. A CORBA infrastructure developed by EMBL-EBI provides an efficient means of accessing and distributing EMBL data. The EMBL object model is defined such that it provides a basis for specifying interfaces in interface definition language (IDL) and thus for developing the CORBA servers. The mapping from the object model to the relational schema in the underlying Oracle database uses the facilities provided by PersistenceTM, an object/relational tool. The techniques of developing loaders and 'live object caching' with persistent objects achieve a smart live object cache where objects are created on demand. The objects are managed by an evictor pattern mechanism. The CORBA interfaces to the EMBL database address some of the problems of traditional flat-file formats and provide an efficient means for accessing and distributing EMBL data. CORBA also provides a flexible environment for users to develop their applications by building clients to our CORBA servers, which can be integrated into existing systems.

  14. Accessing and distributing EMBL data using CORBA (common object request broker architecture)

    PubMed Central

    Wang, Lichun; Rodriguez-Tomé, Patricia; Redaschi, Nicole; McNeil, Phil; Robinson, Alan; Lijnzaad, Philip

    2000-01-01

    Background: The EMBL Nucleotide Sequence Database is a comprehensive database of DNA and RNA sequences and related information traditionally made available in flat-file format. Queries through tools such as SRS (Sequence Retrieval System) also return data in flat-file format. Flat files have a number of shortcomings, however, and the resources therefore currently lack a flexible environment to meet individual researchers' needs. The Object Management Group's common object request broker architecture (CORBA) is an industry standard that provides platform-independent programming interfaces and models for portable distributed object-oriented computing applications. Its independence from programming languages, computing platforms and network protocols makes it attractive for developing new applications for querying and distributing biological data. Results: A CORBA infrastructure developed by EMBL-EBI provides an efficient means of accessing and distributing EMBL data. The EMBL object model is defined such that it provides a basis for specifying interfaces in interface definition language (IDL) and thus for developing the CORBA servers. The mapping from the object model to the relational schema in the underlying Oracle database uses the facilities provided by PersistenceTM, an object/relational tool. The techniques of developing loaders and 'live object caching' with persistent objects achieve a smart live object cache where objects are created on demand. The objects are managed by an evictor pattern mechanism. Conclusions: The CORBA interfaces to the EMBL database address some of the problems of traditional flat-file formats and provide an efficient means for accessing and distributing EMBL data. CORBA also provides a flexible environment for users to develop their applications by building clients to our CORBA servers, which can be integrated into existing systems. PMID:11178259

  15. Visual Computing Environment

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Putt, Charles W.

    1997-01-01

    The Visual Computing Environment (VCE) is a NASA Lewis Research Center project to develop a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis. The objectives of VCE are to (1) develop a visual computing environment for controlling the execution of individual simulation codes that are running in parallel and are distributed on heterogeneous host machines in a networked environment, (2) develop numerical coupling algorithms for interchanging boundary conditions between codes with arbitrary grid matching and different levels of dimensionality, (3) provide a graphical interface for simulation setup and control, and (4) provide tools for online visualization and plotting. VCE was designed to provide a distributed, object-oriented environment. Mechanisms are provided for creating and manipulating objects, such as grids, boundary conditions, and solution data. This environment includes parallel virtual machine (PVM) for distributed processing. Users can interactively select and couple any set of codes that have been modified to run in a parallel distributed fashion on a cluster of heterogeneous workstations. A scripting facility allows users to dictate the sequence of events that make up the particular simulation.

  16. a Framework for Distributed Mixed Language Scientific Applications

    NASA Astrophysics Data System (ADS)

    Quarrie, D. R.

    The Object Management Group has defined an architecture (CORBA) for distributed object applications based on an Object Request Broker and Interface Definition Language. This project builds upon this architecture to establish a framework for the creation of mixed language scientific applications. A prototype compiler has been written that generates FORTRAN 90 or Eiffel stubs and skeletons and the required C++ glue code from an input IDL file that specifies object interfaces. This generated code can be used directly for non-distributed mixed language applications or in conjunction with the C++ code generated from a commercial IDL compiler for distributed applications. A feasibility study is presently underway to see whether a fully integrated software development environment for distributed, mixed-language applications can be created by modifying the back-end code generator of a commercial CASE tool to emit IDL.

  17. Fishing for Novel Approaches to Ecosystem Service Forecasts

    EPA Science Inventory

    The ecosystem service concept provides a powerful framework for conserving species and the environments they depend upon. Describing current distributions of ecosystem services and forecasting their future distributions have therefore become central objectives in many conservati...

  18. A Rendering System Independent High Level Architecture Implementation for Networked Virtual Environments

    DTIC Science & Technology

    2002-09-01

    Management .........................15 5. Time Management ..............................16 6. Data Distribution Management .................16 D...50 b. Ownership Management .....................51 c. Data Distribution Management .............51 2. Additional Objects and Interactions...16 Figure 6. Data Distribution Management . (From: ref. 2) ...16 Figure 7. RTI and Federate Code Responsibilities. (From: ref. 2

  19. Technology, Learning and Instruction: Distributed Cognition in the Secondary English Classroom

    ERIC Educational Resources Information Center

    Gomez, Mary Louise; Schieble, Melissa; Curwood, Jen Scott; Hassett, Dawnene

    2010-01-01

    In this paper, we analyse interactions between secondary students and pre-service teachers in an online environment in order to understand how their meaning-making processes embody distributed cognition. We begin by providing a theoretical review of the ways in which literacy learning is distributed across learners, objects, tools, symbols,…

  20. Distribution and Supply Chain Management: Educating the Army Officer

    DTIC Science & Technology

    2005-05-26

    knowledge a logistics officer must have to function effectively in a supply chain and distribution management environment. It analyzes how officers...Educational Objectives. It discusses how the Army/DoD currently teaches supply chain and distribution management concepts in various programs, such as the...its educational curriculum, and that logisticians continue to gain operational experience in distribution management operations. The paper recommends

  1. Object-oriented Tools for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1993-01-01

    Distributed computing systems are proliferating, owing to the availability of powerful, affordable microcomputers and inexpensive communication networks. A critical problem in developing such systems is getting application programs to interact with one another across a computer network. Remote interprogram connectivity is particularly challenging across heterogeneous environments, where applications run on different kinds of computers and operating systems. NetWorks! (trademark) is an innovative software product that provides an object-oriented messaging solution to these problems. This paper describes the design and functionality of NetWorks! and illustrates how it is being used to build complex distributed applications for NASA and in the commercial sector.

  2. Creating an Organic Knowledge-Building Environment within an Asynchronous Distributed Learning Context.

    ERIC Educational Resources Information Center

    Moller, Leslie; Prestera, Gustavo E.; Harvey, Douglas; Downs-Keller, Margaret; McCausland, Jo-Ann

    2002-01-01

    Discusses organic architecture and suggests that learning environments should be designed and constructed using an organic approach, so that learning is not viewed as a distinct human activity but incorporated into everyday performance. Highlights include an organic knowledge-building model; information objects; scaffolding; discourse action…

  3. Virtual memory support for distributed computing environments using a shared data object model

    NASA Astrophysics Data System (ADS)

    Huang, F.; Bacon, J.; Mapp, G.

    1995-12-01

    Conventional storage management systems provide one interface for accessing memory segments and another for accessing secondary storage objects. This hinders application programming and affects overall system performance due to mandatory data copying and user/kernel boundary crossings, which in the microkernel case may involve context switches. Memory-mapping techniques may be used to provide programmers with a unified view of the storage system. This paper extends such techniques to support a shared data object model for distributed computing environments in which good support for coherence and synchronization is essential. The approach is based on a microkernel, typed memory objects, and integrated coherence control. A microkernel architecture is used to support multiple coherence protocols and the addition of new protocols. Memory objects are typed and applications can choose the most suitable protocols for different types of object to avoid protocol mismatch. Low-level coherence control is integrated with high-level concurrency control so that the number of messages required to maintain memory coherence is reduced and system-wide synchronization is realized without severely impacting the system performance. These features together contribute a novel approach to the support for flexible coherence under application control.

  4. Garbage Collection in a Distributed Object-Oriented System

    NASA Technical Reports Server (NTRS)

    Gupta, Aloke; Fuchs, W. Kent

    1993-01-01

    An algorithm is described in this paper for garbage collection in distributed systems with object sharing across processor boundaries. The algorithm allows local garbage collection at each node in the system to proceed independently of local collection at the other nodes. It requires no global synchronization or knowledge of the global state of the system and exhibits the capability of graceful degradation. The concept of a specialized dump node is proposed to facilitate the collection of inaccessible circular structures. An experimental evaluation of the algorithm is also described. The algorithm is compared with a corresponding scheme that requires global synchronization. The results show that the algorithm works well in distributed processing environments even when the locality of object references is low.

  5. Statistics of high-level scene context.

    PubMed

    Greene, Michelle R

    2013-01-01

    CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics rather than intuition.

  6. Object Lesson: Discovering and Learning to Recognize Objects

    DTIC Science & Technology

    2002-01-01

    4 x 4 grid represents the possible appearance of an edge, quantized to just two luminance levels. The dark line centered in the grid is the average...11):33-38, 1995. [16] Maja J. Mataric . A distributed model for mobile robot environment-learning and navigation. Technical Report AIlR- 1228

  7. Reducing acquisition risk through integrated systems of systems engineering

    NASA Astrophysics Data System (ADS)

    Gross, Andrew; Hobson, Brian; Bouwens, Christina

    2016-05-01

    In the fall of 2015, the Joint Staff J7 (JS J7) sponsored the Bold Quest (BQ) 15.2 event and conducted planning and coordination to combine this event into a joint event with the Army Warfighting Assessment (AWA) 16.1 sponsored by the U.S. Army. This multipurpose event combined a Joint/Coalition exercise (JS J7) with components of testing, training, and experimentation required by the Army. In support of Assistant Secretary of the Army for Acquisition, Logistics, and Technology (ASA(ALT)) System of Systems Engineering and Integration (SoSE&I), Always On-On Demand (AO-OD) used a system of systems (SoS) engineering approach to develop a live, virtual, constructive distributed environment (LVC-DE) to support risk mitigation utilizing this complex and challenging exercise environment for a system preparing to enter limited user test (LUT). AO-OD executed a requirements-based SoS engineering process starting with user needs and objectives from Army Integrated Air and Missile Defense (AIAMD), Patriot units, Coalition Intelligence, Surveillance and Reconnaissance (CISR), Focused End State 4 (FES4) Mission Command (MC) Interoperability with Unified Action Partners (UAP), and Mission Partner Environment (MPE) Integration and Training, Tactics and Procedures (TTP) assessment. The SoS engineering process decomposed the common operational, analytical, and technical requirements, while utilizing the Institute of Electrical and Electronics Engineers (IEEE) Distributed Simulation Engineering and Execution Process (DSEEP) to provide structured accountability for the integration and execution of the AO-OD LVC-DE. As a result of this process implementation, AO-OD successfully planned for, prepared, and executed a distributed simulation support environment that responsively satisfied user needs and objectives, demonstrating the viability of an LVC-DE environment to support multiple user objectives and support risk mitigation activities for systems in the acquisition process.

  8. Contribution of explosion and future collision fragments to the orbital debris environment

    NASA Technical Reports Server (NTRS)

    Su, S.-Y.; Kessler, D. J.

    1985-01-01

    The time evolution of the near-earth man-made orbital debris environment modeled by numerical simulation is presented in this paper. The model starts with a data base of orbital debris objects which are tracked by the NORAD ground radar system. The current untrackable small objects are assumed to result from explosions and are predicted from data collected from a ground explosion experiment. Future collisions between earth orbiting objects are handled by the Monte Carlo method to simulate the range of collision possibilities that may occur in the real world. The collision fragmentation process between debris objects is calculated using an empirical formula derived from a laboratory spacecraft impact experiment to obtain the number versus size distribution of the newly generated debris population. The evolution of the future space debris environment is compared with the natural meteoroid background for the relative spacecraft penetration hazard.

  9. Eight nonnative plants in western Oregon forests: associations with environment and management.

    Treesearch

    Andrew. Gray

    2005-01-01

    Nonnative plants have tremendous ecological and economic impacts on plant communities globally, but comprehensive data on the distribution and ecological relationships of individual species is often scarce or nonexistent. The objective of this study was to assess the influence of vegetation type, climate, topography, and management history on the distribution and...

  10. The Importance of Distributed Broadband Networks to Academic Biomedical Research and Education Programs

    ERIC Educational Resources Information Center

    Yellowlees, Peter M.; Hogarth, Michael; Hilty, Donald M.

    2006-01-01

    Objective: This article highlights the importance of distributed broadband networks as part of the core infrastructure necessary to deliver academic research and education programs. Method: The authors review recent developments in the field and present the University of California, Davis, environment as a case study of a future virtual regional…

  11. WaveJava: Wavelet-based network computing

    NASA Astrophysics Data System (ADS)

    Ma, Kun; Jiao, Licheng; Shi, Zhuoer

    1997-04-01

    Wavelet is a powerful theory, but its successful application still needs suitable programming tools. Java is a simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multi- threaded, dynamic language. This paper addresses the design and development of a cross-platform software environment for experimenting and applying wavelet theory. WaveJava, a wavelet class library designed by the object-orient programming, is developed to take advantage of the wavelets features, such as multi-resolution analysis and parallel processing in the networking computing. A new application architecture is designed for the net-wide distributed client-server environment. The data are transmitted with multi-resolution packets. At the distributed sites around the net, these data packets are done the matching or recognition processing in parallel. The results are fed back to determine the next operation. So, the more robust results can be arrived quickly. The WaveJava is easy to use and expand for special application. This paper gives a solution for the distributed fingerprint information processing system. It also fits for some other net-base multimedia information processing, such as network library, remote teaching and filmless picture archiving and communications.

  12. A unified framework for building high performance DVEs

    NASA Astrophysics Data System (ADS)

    Lei, Kaibin; Ma, Zhixia; Xiong, Hua

    2011-10-01

    A unified framework for integrating PC cluster based parallel rendering with distributed virtual environments (DVEs) is presented in this paper. While various scene graphs have been proposed in DVEs, it is difficult to enable collaboration of different scene graphs. This paper proposes a technique for non-distributed scene graphs with the capability of object and event distribution. With the increase of graphics data, DVEs require more powerful rendering ability. But general scene graphs are inefficient in parallel rendering. The paper also proposes a technique to connect a DVE and a PC cluster based parallel rendering environment. A distributed multi-player video game is developed to show the interaction of different scene graphs and the parallel rendering performance on a large tiled display wall.

  13. Estrellas asociadas con planetas extrasolares vs. estrellas de tipo β Pictoris

    NASA Astrophysics Data System (ADS)

    Chavero, C.; Gómez, M.

    In this contribution we initially confront physical properties of two groups of stars: the Planet Host Stars and the Vega-like objects. The Planet Host Star group has one or more planet mass object associated and the Vega-like stars have circumstellar disks. We have compiled magnitudes, colors, parallaxes, spectral types, etc. for these objects from the literature and analyzed the distribution of both groups. We find that the samples are very similar in metallicities, ages, and spatial distributions. Our analysis suggests that the circumstellar environments are probably different while the central objects have similar physical properties. This difference may explain, at least in part, why the Planet Host Stars form extra-solar planetary objects such as those detected by the Doppler effect while the Vega-like objects are not commonly associated with these planet-mass bodies.

  14. Performance Evaluation of Communication Software Systems for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Fatoohi, Rod

    1996-01-01

    In recent years there has been an increasing interest in object-oriented distributed computing since it is better quipped to deal with complex systems while providing extensibility, maintainability, and reusability. At the same time, several new high-speed network technologies have emerged for local and wide area networks. However, the performance of networking software is not improving as fast as the networking hardware and the workstation microprocessors. This paper gives an overview and evaluates the performance of the Common Object Request Broker Architecture (CORBA) standard in a distributed computing environment at NASA Ames Research Center. The environment consists of two testbeds of SGI workstations connected by four networks: Ethernet, FDDI, HiPPI, and ATM. The performance results for three communication software systems are presented, analyzed and compared. These systems are: BSD socket programming interface, IONA's Orbix, an implementation of the CORBA specification, and the PVM message passing library. The results show that high-level communication interfaces, such as CORBA and PVM, can achieve reasonable performance under certain conditions.

  15. Effect of cessation of beef cattle pasture-feedlot type backgrounding operation on the persistence of antibiotic resistance genes in the environment

    USDA-ARS?s Scientific Manuscript database

    Introduction It is not known how removal of cattle from a backgrounding operation will affect the persistence of antibiotic resistance genes (ARGs) in the environment. Our objective was to investigate the effect of destocking on the persistence and distribution of ARGs in the backgrounding environm...

  16. A development framework for distributed artificial intelligence

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Cottman, Bruce H.

    1989-01-01

    The authors describe distributed artificial intelligence (DAI) applications in which multiple organizations of agents solve multiple domain problems. They then describe work in progress on a DAI system development environment, called SOCIAL, which consists of three primary language-based components. The Knowledge Object Language defines models of knowledge representation and reasoning. The metaCourier language supplies the underlying functionality for interprocess communication and control access across heterogeneous computing environments. The metaAgents language defines models for agent organization coordination, control, and resource management. Application agents and agent organizations will be constructed by combining metaAgents and metaCourier building blocks with task-specific functionality such as diagnostic or planning reasoning. This architecture hides implementation details of communications, control, and integration in distributed processing environments, enabling application developers to concentrate on the design and functionality of the intelligent agents and agent networks themselves.

  17. Strawman Distributed Interactive Simulation Architecture Description Document. Volume 2. Supporting Rationale. Book 2. DIS Architecture Issues

    DTIC Science & Technology

    1992-03-31

    the-loop, interactive training environment. Its primary advantage is that it has a long history of use and a number of experienced users. However...programmer teams. Mazda IsU ADST/WDLPr,-92.OO8O1O 2 The Object Oriented Behavioral Decomposition Approach Object oriented behavioral decomposition is

  18. International Ultraviolet Explorer (IUE)

    NASA Technical Reports Server (NTRS)

    Boehm, Karl-Heinz

    1992-01-01

    The observation, data reduction, and interpretation of ultraviolet spectra (obtained with the International Ultraviolet Explorer) of Herbig-Haro objects, stellar jets, and (in a few cases) reflection nebulae in star-forming regions is discussed. Intermediate results have been reported in the required semi-annual reports. The observations for this research were obtained in 23 (US1) IUE shifts. The spectra were taken in the low resolution mode with the large aperture. The following topics were investigated: (1) detection of UV spectra of high excitation Herbig-Haro (HH) objects, identification of emission lines, and a preliminary study of the energy distribution of the ultraviolet continuum; (2) details of the continuum energy distribution of these spectra and their possible interpretation; (3) the properties of the reddening (extinction) of HH objects; (4) the possible time variation of strong emission lines in high excitation HH objects; (5) the ultraviolet emission of low excitation HH objects, especially in the fluorescent lines of the H2 molecule; (6) the ultraviolet emission in the peculiar object HH24; (7) the spatial emission distribution of different lines and different parts of the continuum in different HH objects; and (8) some properties of reflection nebula, in the environment of Herbig-Haro objects. Each topic is discussed.

  19. A Simple Model for the Orbital Debris Environment in GEO

    NASA Astrophysics Data System (ADS)

    Anilkumar, A. K.; Ananthasayanam, M. R.; Subba Rao, P. V.

    The increase of space debris and its threat to commercial space activities in the Geosynchronous Earth Orbit (GEO) predictably cause concern regarding the environment over the long term. A variety of studies regarding space debris such as detection, modeling, protection and mitigation measures, is being pursued for the past couple of decades. Due to the absence of atmospheric drag to remove debris in GEO and the increasing number of utility satellites therein, the number of objects in GEO will continue to increase. The characterization of the GEO environment is critical for risk assessment and protection of future satellites and also to incorporate effective debris mitigation measures in the design and operations. The debris measurements in GEO have been limited to objects with size more than 60 cm. This paper provides an engineering model of the GEO environment by utilizing the philosophy and approach as laid out for the SIMPLE model proposed recently for LEO by the authors. The present study analyses the statistical characteristics of the GEO catalogued objects in order to arrive at a model for the GEO space debris environment. It is noted that the catalogued objects, as of now of around 800, by USSPACECOM across the years 1998 to 2004 have the same semi major axis mode (highest number density) around 35750 km above the earth. After removing the objects in the small bin around the mode, (35700, 35800) km containing around 40 percent (a value that is nearly constant across the years) of the objects, the number density of the other objects follow a single Laplace distribution with two parameters, namely location and scale. Across the years the location parameter of the above distribution does not significantly vary but the scale parameter shows a definite trend. These observations are successfully utilized in proposing a simple model for the GEO debris environment. References Ananthasayanam, M. R., Anil Kumar, A. K., and Subba Rao, P. V., ``A New Stochastic Impressionistic Low Earth (SIMPLE) Model of the Space Debris Scenario'', Conference Abstract COSPAR 02-A-01772, 2002. Ananthasayanam, M. R., Anilkumar, A. K., Subba Rao, P. V., and V. Adimurthy, ``Characterization of Eccentricity and Ballistic Coefficients of Space Debris in Altitude and Perigee Bins'', IAC-03-IAA5.p.04, Presented at the IAF Conference, Bremen, October 2003 and also to be published in the Proceedings of IAF Conference, Science and Technology Series, 2003.

  20. Statistics of high-level scene context

    PubMed Central

    Greene, Michelle R.

    2013-01-01

    Context is critical for recognizing environments and for searching for objects within them: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed “things” in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics rather than intuition. PMID:24194723

  1. CAD/CAE Integration Enhanced by New CAD Services Standard

    NASA Technical Reports Server (NTRS)

    Claus, Russell W.

    2002-01-01

    A Government-industry team led by the NASA Glenn Research Center has developed a computer interface standard for accessing data from computer-aided design (CAD) systems. The Object Management Group, an international computer standards organization, has adopted this CAD services standard. The new standard allows software (e.g., computer-aided engineering (CAE) and computer-aided manufacturing software to access multiple CAD systems through one programming interface. The interface is built on top of a distributed computing system called the Common Object Request Broker Architecture (CORBA). CORBA allows the CAD services software to operate in a distributed, heterogeneous computing environment.

  2. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  3. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 2; Preliminary Results

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.

  4. Dynamic shared state maintenance in distributed virtual environments

    NASA Astrophysics Data System (ADS)

    Hamza-Lup, Felix George

    Advances in computer networks and rendering systems facilitate the creation of distributed collaborative environments in which the distribution of information at remote locations allows efficient communication. Particularly challenging are distributed interactive Virtual Environments (VE) that allow knowledge sharing through 3D information. The purpose of this work is to address the problem of latency in distributed interactive VE and to develop a conceptual model for consistency maintenance in these environments based on the participant interaction model. An area that needs to be explored is the relationship between the dynamic shared state and the interaction with the virtual entities present in the shared scene. Mixed Reality (MR) and VR environments must bring the human participant interaction into the loop through a wide range of electronic motion sensors, and haptic devices. Part of the work presented here defines a novel criterion for categorization of distributed interactive VE and introduces, as well as analyzes, an adaptive synchronization algorithm for consistency maintenance in such environments. As part of the work, a distributed interactive Augmented Reality (AR) testbed and the algorithm implementation details are presented. Currently the testbed is part of several research efforts at the Optical Diagnostics and Applications Laboratory including 3D visualization applications using custom built head-mounted displays (HMDs) with optical motion tracking and a medical training prototype for endotracheal intubation and medical prognostics. An objective method using quaternion calculus is applied for the algorithm assessment. In spite of significant network latency, results show that the dynamic shared state can be maintained consistent at multiple remotely located sites. In further consideration of the latency problems and in the light of the current trends in interactive distributed VE applications, we propose a hybrid distributed system architecture for sensor-based distributed VE that has the potential to improve the system real-time behavior and scalability. (Abstract shortened by UMI.)

  5. Integrated Design Engineering Analysis (IDEA) Environment - Aerodynamics, Aerothermodynamics, and Thermal Protection System Integration Module

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2011-01-01

    This report documents the work performed during from March 2010 October 2011. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed environment using the Adaptive Modeling Language (AML) as the underlying framework. This report will focus on describing the work done in the area of extending the aerodynamics, and aerothermodynamics module using S/HABP, CBAERO, PREMIN and LANMIN. It will also detail the work done integrating EXITS as the TPS sizing tool.

  6. Coordinated control of micro-grid based on distributed moving horizon control.

    PubMed

    Ma, Miaomiao; Shao, Liyang; Liu, Xiangjie

    2018-05-01

    This paper proposed the distributed moving horizon coordinated control scheme for the power balance and economic dispatch problems of micro-grid based on distributed generation. We design the power coordinated controller for each subsystem via moving horizon control by minimizing a suitable objective function. The objective function of distributed moving horizon coordinated controller is chosen based on the principle that wind power subsystem has the priority to generate electricity while photovoltaic power generation coordinates with wind power subsystem and the battery is only activated to meet the load demand when necessary. The simulation results illustrate that the proposed distributed moving horizon coordinated controller can allocate the output power of two generation subsystems reasonably under varying environment conditions, which not only can satisfy the load demand but also limit excessive fluctuations of output power to protect the power generation equipment. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Adaptation in pronoun resolution: Evidence from Brazilian and European Portuguese.

    PubMed

    Fernandes, Eunice G; Luegi, Paula; Correa Soares, Eduardo; de la Fuente, Israel; Hemforth, Barbara

    2018-04-26

    Previous research accounting for pronoun resolution as a problem of probabilistic inference has not explored the phenomenon of adaptation, whereby the processor constantly tracks and adapts, rationally, to changes in a statistical environment. We investigate whether Brazilian (BP) and European Portuguese (EP) speakers adapt to variations in the probability of occurrence of ambiguous overt and null pronouns, in two experiments assessing resolution toward subject and object referents. For each variety (BP, EP), participants were faced with either the same number of null and overt pronouns (equal distribution), or with an environment with fewer overt (than null) pronouns (unequal distribution). We find that the preference for interpreting overt pronouns as referring back to an object referent (object-biased interpretation) is higher when there are fewer overt pronouns (i.e., in the unequal, relative to the equal distribution condition). This is especially the case for BP, a variety with higher prior frequency and smaller object-biased interpretation of overt pronouns, suggesting that participants adapted incrementally and integrated prior statistical knowledge with the knowledge obtained in the experiment. We hypothesize that comprehenders adapted rationally, with the goal of maintaining, across variations in pronoun probability, the likelihood of subject and object referents. Our findings unify insights from research in pronoun resolution and in adaptation, and add to previous studies in both topics: They provide evidence for the influence of pronoun probability in pronoun resolution, and for an adaptation process whereby the language processor not only tracks statistical information, but uses it to make interpretational inferences. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. Design of Magnetic Charged Particle Lens Using Analytical Potential Formula

    NASA Astrophysics Data System (ADS)

    Al-Batat, A. H.; Yaseen, M. J.; Abbas, S. R.; Al-Amshani, M. S.; Hasan, H. S.

    2018-05-01

    In the current research was to benefit from the potential of the two cylindrical electric lenses to be used in the product a mathematical model from which, one can determine the magnetic field distribution of the charged particle objective lens. With aid of simulink in matlab environment, some simulink models have been building to determine the distribution of the target function and their related axial functions along the optical axis of the charged particle lens. The present study showed that the physical parameters (i.e., the maximum value, Bmax, and the half width W of the field distribution) and the objective properties of the charged particle lens have been affected by varying the main geometrical parameter of the lens named the bore radius R.

  9. Virtual environment and computer-aided technologies used for system prototyping and requirements development

    NASA Technical Reports Server (NTRS)

    Logan, Cory; Maida, James; Goldsby, Michael; Clark, Jim; Wu, Liew; Prenger, Henk

    1993-01-01

    The Space Station Freedom (SSF) Data Management System (DMS) consists of distributed hardware and software which monitor and control the many onboard systems. Virtual environment and off-the-shelf computer technologies can be used at critical points in project development to aid in objectives and requirements development. Geometric models (images) coupled with off-the-shelf hardware and software technologies were used in The Space Station Mockup and Trainer Facility (SSMTF) Crew Operational Assessment Project. Rapid prototyping is shown to be a valuable tool for operational procedure and system hardware and software requirements development. The project objectives, hardware and software technologies used, data gained, current activities, future development and training objectives shall be discussed. The importance of defining prototyping objectives and staying focused while maintaining schedules are discussed along with project pitfalls.

  10. Sensor fusion V; Proceedings of the Meeting, Boston, MA, Nov. 15-17, 1992

    NASA Technical Reports Server (NTRS)

    Schenker, Paul S. (Editor)

    1992-01-01

    Topics addressed include 3D object perception, human-machine interface in multisensor systems, sensor fusion architecture, fusion of multiple and distributed sensors, interface and decision models for sensor fusion, computational networks, simple sensing for complex action, multisensor-based control, and metrology and calibration of multisensor systems. Particular attention is given to controlling 3D objects by sketching 2D views, the graphical simulation and animation environment for flexible structure robots, designing robotic systems from sensorimotor modules, cylindrical object reconstruction from a sequence of images, an accurate estimation of surface properties by integrating information using Bayesian networks, an adaptive fusion model for a distributed detection system, multiple concurrent object descriptions in support of autonomous navigation, robot control with multiple sensors and heuristic knowledge, and optical array detectors for image sensors calibration. (No individual items are abstracted in this volume)

  11. A Distributed Operating System Design and Dictionary/Directory for the Stock Point Logistics Integrated Communications Environment.

    DTIC Science & Technology

    1983-11-01

    transmission, FM(R) will only have to hold one message. 3. Program Control Block (PCB) The PCB ( Deitel 82] will be maintained by the Executive in...and Use of Kernel to Process Interrupts 35 10. Layered Operating System Design 38 11. Program Control Block Table 43 12. Ready List Data Structure 45 13...examples of fully distributed systems in operation. An objective of the NPS research program for SPLICE is to advance our knowledge of distributed

  12. [Environmental Hazards Assessment Program annual report, June 1992--June 1993]. Use of diatom distributions to monitor environmental health

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levine, R.H.

    1993-12-01

    A variety of approaches has been used in the past to assess the environmental impact of anthropogenic contaminants. One reliable index for aquatic environments is the analysis of diatom species distribution; the focus in this case being on the Savannah River. The completed objectives of this study were: (A) the development and use of procedures for measuring diatom distribution in the water column and (B) the development and evaluation of sediment sampling methods for retrospective analysis.

  13. A Java-Enabled Interactive Graphical Gas Turbine Propulsion System Simulator

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Afjeh, Abdollah A.

    1997-01-01

    This paper describes a gas turbine simulation system which utilizes the newly developed Java language environment software system. The system provides an interactive graphical environment which allows the quick and efficient construction and analysis of arbitrary gas turbine propulsion systems. The simulation system couples a graphical user interface, developed using the Java Abstract Window Toolkit, and a transient, space- averaged, aero-thermodynamic gas turbine analysis method, both entirely coded in the Java language. The combined package provides analytical, graphical and data management tools which allow the user to construct and control engine simulations by manipulating graphical objects on the computer display screen. Distributed simulations, including parallel processing and distributed database access across the Internet and World-Wide Web (WWW), are made possible through services provided by the Java environment.

  14. Smart sensing surveillance system

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Chu, Kai-Dee; O'Looney, James; Blake, Michael; Rutar, Colleen

    2010-04-01

    An effective public safety sensor system for heavily-populated applications requires sophisticated and geographically-distributed infrastructures, centralized supervision, and deployment of large-scale security and surveillance networks. Artificial intelligence in sensor systems is a critical design to raise awareness levels, improve the performance of the system and adapt to a changing scenario and environment. In this paper, a highly-distributed, fault-tolerant, and energy-efficient Smart Sensing Surveillance System (S4) is presented to efficiently provide a 24/7 and all weather security operation in crowded environments or restricted areas. Technically, the S4 consists of a number of distributed sensor nodes integrated with specific passive sensors to rapidly collect, process, and disseminate heterogeneous sensor data from near omni-directions. These distributed sensor nodes can cooperatively work to send immediate security information when new objects appear. When the new objects are detected, the S4 will smartly select the available node with a Pan- Tilt- Zoom- (PTZ) Electro-Optics EO/IR camera to track the objects and capture associated imagery. The S4 provides applicable advanced on-board digital image processing capabilities to detect and track the specific objects. The imaging detection operations include unattended object detection, human feature and behavior detection, and configurable alert triggers, etc. Other imaging processes can be updated to meet specific requirements and operations. In the S4, all the sensor nodes are connected with a robust, reconfigurable, LPI/LPD (Low Probability of Intercept/ Low Probability of Detect) wireless mesh network using Ultra-wide band (UWB) RF technology. This UWB RF technology can provide an ad-hoc, secure mesh network and capability to relay network information, communicate and pass situational awareness and messages. The Service Oriented Architecture of S4 enables remote applications to interact with the S4 network and use the specific presentation methods. In addition, the S4 is compliant with Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards to efficiently discover, access, use, and control heterogeneous sensors and their metadata. These S4 capabilities and technologies have great potential for both military and civilian applications, enabling highly effective security support tools for improving surveillance activities in densely crowded environments. The S4 system is directly applicable to solutions for emergency response personnel, law enforcement, and other homeland security missions, as well as in applications requiring the interoperation of sensor networks with handheld or body-worn interface devices.

  15. A Novel Event-Based Incipient Slip Detection Using Dynamic Active-Pixel Vision Sensor (DAVIS)

    PubMed Central

    Rigi, Amin

    2018-01-01

    In this paper, a novel approach to detect incipient slip based on the contact area between a transparent silicone medium and different objects using a neuromorphic event-based vision sensor (DAVIS) is proposed. Event-based algorithms are developed to detect incipient slip, slip, stress distribution and object vibration. Thirty-seven experiments were performed on five objects with different sizes, shapes, materials and weights to compare precision and response time of the proposed approach. The proposed approach is validated by using a high speed constitutional camera (1000 FPS). The results indicate that the sensor can detect incipient slippage with an average of 44.1 ms latency in unstructured environment for various objects. It is worth mentioning that the experiments were conducted in an uncontrolled experimental environment, therefore adding high noise levels that affected results significantly. However, eleven of the experiments had a detection latency below 10 ms which shows the capability of this method. The results are very promising and show a high potential of the sensor being used for manipulation applications especially in dynamic environments. PMID:29364190

  16. INTERACTIONS OF INTRODUCED BACTERIA AND AQUATIC INVERTEBRATES

    EPA Science Inventory

    Bacteria enter into stream environments from a variety of sources and interact in varying ways with other biota. There were three basic objectives for this project: 1) to examine the effect of different types of macroinvertebrates on bacterial survival and distribution, 2) to com...

  17. Distributed Computing Framework for Synthetic Radar Application

    NASA Technical Reports Server (NTRS)

    Gurrola, Eric M.; Rosen, Paul A.; Aivazis, Michael

    2006-01-01

    We are developing an extensible software framework, in response to Air Force and NASA needs for distributed computing facilities for a variety of radar applications. The objective of this work is to develop a Python based software framework, that is the framework elements of the middleware that allows developers to control processing flow on a grid in a distributed computing environment. Framework architectures to date allow developers to connect processing functions together as interchangeable objects, thereby allowing a data flow graph to be devised for a specific problem to be solved. The Pyre framework, developed at the California Institute of Technology (Caltech), and now being used as the basis for next-generation radar processing at JPL, is a Python-based software framework. We have extended the Pyre framework to include new facilities to deploy processing components as services, including components that monitor and assess the state of the distributed network for eventual real-time control of grid resources.

  18. Flight dynamics software in a distributed network environment

    NASA Technical Reports Server (NTRS)

    Jeletic, J.; Weidow, D.; Boland, D.

    1995-01-01

    As with all NASA facilities, the announcement of reduced budgets, reduced staffing, and the desire to implement smaller/quicker/cheaper missions has required the Agency's organizations to become more efficient in what they do. To accomplish these objectives, the FDD has initiated the development of the Flight Dynamics Distributed System (FDDS). The underlying philosophy of FDDS is to build an integrated system that breaks down the traditional barriers of attitude, mission planning, and navigation support software to provide a uniform approach to flight dynamics applications. Through the application of open systems concepts and state-of-the-art technologies, including object-oriented specification concepts, object-oriented software, and common user interface, communications, data management, and executive services, the FDD will reengineer most of its six million lines of code.

  19. PID temperature controller in pig nursery: spatial characterization of thermal environment

    NASA Astrophysics Data System (ADS)

    de Souza Granja Barros, Juliana; Rossi, Luiz Antonio; Menezes de Souza, Zigomar

    2018-05-01

    The use of enhanced technologies of temperature control can improve the thermal conditions in environments of livestock facilities. The objective of this study was to evaluate the spatial distribution of the thermal environment variables in a pig nursery with a heating system with two temperature control technologies based on the geostatistical analysis. The following systems were evaluated: overhead electrical resistance with Proportional, Integral, and Derivative (PID) controller and overhead electrical resistance with a thermostat. We evaluated the climatic variables: dry bulb temperature (Tbs), air relative humidity (RH), temperature and humidity index (THI), and enthalpy in the winter, at 7:00, 12:00, and 18:00 h. The spatial distribution of these variables was mapped by kriging. The results showed that the resistance heating system with PID controllers improved the thermal comfort conditions in the pig nursery in the coldest hours, maintaining the spatial distribution of the air temperature more homogeneous in the pen. During the hottest weather, neither system provided comfort.

  20. PID temperature controller in pig nursery: spatial characterization of thermal environment

    NASA Astrophysics Data System (ADS)

    de Souza Granja Barros, Juliana; Rossi, Luiz Antonio; Menezes de Souza, Zigomar

    2017-11-01

    The use of enhanced technologies of temperature control can improve the thermal conditions in environments of livestock facilities. The objective of this study was to evaluate the spatial distribution of the thermal environment variables in a pig nursery with a heating system with two temperature control technologies based on the geostatistical analysis. The following systems were evaluated: overhead electrical resistance with Proportional, Integral, and Derivative (PID) controller and overhead electrical resistance with a thermostat. We evaluated the climatic variables: dry bulb temperature (Tbs), air relative humidity (RH), temperature and humidity index (THI), and enthalpy in the winter, at 7:00, 12:00, and 18:00 h. The spatial distribution of these variables was mapped by kriging. The results showed that the resistance heating system with PID controllers improved the thermal comfort conditions in the pig nursery in the coldest hours, maintaining the spatial distribution of the air temperature more homogeneous in the pen. During the hottest weather, neither system provided comfort.

  1. UBioLab: a web-LABoratory for Ubiquitous in-silico experiments.

    PubMed

    Bartocci, E; Di Berardini, M R; Merelli, E; Vito, L

    2012-03-01

    The huge and dynamic amount of bioinformatic resources (e.g., data and tools) available nowadays in Internet represents a big challenge for biologists -for what concerns their management and visualization- and for bioinformaticians -for what concerns the possibility of rapidly creating and executing in-silico experiments involving resources and activities spread over the WWW hyperspace. Any framework aiming at integrating such resources as in a physical laboratory has imperatively to tackle -and possibly to handle in a transparent and uniform way- aspects concerning physical distribution, semantic heterogeneity, co-existence of different computational paradigms and, as a consequence, of different invocation interfaces (i.e., OGSA for Grid nodes, SOAP for Web Services, Java RMI for Java objects, etc.). The framework UBioLab has been just designed and developed as a prototype following the above objective. Several architectural features -as those ones of being fully Web-based and of combining domain ontologies, Semantic Web and workflow techniques- give evidence of an effort in such a direction. The integration of a semantic knowledge management system for distributed (bioinformatic) resources, a semantic-driven graphic environment for defining and monitoring ubiquitous workflows and an intelligent agent-based technology for their distributed execution allows UBioLab to be a semantic guide for bioinformaticians and biologists providing (i) a flexible environment for visualizing, organizing and inferring any (semantics and computational) "type" of domain knowledge (e.g., resources and activities, expressed in a declarative form), (ii) a powerful engine for defining and storing semantic-driven ubiquitous in-silico experiments on the domain hyperspace, as well as (iii) a transparent, automatic and distributed environment for correct experiment executions.

  2. DAVE: A plug and play model for distributed multimedia application development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mines, R.F.; Friesen, J.A.; Yang, C.L.

    1994-07-01

    This paper presents a model being used for the development of distributed multimedia applications. The Distributed Audio Video Environment (DAVE) was designed to support the development of a wide range of distributed applications. The implementation of this model is described. DAVE is unique in that it combines a simple ``plug and play`` programming interface, supports both centralized and fully distributed applications, provides device and media extensibility, promotes object reuseability, and supports interoperability and network independence. This model enables application developers to easily develop distributed multimedia applications and create reusable multimedia toolkits. DAVE was designed for developing applications such as videomore » conferencing, media archival, remote process control, and distance learning.« less

  3. A Software Architecture for Intelligent Synthesis Environments

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Norvig, Peter (Technical Monitor)

    2001-01-01

    The NASA's Intelligent Synthesis Environment (ISE) program is a grand attempt to develop a system to transform the way complex artifacts are engineered. This paper discusses a "middleware" architecture for enabling the development of ISE. Desirable elements of such an Intelligent Synthesis Architecture (ISA) include remote invocation; plug-and-play applications; scripting of applications; management of design artifacts, tools, and artifact and tool attributes; common system services; system management; and systematic enforcement of policies. This paper argues that the ISA extend conventional distributed object technology (DOT) such as CORBA and Product Data Managers with flexible repositories of product and tool annotations and "plug-and-play" mechanisms for inserting "ility" or orthogonal concerns into the system. I describe the Object Infrastructure Framework, an Aspect Oriented Programming (AOP) environment for developing distributed systems that provides utility insertion and enables consistent annotation maintenance. This technology can be used to enforce policies such as maintaining the annotations of artifacts, particularly the provenance and access control rules of artifacts-, performing automatic datatype transformations between representations; supplying alternative servers of the same service; reporting on the status of jobs and the system; conveying privileges throughout an application; supporting long-lived transactions; maintaining version consistency; and providing software redundancy and mobility.

  4. Orbital debris and meteoroids: Results from retrieved spacecraft surfaces

    NASA Astrophysics Data System (ADS)

    Mandeville, J. C.

    1993-08-01

    Near-Earth space contains natural and man-made particles, whose size distribution ranges from submicron sized particles to cm sized objects. This environment causes a grave threat to space missions, mainly for future manned or long duration missions. Several experiments devoted to the study of this environment have been recently retrieved from space. Among them several were located on the NASA Long Duration Exposure Facility (LDEF) and on the Russian MIR Space Station. Evaluation of hypervelocity impact features gives valuable information on size distribution of small dust particles present in low Earth orbit. Chemical identification of projectile remnants is possible in many instances, thus allowing a discrimination between extraterrestrial particles and man-made orbital debris. A preliminary comparison of flight data with current modeling of meteoroids and space debris shows a fair agreement. However impact of particles identified as space debris on the trailing side of LDEF, not predicted by the models, could be the result of space debris in highly excentric orbits, probably associated with GTO objects.

  5. High-Surety Telemedicine in a Distributed, 'Plug-andPlan' Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Gallagher, Linda K.

    1999-05-17

    Commercial telemedicine systems are increasingly functional, incorporating video-conferencing capabilities, diagnostic peripherals, medication reminders, and patient education services. However, these systems (1) rarely utilize information architectures which allow them to be easily integrated with existing health information networks and (2) do not always protect patient confidentiality with adequate security mechanisms. Using object-oriented methods and software wrappers, we illustrate the transformation of an existing stand-alone telemedicine system into `plug-and-play' components that function in a distributed medical information environment. We show, through the use of open standards and published component interfaces, that commercial telemedicine offerings which were once incompatible with electronic patient recordmore » systems can now share relevant data with clinical information repositories while at the same time hiding the proprietary implementations of the respective systems. Additionally, we illustrate how leading-edge technology can secure this distributed telemedicine environment, maintaining patient confidentiality and the integrity of the associated electronic medical data. Information surety technology also encourages the development of telemedicine systems that have both read and write access to electronic medical records containing patient-identifiable information. The win-win approach to telemedicine information system development preserves investments in legacy software and hardware while promoting security and interoperability in a distributed environment.« less

  6. The AI Bus architecture for distributed knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain

    1991-01-01

    The AI Bus architecture is layered, distributed object oriented framework developed to support the requirements of advanced technology programs for an order of magnitude improvement in software costs. The consequent need for highly autonomous computer systems, adaptable to new technology advances over a long lifespan, led to the design of an open architecture and toolbox for building large scale, robust, production quality systems. The AI Bus accommodates a mix of knowledge based and conventional components, running on heterogeneous, distributed real world and testbed environment. The concepts and design is described of the AI Bus architecture and its current implementation status as a Unix C++ library or reusable objects. Each high level semiautonomous agent process consists of a number of knowledge sources together with interagent communication mechanisms based on shared blackboards and message passing acquaintances. Standard interfaces and protocols are followed for combining and validating subsystems. Dynamic probes or demons provide an event driven means for providing active objects with shared access to resources, and each other, while not violating their security.

  7. Data management in an object-oriented distributed aircraft conceptual design environment

    NASA Astrophysics Data System (ADS)

    Lu, Zhijie

    In the competitive global market place, aerospace companies are forced to deliver the right products to the right market, with the right cost, and at the right time. However, the rapid development of technologies and new business opportunities, such as mergers, acquisitions, supply chain management, etc., have dramatically increased the complexity of designing an aircraft. Therefore, the pressure to reduce design cycle time and cost is enormous. One way to solve such a dilemma is to develop and apply advanced engineering environments (AEEs), which are distributed collaborative virtual design environments linking researchers, technologists, designers, etc., together by incorporating application tools and advanced computational, communications, and networking facilities. Aircraft conceptual design, as the first design stage, provides major opportunity to compress design cycle time and is the cheapest place for making design changes. However, traditional aircraft conceptual design programs, which are monolithic programs, cannot provide satisfactory functionality to meet new design requirements due to the lack of domain flexibility and analysis scalability. Therefore, we are in need of the next generation aircraft conceptual design environment (NextADE). To build the NextADE, the framework and the data management problem are two major problems that need to be addressed at the forefront. Solving these two problems, particularly the data management problem, is the focus of this research. In this dissertation, in light of AEEs, a distributed object-oriented framework is firstly formulated and tested for the NextADE. In order to improve interoperability and simplify the integration of heterogeneous application tools, data management is one of the major problems that need to be tackled. To solve this problem, taking into account the characteristics of aircraft conceptual design data, a robust, extensible object-oriented data model is then proposed according to the distributed object-oriented framework. By overcoming the shortcomings of the traditional approach of modeling aircraft conceptual design data, this data model makes it possible to capture specific detailed information of aircraft conceptual design without sacrificing generality, which is one of the most desired features of a data model for aircraft conceptual design. Based upon this data model, a prototype of the data management system, which is one of the fundamental building blocks of the NextADE, is implemented utilizing the state of the art information technologies. Using a general-purpose integration software package to demonstrate the efficacy of the proposed framework and the data management system, the NextADE is initially implemented by integrating the prototype of the data management system with other building blocks of the design environment, such as disciplinary analyses programs and mission analyses programs. As experiments, two case studies are conducted in the integrated design environments. One is based upon a simplified conceptual design of a notional conventional aircraft; the other is a simplified conceptual design of an unconventional aircraft. As a result of the experiments, the proposed framework and the data management approach are shown to be feasible solutions to the research problems.

  8. Air traffic control by distributed management in a MLS environment

    NASA Technical Reports Server (NTRS)

    Kreifeldt, J. G.; Parkin, L.; Hart, S.

    1977-01-01

    The microwave landing system (MLS) is a technically feasible means for increasing runway capacity since it could support curved approaches to a short final. The shorter the final segment of the approach, the wider the variety of speed mixes possible so that theoretically, capacity would ultimately be limited by runway occupance time only. An experiment contrasted air traffic control in a MLS environment under a centralized form of management and under distributed management which was supported by a traffic situation display in each of the 3 piloted simulators. Objective flight data, verbal communication and subjective responses were recorded on 18 trial runs lasting about 20 minutes each. The results were in general agreement with previous distributed management research. In particular, distributed management permitted a smaller spread of intercrossing times and both pilots and controllers perceived distributed management as the more 'ideal' system in this task. It is concluded from this and previous research that distributed management offers a viable alternative to centralized management with definite potential for dealing with dense traffic in a safe, orderly and expeditious manner.

  9. The Impact of Meteoroid Streams on the Lunar Atmosphere and Dust Environment During the LADEE Mission

    NASA Technical Reports Server (NTRS)

    Stubbs, T. J.; Glenar, D. A.; Wang, Y.; Hermalyn, B.; Sarantos, M.; Colaprete, A.; Elphic, R. C.

    2015-01-01

    The scientific objectives of the Lunar Atmosphere and Dust Environment Explorer (LADEE) mission are: (1) determine the composition of the lunar atmosphere, investigate processes controlling distribution and variability - sources, sinks, and surface interactions; and (2) characterize the lunar exospheric dust environment, measure spatial and temporal variability, and influences on the lunar atmosphere. Impacts on the lunar surface from meteoroid streams encountered by the Earth-Moon system are anticipated to result in enhancements in the both the lunar atmosphere and dust environment. Here we describe the annual meteoroid streams expected to be incident at the Moon during the LADEE mission, and their anticipated effects on the lunar environment.

  10. Tracking multiple objects is limited only by object spacing, not by speed, time, or capacity.

    PubMed

    Franconeri, S L; Jonathan, S V; Scimeca, J M

    2010-07-01

    In dealing with a dynamic world, people have the ability to maintain selective attention on a subset of moving objects in the environment. Performance in such multiple-object tracking is limited by three primary factors-the number of objects that one can track, the speed at which one can track them, and how close together they can be. We argue that this last limit, of object spacing, is the root cause of all performance constraints in multiple-object tracking. In two experiments, we found that as long as the distribution of object spacing is held constant, tracking performance is unaffected by large changes in object speed and tracking time. These results suggest that barring object-spacing constraints, people could reliably track an unlimited number of objects as fast as they could track a single object.

  11. Guest Editor's introduction: Selected papers from the 4th USENIX Conference on Object-Oriented Technologies and Systems

    NASA Astrophysics Data System (ADS)

    Sventek, Joe

    1998-12-01

    Hewlett-Packard Laboratories, 1501 Page Mill Road, Palo Alto, CA 94304, USA Introduction The USENIX Conference on Object-Oriented Technologies and Systems (COOTS) is held annually in the late spring. The conference evolved from a set of C++ workshops that were held under the auspices of USENIX, the first of which met in 1989. Given the growing diverse interest in object-oriented technologies, the C++ focus of the workshop eventually became too narrow, with the result that the scope was widened in 1995 to include object-oriented technologies and systems. COOTS is intended to showcase advanced R&D efforts in object-oriented technologies and software systems. The conference emphasizes experimental research and experience gained by using object-oriented techniques and languages to build complex software systems that meet real-world needs. COOTS solicits papers in the following general areas: application of, and experiences with, object-oriented technologies in particular domains (e.g. financial, medical, telecommunication); the architecture and implementation of distributed object systems (e.g. CORBA, DCOM, RMI); object-oriented programming and specification languages; object-oriented design and analysis. The 4th meeting of COOTS was held 27 - 30 April 1998 at the El Dorado Hotel, Santa Fe, New Mexico, USA. Several tutorials were given. The technical program proper consisted of a single track of six sessions, with three paper presentations per session. A keynote address and a provocative panel session rounded out the technical program. The program committee reviewed 56 papers, selecting the best 18 for presentation in the technical sessions. While we solicit papers across the spectrum of applications of object-oriented technologies, this year there was a predominance of distributed, object-oriented papers. The accepted papers reflected this asymmetry, with 15 papers on distributed objects and 3 papers on object-oriented languages. The papers in this special issue are the six best distributed object papers (in the opinion of the program committee). They represent the diversity of research in this particular area, and should give the reader a good idea of the types of papers presented at COOTS as well as the calibre of the work so presented. The papers The paper by Jain, Widoff and Schmidt explores the suitability of Java for writing performance-sensitive distributed applications. Despite the popularity of Java, there are many concerns about its efficiency; in particular, networking and computation performance are key concerns when considering the use of Java to develop performance-sensitive distributed applications. This paper makes three contributions to the study of Java for these applications: it describes an architecture using Java and the Web to develop MedJava, which is a distributed electronic medical imaging system with stringent networking and computation requirements; it presents benchmarks of MedJava image processing and compares the results to the performance of xv, which is an equivalent image processing application written in C; it presents performance benchmarks using Java as a transport interface to exchange large medical images over high-speed ATM networks. The paper by Little and Shrivastava covers the integration of several important topics: transactions, distributed systems, Java, the Internet and security. The usefulness of this paper lies in the synthesis of an effective solution applying work in different areas of computing to the Java environment. Securing applications constructed from distributed objects is important if these applications are to be used in mission-critical situations. Delegation is one aspect of distributed system security that is necessary for such applications. The paper by Nagaratnam and Lea describes a secure delegation model for Java-based, distributed object environments. The paper by Frølund and Koistinen addresses the topical issue of providing a common way for describing Quality-of-Service (QoS) features in distributed, object-oriented systems. They present a general QoS language, QML, that can be used to capture QoS properties as part of a design. They also show how to extend UML to support QML concepts. The paper by Szymaszek, Uszok and Zielinski discusses the important issue of efficient implementation and usage of fine-grained objects in CORBA-based applications. Fine-grained objects can have serious ramifications on overall application performance and scalability, and the paper suggests that such objects should not be treated as first-class CORBA objects, proposing instead the use of collections and smart proxies for efficient implementation. The paper by Milojicic, LaForge and Chauhan describes a mobile objects and agents infrastructure. Their particular research has focused on communication support across agent migration and extensive resource control. The paper also discusses issues regarding interoperation between agent systems. Acknowledgments The editor wishes to thank all of the authors, reviewers and publishers. Without their excellent work, and the contribution of their valuable time, this special issue would not have been possible.

  12. Debris mapping sensor technology project summary: Technology flight experiments program area of the space platforms technology program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The topics presented are covered in viewgraph form. Programmatic objectives are: (1) to improve characterization of the orbital debris environment; and (2) to provide a passive sensor test bed for debris collision detection systems. Technical objectives are: (1) to study LEO debris altitude, size and temperature distribution down to 1 mm particles; (2) to quantify ground based radar and optical data ambiguities; and (3) to optimize debris detection strategies.

  13. UBioLab: a web-laboratory for ubiquitous in-silico experiments.

    PubMed

    Bartocci, Ezio; Cacciagrano, Diletta; Di Berardini, Maria Rita; Merelli, Emanuela; Vito, Leonardo

    2012-07-09

    The huge and dynamic amount of bioinformatic resources (e.g., data and tools) available nowadays in Internet represents a big challenge for biologists –for what concerns their management and visualization– and for bioinformaticians –for what concerns the possibility of rapidly creating and executing in-silico experiments involving resources and activities spread over the WWW hyperspace. Any framework aiming at integrating such resources as in a physical laboratory has imperatively to tackle –and possibly to handle in a transparent and uniform way– aspects concerning physical distribution, semantic heterogeneity, co-existence of different computational paradigms and, as a consequence, of different invocation interfaces (i.e., OGSA for Grid nodes, SOAP for Web Services, Java RMI for Java objects, etc.). The framework UBioLab has been just designed and developed as a prototype following the above objective. Several architectural features –as those ones of being fully Web-based and of combining domain ontologies, Semantic Web and workflow techniques– give evidence of an effort in such a direction. The integration of a semantic knowledge management system for distributed (bioinformatic) resources, a semantic-driven graphic environment for defining and monitoring ubiquitous workflows and an intelligent agent-based technology for their distributed execution allows UBioLab to be a semantic guide for bioinformaticians and biologists providing (i) a flexible environment for visualizing, organizing and inferring any (semantics and computational) "type" of domain knowledge (e.g., resources and activities, expressed in a declarative form), (ii) a powerful engine for defining and storing semantic-driven ubiquitous in-silico experiments on the domain hyperspace, as well as (iii) a transparent, automatic and distributed environment for correct experiment executions.

  14. Robot soccer anywhere: achieving persistent autonomous navigation, mapping, and object vision tracking in dynamic environments

    NASA Astrophysics Data System (ADS)

    Dragone, Mauro; O'Donoghue, Ruadhan; Leonard, John J.; O'Hare, Gregory; Duffy, Brian; Patrikalakis, Andrew; Leederkerken, Jacques

    2005-06-01

    The paper describes an ongoing effort to enable autonomous mobile robots to play soccer in unstructured, everyday environments. Unlike conventional robot soccer competitions that are usually held on purpose-built robot soccer "fields", in our work we seek to develop the capability for robots to demonstrate aspects of soccer-playing in more diverse environments, such as schools, hospitals, or shopping malls, with static obstacles (furniture) and dynamic natural obstacles (people). This problem of "Soccer Anywhere" presents numerous research challenges including: (1) Simultaneous Localization and Mapping (SLAM) in dynamic, unstructured environments, (2) software control architectures for decentralized, distributed control of mobile agents, (3) integration of vision-based object tracking with dynamic control, and (4) social interaction with human participants. In addition to the intrinsic research merit of these topics, we believe that this capability would prove useful for outreach activities, in demonstrating robotics technology to primary and secondary school students, to motivate them to pursue careers in science and engineering.

  15. NoSOCS in SDSS - VI. The environmental dependence of AGN in clusters and field in the local Universe

    NASA Astrophysics Data System (ADS)

    Lopes, P. A. A.; Ribeiro, A. L. B.; Rembold, S. B.

    2017-11-01

    We investigated the variation in the fraction of optical active galactic nuclei (AGNs) hosts with stellar mass, as well as their local and global environments. Our sample is composed of cluster members and field galaxies at z ≤ 0.1 and we consider only strong AGN. We find a strong variation in the AGN fraction (FAGN) with stellar mass. The field population comprises a higher AGN fraction compared to the global cluster population, especially for objects with log M* > 10.6. Hence, we restricted our analysis to more massive objects. We detected a smooth variation in the FAGN with local stellar mass density for cluster objects, reaching a plateau in the field environment. As a function of cluster-centric distance we verify that FAGN is roughly constant for R > R200, but show a steep decline inwards. We have also verified the dependence of the AGN population on cluster velocity dispersion, finding a constant behaviour for low mass systems (σP ≲ 650-700 km s-1). However, there is a strong decline in FAGN for higher mass clusters (>700 km s-1). When comparing the FAGN in clusters with or without substructure, we only find different results for objects at large radii (R > R200), in the sense that clusters with substructure present some excess in the AGN fraction. Finally, we have found that the phase-space distribution of AGN cluster members is significantly different than other populations. Due to the environmental dependence of FAGN and their phase-space distribution, we interpret AGN to be the result of galaxy interactions, favoured in environments where the relative velocities are low, typical of the field, low mass groups or cluster outskirts.

  16. The component-based architecture of the HELIOS medical software engineering environment.

    PubMed

    Degoulet, P; Jean, F C; Engelmann, U; Meinzer, H P; Baud, R; Sandblad, B; Wigertz, O; Le Meur, R; Jagermann, C

    1994-12-01

    The constitution of highly integrated health information networks and the growth of multimedia technologies raise new challenges for the development of medical applications. We describe in this paper the general architecture of the HELIOS medical software engineering environment devoted to the development and maintenance of multimedia distributed medical applications. HELIOS is made of a set of software components, federated by a communication channel called the HELIOS Unification Bus. The HELIOS kernel includes three main components, the Analysis-Design and Environment, the Object Information System and the Interface Manager. HELIOS services consist in a collection of toolkits providing the necessary facilities to medical application developers. They include Image Related services, a Natural Language Processor, a Decision Support System and Connection services. The project gives special attention to both object-oriented approaches and software re-usability that are considered crucial steps towards the development of more reliable, coherent and integrated applications.

  17. Systems Issues In Terrestrial Fiber Optic Link Reliability

    NASA Astrophysics Data System (ADS)

    Spencer, James L.; Lewin, Barry R.; Lee, T. Frank S.

    1990-01-01

    This paper reviews fiber optic system reliability issues from three different viewpoints - availability, operating environment, and evolving technologies. Present availability objectives for interoffice links and for the distribution loop must be re-examined for applications such as the Synchronous Optical Network (SONET), Fiber-to-the-Home (FTTH), and analog services. The hostile operating environments of emerging applications (such as FTTH) must be carefully considered in system design as well as reliability assessments. Finally, evolving technologies might require the development of new reliability testing strategies.

  18. Characterization of Magnetospheric Spacecraft Charging Environments Using the LANL Magnetospheric Plasma Analyzer Data Set

    NASA Technical Reports Server (NTRS)

    Hardage, Donna (Technical Monitor); Davis, V. A.; Mandell, M. J.; Thomsen, M. F.

    2003-01-01

    An improved specification of the plasma environment has been developed for use in modeling spacecraft charging. It was developed by statistically analyzing a large part of the LANL Magnetospheric Plasma Analyzer (MPA) data set for ion and electron spectral signature correlation with spacecraft charging, including anisotropies. The objective is to identify a relatively simple characterization of the full particle distributions that yield an accurate predication of the observed charging under a wide variety of conditions.

  19. Electric Propulsion Test & Evaluation Methodologies for Plasma in the Environments of Space and Testing (EP TEMPEST) (Briefing Charts)

    DTIC Science & Technology

    2015-04-01

    in the Environments of Space and Testing (EP TEMPEST ) - Program Review (Briefing Charts) 5a. CONTRACT NUMBER In-House 5b. GRANT NUMBER 5c...of Space and Testing (EP TEMPEST ) AFOSR T&E Program Review 13-17 April 2015 Dr. Daniel L. Brown In-Space Propulsion Branch (RQRS) Aerospace Systems...Statement A: Approved for public release; distribution is unlimited. EP TEMPEST (Lab Task, FY14-FY16) Program Goals and Objectives Title: Electric

  20. Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.

  1. A Virtual Bioinformatics Knowledge Environment for Early Cancer Detection

    NASA Technical Reports Server (NTRS)

    Crichton, Daniel; Srivastava, Sudhir; Johnsey, Donald

    2003-01-01

    Discovery of disease biomarkers for cancer is a leading focus of early detection. The National Cancer Institute created a network of collaborating institutions focused on the discovery and validation of cancer biomarkers called the Early Detection Research Network (EDRN). Informatics plays a key role in enabling a virtual knowledge environment that provides scientists real time access to distributed data sets located at research institutions across the nation. The distributed and heterogeneous nature of the collaboration makes data sharing across institutions very difficult. EDRN has developed a comprehensive informatics effort focused on developing a national infrastructure enabling seamless access, sharing and discovery of science data resources across all EDRN sites. This paper will discuss the EDRN knowledge system architecture, its objectives and its accomplishments.

  2. Graph Partitioning for Parallel Applications in Heterogeneous Grid Environments

    NASA Technical Reports Server (NTRS)

    Bisws, Rupak; Kumar, Shailendra; Das, Sajal K.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The problem of partitioning irregular graphs and meshes for parallel computations on homogeneous systems has been extensively studied. However, these partitioning schemes fail when the target system architecture exhibits heterogeneity in resource characteristics. With the emergence of technologies such as the Grid, it is imperative to study the partitioning problem taking into consideration the differing capabilities of such distributed heterogeneous systems. In our model, the heterogeneous system consists of processors with varying processing power and an underlying non-uniform communication network. We present in this paper a novel multilevel partitioning scheme for irregular graphs and meshes, that takes into account issues pertinent to Grid computing environments. Our partitioning algorithm, called MiniMax, generates and maps partitions onto a heterogeneous system with the objective of minimizing the maximum execution time of the parallel distributed application. For experimental performance study, we have considered both a realistic mesh problem from NASA as well as synthetic workloads. Simulation results demonstrate that MiniMax generates high quality partitions for various classes of applications targeted for parallel execution in a distributed heterogeneous environment.

  3. A Comparison of Alternative Distributed Dynamic Cluster Formation Techniques for Industrial Wireless Sensor Networks.

    PubMed

    Gholami, Mohammad; Brennan, Robert W

    2016-01-06

    In this paper, we investigate alternative distributed clustering techniques for wireless sensor node tracking in an industrial environment. The research builds on extant work on wireless sensor node clustering by reporting on: (1) the development of a novel distributed management approach for tracking mobile nodes in an industrial wireless sensor network; and (2) an objective comparison of alternative cluster management approaches for wireless sensor networks. To perform this comparison, we focus on two main clustering approaches proposed in the literature: pre-defined clusters and ad hoc clusters. These approaches are compared in the context of their reconfigurability: more specifically, we investigate the trade-off between the cost and the effectiveness of competing strategies aimed at adapting to changes in the sensing environment. To support this work, we introduce three new metrics: a cost/efficiency measure, a performance measure, and a resource consumption measure. The results of our experiments show that ad hoc clusters adapt more readily to changes in the sensing environment, but this higher level of adaptability is at the cost of overall efficiency.

  4. A Comparison of Alternative Distributed Dynamic Cluster Formation Techniques for Industrial Wireless Sensor Networks

    PubMed Central

    Gholami, Mohammad; Brennan, Robert W.

    2016-01-01

    In this paper, we investigate alternative distributed clustering techniques for wireless sensor node tracking in an industrial environment. The research builds on extant work on wireless sensor node clustering by reporting on: (1) the development of a novel distributed management approach for tracking mobile nodes in an industrial wireless sensor network; and (2) an objective comparison of alternative cluster management approaches for wireless sensor networks. To perform this comparison, we focus on two main clustering approaches proposed in the literature: pre-defined clusters and ad hoc clusters. These approaches are compared in the context of their reconfigurability: more specifically, we investigate the trade-off between the cost and the effectiveness of competing strategies aimed at adapting to changes in the sensing environment. To support this work, we introduce three new metrics: a cost/efficiency measure, a performance measure, and a resource consumption measure. The results of our experiments show that ad hoc clusters adapt more readily to changes in the sensing environment, but this higher level of adaptability is at the cost of overall efficiency. PMID:26751447

  5. Informational model verification of ZVS Buck quasi-resonant DC-DC converter

    NASA Astrophysics Data System (ADS)

    Vakovsky, Dimiter; Hinov, Nikolay

    2016-12-01

    The aim of the paper is to create a polymorphic informational model of a ZVS Buck quasi-resonant DC-DC converter for the modeling purposes of the object. For the creation of the model is applied flexible open standards for setting, storing, publishing and exchange of data in distributed information environment. The created model is useful for creation of many and different by type variants with different configuration of the composing elements and different inner model of the examined object.

  6. Distributed interactive virtual environments for collaborative experiential learning and training independent of distance over Internet2.

    PubMed

    Alverson, Dale C; Saiki, Stanley M; Jacobs, Joshua; Saland, Linda; Keep, Marcus F; Norenberg, Jeffrey; Baker, Rex; Nakatsu, Curtis; Kalishman, Summers; Lindberg, Marlene; Wax, Diane; Mowafi, Moad; Summers, Kenneth L; Holten, James R; Greenfield, John A; Aalseth, Edward; Nickles, David; Sherstyuk, Andrei; Haines, Karen; Caudell, Thomas P

    2004-01-01

    Medical knowledge and skills essential for tomorrow's healthcare professionals continue to change faster than ever before creating new demands in medical education. Project TOUCH (Telehealth Outreach for Unified Community Health) has been developing methods to enhance learning by coupling innovations in medical education with advanced technology in high performance computing and next generation Internet2 embedded in virtual reality environments (VRE), artificial intelligence and experiential active learning. Simulations have been used in education and training to allow learners to make mistakes safely in lieu of real-life situations, learn from those mistakes and ultimately improve performance by subsequent avoidance of those mistakes. Distributed virtual interactive environments are used over distance to enable learning and participation in dynamic, problem-based, clinical, artificial intelligence rules-based, virtual simulations. The virtual reality patient is programmed to dynamically change over time and respond to the manipulations by the learner. Participants are fully immersed within the VRE platform using a head-mounted display and tracker system. Navigation, locomotion and handling of objects are accomplished using a joy-wand. Distribution is managed via the Internet2 Access Grid using point-to-point or multi-casting connectivity through which the participants can interact. Medical students in Hawaii and New Mexico (NM) participated collaboratively in problem solving and managing of a simulated patient with a closed head injury in VRE; dividing tasks, handing off objects, and functioning as a team. Students stated that opportunities to make mistakes and repeat actions in the VRE were extremely helpful in learning specific principles. VRE created higher performance expectations and some anxiety among VRE users. VRE orientation was adequate but students needed time to adapt and practice in order to improve efficiency. This was also demonstrated successfully between Western Australia and UNM. We successfully demonstrated the ability to fully immerse participants in a distributed virtual environment independent of distance for collaborative team interaction in medical simulation designed for education and training. The ability to make mistakes in a safe environment is well received by students and has a positive impact on their understanding, as well as memory of the principles involved in correcting those mistakes. Bringing people together as virtual teams for interactive experiential learning and collaborative training, independent of distance, provides a platform for distributed "just-in-time" training, performance assessment and credentialing. Further validation is necessary to determine the potential value of the distributed VRE in knowledge transfer, improved future performance and should entail training participants to competence in using these tools.

  7. Adopting SCORM 1.2 Standards in a Courseware Production Environment

    ERIC Educational Resources Information Center

    Barker, Bradley

    2004-01-01

    The Sharable Content Object Reference Model (SCORM) is a technology framework for Web-based learning technology. Originated by the Department of Defense and accelerated by the Advanced Distributed Learning initiative SCORM was released in January of 2000 (ADL, 2003). The goals of SCORM are to decrease the cost of training, while increasing the…

  8. Learning Different Light Prior Distributions for Different Contexts

    ERIC Educational Resources Information Center

    Kerrigan, Iona S.; Adams, Wendy J.

    2013-01-01

    The pattern of shading across an image can provide a rich sense of object shape. Our ability to use shading information is remarkable given the infinite possible combinations of illumination, shape and reflectance that could have produced any given image. Illumination can change dramatically across environments (e.g. indoor vs. outdoor) and times…

  9. Synoptic and Mesoscale Climatologies of Severe Local Storms for the American Midwest.

    NASA Astrophysics Data System (ADS)

    Arnold, David Leslie

    This study investigates the synoptic and mesoscale environments associated with severe local storms (SELS) in the heart of the American Midwest. This region includes west-central Illinois, most of Indiana, the extreme western counties of Ohio, and a small part of northeastern Kentucky. The primary objectives of this study are to determine the surface and middle-tropospheric synoptic circulation patterns and thermodynamic and kinematic environments associated with SELS event types (tornadoes, hail, severe straight -line winds), and to assess the degree to which the synoptic circulation patterns and meso-beta scale kinematic and thermodynamic climatology of the Midwest differ from that of the Great Plains. A secondary objective is to investigate the possible role that land-surface atmosphere interactions play in the spatial distribution of SELS. A new subjective synoptic typing scheme is developed and applied to determine the synoptic-scale circulation patterns associated with the occurrence of SELS event types. This scheme is based on a combination of surface and middle -tropospheric patterns. Thermodynamic and kinematic parameters are analyzed to determine meso-scale environments favorable for the development of SELS. Results indicate that key synoptic-scale circulation patterns, and specific ranges of thermodynamic and kinematic parameters are related to specific SELS event types. These circulation types and ranges of thermodynamic and kinematic parameters may be used to help improve the medium-range forecasting of severe local storms. Results of the secondary objective reveal that the spatial distribution of SELS events is clustered within the study region, and most occur under a negative climate division-level soil moisture gradient; that is, a drier upwind division than the division in which the event occurs. Moreover, the spatial distribution of SELS events is compared against a map of soil types and vegetation. The resulting distribution depicts a visual correlation between the primary soil and vegetative boundaries and clusters of SELS. This supports the likely role of meso-scale land-surface-atmosphere interactions in severe weather development for humid lowlands of the Midwest United States.

  10. Network-based Modeling of Mesoscale Catchments - The Hydrology Perspective of Glowa-danube

    NASA Astrophysics Data System (ADS)

    Ludwig, R.; Escher-Vetter, H.; Hennicker, R.; Mauser, W.; Niemeyer, S.; Reichstein, M.; Tenhunen, J.

    Within the GLOWA initiative of the German Ministry for Research and Educa- tion (BMBF), the project GLOWA-Danube is funded to establish a transdisciplinary network-based decision support tool for water related issues in the Upper Danube wa- tershed. It aims to develop and validate integration techniques, integrated models and integrated monitoring procedures and to implement them in the network-based De- cision Support System DANUBIA. An accurate description of processes involved in energy, water and matter fluxes and turnovers requires an intense collaboration and exchange of water related expertise of different scientific disciplines. DANUBIA is conceived as a distributed expert network and is developed on the basis of re-useable, refineable, and documented sub-models. In order to synthesize a common understand- ing between the project partners, a standardized notation of parameters and functions and a platform-independent structure of computational methods and interfaces has been established using the Unified Modeling Language UML. DANUBIA is object- oriented, spatially distributed and raster-based at its core. It applies the concept of "proxels" (Process Pixel) as its basic object, which has different dimensions depend- ing on the viewing scale and connects to its environment through fluxes. The presented study excerpts the hydrological view point of GLOWA-Danube, its approach of model coupling and network based communication (using the Remote Method Invocation RMI), the object-oriented technology to simulate physical processes and interactions at the land surface and the methodology to treat the issue of spatial and temporal scal- ing in large, heterogeneous catchments. The mechanisms applied to communicate data and model parameters across the typical discipline borders will be demonstrated from the perspective of a land-surface object, which comprises the capabilities of interde- pendent expert models for snowmelt, soil water movement, runoff formation, plant growth and radiation balance in a distributed JAVA-based modeling environment. The coupling to the adjacent physical objects of atmosphere, groundwater and river net- work will also be addressed.

  11. Harnessing Orbital Debris to Sense the Space Environment

    NASA Astrophysics Data System (ADS)

    Mutschler, S.; Axelrad, P.; Matsuo, T.

    A key requirement for accurate space situational awareness (SSA) is knowledge of the non-conservative forces that act on space objects. These effects vary temporally and spatially, driven by the dynamical behavior of space weather. Existing SSA algorithms adjust space weather models based on observations of calibration satellites. However, lack of sufficient data and mismodeling of non-conservative forces cause inaccuracies in space object motion prediction. The uncontrolled nature of debris makes it particularly sensitive to the variations in space weather. Our research takes advantage of this behavior by inverting observations of debris objects to infer the space environment parameters causing their motion. In addition, this research will produce more accurate predictions of the motion of debris objects. The hypothesis of this research is that it is possible to utilize a "cluster" of debris objects, objects within relatively close proximity of each other, to sense their local environment. We focus on deriving parameters of an atmospheric density model to more precisely predict the drag force on LEO objects. An Ensemble Kalman Filter (EnKF) is used for assimilation; the prior ensemble to the posterior ensemble is transformed during the measurement update in a manner that does not require inversion of large matrices. A prior ensemble is utilized to empirically determine the nonlinear relationship between measurements and density parameters. The filter estimates an extended state that includes position and velocity of the debris object, and atmospheric density parameters. The density is parameterized as a grid of values, distributed by latitude and local sidereal time over a spherical shell encompassing Earth. This research focuses on LEO object motion, but it can also be extended to additional orbital regimes for observation and refinement of magnetic field and solar radiation models. An observability analysis of the proposed approach is presented in terms of the measurement cadence necessary to estimate the local space environment.

  12. Object-oriented fault tree models applied to system diagnosis

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Patterson-Hine, F. A.

    1990-01-01

    When a diagnosis system is used in a dynamic environment, such as the distributed computer system planned for use on Space Station Freedom, it must execute quickly and its knowledge base must be easily updated. Representing system knowledge as object-oriented augmented fault trees provides both features. The diagnosis system described here is based on the failure cause identification process of the diagnostic system described by Narayanan and Viswanadham. Their system has been enhanced in this implementation by replacing the knowledge base of if-then rules with an object-oriented fault tree representation. This allows the system to perform its task much faster and facilitates dynamic updating of the knowledge base in a changing diagnosis environment. Accessing the information contained in the objects is more efficient than performing a lookup operation on an indexed rule base. Additionally, the object-oriented fault trees can be easily updated to represent current system status. This paper describes the fault tree representation, the diagnosis algorithm extensions, and an example application of this system. Comparisons are made between the object-oriented fault tree knowledge structure solution and one implementation of a rule-based solution. Plans for future work on this system are also discussed.

  13. Providing the Persistent Data Storage in a Software Engineering Environment Using Java/COBRA and a DBMS

    NASA Technical Reports Server (NTRS)

    Dhaliwal, Swarn S.

    1997-01-01

    An investigation was undertaken to build the software foundation for the WHERE (Web-based Hyper-text Environment for Requirements Engineering) project. The TCM (Toolkit for Conceptual Modeling) was chosen as the foundation software for the WHERE project which aims to provide an environment for facilitating collaboration among geographically distributed people involved in the Requirements Engineering process. The TCM is a collection of diagram and table editors and has been implemented in the C++ programming language. The C++ implementation of the TCM was translated into Java in order to allow the editors to be used for building various functionality of the WHERE project; the WHERE project intends to use the Web as its communication back- bone. One of the limitations of the translated software (TcmJava), which militated against its use in the WHERE project, was persistent data management mechanisms which it inherited from the original TCM; it was designed to be used in standalone applications. Before TcmJava editors could be used as a part of the multi-user, geographically distributed applications of the WHERE project, a persistent storage mechanism must be built which would allow data communication over the Internet, using the capabilities of the Web. An approach involving features of Java, CORBA (Common Object Request Broker), the Web, a middle-ware (Java Relational Binding (JRB)), and a database server was used to build the persistent data management infrastructure for the WHERE project. The developed infrastructure allows a TcmJava editor to be downloaded and run from a network host by using a JDK 1.1 (Java Developer's Kit) compatible Web-browser. The aforementioned editor establishes connection with a server by using the ORB (Object Request Broker) software and stores/retrieves data in/from the server. The server consists of a CORBA object or objects depending upon whether the data is to be made persistent on a single server or multiple servers. The CORBA object providing the persistent data server is implemented using the Java progranu-ning language. It uses the JRB to store/retrieve data in/from a relational database server. The persistent data management system provides transaction and user management facilities which allow multi-user, distributed access to the stored data in a secure manner.

  14. OXC management and control system architecture with scalability, maintenance, and distributed managing environment

    NASA Astrophysics Data System (ADS)

    Park, Soomyung; Joo, Seong-Soon; Yae, Byung-Ho; Lee, Jong-Hyun

    2002-07-01

    In this paper, we present the Optical Cross-Connect (OXC) Management Control System Architecture, which has the scalability and robust maintenance and provides the distributed managing environment in the optical transport network. The OXC system we are developing, which is divided into the hardware and the internal and external software for the OXC system, is made up the OXC subsystem with the Optical Transport Network (OTN) sub layers-hardware and the optical switch control system, the signaling control protocol subsystem performing the User-to-Network Interface (UNI) and Network-to-Network Interface (NNI) signaling control, the Operation Administration Maintenance & Provisioning (OAM&P) subsystem, and the network management subsystem. And the OXC management control system has the features that can support the flexible expansion of the optical transport network, provide the connectivity to heterogeneous external network elements, be added or deleted without interrupting OAM&P services, be remotely operated, provide the global view and detail information for network planner and operator, and have Common Object Request Broker Architecture (CORBA) based the open system architecture adding and deleting the intelligent service networking functions easily in future. To meet these considerations, we adopt the object oriented development method in the whole developing steps of the system analysis, design, and implementation to build the OXC management control system with the scalability, the maintenance, and the distributed managing environment. As a consequently, the componentification for the OXC operation management functions of each subsystem makes the robust maintenance, and increases code reusability. Also, the component based OXC management control system architecture will have the flexibility and scalability in nature.

  15. Distributed Motor Controller (DMC) for Operation in Extreme Environments

    NASA Technical Reports Server (NTRS)

    McKinney, Colin M.; Yager, Jeremy A.; Mojarradi, Mohammad M.; Some, Rafi; Sirota, Allen; Kopf, Ted; Stern, Ryan; Hunter, Don

    2012-01-01

    This paper presents an extreme environment capable Distributed Motor Controller (DMC) module suitable for operation with a distributed architecture of future spacecraft systems. This motor controller is designed to be a bus-based electronics module capable of operating a single Brushless DC motor in extreme space environments: temperature (-120 C to +85 C required, -180 C to +100 C stretch goal); radiation (>;20K required, >;100KRad stretch goal); >;360 cycles of operation. Achieving this objective will result in a scalable modular configuration for motor control with enhanced reliability that will greatly lower cost during the design, fabrication and ATLO phases of future missions. Within the heart of the DMC lies a pair of cold-capable Application Specific Integrated Circuits (ASICs) and a Field Programmable Gate Array (FPGA) that enable its miniaturization and operation in extreme environments. The ASICs are fabricated in the IBM 0.5 micron Silicon Germanium (SiGe) BiCMOS process and are comprised of Analog circuitry to provide telemetry information, sensor interface, and health and status of DMC. The FPGA contains logic to provide motor control, status monitoring and spacecraft interface. The testing and characterization of these ASICs have yielded excellent functionality in cold temperatures (-135 C). The DMC module has demonstrated successful operation of a motor at temperature.

  16. Biogeography of anaerobic ammonia-oxidizing (anammox) bacteria

    PubMed Central

    Sonthiphand, Puntipar; Hall, Michael W.; Neufeld, Josh D.

    2014-01-01

    Anaerobic ammonia-oxidizing (anammox) bacteria are able to oxidize ammonia and reduce nitrite to produce N2 gas. After being discovered in a wastewater treatment plant (WWTP), anammox bacteria were subsequently characterized in natural environments, including marine, estuary, freshwater, and terrestrial habitats. Although anammox bacteria play an important role in removing fixed N from both engineered and natural ecosystems, broad scale anammox bacterial distributions have not yet been summarized. The objectives of this study were to explore global distributions and diversity of anammox bacteria and to identify factors that influence their biogeography. Over 6000 anammox 16S rRNA gene sequences from the public database were analyzed in this current study. Data ordinations indicated that salinity was an important factor governing anammox bacterial distributions, with distinct populations inhabiting natural and engineered ecosystems. Gene phylogenies and rarefaction analysis demonstrated that freshwater environments and the marine water column harbored the highest and the lowest diversity of anammox bacteria, respectively. Co-occurrence network analysis indicated that Ca. Scalindua strongly connected with other Ca. Scalindua taxa, whereas Ca. Brocadia co-occurred with taxa from both known and unknown anammox genera. Our survey provides a better understanding of ecological factors affecting anammox bacterial distributions and provides a comprehensive baseline for understanding the relationships among anammox communities in global environments. PMID:25147546

  17. Biogeography of anaerobic ammonia-oxidizing (anammox) bacteria.

    PubMed

    Sonthiphand, Puntipar; Hall, Michael W; Neufeld, Josh D

    2014-01-01

    Anaerobic ammonia-oxidizing (anammox) bacteria are able to oxidize ammonia and reduce nitrite to produce N2 gas. After being discovered in a wastewater treatment plant (WWTP), anammox bacteria were subsequently characterized in natural environments, including marine, estuary, freshwater, and terrestrial habitats. Although anammox bacteria play an important role in removing fixed N from both engineered and natural ecosystems, broad scale anammox bacterial distributions have not yet been summarized. The objectives of this study were to explore global distributions and diversity of anammox bacteria and to identify factors that influence their biogeography. Over 6000 anammox 16S rRNA gene sequences from the public database were analyzed in this current study. Data ordinations indicated that salinity was an important factor governing anammox bacterial distributions, with distinct populations inhabiting natural and engineered ecosystems. Gene phylogenies and rarefaction analysis demonstrated that freshwater environments and the marine water column harbored the highest and the lowest diversity of anammox bacteria, respectively. Co-occurrence network analysis indicated that Ca. Scalindua strongly connected with other Ca. Scalindua taxa, whereas Ca. Brocadia co-occurred with taxa from both known and unknown anammox genera. Our survey provides a better understanding of ecological factors affecting anammox bacterial distributions and provides a comprehensive baseline for understanding the relationships among anammox communities in global environments.

  18. Multi-objects recognition for distributed intelligent sensor networks

    NASA Astrophysics Data System (ADS)

    He, Haibo; Chen, Sheng; Cao, Yuan; Desai, Sachi; Hohil, Myron E.

    2008-04-01

    This paper proposes an innovative approach for multi-objects recognition for homeland security and defense based intelligent sensor networks. Unlike the conventional way of information analysis, data mining in such networks is typically characterized with high information ambiguity/uncertainty, data redundancy, high dimensionality and real-time constrains. Furthermore, since a typical military based network normally includes multiple mobile sensor platforms, ground forces, fortified tanks, combat flights, and other resources, it is critical to develop intelligent data mining approaches to fuse different information resources to understand dynamic environments, to support decision making processes, and finally to achieve the goals. This paper aims to address these issues with a focus on multi-objects recognition. Instead of classifying a single object as in the traditional image classification problems, the proposed method can automatically learn multiple objectives simultaneously. Image segmentation techniques are used to identify the interesting regions in the field, which correspond to multiple objects such as soldiers or tanks. Since different objects will come with different feature sizes, we propose a feature scaling method to represent each object in the same number of dimensions. This is achieved by linear/nonlinear scaling and sampling techniques. Finally, support vector machine (SVM) based learning algorithms are developed to learn and build the associations for different objects, and such knowledge will be adaptively accumulated for objects recognition in the testing stage. We test the effectiveness of proposed method in different simulated military environments.

  19. The USL NASA PC R and D project: Detailed specifications of objects

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Chum, Frank Y.; Hall, Philip P.; Moreau, Dennis R.; Triantafyllopoulos, Spiros

    1984-01-01

    The specifications for a number of projects which are to be implemented within the University of Southwestern Louisiana NASA PC R and D Project are discussed. The goals and objectives of the PC development project and the interrelationships of the various components are discussed. Six projects are described. They are a NASA/RECON simulator, a user interface to multiple remote information systems, evaluation of various personal computer systems, statistical analysis software development, interactive presentation system development, and the development of a distributed processing environment. The relationships of these projects to one another and to the goals and objectives of the overall project are discussed.

  20. Memory-Based Multiagent Coevolution Modeling for Robust Moving Object Tracking

    PubMed Central

    Wang, Yanjiang; Qi, Yujuan; Li, Yongping

    2013-01-01

    The three-stage human brain memory model is incorporated into a multiagent coevolutionary process for finding the best match of the appearance of an object, and a memory-based multiagent coevolution algorithm for robust tracking the moving objects is presented in this paper. Each agent can remember, retrieve, or forget the appearance of the object through its own memory system by its own experience. A number of such memory-based agents are randomly distributed nearby the located object region and then mapped onto a 2D lattice-like environment for predicting the new location of the object by their coevolutionary behaviors, such as competition, recombination, and migration. Experimental results show that the proposed method can deal with large appearance changes and heavy occlusions when tracking a moving object. It can locate the correct object after the appearance changed or the occlusion recovered and outperforms the traditional particle filter-based tracking methods. PMID:23843739

  1. Memory-based multiagent coevolution modeling for robust moving object tracking.

    PubMed

    Wang, Yanjiang; Qi, Yujuan; Li, Yongping

    2013-01-01

    The three-stage human brain memory model is incorporated into a multiagent coevolutionary process for finding the best match of the appearance of an object, and a memory-based multiagent coevolution algorithm for robust tracking the moving objects is presented in this paper. Each agent can remember, retrieve, or forget the appearance of the object through its own memory system by its own experience. A number of such memory-based agents are randomly distributed nearby the located object region and then mapped onto a 2D lattice-like environment for predicting the new location of the object by their coevolutionary behaviors, such as competition, recombination, and migration. Experimental results show that the proposed method can deal with large appearance changes and heavy occlusions when tracking a moving object. It can locate the correct object after the appearance changed or the occlusion recovered and outperforms the traditional particle filter-based tracking methods.

  2. Distributed virtual environment for emergency medical training

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.; Garcia, Brian W.; Godsell-Stytz, Gayl M.

    1997-07-01

    In many professions where individuals must work in a team in a high stress environment to accomplish a time-critical task, individual and team performance can benefit from joint training using distributed virtual environments (DVEs). One professional field that lacks but needs a high-fidelity team training environment is the field of emergency medicine. Currently, emergency department (ED) medical personnel train by using words to create a metal picture of a situation for the physician and staff, who then cooperate to solve the problems portrayed by the word picture. The need in emergency medicine for realistic virtual team training is critical because ED staff typically encounter rarely occurring but life threatening situations only once in their careers and because ED teams currently have no realistic environment in which to practice their team skills. The resulting lack of experience and teamwork makes diagnosis and treatment more difficult. Virtual environment based training has the potential to redress these shortfalls. The objective of our research is to develop a state-of-the-art virtual environment for emergency medicine team training. The virtual emergency room (VER) allows ED physicians and medical staff to realistically prepare for emergency medical situations by performing triage, diagnosis, and treatment on virtual patients within an environment that provides them with the tools they require and the team environment they need to realistically perform these three tasks. There are several issues that must be addressed before this vision is realized. The key issues deal with distribution of computations; the doctor and staff interface to the virtual patient and ED equipment; the accurate simulation of individual patient organs' response to injury, medication, and treatment; and an accurate modeling of the symptoms and appearance of the patient while maintaining a real-time interaction capability. Our ongoing work addresses all of these issues. In this paper we report on our prototype VER system and its distributed system architecture for an emergency department distributed virtual environment for emergency medical staff training. The virtual environment enables emergency department physicians and staff to develop their diagnostic and treatment skills using the virtual tools they need to perform diagnostic and treatment tasks. Virtual human imagery, and real-time virtual human response are used to create the virtual patient and present a scenario. Patient vital signs are available to the emergency department team as they manage the virtual case. The work reported here consists of the system architectures we developed for the distributed components of the virtual emergency room. The architectures we describe consist of the network level architecture as well as the software architecture for each actor within the virtual emergency room. We describe the role of distributed interactive simulation and other enabling technologies within the virtual emergency room project.

  3. Are Learning Style Preferences of Health Science Students Predictive of Their Attitudes towards E-Learning?

    ERIC Educational Resources Information Center

    Brown, Ted; Zoghi, Maryam; Williams, Brett; Jaberzadeh, Shapour; Roller, Louis; Palermo, Claire; McKenna, Lisa; Wright, Caroline; Baird, Marilyn; Schneider-Kolsky, Michal; Hewitt, Lesley; Sim, Jenny; Holt, Tangerine-Ann

    2009-01-01

    The objective for this study was to determine whether learning style preferences of health science students could predict their attitudes to e-learning. A survey comprising the "Index of Learning Styles" (ILS) and the "Online Learning Environment Survey" (OLES) was distributed to 2885 students enrolled in 10 different health…

  4. CHLORPYRIFOS ACCUMULATION PATTERNS FOR CHILD ACCESSIBLE SURFACES AND OBJECTIVES AND URINARY METABOLITE EXCRETION BY CHILDREN FOR TWO-WEEKS AFTER CRACK-AND-CREVICE APPLICATION

    EPA Science Inventory

    The Children's-Post-Pesticide-Application-Exposure-Study (CPPAES) was conducted to look at the distribution of chlorpyrifos within a home environment for a 2-week period following a routine professional crack-and-crevice application, and to determine the amount of the chlorpyrifo...

  5. Apollo 17 ultraviolet spectrometer experiment (S-169)

    NASA Technical Reports Server (NTRS)

    Fastie, W. G.

    1974-01-01

    The scientific objectives of the ultraviolet spectrometer experiment are discussed, along with design and operational details, instrument preparation and performance, and scientific results. Information gained from the experiment is given concerning the lunar atmosphere and albedo, zodiacal light, astronomical observations, spacecraft environment, and the distribution of atomic hydrogen in the solar system and in the earth's atmosphere.

  6. Forest cover changes in the Oregon Coast Range from 1939 to 1993.

    Treesearch

    Rebecca S.H. Kennedy; Thomas A. Spies

    2004-01-01

    Understanding the shifts over time in the distribution and amount of forest vegetation types in relation to forest management and environmental conditions is critical for many policy and ecological questions. Our objective was to assess the influences of ownership and environment on changes !n forest vegetation from post-settlement historical to recent times in the...

  7. The Evolution of the Multiplicity of Embedded Protostars. II. Binary Separation Distribution and Analysis

    NASA Astrophysics Data System (ADS)

    Connelley, Michael S.; Reipurth, Bo; Tokunaga, Alan T.

    2008-06-01

    We present the Class I protostellar binary separation distribution based on the data tabulated in a companion paper. We verify the excess of Class I binary stars over solar-type main-sequence stars in the separation range from 500 AU to 4500 AU. Although our sources are in nearby star-forming regions distributed across the entire sky (including Orion), none of our objects are in a high stellar density environment. A log-normal function, used by previous authors to fit the main-sequence and T Tauri binary separation distributions, poorly fits our data, and we determine that a log-uniform function is a better fit. Our observations show that the binary separation distribution changes significantly during the Class I phase, and that the binary frequency at separations greater than 1000 AU declines steadily with respect to spectral index. Despite these changes, the binary frequency remains constant until the end of the Class I phase, when it drops sharply. We propose a scenario to account for the changes in the Class I binary separation distribution. This scenario postulates that a large number of companions with a separation greater than ~1000 AU were ejected during the Class 0 phase, but remain gravitationally bound due to the significant mass of the Class I envelope. As the envelope dissipates, these companions become unbound and the binary frequency at wide separations declines. Circumstellar and circumbinary disks are expected to play an important role in the orbital evolution at closer separations. This scenario predicts that a large number of Class 0 objects should be non-hierarchical multiple systems, and that many Class I young stellar objects (YSOs) with a widely separated companion should also have a very close companion. We also find that Class I protostars are not dynamically pristine, but have experienced dynamical evolution before they are visible as Class I objects. Our analysis shows that the Class I binary frequency and the binary separation distribution strongly depend on the star-forming environment. The Infrared Telescope Facility is operated by the University of Hawaii under Cooperative Agreement no. NCC 5-538 with the National Aeronautics and Space Administration, Science Mission Directorate, Planetary Astronomy Program. The United Kingdom Infrared Telescope is operated by the Joint Astronomy Centre on behalf of the Science and Technology Facilities Council of the U.K. Based in part on data collected at Subaru Telescope, which is operated by the National Astronomical Observatory of Japan.

  8. Optimal coordinated voltage control in active distribution networks using backtracking search algorithm

    PubMed Central

    Tengku Hashim, Tengku Juhana; Mohamed, Azah

    2017-01-01

    The growing interest in distributed generation (DG) in recent years has led to a number of generators connected to a distribution system. The integration of DGs in a distribution system has resulted in a network known as active distribution network due to the existence of bidirectional power flow in the system. Voltage rise issue is one of the predominantly important technical issues to be addressed when DGs exist in an active distribution network. This paper presents the application of the backtracking search algorithm (BSA), which is relatively new optimisation technique to determine the optimal settings of coordinated voltage control in a distribution system. The coordinated voltage control considers power factor, on-load tap-changer and generation curtailment control to manage voltage rise issue. A multi-objective function is formulated to minimise total losses and voltage deviation in a distribution system. The proposed BSA is compared with that of particle swarm optimisation (PSO) so as to evaluate its effectiveness in determining the optimal settings of power factor, tap-changer and percentage active power generation to be curtailed. The load flow algorithm from MATPOWER is integrated in the MATLAB environment to solve the multi-objective optimisation problem. Both the BSA and PSO optimisation techniques have been tested on a radial 13-bus distribution system and the results show that the BSA performs better than PSO by providing better fitness value and convergence rate. PMID:28991919

  9. Optimal coordinated voltage control in active distribution networks using backtracking search algorithm.

    PubMed

    Tengku Hashim, Tengku Juhana; Mohamed, Azah

    2017-01-01

    The growing interest in distributed generation (DG) in recent years has led to a number of generators connected to a distribution system. The integration of DGs in a distribution system has resulted in a network known as active distribution network due to the existence of bidirectional power flow in the system. Voltage rise issue is one of the predominantly important technical issues to be addressed when DGs exist in an active distribution network. This paper presents the application of the backtracking search algorithm (BSA), which is relatively new optimisation technique to determine the optimal settings of coordinated voltage control in a distribution system. The coordinated voltage control considers power factor, on-load tap-changer and generation curtailment control to manage voltage rise issue. A multi-objective function is formulated to minimise total losses and voltage deviation in a distribution system. The proposed BSA is compared with that of particle swarm optimisation (PSO) so as to evaluate its effectiveness in determining the optimal settings of power factor, tap-changer and percentage active power generation to be curtailed. The load flow algorithm from MATPOWER is integrated in the MATLAB environment to solve the multi-objective optimisation problem. Both the BSA and PSO optimisation techniques have been tested on a radial 13-bus distribution system and the results show that the BSA performs better than PSO by providing better fitness value and convergence rate.

  10. An approach for access differentiation design in medical distributed applications built on databases.

    PubMed

    Shoukourian, S K; Vasilyan, A M; Avagyan, A A; Shukurian, A K

    1999-01-01

    A formalized "top to bottom" design approach was described in [1] for distributed applications built on databases, which were considered as a medium between virtual and real user environments for a specific medical application. Merging different components within a unified distributed application posits new essential problems for software. Particularly protection tools, which are sufficient separately, become deficient during the integration due to specific additional links and relationships not considered formerly. E.g., it is impossible to protect a shared object in the virtual operating room using only DBMS protection tools, if the object is stored as a record in DB tables. The solution of the problem should be found only within the more general application framework. Appropriate tools are absent or unavailable. The present paper suggests a detailed outline of a design and testing toolset for access differentiation systems (ADS) in distributed medical applications which use databases. The appropriate formal model as well as tools for its mapping to a DMBS are suggested. Remote users connected via global networks are considered too.

  11. Contact-force distribution optimization and control for quadruped robots using both gradient and adaptive neural networks.

    PubMed

    Li, Zhijun; Ge, Shuzhi Sam; Liu, Sibang

    2014-08-01

    This paper investigates optimal feet forces' distribution and control of quadruped robots under external disturbance forces. First, we formulate a constrained dynamics of quadruped robots and derive a reduced-order dynamical model of motion/force. Consider an external wrench on quadruped robots; the distribution of required forces and moments on the supporting legs of a quadruped robot is handled as a tip-point force distribution and used to equilibrate the external wrench. Then, a gradient neural network is adopted to deal with the optimized objective function formulated as to minimize this quadratic objective function subjected to linear equality and inequality constraints. For the obtained optimized tip-point force and the motion of legs, we propose the hybrid motion/force control based on an adaptive neural network to compensate for the perturbations in the environment and approximate feedforward force and impedance of the leg joints. The proposed control can confront the uncertainties including approximation error and external perturbation. The verification of the proposed control is conducted using a simulation.

  12. How a visual surveillance system hypothesizes how you behave.

    PubMed

    Micheloni, C; Piciarelli, C; Foresti, G L

    2006-08-01

    In the last few years, the installation of a large number of cameras has led to a need for increased capabilities in video surveillance systems. It has, indeed, been more and more necessary for human operators to be helped in the understanding of ongoing activities in real environments. Nowadays, the technology and the research in the machine vision and artificial intelligence fields allow one to expect a new generation of completely autonomous systems able to reckon the behaviors of entities such as pedestrians, vehicles, and so forth. Hence, whereas the sensing aspect of these systems has been the issue considered the most so far, research is now focused mainly on more newsworthy problems concerning understanding. In this article, we present a novel method for hypothesizing the evolution of behavior. For such purposes, the system is required to extract useful information by means of low-level techniques for detecting and maintaining track of moving objects. The further estimation of performed trajectories, together with objects classification, enables one to compute the probability distribution of the normal activities (e.g., trajectories). Such a distribution is defined by means of a novel clustering technique. The resulting clusters are used to estimate the evolution of objects' behaviors and to speculate about any intention to act dangerously. The provided solution for hypothesizing behaviors occurring in real environments was tested in the context of an outdoor parking lot

  13. KeyWare: an open wireless distributed computing environment

    NASA Astrophysics Data System (ADS)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  14. Stellar Classification Online - Public Exploration

    NASA Astrophysics Data System (ADS)

    Castelaz, Michael W.; Bedell, W.; Barker, T.; Cline, J.; Owen, L.

    2009-01-01

    The Michigan Objective Prism Blue Survey (e.g. Sowell et al 2007, AJ, 134, 1089) photographic plates located in the Astronomical Photographic Data Archive at the Pisgah Astronomical Research Institute hold hundreds of thousands of stellar spectra, many of which have not been classified before. The public is invited to participate in a distributed computing online environment to classify the stars on the objective prism plates. The online environment is called Stellar Classification Online - Public Exploration (SCOPE). Through a website, SCOPE participants are given a tutorial on stellar spectra and their classification, and given the chance to practice their skills at classification. After practice, participants register, login, and select stars for classification from scans of the objective prism plates. Their classifications are recorded in a database where the accumulation of classifications of the same star by many users will be statistically analyzed. The project includes stars with known spectral types to help test the reliability of classifications. The SCOPE webpage and the use of results will be described.

  15. Shared virtual environments for aerospace training

    NASA Technical Reports Server (NTRS)

    Loftin, R. Bowen; Voss, Mark

    1994-01-01

    Virtual environments have the potential to significantly enhance the training of NASA astronauts and ground-based personnel for a variety of activities. A critical requirement is the need to share virtual environments, in real or near real time, between remote sites. It has been hypothesized that the training of international astronaut crews could be done more cheaply and effectively by utilizing such shared virtual environments in the early stages of mission preparation. The Software Technology Branch at NASA's Johnson Space Center has developed the capability for multiple users to simultaneously share the same virtual environment. Each user generates the graphics needed to create the virtual environment. All changes of object position and state are communicated to all users so that each virtual environment maintains its 'currency.' Examples of these shared environments will be discussed and plans for the utilization of the Department of Defense's Distributed Interactive Simulation (DIS) protocols for shared virtual environments will be presented. Finally, the impact of this technology on training and education in general will be explored.

  16. An Autonomous Distributed Fault-Tolerant Local Positioning System

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    2017-01-01

    We describe a fault-tolerant, GPS-independent (Global Positioning System) distributed autonomous positioning system for static/mobile objects and present solutions for providing highly-accurate geo-location data for the static/mobile objects in dynamic environments. The reliability and accuracy of a positioning system fundamentally depends on two factors; its timeliness in broadcasting signals and the knowledge of its geometry, i.e., locations and distances of the beacons. Existing distributed positioning systems either synchronize to a common external source like GPS or establish their own time synchrony using a scheme similar to a master-slave by designating a particular beacon as the master and other beacons synchronize to it, resulting in a single point of failure. Another drawback of existing positioning systems is their lack of addressing various fault manifestations, in particular, communication link failures, which, as in wireless networks, are increasingly dominating the process failures and are typically transient and mobile, in the sense that they typically affect different messages to/from different processes over time.

  17. A network-based distributed, media-rich computing and information environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, R.L.

    1995-12-31

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.« less

  18. What do we gain from simplicity versus complexity in species distribution models?

    USGS Publications Warehouse

    Merow, Cory; Smith, Matthew J.; Edwards, Thomas C.; Guisan, Antoine; McMahon, Sean M.; Normand, Signe; Thuiller, Wilfried; Wuest, Rafael O.; Zimmermann, Niklaus E.; Elith, Jane

    2014-01-01

    Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence–environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence–environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building ‘under fit’ models, having insufficient flexibility to describe observed occurrence–environment relationships, we risk misunderstanding the factors shaping species distributions. By building ‘over fit’ models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species ranges.

  19. Meteoroids and Orbital Debris: Effects on Spacecraft

    NASA Technical Reports Server (NTRS)

    Belk, Cynthia A.; Robinson, Jennifer H.; Alexander, Margaret B.; Cooke, William J.; Pavelitz, Steven D.

    1997-01-01

    The natural space environment is characterized by many complex and subtle phenomena hostile to spacecraft. The effects of these phenomena impact spacecraft design, development, and operations. Space systems become increasingly susceptible to the space environment as use of composite materials and smaller, faster electronics increases. This trend makes an understanding of the natural space environment essential to accomplish overall mission objectives, especially in the current climate of better/cheaper/faster. Meteoroids are naturally occurring phenomena in the natural space environment. Orbital debris is manmade space litter accumulated in Earth orbit from the exploration of space. Descriptions are presented of orbital debris source, distribution, size, lifetime, and mitigation measures. This primer is one in a series of NASA Reference Publications currently being developed by the Electromagnetics and Aerospace Environments Branch, Systems Analysis and Integration Laboratory, Marshall Space Flight Center, National Aeronautics and Space Administration.

  20. Integrated Design and Engineering Analysis (IDEA) Environment - Propulsion Related Module Development and Vehicle Integration

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2013-01-01

    This report documents the work performed during the period from May 2011 - October 2012 on the Integrated Design and Engineering Analysis (IDEA) environment. IDEA is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML). This report will focus on describing the work done in the areas of: (1) Integrating propulsion data (turbines, rockets, and scramjets) in the system, and using the data to perform trajectory analysis; (2) Developing a parametric packaging strategy for a hypersonic air breathing vehicles allowing for tank resizing when multiple fuels and/or oxidizer are part of the configuration; and (3) Vehicle scaling and closure strategies.

  1. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  2. Identifying rural food deserts: Methodological considerations for food environment interventions.

    PubMed

    Lebel, Alexandre; Noreau, David; Tremblay, Lucie; Oberlé, Céline; Girard-Gadreau, Maurie; Duguay, Mathieu; Block, Jason P

    2016-06-09

    Food insecurity in an important public health issue and affects 13% of Canadian households. It is associated with poor accessibility to fresh, diverse and affordable food products. However, measurement of the food environment is challenging in rural settings since the proximity of food supply sources is unevenly distributed. The objective of this study was to develop a methodology to identify food deserts in rural environments. In-store evaluations of 25 food products were performed for all food stores located in four contiguous rural counties in Quebec. The quality of food products was estimated using four indices: freshness, affordability, diversity and the relative availability. Road network distance between all residences to the closest food store with a favourable score on the four dimensions was mapped to identify residential clusters located in deprived communities without reasonable access to a "good" food source. The result was compared with the food desert parameters proposed by the US Department of Agriculture (USDA), as well as with the perceptions of a group of regional stakeholders. When food quality was considered, food deserts appeared more prevalent than when only the USDA definition was used. Objective measurements of the food environment matched stakeholders' perceptions. Food stores' characteristics are different in rural areas and require an in-store estimation to identify potential rural food deserts. The objective measurements of the food environment combined with the field knowledge of stakeholders may help to shape stronger arguments to gain the support of decision-makers to develop relevant interventions.

  3. The StarLite Project Prototyping Real-Time Software

    DTIC Science & Technology

    1991-10-01

    multiversion data objects using the prototyping environment. Section 5 concludes the paper. 2. Message-Based Simulation When prototyping distributed...phase locking and priority-based synchronization algorithms, and between a multiversion database and its corresponding single-version database, through...its deadline, since the transaction is only aborted in the validation phase. 4.5. A Multiversion Database System To illustrate the effctivcness of the

  4. Astrobiology: Life on Earth (and Elsewhere?)

    NASA Technical Reports Server (NTRS)

    Des Marais, David J.

    2016-01-01

    Astrobiology investigates the origins, evolution and distribution of life in the universe. Scientists study how stellar systems and their planets can create planetary environments that sustain biospheres. They search for biosignatures, which are objects, substances and or patterns that indicate the presence of life. Studies of Earth's early biosphere enhance these search strategies and also provide key insights about our own origins.

  5. Curricula and Syllabi in Hydrology. A Contribution to the International Hydrological Programme. UNESCO Technical Papers in Hydrology No. 22. Second Edition.

    ERIC Educational Resources Information Center

    Chandra, Satish, Ed.; Mostertman, L. J., Ed.

    Hydrology is the science dealing with the earth's waters, their occurrence, circulation, and distribution, their chemical and physical properties, and their reaction with the environment. As such, hydrology is an indispensible requirement for planning in the field of water resources. Objectives for, spectrum of, and topics for education in…

  6. Internet-based distributed collaborative environment for engineering education and design

    NASA Astrophysics Data System (ADS)

    Sun, Qiuli

    2001-07-01

    This research investigates the use of the Internet for engineering education, design, and analysis through the presentation of a Virtual City environment. The main focus of this research was to provide an infrastructure for engineering education, test the concept of distributed collaborative design and analysis, develop and implement the Virtual City environment, and assess the environment's effectiveness in the real world. A three-tier architecture was adopted in the development of the prototype, which contains an online database server, a Web server as well as multi-user servers, and client browsers. The environment is composed of five components, a 3D virtual world, multiple Internet-based multimedia modules, an online database, a collaborative geometric modeling module, and a collaborative analysis module. The environment was designed using multiple Intenet-based technologies, such as Shockwave, Java, Java 3D, VRML, Perl, ASP, SQL, and a database. These various technologies together formed the basis of the environment and were programmed to communicate smoothly with each other. Three assessments were conducted over a period of three semesters. The Virtual City is open to the public at www.vcity.ou.edu. The online database was designed to manage the changeable data related to the environment. The virtual world was used to implement 3D visualization and tie the multimedia modules together. Students are allowed to build segments of the 3D virtual world upon completion of appropriate undergraduate courses in civil engineering. The end result is a complete virtual world that contains designs from all of their coursework and is viewable on the Internet. The environment is a content-rich educational system, which can be used to teach multiple engineering topics with the help of 3D visualization, animations, and simulations. The concept of collaborative design and analysis using the Internet was investigated and implemented. Geographically dispersed users can build the same geometric model simultaneously over the Internet and communicate with each other through a chat room. They can also conduct finite element analysis collaboratively on the same object over the Internet. They can mesh the same object, apply and edit the same boundary conditions and forces, obtain the same analysis results, and then discuss the results through the Internet.

  7. Magnetoacoustic tomography with magnetic induction for high-resolution bioimepedance imaging through vector source reconstruction under the static field of MRI magnet.

    PubMed

    Mariappan, Leo; Hu, Gang; He, Bin

    2014-02-01

    Magnetoacoustic tomography with magnetic induction (MAT-MI) is an imaging modality to reconstruct the electrical conductivity of biological tissue based on the acoustic measurements of Lorentz force induced tissue vibration. This study presents the feasibility of the authors' new MAT-MI system and vector source imaging algorithm to perform a complete reconstruction of the conductivity distribution of real biological tissues with ultrasound spatial resolution. In the present study, using ultrasound beamformation, imaging point spread functions are designed to reconstruct the induced vector source in the object which is used to estimate the object conductivity distribution. Both numerical studies and phantom experiments are performed to demonstrate the merits of the proposed method. Also, through the numerical simulations, the full width half maximum of the imaging point spread function is calculated to estimate of the spatial resolution. The tissue phantom experiments are performed with a MAT-MI imaging system in the static field of a 9.4 T magnetic resonance imaging magnet. The image reconstruction through vector beamformation in the numerical and experimental studies gives a reliable estimate of the conductivity distribution in the object with a ∼ 1.5 mm spatial resolution corresponding to the imaging system frequency of 500 kHz ultrasound. In addition, the experiment results suggest that MAT-MI under high static magnetic field environment is able to reconstruct images of tissue-mimicking gel phantoms and real tissue samples with reliable conductivity contrast. The results demonstrate that MAT-MI is able to image the electrical conductivity properties of biological tissues with better than 2 mm spatial resolution at 500 kHz, and the imaging with MAT-MI under a high static magnetic field environment is able to provide improved imaging contrast for biological tissue conductivity reconstruction.

  8. The portable UNIX programming system (PUPS) and CANTOR: a computational environment for dynamical representation and analysis of complex neurobiological data.

    PubMed

    O'Neill, M A; Hilgetag, C C

    2001-08-29

    Many problems in analytical biology, such as the classification of organisms, the modelling of macromolecules, or the structural analysis of metabolic or neural networks, involve complex relational data. Here, we describe a software environment, the portable UNIX programming system (PUPS), which has been developed to allow efficient computational representation and analysis of such data. The system can also be used as a general development tool for database and classification applications. As the complexity of analytical biology problems may lead to computation times of several days or weeks even on powerful computer hardware, the PUPS environment gives support for persistent computations by providing mechanisms for dynamic interaction and homeostatic protection of processes. Biological objects and their interrelations are also represented in a homeostatic way in PUPS. Object relationships are maintained and updated by the objects themselves, thus providing a flexible, scalable and current data representation. Based on the PUPS environment, we have developed an optimization package, CANTOR, which can be applied to a wide range of relational data and which has been employed in different analyses of neuroanatomical connectivity. The CANTOR package makes use of the PUPS system features by modifying candidate arrangements of objects within the system's database. This restructuring is carried out via optimization algorithms that are based on user-defined cost functions, thus providing flexible and powerful tools for the structural analysis of the database content. The use of stochastic optimization also enables the CANTOR system to deal effectively with incomplete and inconsistent data. Prototypical forms of PUPS and CANTOR have been coded and used successfully in the analysis of anatomical and functional mammalian brain connectivity, involving complex and inconsistent experimental data. In addition, PUPS has been used for solving multivariate engineering optimization problems and to implement the digital identification system (DAISY), a system for the automated classification of biological objects. PUPS is implemented in ANSI-C under the POSIX.1 standard and is to a great extent architecture- and operating-system independent. The software is supported by systems libraries that allow multi-threading (the concurrent processing of several database operations), as well as the distribution of the dynamic data objects and library operations over clusters of computers. These attributes make the system easily scalable, and in principle allow the representation and analysis of arbitrarily large sets of relational data. PUPS and CANTOR are freely distributed (http://www.pups.org.uk) as open-source software under the GNU license agreement.

  9. The portable UNIX programming system (PUPS) and CANTOR: a computational environment for dynamical representation and analysis of complex neurobiological data.

    PubMed Central

    O'Neill, M A; Hilgetag, C C

    2001-01-01

    Many problems in analytical biology, such as the classification of organisms, the modelling of macromolecules, or the structural analysis of metabolic or neural networks, involve complex relational data. Here, we describe a software environment, the portable UNIX programming system (PUPS), which has been developed to allow efficient computational representation and analysis of such data. The system can also be used as a general development tool for database and classification applications. As the complexity of analytical biology problems may lead to computation times of several days or weeks even on powerful computer hardware, the PUPS environment gives support for persistent computations by providing mechanisms for dynamic interaction and homeostatic protection of processes. Biological objects and their interrelations are also represented in a homeostatic way in PUPS. Object relationships are maintained and updated by the objects themselves, thus providing a flexible, scalable and current data representation. Based on the PUPS environment, we have developed an optimization package, CANTOR, which can be applied to a wide range of relational data and which has been employed in different analyses of neuroanatomical connectivity. The CANTOR package makes use of the PUPS system features by modifying candidate arrangements of objects within the system's database. This restructuring is carried out via optimization algorithms that are based on user-defined cost functions, thus providing flexible and powerful tools for the structural analysis of the database content. The use of stochastic optimization also enables the CANTOR system to deal effectively with incomplete and inconsistent data. Prototypical forms of PUPS and CANTOR have been coded and used successfully in the analysis of anatomical and functional mammalian brain connectivity, involving complex and inconsistent experimental data. In addition, PUPS has been used for solving multivariate engineering optimization problems and to implement the digital identification system (DAISY), a system for the automated classification of biological objects. PUPS is implemented in ANSI-C under the POSIX.1 standard and is to a great extent architecture- and operating-system independent. The software is supported by systems libraries that allow multi-threading (the concurrent processing of several database operations), as well as the distribution of the dynamic data objects and library operations over clusters of computers. These attributes make the system easily scalable, and in principle allow the representation and analysis of arbitrarily large sets of relational data. PUPS and CANTOR are freely distributed (http://www.pups.org.uk) as open-source software under the GNU license agreement. PMID:11545702

  10. Temporal and spatial PM10 concentration distribution using an inverse distance weighted method in Klang Valley, Malaysia

    NASA Astrophysics Data System (ADS)

    Tarmizi, S. N. M.; Asmat, A.; Sumari, S. M.

    2014-02-01

    PM10 is one of the air contaminants that can be harmful to human health. Meteorological factors and changes of monsoon season may affect the distribution of these particles. The objective of this study is to determine the temporal and spatial particulate matter (PM10) concentration distribution in Klang Valley, Malaysia by using the Inverse Distance Weighted (IDW) method at different monsoon season and meteorological conditions. PM10 and meteorological data were obtained from the Malaysian Department of Environment (DOE). Particles distribution data were added to the geographic database on a seasonal basis. Temporal and spatial patterns of PM10 concentration distribution were determined by using ArcGIS 9.3. The higher PM10 concentrations are observed during Southwest monsoon season. The values are lower during the Northeast monsoon season. Different monsoon seasons show different meteorological conditions that effect PM10 distribution.

  11. Grist : grid-based data mining for astronomy

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S.; Miller, Craig D.; Walia, Harshpreet; Williams, Roy; Djorgovski, S. George; Graham, Matthew J.; Mahabal, Ashish; Babu, Jogesh; Berk, Daniel E. Vanden; hide

    2004-01-01

    The Grist project is developing a grid-technology based system as a research environment for astronomy with massive and complex datasets. This knowledge extraction system will consist of a library of distributed grid services controlled by a workflow system, compliant with standards emerging from the grid computing, web services, and virtual observatory communities. This new technology is being used to find high redshift quasars, study peculiar variable objects, search for transients in real time, and fit SDSS QSO spectra to measure black hole masses. Grist services are also a component of the 'hyperatlas' project to serve high-resolution multi-wavelength imagery over the Internet. In support of these science and outreach objectives, the Grist framework will provide the enabling fabric to tie together distributed grid services in the areas of data access, federation, mining, subsetting, source extraction, image mosaicking, statistics, and visualization.

  12. Grist: Grid-based Data Mining for Astronomy

    NASA Astrophysics Data System (ADS)

    Jacob, J. C.; Katz, D. S.; Miller, C. D.; Walia, H.; Williams, R. D.; Djorgovski, S. G.; Graham, M. J.; Mahabal, A. A.; Babu, G. J.; vanden Berk, D. E.; Nichol, R.

    2005-12-01

    The Grist project is developing a grid-technology based system as a research environment for astronomy with massive and complex datasets. This knowledge extraction system will consist of a library of distributed grid services controlled by a workflow system, compliant with standards emerging from the grid computing, web services, and virtual observatory communities. This new technology is being used to find high redshift quasars, study peculiar variable objects, search for transients in real time, and fit SDSS QSO spectra to measure black hole masses. Grist services are also a component of the ``hyperatlas'' project to serve high-resolution multi-wavelength imagery over the Internet. In support of these science and outreach objectives, the Grist framework will provide the enabling fabric to tie together distributed grid services in the areas of data access, federation, mining, subsetting, source extraction, image mosaicking, statistics, and visualization.

  13. Mapping Sub-Antarctic Cushion Plants Using Random Forests to Combine Very High Resolution Satellite Imagery and Terrain Modelling

    PubMed Central

    Bricher, Phillippa K.; Lucieer, Arko; Shaw, Justine; Terauds, Aleks; Bergstrom, Dana M.

    2013-01-01

    Monitoring changes in the distribution and density of plant species often requires accurate and high-resolution baseline maps of those species. Detecting such change at the landscape scale is often problematic, particularly in remote areas. We examine a new technique to improve accuracy and objectivity in mapping vegetation, combining species distribution modelling and satellite image classification on a remote sub-Antarctic island. In this study, we combine spectral data from very high resolution WorldView-2 satellite imagery and terrain variables from a high resolution digital elevation model to improve mapping accuracy, in both pixel- and object-based classifications. Random forest classification was used to explore the effectiveness of these approaches on mapping the distribution of the critically endangered cushion plant Azorella macquariensis Orchard (Apiaceae) on sub-Antarctic Macquarie Island. Both pixel- and object-based classifications of the distribution of Azorella achieved very high overall validation accuracies (91.6–96.3%, κ = 0.849–0.924). Both two-class and three-class classifications were able to accurately and consistently identify the areas where Azorella was absent, indicating that these maps provide a suitable baseline for monitoring expected change in the distribution of the cushion plants. Detecting such change is critical given the threats this species is currently facing under altering environmental conditions. The method presented here has applications to monitoring a range of species, particularly in remote and isolated environments. PMID:23940805

  14. Michigan Orbital DEbris Survey Telescope Observations of the Geosynchronous Orbital Debris Environment. Observing Years: 2007-2009

    NASA Technical Reports Server (NTRS)

    Abercromby, K. J.; Seitzer, P.; Cowardin, H. M.; Barker, E. S.; Matney, M. J.

    2011-01-01

    NASA uses the Michigan Orbital DEbris Survey Telescope (MODEST), the University of Michigan's 0.61-m aperture Curtis-Schmidt telescope at the Cerro Tololo Inter-American Observatory in Chile, to help characterize the debris environment in geosynchronous orbit; this began in February 2001 and continues to the present day. Detected objects that are found to be on the U.S. Space Surveillance Network cataloged objects list are termed correlated targets (CTs), while those not found on the list are called uncorrelated targets (UCTs). This Johnson Space Center report provides details of observational and data-reduction processes for the entire MODEST dataset acquired in calendar years (CYs) 2007, 2008, and 2009. Specifically, this report describes the collection and analysis of 36 nights of data collected in CY 2007, 43 nights of data collected in CY 2008, and 43 nights of data collected in CY 2009. MODEST is equipped with a 2048 x 2048-pixel charged coupled device camera with a 1.3 by 1.3 deg field of view. This system is capable of detecting objects fainter than 18th magnitude (R filter) using a 5-s integration. This corresponds to a 20-cm diameter, 0.175-albedo object at 36,000 km altitude assuming a diffuse Lambertian phase function. The average number of detections each night over all 3 years was 26. The percentage of this number that represented the UCT population ranged from 34% to 18%, depending on the observing strategy and the field center location. Due to the short orbital arc over which observations are made, the eccentricity of the object s orbit is extremely difficult to measure accurately. Therefore, a circular orbit was assumed when calculating the orbital elements. A comparison of the measured inclination (INC), right ascension of ascending node (RAAN), and mean motion to the quantities for CTs from the U.S. Space Surveillance Network shows acceptable errors. This analysis lends credibility to the determination of the UCT orbital distributions. Figure 1 shows the size distribution of 3,143 objects detected in the data processed for CYs 2007, 2008, and 2009. The actual peak of the absolute magnitude distribution for the functional correlated targets is 10th magnitude, whereas the peak was 11th magnitude in 2002 2003 and 10th magnitude for 2004-2006. An absolute magnitude of 10.5 corresponds to objects with average diameters of 6.3 m, assuming an albedo of 0.175 and a diffuse Lambertian phase function. This result generally agrees with the known sizes of intact satellites. The absolute magnitude distribution for the UCTs is broad, but starts to roll off near 25 cm diameter or 17.5 magnitude. This roll off in the distribution reflects the detection capability of MODEST, not the true nature of the population. The true population is believed to continue at the same slope through fainter magnitudes based on comparisons with the LEO break-up law.

  15. Information Foraging and Change Detection for Automated Science Exploration

    NASA Technical Reports Server (NTRS)

    Furlong, P. Michael; Dille, Michael

    2016-01-01

    This paper presents a new algorithm for autonomous on-line exploration in unknown environments. The objective is to free remote scientists from possibly-infeasible extensive preliminary site investigation prior to sending robotic agents. We simulate a common exploration task for an autonomous robot sampling the environment at various locations and compare performance against simpler control strategies. An extension is proposed and evaluated that further permits operation in the presence of environmental variability in which the robot encounters a change in the distribution underlying sampling targets. Experimental results indicate a strong improvement in performance across varied parameter choices for the scenario.

  16. Small vs. Large Convective Cloud Objects from CERES Aqua Observations: Where are the Intraseasonal Variation Signals?

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2016-01-01

    During inactive phases of Madden-Julian oscillation (MJO), there are plenty of deep but small convective systems and far fewer deep and large ones. During active phases of MJO, a manifestation of an increase in the occurrence of large and deep cloud clusters results from an amplification of large-scale motions by stronger convective heating. This study is designed to quantitatively examine the roles of small and large cloud clusters during the MJO life cycle. We analyze the cloud object data from Aqua CERES observations for tropical deep convective (DC) and cirrostratus (CS) cloud object types according to the real-time multivariate MJO index. The cloud object is a contiguous region of the earth with a single dominant cloud-system type. The size distributions, defined as the footprint numbers as a function of cloud object diameters, for particular MJO phases depart greatly from the combined (8-phase) distribution at large cloud-object diameters due to the reduced/increased numbers of cloud objects related to changes in the large-scale environments. The medium diameter corresponding to the combined distribution is determined and used to partition all cloud objects into "small" and "large" groups of a particular phase. The two groups corresponding to the combined distribution have nearly equal numbers of footprints. The medium diameters are 502 km for DC and 310 km for cirrostratus. The range of the variation between two extreme phases (typically, the most active and depressed phases) for the small group is 6-11% in terms of the numbers of cloud objects and the total footprint numbers. The corresponding range for the large group is 19-44%. In terms of the probability density functions of radiative and cloud physical properties, there are virtually no differences between the MJO phases for the small group, but there are significant differences for the large groups for both DC and CS types. These results suggest that the intreseasonal variation signals reside at the large cloud clusters while the small cloud clusters represent the background noises resulting from various types of the tropical waves with different wavenumbers and propagation directions/speeds.

  17. Natural Environment Characterization Using Hybrid Tomographic Aproaches

    NASA Astrophysics Data System (ADS)

    Huang, Yue; Ferro-Famil, Laurent; Reigber, Andreas

    2011-03-01

    SAR tomography (SARTOM) is the extension of conventional two-dimensional SAR imaging principle to three dimensions [1]. A real 3D imaging of a scene is achieved by the formation of an additional synthetic aperture in elevation and the coherent combination of images acquired from several parallel flight tracks. This imaging technique allows a direct localization of multiple scattering contributions in a same resolution cell, leading to a refined analysis of volume structures, like forests or dense urban areas. In order to improve the vertical resolution with respect to classical Fourier-based methods, High-Resolution (HR) approaches are used in this paper to perform SAR tomography. Both nonparametric spectral estimators, like Beamforming and Capon and parametric ones, like MUSIC, Maximum Likelihood, are applied to real data sets and compared in terms of scatterer location accuracy and resolution. It is known that nonparametric approaches are in general more robust to focusing artefacts, whereas parametric approaches are characterized by a better vertical resolution. It has been shown [2], [3] that the performance of these spectral analysis approaches is conditioned by the nature of the scattering response of the observed objects. In the scenario of hybrid environments where objects with a deterministic response are embedded in a speckle affected environment, the parameter estimation for this type of scatterers becomes a problem of mixed-spectrum estimation. The impenetrable medium like the ground or object, possesses an isolated localized phase center in the vertical direction, leading to a discrete (line) spectrum. This type of scatterers can be considered as 'h-localized', named 'Isolated Scatterers' (IS). Whereas natural environments consist of a large number of elementary scatterers successively distributed in the vertical direction. This type of scatterers can be described as 'h-distributed' scatterers and characterized by a continuous spectrum. Therefore, the usual spectral estimators may reach some limitations due to their lack of adaptation to both the statistical features of the backscattered information and the type of spectrum of the considered media. In order to overcome this problem, a tomographic focusing approach based on hybrid spectral estimators is introduced and extended to the polarimetric case. It contains two parallel procedures: one is to detect and localize isolated scatterers and the other one is to characterize the natural environment by estimating the heights of the ground and the tree top. These two decoupled procedures permit to more precisely characterize the scenario of hybrid environments.

  18. JACK - ANTHROPOMETRIC MODELING SYSTEM FOR SILICON GRAPHICS WORKSTATIONS

    NASA Technical Reports Server (NTRS)

    Smith, B.

    1994-01-01

    JACK is an interactive graphics program developed at the University of Pennsylvania that displays and manipulates articulated geometric figures. JACK is typically used to observe how a human mannequin interacts with its environment and what effects body types will have upon the performance of a task in a simulated environment. Any environment can be created, and any number of mannequins can be placed anywhere in that environment. JACK includes facilities to construct limited geometric objects, position figures, perform a variety of analyses on the figures, describe the motion of the figures and specify lighting and surface property information for rendering high quality images. JACK is supplied with a variety of body types pre-defined and known to the system. There are both male and female bodies, ranging from the 5th to the 95th percentile, based on NASA Standard 3000. Each mannequin is fully articulated and reflects the joint limitations of a normal human. JACK is an editor for manipulating previously defined objects known as "Peabody" objects. Used to describe the figures as well as the internal data structure for representing them, Peabody is a language with a powerful and flexible mechanism for representing connectivity between objects, both the joints between individual segments within a figure and arbitrary connections between different figures. Peabody objects are generally comprised of several individual figures, each one a collection of segments. Each segment has a geometry represented by PSURF files that consist of polygons or curved surface patches. Although JACK does not have the capability to create new objects, objects may be created by other geometric modeling programs and then translated into the PSURF format. Environment files are a collection of figures and attributes that may be dynamically moved under the control of an animation file. The animation facilities allow the user to create a sequence of commands that duplicate the movements of a human figure in an environment. Integrated into JACK is a set of vision tools that allow predictions about visibility and legibility. The program is capable of displaying environment perspectives corresponding to what the mannequin would see while in the environment, indicating potential problems with occlusion and visibility. It is also possible to display view cones emanating from the figure's eyes, indicating field of view. Another feature projects the environment onto retina coordinates which gives clues regarding visual angles, acuity and occlusion by the biological blind spots. A retina editor makes it possible to draw onto the retina and project that into 3-dimensional space. Another facility, Reach, causes the mannequin to move a specific portion of its anatomy to a chosen point in space. The Reach facility helps in analyzing problems associated with operator size and other constraints. The 17-segment torso makes it possible to set a figure into realistic postures, simulating human postures closely. The JACK application software is written in C-language for Silicon Graphics workstations running IRIX versions 4.0.5 or higher and is available only in executable form. Since JACK is a copyrighted program (copyright 1991 University of Pennsylvania), this executable may not be redistributed. The recommended minimum hardware configuration for running the executable includes a floating-point accelerator, an 8-megabyte program memory, a high resolution (1280 x 1024) graphics card, and at least 50Mb of free disk space. JACK's data files take up millions of bytes of storage space, so additional disk space is highly recommended. The standard distribution medium for JACK is a .25 inch streaming magnetic IRIX tape cartridge in UNIX tar format. JACK was originally developed in 1988. Jack v4.8 was released for distribution through COSMIC in 1993.

  19. Handbook on surficial uranium deposits. Chapter 3. World distribution relative to climate and physical setting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlisle, D

    1983-01-01

    This chapter discusses regional controls which affect the world distribution of surficial chemogenic uranium deposits. The most important of these are (1) climate, (2) geomorphology, including physiographic and climatic stability, and (3) provenance, i.e., the weathering terrain from which uranium and associated substances are derived. The three economically important environments are the calcrete environment, simple evaporative environments and paludal environments. Of these three categories, the calcrete uranium environment is probably the most uniquely constrained in terms of regional climate, geomorphic setting, provenance (vanadium as well as uranium) and especially the need for long term stability of both climate and physiography.more » Purely evaporative deposits, though subject to some of the same kinds of constraints, can also reflect local circumstances and a wider range of climates, physiographic settings, and source terrains. The third category encompassing bogs, marshes and organic-rich playas can form under an even wider range of climates and settings provided only that organic materials accumulate in abundance and are contacted by uranium-bearing waters. For all of these reasons and also because of the great economic importance of the calcrete environment as well as its relative novelty and complexity the discussion in this chapter is focused on calcrete, dolocrete and gypcrete uranium deposits. Objective data are reviewed first follwed by inferences and suggestions. 13 figures.« less

  20. Protocols for distributive scheduling

    NASA Technical Reports Server (NTRS)

    Richards, Stephen F.; Fox, Barry

    1993-01-01

    The increasing complexity of space operations and the inclusion of interorganizational and international groups in the planning and control of space missions lead to requirements for greater communication, coordination, and cooperation among mission schedulers. These schedulers must jointly allocate scarce shared resources among the various operational and mission oriented activities while adhering to all constraints. This scheduling environment is complicated by such factors as the presence of varying perspectives and conflicting objectives among the schedulers, the need for different schedulers to work in parallel, and limited communication among schedulers. Smooth interaction among schedulers requires the use of protocols that govern such issues as resource sharing, authority to update the schedule, and communication of updates. This paper addresses the development and characteristics of such protocols and their use in a distributed scheduling environment that incorporates computer-aided scheduling tools. An example problem is drawn from the domain of space shuttle mission planning.

  1. Distributed project scheduling at NASA: Requirements for manual protocols and computer-based support

    NASA Technical Reports Server (NTRS)

    Richards, Stephen F.

    1992-01-01

    The increasing complexity of space operations and the inclusion of interorganizational and international groups in the planning and control of space missions lead to requirements for greater communication, coordination, and cooperation among mission schedulers. These schedulers must jointly allocate scarce shared resources among the various operational and mission oriented activities while adhering to all constraints. This scheduling environment is complicated by such factors as the presence of varying perspectives and conflicting objectives among the schedulers, the need for different schedulers to work in parallel, and limited communication among schedulers. Smooth interaction among schedulers requires the use of protocols that govern such issues as resource sharing, authority to update the schedule, and communication of updates. This paper addresses the development and characteristics of such protocols and their use in a distributed scheduling environment that incorporates computer-aided scheduling tools. An example problem is drawn from the domain of Space Shuttle mission planning.

  2. A vector-product information retrieval system adapted to heterogeneous, distributed computing environments

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E.

    1991-01-01

    Vector-product information retrieval (IR) systems produce retrieval results superior to all other searching methods but presently have no commercial implementations beyond the personal computer environment. The NASA Electronic Library Systems (NELS) provides a ranked list of the most likely relevant objects in collections in response to a natural language query. Additionally, the system is constructed using standards and tools (Unix, X-Windows, Notif, and TCP/IP) that permit its operation in organizations that possess many different hosts, workstations, and platforms. There are no known commercial equivalents to this product at this time. The product has applications in all corporate management environments, particularly those that are information intensive, such as finance, manufacturing, biotechnology, and research and development.

  3. Hemispherical reflectance model for passive images in an outdoor environment.

    PubMed

    Kim, Charles C; Thai, Bea; Yamaoka, Neil; Aboutalib, Omar

    2015-05-01

    We present a hemispherical reflectance model for simulating passive images in an outdoor environment where illumination is provided by natural sources such as the sun and the clouds. While the bidirectional reflectance distribution function (BRDF) accurately produces radiance from any objects after the illumination, using the BRDF in calculating radiance requires double integration. Replacing the BRDF by hemispherical reflectance under the natural sources transforms the double integration into a multiplication. This reduces both storage space and computation time. We present the formalism for the radiance of the scene using hemispherical reflectance instead of BRDF. This enables us to generate passive images in an outdoor environment taking advantage of the computational and storage efficiencies. We show some examples for illustration.

  4. Nondeterministic data base for computerized visual perception

    NASA Technical Reports Server (NTRS)

    Yakimovsky, Y.

    1976-01-01

    A description is given of the knowledge representation data base in the perception subsystem of the Mars robot vehicle prototype. Two types of information are stored. The first is generic information that represents general rules that are conformed to by structures in the expected environments. The second kind of information is a specific description of a structure, i.e., the properties and relations of objects in the specific case being analyzed. The generic knowledge is represented so that it can be applied to extract and infer the description of specific structures. The generic model of the rules is substantially a Bayesian representation of the statistics of the environment, which means it is geared to representation of nondeterministic rules relating properties of, and relations between, objects. The description of a specific structure is also nondeterministic in the sense that all properties and relations may take a range of values with an associated probability distribution.

  5. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    NASA Technical Reports Server (NTRS)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  6. Transitioning from Distributed and Traditional to Distributed and Agile: An Experience Report

    NASA Astrophysics Data System (ADS)

    Wildt, Daniel; Prikladnicki, Rafael

    Global companies that experienced extensive waterfall phased plans are trying to improve their existing processes to expedite team engagement. Agile methodologies have become an acceptable path to follow because it comprises project management as part of its practices. Agile practices have been used with the objective of simplifying project control through simple processes, easy to update documentation and higher team iteration over exhaustive documentation, focusing rather on team continuous improvement and aiming to add value to business processes. The purpose of this chapter is to describe the experience of a global multinational company on transitioning from distributed and traditional to distributed and agile. This company has development centers across North America, South America and Asia. This chapter covers challenges faced by the project teams of two pilot projects, including strengths of using agile practices in a globally distributed environment and practical recommendations for similar endeavors.

  7. A Distributed Simulation Software System for Multi-Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Burns, Richard; Davis, George; Cary, Everett

    2003-01-01

    The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.

  8. Encapsulating model complexity and landscape-scale analyses of state-and-transition simulation models: an application of ecoinformatics and juniper encroachment in sagebrush steppe ecosystems

    USGS Publications Warehouse

    O'Donnell, Michael

    2015-01-01

    State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf

  9. A method for optimizing multi-objective reservoir operation upon human and riverine ecosystem demands

    NASA Astrophysics Data System (ADS)

    Ai, Xueshan; Dong, Zuo; Mo, Mingzhu

    2017-04-01

    The optimal reservoir operation is in generally a multi-objective problem. In real life, most of the reservoir operation optimization problems involve conflicting objectives, for which there is no single optimal solution which can simultaneously gain an optimal result of all the purposes, but rather a set of well distributed non-inferior solutions or Pareto frontier exists. On the other hand, most of the reservoirs operation rules is to gain greater social and economic benefits at the expense of ecological environment, resulting to the destruction of riverine ecology and reduction of aquatic biodiversity. To overcome these drawbacks, this study developed a multi-objective model for the reservoir operating with the conflicting functions of hydroelectric energy generation, irrigation and ecological protection. To solve the model with the objectives of maximize energy production, maximize the water demand satisfaction rate of irrigation and ecology, we proposed a multi-objective optimization method of variable penalty coefficient (VPC), which was based on integrate dynamic programming (DP) with discrete differential dynamic programming (DDDP), to generate a well distributed non-inferior along the Pareto front by changing the penalties coefficient of different objectives. This method was applied to an existing China reservoir named Donggu, through a course of a year, which is a multi-annual storage reservoir with multiple purposes. The case study results showed a good relationship between any two of the objectives and a good Pareto optimal solutions, which provide a reference for the reservoir decision makers.

  10. The color-magnitude distribution of small Kuiper Belt objects

    NASA Astrophysics Data System (ADS)

    Wong, Ian; Brown, Michael E.

    2015-11-01

    Occupying a vast region beyond the ice giants is an extensive swarm of minor bodies known as the Kuiper Belt. Enigmatic in their formation, composition, and evolution, these Kuiper Belt objects (KBOs) lie at the intersection of many of the most important topics in planetary science. Improved instruments and large-scale surveys have revealed a complex dynamical picture of the Kuiper Belt. Meanwhile, photometric studies have indicated that small KBOs display a wide range of colors, which may reflect a chemically diverse initial accretion environment and provide important clues to constraining the surface compositions of these objects. Notably, some recent work has shown evidence for bimodality in the colors of non-cold classical KBOs, which would have major implications for the formation and subsequent evolution of the entire KBO population. However, these previous color measurements are few and mostly come from targeted observations of known objects. As a consequence, the effect of observational biases cannot be readily removed, preventing one from obtaining an accurate picture of the true color distribution of the KBOs as a whole.We carried out a survey of KBOs using the Hyper Suprime-Cam instrument on the 8.2-meter Subaru telescope. Our observing fields targeted regions away from the ecliptic plane so as to avoid contamination from cold classical KBOs. Each field was imaged in both the g’ and i’ filters, which allowed us to calculate the g’-i’ color of each detected object. We detected more than 500 KBOs over two nights of observation, with absolute magnitudes from H=6 to H=11. Our survey increases the number of KBOs fainter than H=8 with known colors by more than an order of magnitude. We find that the distribution of colors demonstrates a robust bimodality across the entire observed range of KBO sizes, from which we can categorize individual objects into two color sub-populations -- the red and very-red KBOs. We present the very first analysis of the magnitude distributions of the two color sub-populations.

  11. CORBASec Used to Secure Distributed Aerospace Propulsion Simulations

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    The NASA Glenn Research Center and its industry partners are developing a Common Object Request Broker (CORBA) Security (CORBASec) test bed to secure their distributed aerospace propulsion simulations. Glenn has been working with its aerospace propulsion industry partners to deploy the Numerical Propulsion System Simulation (NPSS) object-based technology. NPSS is a program focused on reducing the cost and time in developing aerospace propulsion engines. It was developed by Glenn and is being managed by the NASA Ames Research Center as the lead center reporting directly to NASA Headquarters' Aerospace Technology Enterprise. Glenn is an active domain member of the Object Management Group: an open membership, not-for-profit consortium that produces and manages computer industry specifications (i.e., CORBA) for interoperable enterprise applications. When NPSS is deployed, it will assemble a distributed aerospace propulsion simulation scenario from proprietary analytical CORBA servers and execute them with security afforded by the CORBASec implementation. The NPSS CORBASec test bed was initially developed with the TPBroker Security Service product (Hitachi Computer Products (America), Inc., Waltham, MA) using the Object Request Broker (ORB), which is based on the TPBroker Basic Object Adaptor, and using NPSS software across different firewall products. The test bed has been migrated to the Portable Object Adaptor architecture using the Hitachi Security Service product based on the VisiBroker 4.x ORB (Borland, Scotts Valley, CA) and on the Orbix 2000 ORB (Dublin, Ireland, with U.S. headquarters in Waltham, MA). Glenn, GE Aircraft Engines, and Pratt & Whitney Aircraft are the initial industry partners contributing to the NPSS CORBASec test bed. The test bed uses Security SecurID (RSA Security Inc., Bedford, MA) two-factor token-based authentication together with Hitachi Security Service digital-certificate-based authentication to validate the various NPSS users. The test bed is expected to demonstrate NPSS CORBASec-specific policy functionality, confirm adequate performance, and validate the required Internet configuration in a distributed collaborative aerospace propulsion environment.

  12. The 11.2 μm emission of PAHs in astrophysical objects

    NASA Astrophysics Data System (ADS)

    Candian, A.; Sarre, P. J.

    2015-04-01

    The 11.2-μm emission band belongs to the family of the `unidentified' infrared emission bands seen in many astronomical environments. In this work, we present a theoretical interpretation of the band characteristics and profile variation for a number of astrophysical sources in which the carriers are subject to a range of physical conditions. The results of Density Functional Theory calculations for the solo out-of-plane vibrational bending modes of large polycyclic aromatic hydrocarbon (PAH) molecules are used as input for a detailed emission model which includes the temperature and mass dependence of PAH band wavelength, and a PAH mass distribution that varies with object. Comparison of the model with astronomical spectra indicates that the 11.2-μm band asymmetry and profile variation can be explained principally in terms of the mass distribution of neutral PAHs with a small contribution from anharmonic effects.

  13. Distributed run of a one-dimensional model in a regional application using SOAP-based web services

    NASA Astrophysics Data System (ADS)

    Smiatek, Gerhard

    This article describes the setup of a distributed computing system in Perl. It facilitates the parallel run of a one-dimensional environmental model on a number of simple network PC hosts. The system uses Simple Object Access Protocol (SOAP) driven web services offering the model run on remote hosts and a multi-thread environment distributing the work and accessing the web services. Its application is demonstrated in a regional run of a process-oriented biogenic emission model for the area of Germany. Within a network consisting of up to seven web services implemented on Linux and MS-Windows hosts, a performance increase of approximately 400% has been reached compared to a model run on the fastest single host.

  14. Advances in Modal Analysis Using a Robust and Multiscale Method

    NASA Astrophysics Data System (ADS)

    Picard, Cécile; Frisson, Christian; Faure, François; Drettakis, George; Kry, Paul G.

    2010-12-01

    This paper presents a new approach to modal synthesis for rendering sounds of virtual objects. We propose a generic method that preserves sound variety across the surface of an object at different scales of resolution and for a variety of complex geometries. The technique performs automatic voxelization of a surface model and automatic tuning of the parameters of hexahedral finite elements, based on the distribution of material in each cell. The voxelization is performed using a sparse regular grid embedding of the object, which permits the construction of plausible lower resolution approximations of the modal model. We can compute the audible impulse response of a variety of objects. Our solution is robust and can handle nonmanifold geometries that include both volumetric and surface parts. We present a system which allows us to manipulate and tune sounding objects in an appropriate way for games, training simulations, and other interactive virtual environments.

  15. The NASA Astrobiology Roadmap

    NASA Technical Reports Server (NTRS)

    Des Marais, David J.; Allamandola, Louis J.; Benner, Steven A.; Boss, Alan P.; Deamer, David; Falkowski, Paul G.; Farmer, Jack D.; Hedges, S. Blair; Jakosky, Bruce M.; Knoll, Andrew H.; hide

    2003-01-01

    The NASA Astrobiology Roadmap provides guidance for research and technology development across the NASA enterprises that encompass the space, Earth, and biological sciences. The ongoing development of astrobiology roadmaps embodies the contributions of diverse scientists and technologists from government, universities, and private institutions. The Roadmap addresses three basic questions: How does life begin and evolve, does life exist elsewhere in the universe, and what is the future of life on Earth and beyond? Seven Science Goals outline the following key domains of investigation: understanding the nature and distribution of habitable environments in the universe, exploring for habitable environments and life in our own solar system, understanding the emergence of life, determining how early life on Earth interacted and evolved with its changing environment, understanding the evolutionary mechanisms and environmental limits of life, determining the principles that will shape life in the future, and recognizing signatures of life on other worlds and on early Earth. For each of these goals, Science Objectives outline more specific high-priority efforts for the next 3-5 years. These 18 objectives are being integrated with NASA strategic planning.

  16. The NASA Astrobiology Roadmap.

    PubMed

    Des Marais, David J; Allamandola, Louis J; Benner, Steven A; Boss, Alan P; Deamer, David; Falkowski, Paul G; Farmer, Jack D; Hedges, S Blair; Jakosky, Bruce M; Knoll, Andrew H; Liskowsky, David R; Meadows, Victoria S; Meyer, Michael A; Pilcher, Carl B; Nealson, Kenneth H; Spormann, Alfred M; Trent, Jonathan D; Turner, William W; Woolf, Neville J; Yorke, Harold W

    2003-01-01

    The NASA Astrobiology Roadmap provides guidance for research and technology development across the NASA enterprises that encompass the space, Earth, and biological sciences. The ongoing development of astrobiology roadmaps embodies the contributions of diverse scientists and technologists from government, universities, and private institutions. The Roadmap addresses three basic questions: How does life begin and evolve, does life exist elsewhere in the universe, and what is the future of life on Earth and beyond? Seven Science Goals outline the following key domains of investigation: understanding the nature and distribution of habitable environments in the universe, exploring for habitable environments and life in our own solar system, understanding the emergence of life, determining how early life on Earth interacted and evolved with its changing environment, understanding the evolutionary mechanisms and environmental limits of life, determining the principles that will shape life in the future, and recognizing signatures of life on other worlds and on early Earth. For each of these goals, Science Objectives outline more specific high-priority efforts for the next 3-5 years. These 18 objectives are being integrated with NASA strategic planning.

  17. The NASA Astrobiology Roadmap.

    PubMed

    Des Marais, David J; Nuth, Joseph A; Allamandola, Louis J; Boss, Alan P; Farmer, Jack D; Hoehler, Tori M; Jakosky, Bruce M; Meadows, Victoria S; Pohorille, Andrew; Runnegar, Bruce; Spormann, Alfred M

    2008-08-01

    The NASA Astrobiology Roadmap provides guidance for research and technology development across the NASA enterprises that encompass the space, Earth, and biological sciences. The ongoing development of astrobiology roadmaps embodies the contributions of diverse scientists and technologists from government, universities, and private institutions. The Roadmap addresses three basic questions: how does life begin and evolve, does life exist elsewhere in the universe, and what is the future of life on Earth and beyond? Seven Science Goals outline the following key domains of investigation: understanding the nature and distribution of habitable environments in the universe, exploring for habitable environments and life in our own Solar System, understanding the emergence of life, determining how early life on Earth interacted and evolved with its changing environment, understanding the evolutionary mechanisms and environmental limits of life, determining the principles that will shape life in the future, and recognizing signatures of life on other worlds and on early Earth. For each of these goals, Science Objectives outline more specific high priority efforts for the next three to five years. These eighteen objectives are being integrated with NASA strategic planning.

  18. Search strategy in a complex and dynamic environment (the Indian Ocean case)

    NASA Astrophysics Data System (ADS)

    Loire, Sophie; Arbabi, Hassan; Clary, Patrick; Ivic, Stefan; Crnjaric-Zic, Nelida; Macesic, Senka; Crnkovic, Bojan; Mezic, Igor; UCSB Team; Rijeka Team

    2014-11-01

    The disappearance of Malaysia Airlines Flight 370 (MH370) in the early morning hours of 8 March 2014 has exposed the disconcerting lack of efficient methods for identifying where to look and how to look for missing objects in a complex and dynamic environment. The search area for plane debris is a remote part of the Indian Ocean. Searches, of the lawnmower type, have been unsuccessful so far. Lagrangian kinematics of mesoscale features are visible in hypergraph maps of the Indian Ocean surface currents. Without a precise knowledge of the crash site, these maps give an estimate of the time evolution of any initial distribution of plane debris and permits the design of a search strategy. The Dynamic Spectral Multiscale Coverage search algorithm is modified to search a spatial distribution of targets that is evolving with time following the dynamic of ocean surface currents. Trajectories are generated for multiple search agents such that their spatial coverage converges to the target distribution. Central to this DSMC algorithm is a metric for the ergodicity.

  19. Derivation and Application of a Global Albedo yielding an Optical Brightness To Physical Size Transformation Free of Systematic Errors

    NASA Technical Reports Server (NTRS)

    Mulrooney, Dr. Mark K.; Matney, Dr. Mark J.

    2007-01-01

    Orbital object data acquired via optical telescopes can play a crucial role in accurately defining the space environment. Radar systems probe the characteristics of small debris by measuring the reflected electromagnetic energy from an object of the same order of size as the wavelength of the radiation. This signal is affected by electrical conductivity of the bulk of the debris object, as well as its shape and orientation. Optical measurements use reflected solar radiation with wavelengths much smaller than the size of the objects. Just as with radar, the shape and orientation of an object are important, but we only need to consider the surface electrical properties of the debris material (i.e., the surface albedo), not the bulk electromagnetic properties. As a result, these two methods are complementary in that they measure somewhat independent physical properties to estimate the same thing, debris size. Short arc optical observations such as are typical of NASA's Liquid Mirror Telescope (LMT) give enough information to estimate an Assumed Circular Orbit (ACO) and an associated range. This information, combined with the apparent magnitude, can be used to estimate an "absolute" brightness (scaled to a fixed range and phase angle). This absolute magnitude is what is used to estimate debris size. However, the shape and surface albedo effects make the size estimates subject to systematic and random errors, such that it is impossible to ascertain the size of an individual object with any certainty. However, as has been shown with radar debris measurements, that does not preclude the ability to estimate the size distribution of a number of objects statistically. After systematic errors have been eliminated (range errors, phase function assumptions, photometry) there remains a random geometric albedo distribution that relates object size to absolute magnitude. Measurements by the LMT of a subset of tracked debris objects with sizes estimated from their radar cross sections indicate that the random variations in the albedo follow a log-normal distribution quite well. In addition, this distribution appears to be independent of object size over a considerable range in size. Note that this relation appears to hold for debris only, where the shapes and other properties are not primarily the result of human manufacture, but of random processes. With this information in hand, it now becomes possible to estimate the actual size distribution we are sampling from. We have identified two characteristics of the space debris population that make this process tractable and by extension have developed a methodology for performing the transformation.

  20. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification

    PubMed Central

    Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard

    2016-01-01

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment. PMID:27922592

  1. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification

    NASA Astrophysics Data System (ADS)

    Bradbury, Kyle; Saboo, Raghav; L. Johnson, Timothy; Malof, Jordan M.; Devarajan, Arjun; Zhang, Wuming; M. Collins, Leslie; G. Newell, Richard

    2016-12-01

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.

  2. Distributed solar photovoltaic array location and extent dataset for remote sensing object identification.

    PubMed

    Bradbury, Kyle; Saboo, Raghav; L Johnson, Timothy; Malof, Jordan M; Devarajan, Arjun; Zhang, Wuming; M Collins, Leslie; G Newell, Richard

    2016-12-06

    Earth-observing remote sensing data, including aerial photography and satellite imagery, offer a snapshot of the world from which we can learn about the state of natural resources and the built environment. The components of energy systems that are visible from above can be automatically assessed with these remote sensing data when processed with machine learning methods. Here, we focus on the information gap in distributed solar photovoltaic (PV) arrays, of which there is limited public data on solar PV deployments at small geographic scales. We created a dataset of solar PV arrays to initiate and develop the process of automatically identifying solar PV locations using remote sensing imagery. This dataset contains the geospatial coordinates and border vertices for over 19,000 solar panels across 601 high-resolution images from four cities in California. Dataset applications include training object detection and other machine learning algorithms that use remote sensing imagery, developing specific algorithms for predictive detection of distributed PV systems, estimating installed PV capacity, and analysis of the socioeconomic correlates of PV deployment.

  3. An Integrated XRF/XRD Instrument for Mars Exobiology and Geology Experiments

    NASA Technical Reports Server (NTRS)

    Koppel, L. N.; Franco, E. D.; Kerner, J. A.; Fonda, M. L.; Schwartz, D. E.; Marshall, J. R.

    1993-01-01

    By employing an integrated x-ray instrument on a future Mars mission, data obtained will greatly augment those returned by Viking; details characterizing the past and present environment on Mars and those relevant to the possibility of the origin and evolution of life will be acquired. A combined x-ray fluorescence/x-ray diffraction (XRF/XRD) instrument was breadboarded and demonstrated to accommodate important exobiology and geology experiment objectives outlined for MESUR and future Mars missions. Among others, primary objectives for the exploration of Mars include the intense study of local areas on Mars to establish the chemical, mineralogical, and petrological character of different components of the surface material; to determine the distribution, abundance, and sources and sinks of volatile materials, including an assessment of the biologic potential, now and during past epoches; and to establish the global chemical and physical characteristics of the Martian surface. The XRF/XRD breadboard instrument identifies and quantifies soil surface elemental, mineralogical, and petrological characteristics and acquires data necessary to address questions on volatile abundance and distribution. Additionally, the breadboard is able to characterize the biogenic element constituents of soil samples providing information on the biologic potential of the Mars environment. Preliminary breadboard experiments confirmed the fundamental instrument design approach and measurement performance.

  4. THE YOUNG STELLAR POPULATION OF LYNDS 1340. AN INFRARED VIEW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kun, M.; Moór, A.; Wolf-Chase, G.

    We present results of an infrared study of the molecular cloud Lynds 1340, forming three groups of low- and intermediate-mass stars. Our goals are to identify and characterize the young stellar population of the cloud, study the relationships between the properties of the cloud and the emergent stellar groups, and integrate L1340 into the picture of the star-forming activity of our Galactic environment. We selected candidate young stellar objects (YSOs) from the Spitzer and WISE databases using various published color criteria and classified them based on the slope of the spectral energy distribution (SED). We identified 170 Class II, 27more » flat SED, and 45 Class 0/I sources. High angular resolution near-infrared observations of the RNO 7 cluster, embedded in L1340, revealed eight new young stars of near-infrared excess. The surface density distribution of YSOs shows three groups, associated with the three major molecular clumps of L1340, each consisting of ≲100 members, including both pre-main-sequence stars and embedded protostars. New Herbig–Haro objects were identified in the Spitzer images. Our results demonstrate that L1340 is a prolific star-forming region of our Galactic environment in which several specific properties of the intermediate-mass mode of star formation can be studied in detail.« less

  5. Space Debris Symposium (A6.) Measurements and Space Surveillance (1.): Measurements of the Small Particle Debris Cloud from the 11 January, 2007 Chinese Anti-satellite Test

    NASA Technical Reports Server (NTRS)

    Matney, Mark J.; Stansbery, Eugene; J.-C Liou; Stokely, Christopher; Horstman, Matthew; Whitlock, David

    2008-01-01

    On January 11, 2007, the Chinese military conducted a test of an anti-satellite (ASAT) system, destroying their own Fengyun-1C spacecraft with an interceptor missile. The resulting hypervelocity collision created an unprecedented number of tracked debris - more than 2500 objects. These objects represent only those large enough for the US Space Surveillance Network (SSN) to track - typically objects larger than about 5-10 cm in diameter. There are expected to be even more debris objects at sizes too small to be seen and tracked by the SSN. Because of the altitude of the target satellite (865 x 845 km orbit), many of the debris are expected to have long orbital lifetimes and contribute to the orbital debris environment for decades to come. In the days and weeks following the ASAT test, NASA was able to use Lincoln Laboratory s Haystack radar on several occasions to observe portions of the ASAT debris cloud. Haystack has the capability of detecting objects down to less than one centimeter in diameter, and a large number of centimeter-sized particles corresponding to the ASAT cloud were clearly seen in the data. While Haystack cannot track these objects, the statistical sampling procedures NASA uses can give an accurate statistical picture of the characteristics of the debris from a breakup event. For years computer models based on data from ground hypervelocity collision tests (e.g., the SOCIT test) and orbital collision experiments (e.g., the P-78 and Delta-180 on-orbit collisions) have been used to predict the extent and characteristics of such hypervelocity collision debris clouds, but until now there have not been good ways to verify these models in the centimeter size regime. It is believed that unplanned collisions of objects in space similar to ASAT tests will drive the long-term future evolution of the debris environment in near-Earth space. Therefore, the Chinese ASAT test provides an excellent opportunity to test the models used to predict the future debris environment. For this study, Haystack detection events are compared to model predictions to test the model assumptions, including debris size distribution, velocity distribution, and assumptions about momentum transfer between the target and interceptor. In this paper we will present the results of these and other measurements on the size and extent of collisional breakup debris clouds.

  6. Chytridiomycosis: a global threat to amphibians.

    PubMed

    Pereira, P L L; Torres, A M C; Soares, D F M; Hijosa-Valsero, M; Bécares, E

    2013-12-01

    Chytridiomycosis, which is caused by Batrachochytrium dendrobatidis, is an emerging infectious disease of amphibians. The disease is one of the main causes of the global decline in amphibians. The aetiological agent is ubiquitous, with worldwide distribution, and affects a large number of amphibian species in several biomes. In the last decade, scientific research has substantially increased knowledge of the aetiological agent and the associated infection. However, important epidemiological aspects of the environment-mediated interactions between the aetiological agent and the host are not yet clear. The objective of the present review is to describe chytridiomycosis with regard to the major features of the aetiological agent, the host and the environment.

  7. Design and implementation of a CORBA-based genome mapping system prototype.

    PubMed

    Hu, J; Mungall, C; Nicholson, D; Archibald, A L

    1998-01-01

    CORBA (Common Object Request Broker Architecture), as an open standard, is considered to be a good solution for the development and deployment of applications in distributed heterogeneous environments. This technology can be applied in the bioinformatics area to enhance utilization, management and interoperation between biological resources. This paper investigates issues in developing CORBA applications for genome mapping information systems in the Internet environment with emphasis on database connectivity and graphical user interfaces. The design and implementation of a CORBA prototype for an animal genome mapping database are described. The prototype demonstration is available via: http://www.ri.bbsrc.ac.uk/ark_corba/. jian.hu@bbsrc.ac.uk

  8. Body posture differentially impacts on visual attention towards tool, graspable, and non-graspable objects.

    PubMed

    Ambrosini, Ettore; Costantini, Marcello

    2017-02-01

    Viewed objects have been shown to afford suitable actions, even in the absence of any intention to act. However, little is known as to whether gaze behavior (i.e., the way we simply look at objects) is sensitive to action afforded by the seen object and how our actual motor possibilities affect this behavior. We recorded participants' eye movements during the observation of tools, graspable and ungraspable objects, while their hands were either freely resting on the table or tied behind their back. The effects of the observed object and hand posture on gaze behavior were measured by comparing the actual fixation distribution with that predicted by 2 widely supported models of visual attention, namely the Graph-Based Visual Saliency and the Adaptive Whitening Salience models. Results showed that saliency models did not accurately predict participants' fixation distributions for tools. Indeed, participants mostly fixated the action-related, functional part of the tools, regardless of its visual saliency. Critically, the restriction of the participants' action possibility led to a significant reduction of this effect and significantly improved the model prediction of the participants' gaze behavior. We suggest, first, that action-relevant object information at least in part guides gaze behavior. Second, postural information interacts with visual information to the generation of priority maps of fixation behavior. We support the view that the kind of information we access from the environment is constrained by our readiness to act. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Advanced sensors and instrumentation

    NASA Technical Reports Server (NTRS)

    Calloway, Raymond S.; Zimmerman, Joe E.; Douglas, Kevin R.; Morrison, Rusty

    1990-01-01

    NASA is currently investigating the readiness of Advanced Sensors and Instrumentation to meet the requirements of new initiatives in space. The following technical objectives and technologies are briefly discussed: smart and nonintrusive sensors; onboard signal and data processing; high capacity and rate adaptive data acquisition systems; onboard computing; high capacity and rate onboard storage; efficient onboard data distribution; high capacity telemetry; ground and flight test support instrumentation; power distribution; and workstations, video/lighting. The requirements for high fidelity data (accuracy, frequency, quantity, spatial resolution) in hostile environments will continue to push the technology developers and users to extend the performance of their products and to develop new generations.

  10. Person-job and person-organization fits: Co-op fits in an aerospace engineering environment

    NASA Astrophysics Data System (ADS)

    Urban, Anthony John, Jr.

    This dissertation research was a replication of a quantitative study completed by Dr. Cynthia Shantz at Wayne State University during 2003. The intent of the research was to investigate the fits of college students who participated in cooperative academic-work programs (co-ops) to employment positions within aerospace engineering. The objective of investigating person-job (P-J) and person-organization (P-O) fits was to determine if variables could be identified that indicated an individual's aptitude to complete successfully aerospace engineering standard work. Research participants were co-op employees who were surveyed during their employment to identify indications of their fits into their organization and job assignments. Dr. Shantz's research led to the thought employment success might increase when P-J and P-O fits increase. For example, reduced initial training investments and increased employee retention might result with improved P-O and P-J fits. Research data were gathered from surveys of co-ops who worked at a Connecticut aerospace engineering company. Data were collected by distributing invitations to co-ops to participate in three online surveys over a 9-11 week period. Distribution of survey invitations was accomplished through the Human Resources Department to ensure that respondent identities were maintained private. To protect anonymity and privacy further, no identifying information about individuals or the company is published. However, some demographic information was collected to ensure that correlations were based on valid and reliable data and research and analysis methods. One objective of this research was to determine if co-op characteristics could be correlated with successful employment in an aerospace engineering environment. A second objective was to determine if P-J and P-O fits vary over time as co-ops become increasing familiar with their assignments, organization, and environment. Understanding and incorporating the use P-J and P-O fits characteristics in the employment preparation and screening process may benefit aerospace engineering companies, co-ops, and academia through gains realized in reduced employment recruitment time and retention cost, student improvement in preparation to fit into aerospace engineering environments, and increases in pre- and post-graduation job placement rates.

  11. EVOLUTION IN THE H I GAS CONTENT OF GALAXY GROUPS: PRE-PROCESSING AND MASS ASSEMBLY IN THE CURRENT EPOCH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hess, Kelley M.; Wilcots, Eric M., E-mail: hess@ast.uct.ac.za, E-mail: ewilcots@astro.wisc.edu

    We present an analysis of the neutral hydrogen (H I) content and distribution of galaxies in groups as a function of their parent dark matter halo mass. The Arecibo Legacy Fast ALFA survey α.40 data release allows us, for the first time, to study the H I properties of over 740 galaxy groups in the volume of sky common to the Sloan Digital Sky Survey (SDSS) and ALFALFA surveys. We assigned ALFALFA H I detections a group membership based on an existing magnitude/volume-limited SDSS Data Release 7 group/cluster catalog. Additionally, we assigned group ''proximity' membership to H I detected objectsmore » whose optical counterpart falls below the limiting optical magnitude—thereby not contributing substantially to the estimate of the group stellar mass, but significantly to the total group H I mass. We find that only 25% of the H I detected galaxies reside in groups or clusters, in contrast to approximately half of all optically detected galaxies. Further, we plot the relative positions of optical and H I detections in groups as a function of parent dark matter halo mass to reveal strong evidence that H I is being processed in galaxies as a result of the group environment: as optical membership increases, groups become increasingly deficient of H I rich galaxies at their center and the H I distribution of galaxies in the most massive groups starts to resemble the distribution observed in comparatively more extreme cluster environments. We find that the lowest H I mass objects lose their gas first as they are processed in the group environment, and it is evident that the infall of gas rich objects is important to the continuing growth of large scale structure at the present epoch, replenishing the neutral gas supply of groups. Finally, we compare our results to those of cosmological simulations and find that current models cannot simultaneously predict the H I selected halo occupation distribution for both low and high mass halos.« less

  12. Is the objective food environment associated with perceptions of the food environment?

    PubMed

    Williams, Lauren K; Thornton, Lukar; Ball, Kylie; Crawford, David

    2012-02-01

    The present study examined whether objective measures of the food environment are associated with perceptions of the food environment and whether this relationship varies by socio-economic disadvantage. The study is a cross-sectional analysis of self-report surveys and objective environment data. Women reported their perceptions on the nutrition environment. Participants' homes and food stores were geocoded to measure the objective community nutrition environment. Data on the average price and variety of fruit and vegetables were used to measure the objective consumer nutrition environment. The study was conducted in Melbourne, Australia, in 2003-2004. Data presented are from a sample of 1393 women aged 18-65 years. Overall the match between the perceived and objective environment was poor, underscoring the limitations in using perceptions of the environment as a proxy for the objective environment. Socio-economic disadvantage had limited impact on the relationship between perceived and objective nutrition environment. Further research is needed to understand the determinants of perceptions of the nutrition environment to enhance our understanding of the role of perceptions in nutrition choices and drivers of socio-economic inequalities in nutrition.

  13. An Internet Protocol-Based Software System for Real-Time, Closed-Loop, Multi-Spacecraft Mission Simulation Applications

    NASA Technical Reports Server (NTRS)

    Davis, George; Cary, Everett; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis

    2003-01-01

    The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.

  14. Virtual Research Environments for Natural Hazard Modelling

    NASA Astrophysics Data System (ADS)

    Napier, Hazel; Aldridge, Tim

    2017-04-01

    The Natural Hazards Partnership (NHP) is a group of 17 collaborating public sector organisations providing a mechanism for co-ordinated advice to government and agencies responsible for civil contingency and emergency response during natural hazard events. The NHP has set up a Hazard Impact Model (HIM) group tasked with modelling the impact of a range of UK hazards with the aim of delivery of consistent hazard and impact information. The HIM group consists of 7 partners initially concentrating on modelling the socio-economic impact of 3 key hazards - surface water flooding, land instability and high winds. HIM group partners share scientific expertise and data within their specific areas of interest including hydrological modelling, meteorology, engineering geology, GIS, data delivery, and modelling of socio-economic impacts. Activity within the NHP relies on effective collaboration between partners distributed across the UK. The NHP are acting as a use case study for a new Virtual Research Environment (VRE) being developed by the EVER-EST project (European Virtual Environment for Research - Earth Science Themes: a solution). The VRE is allowing the NHP to explore novel ways of cooperation including improved capabilities for e-collaboration, e-research, automation of processes and e-learning. Collaboration tools are complemented by the adoption of Research Objects, semantically rich aggregations of resources enabling the creation of uniquely identified digital artefacts resulting in reusable science and research. Application of the Research Object concept to HIM development facilitates collaboration, by encapsulating scientific knowledge in a shareable format that can be easily shared and used by partners working on the same model but within their areas of expertise. This paper describes the application of the VRE to the NHP use case study. It outlines the challenges associated with distributed partnership working and how they are being addressed in the VRE. A case study is included focussing on the application of Research Objects to development work for the surface water flooding hazard impact model, a key achievement for the HIM group.

  15. Collaborative Workspaces within Distributed Virtual Environments.

    DTIC Science & Technology

    1996-12-01

    such as a text document, a 3D model, or a captured image using a collaborative workspace called the InPerson Whiteboard . The Whiteboard contains a...commands for editing objects drawn on the screen. Finally, when the call is completed, the Whiteboard can be saved to a file for future use . IRIS Annotator... use , and a shared whiteboard that includes a number of multimedia annotation tools. Both systems are also mindful of bandwidth limitations and can

  16. Cardiological database management system as a mediator to clinical decision support.

    PubMed

    Pappas, C; Mavromatis, A; Maglaveras, N; Tsikotis, A; Pangalos, G; Ambrosiadou, V

    1996-03-01

    An object-oriented medical database management system is presented for a typical cardiologic center, facilitating epidemiological trials. Object-oriented analysis and design were used for the system design, offering advantages for the integrity and extendibility of medical information systems. The system was developed using object-oriented design and programming methodology, the C++ language and the Borland Paradox Relational Data Base Management System on an MS-Windows NT environment. Particular attention was paid to system compatibility, portability, the ease of use, and the suitable design of the patient record so as to support the decisions of medical personnel in cardiovascular centers. The system was designed to accept complex, heterogeneous, distributed data in various formats and from different kinds of examinations such as Holter, Doppler and electrocardiography.

  17. EOS: A project to investigate the design and construction of real-time distributed embedded operating systems

    NASA Technical Reports Server (NTRS)

    Campbell, R. H.; Essick, R. B.; Grass, J.; Johnston, G.; Kenny, K.; Russo, V.

    1986-01-01

    The EOS project is investigating the design and construction of a family of real-time distributed embedded operating systems for reliable, distributed aerospace applications. Using the real-time programming techniques developed in co-operation with NASA in earlier research, the project staff is building a kernel for a multiple processor networked system. The first six months of the grant included a study of scheduling in an object-oriented system, the design philosophy of the kernel, and the architectural overview of the operating system. In this report, the operating system and kernel concepts are described. An environment for the experiments has been built and several of the key concepts of the system have been prototyped. The kernel and operating system is intended to support future experimental studies in multiprocessing, load-balancing, routing, software fault-tolerance, distributed data base design, and real-time processing.

  18. Analysis of space radiation exposure levels at different shielding configurations by ray-tracing dose estimation method

    NASA Astrophysics Data System (ADS)

    Kartashov, Dmitry; Shurshakov, Vyacheslav

    2018-03-01

    A ray-tracing method to calculate radiation exposure levels of astronauts at different spacecraft shielding configurations has been developed. The method uses simplified shielding geometry models of the spacecraft compartments together with depth-dose curves. The depth-dose curves can be obtained with different space radiation environment models and radiation transport codes. The spacecraft shielding configurations are described by a set of geometry objects. To calculate the shielding probability functions for each object its surface is composed from a set of the disjoint adjacent triangles that fully cover the surface. Such description can be applied for any complex shape objects. The method is applied to the space experiment MATROSHKA-R modeling conditions. The experiment has been carried out onboard the ISS from 2004 to 2016. Dose measurements were realized in the ISS compartments with anthropomorphic and spherical phantoms, and the protective curtain facility that provides an additional shielding on the crew cabin wall. The space ionizing radiation dose distributions in tissue-equivalent spherical and anthropomorphic phantoms and for an additional shielding installed in the compartment are calculated. There is agreement within accuracy of about 15% between the data obtained in the experiment and calculated ones. Thus the calculation method used has been successfully verified with the MATROSHKA-R experiment data. The ray-tracing radiation dose calculation method can be recommended for estimation of dose distribution in astronaut body in different space station compartments and for estimation of the additional shielding efficiency, especially when exact compartment shielding geometry and the radiation environment for the planned mission are not known.

  19. Magnetoacoustic tomography with magnetic induction for high-resolution bioimepedance imaging through vector source reconstruction under the static field of MRI magnet

    PubMed Central

    Mariappan, Leo; Hu, Gang; He, Bin

    2014-01-01

    Purpose: Magnetoacoustic tomography with magnetic induction (MAT-MI) is an imaging modality to reconstruct the electrical conductivity of biological tissue based on the acoustic measurements of Lorentz force induced tissue vibration. This study presents the feasibility of the authors' new MAT-MI system and vector source imaging algorithm to perform a complete reconstruction of the conductivity distribution of real biological tissues with ultrasound spatial resolution. Methods: In the present study, using ultrasound beamformation, imaging point spread functions are designed to reconstruct the induced vector source in the object which is used to estimate the object conductivity distribution. Both numerical studies and phantom experiments are performed to demonstrate the merits of the proposed method. Also, through the numerical simulations, the full width half maximum of the imaging point spread function is calculated to estimate of the spatial resolution. The tissue phantom experiments are performed with a MAT-MI imaging system in the static field of a 9.4 T magnetic resonance imaging magnet. Results: The image reconstruction through vector beamformation in the numerical and experimental studies gives a reliable estimate of the conductivity distribution in the object with a ∼1.5 mm spatial resolution corresponding to the imaging system frequency of 500 kHz ultrasound. In addition, the experiment results suggest that MAT-MI under high static magnetic field environment is able to reconstruct images of tissue-mimicking gel phantoms and real tissue samples with reliable conductivity contrast. Conclusions: The results demonstrate that MAT-MI is able to image the electrical conductivity properties of biological tissues with better than 2 mm spatial resolution at 500 kHz, and the imaging with MAT-MI under a high static magnetic field environment is able to provide improved imaging contrast for biological tissue conductivity reconstruction. PMID:24506649

  20. 45 CFR 153.700 - Distributed data environment.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Distributed data environment. 153.700 Section 153... Distributed Data Collection for HHS-Operated Programs § 153.700 Distributed data environment. (a) Dedicated distributed data environments. For each benefit year in which HHS operates the risk adjustment or reinsurance...

  1. 45 CFR 153.700 - Distributed data environment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Distributed data environment. 153.700 Section 153... Distributed Data Collection for HHS-Operated Programs § 153.700 Distributed data environment. (a) Dedicated distributed data environments. For each benefit year in which HHS operates the risk adjustment or reinsurance...

  2. Radiological tele-immersion for next generation networks.

    PubMed

    Ai, Z; Dech, F; Rasmussen, M; Silverstein, J C

    2000-01-01

    Since the acquisition of high-resolution three-dimensional patient images has become widespread, medical volumetric datasets (CT or MR) larger than 100 MB and encompassing more than 250 slices are common. It is important to make this patient-specific data quickly available and usable to many specialists at different geographical sites. Web-based systems have been developed to provide volume or surface rendering of medical data over networks with low fidelity, but these cannot adequately handle stereoscopic visualization or huge datasets. State-of-the-art virtual reality techniques and high speed networks have made it possible to create an environment for clinicians geographically distributed to immersively share these massive datasets in real-time. An object-oriented method for instantaneously importing medical volumetric data into Tele-Immersive environments has been developed at the Virtual Reality in Medicine Laboratory (VRMedLab) at the University of Illinois at Chicago (UIC). This networked-VR setup is based on LIMBO, an application framework or template that provides the basic capabilities of Tele-Immersion. We have developed a modular general purpose Tele-Immersion program that automatically combines 3D medical data with the methods for handling the data. For this purpose a DICOM loader for IRIS Performer has been developed. The loader was designed for SGI machines as a shared object, which is executed at LIMBO's runtime. The loader loads not only the selected DICOM dataset, but also methods for rendering, handling, and interacting with the data, bringing networked, real-time, stereoscopic interaction with radiological data to reality. Collaborative, interactive methods currently implemented in the loader include cutting planes and windowing. The Tele-Immersive environment has been tested on the UIC campus over an ATM network. We tested the environment with 3 nodes; one ImmersaDesk at the VRMedLab, one CAVE at the Electronic Visualization Laboratory (EVL) on east campus, and a CT scan machine in UIC Hospital. CT data was pulled directly from the scan machine to the Tele-Immersion server in our Laboratory, and then the data was synchronously distributed by our Onyx2 Rack server to all the VR setups. Instead of permitting medical volume visualization at one VR device, by combining teleconferencing, tele-presence, and virtual reality, the Tele-Immersive environment will enable geographically distributed clinicians to intuitively interact with the same medical volumetric models, point, gesture, converse, and see each other. This environment will bring together clinicians at different geographic locations to participate in Tele-Immersive consultation and collaboration.

  3. Building distributed rule-based systems using the AI Bus

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain C.

    1990-01-01

    The AI Bus software architecture was designed to support the construction of large-scale, production-quality applications in areas of high technology flux, running heterogeneous distributed environments, utilizing a mix of knowledge-based and conventional components. These goals led to its current development as a layered, object-oriented library for cooperative systems. This paper describes the concepts and design of the AI Bus and its implementation status as a library of reusable and customizable objects, structured by layers from operating system interfaces up to high-level knowledge-based agents. Each agent is a semi-autonomous process with specialized expertise, and consists of a number of knowledge sources (a knowledge base and inference engine). Inter-agent communication mechanisms are based on blackboards and Actors-style acquaintances. As a conservative first implementation, we used C++ on top of Unix, and wrapped an embedded Clips with methods for the knowledge source class. This involved designing standard protocols for communication and functions which use these protocols in rules. Embedding several CLIPS objects within a single process was an unexpected problem because of global variables, whose solution involved constructing and recompiling a C++ version of CLIPS. We are currently working on a more radical approach to incorporating CLIPS, by separating out its pattern matcher, rule and fact representations and other components as true object oriented modules.

  4. Calculating Statistical Orbit Distributions Using GEO Optical Observations with the Michigan Orbital Debris Survey Telescope (MODEST)

    NASA Technical Reports Server (NTRS)

    Matney, M.; Barker, E.; Seitzer, P.; Abercromby, K. J.; Rodriquez, H. M.

    2006-01-01

    NASA's Orbital Debris measurements program has a goal to characterize the small debris environment in the geosynchronous Earth-orbit (GEO) region using optical telescopes ("small" refers to objects too small to catalog and track with current systems). Traditionally, observations of GEO and near-GEO objects involve following the object with the telescope long enough to obtain an orbit suitable for tracking purposes. Telescopes operating in survey mode, however, randomly observe objects that pass through their field of view. Typically, these short-arc observation are inadequate to obtain detailed orbits, but can be used to estimate approximate circular orbit elements (semimajor axis, inclination, and ascending node). From this information, it should be possible to make statistical inferences about the orbital distributions of the GEO population bright enough to be observed by the system. The Michigan Orbital Debris Survey Telescope (MODEST) has been making such statistical surveys of the GEO region for four years. During that time, the telescope has made enough observations in enough areas of the GEO belt to have had nearly complete coverage. That means that almost all objects in all possible orbits in the GEO and near- GEO region had a non-zero chance of being observed. Some regions (such as those near zero inclination) have had good coverage, while others are poorly covered. Nevertheless, it is possible to remove these statistical biases and reconstruct the orbit populations within the limits of sampling error. In this paper, these statistical techniques and assumptions are described, and the techniques are applied to the current MODEST data set to arrive at our best estimate of the GEO orbit population distribution.

  5. Influence of semantic consistency and perceptual features on visual attention during scene viewing in toddlers.

    PubMed

    Helo, Andrea; van Ommen, Sandrien; Pannasch, Sebastian; Danteny-Dordoigne, Lucile; Rämä, Pia

    2017-11-01

    Conceptual representations of everyday scenes are built in interaction with visual environment and these representations guide our visual attention. Perceptual features and object-scene semantic consistency have been found to attract our attention during scene exploration. The present study examined how visual attention in 24-month-old toddlers is attracted by semantic violations and how perceptual features (i. e. saliency, centre distance, clutter and object size) and linguistic properties (i. e. object label frequency and label length) affect gaze distribution. We compared eye movements of 24-month-old toddlers and adults while exploring everyday scenes which either contained an inconsistent (e.g., soap on a breakfast table) or consistent (e.g., soap in a bathroom) object. Perceptual features such as saliency, centre distance and clutter of the scene affected looking times in the toddler group during the whole viewing time whereas looking times in adults were affected only by centre distance during the early viewing time. Adults looked longer to inconsistent than consistent objects either if the objects had a high or a low saliency. In contrast, toddlers presented semantic consistency effect only when objects were highly salient. Additionally, toddlers with lower vocabulary skills looked longer to inconsistent objects while toddlers with higher vocabulary skills look equally long to both consistent and inconsistent objects. Our results indicate that 24-month-old children use scene context to guide visual attention when exploring the visual environment. However, perceptual features have a stronger influence in eye movement guidance in toddlers than in adults. Our results also indicate that language skills influence cognitive but not perceptual guidance of eye movements during scene perception in toddlers. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. The effect of solar radiation on the thermal environment inside the air-conditioned automobile chamber

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tong, L.; Yang, K.; Chen, Z.

    1999-07-01

    The distribution of solar radiant energy inside the specific air-conditioned automobile chamber is studied on the basis of the unique wavelength spectrum. Some important optical parameters of the internal materials are mostly determined by experiments with monochromator, electron-multiplier phototube, etc. Some optical parameters of the thin transparent object are analyzed theoretically. Based on random model, Monte Carlo method is adopted to get the detailed distribution of solar radiant energy. The procedures of absorptivity, reflection and transmission of each ray are simulated and traced during the calculation. The universal software calculates two cases with different kind of glass. The relevant resultsmore » show the importance of solar radiant energy on the thermal environment inside the air-conditioned automobile chamber. Furthermore, the necessity of shield quality of the automobile glass is also obvious. This study is also the basis of the following researches on fluid and temperature fields. The results are also useful for further thermal comfort design.« less

  7. Standardization of quantum key distribution and the ETSI standardization initiative ISG-QKD

    NASA Astrophysics Data System (ADS)

    Länger, Thomas; Lenhart, Gaby

    2009-05-01

    In recent years, quantum key distribution (QKD) has been the object of intensive research activities and of rapid progress, and it is now developing into a competitive industry with commercial products. Once QKD systems are transferred from the controlled environment of physical laboratories into a real-world environment for practical use, a number of practical security, compatibility and connectivity issues need to be resolved. In particular, comprehensive security evaluation and watertight security proofs need to be addressed to increase trust in QKD. System interoperability with existing infrastructures and applications as well as conformance with specific user requirements have to be assured. Finding common solutions to these problems involving all actors can provide an advantage for the commercialization of QKD as well as for further technological development. The ETSI industry specification group for QKD (ISG-QKD) offers a forum for creating such universally accepted standards and will promote significant leverage effects on coordination, cooperation and convergence in research, technical development and business application of QKD.

  8. A Model-Based Expert System for Space Power Distribution Diagnostics

    NASA Technical Reports Server (NTRS)

    Quinn, Todd M.; Schlegelmilch, Richard F.

    1994-01-01

    When engineers diagnose system failures, they often use models to confirm system operation. This concept has produced a class of advanced expert systems that perform model-based diagnosis. A model-based diagnostic expert system for the Space Station Freedom electrical power distribution test bed is currently being developed at the NASA Lewis Research Center. The objective of this expert system is to autonomously detect and isolate electrical fault conditions. Marple, a software package developed at TRW, provides a model-based environment utilizing constraint suspension. Originally, constraint suspension techniques were developed for digital systems. However, Marple provides the mechanisms for applying this approach to analog systems such as the test bed, as well. The expert system was developed using Marple and Lucid Common Lisp running on a Sun Sparc-2 workstation. The Marple modeling environment has proved to be a useful tool for investigating the various aspects of model-based diagnostics. This report describes work completed to date and lessons learned while employing model-based diagnostics using constraint suspension within an analog system.

  9. Framework for Development of Object-Oriented Software

    NASA Technical Reports Server (NTRS)

    Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan

    2004-01-01

    The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.

  10. Efficient Software Systems for Cardio Surgical Departments

    NASA Astrophysics Data System (ADS)

    Fountoukis, S. G.; Diomidous, M. J.

    2009-08-01

    Herein, the design implementation and deployment of an object oriented software system, suitable for the monitoring of cardio surgical departments, is investigated. Distributed design architectures are applied and the implemented software system can be deployed on distributed infrastructures. The software is flexible and adaptable to any cardio surgical environment regardless of the department resources used. The system exploits the relations and the interdependency of the successive bed positions that the patients occupy at the different health care units during their stay in a cardio surgical department, to determine bed availabilities and to perform patient scheduling and instant rescheduling whenever necessary. It also aims to successful monitoring of the workings of the cardio surgical departments in an efficient manner.

  11. The Lunar Atmosphere and Dust Environment Explorer (LADEE): Initial Science Results

    NASA Technical Reports Server (NTRS)

    Elphic, R. C.; Hine, B.; Delory, G. T.; Salute, J. S.; Noble, S.; Colaprete, A.; Horanyi, M.; Mahaffy, P.

    2014-01-01

    On September 6, 2013, a nearperfect launch of the first Minotaur V rocket successfully carried NASA's Lunar Atmosphere and Dust Environment Explorer (LADEE) into a higheccentricity geocentric orbit. The launch, from NASA's Wallops Flight Facility in Virginia, was visible from much of the eastern seaboard. Over the next 30 days, LADEE performed three phasing orbits, with near-perfect maneuvers that placed apogee at ever higher altitudes in preparation for rendezvous with the Moon. LADEE arrived at the Moon on October 6, 2013, during the government shutdown. LADEE's science objectives are twofold: (1) Determine the composition of the lunar atmosphere, investigate processes controlling its distribution and variability, including sources, sinks, and surface interactions; (2) Characterize the lunar exospheric dust environment, measure its spatial and temporal variability, and effects on the lunar atmosphere, if any.

  12. NELS 2.0 - A general system for enterprise wide information management

    NASA Technical Reports Server (NTRS)

    Smith, Stephanie L.

    1993-01-01

    NELS, the NASA Electronic Library System, is an information management tool for creating distributed repositories of documents, drawings, and code for use and reuse by the aerospace community. The NELS retrieval engine can load metadata and source files of full text objects, perform natural language queries to retrieve ranked objects, and create links to connect user interfaces. For flexibility, the NELS architecture has layered interfaces between the application program and the stored library information. The session manager provides the interface functions for development of NELS applications. The data manager is an interface between session manager and the structured data system. The center of the structured data system is the Wide Area Information Server. This system architecture provides access to information across heterogeneous platforms in a distributed environment. There are presently three user interfaces that connect to the NELS engine; an X-Windows interface, and ASCII interface and the Spatial Data Management System. This paper describes the design and operation of NELS as an information management tool and repository.

  13. EPICS as a MARTe Configuration Environment

    NASA Astrophysics Data System (ADS)

    Valcarcel, Daniel F.; Barbalace, Antonio; Neto, André; Duarte, André S.; Alves, Diogo; Carvalho, Bernardo B.; Carvalho, Pedro J.; Sousa, Jorge; Fernandes, Horácio; Goncalves, Bruno; Sartori, Filippo; Manduchi, Gabriele

    2011-08-01

    The Multithreaded Application Real-Time executor (MARTe) software provides an environment for the hard real-time execution of codes while leveraging a standardized algorithm development process. The Experimental Physics and Industrial Control System (EPICS) software allows the deployment and remote monitoring of networked control systems. Channel Access (CA) is the protocol that enables the communication between EPICS distributed components. It allows to set and monitor process variables across the network belonging to different systems. The COntrol and Data Acquisition and Communication (CODAC) system for the ITER Tokamak will be EPICS based and will be used to monitor and live configure the plant controllers. The reconfiguration capability in a hard real-time system requires strict latencies from the request to the actuation and it is a key element in the design of the distributed control algorithm. Presently, MARTe and its objects are configured using a well-defined structured language. After each configuration, all objects are destroyed and the system rebuilt, following the strong hard real-time rule that a real-time system in online mode must behave in a strictly deterministic fashion. This paper presents the design and considerations to use MARTe as a plant controller and enable it to be EPICS monitorable and configurable without disturbing the execution at any time, in particular during a plasma discharge. The solutions designed for this will be presented and discussed.

  14. Muon radiography in Russia with emulsion technique. First experiments future perspectives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aleksandrov, A. B.; Bagulya, A. V.; Chernyavsky, M. M.

    Cosmic ray muon radiography is a novel technique for imaging the internal structures of massive objects. It exploits the capability of high energy muons from cosmic-rays in order to obtain a density map of investigated object and trying to guess information on the variation in the density distribution. Nuclear emulsions are tracking detectors well suited to be employed in this context since they have an excellent angular resolution (few mrad), they are cheap, compact and robust, easily transportable, able to work in harsh environments, and do not require power supply. This work presents the first successful results in the fieldmore » of muon radiography in Russia with nuclear emulsions.« less

  15. SAVA 3: A testbed for integration and control of visual processes

    NASA Technical Reports Server (NTRS)

    Crowley, James L.; Christensen, Henrik

    1994-01-01

    The development of an experimental test-bed to investigate the integration and control of perception in a continuously operating vision system is described. The test-bed integrates a 12 axis robotic stereo camera head mounted on a mobile robot, dedicated computer boards for real-time image acquisition and processing, and a distributed system for image description. The architecture was designed to: (1) be continuously operating, (2) integrate software contributions from geographically dispersed laboratories, (3) integrate description of the environment with 2D measurements, 3D models, and recognition of objects, (4) capable of supporting diverse experiments in gaze control, visual servoing, navigation, and object surveillance, and (5) dynamically reconfiguarable.

  16. Design, implementation, and extension of thermal invisibility cloaks

    NASA Astrophysics Data System (ADS)

    Zhang, Youming; Xu, Hongyi; Zhang, Baile

    2015-05-01

    A thermal invisibility cloak, as inspired by optical invisibility cloaks, is a device which can steer the conductive heat flux around an isolated object without changing the ambient temperature distribution so that the object can be "invisible" to external thermal environment. While designs of thermal invisibility cloaks inherit previous theories from optical cloaks, the uniqueness of heat diffusion leads to more achievable implementations. Thermal invisibility cloaks, as well as the variations including thermal concentrator, rotator, and illusion devices, have potentials to be applied in thermal management, sensing and imaging applications. Here, we review the current knowledge of thermal invisibility cloaks in terms of their design and implementation in cloaking studies, and their extension as other functional devices.

  17. Collaborative mining and transfer learning for relational data

    NASA Astrophysics Data System (ADS)

    Levchuk, Georgiy; Eslami, Mohammed

    2015-06-01

    Many of the real-world problems, - including human knowledge, communication, biological, and cyber network analysis, - deal with data entities for which the essential information is contained in the relations among those entities. Such data must be modeled and analyzed as graphs, with attributes on both objects and relations encode and differentiate their semantics. Traditional data mining algorithms were originally designed for analyzing discrete objects for which a set of features can be defined, and thus cannot be easily adapted to deal with graph data. This gave rise to the relational data mining field of research, of which graph pattern learning is a key sub-domain [11]. In this paper, we describe a model for learning graph patterns in collaborative distributed manner. Distributed pattern learning is challenging due to dependencies between the nodes and relations in the graph, and variability across graph instances. We present three algorithms that trade-off benefits of parallelization and data aggregation, compare their performance to centralized graph learning, and discuss individual benefits and weaknesses of each model. Presented algorithms are designed for linear speedup in distributed computing environments, and learn graph patterns that are both closer to ground truth and provide higher detection rates than centralized mining algorithm.

  18. Design and Implementation of Replicated Object Layer

    NASA Technical Reports Server (NTRS)

    Koka, Sudhir

    1996-01-01

    One of the widely used techniques for construction of fault tolerant applications is the replication of resources so that if one copy fails sufficient copies may still remain operational to allow the application to continue to function. This thesis involves the design and implementation of an object oriented framework for replicating data on multiple sites and across different platforms. Our approach, called the Replicated Object Layer (ROL) provides a mechanism for consistent replication of data over dynamic networks. ROL uses the Reliable Multicast Protocol (RMP) as a communication protocol that provides for reliable delivery, serialization and fault tolerance. Besides providing type registration, this layer facilitates distributed atomic transactions on replicated data. A novel algorithm called the RMP Commit Protocol, which commits transactions efficiently in reliable multicast environment is presented. ROL provides recovery procedures to ensure that site and communication failures do not corrupt persistent data, and male the system fault tolerant to network partitions. ROL will facilitate building distributed fault tolerant applications by performing the burdensome details of replica consistency operations, and making it completely transparent to the application.Replicated databases are a major class of applications which could be built on top of ROL.

  19. Design notes for the next generation persistent object manager for CAP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isely, M.; Fischler, M.; Galli, M.

    1995-05-01

    The CAP query system software at Fermilab has several major components, including SQS (for managing the query), the retrieval system (for fetching auxiliary data), and the query software itself. The central query software in particular is essentially a modified version of the `ptool` product created at UIC (University of Illinois at Chicago) as part of the PASS project under Bob Grossman. The original UIC version was designed for use in a single-user non-distributed Unix environment. The Fermi modifications were an attempt to permit multi-user access to a data set distributed over a set of storage nodes. (The hardware is anmore » IBM SP-x system - a cluster of AIX POWER2 nodes with an IBM-proprietary high speed switch interconnect). Since the implementation work of the Fermi-ized ptool, the CAP members have learned quite a bit about the nature of queries and where the current performance bottlenecks exist. This has lead them to design a persistent object manager that will overcome these problems. For backwards compatibility with ptool, the ptool persistent object API will largely be retained, but the implementation will be entirely different.« less

  20. Semiautomatic mapping of permafrost in the Yukon Flats, Alaska

    NASA Astrophysics Data System (ADS)

    Gulbrandsen, Mats Lundh; Minsley, Burke J.; Ball, Lyndsay B.; Hansen, Thomas Mejer

    2016-12-01

    Thawing of permafrost due to global warming can have major impacts on hydrogeological processes, climate feedback, arctic ecology, and local environments. To understand these effects and processes, it is crucial to know the distribution of permafrost. In this study we exploit the fact that airborne electromagnetic (AEM) data are sensitive to the distribution of permafrost and demonstrate how the distribution of permafrost in the Yukon Flats, Alaska, is mapped in an efficient (semiautomatic) way, using a combination of supervised and unsupervised (machine) learning algorithms, i.e., Smart Interpretation and K-means clustering. Clustering is used to sort unfrozen and frozen regions, and Smart Interpretation is used to predict the depth of permafrost based on expert interpretations. This workflow allows, for the first time, a quantitative and objective approach to efficiently map permafrost based on large amounts of AEM data.

  1. Semiautomatic mapping of permafrost in the Yukon Flats, Alaska

    USGS Publications Warehouse

    Gulbrandsen, Mats Lundh; Minsley, Burke J.; Ball, Lyndsay B.; Hansen, Thomas Mejer

    2016-01-01

    Thawing of permafrost due to global warming can have major impacts on hydrogeological processes, climate feedback, arctic ecology, and local environments. To understand these effects and processes, it is crucial to know the distribution of permafrost. In this study we exploit the fact that airborne electromagnetic (AEM) data are sensitive to the distribution of permafrost and demonstrate how the distribution of permafrost in the Yukon Flats, Alaska, is mapped in an efficient (semiautomatic) way, using a combination of supervised and unsupervised (machine) learning algorithms, i.e., Smart Interpretation and K-means clustering. Clustering is used to sort unfrozen and frozen regions, and Smart Interpretation is used to predict the depth of permafrost based on expert interpretations. This workflow allows, for the first time, a quantitative and objective approach to efficiently map permafrost based on large amounts of AEM data.

  2. Incorporating client-server database architecture and graphical user interface into outpatient medical records.

    PubMed Central

    Fiacco, P. A.; Rice, W. H.

    1991-01-01

    Computerized medical record systems require structured database architectures for information processing. However, the data must be able to be transferred across heterogeneous platform and software systems. Client-Server architecture allows for distributive processing of information among networked computers and provides the flexibility needed to link diverse systems together effectively. We have incorporated this client-server model with a graphical user interface into an outpatient medical record system, known as SuperChart, for the Department of Family Medicine at SUNY Health Science Center at Syracuse. SuperChart was developed using SuperCard and Oracle SuperCard uses modern object-oriented programming to support a hypermedia environment. Oracle is a powerful relational database management system that incorporates a client-server architecture. This provides both a distributed database and distributed processing which improves performance. PMID:1807732

  3. Habitat preference of freshwater snails in relation to environmental factors and the presence of the competitor snail Melanoides tuberculatus (Müller, 1774).

    PubMed

    Giovanelli, Alexandre; da Silva, Cesar Luiz Pinto Ayres Coelho; Leal, Geórgia Borges Eccard; Baptista, Darcílio Fernandes

    2005-04-01

    Our objective is to evaluate the habitat preference of freshwater snails in relation to environmental factors and the presence of the competitor snail Melanoides tuberculatus. In the first phase, snails was collected at 12 sites. This sampling sites presented a degree of organic input. In the second phase 33 sampling sites were chosen, covering a variety of lotic and lentic environments. The snail species found at Guapimirim, state of Rio de Janeiro, displayed a marked habitat preference, specially in relation to the physical characteristics of each environment. Other limiting factors for snail distribution at the studied lotic environments were the water current velocity and the amount of organic matter, mainly to Physa marmorata, M. tuberculatus, and Biomphalaria tenagophila. The absence of interactions between M. tuberculatus and another snails could be associated to the distinct spatial distribution of those species and the instability of habitats. This later factor may favor the coexistence of M. tuberculatus with B. glabrata by reduction of population density. In areas of schistosomiasis transmission some habitat modification may add to the instability of the environment, which would make room for the coexistence of M. tuberculatus and Biomphalaria spp. In this way, some of the usual measures for the control of snail hosts would prevent the extinction of populations of Biomphalaria spp. by M. tuberculatus in particular habitats.

  4. Multiagent Systems Based Modeling and Implementation of Dynamic Energy Management of Smart Microgrid Using MACSimJX.

    PubMed

    Raju, Leo; Milton, R S; Mahadevan, Senthilkumaran

    The objective of this paper is implementation of multiagent system (MAS) for the advanced distributed energy management and demand side management of a solar microgrid. Initially, Java agent development environment (JADE) frame work is used to implement MAS based dynamic energy management of solar microgrid. Due to unstable nature of MATLAB, when dealing with multithreading environment, MAS operating in JADE is linked with the MATLAB using a middle ware called Multiagent Control Using Simulink with Jade Extension (MACSimJX). MACSimJX allows the solar microgrid components designed with MATLAB to be controlled by the corresponding agents of MAS. The microgrid environment variables are captured through sensors and given to agents through MATLAB/Simulink and after the agent operations in JADE, the results are given to the actuators through MATLAB for the implementation of dynamic operation in solar microgrid. MAS operating in JADE maximizes operational efficiency of solar microgrid by decentralized approach and increase in runtime efficiency due to JADE. Autonomous demand side management is implemented for optimizing the power exchange between main grid and microgrid with intermittent nature of solar power, randomness of load, and variation of noncritical load and grid price. These dynamics are considered for every time step and complex environment simulation is designed to emulate the distributed microgrid operations and evaluate the impact of agent operations.

  5. Multiagent Systems Based Modeling and Implementation of Dynamic Energy Management of Smart Microgrid Using MACSimJX

    PubMed Central

    Raju, Leo; Milton, R. S.; Mahadevan, Senthilkumaran

    2016-01-01

    The objective of this paper is implementation of multiagent system (MAS) for the advanced distributed energy management and demand side management of a solar microgrid. Initially, Java agent development environment (JADE) frame work is used to implement MAS based dynamic energy management of solar microgrid. Due to unstable nature of MATLAB, when dealing with multithreading environment, MAS operating in JADE is linked with the MATLAB using a middle ware called Multiagent Control Using Simulink with Jade Extension (MACSimJX). MACSimJX allows the solar microgrid components designed with MATLAB to be controlled by the corresponding agents of MAS. The microgrid environment variables are captured through sensors and given to agents through MATLAB/Simulink and after the agent operations in JADE, the results are given to the actuators through MATLAB for the implementation of dynamic operation in solar microgrid. MAS operating in JADE maximizes operational efficiency of solar microgrid by decentralized approach and increase in runtime efficiency due to JADE. Autonomous demand side management is implemented for optimizing the power exchange between main grid and microgrid with intermittent nature of solar power, randomness of load, and variation of noncritical load and grid price. These dynamics are considered for every time step and complex environment simulation is designed to emulate the distributed microgrid operations and evaluate the impact of agent operations. PMID:27127802

  6. Tori, Discs, and Winds: The First Ten Years of AGN Interferometry

    NASA Astrophysics Data System (ADS)

    Hönig, Sebastian F.

    Infrared (IR) interferometry has made significant progress over the last 10 years to a level that active galactic nuclei (AGN) are now routine targets for long-baseline interferometers. Almost 50 different objects have been studied today in the near-IR and mid-IR. This allowed for detailed characterisation of the dusty environment of the actively growing black holes. It was possible to show directly that the dust must be arranged in clumps, as had been indirectly inferred from theory and unresolved observations. The dust composition seems to undergo significant evolution from galactic scales to the AGN environment, with the hottest dust close to the sublimation front being dominated by large graphite grains. While the overall distribution of the dusty mass is quite diverse from object to object, indications have been found that the dust distribution may depend on AGN luminosity, with more powerful AGN potentially showing more compact dust structures. Arguably the most exciting discovery was the fact that the bulk of the mid-IR emission in Seyfert galaxies emerges from the polar region of the AGN, which is difficult to reconcile with classical torus models. An alternative model is currently being debated that consists of a dusty disc plus a dusty wind driven by radiation pressure from the central source. This finding has major implications for our understanding of AGN unification and will become a focus of the upcoming generation of instruments at the VLTI. More recently, an application of interferometry to cosmology was proposed to measure precise geometric distances to AGN in the Hubble flow. Further exploration of this method may open up interferometry to a new scientific community.

  7. The Planck Catalogue of Galactic Cold Clumps : Looking at the early stages of star-formation

    NASA Astrophysics Data System (ADS)

    Montier, Ludovic

    2015-08-01

    The Planck satellite has provided an unprecedented view of the submm sky, allowing us to search for the dust emission of Galactic cold sources. Combining Planck-HFI all-sky maps in the high frequency channels with the IRAS map at 100um, we built the Planck catalogue of Galactic Cold Clumps (PGCC, Planck 2015 results XXVIII 2015), counting 13188 sources distributed over the whole sky, and following mainly the Galactic structures at low and intermediate latitudes. This is the first all-sky catalogue of Galactic cold sources obtained with a single instrument at this resolution and sensitivity, which opens a new window on star-formation processes in our Galaxy.I will briefly describe the colour detection method used to extract the Galactic cold sources, i.e., the Cold Core Colour Detection Tool (CoCoCoDeT, Montier et al. 2010), and its application to the Planck data. I will discuss the statistical distribution of the properties of the PGCC sources (in terms of dust temperature, distance, mass, density and luminosity), which illustrates that the PGCC catalogue spans a large variety of environments and objects, from molecular clouds to cold cores, and covers various stages of evolution. The Planck catalogue is a very powerful tool to study the formation and the evolution of prestellar objects and star-forming regions.I will finally present an overview of the Herschel Key Program Galactic Cold Cores (PI. M.Juvela), which allowed us to follow-up about 350 Planck Galactic Cold Clumps, in various stages of evolution and environments. With this program, the nature and the composition of the 5' Planck sources have been revealed at a sub-arcmin resolution, showing very different configurations, such as starless cold cores or multiple Young Stellar objects still embedded in their cold envelope.

  8. Modeling and Simulation of the Transient Response of Temperature and Relative Humidity Sensors with and without Protective Housing

    PubMed Central

    Rocha, Keller Sullivan Oliveira; Martins, José Helvecio; Martins, Marcio Arêdes; Ferreira Tinôco, Ilda de Fátima; Saraz, Jairo Alexander Osorio; Filho, Adílio Flauzino Lacerda; Fernandes, Luiz Henrique Martins

    2014-01-01

    Based on the necessity for enclosure protection of temperature and relative humidity sensors installed in a hostile environment, a wind tunnel was used to quantify the time that the sensors take to reach equilibrium in the environmental conditions to which they are exposed. Two treatments were used: (1) sensors with polyvinyl chloride (PVC) enclosure protection, and (2) sensors with no enclosure protection. The primary objective of this study was to develop and validate a 3-D computational fluid dynamics (CFD) model for analyzing the temperature and relative humidity distribution in a wind tunnel using sensors with PVC enclosure protection and sensors with no enclosure protection. A CFD simulation model was developed to describe the temperature distribution and the physics of mass transfer related to the airflow relative humidity. The first results demonstrate the applicability of the simulation. For verification, a sensor device was successfully assembled and tested in an environment that was optimized to ensure fast change conditions. The quantification setup presented in this paper is thus considered to be adequate for testing different materials and morphologies for enclosure protection. The results show that the boundary layer flow regime has a significant impact on the heat flux distribution. The results indicate that the CFD technique is a powerful tool which provides a detailed description of the flow and temperature fields as well as the time that the relative humidity takes to reach equilibrium with the environment in which the sensors are inserted. PMID:24851994

  9. New frontier, new power: the retail environment in Australia's dark market

    PubMed Central

    Carter, S

    2003-01-01

    Objective: To investigate the role of the retail environment in cigarette marketing in Australia, one of the "darkest" markets in the world. Design: Analysis of 172 tobacco industry documents; and articles and advertisements found by hand searching Australia's three leading retail trade journals. Results: As Australian cigarette marketing was increasingly restricted, the retail environment became the primary communication vehicle for building cigarette brands. When retail marketing was restricted, the industry conceded only incrementally and under duress, and at times continues to break the law. The tobacco industry targets retailers via trade promotional expenditure, financial and practical assistance with point of sale marketing, alliance building, brand advertising, and distribution. Cigarette brand advertising in retail magazines are designed to build brand identities. Philip Morris and British American Tobacco are now competing to control distribution of all products to retailers, placing themselves at the heart of retail business. Conclusions: Cigarette companies prize retail marketing in Australia's dark market. Stringent point of sale marketing restrictions should be included in any comprehensive tobacco control measures. Relationships between retailers and the industry will be more difficult to regulate. Retail press advertising and trade promotional expenditure could be banned. In-store marketing assistance, retail–tobacco industry alliance building, and new electronic retail distribution systems may be less amenable to regulation. Alliances between the health and retail sectors and financial support for a move away from retail dependence on tobacco may be necessary to effect cultural change. PMID:14645954

  10. Modeling the Solar Dust Environment at 9.5 Solar Radii: Revealing Radiance Trends with MESSENGER Star Tracker Data

    NASA Astrophysics Data System (ADS)

    Strong, S. B.; Strikwerda, T.; Lario, D.; Raouafi, N.; Decker, R.

    2010-12-01

    The main components of interplanetary dust are created through destruction, erosion, and collision of asteroids and comets (e.g. Mann et al. 2006). Solar radiation forces distribute these interplanetary dust particles throughout the solar system. The percent contribution of these source particulates to the net interplanetary dust distribution can reveal information about solar nebula conditions, within which these objects are formed. In the absence of observational data (e.g. Helios, Pioneer), specifically at distances less than 0.3 AU, the precise dust distributions remain unknown and limited to 1 AU extrapolative models (e.g. Mann et al. 2003). We have developed a model suitable for the investigation of scattered dust and electron irradiance incident on a sensor for distances inward of 1 AU. The model utilizes the Grün et al. (1985) and Mann et al. (2004) dust distribution theory combined with Mie theory and Thomson electron scattering to determine the magnitude of solar irradiance scattered towards an optical sensor as a function of helio-ecliptic latitude and longitude. MESSENGER star tracker observations (launch to 2010) of the ambient celestial background combined with Helios data (Lienert et al. 1982) reveal trends in support of the model predictions. This analysis further emphasizes the need to characterize the inner solar system dust environment in anticipation of near-Solar missions.

  11. Object Segmentation Methods for Online Model Acquisition to Guide Robotic Grasping

    NASA Astrophysics Data System (ADS)

    Ignakov, Dmitri

    A vision system is an integral component of many autonomous robots. It enables the robot to perform essential tasks such as mapping, localization, or path planning. A vision system also assists with guiding the robot's grasping and manipulation tasks. As an increased demand is placed on service robots to operate in uncontrolled environments, advanced vision systems must be created that can function effectively in visually complex and cluttered settings. This thesis presents the development of segmentation algorithms to assist in online model acquisition for guiding robotic manipulation tasks. Specifically, the focus is placed on localizing door handles to assist in robotic door opening, and on acquiring partial object models to guide robotic grasping. First, a method for localizing a door handle of unknown geometry based on a proposed 3D segmentation method is presented. Following segmentation, localization is performed by fitting a simple box model to the segmented handle. The proposed method functions without requiring assumptions about the appearance of the handle or the door, and without a geometric model of the handle. Next, an object segmentation algorithm is developed, which combines multiple appearance (intensity and texture) and geometric (depth and curvature) cues. The algorithm is able to segment objects without utilizing any a priori appearance or geometric information in visually complex and cluttered environments. The segmentation method is based on the Conditional Random Fields (CRF) framework, and the graph cuts energy minimization technique. A simple and efficient method for initializing the proposed algorithm which overcomes graph cuts' reliance on user interaction is also developed. Finally, an improved segmentation algorithm is developed which incorporates a distance metric learning (DML) step as a means of weighing various appearance and geometric segmentation cues, allowing the method to better adapt to the available data. The improved method also models the distribution of 3D points in space as a distribution of algebraic distances from an ellipsoid fitted to the object, improving the method's ability to predict which points are likely to belong to the object or the background. Experimental validation of all methods is performed. Each method is evaluated in a realistic setting, utilizing scenarios of various complexities. Experimental results have demonstrated the effectiveness of the handle localization method, and the object segmentation methods.

  12. Impact of Spatial Pumping Patterns on Groundwater Management

    NASA Astrophysics Data System (ADS)

    Yin, J.; Tsai, F. T. C.

    2017-12-01

    Challenges exist to manage groundwater resources while maintaining a balance between groundwater quantity and quality because of anthropogenic pumping activities as well as complex subsurface environment. In this study, to address the impact of spatial pumping pattern on groundwater management, a mixed integer nonlinear multi-objective model is formulated by integrating three objectives within a management framework to: (i) maximize total groundwater withdrawal from potential wells; (ii) minimize total electricity cost for well pumps; and (iii) attain groundwater level at selected monitoring locations as close as possible to the target level. Binary variables are used in the groundwater management model to control the operative status of pumping wells. The NSGA-II is linked with MODFLOW to solve the multi-objective problem. The proposed method is applied to a groundwater management problem in the complex Baton Rouge aquifer system, southeastern Louisiana. Results show that (a) non-dominated trade-off solutions under various spatial distributions of active pumping wells can be achieved. Each solution is optimal with regard to its corresponding objectives; (b) operative status, locations and pumping rates of pumping wells are significant to influence the distribution of hydraulic head, which in turn influence the optimization results; (c) A wide range of optimal solutions is obtained such that decision makers can select the most appropriate solution through negotiation with different stakeholders. This technique is beneficial to finding out the optimal extent to which three objectives including water supply concern, energy concern and subsidence concern can be balanced.

  13. Heavy metal distributions in Peru Basin surface sediments in relation to historic, present and disturbed redox environments

    NASA Astrophysics Data System (ADS)

    Koschinsky, Andrea

    Heavy metal distributions in deep-sea surface sediments and pore water profiles from five areas in the Peru Basin were investigated with respect to the redox environment and diagenetic processes in these areas. The 10-20-cm-thick Mn oxide-rich and minor metal-rich top layer is underlain by an increase in dissolved Mn and Ni concentrations resulting from the reduction of the MnO 2 phase below the oxic zone. The mobilised associated metals like Co, Zn and Cu are partly immobilised by sorption on clay, organic or Fe compounds in the post-oxic environment. Enrichment of dissolved Cu, Zn, Ni, Co, Pb, Cd, Fe and V within the upper 1-5 cm of the oxic zone can be attributed to the degradation of organic matter. In a core from one area at around 22-25 cm depth, striking enrichments of these metals in dissolved and solid forms were observed. Offset distributions between oxygen penetration and Mn reduction and the thickness of the Mn oxide-rich layer indicate fluctuations of the Mn redox boundary on a short-term time scale. Within the objectives of the German ATESEPP research programme, the effect of an industrial impact such as manganese nodule mining on the heavy metal cycle in the surface sediment was considered. If the oxic surface were to be removed or disturbed, oxygen would penetrate deep into the formerly suboxic sediment and precipitate Mn 2+ and metals like Ni and Co which are preferably scavenged by MnO 2. The solid enrichments of Cd, V, and other metals formed in post-oxic environments would move downward with the new redox boundary until a new equilibrium between oxygen diffusion and consumption is reached.

  14. Early Results from the Lunar Atmosphere and Dust Environment Explorer (LADEE)

    NASA Technical Reports Server (NTRS)

    Elphic, R. C.; Hine, B.; Delory, G. T.; Mahaffy, Paul; Benna, Mehdi; Horanyi, Mihaly; Colaprete, Anthony; Noble, Sarah

    2014-01-01

    On 6 September, 2013, a near-perfect launch of the first Minotaur V rocket successfully carried NASA's Lunar Atmosphere and Dust Environment Explorer (LADEE) into a high-eccentricity geocentric orbit. After 30 days of phasing, LADEE arrived at the Moon on 6 October, 2013. LADEE's science objectives are twofold: (1) Determine the composition of the lunar atmosphere, investigate processes controlling its distribution and variability, including sources, sinks, and surface interactions; (2) Characterize the lunar exospheric dust environment, measure its spatial and temporal variability, and effects on the lunar atmosphere, if any. After a successful commissioning phase, the three science instruments have made systematic observations of the lunar dust and exospheric environment. These include initial observations of argon, neon and helium exospheres, and their diurnal variations; the lunar micrometeoroid impact ejecta cloud and its variations; spatial and temporal variations of the sodium exosphere; and the search for sunlight extinction caused by dust. LADEE also made observations of the effects of the Chang'e 3 landing on 14 December 2013.

  15. Background Noises Versus Intraseasonal Variation Signals: Small vs. Large Convective Cloud Objects From CERES Aqua Observations

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2015-01-01

    During inactive phases of Madden-Julian Oscillation (MJO), there are plenty of deep but small convective systems and far fewer deep and large ones. During active phases of MJO, a manifestation of an increase in the occurrence of large and deep cloud clusters results from an amplification of large-scale motions by stronger convective heating. This study is designed to quantitatively examine the roles of small and large cloud clusters during the MJO life cycle. We analyze the cloud object data from Aqua CERES (Clouds and the Earth's Radiant Energy System) observations between July 2006 and June 2010 for tropical deep convective (DC) and cirrostratus (CS) cloud object types according to the real-time multivariate MJO index, which assigns the tropics to one of the eight MJO phases each day. The cloud object is a contiguous region of the earth with a single dominant cloud-system type. The criteria for defining these cloud types are overcast footprints and cloud top pressures less than 400 hPa, but DC has higher cloud optical depths (=10) than those of CS (<10). The size distributions, defined as the footprint numbers as a function of cloud object diameters, for particular MJO phases depart greatly from the combined (8-phase) distribution at large cloud-object diameters due to the reduced/increased numbers of cloud objects related to changes in the large-scale environments. The medium diameter corresponding to the combined distribution is determined and used to partition all cloud objects into "small" and "large" groups of a particular phase. The two groups corresponding to the combined distribution have nearly equal numbers of footprints. The medium diameters are 502 km for DC and 310 km for cirrostratus. The range of the variation between two extreme phases (typically, the most active and depressed phases) for the small group is 6-11% in terms of the numbers of cloud objects and the total footprint numbers. The corresponding range for the large group is 19-44%. In terms of the probability density functions of radiative and cloud physical properties, there are virtually no differences between the MJO phases for the small group, but there are significant differences for the large groups for both DC and CS types. These results suggest that the intreseasonal variation signals reside at the large cloud clusters while the small cloud clusters represent the background noises resulting from various types of the tropical waves with different wavenumbers and propagation speeds/directions.

  16. Object Creation and Human Factors Evaluation for Virtual Environments

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1998-01-01

    The main objective of this project is to provide test objects for simulated environments utilized by the recently established Army/NASA Virtual Innovations Lab (ANVIL) at Marshall Space Flight Center, Huntsville, Al. The objective of the ANVIL lab is to provide virtual reality (VR) models and environments and to provide visualization and manipulation methods for the purpose of training and testing. Visualization equipment used in the ANVIL lab includes head-mounted and boom-mounted immersive virtual reality display devices. Objects in the environment are manipulated using data glove, hand controller, or mouse. These simulated objects are solid or surfaced three dimensional models. They may be viewed or manipulated from any location within the environment and may be viewed on-screen or via immersive VR. The objects are created using various CAD modeling packages and are converted into the virtual environment using dVise. This enables the object or environment to be viewed from any angle or distance for training or testing purposes.

  17. Development of a three-dimensional multistage inverse design method for aerodynamic matching of axial compressor blading

    NASA Astrophysics Data System (ADS)

    van Rooij, Michael P. C.

    Current turbomachinery design systems increasingly rely on multistage Computational Fluid Dynamics (CFD) as a means to assess performance of designs. However, design weaknesses attributed to improper stage matching are addressed using often ineffective strategies involving a costly iterative loop between blading modification, revision of design intent, and evaluation of aerodynamic performance. A design methodology is presented which greatly improves the process of achieving design-point aerodynamic matching. It is based on a three-dimensional viscous inverse design method which generates the blade camber surface based on prescribed pressure loading, thickness distribution and stacking line. This inverse design method has been extended to allow blading analysis and design in a multi-blade row environment. Blade row coupling was achieved through a mixing plane approximation. Parallel computing capability in the form of MPI has been implemented to reduce the computational time for multistage calculations. Improvements have been made to the flow solver to reach the level of accuracy required for multistage calculations. These include inclusion of heat flux, temperature-dependent treatment of viscosity, and improved calculation of stress components and artificial dissipation near solid walls. A validation study confirmed that the obtained accuracy is satisfactory at design point conditions. Improvements have also been made to the inverse method to increase robustness and design fidelity. These include the possibility to exclude spanwise sections of the blade near the endwalls from the design process, and a scheme that adjusts the specified loading area for changes resulting from the leading and trailing edge treatment. Furthermore, a pressure loading manager has been developed. Its function is to automatically adjust the pressure loading area distribution during the design calculation in order to achieve a specified design objective. Possible objectives are overall mass flow and compression ratio, and radial distribution of exit flow angle. To supplement the loading manager, mass flow inlet and exit boundary conditions have been implemented. Through appropriate combination of pressure or mass flow inflow/outflow boundary conditions and loading manager objectives, increased control over the design intent can be obtained. The three-dimensional multistage inverse design method with pressure loading manager was demonstrated to offer greatly enhanced blade row matching capabilities. Multistage design allows for simultaneous design of blade rows in a mutually interacting environment, which permits the redesigned blading to adapt to changing aerodynamic conditions resulting from the redesign. This ensures that the obtained blading geometry and performance implied by the prescribed pressure loading distribution are consistent with operation in the multi-blade row environment. The developed methodology offers high aerodynamic design quality and productivity, and constitutes a significant improvement over existing approaches used to address design-point aerodynamic matching.

  18. Estimating the center of mass of a free-floating body in microgravity.

    PubMed

    Lejeune, L; Casellato, C; Pattyn, N; Neyt, X; Migeotte, P-F

    2013-01-01

    This paper addresses the issue of estimating the position of the center of mass (CoM) of a free-floating object of unknown mass distribution in microgravity using a stereoscopic imaging system. The method presented here is applied to an object of known mass distribution for validation purposes. In the context of a study of 3-dimensional ballistocardiography in microgravity, and the elaboration of a physical model of the cardiovascular adaptation to weightlessness, the hypothesis that the fluid shift towards the head of astronauts induces a significant shift of their CoM needs to be tested. The experiments were conducted during the 57th parabolic flight campaign of the European Space Agency (ESA). At the beginning of the microgravity phase, the object was given an initial translational and rotational velocity. A 3D point cloud corresponding to the object was then generated, to which a motion-based method inspired by rigid body physics was applied. Through simulations, the effects of the centroid-to-CoM distance and the number of frames of the sequence are investigated. In experimental conditions, considering the important residual accelerations of the airplane during the microgravity phases, CoM estimation errors (16 to 76 mm) were consistent with simulations. Overall, our results suggest that the method has a good potential for its later generalization to a free-floating human body in a weightless environment.

  19. CQPSO scheduling algorithm for heterogeneous multi-core DAG task model

    NASA Astrophysics Data System (ADS)

    Zhai, Wenzheng; Hu, Yue-Li; Ran, Feng

    2017-07-01

    Efficient task scheduling is critical to achieve high performance in a heterogeneous multi-core computing environment. The paper focuses on the heterogeneous multi-core directed acyclic graph (DAG) task model and proposes a novel task scheduling method based on an improved chaotic quantum-behaved particle swarm optimization (CQPSO) algorithm. A task priority scheduling list was built. A processor with minimum cumulative earliest finish time (EFT) was acted as the object of the first task assignment. The task precedence relationships were satisfied and the total execution time of all tasks was minimized. The experimental results show that the proposed algorithm has the advantage of optimization abilities, simple and feasible, fast convergence, and can be applied to the task scheduling optimization for other heterogeneous and distributed environment.

  20. Microgravity effects on water flow and distribution in unsaturated porous media: Analyses of flight experiments

    NASA Astrophysics Data System (ADS)

    Jones, Scott B.; Or, Dani

    1999-04-01

    Plants grown in porous media are part of a bioregenerative life support system designed for long-duration space missions. Reduced gravity conditions of orbiting spacecraft (microgravity) alter several aspects of liquid flow and distribution within partially saturated porous media. The objectives of this study were to evaluate the suitability of conventional capillary flow theory in simulating water distribution in porous media measured in a microgravity environment. Data from experiments aboard the Russian space station Mir and a U.S. space shuttle were simulated by elimination of the gravitational term from the Richards equation. Qualitative comparisons with media hydraulic parameters measured on Earth suggest narrower pore size distributions and inactive or nonparticipating large pores in microgravity. Evidence of accentuated hysteresis, altered soil-water characteristic, and reduced unsaturated hydraulic conductivity from microgravity simulations may be attributable to a number of proposed secondary mechanisms. These are likely spawned by enhanced and modified paths of interfacial flows and an altered force ratio of capillary to body forces in microgravity.

  1. LED-CT Scan for pH Distribution on a Cross-Section of Cell Culture Medium.

    PubMed

    Higashino, Nobuya; Takayama, Toshio; Ito, Hiroaki; Horade, Mitsuhiro; Yamaguchi, Yasutaka; Dylan Tsai, Chia-Hung; Kaneko, Makoto

    2018-01-11

    In cell culture, the pH of the culture medium is one of the most important conditions. However, the culture medium may have non-uniform pH distribution due to activities of cells and changes in the environment. Although it is possible to measure the pH distribution with an existing pH meter using distributed electrodes, the method involves direct contact with the medium and would greatly increase the risk of contamination. Here in this paper, we propose a computed tomography (CT) scan for measuring pH distribution using the color change of phenol red with a light-emitting diode (LED) light source. Using the principle of CT scan, we can measure pH distribution without contacting culture medium, and thus, decrease the risk of contamination. We have developed the device with a LED, an array of photo receivers and a rotation mechanism. The system is firstly calibrated with different shapes of wooden objects that do not pass light, we succeeded in obtaining their 3D topographies. The system was also used for measuring a culture medium with two different pH values, it was possible to obtain a pH distribution that clearly shows the boundary.

  2. Spatial and temporal variability of soil moisture on the field with and without plants*

    NASA Astrophysics Data System (ADS)

    Usowicz, B.; Marczewski, W.; Usowicz, J. B.

    2012-04-01

    Spatial and temporal variability of the natural environment is its inherent and unavoidable feature. Every element of the environment is characterized by its own variability. One of the kinds of variability in the natural environment is the variability of the soil environment. To acquire better and deeper knowledge and understanding of the temporal and spatial variability of the physical, chemical and biological features of the soil environment, we should determine the causes that induce a given variability. Relatively stable features of soil include its texture and mineral composition; examples of those variables in time are the soil pH or organic matter content; an example of a feature with strong dynamics is the soil temperature and moisture content. The aim of this study was to identify the variability of soil moisture on the field with and without plants using geostatistical methods. The soil moisture measurements were taken on the object with plant canopy and without plants (as reference). The measurements of soil moisture and meteorological components were taken within the period of April-July. The TDR moisture sensors covered 5 cm soil layers and were installed in the plots in the soil layers of 0-0.05, 0.05-0.1, 0.1-0.15, 0.2-0.25, 0.3-0.35, 0.4-0.45, 0.5-0.55, 0.8-0.85 m. Measurements of soil moisture were taken once a day, in the afternoon hours. For the determination of reciprocal correlation, precipitation data and data from soil moisture measurements with the TDR meter were used. Calculations of reciprocal correlation of precipitation and soil moisture at various depths were made for three objects - spring barley, rye, and bare soil, at the level of significance of p<0.05. No significant reciprocal correlation was found between the precipitation and soil moisture in the soil profile for any of the objects studied. Although the correlation analysis indicates a lack of correlation between the variables under consideration, observation of the soil moisture runs in particular objects and of precipitation distribution shows clearly that rainfall has an effect on the soil moisture. The amount of precipitation water that increased the soil moisture depended on the strength of the rainfall, on the hydrological properties of the soil (primarily the soil density), the status of the plant cover, and surface runoff. Basing on the precipitation distribution and on the soil moisture runs, an attempt was made at finding a temporal and spatial relationship between those variables, employing for the purpose the geostatistical methods which permit time and space to be included in the analysis. The geostatistical parameters determined showed the temporal dependence of moisture distribution in the soil profile, with the autocorrelation radius increasing with increasing depth in the profile. The highest values of the radius were observed in the plots with plant cover below the arable horizon, and the lowest in the arable horizon on the barley and fallow plots. The fractal dimensions showed a clear decrease in values with increasing depth in the plots with plant cover, while in the bare plots they were relatively constant within the soil profile under study. Therefore, they indicated that the temporal distribution of soil moisture within the soil profile in the bare field was more random in character than in the plots with plants. The results obtained and the analyses indicate that the moisture in the soil profile, its variability and determination, are significantly affected by the type and condition of plant canopy. The differentiation in moisture content between the plots studied resulted from different precipitation interception and different intensity of water uptake by the roots. * The work was financially supported in part by the ESA Programme for European Cooperating States (PECS), No.98084 "SWEX-R, Soil Water and Energy Exchange/Research", AO-3275.

  3. Structure of massive star forming clumps from the Red MSX Source Survey

    NASA Astrophysics Data System (ADS)

    Figura, Charles C.; Urquhart, J. S.; Morgan, L.

    2014-01-01

    We present ammonia (1,1) and (2,2) emission maps of 61 high-mass star forming regions drawn from the Red MSX Source (RMS) Survey and observed with the Green Bank Telescope's K-Band Focal Plane Array. We use these observations to investigate the spatial distribution of the environmental conditions associated with this sample of embedded massive young stellar objects (MYSOs). Ammonia is an excellent high-density tracer of star-forming regions as its hyperfine structure allows relatively simple characterisation of the molecular environment. These maps are used to measure the column density, kinetic gas temperature distributions and velocity structure across these regions. We compare the distribution of these properties to that of the associated dust and mid-infrared emission traced by the ATLASGAL 870 micron emission maps and the Spitzer GLIMPSE IRAC images. We present a summary of these results and highlight some of more interesting finds.

  4. Hierarchical representation of shapes in visual cortex—from localized features to figural shape segregation

    PubMed Central

    Tschechne, Stephan; Neumann, Heiko

    2014-01-01

    Visual structures in the environment are segmented into image regions and those combined to a representation of surfaces and prototypical objects. Such a perceptual organization is performed by complex neural mechanisms in the visual cortex of primates. Multiple mutually connected areas in the ventral cortical pathway receive visual input and extract local form features that are subsequently grouped into increasingly complex, more meaningful image elements. Such a distributed network of processing must be capable to make accessible highly articulated changes in shape boundary as well as very subtle curvature changes that contribute to the perception of an object. We propose a recurrent computational network architecture that utilizes hierarchical distributed representations of shape features to encode surface and object boundary over different scales of resolution. Our model makes use of neural mechanisms that model the processing capabilities of early and intermediate stages in visual cortex, namely areas V1–V4 and IT. We suggest that multiple specialized component representations interact by feedforward hierarchical processing that is combined with feedback signals driven by representations generated at higher stages. Based on this, global configurational as well as local information is made available to distinguish changes in the object's contour. Once the outline of a shape has been established, contextual contour configurations are used to assign border ownership directions and thus achieve segregation of figure and ground. The model, thus, proposes how separate mechanisms contribute to distributed hierarchical cortical shape representation and combine with processes of figure-ground segregation. Our model is probed with a selection of stimuli to illustrate processing results at different processing stages. We especially highlight how modulatory feedback connections contribute to the processing of visual input at various stages in the processing hierarchy. PMID:25157228

  5. Detecting Inspection Objects of Power Line from Cable Inspection Robot LiDAR Data

    PubMed Central

    Qin, Xinyan; Wu, Gongping; Fan, Fei

    2018-01-01

    Power lines are extending to complex environments (e.g., lakes and forests), and the distribution of power lines in a tower is becoming complicated (e.g., multi-loop and multi-bundle). Additionally, power line inspection is becoming heavier and more difficult. Advanced LiDAR technology is increasingly being used to solve these difficulties. Based on precise cable inspection robot (CIR) LiDAR data and the distinctive position and orientation system (POS) data, we propose a novel methodology to detect inspection objects surrounding power lines. The proposed method mainly includes four steps: firstly, the original point cloud is divided into single-span data as a processing unit; secondly, the optimal elevation threshold is constructed to remove ground points without the existing filtering algorithm, improving data processing efficiency and extraction accuracy; thirdly, a single power line and its surrounding data can be respectively extracted by a structured partition based on a POS data (SPPD) algorithm from “layer” to “block” according to power line distribution; finally, a partition recognition method is proposed based on the distribution characteristics of inspection objects, highlighting the feature information and improving the recognition effect. The local neighborhood statistics and the 3D region growing method are used to recognize different inspection objects surrounding power lines in a partition. Three datasets were collected by two CIR LIDAR systems in our study. The experimental results demonstrate that an average 90.6% accuracy and average 98.2% precision at the point cloud level can be achieved. The successful extraction indicates that the proposed method is feasible and promising. Our study can be used to obtain precise dimensions of fittings for modeling, as well as automatic detection and location of security risks, so as to improve the intelligence level of power line inspection. PMID:29690560

  6. Hierarchical representation of shapes in visual cortex-from localized features to figural shape segregation.

    PubMed

    Tschechne, Stephan; Neumann, Heiko

    2014-01-01

    Visual structures in the environment are segmented into image regions and those combined to a representation of surfaces and prototypical objects. Such a perceptual organization is performed by complex neural mechanisms in the visual cortex of primates. Multiple mutually connected areas in the ventral cortical pathway receive visual input and extract local form features that are subsequently grouped into increasingly complex, more meaningful image elements. Such a distributed network of processing must be capable to make accessible highly articulated changes in shape boundary as well as very subtle curvature changes that contribute to the perception of an object. We propose a recurrent computational network architecture that utilizes hierarchical distributed representations of shape features to encode surface and object boundary over different scales of resolution. Our model makes use of neural mechanisms that model the processing capabilities of early and intermediate stages in visual cortex, namely areas V1-V4 and IT. We suggest that multiple specialized component representations interact by feedforward hierarchical processing that is combined with feedback signals driven by representations generated at higher stages. Based on this, global configurational as well as local information is made available to distinguish changes in the object's contour. Once the outline of a shape has been established, contextual contour configurations are used to assign border ownership directions and thus achieve segregation of figure and ground. The model, thus, proposes how separate mechanisms contribute to distributed hierarchical cortical shape representation and combine with processes of figure-ground segregation. Our model is probed with a selection of stimuli to illustrate processing results at different processing stages. We especially highlight how modulatory feedback connections contribute to the processing of visual input at various stages in the processing hierarchy.

  7. Detecting Inspection Objects of Power Line from Cable Inspection Robot LiDAR Data.

    PubMed

    Qin, Xinyan; Wu, Gongping; Lei, Jin; Fan, Fei; Ye, Xuhui

    2018-04-22

    Power lines are extending to complex environments (e.g., lakes and forests), and the distribution of power lines in a tower is becoming complicated (e.g., multi-loop and multi-bundle). Additionally, power line inspection is becoming heavier and more difficult. Advanced LiDAR technology is increasingly being used to solve these difficulties. Based on precise cable inspection robot (CIR) LiDAR data and the distinctive position and orientation system (POS) data, we propose a novel methodology to detect inspection objects surrounding power lines. The proposed method mainly includes four steps: firstly, the original point cloud is divided into single-span data as a processing unit; secondly, the optimal elevation threshold is constructed to remove ground points without the existing filtering algorithm, improving data processing efficiency and extraction accuracy; thirdly, a single power line and its surrounding data can be respectively extracted by a structured partition based on a POS data (SPPD) algorithm from "layer" to "block" according to power line distribution; finally, a partition recognition method is proposed based on the distribution characteristics of inspection objects, highlighting the feature information and improving the recognition effect. The local neighborhood statistics and the 3D region growing method are used to recognize different inspection objects surrounding power lines in a partition. Three datasets were collected by two CIR LIDAR systems in our study. The experimental results demonstrate that an average 90.6% accuracy and average 98.2% precision at the point cloud level can be achieved. The successful extraction indicates that the proposed method is feasible and promising. Our study can be used to obtain precise dimensions of fittings for modeling, as well as automatic detection and location of security risks, so as to improve the intelligence level of power line inspection.

  8. Adaptive particle filter for robust visual tracking

    NASA Astrophysics Data System (ADS)

    Dai, Jianghua; Yu, Shengsheng; Sun, Weiping; Chen, Xiaoping; Xiang, Jinhai

    2009-10-01

    Object tracking plays a key role in the field of computer vision. Particle filter has been widely used for visual tracking under nonlinear and/or non-Gaussian circumstances. In particle filter, the state transition model for predicting the next location of tracked object assumes the object motion is invariable, which cannot well approximate the varying dynamics of the motion changes. In addition, the state estimate calculated by the mean of all the weighted particles is coarse or inaccurate due to various noise disturbances. Both these two factors may degrade tracking performance greatly. In this work, an adaptive particle filter (APF) with a velocity-updating based transition model (VTM) and an adaptive state estimate approach (ASEA) is proposed to improve object tracking. In APF, the motion velocity embedded into the state transition model is updated continuously by a recursive equation, and the state estimate is obtained adaptively according to the state posterior distribution. The experiment results show that the APF can increase the tracking accuracy and efficiency in complex environments.

  9. The Next Generation Virgo Cluster Survey. VII. The Intrinsic Shapes of Low-luminosity Galaxies in the Core of the Virgo Cluster, and a Comparison with the Local Group

    NASA Astrophysics Data System (ADS)

    Sánchez-Janssen, Rubén; Ferrarese, Laura; MacArthur, Lauren A.; Côté, Patrick; Blakeslee, John P.; Cuillandre, Jean-Charles; Duc, Pierre-Alain; Durrell, Patrick; Gwyn, Stephen; McConnacchie, Alan W.; Boselli, Alessandro; Courteau, Stéphane; Emsellem, Eric; Mei, Simona; Peng, Eric; Puzia, Thomas H.; Roediger, Joel; Simard, Luc; Boyer, Fred; Santos, Matthew

    2016-03-01

    We investigate the intrinsic shapes of low-luminosity galaxies in the central 300 kpc of the Virgo Cluster using deep imaging obtained as part of the Next Generation Virgo Cluster Survey (NGVS). We build a sample of nearly 300 red-sequence cluster members in the yet-unexplored -14 < Mg < -8 mag range, and we measure their apparent axis ratios, q, through Sérsic fits to their two-dimensional light distribution, which is well described by a constant ellipticity parameter. The resulting distribution of apparent axis ratios is then fit by families of triaxial models with normally distributed intrinsic ellipticities, E = 1 - C/A, and triaxialities, T = (A2 - B2)/(A2 - C2). We develop a Bayesian framework to explore the posterior distribution of the model parameters, which allows us to work directly on discrete data, and to account for individual, surface-brightness-dependent axis ratio uncertainties. For this population we infer a mean intrinsic ellipticity \\bar{E} = {0.43}-0.02+0.02 and a mean triaxiality \\bar{T} = {0.16}-0.06+0.07. This implies that faint Virgo galaxies are best described as a family of thick, nearly oblate spheroids with mean intrinsic axis ratios 1:0.94:0.57. The core of Virgo lacks highly elongated low-luminosity galaxies, with 95% of the population having q > 0.45. We additionally attempt a study of the intrinsic shapes of Local Group (LG) satellites of similar luminosities. For the LG population we infer a slightly larger mean intrinsic ellipticity \\bar{E} = {0.51}-0.06+0.07, and the paucity of objects with round apparent shapes translates into more triaxial mean shapes, 1:0.76:0.49. Numerical studies that follow the tidal evolution of satellites within LG-sized halos are in good agreement with the inferred shape distributions, but the mismatch for faint galaxies in Virgo highlights the need for more adequate simulations of this population in the cluster environment. We finally compare the intrinsic shapes of NGVS low-mass galaxies with samples of more massive quiescent systems, and with field, star-forming galaxies of similar luminosities. We find that the intrinsic flattening in this low-luminosity regime is almost independent of the environment in which the galaxy resides, but there is a hint that objects may be slightly rounder in denser environments. The comparable flattening distributions of low-luminosity galaxies that have experienced very different degrees of environmental effects suggest that internal processes are the main drivers of galaxy structure at low masses, with external mechanisms playing a secondary role.

  10. Parameter estimation for the 4-parameter Asymmetric Exponential Power distribution by the method of L-moments using R

    USGS Publications Warehouse

    Asquith, William H.

    2014-01-01

    The implementation characteristics of two method of L-moments (MLM) algorithms for parameter estimation of the 4-parameter Asymmetric Exponential Power (AEP4) distribution are studied using the R environment for statistical computing. The objective is to validate the algorithms for general application of the AEP4 using R. An algorithm was introduced in the original study of the L-moments for the AEP4. A second or alternative algorithm is shown to have a larger L-moment-parameter domain than the original. The alternative algorithm is shown to provide reliable parameter production and recovery of L-moments from fitted parameters. A proposal is made for AEP4 implementation in conjunction with the 4-parameter Kappa distribution to create a mixed-distribution framework encompassing the joint L-skew and L-kurtosis domains. The example application provides a demonstration of pertinent algorithms with L-moment statistics and two 4-parameter distributions (AEP4 and the Generalized Lambda) for MLM fitting to a modestly asymmetric and heavy-tailed dataset using R.

  11. Community health assessment using self-organizing maps and geographic information systems

    PubMed Central

    Basara, Heather G; Yuan, May

    2008-01-01

    Background From a public health perspective, a healthier community environment correlates with fewer occurrences of chronic or infectious diseases. Our premise is that community health is a non-linear function of environmental and socioeconomic effects that are not normally distributed among communities. The objective was to integrate multivariate data sets representing social, economic, and physical environmental factors to evaluate the hypothesis that communities with similar environmental characteristics exhibit similar distributions of disease. Results The SOM algorithm used the intrinsic distributions of 92 environmental variables to classify 511 communities into five clusters. SOM determined clusters were reprojected to geographic space and compared with the distributions of several health outcomes. ANOVA results indicated that the variability between community clusters was significant with respect to the spatial distribution of disease occurrence. Conclusion Our study demonstrated a positive relationship between environmental conditions and health outcomes in communities using the SOM-GIS method to overcome data and methodological challenges traditionally encountered in public health research. Results demonstrated that community health can be classified using environmental variables and that the SOM-GIS method may be applied to multivariate environmental health studies. PMID:19116020

  12. Measuring the Clustering Around Normal and Dust-Obscured Quasars at 2 in the Spitzer Extragalactic Representative Volume Survey (SERVS)

    NASA Astrophysics Data System (ADS)

    Jones, Kristen M.; Lacy, M.; Spitzer Extragalactic Representative Volume Survey Team

    2014-01-01

    Little is known about the environments of high redshift quasars, particularly those obscured by dust. Previous work suggests that dust-shrouded (type 2) quasars are at least as common as un-obscured optical (type 1) quasars; therefore, in order to fully understand the role quasars play in the evolutionary history of the universe, we must understand both types of objects. This project seeks to explore the environments in which obscured quasars form. In this poster, we present mid-infrared clustering measurements for a sample of 45 quasars with 1.3 < z < 2.5, a redshift range that is unexplored in the literature. The objects were selected using IRAC multi-color criteria to remove low-redshift starburst and quiescent galaxies, and subsequently had spectroscopy carried out to both obtain redshifts, and to distinguish between type 1 and type 2 quasars; the high-redshift sample presented in this paper is roughly evenly distributed between the two types. We use the SERVS galaxy catalogs to estimate the cross-correlation between each quasar and its surrounding galaxies. The amplitude of this function gives us the richness of the environments in which these quasars are found, and we compare our results with a matched sample with z < 1.3.

  13. The Particle Physics Data Grid. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron

    2002-08-16

    The main objective of the Particle Physics Data Grid (PPDG) project has been to implement and evaluate distributed (Grid-enabled) data access and management technology for current and future particle and nuclear physics experiments. The specific goals of PPDG have been to design, implement, and deploy a Grid-based software infrastructure capable of supporting the data generation, processing and analysis needs common to the physics experiments represented by the participants, and to adapt experiment-specific software to operate in the Grid environment and to exploit this infrastructure. To accomplish these goals, the PPDG focused on the implementation and deployment of several critical services:more » reliable and efficient file replication service, high-speed data transfer services, multisite file caching and staging service, and reliable and recoverable job management services. The focus of the activity was the job management services and the interplay between these services and distributed data access in a Grid environment. Software was developed to study the interaction between HENP applications and distributed data storage fabric. One key conclusion was the need for a reliable and recoverable tool for managing large collections of interdependent jobs. An attached document provides an overview of the current status of the Directed Acyclic Graph Manager (DAGMan) with its main features and capabilities.« less

  14. Distinguishing spin-aligned and isotropic black hole populations with gravitational waves.

    PubMed

    Farr, Will M; Stevenson, Simon; Miller, M Coleman; Mandel, Ilya; Farr, Ben; Vecchio, Alberto

    2017-08-23

    The direct detection of gravitational waves from merging binary black holes opens up a window into the environments in which binary black holes form. One signature of such environments is the angular distribution of the black hole spins. Binary systems that formed through dynamical interactions between already-compact objects are expected to have isotropic spin orientations (that is, the spins of the black holes are randomly oriented with respect to the orbit of the binary system), whereas those that formed from pairs of stars born together are more likely to have spins that are preferentially aligned with the orbit. The best-measured combination of spin parameters for each of the four likely binary black hole detections GW150914, LVT151012, GW151226 and GW170104 is the 'effective' spin. Here we report that, if the magnitudes of the black hole spins are allowed to extend to high values, the effective spins for these systems indicate a 0.015 odds ratio against an aligned angular distribution compared to an isotropic one. When considering the effect of ten additional detections, this odds ratio decreases to 2.9 × 10 -7 against alignment. The existing preference for either an isotropic spin distribution or low spin magnitudes for the observed systems will be confirmed (or overturned) confidently in the near future.

  15. Effect of fiber distribution and realignment on the nonlinear and inhomogeneous mechanical properties of human supraspinatus tendon under longitudinal tensile loading.

    PubMed

    Lake, Spencer P; Miller, Kristin S; Elliott, Dawn M; Soslowsky, Louis J

    2009-12-01

    Tendon exhibits nonlinear stress-strain behavior that may be partly due to movement of collagen fibers through the extracellular matrix. While a few techniques have been developed to evaluate the fiber architecture of other soft tissues, the organizational behavior of tendon under load has not been determined. The supraspinatus tendon (SST) of the rotator cuff is of particular interest for investigation due to its complex mechanical environment and corresponding inhomogeneity. In addition, SST injury occurs frequently with limited success in treatment strategies, illustrating the need for a better understanding of SST properties. Therefore, the objective of this study was to quantitatively evaluate the inhomogeneous tensile mechanical properties, fiber organization, and fiber realignment under load of human SST utilizing a novel polarized light technique. Fiber distributions were found to become more aligned under load, particularly during the low stiffness toe-region, suggesting that fiber realignment may be partly responsible for observed nonlinear behavior. Fiber alignment was found to correlate significantly with mechanical parameters, providing evidence for strong structure-function relationships in tendon. Human SST exhibits complex, inhomogeneous mechanical properties and fiber distributions, perhaps due to its complex loading environment. Surprisingly, histological grade of degeneration did not correlate with mechanical properties.

  16. Design of Distortion-Invariant Optical ID Tags for Remote Identification and Verification of Objects

    NASA Astrophysics Data System (ADS)

    Pérez-Cabré, Elisabet; Millán, María Sagrario; Javidi, Bahram

    Optical identification (ID) tags [1] have a promising future in a number of applications such as the surveillance of vehicles in transportation, control of restricted areas for homeland security, item tracking on conveyor belts or other industrial environment, etc. More specifically, passive optical ID tag [1] was introduced as an optical code containing a signature (that is, a characteristic image or other relevant information of the object), which permits its real-time remote detection and identification. Since their introduction in the literature [1], some contributions have been proposed to increase their usefulness and robustness. To increase security and avoid counterfeiting, the signature was introduced in the optical code as an encrypted function [2-5] following the double-phase encryption technique [6]. Moreover, the design of the optical ID tag was done in such a way that tolerance to variations in scale and rotation was achieved [2-5]. To do that, the encrypted information was multiplexed and distributed in the optical code following an appropriate topology. Further studies were carried out to analyze the influence of different sources of noise. In some proposals [5, 7], the designed ID tag consists of two optical codes where the complex-valued encrypted signature was separately introduced in two real-valued functions according to its magnitude and phase distributions. This solution was introduced to overcome some difficulties in the readout of complex values in outdoors environments. Recently, the fully phase encryption technique [8] has been proposed to increase noise robustness of the authentication system.

  17. Optical technologies for the Internet of Things era

    NASA Astrophysics Data System (ADS)

    Ji, Philip N.

    2017-08-01

    Internet of Things (IoT) is a network of interrelated physical objects that can collect and exchange data with one another through embedded electronics, software, sensors, over the Internet. It extends Internet connectivity beyond traditional networking devices to a diverse range of physical devices and everyday things that utilize embedded technologies to communicate and interact with the external environment. The IoT brings automation and efficiency improvement to everyday life, business, and society. Therefore IoT applications and market are growing rapidly. Contrary to common belief that IoT is only related to wireless technology, optical technologies actually play important roles in the growth of IoT and contribute to its advancement. Firstly, fiber optics provides the backbone for transporting large amount of data generated by IoT network in the core , metro and access networks, and in building or in the physical object. Secondly, optical switching technologies, including all-optical switching and hybrid optical-electrical switching, enable fast and high bandwidth routing in IoT data processing center. Thirdly, optical sensing and imaging delivers comprehensive information of multiple physical phenomena through monitoring various optical properties such as intensity, phase, wavelength, frequency, polarization, and spectral distribution. In particular, fiber optic sensor has the advantages of high sensitivity, low latency, and long distributed sensing range. It is also immune to electromagnetic interference, and can be implemented in harsh environment. In this paper, the architecture of IoT is described, and the optical technologies and their applications in the IoT networks are discussed with practical examples.

  18. Comparing host and target environments for distributed Ada programs

    NASA Technical Reports Server (NTRS)

    Paulk, Mark C.

    1986-01-01

    The Ada programming language provides a means of specifying logical concurrency by using multitasking. Extending the Ada multitasking concurrency mechanism into a physically concurrent distributed environment which imposes its own requirements can lead to incompatibilities. These problems are discussed. Using distributed Ada for a target system may be appropriate, but when using the Ada language in a host environment, a multiprocessing model may be more suitable than retargeting an Ada compiler for the distributed environment. The tradeoffs between multitasking on distributed targets and multiprocessing on distributed hosts are discussed. Comparisons of the multitasking and multiprocessing models indicate different areas of application.

  19. A general-purpose development environment for intelligent computer-aided training systems

    NASA Technical Reports Server (NTRS)

    Savely, Robert T.

    1990-01-01

    Space station training will be a major task, requiring the creation of large numbers of simulation-based training systems for crew, flight controllers, and ground-based support personnel. Given the long duration of space station missions and the large number of activities supported by the space station, the extension of space shuttle training methods to space station training may prove to be impractical. The application of artificial intelligence technology to simulation training can provide the ability to deliver individualized training to large numbers of personnel in a distributed workstation environment. The principal objective of this project is the creation of a software development environment which can be used to build intelligent training systems for procedural tasks associated with the operation of the space station. Current NASA Johnson Space Center projects and joint projects with other NASA operational centers will result in specific training systems for existing space shuttle crew, ground support personnel, and flight controller tasks. Concurrently with the creation of these systems, a general-purpose development environment for intelligent computer-aided training systems will be built. Such an environment would permit the rapid production, delivery, and evolution of training systems for space station crew, flight controllers, and other support personnel. The widespread use of such systems will serve to preserve task and training expertise, support the training of many personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences. As a result, significant reductions in training costs can be realized while safety and the probability of mission success can be enhanced.

  20. Sharing Data and Analytical Resources Securely in a Biomedical Research Grid Environment

    PubMed Central

    Langella, Stephen; Hastings, Shannon; Oster, Scott; Pan, Tony; Sharma, Ashish; Permar, Justin; Ervin, David; Cambazoglu, B. Barla; Kurc, Tahsin; Saltz, Joel

    2008-01-01

    Objectives To develop a security infrastructure to support controlled and secure access to data and analytical resources in a biomedical research Grid environment, while facilitating resource sharing among collaborators. Design A Grid security infrastructure, called Grid Authentication and Authorization with Reliably Distributed Services (GAARDS), is developed as a key architecture component of the NCI-funded cancer Biomedical Informatics Grid (caBIG™). The GAARDS is designed to support in a distributed environment 1) efficient provisioning and federation of user identities and credentials; 2) group-based access control support with which resource providers can enforce policies based on community accepted groups and local groups; and 3) management of a trust fabric so that policies can be enforced based on required levels of assurance. Measurements GAARDS is implemented as a suite of Grid services and administrative tools. It provides three core services: Dorian for management and federation of user identities, Grid Trust Service for maintaining and provisioning a federated trust fabric within the Grid environment, and Grid Grouper for enforcing authorization policies based on both local and Grid-level groups. Results The GAARDS infrastructure is available as a stand-alone system and as a component of the caGrid infrastructure. More information about GAARDS can be accessed at http://www.cagrid.org. Conclusions GAARDS provides a comprehensive system to address the security challenges associated with environments in which resources may be located at different sites, requests to access the resources may cross institutional boundaries, and user credentials are created, managed, revoked dynamically in a de-centralized manner. PMID:18308979

  1. From built environment to health inequalities: An explanatory framework based on evidence

    PubMed Central

    Gelormino, Elena; Melis, Giulia; Marietta, Cristina; Costa, Giuseppe

    2015-01-01

    Objective: The Health in All Policies strategy aims to engage every policy domain in health promotion. The more socially disadvantaged groups are usually more affected by potential negative impacts of policies if they are not health oriented. The built environment represents an important policy domain and, apart from its housing component, its impact on health inequalities is seldom assessed. Methods: A scoping review of evidence on the built environment and its health equity impact was carried out, searching both urban and medical literature since 2000 analysing socio-economic inequalities in relation to different components of the built environment. Results: The proposed explanatory framework assumes that key features of built environment (identified as density, functional mix and public spaces and services), may influence individual health through their impact on both natural environment and social context, as well as behaviours, and that these effects may be unequally distributed according to the social position of individuals. Conclusion: In general, the expected links proposed by the framework are well documented in the literature; however, evidence of their impact on health inequalities remains uncertain due to confounding factors, heterogeneity in study design, and difficulty to generalize evidence that is still very embedded to local contexts. PMID:26844145

  2. AN-CASE NET-CENTRIC modeling and simulation

    NASA Astrophysics Data System (ADS)

    Baskinger, Patricia J.; Chruscicki, Mary Carol; Turck, Kurt

    2009-05-01

    The objective of mission training exercises is to immerse the trainees into an environment that enables them to train like they would fight. The integration of modeling and simulation environments that can seamlessly leverage Live systems, and Virtual or Constructive models (LVC) as they are available offers a flexible and cost effective solution to extending the "war-gaming" environment to a realistic mission experience while evolving the development of the net-centric enterprise. From concept to full production, the impact of new capabilities on the infrastructure and concept of operations, can be assessed in the context of the enterprise, while also exposing them to the warfighter. Training is extended to tomorrow's tools, processes, and Tactics, Techniques and Procedures (TTPs). This paper addresses the challenges of a net-centric modeling and simulation environment that is capable of representing a net-centric enterprise. An overview of the Air Force Research Laboratory's (AFRL) Airborne Networking Component Architecture Simulation Environment (AN-CASE) is provide as well as a discussion on how it is being used to assess technologies for the purpose of experimenting with new infrastructure mechanisms that enhance the scalability and reliability of the distributed mission operations environment.

  3. Expressing Parallelism with ROOT

    NASA Astrophysics Data System (ADS)

    Piparo, D.; Tejedor, E.; Guiraud, E.; Ganis, G.; Mato, P.; Moneta, L.; Valls Pla, X.; Canal, P.

    2017-10-01

    The need for processing the ever-increasing amount of data generated by the LHC experiments in a more efficient way has motivated ROOT to further develop its support for parallelism. Such support is being tackled both for shared-memory and distributed-memory environments. The incarnations of the aforementioned parallelism are multi-threading, multi-processing and cluster-wide executions. In the area of multi-threading, we discuss the new implicit parallelism and related interfaces, as well as the new building blocks to safely operate with ROOT objects in a multi-threaded environment. Regarding multi-processing, we review the new MultiProc framework, comparing it with similar tools (e.g. multiprocessing module in Python). Finally, as an alternative to PROOF for cluster-wide executions, we introduce the efforts on integrating ROOT with state-of-the-art distributed data processing technologies like Spark, both in terms of programming model and runtime design (with EOS as one of the main components). For all the levels of parallelism, we discuss, based on real-life examples and measurements, how our proposals can increase the productivity of scientists.

  4. Radioactive contamination in the marine environment adjacent to the outfall of the radioactive waste treatment plant at ATOMFLOT, northern Russia.

    PubMed

    Brown, J E; Nikitin, A; Valetova, N K; Chumichev, V B; Katrich, I Yu; Berezhnoy, V I; Pegoev, N N; Kabanov, A I; Pichugin, S N; Vopiyashin, Yu Ya; Lind, B; Grøttheim, S; Sickel, M; Strand, P

    2002-01-01

    RTP "ATOMFLOT" is a civilian nuclear icebreaker base located on the Kola Bay of northwest Russia. The objectives of this study were to determine the distributions of man-made radionuclides in the marine environment adjacent to the base, to explain the form of the distributions in sediments and to derive information concerning the fate of radionuclides discharged from ATOMFLOT. Mean activity concentrations (d.w.) for surface sediment, of 63 Bq kg(-1 137Cs, 5.8 Bq kg(-1) 90Sr and 0.45 Bq kg(-1 239,240)Pu were measured. Filtered seawater activity levels were in the range of 3--6.9 Bq m(-3) 137Cs, 2.0-11.2 Bq m(-3) 90Sr, and 16-40 m Bq m(-3), 239,240Pu. Short-lived radionuclides were present at sediment depths in excess of 10cm indicating a high degree of sediment mixing. Correlations of radionuclide activity concentrations with grain-size appear to be absent; instead, the presence of relatively contaminated sediment appears to be related to the existence of radioactive particles.

  5. Development and application of a species sensitivity distribution for temperature-induced mortality in the aquatic environment.

    PubMed

    de Vries, Pepijn; Tamis, Jacqueline E; Murk, Albertinka J; Smit, Mathijs G D

    2008-12-01

    Current European legislation has static water quality objectives for temperature effects, based on the most sensitive species. In the present study a species sensitivity distribution (SSD) for elevated temperatures is developed on the basis of temperature sensitivity data (mortality) of 50 aquatic species. The SSD applies to risk assessment of heat discharges that are localized in space or time. As collected median lethal temperatures (LT50 values) for different species depend on the acclimation temperature, the SSD is also a function of the acclimation temperature. Data from a thermal discharge in The Netherlands are used to show the applicability of the developed SSD in environmental risk assessment. Although restrictions exist in the application of the developed SSD, it is concluded that the SSD approach can be applied to assess the effects of elevated temperature. Application of the concept of SSD to temperature changes allows harmonization of environmental risk assessment for stressors in the aquatic environment. When a synchronization of the assessment methods is achieved, the steps to integration of risks from toxic and nontoxic stressors can be made.

  6. Effective user management with high strength crypto -key in dynamic group environment in cloud

    NASA Astrophysics Data System (ADS)

    Kumar, P. J.; Suganya, P.; Karthik, G.

    2017-11-01

    Cloud Clusters consists of various collections of files which are being accessed by multiple users of Cloud. The users are managed as a group and the association of the user to a particular group is dynamic in nature. Every group has a manager who handles the membership of a user to a particular group by issuing keys for encryption and decryption. Due to the dynamic nature of a user he/she may leave the group very frequently. But an attempt can be made by the user who has recently left the group to access a file maintained by that group. Key distribution becomes a critical issue while the behavior of the user is dynamic. Existing techniques to manage the users of group in terms of security and key distribution has been investigated so that to arrive at an objective to identify the scopes to increase security and key management scheme in cloud. The usage of various key combinations to measure the strength of security and efficiency of user management in dynamic cloud environment has been investigated.

  7. Expressing Parallelism with ROOT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piparo, D.; Tejedor, E.; Guiraud, E.

    The need for processing the ever-increasing amount of data generated by the LHC experiments in a more efficient way has motivated ROOT to further develop its support for parallelism. Such support is being tackled both for shared-memory and distributed-memory environments. The incarnations of the aforementioned parallelism are multi-threading, multi-processing and cluster-wide executions. In the area of multi-threading, we discuss the new implicit parallelism and related interfaces, as well as the new building blocks to safely operate with ROOT objects in a multi-threaded environment. Regarding multi-processing, we review the new MultiProc framework, comparing it with similar tools (e.g. multiprocessing module inmore » Python). Finally, as an alternative to PROOF for cluster-wide executions, we introduce the efforts on integrating ROOT with state-of-the-art distributed data processing technologies like Spark, both in terms of programming model and runtime design (with EOS as one of the main components). For all the levels of parallelism, we discuss, based on real-life examples and measurements, how our proposals can increase the productivity of scientists.« less

  8. Distributed tactical reasoning framework for intelligent vehicles

    NASA Astrophysics Data System (ADS)

    Sukthankar, Rahul; Pomerleau, Dean A.; Thorpe, Chuck E.

    1998-01-01

    In independent vehicle concepts for the Automated Highway System (AHS), the ability to make competent tactical-level decisions in real-time is crucial. Traditional approaches to tactical reasoning typically involve the implementation of large monolithic systems, such as decision trees or finite state machines. However, as the complexity of the environment grows, the unforeseen interactions between components can make modifications to such systems very challenging. For example, changing an overtaking behavior may require several, non-local changes to car-following, lane changing and gap acceptance rules. This paper presents a distributed solution to the problem. PolySAPIENT consists of a collection of autonomous modules, each specializing in a particular aspect of the driving task - classified by traffic entities rather than tactical behavior. Thus, the influence of the vehicle ahead on the available actions is managed by one reasoning object, while the implications of an approaching exit are managed by another. The independent recommendations form these reasoning objects are expressed in the form of votes and vetos over a 'tactical action space', and are resolved by a voting arbiter. This local independence enables PolySAPIENT reasoning objects to be developed independently, using a heterogenous implementation. PolySAPIENT vehicles are implemented in the SHIVA tactical highway simulator, whose vehicles are based on the Carnegie Mellon Navlab robots.

  9. Studies for the Loss of Atomic and Molecular Species from Io

    NASA Technical Reports Server (NTRS)

    Combi, Michael R.

    1997-01-01

    The general objective of this project has been to advance our theoretical understanding of Io's atmosphere and how various atomic and molecular species are lost from this atmosphere and are distributed in the circumplanetary environment of Jupiter. The scientific objectives of the larger collaborative program between AER, Inc., and the University of Michigan have been to undertake theoretical modeling studies to simulate the distributions of the exospheric gases in Io's corona and extended clouds, to investigate the importance of the various physical processes that shape their relative abundances, and with these tools to analyze observations of O, S and Na obtained by four observers: M.A. McGrath of the Space Telescope Science Institute and G.E. Ballester of the University of Michigan who each have obtained Hubble Space Telescope observations of O and S near Io, F. Scherb who continues an effort to obtain 6300 A OI observations as part of the University of Wisconsin Fabry-Perot program, and N.M. Schneider of the University of Colorado who obtained an extensive set of spectral and spatial observations of the Na emission near Io in the D-lines.

  10. Development of System Architecture to Investigate the Impact of Integrated Air and Missile Defense in a Distributed Lethality Environment

    DTIC Science & Technology

    2017-12-01

    SYSTEM ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT by Justin K. Davis...TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT 5. FUNDING NUMBERS 6. AUTHOR(S) Justin K...ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT Justin K. Davis Lieutenant

  11. Use of geostatistics to determine the spatial distribution and infestation rate of leaf-cutting ant nests (Hymenoptera: Formicidae) in eucalyptus plantations.

    PubMed

    Lasmar, O; Zanetti, R; dos Santos, A; Fernandes, B V

    2012-08-01

    One of the fundamental steps in pest sampling is the assessment of the population distribution in the field. Several studies have investigated the distribution and appropriate sampling methods for leaf-cutting ants; however, more reliable methods are still required, such as those that use geostatistics. The objective of this study was to determine the spatial distribution and infestation rate of leaf-cutting ant nests in eucalyptus plantations by using geostatistics. The study was carried out in 2008 in two eucalyptus stands in Paraopeba, Minas Gerais, Brazil. All of the nests in the studied area were located and used for the generation of GIS maps, and the spatial pattern of distribution was determined considering the number and size of nests. Each analysis and map was made using the R statistics program and the geoR package. The nest spatial distribution in a savanna area of Minas Gerais was clustered to a certain extent. The models generated allowed the production of kriging maps of areas infested with leaf-cutting ants, where chemical intervention would be necessary, reducing the control costs, impact on humans, and the environment.

  12. Speaker's comfort in teaching environments: voice problems in Swedish teaching staff.

    PubMed

    Åhlander, Viveka Lyberg; Rydell, Roland; Löfqvist, Anders

    2011-07-01

    The primary objective of this study was to examine how a group of Swedish teachers rate aspects of their working environment that can be presumed to have an impact on vocal behavior and voice problems. The secondary objective was to explore the prevalence of voice problems in Swedish teachers. Questionnaires were distributed to the teachers of 23 randomized schools. Teaching staff at all levels were included, except preschool teachers and teachers at specialized, vocational high schools. The response rate was 73%. The results showed that 13% of the whole group reported voice problems occurring sometimes, often, or always. The teachers reporting voice problems were compared with those without problems. There were significant differences among the groups for several items. The teachers with voice problems rated items on room acoustics and work environment as more noticeable. This group also reported voice symptoms, such as hoarseness, throat clearing, and voice change, to a significantly higher degree, even though teachers in both groups reported some voice symptoms. Absence from work because of voice problems was also significantly more common in the group with voice problems--35% versus 9% in the group without problems. We may conclude that teachers suffering from voice problems react stronger to loading factors in the teaching environment, report more frequent symptoms of voice discomfort, and are more often absent from work because of voice problems than their voice-healthy colleagues. Copyright © 2011 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  13. COMPLEX CONDITIONAL CONTROL BY PIGEONS IN A CONTINUOUS VIRTUAL ENVIRONMENT

    PubMed Central

    Qadri, Muhammad A. J.; Reid, Sean; Cook, Robert G.

    2016-01-01

    We tested two pigeons in a continuously streaming digital environment. Using animation software that constantly presented a dynamic, three-dimensional (3D) environment, the animals were tested with a conditional object identification task. The correct object at a given time depended on the virtual context currently streaming in front of the pigeon. Pigeons were required to accurately peck correct target objects in the environment for food reward, while suppressing any pecks to intermixed distractor objects which delayed the next object’s presentation. Experiment 1 established that the pigeons’ discrimination of two objects could be controlled by the surface material of the digital terrain. Experiment 2 established that the pigeons’ discrimination of four objects could be conjunctively controlled by both the surface material and topography of the streaming environment. These experiments indicate that pigeons can simultaneously process and use at least two context cues from a streaming environment to control their identification behavior of passing objects. These results add to the promise of testing interactive digital environments with animals to advance our understanding of cognition and behavior. PMID:26781058

  14. GSHR-Tree: a spatial index tree based on dynamic spatial slot and hash table in grid environments

    NASA Astrophysics Data System (ADS)

    Chen, Zhanlong; Wu, Xin-cai; Wu, Liang

    2008-12-01

    Computation Grids enable the coordinated sharing of large-scale distributed heterogeneous computing resources that can be used to solve computationally intensive problems in science, engineering, and commerce. Grid spatial applications are made possible by high-speed networks and a new generation of Grid middleware that resides between networks and traditional GIS applications. The integration of the multi-sources and heterogeneous spatial information and the management of the distributed spatial resources and the sharing and cooperative of the spatial data and Grid services are the key problems to resolve in the development of the Grid GIS. The performance of the spatial index mechanism is the key technology of the Grid GIS and spatial database affects the holistic performance of the GIS in Grid Environments. In order to improve the efficiency of parallel processing of a spatial mass data under the distributed parallel computing grid environment, this paper presents a new grid slot hash parallel spatial index GSHR-Tree structure established in the parallel spatial indexing mechanism. Based on the hash table and dynamic spatial slot, this paper has improved the structure of the classical parallel R tree index. The GSHR-Tree index makes full use of the good qualities of R-Tree and hash data structure. This paper has constructed a new parallel spatial index that can meet the needs of parallel grid computing about the magnanimous spatial data in the distributed network. This arithmetic splits space in to multi-slots by multiplying and reverting and maps these slots to sites in distributed and parallel system. Each sites constructs the spatial objects in its spatial slot into an R tree. On the basis of this tree structure, the index data was distributed among multiple nodes in the grid networks by using large node R-tree method. The unbalance during process can be quickly adjusted by means of a dynamical adjusting algorithm. This tree structure has considered the distributed operation, reduplication operation transfer operation of spatial index in the grid environment. The design of GSHR-Tree has ensured the performance of the load balance in the parallel computation. This tree structure is fit for the parallel process of the spatial information in the distributed network environments. Instead of spatial object's recursive comparison where original R tree has been used, the algorithm builds the spatial index by applying binary code operation in which computer runs more efficiently, and extended dynamic hash code for bit comparison. In GSHR-Tree, a new server is assigned to the network whenever a split of a full node is required. We describe a more flexible allocation protocol which copes with a temporary shortage of storage resources. It uses a distributed balanced binary spatial tree that scales with insertions to potentially any number of storage servers through splits of the overloaded ones. The application manipulates the GSHR-Tree structure from a node in the grid environment. The node addresses the tree through its image that the splits can make outdated. This may generate addressing errors, solved by the forwarding among the servers. In this paper, a spatial index data distribution algorithm that limits the number of servers has been proposed. We improve the storage utilization at the cost of additional messages. The structure of GSHR-Tree is believed that the scheme of this grid spatial index should fit the needs of new applications using endlessly larger sets of spatial data. Our proposal constitutes a flexible storage allocation method for a distributed spatial index. The insertion policy can be tuned dynamically to cope with periods of storage shortage. In such cases storage balancing should be favored for better space utilization, at the price of extra message exchanges between servers. This structure makes a compromise in the updating of the duplicated index and the transformation of the spatial index data. Meeting the needs of the grid computing, GSHRTree has a flexible structure in order to satisfy new needs in the future. The GSHR-Tree provides the R-tree capabilities for large spatial datasets stored over interconnected servers. The analysis, including the experiments, confirmed the efficiency of our design choices. The scheme should fit the needs of new applications of spatial data, using endlessly larger datasets. Using the system response time of the parallel processing of spatial scope query algorithm as the performance evaluation factor, According to the result of the simulated the experiments, GSHR-Tree is performed to prove the reasonable design and the high performance of the indexing structure that the paper presented.

  15. ADHydro: A Parallel Implementation of a Large-scale High-Resolution Multi-Physics Distributed Water Resources Model Using the Charm++ Run Time System

    NASA Astrophysics Data System (ADS)

    Steinke, R. C.; Ogden, F. L.; Lai, W.; Moreno, H. A.; Pureza, L. G.

    2014-12-01

    Physics-based watershed models are useful tools for hydrologic studies, water resources management and economic analyses in the contexts of climate, land-use, and water-use changes. This poster presents a parallel implementation of a quasi 3-dimensional, physics-based, high-resolution, distributed water resources model suitable for simulating large watersheds in a massively parallel computing environment. Developing this model is one of the objectives of the NSF EPSCoR RII Track II CI-WATER project, which is joint between Wyoming and Utah EPSCoR jurisdictions. The model, which we call ADHydro, is aimed at simulating important processes in the Rocky Mountain west, including: rainfall and infiltration, snowfall and snowmelt in complex terrain, vegetation and evapotranspiration, soil heat flux and freezing, overland flow, channel flow, groundwater flow, water management and irrigation. Model forcing is provided by the Weather Research and Forecasting (WRF) model, and ADHydro is coupled with the NOAH-MP land-surface scheme for calculating fluxes between the land and atmosphere. The ADHydro implementation uses the Charm++ parallel run time system. Charm++ is based on location transparent message passing between migrateable C++ objects. Each object represents an entity in the model such as a mesh element. These objects can be migrated between processors or serialized to disk allowing the Charm++ system to automatically provide capabilities such as load balancing and checkpointing. Objects interact with each other by passing messages that the Charm++ system routes to the correct destination object regardless of its current location. This poster discusses the algorithms, communication patterns, and caching strategies used to implement ADHydro with Charm++. The ADHydro model code will be released to the hydrologic community in late 2014.

  16. Adaptive planning for applications with dynamic objectives

    NASA Technical Reports Server (NTRS)

    Hadavi, Khosrow; Hsu, Wen-Ling; Pinedo, Michael

    1992-01-01

    We devise a qualitative control layer to be integrated into a real-time multi-agent reactive planner. The reactive planning system consists of distributed planning agents attending to various perspectives of the task environment. Each perspective corresponds to an objective. The set of objectives considered are sometimes in conflict with each other. Each agent receives information about events as they occur, and a set of actions based on heuristics can be taken by the agents. Within the qualitative control scheme, we use a set of qualitative feature vectors to describe the effects of applying actions. A qualitative transition vector is used to denote the qualitative distance between the current state and the target state. We will then apply on-line learning at the qualitative control level to achieve adaptive planning. Our goal is to design a mechanism to refine the heuristics used by the reactive planner every time an action is taken toward achieving the objectives, using feedback from the results of the actions. When the outcome is compared with expectations, our prior objectives may be modified and a new set of objectives (or a new assessment of the relative importance of the different objectives) can be introduced. Because we are able to obtain better estimates of the time-varying objectives, the reactive strategies can be improved and better prediction can be achieved.

  17. Hyperspectral imaging simulation of object under sea-sky background

    NASA Astrophysics Data System (ADS)

    Wang, Biao; Lin, Jia-xuan; Gao, Wei; Yue, Hui

    2016-10-01

    Remote sensing image simulation plays an important role in spaceborne/airborne load demonstration and algorithm development. Hyperspectral imaging is valuable in marine monitoring, search and rescue. On the demand of spectral imaging of objects under the complex sea scene, physics based simulation method of spectral image of object under sea scene is proposed. On the development of an imaging simulation model considering object, background, atmosphere conditions, sensor, it is able to examine the influence of wind speed, atmosphere conditions and other environment factors change on spectral image quality under complex sea scene. Firstly, the sea scattering model is established based on the Philips sea spectral model, the rough surface scattering theory and the water volume scattering characteristics. The measured bi directional reflectance distribution function (BRDF) data of objects is fit to the statistical model. MODTRAN software is used to obtain solar illumination on the sea, sky brightness, the atmosphere transmittance from sea to sensor and atmosphere backscattered radiance, and Monte Carlo ray tracing method is used to calculate the sea surface object composite scattering and spectral image. Finally, the object spectrum is acquired by the space transformation, radiation degradation and adding the noise. The model connects the spectrum image with the environmental parameters, the object parameters, and the sensor parameters, which provide a tool for the load demonstration and algorithm development.

  18. Modeling and optimization of the multiobjective stochastic joint replenishment and delivery problem under supply chain environment.

    PubMed

    Wang, Lin; Qu, Hui; Liu, Shan; Dun, Cai-xia

    2013-01-01

    As a practical inventory and transportation problem, it is important to synthesize several objectives for the joint replenishment and delivery (JRD) decision. In this paper, a new multiobjective stochastic JRD (MSJRD) of the one-warehouse and n-retailer systems considering the balance of service level and total cost simultaneously is proposed. The goal of this problem is to decide the reasonable replenishment interval, safety stock factor, and traveling routing. Secondly, two approaches are designed to handle this complex multi-objective optimization problem. Linear programming (LP) approach converts the multi-objective to single objective, while a multi-objective evolution algorithm (MOEA) solves a multi-objective problem directly. Thirdly, three intelligent optimization algorithms, differential evolution algorithm (DE), hybrid DE (HDE), and genetic algorithm (GA), are utilized in LP-based and MOEA-based approaches. Results of the MSJRD with LP-based and MOEA-based approaches are compared by a contrastive numerical example. To analyses the nondominated solution of MOEA, a metric is also used to measure the distribution of the last generation solution. Results show that HDE outperforms DE and GA whenever LP or MOEA is adopted.

  19. Modeling and Optimization of the Multiobjective Stochastic Joint Replenishment and Delivery Problem under Supply Chain Environment

    PubMed Central

    Dun, Cai-xia

    2013-01-01

    As a practical inventory and transportation problem, it is important to synthesize several objectives for the joint replenishment and delivery (JRD) decision. In this paper, a new multiobjective stochastic JRD (MSJRD) of the one-warehouse and n-retailer systems considering the balance of service level and total cost simultaneously is proposed. The goal of this problem is to decide the reasonable replenishment interval, safety stock factor, and traveling routing. Secondly, two approaches are designed to handle this complex multi-objective optimization problem. Linear programming (LP) approach converts the multi-objective to single objective, while a multi-objective evolution algorithm (MOEA) solves a multi-objective problem directly. Thirdly, three intelligent optimization algorithms, differential evolution algorithm (DE), hybrid DE (HDE), and genetic algorithm (GA), are utilized in LP-based and MOEA-based approaches. Results of the MSJRD with LP-based and MOEA-based approaches are compared by a contrastive numerical example. To analyses the nondominated solution of MOEA, a metric is also used to measure the distribution of the last generation solution. Results show that HDE outperforms DE and GA whenever LP or MOEA is adopted. PMID:24302880

  20. How CubeSats contribute to Science and Technology in Astronomy and Astrophysics

    NASA Astrophysics Data System (ADS)

    Cahoy, Kerri Lynn; Douglas, Ewan; Carlton, Ashley; Clark, James; Haughwout, Christian

    2017-01-01

    CubeSats are nanosatellites, spacecraft typically the size of a shoebox or backpack. CubeSats are made up of one or more 10 cm x 10 cm x 10 cm units weighing 1.33 kg (each cube is called a “U”). CubeSats benefit from relatively easy and inexpensive access to space because they are designed to slide into fully enclosed spring-loaded deployer pods before being attached as an auxiliary payload to a larger vehicle, without adding risk to the vehicle or its primary payload(s). Even though CubeSats have inherent resource and aperture limitations due to their small size, over the past fifteen years, researchers and engineers have miniaturized components and subsystems, greatly increasing the capabilities of CubeSats. We discuss how state of the art CubeSats can address both science objectives and technology objectives in Astronomy and Astrophysics. CubeSats can contribute toward science objectives such as cosmic dawn, galactic evolution, stellar evolution, extrasolar planets and interstellar exploration.CubeSats can contribute to understanding how key technologies for larger missions, like detectors, microelectromechanical systems, and integrated optical elements, can not only survive launch and operational environments (which can often be simulated on the ground), but also meet performance specifications over long periods of time in environments that are harder to simulate properly, such as ionizing radiation, the plasma environment, spacecraft charging, and microgravity. CubeSats can also contribute to both science and technology advancements as multi-element space-based platforms that coordinate distributed measurements and use formation flying and large separation baselines to counter their restricted individual apertures.

  1. Real-Time MENTAT programming language and architecture

    NASA Technical Reports Server (NTRS)

    Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.

    1989-01-01

    Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.

  2. Hostile environments and high temperature measurements; Proceedings of the Conference, Kansas City, MO, Nov. 6-8, 1989

    NASA Astrophysics Data System (ADS)

    Topics presented include the identification of stagnant region in a fluidized bed combustor, high sensitivity objective grating speckle, an X-ray beam method for displacement and strain distributions using the moire method, and high-temperature deformation of a Ti-alloy composite under complex loading. Also addressed are a hybrid procedure for dynamic characterization of ceramics at elevated temperature, thermo-structural measurements in a SiC coated carbon-carbon hypersonic glide vehicle, and recent experience with elevated-temperature foil strain gages with application to thin-gage materials.

  3. National Computer Security Conference (16th) held at Baltimore Convention Center, Baltimore, Maryland on September 20-23, 1993. Proceedings

    DTIC Science & Technology

    1993-09-23

    answer to the question: Is subject s allowed access type a on object o? An authorization was thus seen as a 3-tuple (s,o,a). This view of access...called trusted in a Bell-LaPadula architecture. Work at Carnegie Mellon University on type enforcement contemporaneous with Denning’s was not addressed in...34Implementation Considerations for the Typed Access Matrix Model in a Distributed Environment," Proceedings of the 15th National Computer Security

  4. Developing science gateways for drug discovery in a grid environment.

    PubMed

    Pérez-Sánchez, Horacio; Rezaei, Vahid; Mezhuyev, Vitaliy; Man, Duhu; Peña-García, Jorge; den-Haan, Helena; Gesing, Sandra

    2016-01-01

    Methods for in silico screening of large databases of molecules increasingly complement and replace experimental techniques to discover novel compounds to combat diseases. As these techniques become more complex and computationally costly we are faced with an increasing problem to provide the research community of life sciences with a convenient tool for high-throughput virtual screening on distributed computing resources. To this end, we recently integrated the biophysics-based drug-screening program FlexScreen into a service, applicable for large-scale parallel screening and reusable in the context of scientific workflows. Our implementation is based on Pipeline Pilot and Simple Object Access Protocol and provides an easy-to-use graphical user interface to construct complex workflows, which can be executed on distributed computing resources, thus accelerating the throughput by several orders of magnitude.

  5. A System to Provide Real-Time Collaborative Situational Awareness by Web Enabling a Distributed Sensor Network

    NASA Technical Reports Server (NTRS)

    Panangadan, Anand; Monacos, Steve; Burleigh, Scott; Joswig, Joseph; James, Mark; Chow, Edward

    2012-01-01

    In this paper, we describe the architecture of both the PATS and SAP systems and how these two systems interoperate with each other forming a unified capability for deploying intelligence in hostile environments with the objective of providing actionable situational awareness of individuals. The SAP system works in concert with the UICDS information sharing middleware to provide data fusion from multiple sources. UICDS can then publish the sensor data using the OGC's Web Mapping Service, Web Feature Service, and Sensor Observation Service standards. The system described in the paper is able to integrate a spatially distributed sensor system, operating without the benefit of the Web infrastructure, with a remote monitoring and control system that is equipped to take advantage of SWE.

  6. Hawaiian Electric Advanced Inverter Grid Support Function Laboratory Validation and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Austin; Nagarajan, Adarsh; Prabakar, Kumar

    The objective for this test plan was to better understand how to utilize the performance capabilities of advanced inverter functions to allow the interconnection of distributed energy resource (DER) systems to support the new Customer Self-Supply, Customer Grid-Supply, and other future DER programs. The purpose of this project was: 1) to characterize how the tested grid supportive inverters performed the functions of interest, 2) to evaluate the grid supportive inverters in an environment that emulates the dynamics of O'ahu's electrical distribution system, and 3) to gain insight into the benefits of the grid support functions on selected O'ahu island distributionmore » feeders. These goals were achieved through laboratory testing of photovoltaic inverters, including power hardware-in-the-loop testing.« less

  7. Population evolution in the GEO vicinity

    NASA Astrophysics Data System (ADS)

    Wegener, P.; Bendisch, J.; Krag, H.; Stabroth, S.

    The geostationary orbit is now in use for nearly 40 years. Due to the absence of major energy dissipating mechanisms, the object population in the GEO environment steadily grew during this time. In mid 2001, a total of 762 known objects permanently resided within the GEO region (GEO +/-1000 km). Additionally, two explosion events are confirmed within the geostationary ring, which further enlarge the already existing population consisting of payloads and upper stages. Recent observation results obtained by the ESA Space Debris Telescope (SDT) at Tenerife show strong indications for even more fragment clouds. Since the geostationary ring can be seen as a unique resource, which is not protected by any significant selfcleaning effect, a monitoring of the object environment in the vicinity of this orbit is mandatory. In a first step, this paper characterizes the history and current state of the GEO environment. The evolution of fresh object clouds within and out of the ring is analysed to get a better understanding of the short- and mid-term impact of explosion events as well as Solid Rocket Motor (SRM) firings on the overall population. Next to explosion prevention, the transfer of satellites to a graveyard orbit about 300 km above the geostationary altitude is agreed to be the most effective mean to preserve GEO. Although this procedure is internationally recommended, only one third of the retiring spacecraft is in fact brought to a sufficiently high orbit. Another 30% performs a re-orbiting, but is ending up in an orbit in the direct vicinity of the GEO ring or even touching or crossing it. The reason for this low performance often can be found in insufficient fuel gauging or urgent need for several more months of operation. In the future, one possibility to mitigate the population growth by remo v - i n g those spacecraft could be a dedicated vehicle transferring several large objects to the graveyard area before retiring there on its own. The number and distribution of possible candidates for such a removal is outlined. Assuming a future traffic model and different scenarios for re-orbiting and removal, the future evolution of the large object population in and near GEO can be estimated.

  8. Scientific Objectives of Electron Losses and Fields INvestigation Onboard Lomonosov Satellite

    NASA Astrophysics Data System (ADS)

    Shprits, Y. Y.; Angelopoulos, V.; Russell, C. T.; Strangeway, R. J.; Runov, A.; Turner, D.; Caron, R.; Cruce, P.; Leneman, D.; Michaelis, I.; Petrov, V.; Panasyuk, M.; Yashin, I.; Drozdov, A.; Russell, C. L.; Kalegaev, V.; Nazarkov, I.; Clemmons, J. H.

    2018-02-01

    The objective of the Electron Losses and Fields INvestigation on board the Lomonosov satellite (ELFIN-L) project is to determine the energy spectrum of precipitating energetic electrons and ions and, together with other polar-orbiting and equatorial missions, to better understand the mechanisms responsible for scattering these particles into the atmosphere. This mission will provide detailed measurements of the radiation environment at low altitudes. The 400-500 km sun-synchronous orbit of Lomonosov is ideal for observing electrons and ions precipitating into the atmosphere. This mission provides a unique opportunity to test the instruments. Similar suite of instruments will be flown in the future NSF- and NASA-supported spinning CubeSat ELFIN satellites which will augment current measurements by providing detailed information on pitch-angle distributions of precipitating and trapped particles.

  9. Legacy systems: managing evolution through integration in a distributed and object-oriented computing environment.

    PubMed

    Lemaitre, D; Sauquet, D; Fofol, I; Tanguy, L; Jean, F C; Degoulet, P

    1995-01-01

    Legacy systems are crucial for organizations since they support key functionalities. But they become obsolete with aging and the apparition of new techniques. Managing their evolution is a key issue in software engineering. This paper presents a strategy that has been developed at Broussais University Hospital in Paris to make a legacy system devoted to the management of health care units evolve towards a new up-to-date software. A two-phase evolution pathway is described. The first phase consists in separating the interface from the data storage and application control and in using a communication channel between the individualized components. The second phase proposes to use an object-oriented DBMS in place of the homegrown system. An application example for the management of hypertensive patients is described.

  10. Expert system technologies for Space Shuttle decision support: Two case studies

    NASA Technical Reports Server (NTRS)

    Ortiz, Christopher J.; Hasan, David A.

    1994-01-01

    This paper addresses the issue of integrating the C Language Integrated Production System (CLIPS) into distributed data acquisition environments. In particular, it presents preliminary results of some ongoing software development projects aimed at exploiting CLIPS technology in the new mission control center (MCC) being built at NASA Johnson Space Center. One interesting aspect of the control center is its distributed architecture; it consists of networked workstations which acquire and share data through the NASA/JSC-developed information sharing protocol (ISP). This paper outlines some approaches taken to integrate CLIPS and ISP in order to permit the development of intelligent data analysis applications which can be used in the MCC. Three approaches to CLIPS/IPS integration are discussed. The initial approach involves clearly separating CLIPS from ISP using user-defined functions for gathering and sending data to and from a local storage buffer. Memory and performance drawbacks of this design are summarized. The second approach involves taking full advantage of CLIPS and the CLIPS Object-Oriented Language (COOL) by using objects to directly transmit data and state changes from ISP to COOL. Any changes within the object slots eliminate the need for both a data structure and external function call thus taking advantage of the object matching capabilities within CLIPS 6.0. The final approach is to treat CLIPS and ISP as peer toolkits. Neither is embedded in the other; rather the application interweaves calls to each directly in the application source code.

  11. Educational environment as perceived by dental students at King Saud University.

    PubMed

    Al-Saleh, Samar; Al-Madi, Ebtissam M; AlMufleh, Balqees; Al-Degheishem, Al-Hanoof

    2018-07-01

    Main objectives of the present study were to develop a baseline information about dental students' perception of their educational environment at the College of Dentistry, King Saud University (KSU) in Riyadh; and to investigate the role of four different variables on the students' perception. Dundee Ready Education Environment Measure (DREEM) questionnaire was distributed among 497 undergraduate dental students, in the second week of the first semester of the academic year, from second year students to interns studying in the College of Dentistry of King Saud University (KSU). Response rate was 60.73%. Mean for the total DREEM scores was 108.42/200. DREEM subscales mean were above 50% of the total score. DREEM overall score showed no significant statistical difference among the four variables investigated, except the academic year, where the second year students scored significantly higher (118.36 ± 15.8) compared to the interns (105 ± 21.3). Students' perception of educational environment in the KSU College of Dentistry was satisfactory. However, several weak areas were identified which need some attention and consideration.

  12. Measurement of vibrations at different sections of rail through fiber optic sensors

    NASA Astrophysics Data System (ADS)

    Barreda, A.; Molina-Jiménez, T.; Valero, E.; Recuero, S.

    2011-09-01

    This paper presents the results of an investigation about how the vibration of railway vehicles affects nearby buildings. The overall objective is to study the vibration generated in urban environments by tram, train and subway, its transmission to the ground and how the buildings and constructions of the environment receive them. Vibrations can generate noise and vibrations in buildings. For this reason it is necessary to characterize the level of vibration affecting rail, road infrastructure and sidewalks and nearby buildings, to assess the influence of the train (speed, type, profile wheel ,..), rail (area of rolling) and route of step, and finally define interim corrective measures. In this study measurements of levels of energy and vibration excitation frequencies will be undertaken through optical techniques: optical fiber networks with distributed Bragg sensors. Measuring these vibrations in different configurations allows us to evaluate the suitability of different sections of rail for different types of uses or environments. This study aims to help improve the safety of the built environment in the vicinity of a railway operation, and thus increase the comfort for passengers and to reduce the environmental impact.

  13. Developing CORBA-Based Distributed Scientific Applications from Legacy Fortran Programs

    NASA Technical Reports Server (NTRS)

    Sang, Janche; Kim, Chan; Lopez, Isaac

    2000-01-01

    Recent progress in distributed object technology has enabled software applications to be developed and deployed easily such that objects or components can work together across the boundaries of the network, different operating systems, and different languages. A distributed object is not necessarily a complete application but rather a reusable, self-contained piece of software that co-operates with other objects in a plug-and-play fashion via a well-defined interface. The Common Object Request Broker Architecture (CORBA), a middleware standard defined by the Object Management Group (OMG), uses the Interface Definition Language (IDL) to specify such an interface for transparent communication between distributed objects. Since IDL can be mapped to any programming language, such as C++, Java, Smalltalk, etc., existing applications can be integrated into a new application and hence the tasks of code re-writing and software maintenance can be reduced. Many scientific applications in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with CORBA objects can increase the codes reusability. For example, scientists could link their scientific applications to vintage Fortran programs such as Partial Differential Equation(PDE) solvers in a plug-and-play fashion. Unfortunately, CORBA IDL to Fortran mapping has not been proposed and there seems to be no direct method of generating CORBA objects from Fortran without having to resort to manually writing C/C++ wrappers. In this paper, we present an efficient methodology to integrate Fortran legacy programs into a distributed object framework. Issues and strategies regarding the conversion and decomposition of Fortran codes into CORBA objects are discussed. The following diagram shows the conversion and decomposition mechanism we proposed. Our goal is to keep the Fortran codes unmodified. The conversion- aided tool takes the Fortran application program as input and helps programmers generate C/C++ header file and IDL file for wrapping the Fortran code. Programmers need to determine by themselves how to decompose the legacy application into several reusable components based on the cohesion and coupling factors among the functions and subroutines. However, programming effort still can be greatly reduced because function headings and types have been converted to C++ and IDL styles. Most Fortran applications use the COMMON block to facilitate the transfer of large amount of variables among several functions. The COMMON block plays the similar role of global variables used in C. In the CORBA-compliant programming environment, global variables can not be used to pass values between objects. One approach to dealing with this problem is to put the COMMON variables into the parameter list. We do not adopt this approach because it requires modification of the Fortran source code which violates our design consideration. Our approach is to extract the COMMON blocks and convert them into a structure-typed attribute in C++. Through attributes, each component can initialize the variables and return the computation result back to the client. We have tested successfully the proposed conversion methodology based on the f2c converter. Since f2c only translates Fortran to C, we still needed to edit the converted code to meet the C++ and IDL syntax. For example, C++/IDL requires a tag in the structure type, while C does not. In this paper, we identify the necessary changes to the f2c converter in order to directly generate the C++ header and the IDL file. Our future work is to add GUI interface to ease the decomposition task by simply dragging and dropping icons.

  14. Comparison of Orbital Parameters for GEO Debris Predicted by LEGEND and Observed by MODEST: Can Sources of Orbital Debris be Identified?

    NASA Technical Reports Server (NTRS)

    Barker, Edwin S.; Matney, M. J.; Liou, J.-C.; Abercromby, K. J.; Rodriquez, H. M.; Seitzer, P.

    2006-01-01

    Since 2002 the National Aeronautics and Space Administration (NASA) has carried out an optical survey of the debris environment in the geosynchronous Earth-orbit (GEO) region with the Michigan Orbital Debris Survey Telescope (MODEST) in Chile. The survey coverage has been similar for 4 of the 5 years allowing us to follow the orbital evolution of Correlated Targets (CTs), both controlled and un-controlled objects, and Un-Correlated Targets (UCTs). Under gravitational perturbations the distributions of uncontrolled objects, both CTs and UCTs, in GEO orbits will evolve in predictable patterns, particularly evident in the inclination and right ascension of the ascending node (RAAN) distributions. There are several clusters (others have used a "cloud" nomenclature) in observed distributions that show evolution from year to year in their inclination and ascending node elements. However, when MODEST is in survey mode (field-of-view approx.1.3deg) it provides only short 5-8 minute orbital arcs which can only be fit under the assumption of a circular orbit approximation (ACO) to determine the orbital parameters. These ACO elements are useful only in a statistical sense as dedicated observing runs would be required to obtain sufficient orbital coverage to determine a set of accurate orbital elements and then to follow their evolution. Identification of the source(s) for these "clusters of UCTs" would be advantageous to the overall definition of the GEO orbital debris environment. This paper will set out to determine if the ACO elements can be used to in a statistical sense to identify the source of the "clustering of UCTs" roughly centered on an inclination of 12deg and a RAAN of 345deg. The breakup of the Titan 3C-4 transtage on February 21, 1992 has been modeled using NASA s LEGEND (LEO-to-GEO Environment Debris) code to generate a GEO debris cloud. Breakup fragments are created based on the NASA Standard Breakup Model (including fragment size, area-to-mass (A/M), and delta-V distributions). Once fragments are created, they are propagated forward in time with a subroutine GEOPROP. Perturbations included in GEOPROP are those due to solar/lunar gravity, radiation pressure, and major geopotential terms. The question to be addressed: are the UCTs detected by MODEST in this inclination/RAAN region related to the Titan 3C-4 breakup? Discussion will include the observational biases in attempting to detect a specific, uncontrolled target during given observing session. These restrictions include: (1) the length of the observing session which is 8 hours or less at any given date or declination; (2) the assumption of ACO elements for detected object when the breakup model predicts debris with non-zero eccentricities; (3) the size and illumination or brightness of the debris predicted by the model and the telescope/sky limiting magnitude.

  15. The specificity of memory enhancement during interaction with a virtual environment.

    PubMed

    Brooks, B M; Attree, E A; Rose, F D; Clifford, B R; Leadbetter, A G

    1999-01-01

    Two experiments investigated differences between active and passive participation in a computer-generated virtual environment in terms of spatial memory, object memory, and object location memory. It was found that active participants, who controlled their movements in the virtual environment using a joystick, recalled the spatial layout of the virtual environment better than passive participants, who merely watched the active participants' progress. Conversely, there were no significant differences between the active and passive participants' recall or recognition of the virtual objects, nor in their recall of the correct locations of objects in the virtual environment. These findings are discussed in terms of subject-performed task research and the specificity of memory enhancement in virtual environments.

  16. Distribution of the near-earth objects

    NASA Astrophysics Data System (ADS)

    Emel'Yanenko, V. V.; Naroenkov, S. A.; Shustov, B. M.

    2011-12-01

    This paper analyzes the distribution of the orbits of near-Earth minor bodies from the data on more than 7500 objects. The distribution of large near-Earth objects (NEOs) with absolute magnitudes of H < 18 is generally consistent with the earlier predictions (Bottke et al., 2002; Stuart, 2003), although we have revealed a previously undetected maximum in the distribution of perihelion distances q near q = 0.5 AU. The study of the orbital distribution for the entire sample of all detected objects has found new significant features. In particular, the distribution of perihelion longitudes seriously deviates from a homogeneous pattern; its variations are roughly 40% of its mean value. These deviations cannot be stochastic, which is confirmed by the Kolmogorov-Smirnov test with a more than 0.9999 probability. These features can be explained by the dynamic behavior of the minor bodies related to secular resonances with Jupiter. For the objects with H < 18, the variations in the perihelion longitude distribution are not so apparent. By extrapolating the orbital characteristics of the NEOs with H < 18, we have obtained longitudinal, latitudinal, and radial distributions of potentially hazardous objects in a heliocentric ecliptic coordinate frame. The differences in the orbital distributions of objects of different size appear not to be a consequence of observational selection, but could indicate different sources of the NEOs.

  17. Electromagnetic radiation and behavioural response of ticks: an experimental test.

    PubMed

    Vargová, Blažena; Majláth, Igor; Kurimský, Juraj; Cimbala, Roman; Kosterec, Michal; Tryjanowski, Piotr; Jankowiak, Łukasz; Raši, Tomáš; Majláthová, Viktória

    2018-05-01

    Factors associated with the increased usage of electronic devices, wireless technologies and mobile phones nowadays are present in increasing amounts in our environment. All living organisms are constantly affected by electromagnetic radiation which causes serious environmental pollution. The distribution and density of ticks in natural habitats is influenced by a complex of abiotic and biotic factors. Exposure to radio-frequency electromagnetic field (RF-EMF) constitutes a potential cause altering the presence and distribution of ticks in the environment. Our main objective was to determine the affinity of Dermacentor reticulatus ticks towards RF-EMF exposure. Originally designed and constructed radiation-shielded tube (RST) test was used to test the affinity of ticks under controlled laboratory conditions. All test were performed in an electromagnetic compatibility laboratory in an anechoic chamber. Ticks were irradiated using a Double-Ridged Waveguide Horn Antenna to RF-EMF at 900 and 5000 MHz, 0 MHz was used as control. The RF-EMF exposure to 900 MHz induced a higher concentration of ticks on irradiated arm of RST as opposed to the RF-EMF at 5000 MHz, which caused an escape of ticks to the shielded arm. This study represents the first experimental evidence of RF-EMF preference in D. reticulatus. The projection of obtained results to the natural environment could help assess the risk of tick borne diseases and could be a tool of preventive medicine.

  18. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis

    PubMed Central

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E.; Tkachenko, Valery; Torcivia-Rodriguez, John; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure. The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu PMID:26989153

  19. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis.

    PubMed

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. © The Author(s) 2016. Published by Oxford University Press.

  20. A cooperative model for IS security risk management in distributed environment.

    PubMed

    Feng, Nan; Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively.

  1. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments.

    PubMed

    Mora, Higinio; Gil, David; Terol, Rafael Muñoz; Azorín, Jorge; Szymanski, Julian

    2017-10-10

    The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other 'things' ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers' heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries.

  2. Flight Plasma Diagnostics for High-Power, Solar-Electric Deep-Space Spacecraft

    NASA Technical Reports Server (NTRS)

    Johnson, Lee; De Soria-Santacruz Pich, Maria; Conroy, David; Lobbia, Robert; Huang, Wensheng; Choi, Maria; Sekerak, Michael J.

    2018-01-01

    NASA's Asteroid Redirect Robotic Mission (ARRM) project plans included a set of plasma and space environment instruments, the Plasma Diagnostic Package (PDP), to fulfill ARRM requirements for technology extensibility to future missions. The PDP objectives were divided into the classes of 1) Plasma thruster dynamics, 2) Solar array-specific environmental effects, 3) Plasma environmental spacecraft effects, and 4) Energetic particle spacecraft environment. A reference design approach and interface requirements for ARRM's PDP was generated by the PDP team at JPL and GRC. The reference design consisted of redundant single-string avionics located on the ARRM spacecraft bus as well as solar array, driving and processing signals from multiple copies of several types of plasma, effects, and environments sensors distributed over the spacecraft and array. The reference design sensor types were derived in part from sensors previously developed for USAF Research Laboratory (AFRL) plasma effects campaigns such as those aboard TacSat-2 in 2007 and AEHF-2 in 2012.

  3. An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments

    PubMed Central

    Szymanski, Julian

    2017-01-01

    The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other ‘things’ ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers’ heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries. PMID:28994743

  4. Optical Studies of Orbital Debris at GEO Using Two Telescopes

    NASA Technical Reports Server (NTRS)

    Seitzer, P.; Abercromby, K. J.; Rodriquez,H. M.; Barker, E.

    2008-01-01

    Beginning in March, 2007, optical observations of debris at geosynchronous orbit (GEO) were commenced using two telescopes simultaneously at the Cerro Tololo Inter-American Observatory (CTIO) in Chile. The University of Michigan's 0.6/0.9-m Schmidt telescope MODEST (for Michigan Orbital DEbris Survey Telescope) was used in survey mode to find objects that potentially could be at GEO. Because GEO objects only appear in this telescope's field of view for an average of 5 minutes, a full six-parameter orbit can not be determined. Interrupting the survey for follow-up observations leads to incompleteness in the survey results. Instead, as objects are detected on MODEST, initial predictions assuming a circular orbit are done for where the object will be for the next hour, and the objects are reacquired as quickly as possible on the CTIO 0.9-m telescope. This second telescope then follows-up during the first night and, if possible, over several more nights to obtain the maximum time arc possible, and the best six parameter orbit. Our goal is to obtain an initial orbit for all detected objects fainter than R = 15th in order to estimate the orbital distribution of objects selected on the basis of two observational criteria: magnitude and angular rate. Objects fainter than 15th are largely uncataloged and have a completely different angular rate distribution than brighter objects. Combining the information obtained for both faint and bright objects yields a more complete picture of the debris environment rather than just concentrating on the faint debris. One objective is to estimate what fraction of objects selected on the basis of angular rate are not at GEO. A second objective is to obtain magnitudes and colors in standard astronomical filters (BVRI) for comparison with reflectance spectra of likely spacecraft materials. This paper reports on results from two 14 night runs with both telescopes: in March and November 2007: (1) A significant fraction of objects fainter than R = 15th have eccentric orbits (e > 0.1) (2) Virtually all objects selected on the basis of angular rate are in the GEO and GTO regimes. (3) Calibrated magnitudes and colors in BVRI were obtained for many objects fainter than R = 15th magnitude. This work is supported by NASA's Orbital Debris Program Office, Johnson Space Center, Houston, Texas, USA.

  5. An interdisciplinary swat ecohydrological model to define catchment-scale hydrologic partitioning

    NASA Astrophysics Data System (ADS)

    Shope, C. L.; Maharjan, G. R.; Tenhunen, J.; Seo, B.; Kim, K.; Riley, J.; Arnhold, S.; Koellner, T.; Ok, Y. S.; Peiffer, S.; Kim, B.; Park, J.-H.; Huwe, B.

    2013-06-01

    Land use and climate change have long been implicated in modifying ecosystem services, such as water quality and water yield, biodiversity, and agricultural production. To account for future effects on ecosystem services, the integration of physical, biological, economic, and social data over several scales must be implemented to assess the effects on natural resource availability and use. Our objective is to assess the capability of the SWAT model to capture short-duration monsoonal rainfall-runoff processes in complex mountainous terrain under rapid, event-driven processes in a monsoonal environment. To accomplish this, we developed a unique quality-control gap-filling algorithm for interpolation of high frequency meteorological data. We used a novel multi-location, multi-optimization calibration technique to improve estimations of catchment-wide hydrologic partitioning. We calibrated the interdisciplinary model to a combination of statistical, hydrologic, and plant growth metrics. In addition, we used multiple locations of different drainage area, aspect, elevation, and geologic substrata distributed throughout the catchment. Results indicate scale-dependent sensitivity of hydrologic partitioning and substantial influence of engineered features. While our model accurately reproduced observed discharge variability, the addition of hydrologic and plant growth objective functions identified the importance of culverts in catchment-wide flow distribution. The results of this study provide a valuable resource to describe landscape controls and their implication on discharge, sediment transport, and nutrient loading. This study also shows the challenges of applying the SWAT model to complex terrain and extreme environments. By incorporating anthropogenic features into modeling scenarios, we can greatly enhance our understanding of the hydroecological impacts on ecosystem services.

  6. apART: system for the acquisition, processing, archiving, and retrieval of digital images in an open, distributed imaging environment

    NASA Astrophysics Data System (ADS)

    Schneider, Uwe; Strack, Ruediger

    1992-04-01

    apART reflects the structure of an open, distributed environment. According to the general trend in the area of imaging, network-capable, general purpose workstations with capabilities of open system image communication and image input are used. Several heterogeneous components like CCD cameras, slide scanners, and image archives can be accessed. The system is driven by an object-oriented user interface where devices (image sources and destinations), operators (derived from a commercial image processing library), and images (of different data types) are managed and presented uniformly to the user. Browsing mechanisms are used to traverse devices, operators, and images. An audit trail mechanism is offered to record interactive operations on low-resolution image derivatives. These operations are processed off-line on the original image. Thus, the processing of extremely high-resolution raster images is possible, and the performance of resolution dependent operations is enhanced significantly during interaction. An object-oriented database system (APRIL), which can be browsed, is integrated into the system. Attribute retrieval is supported by the user interface. Other essential features of the system include: implementation on top of the X Window System (X11R4) and the OSF/Motif widget set; a SUN4 general purpose workstation, inclusive ethernet, magneto optical disc, etc., as the hardware platform for the user interface; complete graphical-interactive parametrization of all operators; support of different image interchange formats (GIF, TIFF, IIF, etc.); consideration of current IPI standard activities within ISO/IEC for further refinement and extensions.

  7. The Visual Representation of 3D Object Orientation in Parietal Cortex

    PubMed Central

    Cowan, Noah J.; Angelaki, Dora E.

    2013-01-01

    An accurate representation of three-dimensional (3D) object orientation is essential for interacting with the environment. Where and how the brain visually encodes 3D object orientation remains unknown, but prior studies suggest the caudal intraparietal area (CIP) may be involved. Here, we develop rigorous analytical methods for quantifying 3D orientation tuning curves, and use these tools to the study the neural coding of surface orientation. Specifically, we show that single neurons in area CIP of the rhesus macaque jointly encode the slant and tilt of a planar surface, and that across the population, the distribution of preferred slant-tilts is not statistically different from uniform. This suggests that all slant-tilt combinations are equally represented in area CIP. Furthermore, some CIP neurons are found to also represent the third rotational degree of freedom that determines the orientation of the image pattern on the planar surface. Together, the present results suggest that CIP is a critical neural locus for the encoding of all three rotational degrees of freedom specifying an object's 3D spatial orientation. PMID:24305830

  8. Integration of advanced technologies to enhance problem-based learning over distance: Project TOUCH.

    PubMed

    Jacobs, Joshua; Caudell, Thomas; Wilks, David; Keep, Marcus F; Mitchell, Steven; Buchanan, Holly; Saland, Linda; Rosenheimer, Julie; Lozanoff, Beth K; Lozanoff, Scott; Saiki, Stanley; Alverson, Dale

    2003-01-01

    Distance education delivery has increased dramatically in recent years as a result of the rapid advancement of communication technology. The National Computational Science Alliance's Access Grid represents a significant advancement in communication technology with potential for distance medical education. The purpose of this study is to provide an overview of the TOUCH project (Telehealth Outreach for Unified Community Health; http://hsc.unm.edu/touch) with special emphasis on the process of problem-based learning case development for distribution over the Access Grid. The objective of the TOUCH project is to use emerging Internet-based technology to overcome geographic barriers for delivery of tutorial sessions to medical students pursuing rotations at remote sites. The TOUCH project also is aimed at developing a patient simulation engine and an immersive virtual reality environment to achieve a realistic health care scenario enhancing the learning experience. A traumatic head injury case is developed and distributed over the Access Grid as a demonstration of the TOUCH system. Project TOUCH serves as an example of a computer-based learning system for developing and implementing problem-based learning cases within the medical curriculum, but this system should be easily applied to other educational environments and disciplines involving functional and clinical anatomy. Future phases will explore PC versions of the TOUCH cases for increased distribution. Copyright 2003 Wiley-Liss, Inc.

  9. Biosignatures on Mars: What, Where, and How? Implications for the Search for Martian Life

    PubMed Central

    Foucher, Frédéric; Bost, Nicolas; Bertrand, Marylène; Loizeau, Damien; Vago, Jorge L.; Kminek, Gerhard; Gaboyer, Frédéric; Campbell, Kathleen A.; Bréhéret, Jean-Gabriel; Gautret, Pascale; Cockell, Charles S.

    2015-01-01

    Abstract The search for traces of life is one of the principal objectives of Mars exploration. Central to this objective is the concept of habitability, the set of conditions that allows the appearance of life and successful establishment of microorganisms in any one location. While environmental conditions may have been conducive to the appearance of life early in martian history, habitable conditions were always heterogeneous on a spatial scale and in a geological time frame. This “punctuated” scenario of habitability would have had important consequences for the evolution of martian life, as well as for the presence and preservation of traces of life at a specific landing site. We hypothesize that, given the lack of long-term, continuous habitability, if martian life developed, it was (and may still be) chemotrophic and anaerobic. Obtaining nutrition from the same kinds of sources as early terrestrial chemotrophic life and living in the same kinds of environments, the fossilized traces of the latter serve as useful proxies for understanding the potential distribution of martian chemotrophs and their fossilized traces. Thus, comparison with analog, anaerobic, volcanic terrestrial environments (Early Archean >3.5–3.33 Ga) shows that the fossil remains of chemotrophs in such environments were common, although sparsely distributed, except in the vicinity of hydrothermal activity where nutrients were readily available. Moreover, the traces of these kinds of microorganisms can be well preserved, provided that they are rapidly mineralized and that the sediments in which they occur are rapidly cemented. We evaluate the biogenicity of these signatures by comparing them to possible abiotic features. Finally, we discuss the implications of different scenarios for life on Mars for detection by in situ exploration, ranging from its non-appearance, through preserved traces of life, to the presence of living microorganisms. Key Words: Mars—Early Earth—Anaerobic chemotrophs—Biosignatures—Astrobiology missions to Mars. Astrobiology 15, 998–1029. PMID:26575218

  10. Biosignatures on Mars: What, Where, and How? Implications for the Search for Martian Life.

    PubMed

    Westall, Frances; Foucher, Frédéric; Bost, Nicolas; Bertrand, Marylène; Loizeau, Damien; Vago, Jorge L; Kminek, Gerhard; Gaboyer, Frédéric; Campbell, Kathleen A; Bréhéret, Jean-Gabriel; Gautret, Pascale; Cockell, Charles S

    2015-11-01

    The search for traces of life is one of the principal objectives of Mars exploration. Central to this objective is the concept of habitability, the set of conditions that allows the appearance of life and successful establishment of microorganisms in any one location. While environmental conditions may have been conducive to the appearance of life early in martian history, habitable conditions were always heterogeneous on a spatial scale and in a geological time frame. This "punctuated" scenario of habitability would have had important consequences for the evolution of martian life, as well as for the presence and preservation of traces of life at a specific landing site. We hypothesize that, given the lack of long-term, continuous habitability, if martian life developed, it was (and may still be) chemotrophic and anaerobic. Obtaining nutrition from the same kinds of sources as early terrestrial chemotrophic life and living in the same kinds of environments, the fossilized traces of the latter serve as useful proxies for understanding the potential distribution of martian chemotrophs and their fossilized traces. Thus, comparison with analog, anaerobic, volcanic terrestrial environments (Early Archean >3.5-3.33 Ga) shows that the fossil remains of chemotrophs in such environments were common, although sparsely distributed, except in the vicinity of hydrothermal activity where nutrients were readily available. Moreover, the traces of these kinds of microorganisms can be well preserved, provided that they are rapidly mineralized and that the sediments in which they occur are rapidly cemented. We evaluate the biogenicity of these signatures by comparing them to possible abiotic features. Finally, we discuss the implications of different scenarios for life on Mars for detection by in situ exploration, ranging from its non-appearance, through preserved traces of life, to the presence of living microorganisms. Mars-Early Earth-Anaerobic chemotrophs-Biosignatures-Astrobiology missions to Mars.

  11. Web-based modelling of energy, water and matter fluxes to support decision making in mesoscale catchments??the integrative perspective of GLOWA-Danube

    NASA Astrophysics Data System (ADS)

    Ludwig, R.; Mauser, W.; Niemeyer, S.; Colgan, A.; Stolz, R.; Escher-Vetter, H.; Kuhn, M.; Reichstein, M.; Tenhunen, J.; Kraus, A.; Ludwig, M.; Barth, M.; Hennicker, R.

    The GLOWA-initiative (Global Change of the water cycle), funded by the German Ministry of Research and Education (BMBF), has been established to address the manifold consequences of Global Change on regional water resources in a variety of catchment areas with different natural and cultural characteristics. Within this framework, the GLOWA-Danube project is dealing with the Upper Danube watershed as a representative mesoscale test site (∼75.000 km 2) for mountain-foreland regions in the temperate mid-latitudes. The principle objective is to identify, examine and develop new techniques of coupled distributed modelling for the integration of natural and socio-economic sciences. The transdisciplinary research in GLOWA-Danube develops an integrated decision support system, called DANUBIA, to investigate the sustainability of future water use. GLOWA-Danube, which is scheduled for a total run-time of eight years to operationally implement and establish DANUBIA, comprises a university-based network of experts with water-related competence in the fields of engineering, natural and social sciences. Co-operation with a network of stakeholders in water resources management of the Upper Danube catchment ensures that practical issues and future problems in the water sector of the region can be addressed. In order to synthesize a common understanding between the project partners, a standardized notation of parameters and functions and a platform-independent structure of computational methods and interfaces has been established, by making use of the unified modelling language, an industry standard for the structuring and co-ordination of large projects in software development [Booch et al., The Unified Modelling Language User Guide, Addison-Wesley, Reading, 1999]. DANUBIA is object-oriented, spatially distributed and raster-based at its core. It applies the concept of “proxels” (process pixels) as its basic objects, which have different dimensions depending on the viewing scale and connect to their environment through fluxes. The presented paper excerpts the hydrological view point of GLOWA-Danube, its approach of model coupling and network-based communication, and object-oriented techniques to simulate physical processes and interactions at the land surface. The mechanisms and technologies applied to communicate data and model parameters across the typical discipline borders are demonstrated from the perspective of the Landsurface object. It comprises the capabilities of interdependent expert models for energy exchange at various surface types, snowmelt, soil water movement, runoff formation and plant growth in a distributed Java-based modelling environment using the remote method invocation [Pitt et al., Java.rmi: The Remote Method Invocation Guide, Addison Wesley Professional, Reading, 2001, p. 320]. The presented text summarizes the GLOWA-Danube concept and shows the state of an implemented DANUBIA prototype after completion of the first project-year (2001).

  12. A Reconstructed Vision of Environmental Science Literacy: The case of Qatar

    NASA Astrophysics Data System (ADS)

    Khishfe, Rola

    2014-12-01

    The purpose of this study was twofold: (a) develop a conceptual framework for environmental science literacy; and consequently (b) examine the potential of science standards/curricula to prepare environmentally literate citizens. The framework comprised four pillars: science content knowledge, scientific inquiry, nature of science (NOS), and socioscientific issues (SSI). A conceptual understanding of these pillars as interconnected was presented and justified. Then the developed framework was used to examine the potential of the Qatari science standards to prepare environmentally literate citizens. Results showed that the secondary Qatari science standards generally take up the pillars of science content and scientific inquiry in an explicit manner. The NOS pillar is rarely addressed, while the SSI pillar is not addressed in the objectives and activities in a way that aligns with the heavy emphasis given in the overall aims. Moreover, the connections among pillars are mostly manifested within the activities and between the science content and scientific inquiry. The objectives and activities targeting the environment were less frequent among the four pillars across the Qatari standards. Again, the connections related to the environment were less frequent in conformity with the limited environmental objectives and activities. Implications from this study relate to the need for the distribution of the four pillars across the standards as well as the presentation of the different pillars as interconnected.

  13. PLRP-3: Operational Perspectives of Conducting Science-Driven Extravehicular Activity with Communications Latency

    NASA Technical Reports Server (NTRS)

    Miller, Matthew J.; Lim, Darlene S. S.; Brady, Allyson; Cardman, Zena; Bell, Ernest; Garry, Brent; Reid, Donnie; Chappell, Steve; Abercromby, Andrew F. J.

    2016-01-01

    The Pavilion Lake Research Project (PLRP) is a unique platform where the combination of scientific research and human space exploration concepts can be tested in an underwater spaceflight analog environment. The 2015 PLRP field season was performed at Pavilion Lake, Canada, where science-driven exploration techniques focusing on microbialite characterization and acquisition were evaluated within the context of crew and robotic extravehicular activity (EVA) operations. The primary objectives of this analog study were to detail the capabilities, decision-making process, and operational concepts required to meet non-simulated scientific objectives during 5-minute one-way communication latency utilizing crew and robotic assets. Furthermore, this field study served as an opportunity build upon previous tests at PLRP, NASA Desert Research and Technology Studies (DRATS), and NASA Extreme Environment Mission Operations (NEEMO) to characterize the functional roles and responsibilities of the personnel involved in the distributed flight control team and identify operational constraints imposed by science-driven EVA operations. The relationship and interaction between ground and flight crew was found to be dependent on the specific scientific activities being addressed. Furthermore, the addition of a second intravehicular operator was found to be highly enabling when conducting science-driven EVAs. Future human spaceflight activities will need to cope with the added complexity of dynamic and rapid execution of scientific priorities both during and between EVA execution to ensure scientific objectives are achieved.

  14. Reach and get capability in a computing environment

    DOEpatents

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2012-06-05

    A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.

  15. Idiosyncratic representation of peripersonal space depends on the success of one's own motor actions, but also the successful actions of others!

    PubMed Central

    Quesque, François; Gigliotti, Maria-Francesca; Ott, Laurent; Bruyelle, Jean-Luc

    2018-01-01

    Peripersonal space is a multisensory representation of the environment around the body in relation to the motor system, underlying the interactions with the physical and social world. Although changing body properties and social context have been shown to alter the functional processing of space, little is known about how changing the value of objects influences the representation of peripersonal space. In two experiments, we tested the effect of modifying the spatial distribution of reward-yielding targets on manual reaching actions and peripersonal space representation. Before and after performing a target-selection task consisting of manually selecting a set of targets on a touch-screen table, participants performed a two-alternative forced-choice reachability-judgment task. In the target-selection task, half of the targets were associated with a reward (change of colour from grey to green, providing 1 point), the other half being associated with no reward (change of colour from grey to red, providing no point). In Experiment 1, the target-selection task was performed individually with the aim of maximizing the point count, and the distribution of the reward-yielding targets was either 50%, 25% or 75% in the proximal and distal spaces. In Experiment 2, the target-selection task was performed in a social context involving cooperation between two participants to maximize the point count, and the distribution of the reward-yielding targets was 50% in the proximal and distal spaces. Results showed that changing the distribution of the reward-yielding targets or introducing the social context modified concurrently the amplitude of self-generated manual reaching actions and the representation of peripersonal space. Moreover, a decrease of the amplitude of manual reaching actions caused a reduction of peripersonal space when resulting from the distribution of reward-yielding targets, while this effect was not observed in a social interaction context. In that case, the decreased amplitude of manual reaching actions was accompanied by an increase of peripersonal space representation, which was not due to the mere presence of a confederate (control experiment). We conclude that reward-dependent modulation of objects values in the environment modifies the representation of peripersonal space, when resulting from either self-generated motor actions or observation of motor actions performed by a confederate. PMID:29771982

  16. Idiosyncratic representation of peripersonal space depends on the success of one's own motor actions, but also the successful actions of others!

    PubMed

    Coello, Yann; Quesque, François; Gigliotti, Maria-Francesca; Ott, Laurent; Bruyelle, Jean-Luc

    2018-01-01

    Peripersonal space is a multisensory representation of the environment around the body in relation to the motor system, underlying the interactions with the physical and social world. Although changing body properties and social context have been shown to alter the functional processing of space, little is known about how changing the value of objects influences the representation of peripersonal space. In two experiments, we tested the effect of modifying the spatial distribution of reward-yielding targets on manual reaching actions and peripersonal space representation. Before and after performing a target-selection task consisting of manually selecting a set of targets on a touch-screen table, participants performed a two-alternative forced-choice reachability-judgment task. In the target-selection task, half of the targets were associated with a reward (change of colour from grey to green, providing 1 point), the other half being associated with no reward (change of colour from grey to red, providing no point). In Experiment 1, the target-selection task was performed individually with the aim of maximizing the point count, and the distribution of the reward-yielding targets was either 50%, 25% or 75% in the proximal and distal spaces. In Experiment 2, the target-selection task was performed in a social context involving cooperation between two participants to maximize the point count, and the distribution of the reward-yielding targets was 50% in the proximal and distal spaces. Results showed that changing the distribution of the reward-yielding targets or introducing the social context modified concurrently the amplitude of self-generated manual reaching actions and the representation of peripersonal space. Moreover, a decrease of the amplitude of manual reaching actions caused a reduction of peripersonal space when resulting from the distribution of reward-yielding targets, while this effect was not observed in a social interaction context. In that case, the decreased amplitude of manual reaching actions was accompanied by an increase of peripersonal space representation, which was not due to the mere presence of a confederate (control experiment). We conclude that reward-dependent modulation of objects values in the environment modifies the representation of peripersonal space, when resulting from either self-generated motor actions or observation of motor actions performed by a confederate.

  17. Inter-cohort growth for three tropical resources: tilapia, octopus and lobster.

    PubMed

    Velázquez-Abunader, Iván; Gómez-Muñoz, Victor Manuel; Salas, Silvia; Ruiz-Velazco, Javier M J

    2015-09-01

    Growth parameters are an important component for the stock assessment of exploited aquatic species. However, it is often difficult to apply direct methods to estimate growth and to analyse the differences between males and females, particularly in tropical areas. The objective of this study was to analyse the inter-cohort growth of three tropical resources and discuss the possible fisheries management implications. A simple method was used to compare individual growth curves obtained from length frequency distribution analysis, illustrated by case studies of three tropical species from different aquatic environments: tilapia (Oreochromis aureus), red octopus (Octopus maya) and the Caribbean spiny lobster (Panulirus argus). The analysis undertaken compared the size distribution of males and females of a given cohort through modal progression analysis. The technique used proved to be useful for highlighting the differences in growth between females and males of a specific cohort. The potential effect of extrinsic and intrinsic factors on the organism's development as reflected in the size distribution of the cohorts is discussed.

  18. Characterization of branch complexity by fractal analyses

    USGS Publications Warehouse

    Alados, C.L.; Escos, J.; Emlen, J.M.; Freeman, D.C.

    1999-01-01

    The comparison between complexity in the sense of space occupancy (box-counting fractal dimension D(c) and information dimension D1) and heterogeneity in the sense of space distribution (average evenness index f and evenness variation coefficient J(cv)) were investigated in mathematical fractal objects and natural branch structures. In general, increased fractal dimension was paired with low heterogeneity. Comparisons between branch architecture in Anthyllis cytisoides under different slope exposure and grazing impact revealed that branches were more complex and more homogeneously distributed for plants on northern exposures than southern, while grazing had no impact during a wet year. Developmental instability was also investigated by the statistical noise of the allometric relation between internode length and node order. In conclusion, our study demonstrated that fractal dimension of branch structure can be used to analyze the structural organization of plants, especially if we consider not only fractal dimension but also shoot distribution within the canopy (lacunarity). These indexes together with developmental instability analyses are good indicators of growth responses to the environment.

  19. Characterization of branch complexity by fractal analyses and detect plant functional adaptations

    USGS Publications Warehouse

    Alados, C.L.; Escos, J.; Emlen, J.M.; Freeman, D.C.

    1999-01-01

    The comparison between complexity in the sense of space occupancy (box-counting fractal dimension Dc and information dimension DI ) and heterogeneity in the sense of space distribution (average evenness index and evenness variation coefficient JCV) were investigated in mathematical fractal objects and natural branch ¯ J structures. In general, increased fractal dimension was paired with low heterogeneity. Comparisons between branch architecture in Anthyllis cytisoides under different slope exposure and grazing impact revealed that branches were more complex and more homogeneously distributed for plants on northern exposures than southern, while grazing had no impact during a wet year. Developmental instability was also investigated by the statistical noise of the allometric relation between internode length and node order. In conclusion, our study demonstrated that fractal dimension of branch structure can be used to analyze the structural organization of plants, especially if we consider not only fractal dimension but also shoot distribution within the canopy (lacunarity). These indexes together with developmental instability analyses are good indicators of growth responses to the environment.

  20. Scalable Architecture for Federated Translational Inquiries Network (SAFTINet) Technology Infrastructure for a Distributed Data Network

    PubMed Central

    Schilling, Lisa M.; Kwan, Bethany M.; Drolshagen, Charles T.; Hosokawa, Patrick W.; Brandt, Elias; Pace, Wilson D.; Uhrich, Christopher; Kamerick, Michael; Bunting, Aidan; Payne, Philip R.O.; Stephens, William E.; George, Joseph M.; Vance, Mark; Giacomini, Kelli; Braddy, Jason; Green, Mika K.; Kahn, Michael G.

    2013-01-01

    Introduction: Distributed Data Networks (DDNs) offer infrastructure solutions for sharing electronic health data from across disparate data sources to support comparative effectiveness research. Data sharing mechanisms must address technical and governance concerns stemming from network security and data disclosure laws and best practices, such as HIPAA. Methods: The Scalable Architecture for Federated Translational Inquiries Network (SAFTINet) deploys TRIAD grid technology, a common data model, detailed technical documentation, and custom software for data harmonization to facilitate data sharing in collaboration with stakeholders in the care of safety net populations. Data sharing partners host TRIAD grid nodes containing harmonized clinical data within their internal or hosted network environments. Authorized users can use a central web-based query system to request analytic data sets. Discussion: SAFTINet DDN infrastructure achieved a number of data sharing objectives, including scalable and sustainable systems for ensuring harmonized data structures and terminologies and secure distributed queries. Initial implementation challenges were resolved through iterative discussions, development and implementation of technical documentation, governance, and technology solutions. PMID:25848567

  1. Scalable Architecture for Federated Translational Inquiries Network (SAFTINet) Technology Infrastructure for a Distributed Data Network.

    PubMed

    Schilling, Lisa M; Kwan, Bethany M; Drolshagen, Charles T; Hosokawa, Patrick W; Brandt, Elias; Pace, Wilson D; Uhrich, Christopher; Kamerick, Michael; Bunting, Aidan; Payne, Philip R O; Stephens, William E; George, Joseph M; Vance, Mark; Giacomini, Kelli; Braddy, Jason; Green, Mika K; Kahn, Michael G

    2013-01-01

    Distributed Data Networks (DDNs) offer infrastructure solutions for sharing electronic health data from across disparate data sources to support comparative effectiveness research. Data sharing mechanisms must address technical and governance concerns stemming from network security and data disclosure laws and best practices, such as HIPAA. The Scalable Architecture for Federated Translational Inquiries Network (SAFTINet) deploys TRIAD grid technology, a common data model, detailed technical documentation, and custom software for data harmonization to facilitate data sharing in collaboration with stakeholders in the care of safety net populations. Data sharing partners host TRIAD grid nodes containing harmonized clinical data within their internal or hosted network environments. Authorized users can use a central web-based query system to request analytic data sets. SAFTINet DDN infrastructure achieved a number of data sharing objectives, including scalable and sustainable systems for ensuring harmonized data structures and terminologies and secure distributed queries. Initial implementation challenges were resolved through iterative discussions, development and implementation of technical documentation, governance, and technology solutions.

  2. Objectivity in a Noisy Photonic Environment through Quantum State Information Broadcasting

    NASA Astrophysics Data System (ADS)

    Korbicz, J. K.; Horodecki, P.; Horodecki, R.

    2014-03-01

    Recently, the emergence of classical objectivity as a property of a quantum state has been explicitly derived for a small object embedded in a photonic environment in terms of a spectrum broadcast form—a specific classically correlated state, redundantly encoding information about the preferred states of the object in the environment. However, the environment was in a pure state and the fundamental problem was how generic and robust is the conclusion. Here, we prove that despite the initial environmental noise, the emergence of the broadcast structure still holds, leading to the perceived objectivity of the state of the object. We also show how this leads to a quantum Darwinism-type condition, reflecting the classicality of proliferated information in terms of a limit behavior of the mutual information. Quite surprisingly, we find "singular points" of the decoherence, which can be used to faithfully broadcast a specific classical message through the noisy environment.

  3. Virtual Workshop Environment (VWE): A Taxonomy and Service Oriented Architecture (SOA) Framework for Modularized Virtual Learning Environments (VLE)--Applying the Learning Object Concept to the VLE

    ERIC Educational Resources Information Center

    Paulsson, Fredrik; Naeve, Ambjorn

    2006-01-01

    Based on existing Learning Object taxonomies, this article suggests an alternative Learning Object taxonomy, combined with a general Service Oriented Architecture (SOA) framework, aiming to transfer the modularized concept of Learning Objects to modularized Virtual Learning Environments. The taxonomy and SOA-framework exposes a need for a clearer…

  4. NGC 3503 and its molecular environment

    NASA Astrophysics Data System (ADS)

    Duronea, N. U.; Vasquez, J.; Cappa, C. E.; Corti, M.; Arnal, E. M.

    2012-01-01

    Aims: We present a study of the molecular gas and interstellar dust distribution in the environs of the Hii region NGC 3503 associated with the open cluster Pis 17 with the aim of investigating the spatial distribution of the molecular gas linked to the nebula and achieving a better understanding of the interaction of the nebula and Pis 17 with their molecular environment. Methods: We based our study on 12CO(1-0) observations of a region of ~0.6° in size obtained with the 4-m NANTEN telescope, unpublished radio continuum data at 4800 and 8640 MHz obtained with the ATCA telescope, radio continuum data at 843 MHz obtained from SUMSS, and available IRAS, MSX, IRAC-GLIMPSE, and MIPSGAL images. Results: We found a molecular cloud (Component 1) having a mean velocity of -24.7 km s-1 ,compatible with the velocity of the ionized gas, which is associated with the nebula and its surroundings. Adopting a distance of 2.9 ± 0.4 kpc, the total molecular mass yields (7.6 ± 2.1) × 103M⊙ and density yields 400 ± 240 cm-3. The radio continuum data confirm the existence of an electron density gradient in NGC 3503. The IR emission shows a PDR bordering the higher density regions of the nebula. The spatial distribution of the CO emission shows that the nebula coincides with a molecular clump, and the strongest CO emission peak is located close to the higher electron density region. The more negative velocities of the molecular gas (about -27 km s-1), are coincident with NGC 3503. Candidate young stellar objects (YSOs) were detected toward the Hii region, suggesting that embedded star formation may be occurring in the neighborhood of the nebula. The clear electron density gradient, along with the spatial distribution of the molecular gas and PAHs in the region indicates that NGC 3503 is a blister-type Hii region that has probably undergone a champagne phase.

  5. Spectral and Structure Modeling of Low and High Mass Young Stars Using a Radiative Trasnfer Code

    NASA Astrophysics Data System (ADS)

    Robson Rocha, Will; Pilling, Sergio

    The spectroscopy data from space telescopes (ISO, Spitzer, Herchel) shows that in addition to dust grains (e.g. silicates), there is also the presence of the frozen molecular species (astrophysical ices, such as H _{2}O, CO, CO _{2}, CH _{3}OH) in the circumstellar environments. In this work we present a study of the modeling of low and high mass young stellar objects (YSOs), where we highlight the importance in the use of the astrophysical ices processed by the radiation (UV, cosmic rays) comes from stars in formation process. This is important to characterize the physicochemical evolution of the ices distributed by the protostellar disk and its envelope in some situations. To perform this analysis, we gathered (i) observational data from Infrared Space Observatory (ISO) related with low mass protostar Elias29 and high mass protostar W33A, (ii) absorbance experimental data in the infrared spectral range used to determinate the optical constants of the materials observed around this objects and (iii) a powerful radiative transfer code to simulate the astrophysical environment (RADMC-3D, Dullemond et al, 2012). Briefly, the radiative transfer calculation of the YSOs was done employing the RADMC-3D code. The model outputs were the spectral energy distribution and theoretical images in different wavelengths of the studied objects. The functionality of this code is based on the Monte Carlo methodology in addition to Mie theory for interaction among radiation and matter. The observational data from different space telescopes was used as reference for comparison with the modeled data. The optical constants in the infrared, used as input in the models, were calculated directly from absorbance data obtained in the laboratory of both unprocessed and processed simulated interstellar samples by using NKABS code (Rocha & Pilling 2014). We show from this study that some absorption bands in the infrared, observed in the spectrum of Elias29 and W33A can arises after the ices around the protostars were processed by the radiation comes from central object. In addition, we were able also to compare the observational data for this two objects with those obtained in the modeling. Authors would like to thanks the agencies FAPESP (JP#2009/18304-0 and PHD#2013/07657-5).

  6. Adaptive multi-sensor biomimetics for unsupervised submarine hunt (AMBUSH): Early results

    NASA Astrophysics Data System (ADS)

    Blouin, Stéphane

    2014-10-01

    Underwater surveillance is inherently difficult because acoustic wave propagation and transmission are limited and unpredictable when targets and sensors move around in the communication-opaque undersea environment. Today's Navy underwater sensors enable the collection of a massive amount of data, often analyzed offtine. The Navy of tomorrow will dominate by making sense of that data in real-time. DRDC's AMBUSH project proposes a new undersea-surveillance network paradigm that will enable such a real-time operation. Nature abounds with examples of collaborative tasks taking place despite limited communication and computational capabilities. This publication describes a year's worth of research efforts finding inspiration in Nature's collaborative tasks such as wolves hunting in packs. This project proposes the utilization of a heterogeneous network combining both static and mobile network nodes. The military objective is to enable an unsupervised surveillance capability while maximizing target localization performance and endurance. The scientific objective is to develop the necessary technology to acoustically and passively localize a noise-source of interest in shallow waters. The project fulfills these objectives via distributed computing and adaptation to changing undersea conditions. Specific research interests discussed here relate to approaches for performing: (a) network self-discovery, (b) network connectivity self-assessment, (c) opportunistic network routing, (d) distributed data-aggregation, and (e) simulation of underwater acoustic propagation. We present early results then followed by a discussion about future work.

  7. POSIX and Object Distributed Storage Systems Performance Comparison Studies With Real-Life Scenarios in an Experimental Data Taking Context Leveraging OpenStack Swift & Ceph

    NASA Astrophysics Data System (ADS)

    Poat, M. D.; Lauret, J.; Betts, W.

    2015-12-01

    The STAR online computing infrastructure has become an intensive dynamic system used for first-hand data collection and analysis resulting in a dense collection of data output. As we have transitioned to our current state, inefficient, limited storage systems have become an impediment to fast feedback to online shift crews. Motivation for a centrally accessible, scalable and redundant distributed storage system had become a necessity in this environment. OpenStack Swift Object Storage and Ceph Object Storage are two eye-opening technologies as community use and development have led to success elsewhere. In this contribution, OpenStack Swift and Ceph have been put to the test with single and parallel I/O tests, emulating real world scenarios for data processing and workflows. The Ceph file system storage, offering a POSIX compliant file system mounted similarly to an NFS share was of particular interest as it aligned with our requirements and was retained as our solution. I/O performance tests were run against the Ceph POSIX file system and have presented surprising results indicating true potential for fast I/O and reliability. STAR'S online compute farm historical use has been for job submission and first hand data analysis. The goal of reusing the online compute farm to maintain a storage cluster and job submission will be an efficient use of the current infrastructure.

  8. Method of synthesis of abstract images with high self-similarity

    NASA Astrophysics Data System (ADS)

    Matveev, Nikolay V.; Shcheglov, Sergey A.; Romanova, Galina E.; Koneva, Ð.¢atiana A.

    2017-06-01

    Abstract images with high self-similarity could be used for drug-free stress therapy. This based on the fact that a complex visual environment has a high affective appraisal. To create such an image we can use the setup based on the three laser sources of small power and different colors (Red, Green, Blue), the image is the pattern resulting from the reflecting and refracting by the complicated form object placed into the laser ray paths. The images were obtained experimentally which showed the good therapy effect. However, to find and to choose the object which gives needed image structure is very difficult and requires many trials. The goal of the work is to develop a method and a procedure of finding the object form which if placed into the ray paths can provide the necessary structure of the image In fact the task means obtaining the necessary irradiance distribution on the given surface. Traditionally such problems are solved using the non-imaging optics methods. In the given case this task is very complicated because of the complicated structure of the illuminance distribution and its high non-linearity. Alternative way is to use the projected image of a mask with a given structure. We consider both ways and discuss how they can help to speed up the synthesis procedure for the given abstract image of the high self-similarity for the setups of drug-free therapy.

  9. Regional Differences in Tropical Lightning Distributions.

    NASA Astrophysics Data System (ADS)

    Boccippio, Dennis J.; Goodman, Steven J.; Heckman, Stan

    2000-12-01

    Observations from the National Aeronautics and Space Administration Optical Transient Detector (OTD) and Tropical Rainfall Measuring Mission (TRMM)-based Lightning Imaging Sensor (LIS) are analyzed for variability between land and ocean, various geographic regions, and different (objectively defined) convective `regimes.' The bulk of the order-of-magnitude differences between land and ocean regional flash rates are accounted for by differences in storm spacing (density) and/or frequency of occurrence, rather than differences in storm instantaneous flash rates, which only vary by a factor of 2 on average. Regional variability in cell density and cell flash rates closely tracks differences in 85-GHz microwave brightness temperatures. Monotonic relationships are found with the gross moist stability of the tropical atmosphere, a large-scale `adjusted state' parameter. This result strongly suggests that it will be possible, using TRMM observations, to objectively test numerical or theoretical predictions of how mesoscale convective organization interacts with the larger-scale environment. Further parameters are suggested for a complete objective definition of tropical convective regimes.

  10. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  11. Could Blobs Fuel Storage-Based Convergence between HPC and Big Data?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matri, Pierre; Alforov, Yevhen; Brandon, Alvaro

    The increasingly growing data sets processed on HPC platforms raise major challenges for the underlying storage layer. A promising alternative to POSIX-IO- compliant file systems are simpler blobs (binary large objects), or object storage systems. Such systems offer lower overhead and better performance at the cost of largely unused features such as file hierarchies or permissions. Similarly, blobs are increasingly considered for replacing distributed file systems for big data analytics or as a base for storage abstractions such as key-value stores or time-series databases. This growing interest in such object storage on HPC and big data platforms raises the question:more » Are blobs the right level of abstraction to enable storage-based convergence between HPC and Big Data? In this paper we study the impact of blob-based storage for real-world applications on HPC and cloud environments. The results show that blobbased storage convergence is possible, leading to a significant performance improvement on both platforms« less

  12. Electric fish as natural models for technical sensor systems

    NASA Astrophysics Data System (ADS)

    von der Emde, Gerhard; Bousack, Herbert; Huck, Christina; Mayekar, Kavita; Pabst, Michael; Zhang, Yi

    2009-05-01

    Instead of vision, many animals use alternative senses for object detection. Weakly electric fish employ "active electrolocation", during which they discharge an electric organ emitting electrical current pulses (electric organ discharges, EOD). Local EODs are sensed by electroreceptors in the fish's skin, which respond to changes of the signal caused by nearby objects. Fish can gain information about attributes of an object, such as size, shape, distance, and complex impedance. When close to the fish, each object projects an 'electric image' onto the fish's skin. In order to get information about an object, the fish has to analyze the object's electric image by sampling its voltage distribution with the electroreceptors. We now know a great deal about the mechanisms the fish use to gain information about objects in their environment. Inspired by the remarkable capabilities of weakly electric fish in detecting and recognizing objects with their electric sense, we are designing technical sensor systems that can solve similar sensing problems. We applied the principles of active electrolocation to devices that produce electrical current pulses in water and simultaneously sense local current densities. Depending on the specific task, sensors can be designed which detect an object, localize it in space, determine its distance, and measure certain object properties such as material properties, thickness, or material faults. We present first experiments and FEM simulations on the optimal sensor arrangement regarding the sensor requirements e. g. localization of objects or distance measurements. Different methods of the sensor read-out and signal processing are compared.

  13. Object Detection Applied to Indoor Environments for Mobile Robot Navigation.

    PubMed

    Hernández, Alejandra Carolina; Gómez, Clara; Crespo, Jonathan; Barber, Ramón

    2016-07-28

    To move around the environment, human beings depend on sight more than their other senses, because it provides information about the size, shape, color and position of an object. The increasing interest in building autonomous mobile systems makes the detection and recognition of objects in indoor environments a very important and challenging task. In this work, a vision system to detect objects considering usual human environments, able to work on a real mobile robot, is developed. In the proposed system, the classification method used is Support Vector Machine (SVM) and as input to this system, RGB and depth images are used. Different segmentation techniques have been applied to each kind of object. Similarly, two alternatives to extract features of the objects are explored, based on geometric shape descriptors and bag of words. The experimental results have demonstrated the usefulness of the system for the detection and location of the objects in indoor environments. Furthermore, through the comparison of two proposed methods for extracting features, it has been determined which alternative offers better performance. The final results have been obtained taking into account the proposed problem and that the environment has not been changed, that is to say, the environment has not been altered to perform the tests.

  14. Object Detection Applied to Indoor Environments for Mobile Robot Navigation

    PubMed Central

    Hernández, Alejandra Carolina; Gómez, Clara; Crespo, Jonathan; Barber, Ramón

    2016-01-01

    To move around the environment, human beings depend on sight more than their other senses, because it provides information about the size, shape, color and position of an object. The increasing interest in building autonomous mobile systems makes the detection and recognition of objects in indoor environments a very important and challenging task. In this work, a vision system to detect objects considering usual human environments, able to work on a real mobile robot, is developed. In the proposed system, the classification method used is Support Vector Machine (SVM) and as input to this system, RGB and depth images are used. Different segmentation techniques have been applied to each kind of object. Similarly, two alternatives to extract features of the objects are explored, based on geometric shape descriptors and bag of words. The experimental results have demonstrated the usefulness of the system for the detection and location of the objects in indoor environments. Furthermore, through the comparison of two proposed methods for extracting features, it has been determined which alternative offers better performance. The final results have been obtained taking into account the proposed problem and that the environment has not been changed, that is to say, the environment has not been altered to perform the tests. PMID:27483264

  15. Evaluating lightning hazards to building environments using explicit numerical solutions of Maxwell's equations

    NASA Astrophysics Data System (ADS)

    Collier, Richard S.; McKenna, Paul M.; Perala, Rodney A.

    1991-08-01

    The objective here is to describe the lightning hazards to buildings and their internal environments using advanced formulations of Maxwell's Equations. The method described is the Three Dimensional Finite Difference Time Domain Solution. It can be used to solve for the lightning interaction with such structures in three dimensions with the inclusion of a considerable amount of detail. Special techniques were developed for including wire, plumbing, and rebar into the model. Some buildings have provisions for lightning protection in the form of air terminals connected to a ground counterpoise system. It is shown that fields and currents within these structures can be significantly high during a lightning strike. Time lapse video presentations were made showing the electric and magnetic field distributions on selected cross sections of the buildings during a simulated lightning strike.

  16. Evaluating lightning hazards to building environments using explicit numerical solutions of Maxwell's equations

    NASA Technical Reports Server (NTRS)

    Collier, Richard S.; Mckenna, Paul M.; Perala, Rodney A.

    1991-01-01

    The objective here is to describe the lightning hazards to buildings and their internal environments using advanced formulations of Maxwell's Equations. The method described is the Three Dimensional Finite Difference Time Domain Solution. It can be used to solve for the lightning interaction with such structures in three dimensions with the inclusion of a considerable amount of detail. Special techniques were developed for including wire, plumbing, and rebar into the model. Some buildings have provisions for lightning protection in the form of air terminals connected to a ground counterpoise system. It is shown that fields and currents within these structures can be significantly high during a lightning strike. Time lapse video presentations were made showing the electric and magnetic field distributions on selected cross sections of the buildings during a simulated lightning strike.

  17. Stable polyurethane coatings for electronic circuits. NASA tech briefs, fall 1982, volume 7, no. 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    One of the most severe deficiencies of polyurethanes as engineering materials for electrical applications has been their sensitivity to combined humidity and temperature environments. Gross failure by reversion of urethane connector potting materials has occurred under these conditions. This has resulted in both scrapping of expensive hardware and reduction in reliability in other instances. A basic objective of this study has been to gain a more complete understanding of the mechanisms and interactions of moisture in urethane systems to guide the development of reversion resistant materials for connector potting and conformal coating applications in high humidity environments. Basic polymer studies of molecular weight and distribution, polymer structure, and functionality were carried out to define those areas responsible for hydrolytic instability and to define polymer structural feature conducive to optimum hydrolytic stability.

  18. Existence and predictors of soft drink advertisements in Pennsylvania high schools.

    PubMed

    Probart, Claudia; McDonnell, Elaine; Bailey-Davis, Lisa; Weirich, J Elaine

    2006-12-01

    The objective of this study was to describe the extent and locations of soft drink advertisements on high school campuses in Pennsylvania and identify factors related to extent of these advertisements. Surveys were distributed to 271 school foodservice directors in a random sample of high schools in Pennsylvania. These high schools were selected to be representative of the entire population of high schools in Pennsylvania based on chosen demographic characteristics. A three-phase survey strategy was used, involving distribution of a postcard reminder 1 to 2 weeks after the initial survey distribution, and mailing of a second survey to nonrespondents 1 to 2 weeks after mailing of the postcard. Two hundred twenty-eight school foodservice directors (84%) returned surveys. Linear multiple regression analyses were done using SPSS (version 11.5.1, 2002, SPSS Inc, Chicago, IL). Approximately two thirds (66.5%) of respondents indicated soft drink advertisements exist in at least one location in their school, with the most prevalent locations being on vending machines (62%) and school grounds, such as playing fields (27%). Slightly more than 10% of respondents indicated soft drink advertisements displayed in the cafeteria. Extent of soft drink advertisement locations was positively related to existence of a pouring-rights contract, subscription to Channel One, and receipt of incentives from soft drink bottlers based on sales, but negatively related to average daily participation in school lunch. These findings suggest that commercialization and sales incentives might interact to contribute to school environments that are not "nutrition-friendly." Schools' efforts to establish wellness policies as mandated by the Child Nutrition and WIC Reauthorization Act of 2004 provide ideal opportunities to examine school environments for advertising that might conflict with the healthful environments they are aiming to establish, and perhaps to develop policies to address these practices.

  19. Characterizing Spatial and Temporal Patterns of Thermal Environment and Air Quality in Taipei Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Juang, J. Y.; Sun, C. H.; Jiang, J. A.; Wen, T. H.

    2017-12-01

    The urban heat island effect (UHI) caused by the regional-to-global environmental changes, dramatic urbanization, and shifting in land-use compositions has becoming an important environmental issue in recent years. In the past century, the coverage of urban area in Taipei Basin has dramatically increasing by ten folds. The strengthen of UHI effect significantly enhances the frequency of warm-night effect, and strongly influences the thermal environment of the residents in the Greater Taipei Metropolitan. In addition, the urban expansions due to dramatic increasing in urban populations and traffic loading significantly impacts the air quality and causes health issue in Taipei. In this study, the main objective is to quantify and characterize the temporal and spatial distributions of thermal environmental and air quality in the Greater Taipei Metropolitan Area by using monitoring data from Central Weather Bureau, Environmental Protection Administration. In addition, in this study, we conduct the analysis on the distribution of physiological equivalent temperature in the micro scale in the metropolitan area by using the observation data and quantitative simulation to investigate how the thermal environment is influenced under different conditions. Furthermore, we establish a real-time mobile monitoring system by using wireless sensor network to investigate the correlation between the thermal environment, air quality and other environmental factors, and propose to develop the early warning system for heat stress and air quality in the metropolitan area. The results from this study can be integrated into the management and planning system, and provide sufficient and important background information for the development of smart city in the metropolitan area in the future.

  20. Different facets of schizophrenia illustrated by the analysis of the homes of three patients diagnosed with schizophrenia.

    PubMed

    Murawiec, Sławomir; Britmann, Jonathan; Krysta, Krzysztof

    2013-09-01

    Diagnosis and observation of patients' behaviour during outpatient visits or hospitalisations strips the diagnostic process of the opportunity to consider their places of residence as their natural environment. In this way, patients present their symptoms and problems outside of the context of their daily life. Community-based psychiatric care, on the other hand, provides a chance to include, in the diagnostic process the environment created by a patient in their home. This image of a patient's external reality can reflect a certain mental reality. Such elements as furniture and other objects, their number, quality, distribution may reflect the inner mental world of the objects featuring in a person's mind. In some cases, this can become a valuable contribution to a diagnostic process. A description of three patients, all treated for schizophrenia, has been presented in this paper in order to explore this possible relationship. The first individual, "Patient N" lives in a flat in a state of extreme depletion of elements. "Patient N" suffers from chronic schizophrenia with severe negative symptoms. The second individual, "Patient D", has been also diagnosed with schizophrenia. Yet his home is filled with a huge number of elements, writings on the wall, things, figurines and objects of symbolic meaning. A closer examination of his psychopathological symptoms (fantastic, colourful, bizarre content) and history of his illness (unstable diagnosis of schizophrenia), and unpredictable response to antipsychotics may indicate a dissociative type of schizophrenia. Finally, "Patient K's" main living space is dominated by cats that live with him. Patient K was exposed to physical violence as a child and to him cats represent safe, non-threatening objects. He has been also treated for paranoid schizophrenia. The differences between these patients' personal histories and the courses of their illnesses are clearly manifested in the way they create their immediate environment.

  1. A Cooperative Model for IS Security Risk Management in Distributed Environment

    PubMed Central

    Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively. PMID:24563626

  2. Extremely Low Mass: The Circumstellar Envelope of a Potential Proto-Brown Dwarf

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer

    2011-01-01

    What is the environment for planet formation around extremely low mass stars? Is the environment around brown dwarfs and extremely low mass stars conducive and sufficiently massive for planet production? The determining conditions may be set very early in the process of the host object's formation. IRAS 16253-2429, the source of the Wasp-Waist Nebula seen in Spitzer IRAC images, is an isolated, very low luminosity ("VeLLO") Class 0 protostar in the nearby rho Ophiuchi cloud. We present VLA ammonia mapping observations of the dense gas envelope feeding the central core accreting system. We find a flattened envelope perpendicular to the outflow axis, and gas cavities that appear to cradle the outflow lobes as though carved out by the flow and associated (apparently precessing) jet, indicating environmental disruption. Based on the NH3 (1,1) and (2,2) emission distribution, we derive the mass, velocity fields and temperature distribution for the envelope. We discuss the combined evidence for this source to be one of the youngest and lowest mass sources in formation yet known, and discuss the ramifications for planet formation potential in this extremely low mass system.

  3. Rapid prototyping 3D virtual world interfaces within a virtual factory environment

    NASA Technical Reports Server (NTRS)

    Kosta, Charles Paul; Krolak, Patrick D.

    1993-01-01

    On-going work into user requirements analysis using CLIPS (NASA/JSC) expert systems as an intelligent event simulator has led to research into three-dimensional (3D) interfaces. Previous work involved CLIPS and two-dimensional (2D) models. Integral to this work was the development of the University of Massachusetts Lowell parallel version of CLIPS, called PCLIPS. This allowed us to create both a Software Bus and a group problem-solving environment for expert systems development. By shifting the PCLIPS paradigm to use the VEOS messaging protocol we have merged VEOS (HlTL/Seattle) and CLIPS into a distributed virtual worlds prototyping environment (VCLIPS). VCLIPS uses the VEOS protocol layer to allow multiple experts to cooperate on a single problem. We have begun to look at the control of a virtual factory. In the virtual factory there are actors and objects as found in our Lincoln Logs Factory of the Future project. In this artificial reality architecture there are three VCLIPS entities in action. One entity is responsible for display and user events in the 3D virtual world. Another is responsible for either simulating the virtual factory or communicating with the real factory. The third is a user interface expert. The interface expert maps user input levels, within the current prototype, to control information for the factory. The interface to the virtual factory is based on a camera paradigm. The graphics subsystem generates camera views of the factory on standard X-Window displays. The camera allows for view control and object control. Control or the factory is accomplished by the user reaching into the camera views to perform object interactions. All communication between the separate CLIPS expert systems is done through VEOS.

  4. ROME (Request Object Management Environment)

    NASA Astrophysics Data System (ADS)

    Kong, M.; Good, J. C.; Berriman, G. B.

    2005-12-01

    Most current astronomical archive services are based on an HTML/ CGI architecture where users submit HTML forms via a browser and CGI programs operating under a web server process the requests. Most services return an HTML result page with URL links to the result files or, for longer jobs, return a message indicating that email will be sent when the job is done. This paradigm has a few serious shortcomings. First, it is all too common for something to go wrong and for the user to never hear about the job again. Second, for long and complicated jobs there is often important intermediate information that would allow the user to adjust the processing. Finally, unless some sort of custom queueing mechanism is used, background jobs are started immediately upon receiving the CGI request. When there are many such requests the server machine can easily be overloaded and either slow to a crawl or crash. Request Object Management Environment (ROME) is a collection of middleware components being developed under the National Virtual Observatory Project to provide mechanism for managing long jobs such as computationally intensive statistical analysis requests or the generation of large scale mosaic images. Written as EJB objects within the open-source JBoss applications server, ROME receives processing requests via a servelet interface, stores them in a DBMS using JDBC, distributes the processing (via queuing mechanisms) across multiple machines and environments (including Grid resources), manages realtime messages from the processing modules, and ensures proper user notification. The request processing modules are identical in structure to standard CGI-programs -- though they can optionally implement status messaging -- and can be written in any language. ROME will persist these jobs across failures of processing modules, network outages, and even downtime of ROME and the DBMS, restarting them as necessary.

  5. The Moon: Biogenic elements

    NASA Technical Reports Server (NTRS)

    Gibson, Everett K., Jr.; Chang, Sherwood

    1992-01-01

    The specific objectives of the organic chemical exploration of the Moon involve the search for molecules of possible biological or prebiological origin. Detailed knowledge of the amount, distribution, and exact structure of organic compounds present on the Moon is extremely important to our understanding of the origin and history of the Moon and to its relationship to the history of the Earth and solar system. Specifically, such knowledge is essential for determining whether life on the Moon exists, ever did exist, or could develop. In the absence of life or organic matter, it is still essential to determine the abundance, distribution, and origin of the biogenic elements (e.g., H, C, O, N, S, P) in order to understand how the planetary environment may have influenced the course of chemical evolution. The history and scope of this effort is presented.

  6. Using AHP to analyze and ascertain the priority protective order of endangered plants in East Alashan-West Erdos

    NASA Astrophysics Data System (ADS)

    Tao, Zhang; Wei, Wang; Wang, Gary Z.; Luo, Hongyan; Liang, Cunzhu; Liu, Jiahui; An, Huijun; Pei, Hao; Zhong, Huidong; Chen, Xiaojun

    2006-08-01

    AHP is a kind of very effective systematic analytical method, widely applied to energy utilizing and resource analyzing, talent predicting, economic management project, urban industries planning, communications and transportation, water resource using and so on, using this method to solve the problem of ecology also have very strong practicability and validity. Using 15 kinds of endured plants in East Alashan-West Erdos as the research objects, this paper adopts 3S technique and outfield investigates to confirm the geographical distributions, extent density in the distribution area, plant community construction, and plant community environment of the plants. Then invite the experts to give marks according to this datum and using the AHP method to deal with the results, thereby get the priority protective order of endangered plants in East Alashan-West Erdos.

  7. Recent GRC Aerospace Technologies Applicable to Terrestrial Energy Systems

    NASA Technical Reports Server (NTRS)

    Kankam, David; Lyons, Valerie J.; Hoberecht, Mark A.; Tacina, Robert R.; Hepp, Aloysius F.

    2000-01-01

    This paper is an overview of a wide range of recent aerospace technologies under development at the NASA Glenn Research Center, in collaboration with other NASA centers, government agencies, industry and academia. The focused areas are space solar power, advanced power management and distribution systems, Stirling cycle conversion systems, fuel cells, advanced thin film photovoltaics and batteries, and combustion technologies. The aerospace-related objectives of the technologies are generation of space power, development of cost-effective and reliable, high performance power systems, cryogenic applications, energy storage, and reduction in gas-turbine emissions, with attendant clean jet engines. The terrestrial energy applications of the technologies include augmentation of bulk power in ground power distribution systems, and generation of residential, commercial and remote power, as well as promotion of pollution-free environment via reduction in combustion emissions.

  8. Health-Enabled Smart Sensor Fusion Technology

    NASA Technical Reports Server (NTRS)

    Wang, Ray

    2012-01-01

    A process was designed to fuse data from multiple sensors in order to make a more accurate estimation of the environment and overall health in an intelligent rocket test facility (IRTF), to provide reliable, high-confidence measurements for a variety of propulsion test articles. The object of the technology is to provide sensor fusion based on a distributed architecture. Specifically, the fusion technology is intended to succeed in providing health condition monitoring capability at the intelligent transceiver, such as RF signal strength, battery reading, computing resource monitoring, and sensor data reading. The technology also provides analytic and diagnostic intelligence at the intelligent transceiver, enhancing the IEEE 1451.x-based standard for sensor data management and distributions, as well as providing appropriate communications protocols to enable complex interactions to support timely and high-quality flow of information among the system elements.

  9. Development of water allocation Model Based on ET-Control and Its Application in Haihe River Basin

    NASA Astrophysics Data System (ADS)

    You, Jinjun; Gan, Hong; Gan, Zhiguo; Wang, Lin

    2010-05-01

    Traditionally, water allocation is to distribute water to different regions and sectors, without enough consideration on amount of water consumed after water distribution. Water allocation based on ET (evaporation and Transpiration) control changes this idea and emphasizes the absolute amount of evaporation and transpiration in specific area. With this ideology, the amount of ET involved the water allocation includes not only water consumed from the sectors, but the natural ET. Therefore, the water allocation consist of two steps, the first step is to estimate reasonable ET quantum in regions, then allocate water to more detailed regions and various sectors with the ET quantum according with the operational rules. To make qualified ET distribution and water allocation in various regions, a framework is put forward in this paper, in which two models are applied to analyze the different scenarios with predefined economic growth and ecological objective. The first model figures out rational ET objective with multi-objective analysis for compromised solution in economic growth and ecological maintenance. Food security and environmental protection are also taken as constraints in the optimization in the first model. The second one provides hydraulic simulation and water balance to allocate the ET objective to corresponding regions under operational rules. These two models are combined into an integrated ET-Control water allocation. Scenario analysis through the ET-Control Model could discover the relations between economy and ecology, farther to give suggestion on measures to control water use with condition of changing socio-economic growth and ecological objectives. To confirm the methodology, Haihe River is taken as a case to study. Rational water allocation is important branch of decision making on water planning and management in Haihe River Basin since water scarcity and deteriorating environment fights for water in this basin dramatically and reasonable water allocation between economy and ecology is a focus. Considering condition of water scarcity in Haihe River Basin, ET quota is taken as objective for water allocation in provinces to realize the requirement of water inflow into the Bohai Sea. Scenario analysis provides the results of water evaporation from natural water cycle and artificial use. A trade-off curve based on fulfilment of ecological and economic objectives in different scenarios discovers the competitive relation between human activities and nature.

  10. Peace Operations in Mali: Theory into Practice Then Measuring Effectiveness

    DTIC Science & Technology

    2017-06-09

    community’s response along two broad lines of effort (LOE): Creating a Safe and Secure Environment and promoting Stable Governance. When seeking to achieve a... Safe and Secure Environment , two objectives were measured. Objective #1 sought the Cessation of Large Scale Violence. Success was attained, as...Creating a Safe and Secure Environment and promoting Stable Governance. When seeking to achieve a Safe and Secure Environment , two objectives were

  11. Congestion Avoidance Testbed Experiments. Volume 2

    NASA Technical Reports Server (NTRS)

    Denny, Barbara A.; Lee, Diane S.; McKenney, Paul E., Sr.; Lee, Danny

    1994-01-01

    DARTnet provides an excellent environment for executing networking experiments. Since the network is private and spans the continental United States, it gives researchers a great opportunity to test network behavior under controlled conditions. However, this opportunity is not available very often, and therefore a support environment for such testing is lacking. To help remedy this situation, part of SRI's effort in this project was devoted to advancing the state of the art in the techniques used for benchmarking network performance. The second objective of SRI's effort in this project was to advance networking technology in the area of traffic control, and to test our ideas on DARTnet, using the tools we developed to improve benchmarking networks. Networks are becoming more common and are being used by more and more people. The applications, such as multimedia conferencing and distributed simulations, are also placing greater demand on the resources the networks provide. Hence, new mechanisms for traffic control must be created to enable their networks to serve the needs of their users. SRI's objective, therefore, was to investigate a new queueing and scheduling approach that will help to meet the needs of a large, diverse user population in a "fair" way.

  12. The Arecibo Galaxy Environment Survey - VI. The Virgo cluster (II)

    NASA Astrophysics Data System (ADS)

    Taylor, R.; Davies, J. I.; Auld, R.; Minchin, R. F.; Smith, R.

    2013-01-01

    We present 21-cm observations of a 5 × 1 deg2 region in the Virgo cluster, obtained as part of the Arecibo Galaxy Environment Survey. 13 cluster members are detected, together with 36 objects in the background. We compare and contrast the results from this area with a larger 10 × 2 deg2 region. We combine the two data sets to produce an H i mass function, which shows a higher detection rate at low masses (but finds fewer massive galaxies) than less sensitive wider area surveys, such as ALFALFA. We find that the H i-detected galaxies are distributed differently to the non-detections, both spatially and in velocity, providing further evidence that the cluster is still assembling. We use the Tully-Fisher relation to examine the possibility of morphological evolution. We find that highly deficient galaxies, as well as some early-type galaxies, have much lower velocity widths than the Tully-Fisher relation predicts, indicating gas loss via ram-pressure stripping. We also find that H i detections without optical counterparts do not fit the predictions of the baryonic Tully-Fisher relation, implying that they are not primordial objects.

  13. Metric Scale Calculation for Visual Mapping Algorithms

    NASA Astrophysics Data System (ADS)

    Hanel, A.; Mitschke, A.; Boerner, R.; Van Opdenbosch, D.; Hoegner, L.; Brodie, D.; Stilla, U.

    2018-05-01

    Visual SLAM algorithms allow localizing the camera by mapping its environment by a point cloud based on visual cues. To obtain the camera locations in a metric coordinate system, the metric scale of the point cloud has to be known. This contribution describes a method to calculate the metric scale for a point cloud of an indoor environment, like a parking garage, by fusing multiple individual scale values. The individual scale values are calculated from structures and objects with a-priori known metric extension, which can be identified in the unscaled point cloud. Extensions of building structures, like the driving lane or the room height, are derived from density peaks in the point distribution. The extension of objects, like traffic signs with a known metric size, are derived using projections of their detections in images onto the point cloud. The method is tested with synthetic image sequences of a drive with a front-looking mono camera through a virtual 3D model of a parking garage. It has been shown, that each individual scale value improves either the robustness of the fused scale value or reduces its error. The error of the fused scale is comparable to other recent works.

  14. A method of extracting ontology module using concept relations for sharing knowledge in mobile cloud computing environment.

    PubMed

    Lee, Keonsoo; Rho, Seungmin; Lee, Seok-Won

    2014-01-01

    In mobile cloud computing environment, the cooperation of distributed computing objects is one of the most important requirements for providing successful cloud services. To satisfy this requirement, all the members, who are employed in the cooperation group, need to share the knowledge for mutual understanding. Even if ontology can be the right tool for this goal, there are several issues to make a right ontology. As the cost and complexity of managing knowledge increase according to the scale of the knowledge, reducing the size of ontology is one of the critical issues. In this paper, we propose a method of extracting ontology module to increase the utility of knowledge. For the given signature, this method extracts the ontology module, which is semantically self-contained to fulfill the needs of the service, by considering the syntactic structure and semantic relation of concepts. By employing this module, instead of the original ontology, the cooperation of computing objects can be performed with less computing load and complexity. In particular, when multiple external ontologies need to be combined for more complex services, this method can be used to optimize the size of shared knowledge.

  15. NEXUS - Resilient Intelligent Middleware

    NASA Astrophysics Data System (ADS)

    Kaveh, N.; Hercock, R. Ghanea

    Service-oriented computing, a composition of distributed-object computing, component-based, and Web-based concepts, is becoming the widespread choice for developing dynamic heterogeneous software assets available as services across a network. One of the major strengths of service-oriented technologies is the high abstraction layer and large granularity level at which software assets are viewed compared to traditional object-oriented technologies. Collaboration through encapsulated and separately defined service interfaces creates a service-oriented environment, whereby multiple services can be linked together through their interfaces to compose a functional system. This approach enables better integration of legacy and non-legacy services, via wrapper interfaces, and allows for service composition at a more abstract level especially in cases such as vertical market stacks. The heterogeneous nature of service-oriented technologies and the granularity of their software components makes them a suitable computing model in the pervasive domain.

  16. Legacy systems: managing evolution through integration in a distributed and object-oriented computing environment.

    PubMed Central

    Lemaitre, D.; Sauquet, D.; Fofol, I.; Tanguy, L.; Jean, F. C.; Degoulet, P.

    1995-01-01

    Legacy systems are crucial for organizations since they support key functionalities. But they become obsolete with aging and the apparition of new techniques. Managing their evolution is a key issue in software engineering. This paper presents a strategy that has been developed at Broussais University Hospital in Paris to make a legacy system devoted to the management of health care units evolve towards a new up-to-date software. A two-phase evolution pathway is described. The first phase consists in separating the interface from the data storage and application control and in using a communication channel between the individualized components. The second phase proposes to use an object-oriented DBMS in place of the homegrown system. An application example for the management of hypertensive patients is described. PMID:8563252

  17. Distributed GPU Computing in GIScience

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.

    2013-12-01

    Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE Transactions on, 9(3), 378-394. 2. Li, J., Jiang, Y., Yang, C., Huang, Q., & Rice, M. (2013). Visualizing 3D/4D Environmental Data Using Many-core Graphics Processing Units (GPUs) and Multi-core Central Processing Units (CPUs). Computers & Geosciences, 59(9), 78-89. 3. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. (2008). GPU computing. Proceedings of the IEEE, 96(5), 879-899.

  18. Visual Depth from Motion Parallax and Eye Pursuit

    PubMed Central

    Stroyan, Keith; Nawrot, Mark

    2012-01-01

    A translating observer viewing a rigid environment experiences “motion parallax,” the relative movement upon the observer’s retina of variously positioned objects in the scene. This retinal movement of images provides a cue to the relative depth of objects in the environment, however retinal motion alone cannot mathematically determine relative depth of the objects. Visual perception of depth from lateral observer translation uses both retinal image motion and eye movement. In (Nawrot & Stroyan, 2009, Vision Res. 49, p.1969) we showed mathematically that the ratio of the rate of retinal motion over the rate of smooth eye pursuit mathematically determines depth relative to the fixation point in central vision. We also reported on psychophysical experiments indicating that this ratio is the important quantity for perception. Here we analyze the motion/pursuit cue for the more general, and more complicated, case when objects are distributed across the horizontal viewing plane beyond central vision. We show how the mathematical motion/pursuit cue varies with different points across the plane and with time as an observer translates. If the time varying retinal motion and smooth eye pursuit are the only signals used for this visual process, it is important to know what is mathematically possible to derive about depth and structure. Our analysis shows that the motion/pursuit ratio determines an excellent description of depth and structure in these broader stimulus conditions, provides a detailed quantitative hypothesis of these visual processes for the perception of depth and structure from motion parallax, and provides a computational foundation to analyze the dynamic geometry of future experiments. PMID:21695531

  19. Polar bear research in the Beaufort Sea

    USGS Publications Warehouse

    Amstrup, Steven C.; Durner, George M.; Wiig, Øystein; Born, Erik W.; Garner, Gerald W.; Wiig, Øystein; Born, Erik W.; Garner, Gerald W.

    1995-01-01

    Current research is designed to determine the status of the polar bear population in the Beaufort Sea and adjacent areas. One goal is to determine how polar bears are distributed relative to each other and habitat features, and to define population boundaries. Another goal is to determine the population size and trend, and assess how present and future management issues may affect thetrend. Specific objectives of the research include the need to:Determine the movements of individuals comprising the polar bear population that uses the Beaufort Sea. Determine hoe movements vary by season and by year, and whether they can be modeled so as to allow meaningful census and mitigation efforts.Improve estimates of size of the Beaufort Sea population relative to the capacity of the environment to sustain it.Determine factors regulating the rate of recruitment of new bears into the population.Determine the distribution of polar bear dens in northern Alaska and whether denning habitats may be a limiting factor on reproductive success.Determine the timing of den entrance and emergence.Determine the relative success rates (thus the reproductive significance) of dens in various locations.This report summaries the progress towards those objectives that has been made since the last meeting of the PBSG in 1988.

  20. BLUE STRAGGLER EVOLUTION CAUGHT IN THE ACT IN THE LARGE MAGELLANIC CLOUD GLOBULAR CLUSTER HODGE 11

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Chengyuan; De Grijs, Richard; Liu Xiangkun

    High-resolution Hubble Space Telescope imaging observations show that the radial distribution of the field-decontaminated sample of 162 'blue straggler' stars (BSs) in the 11.7{sup +0.2}{sub -0.1} Gyr old Large Magellanic Cloud cluster Hodge 11 exhibits a clear bimodality. In combination with their distinct loci in color-magnitude space, this offers new evidence in support of theoretical expectations that suggest different BS formation channels as a function of stellar density. In the cluster's color-magnitude diagram, the BSs in the inner 15'' (roughly corresponding to the cluster's core radius) are located more closely to the theoretical sequence resulting from stellar collisions, while thosemore » in the periphery (at radii between 85'' and 100'') are preferentially found in the region expected to contain objects formed through binary mass transfer or coalescence. In addition, the objects' distribution in color-magnitude space provides us with the rare opportunity in an extragalactic environment to quantify the evolution of the cluster's collisionally induced BS population and the likely period that has elapsed since their formation epoch, which we estimate to have occurred {approx}4-5 Gyr ago.« less

  1. e-Phys: a suite of intracellular neurophysiology programs integrating COM (component object model) technologies.

    PubMed

    Nguyen, Quoc-Thang; Miledi, Ricardo

    2003-09-30

    Current computer programs for intracellular recordings often lack advanced data management, are usually incompatible with other applications and are also difficult to adapt to new experiments. We have addressed these shortcomings in e-Phys, a suite of electrophysiology applications for intracellular recordings. The programs in e-Phys use Component Object Model (COM) technologies available in the Microsoft Windows operating system to provide enhanced data storage, increased interoperability between e-Phys and other COM-aware applications, and easy customization of data acquisition and analysis thanks to a script-based integrated programming environment. Data files are extensible, hierarchically organized and integrated in the Windows shell by using the Structured Storage technology. Data transfers to and from other programs are facilitated by implementing the ActiveX Automation standard and distributed COM (DCOM). ActiveX Scripting allows experimenters to write their own event-driven acquisition and analysis programs in the VBScript language from within e-Phys. Scripts can reuse components available from other programs on other machines to create distributed meta-applications. This paper describes the main features of e-Phys and how this package was used to determine the effect of the atypical antipsychotic drug clozapine on synaptic transmission at the neuromuscular junction.

  2. Maximizing Information Diffusion in the Cyber-physical Integrated Network †

    PubMed Central

    Lu, Hongliang; Lv, Shaohe; Jiao, Xianlong; Wang, Xiaodong; Liu, Juan

    2015-01-01

    Nowadays, our living environment has been embedded with smart objects, such as smart sensors, smart watches and smart phones. They make cyberspace and physical space integrated by their abundant abilities of sensing, communication and computation, forming a cyber-physical integrated network. In order to maximize information diffusion in such a network, a group of objects are selected as the forwarding points. To optimize the selection, a minimum connected dominating set (CDS) strategy is adopted. However, existing approaches focus on minimizing the size of the CDS, neglecting an important factor: the weight of links. In this paper, we propose a distributed maximizing the probability of information diffusion (DMPID) algorithm in the cyber-physical integrated network. Unlike previous approaches that only consider the size of CDS selection, DMPID also considers the information spread probability that depends on the weight of links. To weaken the effects of excessively-weighted links, we also present an optimization strategy that can properly balance the two factors. The results of extensive simulation show that DMPID can nearly double the information diffusion probability, while keeping a reasonable size of selection with low overhead in different distributed networks. PMID:26569254

  3. Feasibility of an integrated X-ray instrument for Mars exobiology and geology. [Abstract only

    NASA Technical Reports Server (NTRS)

    Fonda, M. L.; Schwartz, D. E.; Koppel, L. N.; Franco, E. D.; Kerner, J. A.

    1994-01-01

    By employing an integrated X-ray instrument on a future Mars mission, data obtained will greatly augment those returned by Viking; details relevant to the possibility of the origin and evolution of life on Mars will be acquired. An integrated combined X Ray Fluorescence/X Ray Detection (XRF/XRD) instrument has been breadboarded and demonstrated to accommodate important exobiology and geology experiment objectives outlined for Mars Environmental Survey (MESUR) and future Mars missions. Among others, primary objectives for the exploration of Mars include: the intense study of local areas on Mars to 'establish the chemical, mineralogical, and petrological character of different components of the surface material; to determine the distribution, abundance and sources and sinks of volatile materials, including an assessment of the biologic potential, now and during past epochs; and to establish the global chemical and physical characteristics of the Martian surface'. The XRF/XRD breadboard instrument identifies and quantifies soil surface elemental, mineralogical, and petrological characteristics and acquires data necessary to address questions on volatile abundance and distribution. Additionally, the breadboard is able to characterize the biogenic element constituents of soil samples providing information on the biologic potential of the Mars environment.

  4. Long Term Land Cover and Seagrass Mapping using Landsat and Object-based Image Analysis from 1972 - 2010 in the Coastal Environment of South East Queensland, Australia

    NASA Astrophysics Data System (ADS)

    Lyons, M. B.; Phinn, S. R.; Roelfsema, C. M.

    2011-12-01

    Long term global archives of high-moderate spatial resolution, multi-spectral satellite imagery are now readily accessible, but are not being fully utilised by management agencies due to the lack of appropriate methods to consistently produce accurate and timely management ready information. This work developed an object-based approach to map land cover and seagrass distribution in an Australian coastal environment for a 38 year Landsat image time-series archive. Landsat Multi-Spectral Scanner (MSS), Thematic Mapper (TM) and Enhanced Thematic Mapper (ETM+) imagery were used without in-situ field data input to produce land and seagrass cover maps every year data was available, resulting in over 60 individual map products over the 38 year archive. Land cover was mapped annually and included several vegetation, bare ground, urban and agricultural classes. Seagrass distribution was also mapped annually, and in some years monthly, via horizontal projective foliage cover classes, sand and deepwater. Land cover products were validated using aerial photography and seagrass was validated with field survey data, producing several measures of accuracy. An average overall accuracy of 65% and 81% was reported for seagrass and land cover respectively, which is consistent with other studies in the area. This study is the first to show moderate spatial resolution, long term annual changes in land cover and seagrass in an Australian environment, without the use of in-situ data; and only one of a few similar studies globally. The land cover products identify several long term trends; such as significant increases in South East Queensland's urban density, vegetation clearing in rural and rural-residential areas, and inter-annual variation in dry vegetation types in western South East Queensland. The seagrass cover products show that there has been a minimal overall change in seagrass extent, but that seagrass cover level distribution is extremely dynamic; evidenced by large scale migrations of higher seagrass cover levels and several events of sudden, significant changes in cover level. These mapping products will allow management agencies to build a baseline assessment of their resources, understand past changes and help inform implementation and planning of management policy to address potential future changes.

  5. Long term land cover and seagrass mapping using Landsat and object-based image analysis from 1972 to 2010 in the coastal environment of South East Queensland, Australia

    NASA Astrophysics Data System (ADS)

    Lyons, Mitchell B.; Phinn, Stuart R.; Roelfsema, Chris M.

    2012-07-01

    Long term global archives of high-moderate spatial resolution, multi-spectral satellite imagery are now readily accessible, but are not being fully utilised by management agencies due to the lack of appropriate methods to consistently produce accurate and timely management ready information. This work developed an object-based remote sensing approach to map land cover and seagrass distribution in an Australian coastal environment for a 38 year Landsat image time-series archive (1972-2010). Landsat Multi-Spectral Scanner (MSS), Thematic Mapper (TM) and Enhanced Thematic Mapper (ETM+) imagery were used without in situ field data input (but still using field knowledge) to produce land and seagrass cover maps every year data were available, resulting in over 60 map products over the 38 year archive. Land cover was mapped annually using vegetation, bare ground, urban and agricultural classes. Seagrass distribution was also mapped annually, and in some years monthly, via horizontal projected foliage cover classes, sand and deep water. Land cover products were validated using aerial photography and seagrass maps were validated with field survey data, producing several measures of accuracy. An average overall accuracy of 65% and 80% was reported for seagrass and land cover products respectively, which is consistent with other studies in the area. This study is the first to show moderate spatial resolution, long term annual changes in land cover and seagrass in an Australian environment, created without the use of in situ data; and only one of a few similar studies globally. The land cover products identify several long term trends; such as significant increases in South East Queensland's urban density and extent, vegetation clearing in rural and rural-residential areas, and inter-annual variation in dry vegetation types in western South East Queensland. The seagrass cover products show that there has been a minimal overall change in seagrass extent, but that seagrass cover level distribution is extremely dynamic; evidenced by large scale migrations of higher seagrass cover levels and several sudden and significant changes in cover level. These mapping products will allow management agencies to build a baseline assessment of their resources, understand past changes and help inform implementation and planning of management policy to address potential future changes.

  6. Pharmaceutical logistics in the European theater.

    PubMed

    Spain, J

    1999-10-01

    This article describes the responsibilities and objectives of the pharmacy officer for the U.S. Army Medical Materiel Center, Europe. Pharmacists' experiences and knowledge offer advantages in the ordering, storage, and distribution of medical materiel. Exploitation of new technology and a customer-focused attitude encourage a working environment that capitalizes on pharmaceutical expertise. The use of temperature monitors, enhanced automation opportunities, expired drug return credits, and other customer-focused initiatives exemplify pharmacists' value to military medical logistics organizations. An overview of the pharmaceutical pipeline to U.S. military and State Department customers in the European theater is provided.

  7. Advanced Group Support Systems and Facilities

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1999-01-01

    The document contains the proceedings of the Workshop on Advanced Group Support Systems and Facilities held at NASA Langley Research Center, Hampton, Virginia, July 19-20, 1999. The workshop was jointly sponsored by the University of Virginia Center for Advanced Computational Technology and NASA. Workshop attendees came from NASA, other government agencies, industry, and universities. The objectives of the workshop were to assess the status of advanced group support systems and to identify the potential of these systems for use in future collaborative distributed design and synthesis environments. The presentations covered the current status and effectiveness of different group support systems.

  8. Autonomous mobile robot teams

    NASA Technical Reports Server (NTRS)

    Agah, Arvin; Bekey, George A.

    1994-01-01

    This paper describes autonomous mobile robot teams performing tasks in unstructured environments. The behavior and the intelligence of the group is distributed, and the system does not include a central command base or leader. The novel concept of the Tropism-Based Cognitive Architecture is introduced, which is used by the robots in order to produce behavior transforming their sensory information to proper action. The results of a number of simulation experiments are presented. These experiments include worlds where the robot teams must locate, decompose, and gather objects, and defend themselves against hostile predators, while navigating around stationary and mobile obstacles.

  9. OAST space power technology program

    NASA Technical Reports Server (NTRS)

    Mullin, J. P.

    1978-01-01

    The current research and technology (R and T) base program is first described, then special attention is directed toward outlining a new system technology specifically oriented toward providing the utility power plant technology base for semi-permanent earth orbital facilities expected to be needed in the middle to late 1980's. The R and T program involves five areas of research: (1) photovoltaic energy conversion; (2) chemical energy conversion and storage; (3) thermal-to-electric conversion; (4) environment interactions; and (5) power systems management and distribution. The general objectives and planned direction of efforts in each of these areas is summarized.

  10. Performance of the Heavy Flavor Tracker (HFT) detector in star experiment at RHIC

    NASA Astrophysics Data System (ADS)

    Alruwaili, Manal

    With the growing technology, the number of the processors is becoming massive. Current supercomputer processing will be available on desktops in the next decade. For mass scale application software development on massive parallel computing available on desktops, existing popular languages with large libraries have to be augmented with new constructs and paradigms that exploit massive parallel computing and distributed memory models while retaining the user-friendliness. Currently, available object oriented languages for massive parallel computing such as Chapel, X10 and UPC++ exploit distributed computing, data parallel computing and thread-parallelism at the process level in the PGAS (Partitioned Global Address Space) memory model. However, they do not incorporate: 1) any extension at for object distribution to exploit PGAS model; 2) the programs lack the flexibility of migrating or cloning an object between places to exploit load balancing; and 3) lack the programming paradigms that will result from the integration of data and thread-level parallelism and object distribution. In the proposed thesis, I compare different languages in PGAS model; propose new constructs that extend C++ with object distribution and object migration; and integrate PGAS based process constructs with these extensions on distributed objects. Object cloning and object migration. Also a new paradigm MIDD (Multiple Invocation Distributed Data) is presented when different copies of the same class can be invoked, and work on different elements of a distributed data concurrently using remote method invocations. I present new constructs, their grammar and their behavior. The new constructs have been explained using simple programs utilizing these constructs.

  11. High-spatial resolution multispectral and panchromatic satellite imagery for mapping perennial desert plants

    NASA Astrophysics Data System (ADS)

    Alsharrah, Saad A.; Bruce, David A.; Bouabid, Rachid; Somenahalli, Sekhar; Corcoran, Paul A.

    2015-10-01

    The use of remote sensing techniques to extract vegetation cover information for the assessment and monitoring of land degradation in arid environments has gained increased interest in recent years. However, such a task can be challenging, especially for medium-spatial resolution satellite sensors, due to soil background effects and the distribution and structure of perennial desert vegetation. In this study, we utilised Pleiades high-spatial resolution, multispectral (2m) and panchromatic (0.5m) imagery and focused on mapping small shrubs and low-lying trees using three classification techniques: 1) vegetation indices (VI) threshold analysis, 2) pre-built object-oriented image analysis (OBIA), and 3) a developed vegetation shadow model (VSM). We evaluated the success of each approach using a root of the sum of the squares (RSS) metric, which incorporated field data as control and three error metrics relating to commission, omission, and percent cover. Results showed that optimum VI performers returned good vegetation cover estimates at certain thresholds, but failed to accurately map the distribution of the desert plants. Using the pre-built IMAGINE Objective OBIA approach, we improved the vegetation distribution mapping accuracy, but this came at the cost of over classification, similar to results of lowering VI thresholds. We further introduced the VSM which takes into account shadow for further refining vegetation cover classification derived from VI. The results showed significant improvements in vegetation cover and distribution accuracy compared to the other techniques. We argue that the VSM approach using high-spatial resolution imagery provides a more accurate representation of desert landscape vegetation and should be considered in assessments of desertification.

  12. Common object request broker architecture (CORBA)-based security services for the virtual radiology environment.

    PubMed

    Martinez, R; Cole, C; Rozenblit, J; Cook, J F; Chacko, A K

    2000-05-01

    The US Army Great Plains Regional Medical Command (GPRMC) has a requirement to conform to Department of Defense (DoD) and Army security policies for the Virtual Radiology Environment (VRE) Project. Within the DoD, security policy is defined as the set of laws, rules, and practices that regulate how an organization manages, protects, and distributes sensitive information. Security policy in the DoD is described by the Trusted Computer System Evaluation Criteria (TCSEC), Army Regulation (AR) 380-19, Defense Information Infrastructure Common Operating Environment (DII COE), Military Health Services System Automated Information Systems Security Policy Manual, and National Computer Security Center-TG-005, "Trusted Network Interpretation." These documents were used to develop a security policy that defines information protection requirements that are made with respect to those laws, rules, and practices that are required to protect the information stored and processed in the VRE Project. The goal of the security policy is to provide for a C2-level of information protection while also satisfying the functional needs of the GPRMC's user community. This report summarizes the security policy for the VRE and defines the CORBA security services that satisfy the policy. In the VRE, the information to be protected is embedded into three major information components: (1) Patient information consists of Digital Imaging and Communications in Medicine (DICOM)-formatted fields. The patient information resides in the digital imaging network picture archiving and communication system (DIN-PACS) networks in the database archive systems and includes (a) patient demographics; (b) patient images from x-ray, computed tomography (CT), magnetic resonance imaging (MRI), and ultrasound (US); and (c) prior patient images and related patient history. (2) Meta-Manager information to be protected consists of several data objects. This information is distributed to the Meta-Manager nodes and includes (a) radiologist schedules; (b) modality worklists; (c) routed case information; (d) DIN-PACS and Composite Health Care system (CHCS) messages, and Meta-Manager administrative and security information; and (e) patient case information. (3) Access control and communications security is required in the VRE to control who uses the VRE and Meta-Manager facilities and to secure the messages between VRE components. The CORBA Security Service Specification version 1.5 is designed to allow up to TCSEC's B2-level security for distributed objects. The CORBA Security Service Specification defines the functionality of several security features: identification and authentication, authorization and access control, security auditing, communication security, nonrepudiation, and security administration. This report describes the enhanced security features for the VRE and their implementation using commercial CORBA Security Service software products.

  13. Arcade: A Web-Java Based Framework for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.

  14. An object oriented Python interface for atomistic simulations

    NASA Astrophysics Data System (ADS)

    Hynninen, T.; Himanen, L.; Parkkinen, V.; Musso, T.; Corander, J.; Foster, A. S.

    2016-01-01

    Programmable simulation environments allow one to monitor and control calculations efficiently and automatically before, during, and after runtime. Environments directly accessible in a programming environment can be interfaced with powerful external analysis tools and extensions to enhance the functionality of the core program, and by incorporating a flexible object based structure, the environments make building and analysing computational setups intuitive. In this work, we present a classical atomistic force field with an interface written in Python language. The program is an extension for an existing object based atomistic simulation environment.

  15. An Intelligent System for Document Retrieval in Distributed Office Environments.

    ERIC Educational Resources Information Center

    Mukhopadhyay, Uttam; And Others

    1986-01-01

    MINDS (Multiple Intelligent Node Document Servers) is a distributed system of knowledge-based query engines for efficiently retrieving multimedia documents in an office environment of distributed workstations. By learning document distribution patterns and user interests and preferences during system usage, it customizes document retrievals for…

  16. Designing Distributed Learning Environments with Intelligent Software Agents

    ERIC Educational Resources Information Center

    Lin, Fuhua, Ed.

    2005-01-01

    "Designing Distributed Learning Environments with Intelligent Software Agents" reports on the most recent advances in agent technologies for distributed learning. Chapters are devoted to the various aspects of intelligent software agents in distributed learning, including the methodological and technical issues on where and how intelligent agents…

  17. Inpatients' and outpatients' satisfaction: the mediating role of perceived quality of physical and social environment.

    PubMed

    Campos Andrade, Cláudia; Lima, Maria Luísa; Pereira, Cícero Roberto; Fornara, Ferdinando; Bonaiuto, Marino

    2013-05-01

    This study analyses the processes through which the physical environment of health care settings impacts on patients' well-being. Specifically, we investigate the mediating role of perceptions of the physical and social environments, and if this process is moderated by patients' status, that is, if the objective physical environment impacts inpatients' and outpatients' satisfaction by different social-psychological processes. Patients (N=206) evaluated the physical and social environments of the care unit where they were receiving treatment, and its objective physical conditions were independently evaluated by two architects. Results showed that the objective environmental quality affects satisfaction through perceptions of environmental quality, and that patients' status moderates this relationship. For inpatients, it is the perception of quality of the social environment that mediates the relationship between objective environmental quality and satisfaction, whereas for outpatients it is the perception of quality of the physical environment. This moderated mediation is discussed in terms of differences on patients' experiences of health care environments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Geospatial Analysis of Food Environment Demonstrates Associations with Gestational Diabetes

    PubMed Central

    KAHR, Maike K.; SUTER, Melissa A.; BALLAS, Jerasimos; RAMIN, Susan M.; MONGA, Manju; LEE, Wesley; HU, Min; SHOPE, Cindy D.; CHESNOKOVA, Arina; KRANNICH, Laura; GRIFFIN, Emily N.; MASTROBATTISTA, Joan; DILDY, Gary A.; STREHLOW, Stacy L.; RAMPHUL, Ryan; HAMILTON, Winifred J; AAGAARD, Kjersti M.

    2015-01-01

    Background Gestational diabetes mellitus (GDM) is one of most common complications of pregnancy, with incidence rates varying by maternal age, race/ethnicity, obesity, parity, and family history. Given its increasing prevalence in recent decades, co-variant environmental and sociodemographic factors may be additional determinants of GDM occurrence. Objectives We hypothesized that environmental risk factors, in particular measures of the food environment, may be a diabetes contributor. We employed geospatial modeling in a populous U.S. county to characterize the association of the relative availability of fast food restaurants and supermarkets to GDM. Study Design Utilizing a perinatal database with over 4900 encoded antenatal and outcome variables inclusive of zip code data, 8912 consecutive pregnancies were analyzed for correlations between GDM and food environment based on county-wide food permit registration data. Linkage between pregnancies and food environment was achieved on the basis of validated 5 digit zip code data. The prevalence of supermarkets and fast food restaurants per 100,000 inhabitants for each zip code were gathered from publicly available food permit sources. In order to independently authenticate our findings with objective data, we measured hemoglobin A1c (HbA1c) levels as a function of geospatial distribution of food environment in a matched subset (n=80). Results Residence in neighborhoods with a high prevalence of fast food restaurants (fourth quartile) was significantly associated with an increased risk of developing GDM (relative to first quartile, aOR: 1.63 [95% CI 1.21–2.19]). In multivariate analysis, this association held true after controlling for potential confounders (p=0.002). Measurement of HbA1c levels in a matched subset were significantly increased in association with residence in a zip code with a higher fast food/supermarket ratio (n=80, r=0.251 p<0.05). Conclusions As demonstrated by geospatial analysis, a relationship of food environment and risk for gestational diabetes was identified. PMID:26319053

  19. Using Acoustics to Determine Eelgrass Bed Distribution and to Assess the Seasonal Variation of Ecosystem Service.

    PubMed

    Sonoki, Shiori; Shao, Huamei; Morita, Yuka; Minami, Kenji; Shoji, Jun; Hori, Masakazu; Miyashita, Kazushi

    2016-01-01

    Eelgrass beds are an important source of primary production in coastal ecosystems. Understanding seasonal variation in the abundance and distribution of eelgrass is important for conservation, and the objectives of this study were to 1) monitor seasonal variation in eelgrass beds using an acoustic monitoring method (Quantitative echo sounder) and 2) broadly quantify the carbon circulation function. We obtained acoustic data of eelgrass beds in coastal areas north and east of Ikunojima Island. Surveys were conducted nine times over the 3-year period from 2011 to 2013 in order to monitor seasonal variation. Acoustic data were obtained and used to estimate the spatial distribution of eelgrass by geostatistical methods. To determine supporting services, we determined carbon sink and carbon fixation by eelgrass beds using data from the National Research Institute of Fisheries and Environment of Inland Sea (2011). The height and distribution of eelgrass beds were at a maximum in May and at a minimum in November of each year. Distribution trends were different between the north and east areas. Supporting services showed the same patterns throughout the year. The area of distribution was considered to be coincident with the life history of eelgrass. Distribution differed by area and changed yearly due to the effects of bottom characteristics and wind direction. Quantifying the supporting services of eelgrass beds was shown to be useful for managing the conservation of coastal ecosystems.

  20. LYDIAN: An Extensible Educational Animation Environment for Distributed Algorithms

    ERIC Educational Resources Information Center

    Koldehofe, Boris; Papatriantafilou, Marina; Tsigas, Philippas

    2006-01-01

    LYDIAN is an environment to support the teaching and learning of distributed algorithms. It provides a collection of distributed algorithms as well as continuous animations. Users can combine algorithms and animations with arbitrary network structures defining the interconnection and behavior of the distributed algorithm. Further, it facilitates…

  1. Direct and Indirect Associations Between the Built Environment and Leisure and Utilitarian Walking in Older Women.

    PubMed

    Troped, Philip J; Tamura, Kosuke; McDonough, Meghan H; Starnes, Heather A; James, Peter; Ben-Joseph, Eran; Cromley, Ellen; Puett, Robin; Melly, Steven J; Laden, Francine

    2017-04-01

    The built environment predicts walking in older adults, but the degree to which associations between the objective built environment and walking for different purposes are mediated by environmental perceptions is unknown. We examined associations between the neighborhood built environment and leisure and utilitarian walking and mediation by the perceived environment among older women. Women (N = 2732, M age  = 72.8 ± 6.8 years) from Massachusetts, Pennsylvania, and California completed a neighborhood built environment and walking survey. Objective population and intersection density and density of stores and services variables were created within residential buffers. Perceived built environment variables included measures of land use mix, street connectivity, infrastructure for walking, esthetics, traffic safety, and personal safety. Regression and bootstrapping were used to test associations and indirect effects. Objective population, stores/services, and intersection density indirectly predicted leisure and utilitarian walking via perceived land use mix (odds ratios (ORs) = 1.01-1.08, 95 % bias corrected and accelerated confidence intervals do not include 1). Objective density of stores/services directly predicted ≥150 min utilitarian walking (OR = 1.11; 95% CI = 1.02, 1.22). Perceived land use mix (ORs = 1.16-1.44) and esthetics (ORs = 1.24-1.61) significantly predicted leisure and utilitarian walking, CONCLUSIONS: Perceived built environment mediated associations between objective built environment variables and walking for leisure and utilitarian purposes. Interventions for older adults should take into account how objective built environment characteristics may influence environmental perceptions and walking.

  2. ON A POSSIBLE SIZE/COLOR RELATIONSHIP IN THE KUIPER BELT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pike, R. E.; Kavelaars, J. J., E-mail: repike@uvic.ca

    2013-10-01

    Color measurements and albedo distributions introduce non-intuitive observational biases in size-color relationships among Kuiper Belt Objects (KBOs) that cannot be disentangled without a well characterized sample population with systematic photometry. Peixinho et al. report that the form of the KBO color distribution varies with absolute magnitude, H. However, Tegler et al. find that KBO color distributions are a property of object classification. We construct synthetic models of observed KBO colors based on two B-R color distribution scenarios: color distribution dependent on H magnitude (H-Model) and color distribution based on object classification (Class-Model). These synthetic B-R color distributions were modified tomore » account for observational flux biases. We compare our synthetic B-R distributions to the observed ''Hot'' and ''Cold'' detected objects from the Canada-France Ecliptic Plane Survey and the Meudon Multicolor Survey. For both surveys, the Hot population color distribution rejects the H-Model, but is well described by the Class-Model. The Cold objects reject the H-Model, but the Class-Model (while not statistically rejected) also does not provide a compelling match for data. Although we formally reject models where the structure of the color distribution is a strong function of H magnitude, we also do not find that a simple dependence of color distribution on orbit classification is sufficient to describe the color distribution of classical KBOs.« less

  3. Virtual Collaborative Environments for System of Systems Engineering and Applications for ISAT

    NASA Technical Reports Server (NTRS)

    Dryer, David A.

    2002-01-01

    This paper describes an system of systems or metasystems approach and models developed to help prepare engineering organizations for distributed engineering environments. These changes in engineering enterprises include competition in increasingly global environments; new partnering opportunities caused by advances in information and communication technologies, and virtual collaboration issues associated with dispersed teams. To help address challenges and needs in this environment, a framework is proposed that can be customized and adapted for NASA to assist in improved engineering activities conducted in distributed, enhanced engineering environments. The approach is designed to prepare engineers for such distributed collaborative environments by learning and applying e-engineering methods and tools to a real-world engineering development scenario. The approach consists of two phases: an e-engineering basics phase and e-engineering application phase. The e-engineering basics phase addresses skills required for e-engineering. The e-engineering application phase applies these skills in a distributed collaborative environment to system development projects.

  4. Microservices in Web Objects Enabled IoT Environment for Enhancing Reusability

    PubMed Central

    Chong, Ilyoung

    2018-01-01

    In the ubiquitous Internet of Things (IoT) environment, reusing objects instead of creating new one has become important in academics and industries. The situation becomes complex due to the availability of a huge number of connected IoT objects, and each individual service creates a new object instead of reusing the existing one to fulfill a requirement. A well-standard mechanism not only improves the reusability of objects but also improves service modularity and extensibility, and reduces cost. Web Objects enabled IoT environment applies the principle of reusability of objects in multiple IoT application domains through central objects repository and microservices. To reuse objects with microservices and to maintain a relationship with them, this study presents an architecture of Web of Objects platform. In the case of a similar request for an object, the already instantiated object that exists in the same or from other domain can be reused. Reuse of objects through microservices avoids duplications, and reduces time to search and instantiate them from their registries. Further, this article presents an algorithm for microservices and related objects discovery that considers the reusability of objects through the central objects repository. To support the reusability of objects, the necessary algorithm for objects matching is also presented. To realize the reusability of objects in Web Objects enabled IoT environment, a prototype has been designed and implemented based on a use case scenario. Finally, the results of the prototype have been analyzed and discussed to validate the proposed approach. PMID:29373491

  5. Microservices in Web Objects Enabled IoT Environment for Enhancing Reusability.

    PubMed

    Jarwar, Muhammad Aslam; Kibria, Muhammad Golam; Ali, Sajjad; Chong, Ilyoung

    2018-01-26

    In the ubiquitous Internet of Things (IoT) environment, reusing objects instead of creating new one has become important in academics and industries. The situation becomes complex due to the availability of a huge number of connected IoT objects, and each individual service creates a new object instead of reusing the existing one to fulfill a requirement. A well-standard mechanism not only improves the reusability of objects but also improves service modularity and extensibility, and reduces cost. Web Objects enabled IoT environment applies the principle of reusability of objects in multiple IoT application domains through central objects repository and microservices. To reuse objects with microservices and to maintain a relationship with them, this study presents an architecture of Web of Objects platform. In the case of a similar request for an object, the already instantiated object that exists in the same or from other domain can be reused. Reuse of objects through microservices avoids duplications, and reduces time to search and instantiate them from their registries. Further, this article presents an algorithm for microservices and related objects discovery that considers the reusability of objects through the central objects repository. To support the reusability of objects, the necessary algorithm for objects matching is also presented. To realize the reusability of objects in Web Objects enabled IoT environment, a prototype has been designed and implemented based on a use case scenario. Finally, the results of the prototype have been analyzed and discussed to validate the proposed approach.

  6. A POSSIBLE DIVOT IN THE SIZE DISTRIBUTION OF THE KUIPER BELT'S SCATTERING OBJECTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shankman, C.; Gladman, B. J.; Kaib, N.

    Via joint analysis of a calibrated telescopic survey, which found scattering Kuiper Belt objects, and models of their expected orbital distribution, we explore the scattering-object (SO) size distribution. Although for D > 100 km the number of objects quickly rise as diameters decrease, we find a relative lack of smaller objects, ruling out a single power law at greater than 99% confidence. After studying traditional ''knees'' in the size distribution, we explore other formulations and find that, surprisingly, our analysis is consistent with a very sudden decrease (a divot) in the number distribution as diameters decrease below 100 km, whichmore » then rises again as a power law. Motivated by other dynamically hot populations and the Centaurs, we argue for a divot size distribution where the number of smaller objects rises again as expected via collisional equilibrium. Extrapolation yields enough kilometer-scale SOs to supply the nearby Jupiter-family comets. Our interpretation is that this divot feature is a preserved relic of the size distribution made by planetesimal formation, now ''frozen in'' to portions of the Kuiper Belt sharing a ''hot'' orbital inclination distribution, explaining several puzzles in Kuiper Belt science. Additionally, we show that to match today's SO inclination distribution, the supply source that was scattered outward must have already been vertically heated to the of order 10 Degree-Sign .« less

  7. Resilience-based optimal design of water distribution network

    NASA Astrophysics Data System (ADS)

    Suribabu, C. R.

    2017-11-01

    Optimal design of water distribution network is generally aimed to minimize the capital cost of the investments on tanks, pipes, pumps, and other appurtenances. Minimizing the cost of pipes is usually considered as a prime objective as its proportion in capital cost of the water distribution system project is very high. However, minimizing the capital cost of the pipeline alone may result in economical network configuration, but it may not be a promising solution in terms of resilience point of view. Resilience of the water distribution network has been considered as one of the popular surrogate measures to address ability of network to withstand failure scenarios. To improve the resiliency of the network, the pipe network optimization can be performed with two objectives, namely minimizing the capital cost as first objective and maximizing resilience measure of the configuration as secondary objective. In the present work, these two objectives are combined as single objective and optimization problem is solved by differential evolution technique. The paper illustrates the procedure for normalizing the objective functions having distinct metrics. Two of the existing resilience indices and power efficiency are considered for optimal design of water distribution network. The proposed normalized objective function is found to be efficient under weighted method of handling multi-objective water distribution design problem. The numerical results of the design indicate the importance of sizing pipe telescopically along shortest path of flow to have enhanced resiliency indices.

  8. Logistic Regression and Path Analysis Method to Analyze Factors influencing Students’ Achievement

    NASA Astrophysics Data System (ADS)

    Noeryanti, N.; Suryowati, K.; Setyawan, Y.; Aulia, R. R.

    2018-04-01

    Students' academic achievement cannot be separated from the influence of two factors namely internal and external factors. The first factors of the student (internal factors) consist of intelligence (X1), health (X2), interest (X3), and motivation of students (X4). The external factors consist of family environment (X5), school environment (X6), and society environment (X7). The objects of this research are eighth grade students of the school year 2016/2017 at SMPN 1 Jiwan Madiun sampled by using simple random sampling. Primary data are obtained by distributing questionnaires. The method used in this study is binary logistic regression analysis that aims to identify internal and external factors that affect student’s achievement and how the trends of them. Path Analysis was used to determine the factors that influence directly, indirectly or totally on student’s achievement. Based on the results of binary logistic regression, variables that affect student’s achievement are interest and motivation. And based on the results obtained by path analysis, factors that have a direct impact on student’s achievement are students’ interest (59%) and students’ motivation (27%). While the factors that have indirect influences on students’ achievement, are family environment (97%) and school environment (37).

  9. Building interactive virtual environments for simulated training in medicine using VRML and Java/JavaScript.

    PubMed

    Korocsec, D; Holobar, A; Divjak, M; Zazula, D

    2005-12-01

    Medicine is a difficult thing to learn. Experimenting with real patients should not be the only option; simulation deserves a special attention here. Virtual Reality Modelling Language (VRML) as a tool for building virtual objects and scenes has a good record of educational applications in medicine, especially for static and animated visualisations of body parts and organs. However, to create computer simulations resembling situations in real environments the required level of interactivity and dynamics is difficult to achieve. In the present paper we describe some approaches and techniques which we used to push the limits of the current VRML technology further toward dynamic 3D representation of virtual environments (VEs). Our demonstration is based on the implementation of a virtual baby model, whose vital signs can be controlled from an external Java application. The main contributions of this work are: (a) outline and evaluation of the three-level VRML/Java implementation of the dynamic virtual environment, (b) proposal for a modified VRML Timesensor node, which greatly improves the overall control of system performance, and (c) architecture of the prototype distributed virtual environment for training in neonatal resuscitation comprising the interactive virtual newborn, active bedside monitor for vital signs and full 3D representation of the surgery room.

  10. Object Detection for Agricultural and Construction Environments Using an Ultrasonic Sensor.

    PubMed

    Dvorak, J S; Stone, M L; Self, K P

    2016-04-01

    This study tested an ultrasonic sensor's ability to detect several objects commonly encountered in outdoor agricultural or construction environments: a water jug, a sheet of oriented strand board (OSB), a metalfence post, a human model, a wooden fence post, a Dracaena plant, a juniper plant, and a dog model. Tests were performed with each target object at distances from 0.01 to 3 m. Five tests were performed with each object at each location, and the sensor's ability to detect the object during each test was categorized as "undetected," "intermittent," "incorrect distance," or "good." Rigid objects that presented a larger surface area to the sensor, such as the water jug and OSB, were better detected than objects with a softer surface texture, which were occasionally not detected as the distance approached 3 m. Objects with extremely soft surface texture, such as the dog model, could be undetected at almost any distance from the sensor. The results of this testing should help designers offuture systems for outdoor environments, as the target objects tested can be found in nearly any agricultural or construction environment.

  11. Quantum Darwinism in an everyday environment: huge redundancy in scattered photons.

    PubMed

    Riedel, C Jess; Zurek, Wojciech H

    2010-07-09

    We study quantum Darwinism--the redundant recording of information about the preferred states of a decohering system by its environment--for an object illuminated by a blackbody. In the cases of point-source and isotropic illumination, we calculate the quantum mutual information between the object and its photon environment. We demonstrate that this realistic model exhibits fast and extensive proliferation of information about the object into the environment and results in redundancies orders of magnitude larger than the exactly soluble models considered to date.

  12. Studies for the Loss of Atomic and Molecular Species from IO

    NASA Technical Reports Server (NTRS)

    Combi, Michael R.

    1999-01-01

    The general objective of this project has been to advance our theoretical understanding of lo's atmosphere and how various atomic and molecular species are lost from this atmosphere and are distributed in the circumplanetary environment of Jupiter. This grant has provided support for the activities of Dr. Michael Combi at the University of Michigan to serve as a small part in collaboration with a larger project awarded to Atmospheric & Environmental Research, Inc., with primary principal investigator Dr. William H. Smyth. Dr. Combi is the Principal Investigator and Project Manager for the Michigan grant NAG5-6187. This Michigan grant has provided for a continuation of a collaboration between Drs. Smyth and Combi in related efforts beginning in 1981, and with the object to develop and apply sophisticated theoretical models to interpret and to relate a number of new and exciting observations for the atmospheric gases of the satellite. The ability to interpret and then to relate through the theoretical fabric a number of these otherwise independent observations are a central strength of this program. This comprehensive approach provides a collective power, extracting more from the sum of the parts and seeing beyond various limitations that are inherent in any one observation. Although the approach is designed to unify, the program is divided into well-defined studies for the likely dominant atmospheric gases involving species of the SO2 family (SO2, SO, O2, S and O) and for the trace atmospheric gas atomic sodium and a likely escaping molecular ion NaX(+) (where Na(X) is the atmospheric molecule and X represents one or more atoms).Attachments: IO's sodium corona and spatially cloud: a consistent flux speed distribution. and Io's plasma environment during the Galileo flyby: global three-dimensional MHD modeling with adaptive mesh refinement.

  13. Web 2.0 and Pharmacy Education

    PubMed Central

    Fox, Brent I.

    2009-01-01

    New types of social Internet applications (often referred to as Web 2.0) are becoming increasingly popular within higher education environments. Although developed primarily for entertainment and social communication within the general population, applications such as blogs, social video sites, and virtual worlds are being adopted by higher education institutions. These newer applications differ from standard Web sites in that they involve the users in creating and distributing information, hence effectively changing how the Web is used for knowledge generation and dispersion. Although Web 2.0 applications offer exciting new ways to teach, they should not be the core of instructional planning, but rather selected only after learning objectives and instructional strategies have been identified. This paper provides an overview of prominent Web 2.0 applications, explains how they are being used within education environments, and elaborates on some of the potential opportunities and challenges that these applications present. PMID:19960079

  14. Energy consciousness in the design of lighting for people

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halldane, J.F.

    1975-01-01

    A comprehensive overview of energy and power distribution in the environment is presented as it relates to lighting. The objectives are to develop a consciousness of the effects of light and vision in order to utilize them more effectively. Notes are made of the physical effects of radiant power on living things and materials including thermal absorption, reflection, transmission, refraction, spectral conversion, interference, diffraction, polarization, phototropy, luminescence, photochemical changes, and photoelectric effects. Environmental issues are stressed. The evaluation process in design is briefly discussed. Reference is made to the goal, parameter, synthesis, and criterion specification as a checklist for evaluation.more » Particular concern is raised for the occupants who experience the constructed environment, since their interests do not appear to be sufficiently represented in the present day design process. Meaningfulness of measurement is emphasized and some anomalies illustrated. (auth)« less

  15. Web 2.0 and pharmacy education.

    PubMed

    Cain, Jeff; Fox, Brent I

    2009-11-12

    New types of social Internet applications (often referred to as Web 2.0) are becoming increasingly popular within higher education environments. Although developed primarily for entertainment and social communication within the general population, applications such as blogs, social video sites, and virtual worlds are being adopted by higher education institutions. These newer applications differ from standard Web sites in that they involve the users in creating and distributing information, hence effectively changing how the Web is used for knowledge generation and dispersion. Although Web 2.0 applications offer exciting new ways to teach, they should not be the core of instructional planning, but rather selected only after learning objectives and instructional strategies have been identified. This paper provides an overview of prominent Web 2.0 applications, explains how they are being used within education environments, and elaborates on some of the potential opportunities and challenges that these applications present.

  16. 3-D Imaging In Virtual Environment: A Scientific Clinical and Teaching Tool

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; DeVincenzi, Donald L. (Technical Monitor)

    1996-01-01

    The advent of powerful graphics workstations and computers has led to the advancement of scientific knowledge through three-dimensional (3-D) reconstruction and imaging of biological cells and tissues. The Biocomputation Center at NASA Ames Research Center pioneered the effort to produce an entirely computerized method for reconstruction of objects from serial sections studied in a transmission electron microscope (TEM). The software developed, ROSS (Reconstruction of Serial Sections), is now being distributed to users across the United States through Space Act Agreements. The software is in widely disparate fields such as geology, botany, biology and medicine. In the Biocomputation Center, ROSS serves as the basis for development of virtual environment technologies for scientific and medical use. This report will describe the Virtual Surgery Workstation Project that is ongoing with clinicians at Stanford University Medical Center, and the role of the Visible Human data in the project.

  17. Supporting capacity sharing in the cloud manufacturing environment based on game theory and fuzzy logic

    NASA Astrophysics Data System (ADS)

    Argoneto, Pierluigi; Renna, Paolo

    2016-02-01

    This paper proposes a Framework for Capacity Sharing in Cloud Manufacturing (FCSCM) able to support the capacity sharing issue among independent firms. The success of geographical distributed plants depends strongly on the use of opportune tools to integrate their resources and demand forecast in order to gather a specific production objective. The framework proposed is based on two different tools: a cooperative game algorithm, based on the Gale-Shapley model, and a fuzzy engine. The capacity allocation policy takes into account the utility functions of the involved firms. It is shown how the capacity allocation policy proposed induces all firms to report truthfully their information about their requirements. A discrete event simulation environment has been developed to test the proposed FCSCM. The numerical results show the drastic reduction of unsatisfied capacity obtained by the model of cooperation implemented in this work.

  18. Crawling and walking infants encounter objects differently in a multi-target environment.

    PubMed

    Dosso, Jill A; Boudreau, J Paul

    2014-10-01

    From birth, infants move their bodies in order to obtain information and stimulation from their environment. Exploratory movements are important for the development of an infant's understanding of the world and are well established as being key to cognitive advances. Newly acquired motor skills increase the potential actions available to the infant. However, the way that infants employ potential actions in environments with multiple potential targets is undescribed. The current work investigated the target object selections of infants across a range of self-produced locomotor experience (11- to 14-month-old crawlers and walkers). Infants repeatedly accessed objects among pairs of objects differing in both distance and preference status, some requiring locomotion. Overall, their object actions were found to be sensitive to object preference status; however, the role of object distance in shaping object encounters was moderated by movement status. Crawlers' actions appeared opportunistic and were biased towards nearby objects while walkers' actions appeared intentional and were independent of object position. Moreover, walkers' movements favoured preferred objects more strongly for children with higher levels of self-produced locomotion experience. The multi-target experimental situation used in this work parallels conditions faced by foraging organisms, and infants' behaviours were discussed with respect to optimal foraging theory. There is a complex interplay between infants' agency, locomotor experience, and environment in shaping their motor actions. Infants' movements, in turn, determine the information and experiences offered to infants by their micro-environment.

  19. Hydrodynamical simulations of coupled and uncoupled quintessence models - I. Halo properties and the cosmic web

    NASA Astrophysics Data System (ADS)

    Carlesi, Edoardo; Knebe, Alexander; Lewis, Geraint F.; Wales, Scott; Yepes, Gustavo

    2014-04-01

    We present the results of a series of adiabatic hydrodynamical simulations of several quintessence models (both with a free and an interacting scalar field) in comparison to a standard Λ cold dark matter cosmology. For each we use 2 × 10243 particles in a 250 h-1 Mpc periodic box assuming 7-year Wilkinson Microwave Anisotropy Probe cosmology. In this work we focus on the properties of haloes in the cosmic web at z = 0. The web is classified into voids, sheets, filaments and knots depending on the eigenvalues of the velocity shear tensor, which are an excellent proxy for the underlying overdensity distribution. We find that the properties of objects classified according to their surrounding environment show a substantial dependence on the underlying cosmology; for example, while Vmax shows average deviations of ≈5 per cent across the different models when considering the full halo sample, comparing objects classified according to their environment, the size of the deviation can be as large as 20 per cent. We also find that halo spin parameters are positively correlated to the coupling, whereas halo concentrations show the opposite behaviour. Furthermore, when studying the concentration-mass relation in different environments, we find that in all cosmologies underdense regions have a larger normalization and a shallower slope. While this behaviour is found to characterize all the models, differences in the best-fitting relations are enhanced in (coupled) dark energy models, thus providing a clearer prediction for this class of models.

  20. Patient privacy protection using anonymous access control techniques.

    PubMed

    Weerasinghe, D; Rajarajan, M; Elmufti, K; Rakocevic, V

    2008-01-01

    The objective of this study is to develop a solution to preserve security and privacy in a healthcare environment where health-sensitive information will be accessed by many parties and stored in various distributed databases. The solution should maintain anonymous medical records and it should be able to link anonymous medical information in distributed databases into a single patient medical record with the patient identity. In this paper we present a protocol that can be used to authenticate and authorize patients to healthcare services without providing the patient identification. Healthcare service can identify the patient using separate temporary identities in each identification session and medical records are linked to these temporary identities. Temporary identities can be used to enable record linkage and reverse track real patient identity in critical medical situations. The proposed protocol provides main security and privacy services such as user anonymity, message privacy, message confidentiality, user authentication, user authorization and message replay attacks. The medical environment validates the patient at the healthcare service as a real and registered patient for the medical services. Using the proposed protocol, the patient anonymous medical records at different healthcare services can be linked into one single report and it is possible to securely reverse track anonymous patient into the real identity. The protocol protects the patient privacy with a secure anonymous authentication to healthcare services and medical record registries according to the European and the UK legislations, where the patient real identity is not disclosed with the distributed patient medical records.

  1. Rogoznica Lake - a Conceptual Framework to Study Sulfate-reducing Bacteria Across a Wide Range of Anoxic/hypoxic Marine Environments

    NASA Astrophysics Data System (ADS)

    Cankovic, M.; Collins, G.; Petrić, I.; Ciglenečki, I.

    2016-02-01

    Today's oceans and seas are experiencing, among other changes, oxygen depletion, resulting in hypoxia/anoxia. Consequently, toxic H2S,generated by sulfate-reducing bacteria (SRB), is released. The prevalence of this type of environment has increased rapidly over the past decades, especially in costal zones. Rogoznica Lake (Croatia) is a typical, extreme euxinic, seawater system, with a permanently anoxic bottom water layer. As such, it represents a natural laboratory to study SRB. The objective of this study was to characterize the SRB community inhabiting the hypoxic/anoxic water column and sediment of Rogoznica Lake. The distribution, diversity, activity and abundance of SRB were investigated using different molecular techniques accompanied by physico-chemical and organic matter measurements. Results indicated seasonal variations in SRB diversity, abundance and activity, as well as variations between different samples. A complex and diverse distribution of SRB was revealed, supporting the idea that habitat-specific SRB communities are the main drivers of anaerobic degradation of organic matter, as well as cycling of sulfur and carbon species, in the Lake. Furthermore, low sequence homology to cultured SRB indicated presence of a specific SRB community in the Lake.While eutrophication is a leading cause of impairment of many freshwater and coastal marine ecosystems in the world, hypoxia and anoxia continue to threaten tourism and fisheries worldwide. In such circumstances better understanding of SRB spatio-temporal distribution and dynamics would be of ecological and economical importance.

  2. UGV navigation in wireless sensor and actuator network environments

    NASA Astrophysics Data System (ADS)

    Zhang, Guyu; Li, Jianfeng; Duncan, Christian A.; Kanno, Jinko; Selmic, Rastko R.

    2012-06-01

    We consider a navigation problem in a distributed, self-organized and coordinate-free Wireless Sensor and Ac- tuator Network (WSAN). We rst present navigation algorithms that are veried using simulation results. Con- sidering more than one destination and multiple mobile Unmanned Ground Vehicles (UGVs), we introduce a distributed solution to the Multi-UGV, Multi-Destination navigation problem. The objective of the solution to this problem is to eciently allocate UGVs to dierent destinations and carry out navigation in the network en- vironment that minimizes total travel distance. The main contribution of this paper is to develop a solution that does not attempt to localize either the UGVs or the sensor and actuator nodes. Other than some connectivity as- sumptions about the communication graph, we consider that no prior information about the WSAN is available. The solution presented here is distributed, and the UGV navigation is solely based on feedback from neigh- boring sensor and actuator nodes. One special case discussed in the paper, the Single-UGV, Multi-Destination navigation problem, is essentially equivalent to the well-known and dicult Traveling Salesman Problem (TSP). Simulation results are presented that illustrate the navigation distance traveled through the network. We also introduce an experimental testbed for the realization of coordinate-free and localization-free UGV navigation. We use the Cricket platform as the sensor and actuator network and a Pioneer 3-DX robot as the UGV. The experiments illustrate the UGV navigation in a coordinate-free WSAN environment where the UGV successfully arrives at the assigned destinations.

  3. dftools: Distribution function fitting

    NASA Astrophysics Data System (ADS)

    Obreschkow, Danail

    2018-05-01

    dftools, written in R, finds the most likely P parameters of a D-dimensional distribution function (DF) generating N objects, where each object is specified by D observables with measurement uncertainties. For instance, if the objects are galaxies, it can fit a mass function (D=1), a mass-size distribution (D=2) or the mass-spin-morphology distribution (D=3). Unlike most common fitting approaches, this method accurately accounts for measurement in uncertainties and complex selection functions.

  4. Digital Library Storage using iRODS Data Grids

    NASA Astrophysics Data System (ADS)

    Hedges, Mark; Blanke, Tobias; Hasan, Adil

    Digital repository software provides a powerful and flexible infrastructure for managing and delivering complex digital resources and metadata. However, issues can arise in managing the very large, distributed data files that may constitute these resources. This paper describes an implementation approach that combines the Fedora digital repository software with a storage layer implemented as a data grid, using the iRODS middleware developed by DICE (Data Intensive Cyber Environments) as the successor to SRB. This approach allows us to use Fedoras flexible architecture to manage the structure of resources and to provide application- layer services to users. The grid-based storage layer provides efficient support for managing and processing the underlying distributed data objects, which may be very large (e.g. audio-visual material). The Rule Engine built into iRODS is used to integrate complex workflows at the data level that need not be visible to users, e.g. digital preservation functionality.

  5. New architectural paradigms for multi-petabyte distributed storage systems

    NASA Technical Reports Server (NTRS)

    Lee, Richard R.

    1994-01-01

    In the not too distant future, programs such as NASA's Earth Observing System, NSF/ARPA/NASA's Digital Libraries Initiative and Intelligence Community's (NSA, CIA, NRO, etc.) mass storage system upgrades will all require multi-petabyte (petabyte: 1015 bytes of bitfile data) (or larger) distributed storage solutions. None of these requirements, as currently defined, will meet their objectives utilizing either today's architectural paradigms or storage solutions. Radically new approaches will be required to not only store and manage veritable 'mountain ranges of data', but to make the cost of ownership affordable, much less practical in today's (and certainly the future's) austere budget environment! Within this paper we will explore new architectural paradigms and project systems performance benefits and dollars per petabyte of information stored. We will discuss essential 'top down' approaches to achieving an overall systems level performance capability sufficient to meet the challenges of these major programs.

  6. Coordinated scheduling for dynamic real-time systems

    NASA Technical Reports Server (NTRS)

    Natarajan, Swaminathan; Zhao, Wei

    1994-01-01

    In this project, we addressed issues in coordinated scheduling for dynamic real-time systems. In particular, we concentrated on design and implementation of a new distributed real-time system called R-Shell. The design objective of R-Shell is to provide computing support for space programs that have large, complex, fault-tolerant distributed real-time applications. In R-shell, the approach is based on the concept of scheduling agents, which reside in the application run-time environment, and are customized to provide just those resource management functions which are needed by the specific application. With this approach, we avoid the need for a sophisticated OS which provides a variety of generalized functionality, while still not burdening application programmers with heavy responsibility for resource management. In this report, we discuss the R-Shell approach, summarize the achievement of the project, and describe a preliminary prototype of R-Shell system.

  7. Cloud@Home: A New Enhanced Computing Paradigm

    NASA Astrophysics Data System (ADS)

    Distefano, Salvatore; Cunsolo, Vincenzo D.; Puliafito, Antonio; Scarpa, Marco

    Cloud computing is a distributed computing paradigm that mixes aspects of Grid computing, ("… hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities" (Foster, 2002)) Internet Computing ("…a computing platform geographically distributed across the Internet" (Milenkovic et al., 2003)), Utility computing ("a collection of technologies and business practices that enables computing to be delivered seamlessly and reliably across multiple computers, ... available as needed and billed according to usage, much like water and electricity are today" (Ross & Westerman, 2004)) Autonomic computing ("computing systems that can manage themselves given high-level objectives from administrators" (Kephart & Chess, 2003)), Edge computing ("… provides a generic template facility for any type of application to spread its execution across a dedicated grid, balancing the load …" Davis, Parikh, & Weihl, 2004) and Green computing (a new frontier of Ethical computing1 starting from the assumption that in next future energy costs will be related to the environment pollution).

  8. Fiber-Optic Magnetometry and Thermometry Using Optically Detected Magnetic Resonance With Nitrogen-Vacancy Centers in Diamond

    NASA Astrophysics Data System (ADS)

    Blakley, Sean Michael

    Nitrogen--vacancy diamond (NVD) quantum sensors are an emerging technology that has shown great promise in areas like high-resolution thermometry and magnetometry. Optical fibers provide attractive new application paradigms for NVD technology. A detailed description of the fabrication processes associated with the development of novel fiber-optic NVD probes are presented in this work. The demonstrated probes are tested on paradigmatic model systems designed to ascertain their suitability for use in challenging biological environments. Methods employing optically detected magnetic resonance (ODMR) are used to accurately measure and map temperature distributions of small objects and to demonstrate emergent temperature-dependent phenomena in genetically modified living organisms. These methods are also used to create detailed high resolution spatial maps of both magnetic scalar and magnetic vector field distributions of spatially localized weak field features in the presence of a noisy, high-field background.

  9. Integrating a Mobile Augmented Reality Activity to Contextualize Student Learning of a Socioscienti?c Issue

    ERIC Educational Resources Information Center

    Chang, Hsin-Yi; Wu, Hsin-Kai; Hsu, Ying-Shao

    2013-01-01

    virtual objects or information overlaying physical objects or environments, resulting in a mixed reality in which virtual objects and real environments coexist in a meaningful way to augment learning…

  10. ARK: Autonomous mobile robot in an industrial environment

    NASA Technical Reports Server (NTRS)

    Nickerson, S. B.; Jasiobedzki, P.; Jenkin, M.; Jepson, A.; Milios, E.; Down, B.; Service, J. R. R.; Terzopoulos, D.; Tsotsos, J.; Wilkes, D.

    1994-01-01

    This paper describes research on the ARK (Autonomous Mobile Robot in a Known Environment) project. The technical objective of the project is to build a robot that can navigate in a complex industrial environment using maps with permanent structures. The environment is not altered in any way by adding easily identifiable beacons and the robot relies on naturally occurring objects to use as visual landmarks for navigation. The robot is equipped with various sensors that can detect unmapped obstacles, landmarks and objects. In this paper we describe the robot's industrial environment, it's architecture, a novel combined range and vision sensor and our recent results in controlling the robot in the real-time detection of objects using their color and in the processing of the robot's range and vision sensor data for navigation.

  11. A distributed computing system for magnetic resonance imaging: Java-based processing and binding of XML.

    PubMed

    de Beer, R; Graveron-Demilly, D; Nastase, S; van Ormondt, D

    2004-03-01

    Recently we have developed a Java-based heterogeneous distributed computing system for the field of magnetic resonance imaging (MRI). It is a software system for embedding the various image reconstruction algorithms that we have created for handling MRI data sets with sparse sampling distributions. Since these data sets may result from multi-dimensional MRI measurements our system has to control the storage and manipulation of large amounts of data. In this paper we describe how we have employed the extensible markup language (XML) to realize this data handling in a highly structured way. To that end we have used Java packages, recently released by Sun Microsystems, to process XML documents and to compile pieces of XML code into Java classes. We have effectuated a flexible storage and manipulation approach for all kinds of data within the MRI system, such as data describing and containing multi-dimensional MRI measurements, data configuring image reconstruction methods and data representing and visualizing the various services of the system. We have found that the object-oriented approach, possible with the Java programming environment, combined with the XML technology is a convenient way of describing and handling various data streams in heterogeneous distributed computing systems.

  12. Independent motion detection with a rival penalized adaptive particle filter

    NASA Astrophysics Data System (ADS)

    Becker, Stefan; Hübner, Wolfgang; Arens, Michael

    2014-10-01

    Aggregation of pixel based motion detection into regions of interest, which include views of single moving objects in a scene is an essential pre-processing step in many vision systems. Motion events of this type provide significant information about the object type or build the basis for action recognition. Further, motion is an essential saliency measure, which is able to effectively support high level image analysis. When applied to static cameras, background subtraction methods achieve good results. On the other hand, motion aggregation on freely moving cameras is still a widely unsolved problem. The image flow, measured on a freely moving camera is the result from two major motion types. First the ego-motion of the camera and second object motion, that is independent from the camera motion. When capturing a scene with a camera these two motion types are adverse blended together. In this paper, we propose an approach to detect multiple moving objects from a mobile monocular camera system in an outdoor environment. The overall processing pipeline consists of a fast ego-motion compensation algorithm in the preprocessing stage. Real-time performance is achieved by using a sparse optical flow algorithm as an initial processing stage and a densely applied probabilistic filter in the post-processing stage. Thereby, we follow the idea proposed by Jung and Sukhatme. Normalized intensity differences originating from a sequence of ego-motion compensated difference images represent the probability of moving objects. Noise and registration artefacts are filtered out, using a Bayesian formulation. The resulting a posteriori distribution is located on image regions, showing strong amplitudes in the difference image which are in accordance with the motion prediction. In order to effectively estimate the a posteriori distribution, a particle filter is used. In addition to the fast ego-motion compensation, the main contribution of this paper is the design of the probabilistic filter for real-time detection and tracking of independently moving objects. The proposed approach introduces a competition scheme between particles in order to ensure an improved multi-modality. Further, the filter design helps to generate a particle distribution which is homogenous even in the presence of multiple targets showing non-rigid motion patterns. The effectiveness of the method is shown on exemplary outdoor sequences.

  13. Modification and evaluation of a Barnes-type objective analysis scheme for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.

    1982-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a Barness-type scheme for the analysis of surface meteorological data. Modifications are introduced to the original version in order to increase its flexibility and to permit greater ease of usage. The code was rewritten for an interactive computer environment. Furthermore, a multiple iteration technique suggested by Barnes was implemented for greater accuracy. PROAM was subjected to a series of experiments in order to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution in order to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple iteration technique increases the accuracy of the analysis. Furthermore, the tests verify appropriate values for the analysis parameters in resolving meso-beta scale phenomena.

  14. Self-Aware Vehicles: Mission and Performance Adaptation to System Health

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.; Leonard, Charles; Scotti, Stephen J.

    2016-01-01

    Advances in sensing (miniaturization, distributed sensor networks) combined with improvements in computational power leading to significant gains in perception, real-time decision making/reasoning and dynamic planning under uncertainty as well as big data predictive analysis have set the stage for realization of autonomous system capability. These advances open the design and operating space for self-aware vehicles that are able to assess their own capabilities and adjust their behavior to either complete the assigned mission or to modify the mission to reflect their current capabilities. This paper discusses the self-aware vehicle concept and associated technologies necessary for full exploitation of the concept. A self-aware aircraft, spacecraft or system is one that is aware of its internal state, has situational awareness of its environment, can assess its capabilities currently and project them into the future, understands its mission objectives, and can make decisions under uncertainty regarding its ability to achieve its mission objectives.

  15. Space and biotechnology: An industry profile

    NASA Technical Reports Server (NTRS)

    Johnston, Richard S.; Norton, David J.; Tom, Baldwin H.

    1988-01-01

    The results of a study conducted by the Center for Space and Advanced Technology (CSAT) for NASA-JSC are presented. The objectives were to determine the interests and attitudes of the U.S. biotechnology industry toward space biotechnology and to prepare a concise review of the current activities of the biotechnology industry. In order to accomplish these objectives, two primary actions were taken. First, a questionnaire was designed, reviewed, and distributed to U.S. biotechnology companies. Second, reviews of the various biotechnology fields were prepared in several aspects of the industry. For each review, leading figures in the field were asked to prepare a brief review pointing out key trends and current industry technical problems. The result is a readable narrative of the biotechnology industry which will provide space scientists and engineers valuable clues as to where the space environment can be explored to advance the U.S. biotechnology industry.

  16. Structured thermal surface for radiative camouflage.

    PubMed

    Li, Ying; Bai, Xue; Yang, Tianzhi; Luo, Hailu; Qiu, Cheng-Wei

    2018-01-18

    Thermal camouflage has been successful in the conductive regime, where thermal metamaterials embedded in a conductive system can manipulate heat conduction inside the bulk. Most reported approaches are background-dependent and not applicable to radiative heat emitted from the surface of the system. A coating with engineered emissivity is one option for radiative camouflage, but only when the background has uniform temperature. Here, we propose a strategy for radiative camouflage of external objects on a given background using a structured thermal surface. The device is non-invasive and restores arbitrary background temperature distributions on its top. For many practical candidates of the background material with similar emissivity as the device, the object can thereby be radiatively concealed without a priori knowledge of the host conductivity and temperature. We expect this strategy to meet the demands of anti-detection and thermal radiation manipulation in complex unknown environments and to inspire developments in phononic and photonic thermotronics.

  17. Ordering actions for visibility. [distributed computing based on idea of atomic actions operating on data

    NASA Technical Reports Server (NTRS)

    Mckendry, M. S.

    1985-01-01

    The notion of 'atomic actions' has been considered in recent work on data integrity and reliability. It has been found that the standard database operations of 'read' and 'write' carry with them severe performance limitations. For this reason, systems are now being designed in which actions operate on 'objects' through operations with more-or-less arbitrary semantics. An object (i.e., an instance of an abstract data type) comprises data, a set of operations (procedures) to manipulate the data, and a set of invariants. An 'action' is a unit of work. It appears to be primitive to its surrounding environment, and 'atomic' to other actions. Attention is given to the conventional model of nested actions, ordering requirements, the maximum possible visibility (full visibility) for items which must be controlled by ordering constraints, item management paradigms, and requirements for blocking mechanisms which provide the required visibility.

  18. The ACES mission: scientific objectives and present status

    NASA Astrophysics Data System (ADS)

    Cacciapuoti, L.; Dimarcq, N.; Salomon, C.

    2017-11-01

    "Atomic Clock Ensemble in Space" (ACES) is a mission in fundamental physics that will operate a new generation of atomic clocks in the microgravity environment of the International Space Station (ISS). The ACES clock signal will combine the medium term frequency stability of a space hydrogen maser (SHM) and the long term stability and accuracy of a frequency standard based on cold cesium atoms (PHARAO). Fractional frequency stability and accuracy of few parts in 1016 will be achieved. The on-board time base distributed on Earth via a microwave link (MWL) will be used to test fundamental laws of physics (Einstein's theories of Special and General Relativity, Standard Model Extension, string theories…) and to develop applications in time and frequency metrology, universal time scales, global positioning and navigation, geodesy and gravimetry. After a general overview on the mission concept and its scientific objectives, the present status of ACES instruments and sub-systems will be discussed.

  19. Combining Agile and Traditional: Customer Communication in Distributed Environment

    NASA Astrophysics Data System (ADS)

    Korkala, Mikko; Pikkarainen, Minna; Conboy, Kieran

    Distributed development is a radically increasing phenomenon in modern software development environments. At the same time, traditional and agile methodologies and combinations of those are being used in the industry. Agile approaches place a large emphasis on customer communication. However, existing knowledge on customer communication in distributed agile development seems to be lacking. In order to shed light on this topic and provide practical guidelines for companies in distributed agile environments, a qualitative case study was conducted in a large globally distributed software company. The key finding was that it might be difficult for an agile organization to get relevant information from a traditional type of customer organization, even though the customer communication was indicated to be active and utilized via multiple different communication media. Several challenges discussed in this paper referred to "information blackout" indicating the importance of an environment fostering meaningful communication. In order to evaluate if this environment can be created a set of guidelines is proposed.

  20. Captive Bottlenose Dolphins (Tursiops truncatus) Spontaneously Using Water Flow to Manipulate Objects

    PubMed Central

    Yamamoto, Chisato; Furuta, Keisuke; Taki, Michihiro; Morisaka, Tadamichi

    2014-01-01

    Several terrestrial animals and delphinids manipulate objects in a tactile manner, using parts of their bodies, such as their mouths or hands. In this paper, we report that bottlenose dolphins (Tursiops truncatus) manipulate objects not by direct bodily contact, but by spontaneous water flow. Three of four dolphins at Suma Aqualife Park performed object manipulation with food. The typical sequence of object manipulation consisted of a three step procedure. First, the dolphins released the object from the sides of their mouths while assuming a head-down posture near the floor. They then manipulated the object around their mouths and caught it. Finally, they ceased to engage in their head-down posture and started to swim. When the dolphins moved the object, they used the water current in the pool or moved their head. These results showed that dolphins manipulate objects using movements that do not directly involve contact between a body part and the object. In the event the dolphins dropped the object on the floor, they lifted it by making water flow in one of three methods: opening and closing their mouths repeatedly, moving their heads lengthwise, or making circular head motions. This result suggests that bottlenose dolphins spontaneously change their environment to manipulate objects. The reason why aquatic animals like dolphins do object manipulation by changing their environment but terrestrial animals do not may be that the viscosity of the aquatic environment is much higher than it is in terrestrial environments. This is the first report thus far of any non-human mammal engaging in object manipulation using several methods to change their environment. PMID:25250625

  1. 3D workflow for HDR image capture of projection systems and objects for CAVE virtual environments authoring with wireless touch-sensitive devices

    NASA Astrophysics Data System (ADS)

    Prusten, Mark J.; McIntyre, Michelle; Landis, Marvin

    2006-02-01

    A 3D workflow pipeline is presented for High Dynamic Range (HDR) image capture of projected scenes or objects for presentation in CAVE virtual environments. The methods of HDR digital photography of environments vs. objects are reviewed. Samples of both types of virtual authoring being the actual CAVE environment and a sculpture are shown. A series of software tools are incorporated into a pipeline called CAVEPIPE, allowing for high-resolution objects and scenes to be composited together in natural illumination environments [1] and presented in our CAVE virtual reality environment. We also present a way to enhance the user interface for CAVE environments. The traditional methods of controlling the navigation through virtual environments include: glove, HUD's and 3D mouse devices. By integrating a wireless network that includes both WiFi (IEEE 802.11b/g) and Bluetooth (IEEE 802.15.1) protocols the non-graphical input control device can be eliminated. Therefore wireless devices can be added that would include: PDA's, Smart Phones, TabletPC's, Portable Gaming consoles, and PocketPC's.

  2. The Lunar Atmosphere and Dust Environment Explorer (LADEE) Mission

    NASA Technical Reports Server (NTRS)

    Spremo, Stevan; Turner, Mark; Caffrey, Robert T.; Hine, Butler Preston

    2010-01-01

    The Lunar Atmosphere and Dust Environment Explorer (LADEE) is a Lunar science orbiter mission currently under development to address the goals of the National Research Council decadal surveys and the recent "Scientific Context for Exploration of the Moon" (SCEM) [1] report to study the pristine state of the lunar atmosphere and dust environment prior to significant human activities. LADEE will determine the composition of the lunar atmosphere and investigate the processes that control its distribution and variability, including sources, sinks, and surface interactions. LADEE will also determine whether dust is present in the lunar exosphere, and reveal the processes that contribute to its sources and variability. These investigations are relevant to our understanding of surface boundary exospheres and dust processes throughout the solar system, address questions regarding the origin and evolution of lunar volatiles, and have potential implications for future exploration activities. LADEE employs a high heritage science instrument payload including a neutral mass spectrometer, ultraviolet spectrometer, and dust sensor. In addition to the science payloads, LADEE will fly a laser communications system technology demonstration that could provide a building block for future space communications architectures. LADEE is an important component in NASA's portfolio of near-term lunar missions, addressing objectives that are currently not covered by other U.S. or international efforts, and whose observations must be conducted before large-scale human or robotic activities irrevocably perturb the tenuous and fragile lunar atmosphere. LADEE will also demonstrate the effectiveness of a low-cost, rapid-development program utilizing a modular bus design launched on the new Minotaur V launch vehicle. Once proven, this capability could enable future lunar missions in a highly cost constrained environment. This paper describes the LADEE objectives, mission design, and technical approach.

  3. ENERGY-NET (Energy, Environment and Society Learning Network): Best Practices to Enhance Informal Geoscience Learning

    NASA Astrophysics Data System (ADS)

    Rossi, R.; Elliott, E. M.; Bain, D.; Crowley, K. J.; Steiner, M. A.; Divers, M. T.; Hopkins, K. G.; Giarratani, L.; Gilmore, M. E.

    2014-12-01

    While energy links all living and non-living systems, the integration of energy, the environment, and society is often not clearly represented in 9 - 12 classrooms and informal learning venues. However, objective public learning that integrates these components is essential for improving public environmental literacy. ENERGY-NET (Energy, Environment and Society Learning Network) is a National Science Foundation funded initiative that uses an Earth Systems Science framework to guide experimental learning for high school students and to improve public learning opportunities regarding the energy-environment-society nexus in a Museum setting. One of the primary objectives of the ENERGY-NET project is to develop a rich set of experimental learning activities that are presented as exhibits at the Carnegie Museum of Natural History in Pittsburgh, Pennsylvania (USA). Here we detail the evolution of the ENERGY-NET exhibit building process and the subsequent evolution of exhibit content over the past three years. While preliminary plans included the development of five "exploration stations" (i.e., traveling activity carts) per calendar year, the opportunity arose to create a single, larger topical exhibit per semester, which was assumed to have a greater impact on museum visitors. Evaluative assessments conducted to date reveal important practices to be incorporated into ongoing exhibit development: 1) Undergraduate mentors and teen exhibit developers should receive additional content training to allow richer exhibit materials. 2) The development process should be distributed over as long a time period as possible and emphasize iteration. This project can serve as a model for other collaborations between geoscience departments and museums. In particular, these practices may streamline development of public presentations and increase the effectiveness of experimental learning activities.

  4. The distribution of infrared point sources in nearby elliptical galaxies

    NASA Astrophysics Data System (ADS)

    Gogoi, Rupjyoti; Shalima, P.; Misra, Ranjeev

    2018-02-01

    Infrared (IR) point sources as observed by Spitzer, in nearby early-type galaxies should either be bright sources in the galaxy such as globular clusters, or they may be background sources such as AGNs. These objects are often counterparts of sources in other wavebands such as optical and X-rays and the IR information provides crucial information regarding their nature. However, many of the IR sources may be background objects and it is important to identify them or at least quantify the level of background contamination. Moreover, the distribution of these IR point sources in flux, distance from the centre and colour would be useful in understanding their origin. Archival Spitzer IRAC images provide a unique opportunity for such a study and here we present the results of such an analysis for four nearby galaxies, NGC 1399, NGC 2768, NGC 4365 and NGC 4649. We estimate the background contamination using several blank fields. Our results suggest that IR colours can be effectively used to differentiate between sources in the galaxy and background ones. In particular we find that sources having AGN like colours are indeed consistent with being background AGNs. For sources with non AGN like colours we compute the distribution of flux and normalised distance from the centre which is found to be of a power-law form. Although our sample size is small, the power-law index for the galaxies are different indicating perhaps that the galaxy environment may be playing a part in their origin and nature.

  5. Collaborative Scheduling Using JMS in a Mixed Java and .NET Environment

    NASA Technical Reports Server (NTRS)

    Wang, Yeou-Fang; Wax, Allan; Lam, Ray; Baldwin, John; Borden, Chet

    2006-01-01

    A viewgraph presentation to demonstrate collaborative scheduling using Java Message Service (JMS) in a mixed Java and .Net environment is given. The topics include: 1) NASA Deep Space Network scheduling; 2) Collaborative scheduling concept; 3) Distributed computing environment; 4) Platform concerns in a distributed environment; 5) Messaging and data synchronization; and 6) The prototype.

  6. A master-slave parallel hybrid multi-objective evolutionary algorithm for groundwater remediation design under general hydrogeological conditions

    NASA Astrophysics Data System (ADS)

    Wu, J.; Yang, Y.; Luo, Q.; Wu, J.

    2012-12-01

    This study presents a new hybrid multi-objective evolutionary algorithm, the niched Pareto tabu search combined with a genetic algorithm (NPTSGA), whereby the global search ability of niched Pareto tabu search (NPTS) is improved by the diversification of candidate solutions arose from the evolving nondominated sorting genetic algorithm II (NSGA-II) population. Also, the NPTSGA coupled with the commonly used groundwater flow and transport codes, MODFLOW and MT3DMS, is developed for multi-objective optimal design of groundwater remediation systems. The proposed methodology is then applied to a large-scale field groundwater remediation system for cleanup of large trichloroethylene (TCE) plume at the Massachusetts Military Reservation (MMR) in Cape Cod, Massachusetts. Furthermore, a master-slave (MS) parallelization scheme based on the Message Passing Interface (MPI) is incorporated into the NPTSGA to implement objective function evaluations in distributed processor environment, which can greatly improve the efficiency of the NPTSGA in finding Pareto-optimal solutions to the real-world application. This study shows that the MS parallel NPTSGA in comparison with the original NPTS and NSGA-II can balance the tradeoff between diversity and optimality of solutions during the search process and is an efficient and effective tool for optimizing the multi-objective design of groundwater remediation systems under complicated hydrogeologic conditions.

  7. Cooperative multisensor system for real-time face detection and tracking in uncontrolled conditions

    NASA Astrophysics Data System (ADS)

    Marchesotti, Luca; Piva, Stefano; Turolla, Andrea; Minetti, Deborah; Regazzoni, Carlo S.

    2005-03-01

    The presented work describes an innovative architecture for multi-sensor distributed video surveillance applications. The aim of the system is to track moving objects in outdoor environments with a cooperative strategy exploiting two video cameras. The system also exhibits the capacity of focusing its attention on the faces of detected pedestrians collecting snapshot frames of face images, by segmenting and tracking them over time at different resolution. The system is designed to employ two video cameras in a cooperative client/server structure: the first camera monitors the entire area of interest and detects the moving objects using change detection techniques. The detected objects are tracked over time and their position is indicated on a map representing the monitored area. The objects" coordinates are sent to the server sensor in order to point its zooming optics towards the moving object. The second camera tracks the objects at high resolution. As well as the client camera, this sensor is calibrated and the position of the object detected on the image plane reference system is translated in its coordinates referred to the same area map. In the map common reference system, data fusion techniques are applied to achieve a more precise and robust estimation of the objects" track and to perform face detection and tracking. The work novelties and strength reside in the cooperative multi-sensor approach, in the high resolution long distance tracking and in the automatic collection of biometric data such as a person face clip for recognition purposes.

  8. The Spatial Distribution of Attention within and across Objects

    PubMed Central

    Hollingworth, Andrew; Maxcey-Richard, Ashleigh M.; Vecera, Shaun P.

    2011-01-01

    Attention operates to select both spatial locations and perceptual objects. However, the specific mechanism by which attention is oriented to objects is not well understood. We examined the means by which object structure constrains the distribution of spatial attention (i.e., a “grouped array”). Using a modified version of the Egly et al. object cuing task, we systematically manipulated within-object distance and object boundaries. Four major findings are reported: 1) spatial attention forms a gradient across the attended object; 2) object boundaries limit the distribution of this gradient, with the spread of attention constrained by a boundary; 3) boundaries within an object operate similarly to across-object boundaries: we observed object-based effects across a discontinuity within a single object, without the demand to divide or switch attention between discrete object representations; and 4) the gradient of spatial attention across an object directly modulates perceptual sensitivity, implicating a relatively early locus for the grouped array representation. PMID:21728455

  9. Computational study of the heat transfer of an avian egg in a tray.

    PubMed

    Eren Ozcan, S; Andriessens, S; Berckmans, D

    2010-04-01

    The development of an embryo in an avian egg depends largely on its temperature. The embryo temperature is affected by its environment and the heat produced by the egg. In this paper, eggshell temperature and the heat transfer characteristics from one egg in a tray toward its environment are studied by means of computational fluid dynamics (CFD). Computational fluid dynamics simulations have the advantage of providing extensive 3-dimensional information on velocity and eggshell temperature distribution around an egg that otherwise is not possible to obtain by experiments. However, CFD results need to be validated against experimental data. The objectives were (1) to find out whether CFD can successfully simulate eggshell temperature from one egg in a tray by comparing to previously conducted experiments, (2) to visualize air flow and air temperature distribution around the egg in a detailed way, and (3) to perform sensitivity analysis on several variables affecting heat transfer. To this end, a CFD model was validated using 2 sets of temperature measurements yielding an effective model. From these simulations, it can be concluded that CFD can effectively be used to analyze heat transfer characteristics and eggshell temperature distribution around an egg. In addition, air flow and temperature distribution around the egg are visualized. It has been observed that temperature differences up to 2.6 degrees C are possible at high heat production (285 mW) and horizontal low flow rates (0.5 m/s). Sensitivity analysis indicates that average eggshell temperature is mainly affected by the inlet air velocity and temperature, flow direction, and the metabolic heat of the embryo and less by the thermal conductivity and emissivity of the egg and thermal emissivity of the tray.

  10. Butyltin sorption onto freshwater sediments: from batch experiments to the field values

    NASA Astrophysics Data System (ADS)

    Bancon-Montingy, C.; Aubert, G.; Chahinian, N.; Meyer, J.; Brunel, V.; Tournoud, M. G.

    2009-04-01

    Butyltins, and most particularly TBT were widely used by the industry in the 1970s and 1980s, namely as anti-fouling paints on ships. Although banned since 2003 in Europe, surveys still point out the presence of these compounds both in coastal and terrestrial environments. The resilience of organotin (OT) compounds can be explained by their high adsorption capacity. OTs can bond easily to particulate matter and "migrate" from the water column unto the sediments where their half-life can extend to a few decades. Consequently sediments can become important organotin stores and release OT compounds during dredging operations, storms, tides or floods. Studies on OT behavior in freshwater environments, mainly sediments, are scarce in the literature compared with marine sediments. However, it is known that sorption behaviour of organotin compounds on sediments is governed by the constituents of sediments, and the composition of interstitial water in the sediments and overlying water, i.e. grain size distribution, clay minerals, organic matter, iron, aluminium (hydr)oxides and carbonate in the sediments; salinity, ionic composition, and pH of interstitial water in the sediments and overlying water. The main objective of this work is to assess butyltin adsorption into the sediments of an intermittent river located in southern France: The Vène. Sediments were collected during high and low flow conditions and batch experiments were set up using "natural" and "crushed" sediments to assess the adsorption kinetics. Classical batch experiments and GC-ICP-MS analysis were carried out to measure the distribution coefficient (Kd). The influence of organic substances on sorption processes for organotin species was studied and the role of grain size distribution assessed by comparing natural and crushed sediments. The results indicated that organotin compounds are sorbed easily and quickly on freshwater sediments. The adsorption isotherm for butyltins follows the Freundlich equation which is used to describe the adsorption behaviour of non-polar organic matters. This is due to their organic substituent groups. The presence of organic matter modifies the sorption process: less OT is adsorbed onto the sediments. This leads to increased OT concentrations in solution and consequently a higher probability for assimilation by freshwater organisms. The comparison of our results to those reported in the literature for marine environments could not be carried out because of the wide differences in salinity and grain size distribution between the two environments.

  11. Environmental impoverishment and aging alter object recognition, spatial learning, and dentate gyrus astrocytes.

    PubMed

    Diniz, Daniel G; Foro, César A R; Rego, Carla M D; Gloria, David A; de Oliveira, Fabio R R; Paes, Juliana M P; de Sousa, Aline A; Tokuhashi, Tatyana P; Trindade, Lucas S; Turiel, Maíra C P; Vasconcelos, Erick G R; Torres, João B; Cunnigham, Colm; Perry, Victor H; Vasconcelos, Pedro F da Costa; Diniz, Cristovam W P

    2010-08-01

    Environmental and age-related effects on learning and memory were analysed and compared with changes observed in astrocyte laminar distribution in the dentate gyrus. Aged (20 months) and young (6 months) adult female albino Swiss mice were housed from weaning either in impoverished conditions or in enriched conditions, and tested for episodic-like and water maze spatial memories. After these behavioral tests, brain hippocampal sections were immunolabeled for glial fibrillary acid protein to identify astrocytes. The effects of environmental enrichment on episodic-like memory were not dependent on age, and may protect water maze spatial learning and memory from declines induced by aging or impoverished environment. In the dentate gyrus, the number of astrocytes increased with both aging and enriched environment in the molecular layer, increased only with aging in the polymorphic layer, and was unchanged in the granular layer. We suggest that long-term experience-induced glial plasticity by enriched environment may represent at least part of the circuitry groundwork for improvements in behavioral performance in the aged mice brain.

  12. Assessing the professional development needs of experienced nurse executive leaders.

    PubMed

    Leach, Linda Searle; McFarland, Patricia

    2014-01-01

    The objective of this study was to identify the professional development topics that senior nurse leaders believe are important to their advancement and success. Senior/experienced nurse leaders at the executive level are able to influence the work environment of nurses and institutional and health policy. Their development needs are likely to reflect this and other contemporary healthcare issues and may be different from middle and frontline managers. A systematic way of assessing professional development needs for these nurse leaders is needed. A descriptive study using an online survey was distributed to a convenience sample of nurse leaders who were members of the Association of California Nurse Leaders (ACNL) or have participated in an ACNL program. Visionary leadership, leading complexity, and effective teams were the highest ranked leadership topics. Leading change, advancing health: The future of nursing, healthy work environments, and healthcare reform were also highly ranked topics. Executive-level nurse leaders are important to nurse retention, effective work environments, and leading change. Regular assessment and attention to the distinct professional development needs of executive-level nurse leaders are a valuable human capital investment.

  13. A Markovian state-space framework for integrating flexibility into space system design decisions

    NASA Astrophysics Data System (ADS)

    Lafleur, Jarret M.

    The past decades have seen the state of the art in aerospace system design progress from a scope of simple optimization to one including robustness, with the objective of permitting a single system to perform well even in off-nominal future environments. Integrating flexibility, or the capability to easily modify a system after it has been fielded in response to changing environments, into system design represents a further step forward. One challenge in accomplishing this rests in that the decision-maker must consider not only the present system design decision, but also sequential future design and operation decisions. Despite extensive interest in the topic, the state of the art in designing flexibility into aerospace systems, and particularly space systems, tends to be limited to analyses that are qualitative, deterministic, single-objective, and/or limited to consider a single future time period. To address these gaps, this thesis develops a stochastic, multi-objective, and multi-period framework for integrating flexibility into space system design decisions. Central to the framework are five steps. First, system configuration options are identified and costs of switching from one configuration to another are compiled into a cost transition matrix. Second, probabilities that demand on the system will transition from one mission to another are compiled into a mission demand Markov chain. Third, one performance matrix for each design objective is populated to describe how well the identified system configurations perform in each of the identified mission demand environments. The fourth step employs multi-period decision analysis techniques, including Markov decision processes from the field of operations research, to find efficient paths and policies a decision-maker may follow. The final step examines the implications of these paths and policies for the primary goal of informing initial system selection. Overall, this thesis unifies state-centric concepts of flexibility from economics and engineering literature with sequential decision-making techniques from operations research. The end objective of this thesis’ framework and its supporting tools is to enable selection of the next-generation space systems today, tailored to decision-maker budget and performance preferences, that will be best able to adapt and perform in a future of changing environments and requirements. Following extensive theoretical development, the framework and its steps are applied to space system planning problems of (1) DARPA-motivated multiple- or distributed-payload satellite selection and (2) NASA human space exploration architecture selection.

  14. Distributed Generation Planning using Peer Enhanced Multi-objective Teaching-Learning based Optimization in Distribution Networks

    NASA Astrophysics Data System (ADS)

    Selvam, Kayalvizhi; Vinod Kumar, D. M.; Siripuram, Ramakanth

    2017-04-01

    In this paper, an optimization technique called peer enhanced teaching learning based optimization (PeTLBO) algorithm is used in multi-objective problem domain. The PeTLBO algorithm is parameter less so it reduced the computational burden. The proposed peer enhanced multi-objective based TLBO (PeMOTLBO) algorithm has been utilized to find a set of non-dominated optimal solutions [distributed generation (DG) location and sizing in distribution network]. The objectives considered are: real power loss and the voltage deviation subjected to voltage limits and maximum penetration level of DG in distribution network. Since the DG considered is capable of injecting real and reactive power to the distribution network the power factor is considered as 0.85 lead. The proposed peer enhanced multi-objective optimization technique provides different trade-off solutions in order to find the best compromise solution a fuzzy set theory approach has been used. The effectiveness of this proposed PeMOTLBO is tested on IEEE 33-bus and Indian 85-bus distribution system. The performance is validated with Pareto fronts and two performance metrics (C-metric and S-metric) by comparing with robust multi-objective technique called non-dominated sorting genetic algorithm-II and also with the basic TLBO.

  15. Modelling microbial metabolic rewiring during growth in a complex medium.

    PubMed

    Fondi, Marco; Bosi, Emanuele; Presta, Luana; Natoli, Diletta; Fani, Renato

    2016-11-24

    In their natural environment, bacteria face a wide range of environmental conditions that change over time and that impose continuous rearrangements at all the cellular levels (e.g. gene expression, metabolism). When facing a nutritionally rich environment, for example, microbes first use the preferred compound(s) and only later start metabolizing the other one(s). A systemic re-organization of the overall microbial metabolic network in response to a variation in the composition/concentration of the surrounding nutrients has been suggested, although the range and the entity of such modifications in organisms other than a few model microbes has been scarcely described up to now. We used multi-step constraint-based metabolic modelling to simulate the growth in a complex medium over several time steps of the Antarctic model organism Pseudoalteromonas haloplanktis TAC125. As each of these phases is characterized by a specific set of amino acids to be used as carbon and energy source our modelling framework describes the major consequences of nutrients switching at the system level. The model predicts that a deep metabolic reprogramming might be required to achieve optimal biomass production in different stages of growth (different medium composition), with at least half of the cellular metabolic network involved (more than 50% of the metabolic genes). Additionally, we show that our modelling framework is able to capture metabolic functional association and/or common regulatory features of the genes embedded in our reconstruction (e.g. the presence of common regulatory motifs). Finally, to explore the possibility of a sub-optimal biomass objective function (i.e. that cells use resources in alternative metabolic processes at the expense of optimal growth) we have implemented a MOMA-based approach (called nutritional-MOMA) and compared the outcomes with those obtained with Flux Balance Analysis (FBA). Growth simulations under this scenario revealed the deep impact of choosing among alternative objective functions on the resulting predictions of fluxes distribution. Here we provide a time-resolved, systems-level scheme of PhTAC125 metabolic re-wiring as a consequence of carbon source switching in a nutritionally complex medium. Our analyses suggest the presence of a potential efficient metabolic reprogramming machinery to continuously and promptly adapt to this nutritionally changing environment, consistent with adaptation to fast growth in a fairly, but probably inconstant and highly competitive, environment. Also, we show i) how functional partnership and co-regulation features can be predicted by integrating multi-step constraint-based metabolic modelling with fed-batch growth data and ii) that performing simulations under a sub-optimal objective function may lead to different flux distributions in respect to canonical FBA.

  16. Objective food environments and health outcomes.

    PubMed

    Minaker, Leia M; Raine, Kim D; Wild, T Cameron; Nykiforuk, Candace I J; Thompson, Mary E; Frank, Lawrence D

    2013-09-01

    Pathways by which food environments affect residents' diet-related outcomes are still unclear. Understanding pathways may help decision makers identify food environment strategies to promote healthy diets. To examine the hypothesis that residents' perceptions mediate the relationship between objective food environment and residents' diet quality and weight status. In the Waterloo Region, Ontario, objective food environment data were collected from 422 food stores and 912 restaurants using the Nutrition Environment Measure Survey in Stores and Restaurants, a shelf-space measure of fruits and vegetables, and the Retail Food Environment Index. Waterloo Region households (n=2223) completed a subjective food environment perception survey; household members (n=4102) self-reported weight, height, and waist circumference. A subsample (1170 individuals within 690 households) completed diet records. Food environment data were collected in 2010; respondent data were collected from 2009-2010; and data were analyzed in 2012. A series of gender-specific models were conducted to test mediation, adjusting for household income, car ownership, age, and education level. Residents' perceptions did not mediate the relationship between objective measures and diet-related outcomes; instead, results revealed the direct effect of several objectively measured factors of the food environment (notably food access and relative food affordability) on outcomes. Perceptions generally were not associated with diet-related outcomes. These results reveal that in this setting, strategies aimed at improving residents' perceptions may be less effective than those acting directly on food environments to improve food access and relative food affordability. Copyright © 2013 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  17. Effects of the duration and inorganic nitrogen composition of a nutrient-rich patch on soil exploration by the roots of Lolium perenne in a heterogeneous environment.

    PubMed

    Nakamura, Ryoji; Kachi, N; Suzuki, J-I

    2010-05-01

    We investigated the growth of and soil exploration by Lolium perenne under a heterogeneous environment before its roots reached a nutrient-rich patch. Temporal changes in the distribution of inorganic nitrogen, i.e., NO(3)(-)-N and NH(4)(+)-N, in the heterogeneous environment during the experimental period were also examined. The results showed that roots randomly explored soil, irrespective of the patchy distribution of inorganic nitrogen and differences in the chemical composition of inorganic nitrogen distribution between heterogeneous and homogeneous environments. We have also elucidated the potential effects of patch duration and inorganic nitrogen distribution on soil exploration by roots and thus on plant growth.

  18. Inter-relationships between objective and subjective measures of the residential environment among urban African American women

    PubMed Central

    Sealy-Jefferson, Shawnita; Messer, Lynne; Slaughter-Acey, Jaime; Misra, Dawn P.

    2016-01-01

    Background The inter-relationships between objective (census-based) and subjective (resident reported) measures of the residential environment is understudied in African American (AA) populations. Methods Using data from the Life Influences on Fetal Environments Study (2009–2011) (n=1,387) of AA women, we quantified the area-level variation in subjective reports of residential healthy food availability, walkability, safety and disorder that can be accounted for with an objective neighborhood disadvantage index (NDI). Two-level generalized linear models estimated associations between objective and subjective measures of the residential environment, accounting for individual-level covariates. Results In unconditional models, intraclass correlation coefficients for block-group variance in subjective reports ranged from 11% (healthy food availability) to 30% (safety). Models accounting for the NDI (versus both NDI and individual level covariates) accounted for more variance in healthy food availability (23% versus 8%) and social disorder (40% versus 38%). The NDI and individual level variables accounted for 39% and 51% of the area-level variation in walkability and safety. Associations between subjective and objective measures of the residential environment were significant and in the expected direction. Conclusions Future studies on neighborhood effects on health, especially among AAs, should include a wide range of residential environment measures, including subjective, objective and spatial contextual variables. PMID:28160971

  19. Inter-relationships between objective and subjective measures of the residential environment among urban African American women.

    PubMed

    Sealy-Jefferson, Shawnita; Messer, Lynne; Slaughter-Acey, Jaime; Misra, Dawn P

    2017-03-01

    The inter-relationships between objective (census based) and subjective (resident reported) measures of the residential environment is understudied in African American (AA) populations. Using data from the Life Influences on Fetal Environments Study (2009-2011; n = 1387) of AA women, we quantified the area-level variation in subjective reports of residential healthy food availability, walkability, safety, and disorder that can be accounted for with an objective neighborhood disadvantage index (NDI). Two-level generalized linear models estimated associations between objective and subjective measures of the residential environment, accounting for individual-level covariates. In unconditional models, intraclass correlation coefficients for block-group variance in subjective reports ranged from 11% (healthy food availability) to 30% (safety). Models accounting for the NDI (vs. both NDI and individual-level covariates) accounted for more variance in healthy food availability (23% vs. 8%) and social disorder (40% vs. 38%). The NDI and individual-level variables accounted for 39% and 51% of the area-level variation in walkability and safety, respectively. Associations between subjective and objective measures of the residential environment were significant and in the expected direction. Future studies on neighborhood effects on health, especially among AAs, should include a wide range of residential environment measures, including subjective, objective, and spatial contextual variables. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Parallel and Distributed Methods for Constrained Nonconvex Optimization—Part I: Theory

    NASA Astrophysics Data System (ADS)

    Scutari, Gesualdo; Facchinei, Francisco; Lampariello, Lorenzo

    2017-04-01

    In Part I of this paper, we proposed and analyzed a novel algorithmic framework for the minimization of a nonconvex (smooth) objective function, subject to nonconvex constraints, based on inner convex approximations. This Part II is devoted to the application of the framework to some resource allocation problems in communication networks. In particular, we consider two non-trivial case-study applications, namely: (generalizations of) i) the rate profile maximization in MIMO interference broadcast networks; and the ii) the max-min fair multicast multigroup beamforming problem in a multi-cell environment. We develop a new class of algorithms enjoying the following distinctive features: i) they are \\emph{distributed} across the base stations (with limited signaling) and lead to subproblems whose solutions are computable in closed form; and ii) differently from current relaxation-based schemes (e.g., semidefinite relaxation), they are proved to always converge to d-stationary solutions of the aforementioned class of nonconvex problems. Numerical results show that the proposed (distributed) schemes achieve larger worst-case rates (resp. signal-to-noise interference ratios) than state-of-the-art centralized ones while having comparable computational complexity.

  1. CICADA, CCD and Instrument Control Software

    NASA Astrophysics Data System (ADS)

    Young, Peter J.; Brooks, Mick; Meatheringham, Stephen J.; Roberts, William H.

    Computerised Instrument Control and Data Acquisition (CICADA) is a software system for control of telescope instruments in a distributed computing environment. It is designed using object-oriented techniques and built with standard computing tools such as RPC, SysV IPC, Posix threads, Tcl, and GUI builders. The system is readily extensible to new instruments and currently supports the Astromed 3200 CCD controller and MSSSO's new tip-tilt system. Work is currently underway to provide support for the SDSU CCD controller and MSSSO's Double Beam Spectrograph. A core set of processes handle common communication and control tasks, while specific instruments are ``bolted'' on using C++ inheritance techniques.

  2. Microplastics in sediments: A review of techniques, occurrence and effects.

    PubMed

    Van Cauwenberghe, Lisbeth; Devriese, Lisa; Galgani, François; Robbens, Johan; Janssen, Colin R

    2015-10-01

    Microplastics are omnipresent in the marine environment and sediments are hypothesized to be major sinks of these plastics. Here, over 100 articles spanning the last 50 year are reviewed with following objectives: (i) to evaluate current microplastic extraction techniques, (ii) to discuss the occurrence and worldwide distribution of microplastics in sediments, and (iii) to make a comprehensive assessment of the possible adverse effects of this type of pollution to marine organisms. Based on this review we propose future research needs and conclude that there is a clear need for a standardized techniques, unified reporting units and more realistic effect assessments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Date prints on stranded macroplastics: Marine litter as a chronological marker in recent coastal deposits

    NASA Astrophysics Data System (ADS)

    Sander, Lasse

    2017-04-01

    Plastic is a collective term describing a group of synthetic materials, most of which were invented over the course of the last century. Already in the 1970s, the magnitude of plastic pollution has been recognized as an issue of concern for the global marine environment. It is hence no longer a rare event to encounter plastic fragments and objects in coastal or marine deposits. Plastic holds a chronological indication: a deposit containing plastic must be younger than the invention of the material. The potential of this approach was tested in an investigation into the spatial distribution of stranded macroplastics in recent overwash deposits in SW Denmark. Larger litter items can be surveyed as discrete objects and allow the retrieval of more precise, though indirect age-information, such as production-date prints. A subgroup of >110 georeferenced surface samples containing date information were surveyed in summer 2015. Objects with ages from the late 1970s until 2014 were encountered. The distribution of the litter was clearly non-random in relation to overwash morphology, and based on the collected samples, it was possible to reconstruct indication on both the timing and the extent of extreme events since the 1990s. These observations were cross-compared with a dense time series of satellite images and orthophotos. It is proposed that an improved interpretation of indications from the plastic record may be obtained by broader surveys including additional parameters, such as the exact location, elevation, chemical composition, assemblage, origin, product design, decay, fracturing or the colonization by marine sessile organisms, from all encountered macroplastic objects. If calibrated properly, the plastic assemblages may serve as fast, cheap and reliable chronological markers in recent coastal deposits.

  4. Gas-Grain Chemical Models: Inclusion of a Grain Size Distribution and a Study Of Young Stellar Objects in the Magellanic Clouds

    NASA Astrophysics Data System (ADS)

    Pauly, Tyler Andrew

    2017-06-01

    Computational models of interstellar gas-grain chemistry have aided in our understanding of star-forming regions. Chemical kinetics models rely on a network of chemical reactions and a set of physical conditions in which atomic and molecular species are allowed to form and react. We replace the canonical single grain-size in our chemical model MAGICKAL with a grain size distribution and analyze the effects on the chemical composition of the gas and grain surface in quiescent and collapsing dark cloud models. We find that a grain size distribution coupled with a temperature distribution across grain sizes can significantly affect the bulk ice composition when dust temperatures fall near critical values related to the surface binding energies of common interstellar chemical species. We then apply the updated model to a study of ice formation in the cold envelopes surrounding massive young stellar objects in the Magellanic Clouds. The Magellanic Clouds are local satellite galaxies of the Milky Way, and they provide nearby environments to study star formation at low metallicity. We expand the model calculation of dust temperature to include a treatment for increased interstellar radiation field intensity; we vary the radiation field to model the elevated dust temperatures observed in the Magellanic Clouds. We also adjust the initial elemental abundances used in the model, guided by observations of Magellanic Cloud HII regions. We are able to reproduce the relative ice fractions observed, indicating that metal depletion and elevated grain temperature are important drivers of the envelope ice composition. The observed shortfall in CO in Small Magellanic Cloud sources can be explained by a combination of reduced carbon abundance and increased grain temperatures. The models indicate that a large variation in radiation field strength is required to match the range of observed LMC abundances. CH 3OH abundance is found to be enhanced (relative to total carbon abundance) in low-metallicity models, providing seed material for complex organic molecule formation. We conclude with a preliminary study of the recently discovered hot core in the Large Magellanic Cloud; we create a grid of models to simulate hot core formation in Magellanic Cloud environments, comparing them to models and observations of well-characterized galactic counterparts.

  5. [Adsorption characteristics of proteins on membrane surface and effect of protein solution environment on permeation behavior of berberine].

    PubMed

    Li, Yi-Qun; Xu, Li; Zhu, Hua-Xu; Tang, Zhi-Shu; Li, Bo; Pan, Yong-Lan; Yao, Wei-Wei; Fu, Ting-Ming; Guo, Li-Wei

    2017-10-01

    In order to explore the adsorption characteristics of proteins on the membrane surface and the effect of protein solution environment on the permeation behavior of berberine, berberine and proteins were used as the research object to prepare simulated solution. Low field NMR, static adsorption experiment and membrane separation experiment were used to study the interaction between the proteins and ceramic membrane or between the proteins and berberine. The static adsorption capacity of proteins, membrane relative flux, rejection rate of proteins, transmittance rate of berberine and the adsorption rate of proteins and berberine were used as the evaluation index. Meanwhile, the membrane resistance distribution, the particle size distribution and the scanning electron microscope (SEM) were determined to investigate the adsorption characteristics of proteins on ceramic membrane and the effect on membrane separation process of berberine. The results showed that the ceramic membrane could adsorb the proteins and the adsorption model was consistent with Langmuir adsorption model. In simulating the membrane separation process, proteins were the main factor to cause membrane fouling. However, when the concentration of proteins was 1 g•L⁻¹, the proteins had no significant effect on membrane separation process of berberine. Copyright© by the Chinese Pharmaceutical Association.

  6. The Web Measurement Environment (WebME): A Tool for Combining and Modeling Distributed Data

    NASA Technical Reports Server (NTRS)

    Tesoriero, Roseanne; Zelkowitz, Marvin

    1997-01-01

    Many organizations have incorporated data collection into their software processes for the purpose of process improvement. However, in order to improve, interpreting the data is just as important as the collection of data. With the increased presence of the Internet and the ubiquity of the World Wide Web, the potential for software processes being distributed among several physically separated locations has also grown. Because project data may be stored in multiple locations and in differing formats, obtaining and interpreting data from this type of environment becomes even more complicated. The Web Measurement Environment (WebME), a Web-based data visualization tool, is being developed to facilitate the understanding of collected data in a distributed environment. The WebME system will permit the analysis of development data in distributed, heterogeneous environments. This paper provides an overview of the system and its capabilities.

  7. Students' perception of the learning environment in a distributed medical programme.

    PubMed

    Veerapen, Kiran; McAleer, Sean

    2010-09-24

    The learning environment of a medical school has a significant impact on students' achievements and learning outcomes. The importance of equitable learning environments across programme sites is implicit in distributed undergraduate medical programmes being developed and implemented. To study the learning environment and its equity across two classes and three geographically separate sites of a distributed medical programme at the University of British Columbia Medical School that commenced in 2004. The validated Dundee Ready Educational Environment Survey was sent to all students in their 2nd and 3rd year (classes graduating in 2009 and 2008) of the programme. The domains of the learning environment surveyed were: students' perceptions of learning, students' perceptions of teachers, students' academic self-perceptions, students' perceptions of the atmosphere, and students' social self-perceptions. Mean scores, frequency distribution of responses, and inter- and intrasite differences were calculated. The perception of the global learning environment at all sites was more positive than negative. It was characterised by a strongly positive perception of teachers. The work load and emphasis on factual learning were perceived negatively. Intersite differences within domains of the learning environment were more evident in the pioneer class (2008) of the programme. Intersite differences consistent across classes were largely related to on-site support for students. Shared strengths and weaknesses in the learning environment at UBC sites were evident in areas that were managed by the parent institution, such as the attributes of shared faculty and curriculum. A greater divergence in the perception of the learning environment was found in domains dependent on local arrangements and social factors that are less amenable to central regulation. This study underlines the need for ongoing comparative evaluation of the learning environment at the distributed sites and interaction between leaders of these sites.

  8. Direct and indirect effects of climate change on the risk of infection by water-transmitted pathogens.

    PubMed

    Sterk, Ankie; Schijven, Jack; de Nijs, Ton; de Roda Husman, Ana Maria

    2013-11-19

    Climate change is likely to affect the infectious disease burden from exposure to pathogens in water used for drinking and recreation. Effective intervention measures require quantification of impacts of climate change on the distribution of pathogens in the environment and their potential effects on human health. Objectives of this systematic review were to summarize current knowledge available to estimate how climate change may directly and indirectly affect infection risks due to Campylobacter, Cryptosporidium, norovirus, and Vibrio. Secondary objectives were to prioritize natural processes and interactions that are susceptible to climate change and to identify knowledge gaps. Search strategies were determined based on a conceptual model and scenarios with the main emphasis on The Netherlands. The literature search resulted in a large quantity of publications on climate variables affecting pathogen input and behavior in aquatic environments. However, not all processes and pathogens are evenly covered by the literature, and in many cases, the direction of change is still unclear. To make useful predictions of climate change, it is necessary to combine both negative and positive effects. This review provides an overview of the most important effects of climate change on human health and shows the importance of QMRA to quantify the net effects.

  9. Music to knowledge: A visual programming environment for the development and evaluation of music information retrieval techniques

    NASA Astrophysics Data System (ADS)

    Ehmann, Andreas F.; Downie, J. Stephen

    2005-09-01

    The objective of the International Music Information Retrieval Systems Evaluation Laboratory (IMIRSEL) project is the creation of a large, secure corpus of audio and symbolic music data accessible to the music information retrieval (MIR) community for the testing and evaluation of various MIR techniques. As part of the IMIRSEL project, a cross-platform JAVA based visual programming environment called Music to Knowledge (M2K) is being developed for a variety of music information retrieval related tasks. The primary objective of M2K is to supply the MIR community with a toolset that provides the ability to rapidly prototype algorithms, as well as foster the sharing of techniques within the MIR community through the use of a standardized set of tools. Due to the relatively large size of audio data and the computational costs associated with some digital signal processing and machine learning techniques, M2K is also designed to support distributed computing across computing clusters. In addition, facilities to allow the integration of non-JAVA based (e.g., C/C++, MATLAB, etc.) algorithms and programs are provided within M2K. [Work supported by the Andrew W. Mellon Foundation and NSF Grants No. IIS-0340597 and No. IIS-0327371.

  10. Security Implications of OPC, OLE, DCOM, and RPC in Control Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2006-01-01

    OPC is a collection of software programming standards and interfaces used in the process control industry. It is intended to provide open connectivity and vendor equipment interoperability. The use of OPC technology simplifies the development of control systems that integrate components from multiple vendors and support multiple control protocols. OPC-compliant products are available from most control system vendors, and are widely used in the process control industry. OPC was originally known as OLE for Process Control; the first standards for OPC were based on underlying services in the Microsoft Windows computing environment. These underlying services (OLE [Object Linking and Embedding],more » DCOM [Distributed Component Object Model], and RPC [Remote Procedure Call]) have been the source of many severe security vulnerabilities. It is not feasible to automatically apply vendor patches and service packs to mitigate these vulnerabilities in a control systems environment. Control systems using the original OPC data access technology can thus inherit the vulnerabilities associated with these services. Current OPC standardization efforts are moving away from the original focus on Microsoft protocols, with a distinct trend toward web-based protocols that are independent of any particular operating system. However, the installed base of OPC equipment consists mainly of legacy implementations of the OLE for Process Control protocols.« less

  11. Structural, Physical, and Compositional Analysis of Lunar Simulants and Regolith

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul; Street, Kenneth W.; Gaier, James

    2008-01-01

    Relative to the prior manned Apollo and unmanned robotic missions, planned Lunar initiatives are comparatively complex and longer in duration. Individual crew rotations are envisioned to span several months, and various surface systems must function in the Lunar environment for periods of years. As a consequence, an increased understanding of the surface environment is required to engineer and test the associated materials, components, and systems necessary to sustain human habitation and surface operations. The effort described here concerns the analysis of existing simulant materials, with application to Lunar return samples. The interplay between these analyses fulfills the objective of ascertaining the critical properties of regolith itself, and the parallel objective of developing suitable stimulant materials for a variety of engineering applications. Presented here are measurements of the basic physical attributes, i.e. particle size distributions and general shape factors. Also discussed are structural and chemical properties, as determined through a variety of techniques, such as optical microscopy, SEM and TEM microscopy, Mossbauer Spectroscopy, X-ray diffraction, Raman microspectroscopy, inductively coupled argon plasma emission spectroscopy and energy dispersive X-ray fluorescence mapping. A comparative description of currently available stimulant materials is discussed, with implications for more detailed analyses, as well as the requirements for continued refinement of methods for simulant production.

  12. Recent benthic foraminifera and sedimentary facies from mangrove swamps and channels of Abu Dhabi (United Arab Emirates)

    NASA Astrophysics Data System (ADS)

    Fiorini, Flavia; Odeh, Weaam A. S. Al; Lokier, Stephen W.; Paul, Andreas

    2016-04-01

    Zonation of Recent mangrove environments can be defined using benthic foraminifera, however, little is known about foraminifera from mangrove environments of the Arabian Gulf. The objective of this study is to produce a detailed micropaleontological and sedimentological analysis to identify foraminiferal associations in several coastline environments (mangrove swamps and channels) located on the eastern side of Abu Dhabi Island (UAE). Detailed sediment sampling collection in mangal environments of Eastern Abu Dhabi was carried out to assess the distribution of living and dead benthic foraminifera in different sedimentary facies in the mangal and in the surrounding area comprising natural environments of the upper and lower intertidal area (mud flats and channels) and areas modified by anthropogenic activities (dredged channels). The fine-grain sediments collected near mangrove (Avicenna marina) roots presented a high abundance of living and dead foraminifera tests. The assemblages in these samples show very low diversity and are almost entirely constituted of small-sized opportunistic species belonging to the genera Ammonia and Elphidium. In particular: • Samples collected on the mud flat and in ponds at the margin of the channel show a foraminiferal assemblage characterised by abundant foraminifera belonging to the genera Ammonia, Elphidium, Triloculina, Quinqueloculina, Peneroplis and Spirolina. • Samples collected in the lower (wet) intertidal area close to Avicenna marina roots, presented a low-diversity assemblage mostly comprising opportunistic foraminifera of the genera Ammonia and Elphidium along with rare miliolidae. • Samples from the upper intertidal area (dry) close to Avicenna marina roots, produced an assemblage exclusively composed of small-sized opportunistic Ammonia and Elphidium, together with abundant specimens belonging to the genera Trochammina. Throchammina specimens have not been previously recorded from Recent sedimentary samples of the coastline environments of the Arabian Gulf. The samples collected in the higher energy settings (channels) were characterised by a very low abundance of foraminiferal tests, no or rare living forms were found in the coarser grained facies. Most of the samples collected in the dredged channels were barren. The distribution of Recent benthic foraminifera from mangrove environment of the Abu Dhabi region present a powerful tool for constructing zonation of marine coastline environments and can be employed as a modern analogue for interpreting the depositional environment of ancient coastline sediments.

  13. Using Acoustics to Determine Eelgrass Bed Distribution and to Assess the Seasonal Variation of Ecosystem Service

    PubMed Central

    Sonoki, Shiori; Shao, Huamei; Morita, Yuka; Minami, Kenji; Shoji, Jun; Hori, Masakazu; Miyashita, Kazushi

    2016-01-01

    Eelgrass beds are an important source of primary production in coastal ecosystems. Understanding seasonal variation in the abundance and distribution of eelgrass is important for conservation, and the objectives of this study were to 1) monitor seasonal variation in eelgrass beds using an acoustic monitoring method (Quantitative echo sounder) and 2) broadly quantify the carbon circulation function. We obtained acoustic data of eelgrass beds in coastal areas north and east of Ikunojima Island. Surveys were conducted nine times over the 3-year period from 2011 to 2013 in order to monitor seasonal variation. Acoustic data were obtained and used to estimate the spatial distribution of eelgrass by geostatistical methods. To determine supporting services, we determined carbon sink and carbon fixation by eelgrass beds using data from the National Research Institute of Fisheries and Environment of Inland Sea (2011). The height and distribution of eelgrass beds were at a maximum in May and at a minimum in November of each year. Distribution trends were different between the north and east areas. Supporting services showed the same patterns throughout the year. The area of distribution was considered to be coincident with the life history of eelgrass. Distribution differed by area and changed yearly due to the effects of bottom characteristics and wind direction. Quantifying the supporting services of eelgrass beds was shown to be useful for managing the conservation of coastal ecosystems. PMID:26954673

  14. Distributed query plan generation using multiobjective genetic algorithm.

    PubMed

    Panicker, Shina; Kumar, T V Vijay

    2014-01-01

    A distributed query processing strategy, which is a key performance determinant in accessing distributed databases, aims to minimize the total query processing cost. One way to achieve this is by generating efficient distributed query plans that involve fewer sites for processing a query. In the case of distributed relational databases, the number of possible query plans increases exponentially with respect to the number of relations accessed by the query and the number of sites where these relations reside. Consequently, computing optimal distributed query plans becomes a complex problem. This distributed query plan generation (DQPG) problem has already been addressed using single objective genetic algorithm, where the objective is to minimize the total query processing cost comprising the local processing cost (LPC) and the site-to-site communication cost (CC). In this paper, this DQPG problem is formulated and solved as a biobjective optimization problem with the two objectives being minimize total LPC and minimize total CC. These objectives are simultaneously optimized using a multiobjective genetic algorithm NSGA-II. Experimental comparison of the proposed NSGA-II based DQPG algorithm with the single objective genetic algorithm shows that the former performs comparatively better and converges quickly towards optimal solutions for an observed crossover and mutation probability.

  15. Distributed Query Plan Generation Using Multiobjective Genetic Algorithm

    PubMed Central

    Panicker, Shina; Vijay Kumar, T. V.

    2014-01-01

    A distributed query processing strategy, which is a key performance determinant in accessing distributed databases, aims to minimize the total query processing cost. One way to achieve this is by generating efficient distributed query plans that involve fewer sites for processing a query. In the case of distributed relational databases, the number of possible query plans increases exponentially with respect to the number of relations accessed by the query and the number of sites where these relations reside. Consequently, computing optimal distributed query plans becomes a complex problem. This distributed query plan generation (DQPG) problem has already been addressed using single objective genetic algorithm, where the objective is to minimize the total query processing cost comprising the local processing cost (LPC) and the site-to-site communication cost (CC). In this paper, this DQPG problem is formulated and solved as a biobjective optimization problem with the two objectives being minimize total LPC and minimize total CC. These objectives are simultaneously optimized using a multiobjective genetic algorithm NSGA-II. Experimental comparison of the proposed NSGA-II based DQPG algorithm with the single objective genetic algorithm shows that the former performs comparatively better and converges quickly towards optimal solutions for an observed crossover and mutation probability. PMID:24963513

  16. Using the SWAT model to improve process descriptions and define hydrologic partitioning in South Korea

    NASA Astrophysics Data System (ADS)

    Shope, C. L.; Maharjan, G. R.; Tenhunen, J.; Seo, B.; Kim, K.; Riley, J.; Arnhold, S.; Koellner, T.; Ok, Y. S.; Peiffer, S.; Kim, B.; Park, J.-H.; Huwe, B.

    2014-02-01

    Watershed-scale modeling can be a valuable tool to aid in quantification of water quality and yield; however, several challenges remain. In many watersheds, it is difficult to adequately quantify hydrologic partitioning. Data scarcity is prevalent, accuracy of spatially distributed meteorology is difficult to quantify, forest encroachment and land use issues are common, and surface water and groundwater abstractions substantially modify watershed-based processes. Our objective is to assess the capability of the Soil and Water Assessment Tool (SWAT) model to capture event-based and long-term monsoonal rainfall-runoff processes in complex mountainous terrain. To accomplish this, we developed a unique quality-control, gap-filling algorithm for interpolation of high-frequency meteorological data. We used a novel multi-location, multi-optimization calibration technique to improve estimations of catchment-wide hydrologic partitioning. The interdisciplinary model was calibrated to a unique combination of statistical, hydrologic, and plant growth metrics. Our results indicate scale-dependent sensitivity of hydrologic partitioning and substantial influence of engineered features. The addition of hydrologic and plant growth objective functions identified the importance of culverts in catchment-wide flow distribution. While this study shows the challenges of applying the SWAT model to complex terrain and extreme environments; by incorporating anthropogenic features into modeling scenarios, we can enhance our understanding of the hydroecological impact.

  17. A Global, Multi-Waveband Model for the Zodiacal Cloud

    NASA Technical Reports Server (NTRS)

    Grogan, Keith; Dermott, Stanley F.; Kehoe, Thomas J. J.

    2003-01-01

    This recently completed three-year project was undertaken by the PI at the University of Florida, NASA Goddard and JPL, and by the Co-I and Collaborator at the University of Florida. The funding was used to support a continuation of research conducted at the University of Florida over the last decade which focuses on the dynamics of dust particles in the interplanetary environment. The main objectives of this proposal were: To produce improved dynamical models of the zodiacal cloud by performing numerical simulations of the orbital evolution of asteroidal and cometary dust particles. To provide visualizations of the results using our visualization software package, SIMUL, simulating the viewing geometries of IRAS and COBE and comparing the model results with archived data. To use the results to provide a more accurate model of the brightness distribution of the zodiacal cloud than existing empirical models. In addition, our dynamical approach can provide insight into fundamental properties of the cloud, including but not limited to the total mass and surface area of dust, the size-frequency distribution of dust, and the relative contributions of asteroidal and cometary material. The model can also be used to provide constraints on trace signals from other sources, such as dust associated with the "Plutinos" , objects captured in the 2:3 resonance with Neptune.

  18. MIRADAS control system

    NASA Astrophysics Data System (ADS)

    Rosich Minguell, Josefina; Garzón Lopez, Francisco

    2012-09-01

    The Mid-resolution InfRAreD Astronomical Spectrograph (MIRADAS, a near-infrared multi-object echelle spectrograph operating at spectral resolution R=20,000 over the 1-2.5μm bandpass) was selected in 2010 by the Gran Telescopio Canarias (GTC) partnership as the next-generation near-infrared spectrograph for the world's largest optical/infrared telescope, and is being developed by an international consortium. The MIRADAS consortium includes the University of Florida, Universidad de Barcelona, Universidad Complutense de Madrid, Instituto de Astrofísica de Canarias, Institut de Física d'Altes Energies, Institut d'Estudis Espacials de Catalunya and Universidad Nacional Autónoma de México. This paper shows an overview of the MIRADAS control software, which follows the standards defined by the telescope to permit the integration of this software on the GTC Control System (GCS). The MIRADAS Control System is based on a distributed architecture according to a component model where every subsystem is selfcontained. The GCS is a distributed environment written in object oriented C++, which runs components in different computers, using CORBA middleware for communications. Each MIRADAS observing mode, including engineering, monitoring and calibration modes, will have its own predefined sequence, which are executed in the GCS Sequencer. These sequences will have the ability of communicating with other telescope subsystems.

  19. A revised and updated catalog of quasi-stellar objects

    NASA Technical Reports Server (NTRS)

    Hewitt, A.; Burbidge, G.

    1993-01-01

    The paper contains a catalog of all known quasi-stellar objects (QSOs) with measured emission redshifts, and BL Lac objects, complete to 1992 December 31. The catalog contains 7315 objects, nearly all QSOs including about 90 BL Lac objects. The catalog and references contain extensive information on names, positions, magnitudes, colors, emission-line redshifts, absorption, variability, polarization, and X-ray, radio, and infrared data. A key in the form of subsidiary tables enables the reader to relate the name of a given object to its coordinate name, which is used throughout the compilation. Plots of the Hubble diagram, the apparent magnitude distribution, the emission redshift distribution, and the distribution of the QSOs on the sky are also given.

  20. 40 CFR 403.2 - Objectives of general pretreatment regulations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 30 2013-07-01 2012-07-01 true Objectives of general pretreatment regulations. 403.2 Section 403.2 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... POLLUTION § 403.2 Objectives of general pretreatment regulations. By establishing the responsibilities of...

  1. 40 CFR 85.1408 - Objections to certification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Objections to certification. 85.1408 Section 85.1408 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM MOBILE SOURCES Urban Bus Rebuild Requirements § 85.1408 Objections...

  2. Objective and subjective measures of neighborhood environment (NE): relationships with transportation physical activity among older persons.

    PubMed

    Nyunt, Ma Shwe Zin; Shuvo, Faysal Kabir; Eng, Jia Yen; Yap, Keng Bee; Scherer, Samuel; Hee, Li Min; Chan, Siew Pang; Ng, Tze Pin

    2015-09-15

    This study examined the associations of subjective and objective measures of the neighbourhood environment with the transportation physical activity of community-dwelling older persons in Singapore. A modified version of the Neighborhood Environment Walkability Scale (NEWS) and Geographical Information System (GIS) measures of the built environment characteristics were related to the frequency of walking for transportation purpose in a study sample of older persons living in high-density apartment blocks within a public housing estate in Singapore. Relevant measured variables to assess the complex relationships among built environment measures and transportation physical activity were examined using structural equation modelling and multiple regression analyses. The subjective measures of residential density, street connectivity, land use mix diversity and aesthetic environment and the objective GIS measure of Accessibility Index have positively significant independent associations with transportation physical activity, after adjusting for demographics, socio-economic and health status. Subjective and objective measures are non-overlapping measures complementing each other in providing information on built environment characteristics. For elderly living in a high-density urban neighborhood, well connected street, diversity of land use mix, close proximity to amenities and facilities, and aesthetic environment were associated with higher frequency of walking for transportation purposes.

  3. Class imbalance in unsupervised change detection - A diagnostic analysis from urban remote sensing

    NASA Astrophysics Data System (ADS)

    Leichtle, Tobias; Geiß, Christian; Lakes, Tobia; Taubenböck, Hannes

    2017-08-01

    Automatic monitoring of changes on the Earth's surface is an intrinsic capability and simultaneously a persistent methodological challenge in remote sensing, especially regarding imagery with very-high spatial resolution (VHR) and complex urban environments. In order to enable a high level of automatization, the change detection problem is solved in an unsupervised way to alleviate efforts associated with collection of properly encoded prior knowledge. In this context, this paper systematically investigates the nature and effects of class distribution and class imbalance in an unsupervised binary change detection application based on VHR imagery over urban areas. For this purpose, a diagnostic framework for sensitivity analysis of a large range of possible degrees of class imbalance is presented, which is of particular importance with respect to unsupervised approaches where the content of images and thus the occurrence and the distribution of classes are generally unknown a priori. Furthermore, this framework can serve as a general technique to evaluate model transferability in any two-class classification problem. The applied change detection approach is based on object-based difference features calculated from VHR imagery and subsequent unsupervised two-class clustering using k-means, genetic k-means and self-organizing map (SOM) clustering. The results from two test sites with different structural characteristics of the built environment demonstrated that classification performance is generally worse in imbalanced class distribution settings while best results were reached in balanced or close to balanced situations. Regarding suitable accuracy measures for evaluating model performance in imbalanced settings, this study revealed that the Kappa statistics show significant response to class distribution while the true skill statistic was widely insensitive to imbalanced classes. In general, the genetic k-means clustering algorithm achieved the most robust results with respect to class imbalance while the SOM clustering exhibited a distinct optimization towards a balanced distribution of classes.

  4. Confounding environmental colour and distribution shape leads to underestimation of population extinction risk.

    PubMed

    Fowler, Mike S; Ruokolainen, Lasse

    2013-01-01

    The colour of environmental variability influences the size of population fluctuations when filtered through density dependent dynamics, driving extinction risk through dynamical resonance. Slow fluctuations (low frequencies) dominate in red environments, rapid fluctuations (high frequencies) in blue environments and white environments are purely random (no frequencies dominate). Two methods are commonly employed to generate the coloured spatial and/or temporal stochastic (environmental) series used in combination with population (dynamical feedback) models: autoregressive [AR(1)] and sinusoidal (1/f) models. We show that changing environmental colour from white to red with 1/f models, and from white to red or blue with AR(1) models, generates coloured environmental series that are not normally distributed at finite time-scales, potentially confounding comparison with normally distributed white noise models. Increasing variability of sample Skewness and Kurtosis and decreasing mean Kurtosis of these series alter the frequency distribution shape of the realised values of the coloured stochastic processes. These changes in distribution shape alter patterns in the probability of single and series of extreme conditions. We show that the reduced extinction risk for undercompensating (slow growing) populations in red environments previously predicted with traditional 1/f methods is an artefact of changes in the distribution shapes of the environmental series. This is demonstrated by comparison with coloured series controlled to be normally distributed using spectral mimicry. Changes in the distribution shape that arise using traditional methods lead to underestimation of extinction risk in normally distributed, red 1/f environments. AR(1) methods also underestimate extinction risks in traditionally generated red environments. This work synthesises previous results and provides further insight into the processes driving extinction risk in model populations. We must let the characteristics of known natural environmental covariates (e.g., colour and distribution shape) guide us in our choice of how to best model the impact of coloured environmental variation on population dynamics.

  5. Instructional Design Issues in a Distributed Collaborative Engineering Design (CED) Instructional Environment

    ERIC Educational Resources Information Center

    Koszalka, Tiffany A.; Wu, Yiyan

    2010-01-01

    Changes in engineering practices have spawned changes in engineering education and prompted the use of distributed learning environments. A distributed collaborative engineering design (CED) course was designed to engage engineering students in learning about and solving engineering design problems. The CED incorporated an advanced interactive…

  6. The Spatial Distribution of Attention within and across Objects

    ERIC Educational Resources Information Center

    Hollingworth, Andrew; Maxcey-Richard, Ashleigh M.; Vecera, Shaun P.

    2012-01-01

    Attention operates to select both spatial locations and perceptual objects. However, the specific mechanism by which attention is oriented to objects is not well understood. We examined the means by which object structure constrains the distribution of spatial attention (i.e., a "grouped array"). Using a modified version of the Egly et…

  7. An object-based storage model for distributed remote sensing images

    NASA Astrophysics Data System (ADS)

    Yu, Zhanwu; Li, Zhongmin; Zheng, Sheng

    2006-10-01

    It is very difficult to design an integrated storage solution for distributed remote sensing images to offer high performance network storage services and secure data sharing across platforms using current network storage models such as direct attached storage, network attached storage and storage area network. Object-based storage, as new generation network storage technology emerged recently, separates the data path, the control path and the management path, which solves the bottleneck problem of metadata existed in traditional storage models, and has the characteristics of parallel data access, data sharing across platforms, intelligence of storage devices and security of data access. We use the object-based storage in the storage management of remote sensing images to construct an object-based storage model for distributed remote sensing images. In the storage model, remote sensing images are organized as remote sensing objects stored in the object-based storage devices. According to the storage model, we present the architecture of a distributed remote sensing images application system based on object-based storage, and give some test results about the write performance comparison of traditional network storage model and object-based storage model.

  8. Overview of Mars Science Laboratory (MSL) Environmental Program

    NASA Technical Reports Server (NTRS)

    Forgave, John C.; Man, Kin F.; Hoffman, Alan R.

    2006-01-01

    This viewgraph presentation is an overview of the Mars Science Laboratory (MSL) program. The engineering objectives of the program are to create a Mobile Science Laboratory capable of one Mars Year surface operational lifetime (670 Martian sols = 687 Earth days). It will be able to land and operation over wide range of latitudes, altitudes and seasons It must have controlled propulsive landing and demonstrate improved landing precision via guided entry The general science objectives are to perform science that will focus on Mars habitability, perform next generation analytical laboratory science investigations, perform remote sensing/contact investigations and carry a suite of environmental monitoring instruments. Specific scientific objectives of the MSL are: (1) Characterization of geological features, contributing to deciphering geological history and the processes that have modified rocks and regolith, including the role of water. (2) Determination of the mineralogy and chemical composition (including an inventory of elements such as C, H, N, O, P, S, etc. known to be building blocks for life) of surface and near-surface materials. (3) Determination of energy sources that could be used to sustain biological processes. (4) Characterization of organic compounds and potential biomarkers in representative regolith, rocks, and ices. (5) Determination the stable isotopic and noble gas composition of the present-day bulk atmosphere. (6) Identification potential bio-signatures (chemical, textural, isotopic) in rocks and regolith. (7) Characterization of the broad spectrum of surface radiation, including galactic cosmic radiation, solar proton events, and secondary neutrons. (8) Characterization of the local environment, including basic meteorology, the state and cycling of water and C02, and the near-surface distribution of hydrogen. Several views of the planned MSL and the rover are shown. The MSL environmental program is to: (1) Ensure the flight hardware design is capable of surviving all the environments throughout its mission life time, including ground, transportation, launch, cruise, entry decent and landing (EDL) and surface operation environments. (2) Verify environmental testing and analysis have adequately validated the flight hardware's ability to withstand all natural, self-induced, and mission-activity-induced environments. The planned tests to ascertain the capability of the MSL to perform as desired are reviewed.

  9. A proposed-standard format to represent and distribute tomographic models and other earth spatial data

    NASA Astrophysics Data System (ADS)

    Postpischl, L.; Morelli, A.; Danecek, P.

    2009-04-01

    Formats used to represent (and distribute) tomographic earth models differ considerably and are rarely self-consistent. In fact, each earth scientist, or research group, uses specific conventions to encode the various parameterizations used to describe, e.g., seismic wave speed or density in three dimensions, and complete information is often found in related documents or publications (if available at all) only. As a consequence, use of various tomographic models from different authors requires considerable effort, is more cumbersome than it should be and prevents widespread exchange and circulation within the community. We propose a format, based on modern web standards, able to represent different (grid-based) model parameterizations within the same simple text-based environment, easy to write, to parse, and to visualise. The aim is the creation of self-describing data-structures, both human and machine readable, that are automatically recognised by general-purpose software agents, and easily imported in the scientific programming environment. We think that the adoption of such a representation as a standard for the exchange and distribution of earth models can greatly ease their usage and enhance their circulation, both among fellow seismologists and among a broader non-specialist community. The proposed solution uses semantic web technologies, fully fitting the current trends in data accessibility. It is based on Json (JavaScript Object Notation), a plain-text, human-readable lightweight computer data interchange format, which adopts a hierarchical name-value model for representing simple data structures and associative arrays (called objects). Our implementation allows integration of large datasets with metadata (authors, affiliations, bibliographic references, units of measure etc.) into a single resource. It is equally suited to represent other geo-referenced volumetric quantities — beyond tomographic models — as well as (structured and unstructured) computational meshes. This approach can exploit the capabilities of the web browser as a computing platform: a series of in-page quick tools for comparative analysis between models will be presented, as well as visualisation techniques for tomographic layers in Google Maps and Google Earth. We are working on tools for conversion into common scientific format like netCDF, to allow easy visualisation in GEON-IDV or gmt.

  10. Development of a Gamma-Ray Spectrometer for Korean Pathfinder Lunar Orbiter

    NASA Astrophysics Data System (ADS)

    Kim, Kyeong Ja; Park, Junghun; Choi, Yire; Lee, Sungsoon; Yeon, Youngkwang; Yi, Eung Seok; Jeong, Meeyoung; Sun, Changwan; van Gasselt, Stephan; Lee, K. B.; Kim, Yongkwon; Min, Kyungwook; Kang, Kyungin; Cho, Jinyeon; Park, Kookjin; Hasebe, Nobuyuki; Elphic, Richard; Englert, Peter; Gasnault, Olivier; Lim, Lucy; Shibamura, Eido; GRS Team

    2016-10-01

    Korea is preparing for a lunar orbiter mission (KPLO) to be developed in no later than 2018. Onboard the spacecraft is a gamma ray spectrometer (KLGRS) allowing to collect low energy gamma-ray signals in order to detect elements by either X-ray fluorescence or by natural radioactive decay in the low as well as higher energy regions of up to 10 MeV. Scientific objectives include lunar resources (water and volatile measurements, rare earth elements and precious metals, energy resources, major elemental distributions for prospective in-situ utilizations), investigation of the lunar geology and studies of the lunar environment (mapping of the global radiation environment from keV to 10 MeV, high energy cosmic ray flux using the plastic scintillator).The Gamma-Ray Spectrometer (GRS) system is a compact low-weight instrument for the chemical analysis of lunar surface materials within a gamma-ray energy range from 10s keV to 10 MeV. The main LaBr3 detector is surrounded by an anti-coincidence counting module of BGO/PS scintillators to reduce both low gamma-ray background from the spacecraft and housing materials and high energy gamma-ray background from cosmic rays. The GRS system will determine the elemental compositions of the near surface of the Moon.The GRS system is a recently developed gamma-ray scintillation based detector which can be used as a replacement for the HPGe GRS sensor with the advantage of being able to operate at a wide range of temperatures with remarkable energy resolution. LaBr3 also has a high photoelectron yield, fast scintillation response, good linearity and thermal stability. With these major advantages, the LaBr3 GRS system will allow us to investigate scientific objectives and assess important research questions on lunar geology and resource exploration.The GRS investigation will help to assess open questions related to the spatial distribution and origin of the elements on the lunar surface and will contribute to unravel geological surface evolution and elemental distributions of potential lunar resources.

  11. Robot environment expert system

    NASA Technical Reports Server (NTRS)

    Potter, J. L.

    1985-01-01

    The Robot Environment Expert System uses a hexidecimal tree data structure to model a complex robot environment where not only the robot arm moves, but also the robot itself and other objects may move. The hextree model allows dynamic updating, collision avoidance and path planning over time, to avoid moving objects.

  12. Physical and Biological Controls of Copepod Aggregation and Baleen Whale Distribution

    DTIC Science & Technology

    2010-09-30

    1 DISTRIBUTION STATEMENT A: Approved for public release; distribution is unlimited. Physical and Biological Controls of Copepod Aggregation...distribution. OBJECTIVES The objectives of this study are to • Elucidate the mechanisms of copepod aggregation in the Great South Channel, a...Physical and Biological Controls of Copepod Aggregation and Baleen Whale Distribution 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  13. Advanced Operating System Technologies

    NASA Astrophysics Data System (ADS)

    Cittolin, Sergio; Riccardi, Fabio; Vascotto, Sandro

    In this paper we describe an R&D effort to define an OS architecture suitable for the requirements of the Data Acquisition and Control of an LHC experiment. Large distributed computing systems are foreseen to be the core part of the DAQ and Control system of the future LHC experiments. Neworks of thousands of processors, handling dataflows of several gigaBytes per second, with very strict timing constraints (microseconds), will become a common experience in the following years. Problems like distributyed scheduling, real-time communication protocols, failure-tolerance, distributed monitoring and debugging will have to be faced. A solid software infrastructure will be required to manage this very complicared environment, and at this moment neither CERN has the necessary expertise to build it, nor any similar commercial implementation exists. Fortunately these problems are not unique to the particle and high energy physics experiments, and the current research work in the distributed systems field, especially in the distributed operating systems area, is trying to address many of the above mentioned issues. The world that we are going to face in the next ten years will be quite different and surely much more interconnected than the one we see now. Very ambitious projects exist, planning to link towns, nations and the world in a single "Data Highway". Teleconferencing, Video on Demend, Distributed Multimedia Applications are just a few examples of the very demanding tasks to which the computer industry is committing itself. This projects are triggering a great research effort in the distributed, real-time micro-kernel based operating systems field and in the software enginering areas. The purpose of our group is to collect the outcame of these different research efforts, and to establish a working environment where the different ideas and techniques can be tested, evaluated and possibly extended, to address the requirements of a DAQ and Control System suitable for LHC. Our work started in the second half of 1994, with a research agreement between CERN and Chorus Systemes (France), world leader in the micro-kernel OS technology. The Chorus OS is targeted to distributed real-time applications, and it can very efficiently support different "OS personalities" in the same environment, like Posix, UNIX, and a CORBA compliant distributed object architecture. Projects are being set-up to verify the suitability of our work for LHC applications, we are building a scaled-down prototype of the DAQ system foreseen for the CMS experiment at LHC, where we will directly test our protocols and where we will be able to make measurements and benchmarks, guiding our development and allowing us to build an analytical model of the system, suitable for simulation and large scale verification.

  14. Smart sensing surveillance system

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Chu, Kai-Dee; O'Looney, James; Blake, Michael; Rutar, Colleen

    2010-04-01

    Unattended ground sensor (UGS) networks have been widely used in remote battlefield and other tactical applications over the last few decades due to the advances of the digital signal processing. The UGS network can be applied in a variety of areas including border surveillance, special force operations, perimeter and building protection, target acquisition, situational awareness, and force protection. In this paper, a highly-distributed, fault-tolerant, and energyefficient Smart Sensing Surveillance System (S4) is presented to efficiently provide 24/7 and all weather security operation in a situation management environment. The S4 is composed of a number of distributed nodes to collect, process, and disseminate heterogeneous sensor data. Nearly all S4 nodes have passive sensors to provide rapid omnidirectional detection. In addition, Pan- Tilt- Zoom- (PTZ) Electro-Optics EO/IR cameras are integrated to selected nodes to track the objects and capture associated imagery. These S4 camera-connected nodes will provide applicable advanced on-board digital image processing capabilities to detect and track the specific objects. The imaging detection operations include unattended object detection, human feature and behavior detection, and configurable alert triggers, etc. In the S4, all the nodes are connected with a robust, reconfigurable, LPI/LPD (Low Probability of Intercept/ Low Probability of Detect) wireless mesh network using Ultra-wide band (UWB) RF technology, which can provide an ad-hoc, secure mesh network and capability to relay network information, communicate and pass situational awareness and messages. The S4 utilizes a Service Oriented Architecture such that remote applications can interact with the S4 network and use the specific presentation methods. The S4 capabilities and technologies have great potential for both military and civilian applications, enabling highly effective security support tools for improving surveillance activities in densely crowded environments and near perimeters and borders. The S4 is compliant with Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE®) standards. It would be directly applicable to solutions for emergency response personnel, law enforcement, and other homeland security missions, as well as in applications requiring the interoperation of sensor networks with handheld or body-worn interface devices.

  15. Logistic Model to Support Service Modularity for the Promotion of Reusability in a Web Objects-Enabled IoT Environment.

    PubMed

    Kibria, Muhammad Golam; Ali, Sajjad; Jarwar, Muhammad Aslam; Kumar, Sunil; Chong, Ilyoung

    2017-09-22

    Due to a very large number of connected virtual objects in the surrounding environment, intelligent service features in the Internet of Things requires the reuse of existing virtual objects and composite virtual objects. If a new virtual object is created for each new service request, then the number of virtual object would increase exponentially. The Web of Objects applies the principle of service modularity in terms of virtual objects and composite virtual objects. Service modularity is a key concept in the Web Objects-Enabled Internet of Things (IoT) environment which allows for the reuse of existing virtual objects and composite virtual objects in heterogeneous ontologies. In the case of similar service requests occurring at the same, or different locations, the already-instantiated virtual objects and their composites that exist in the same, or different ontologies can be reused. In this case, similar types of virtual objects and composite virtual objects are searched and matched. Their reuse avoids duplication under similar circumstances, and reduces the time it takes to search and instantiate them from their repositories, where similar functionalities are provided by similar types of virtual objects and their composites. Controlling and maintaining a virtual object means controlling and maintaining a real-world object in the real world. Even though the functional costs of virtual objects are just a fraction of those for deploying and maintaining real-world objects, this article focuses on reusing virtual objects and composite virtual objects, as well as discusses similarity matching of virtual objects and composite virtual objects. This article proposes a logistic model that supports service modularity for the promotion of reusability in the Web Objects-enabled IoT environment. Necessary functional components and a flowchart of an algorithm for reusing composite virtual objects are discussed. Also, to realize the service modularity, a use case scenario is studied and implemented.

  16. Logistic Model to Support Service Modularity for the Promotion of Reusability in a Web Objects-Enabled IoT Environment

    PubMed Central

    Chong, Ilyoung

    2017-01-01

    Due to a very large number of connected virtual objects in the surrounding environment, intelligent service features in the Internet of Things requires the reuse of existing virtual objects and composite virtual objects. If a new virtual object is created for each new service request, then the number of virtual object would increase exponentially. The Web of Objects applies the principle of service modularity in terms of virtual objects and composite virtual objects. Service modularity is a key concept in the Web Objects-Enabled Internet of Things (IoT) environment which allows for the reuse of existing virtual objects and composite virtual objects in heterogeneous ontologies. In the case of similar service requests occurring at the same, or different locations, the already-instantiated virtual objects and their composites that exist in the same, or different ontologies can be reused. In this case, similar types of virtual objects and composite virtual objects are searched and matched. Their reuse avoids duplication under similar circumstances, and reduces the time it takes to search and instantiate them from their repositories, where similar functionalities are provided by similar types of virtual objects and their composites. Controlling and maintaining a virtual object means controlling and maintaining a real-world object in the real world. Even though the functional costs of virtual objects are just a fraction of those for deploying and maintaining real-world objects, this article focuses on reusing virtual objects and composite virtual objects, as well as discusses similarity matching of virtual objects and composite virtual objects. This article proposes a logistic model that supports service modularity for the promotion of reusability in the Web Objects-enabled IoT environment. Necessary functional components and a flowchart of an algorithm for reusing composite virtual objects are discussed. Also, to realize the service modularity, a use case scenario is studied and implemented. PMID:28937590

  17. Socio-economic and Climate Factors Associated with Dengue Fever Spatial Heterogeneity: A Worked Example in New Caledonia

    PubMed Central

    Teurlai, Magali; Menkès, Christophe Eugène; Cavarero, Virgil; Degallier, Nicolas; Descloux, Elodie; Grangeon, Jean-Paul; Guillaumot, Laurent; Libourel, Thérèse; Lucio, Paulo Sergio; Mathieu-Daudé, Françoise; Mangeas, Morgan

    2015-01-01

    Background/Objectives Understanding the factors underlying the spatio-temporal distribution of infectious diseases provides useful information regarding their prevention and control. Dengue fever spatio-temporal patterns result from complex interactions between the virus, the host, and the vector. These interactions can be influenced by environmental conditions. Our objectives were to analyse dengue fever spatial distribution over New Caledonia during epidemic years, to identify some of the main underlying factors, and to predict the spatial evolution of dengue fever under changing climatic conditions, at the 2100 horizon. Methods We used principal component analysis and support vector machines to analyse and model the influence of climate and socio-economic variables on the mean spatial distribution of 24,272 dengue cases reported from 1995 to 2012 in thirty-three communes of New Caledonia. We then modelled and estimated the future evolution of dengue incidence rates using a regional downscaling of future climate projections. Results The spatial distribution of dengue fever cases is highly heterogeneous. The variables most associated with this observed heterogeneity are the mean temperature, the mean number of people per premise, and the mean percentage of unemployed people, a variable highly correlated with people's way of life. Rainfall does not seem to play an important role in the spatial distribution of dengue cases during epidemics. By the end of the 21st century, if temperature increases by approximately 3°C, mean incidence rates during epidemics could double. Conclusion In New Caledonia, a subtropical insular environment, both temperature and socio-economic conditions are influencing the spatial spread of dengue fever. Extension of this study to other countries worldwide should improve the knowledge about climate influence on dengue burden and about the complex interplay between different factors. This study presents a methodology that can be used as a step by step guide to model dengue spatial heterogeneity in other countries. PMID:26624008

  18. Application of new type of distributed multimedia databases to networked electronic museum

    NASA Astrophysics Data System (ADS)

    Kuroda, Kazuhide; Komatsu, Naohisa; Komiya, Kazumi; Ikeda, Hiroaki

    1999-01-01

    Recently, various kinds of multimedia application systems have actively been developed based on the achievement of advanced high sped communication networks, computer processing technologies, and digital contents-handling technologies. Under this background, this paper proposed a new distributed multimedia database system which can effectively perform a new function of cooperative retrieval among distributed databases. The proposed system introduces a new concept of 'Retrieval manager' which functions as an intelligent controller so that the user can recognize a set of distributed databases as one logical database. The logical database dynamically generates and performs a preferred combination of retrieving parameters on the basis of both directory data and the system environment. Moreover, a concept of 'domain' is defined in the system as a managing unit of retrieval. The retrieval can effectively be performed by cooperation of processing among multiple domains. Communication language and protocols are also defined in the system. These are used in every action for communications in the system. A language interpreter in each machine translates a communication language into an internal language used in each machine. Using the language interpreter, internal processing, such internal modules as DBMS and user interface modules can freely be selected. A concept of 'content-set' is also introduced. A content-set is defined as a package of contents. Contents in the content-set are related to each other. The system handles a content-set as one object. The user terminal can effectively control the displaying of retrieved contents, referring to data indicating the relation of the contents in the content- set. In order to verify the function of the proposed system, a networked electronic museum was experimentally built. The results of this experiment indicate that the proposed system can effectively retrieve the objective contents under the control to a number of distributed domains. The result also indicate that the system can effectively work even if the system becomes large.

  19. Electronic holography using binary phase modulation

    NASA Astrophysics Data System (ADS)

    Matoba, Osamu

    2014-06-01

    A 3D display system by using a phase-only distribution is presented. Especially, binary phase distribution is used to reconstruct a 3D object for wide viewing zone angle. To obtain the phase distribution to be displayed on a phase-mode spatial light modulator, both of experimental and numerical processes are available. In this paper, we present a numerical process by using a computer graphics data. A random phase distribution is attached to all polygons of an input 3D object to reconstruct a 3D object well from the binary phase distribution. Numerical and experimental results are presented to show the effectiveness of the proposed system.

  20. Advanced air distribution: improving health and comfort while reducing energy use.

    PubMed

    Melikov, A K

    2016-02-01

    Indoor environment affects the health, comfort, and performance of building occupants. The energy used for heating, cooling, ventilating, and air conditioning of buildings is substantial. Ventilation based on total volume air distribution in spaces is not always an efficient way to provide high-quality indoor environments at the same time as low-energy consumption. Advanced air distribution, designed to supply clean air where, when, and as much as needed, makes it possible to efficiently achieve thermal comfort, control exposure to contaminants, provide high-quality air for breathing and minimizing the risk of airborne cross-infection while reducing energy use. This study justifies the need for improving the present air distribution design in occupied spaces, and in general the need for a paradigm shift from the design of collective environments to the design of individually controlled environments. The focus is on advanced air distribution in spaces, its guiding principles and its advantages and disadvantages. Examples of advanced air distribution solutions in spaces for different use, such as offices, hospital rooms, vehicle compartments, are presented. The potential of advanced air distribution, and individually controlled macro-environment in general, for achieving shared values, that is, improved health, comfort, and performance, energy saving, reduction of healthcare costs and improved well-being is demonstrated. Performance criteria are defined and further research in the field is outlined. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Population-Based Study on the Effect of a Forest Environment on Salivary Cortisol Concentration

    PubMed Central

    Park, Bum-Jin; Lee, Juyoung

    2017-01-01

    The purpose of this study was to evaluate the effect of a forest environment on salivary cortisol concentration, particularly on the characteristics of its distribution. The participants were 348 young male subjects. The experimental sites were 34 forests and 34 urban areas across Japan. The subjects viewed the landscape (forest or urban environment) for a period of 15 min while sitting in a chair. Saliva was sampled from the participants at the end of this 15-min period and then analyzed for cortisol concentration. Differences in the skewness and kurtosis of the distributions between the two environments were tested by performing a permutation test. The cortisol concentrations exhibited larger skewness (0.76) and kurtosis (3.23) in a forest environment than in an urban environment (skewness = 0.49; kurtosis = 2.47), and these differences were statistically significant. The cortisol distribution exhibited a more peaked and longer right-tailed curve in a forest environment than in an urban environment. PMID:28820452

  2. Population-Based Study on the Effect of a Forest Environment on Salivary Cortisol Concentration.

    PubMed

    Kobayashi, Hiromitsu; Song, Chorong; Ikei, Harumi; Park, Bum-Jin; Lee, Juyoung; Kagawa, Takahide; Miyazaki, Yoshifumi

    2017-08-18

    The purpose of this study was to evaluate the effect of a forest environment on salivary cortisol concentration, particularly on the characteristics of its distribution. The participants were 348 young male subjects. The experimental sites were 34 forests and 34 urban areas across Japan. The subjects viewed the landscape (forest or urban environment) for a period of 15 min while sitting in a chair. Saliva was sampled from the participants at the end of this 15-min period and then analyzed for cortisol concentration. Differences in the skewness and kurtosis of the distributions between the two environments were tested by performing a permutation test. The cortisol concentrations exhibited larger skewness (0.76) and kurtosis (3.23) in a forest environment than in an urban environment (skewness = 0.49; kurtosis = 2.47), and these differences were statistically significant. The cortisol distribution exhibited a more peaked and longer right-tailed curve in a forest environment than in an urban environment.

  3. An investigation of total bacterial communities, culturable antibiotic-resistant bacterial communities and integrons in the river water environments of Taipei city.

    PubMed

    Yang, Chu-Wen; Chang, Yi-Tang; Chao, Wei-Liang; Shiung, Iau-Iun; Lin, Han-Sheng; Chen, Hsuan; Ho, Szu-Han; Lu, Min-Jheng; Lee, Pin-Hsuan; Fan, Shao-Ning

    2014-07-30

    The intensive use of antibiotics may accelerate the development of antibiotic-resistant bacteria (ARB). The global geographical distribution of environmental ARB has been indicated by many studies. However, the ARB in the water environments of Taiwan has not been extensively investigated. The objective of this study was to investigate the communities of ARB in Huanghsi Stream, which presents a natural acidic (pH 4) water environment. Waishuanghsi Stream provides a neutral (pH 7) water environment and was thus also monitored to allow comparison. The plate counts of culturable bacteria in eight antibiotics indicate that the numbers of culturable carbenicillin- and vancomycin-resistant bacteria in both Huanghsi and Waishuanghsi Streams are greater than the numbers of culturable bacteria resistant to the other antibiotics tested. Using a 16S rDNA sequencing approach, both the antibiotic-resistant bacterial communities (culture-based) and the total bacterial communities (metagenome-based) in Waishuanghsi Stream exhibit a higher diversity than those in Huanghsi Stream were observed. Of the three classes of integron, only class I integrons were identified in Waishuanghsi Stream. Our results suggest that an acidic (pH 4) water environment may not only affect the community composition of antibiotic-resistant bacteria but also the horizontal gene transfer mediated by integrons. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. A distributed programming environment for Ada

    NASA Technical Reports Server (NTRS)

    Brennan, Peter; Mcdonnell, Tom; Mcfarland, Gregory; Timmins, Lawrence J.; Litke, John D.

    1986-01-01

    Despite considerable commercial exploitation of fault tolerance systems, significant and difficult research problems remain in such areas as fault detection and correction. A research project is described which constructs a distributed computing test bed for loosely coupled computers. The project is constructing a tool kit to support research into distributed control algorithms, including a distributed Ada compiler, distributed debugger, test harnesses, and environment monitors. The Ada compiler is being written in Ada and will implement distributed computing at the subsystem level. The design goal is to provide a variety of control mechanics for distributed programming while retaining total transparency at the code level.

  5. Interactive model evaluation tool based on IPython notebook

    NASA Astrophysics Data System (ADS)

    Balemans, Sophie; Van Hoey, Stijn; Nopens, Ingmar; Seuntjes, Piet

    2015-04-01

    In hydrological modelling, some kind of parameter optimization is mostly performed. This can be the selection of a single best parameter set, a split in behavioural and non-behavioural parameter sets based on a selected threshold or a posterior parameter distribution derived with a formal Bayesian approach. The selection of the criterion to measure the goodness of fit (likelihood or any objective function) is an essential step in all of these methodologies and will affect the final selected parameter subset. Moreover, the discriminative power of the objective function is also dependent from the time period used. In practice, the optimization process is an iterative procedure. As such, in the course of the modelling process, an increasing amount of simulations is performed. However, the information carried by these simulation outputs is not always fully exploited. In this respect, we developed and present an interactive environment that enables the user to intuitively evaluate the model performance. The aim is to explore the parameter space graphically and to visualize the impact of the selected objective function on model behaviour. First, a set of model simulation results is loaded along with the corresponding parameter sets and a data set of the same variable as the model outcome (mostly discharge). The ranges of the loaded parameter sets define the parameter space. A selection of the two parameters visualised can be made by the user. Furthermore, an objective function and a time period of interest need to be selected. Based on this information, a two-dimensional parameter response surface is created, which actually just shows a scatter plot of the parameter combinations and assigns a color scale corresponding with the goodness of fit of each parameter combination. Finally, a slider is available to change the color mapping of the points. Actually, the slider provides a threshold to exclude non behaviour parameter sets and the color scale is only attributed to the remaining parameter sets. As such, by interactively changing the settings and interpreting the graph, the user gains insight in the model structural behaviour. Moreover, a more deliberate choice of objective function and periods of high information content can be identified. The environment is written in an IPython notebook and uses the available interactive functions provided by the IPython community. As such, the power of the IPython notebook as a development environment for scientific computing is illustrated (Shen, 2014).

  6. An intelligent robot for helping astronauts

    NASA Technical Reports Server (NTRS)

    Erickson, J. D.; Grimm, K. A.; Pendleton, T. W.

    1994-01-01

    This paper describes the development status of a prototype supervised intelligent robot for space application for purposes of (1) helping the crew of a spacecraft such as the Space Station with various tasks, such as holding objects and retrieving/replacing tools and other objects from/into storage, and (2) for purposes of retrieving detached objects, such as equipment or crew, that have become separated from their spacecraft. In addition to this set of tasks in this low-Earth-orbiting spacecraft environment, it is argued that certain aspects of the technology can be viewed as generic in approach, thereby offering insight into intelligent robots for other tasks and environments. Candidate software architectures and their key technical issues which enable real work in real environments to be accomplished safely and robustly are addressed. Results of computer simulations of grasping floating objects are presented. Also described are characterization results on the usable reduced gravity environment in an aircraft flying parabola (to simulate weightlessness) and results on hardware performance there. These results show it is feasible to use that environment for evaluative testing of dexterous grasping based on real-time vision of freely rotating and translating objects.

  7. Development of a Frost Risk Assessment Tool in Agriculture for a Mediterranean ecosystem Utilizing MODIS satellite observations Geomatics and Surface Data

    NASA Astrophysics Data System (ADS)

    Louka, Panagiota; Papanikolaou, Ioannis; Petropoulos, George; Migiros, George; Tsiros, Ioannis

    2014-05-01

    Frost risk in Mediterranean countries is a critical factor in agricultural planning and management. Nowadays, the rapid technological developments in Earth Observation (EO) technology have improved dramatically our ability to map the spatiotemporal distribution of frost conditions over a given area and evaluate its impacts on the environment and society. In this study, a frost risk model for agricultural crops cultivated in a Mediterranean environment has been developed, based primarily on Earth Observation (EO) data from MODIS sensor and ancillary spatial and point data. The ability of the model to predict frost conditions has been validated for selected days on which frost conditions had been observed for a region in Northwestern Greece according to ground observations obtained by the Agricultural Insurance Organization (ELGA). An extensive evaluation of the frost risk model predictions has been performed herein to evaluate objectively its ability to predict the spatio-temporal distribution of frost risk in the studied region, including comparisons against physiographical factors of the study area. The topographical characteristics that were taken under consideration were latitude, altitude, slope steepness, topographic convergence and the extend of the areas influenced by water bodies (such as lake and sea) existing in the study area. Additional data were also used concerning land use data and vegetation classification (type and density). Our results showed that the model was able to produce reasonably the spatio-temporal distribution of the frost conditions in our study area, following largely explainable patterns in respect to the study site and local weather conditions characteristics. All in all, the methodology implemented herein proved capable in obtaining rapidly and cost-effectively cartography of the frost risk in a Mediterranean environment, making it potentially a very useful tool for agricultural management and planning. The model presented here has also a potential to enhance conventional field-based surveying for monitoring frost changes over long timescales. KEYWORDS: Earth Observation, MODIS, frost, risk assessment, Greece

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magee, Thoman

    The Consolidated Edison, Inc., of New York (Con Edison) Secure Interoperable Open Smart Grid Demonstration Project (SGDP), sponsored by the United States (US) Department of Energy (DOE), demonstrated that the reliability, efficiency, and flexibility of the grid can be improved through a combination of enhanced monitoring and control capabilities using systems and resources that interoperate within a secure services framework. The project demonstrated the capability to shift, balance, and reduce load where and when needed in response to system contingencies or emergencies by leveraging controllable field assets. The range of field assets includes curtailable customer loads, distributed generation (DG), batterymore » storage, electric vehicle (EV) charging stations, building management systems (BMS), home area networks (HANs), high-voltage monitoring, and advanced metering infrastructure (AMI). The SGDP enables the seamless integration and control of these field assets through a common, cyber-secure, interoperable control platform, which integrates a number of existing legacy control and data systems, as well as new smart grid (SG) systems and applications. By integrating advanced technologies for monitoring and control, the SGDP helps target and reduce peak load growth, improves the reliability and efficiency of Con Edison’s grid, and increases the ability to accommodate the growing use of distributed resources. Con Edison is dedicated to lowering costs, improving reliability and customer service, and reducing its impact on the environment for its customers. These objectives also align with the policy objectives of New York State as a whole. To help meet these objectives, Con Edison’s long-term vision for the distribution grid relies on the successful integration and control of a growing penetration of distributed resources, including demand response (DR) resources, battery storage units, and DG. For example, Con Edison is expecting significant long-term growth of DG. The SGDP enables the efficient, flexible integration of these disparate resources and lays the architectural foundations for future scalability. Con Edison assembled an SGDP team of more than 16 different project partners, including technology vendors, and participating organizations, and the Con Edison team provided overall guidance and project management. Project team members are listed in Table 1-1.« less

  9. The food environment and student weight status, Los Angeles County, 2008-2009.

    PubMed

    Langellier, Brent A

    2012-01-01

    One factor believed to affect overweight status is the food environment, or the distribution of outlets that serve healthful or unhealthful foods in residential areas, workplaces, and schools. Few studies have investigated the association between the food environment and the prevalence of overweight among children and adolescents. The objective of this study was to investigate the association between the distribution of corner stores and fast food restaurants around Los Angeles County public schools and the prevalence of overweight among students. Hierarchical linear models were used to assess the association between the presence of corner stores or fast food restaurants within a half-mile of Los Angeles County schools (N = 1,694) and overweight prevalence among students in grades 5, 7, and 9. The presence of corner stores and fast food restaurants varied significantly by schools' racial/ethnic composition, Title 1 eligibility, and rural/suburban vs urban location. After adjustment for other factors, overweight prevalence was 1.6 percentage points higher at majority-Latino schools that had at least 1 corner store within a half-mile than at majority-Latino schools that did not have a corner store within a half-mile. The association between corner stores and overweight prevalence varied significantly between majority-Latino schools and schools that were majority-white or that had no racial/ethnic majority. The presence of fast food restaurants within a half-mile of schools was not associated with overweight prevalence among students. This study underscores the importance of interventions that seek to improve the healthfulness of corner store inventories and of student purchases.

  10. Probability distributions of whisker-surface contact: quantifying elements of the rat vibrissotactile natural scene.

    PubMed

    Hobbs, Jennifer A; Towal, R Blythe; Hartmann, Mitra J Z

    2015-08-01

    Analysis of natural scene statistics has been a powerful approach for understanding neural coding in the auditory and visual systems. In the field of somatosensation, it has been more challenging to quantify the natural tactile scene, in part because somatosensory signals are so tightly linked to the animal's movements. The present work takes a step towards quantifying the natural tactile scene for the rat vibrissal system by simulating rat whisking motions to systematically investigate the probabilities of whisker-object contact in naturalistic environments. The simulations permit an exhaustive search through the complete space of possible contact patterns, thereby allowing for the characterization of the patterns that would most likely occur during long sequences of natural exploratory behavior. We specifically quantified the probabilities of 'concomitant contact', that is, given that a particular whisker makes contact with a surface during a whisk, what is the probability that each of the other whiskers will also make contact with the surface during that whisk? Probabilities of concomitant contact were quantified in simulations that assumed increasingly naturalistic conditions: first, the space of all possible head poses; second, the space of behaviorally preferred head poses as measured experimentally; and third, common head poses in environments such as cages and burrows. As environments became more naturalistic, the probability distributions shifted from exhibiting a 'row-wise' structure to a more diagonal structure. Results also reveal that the rat appears to use motor strategies (e.g. head pitches) that generate contact patterns that are particularly well suited to extract information in the presence of uncertainty. © 2015. Published by The Company of Biologists Ltd.

  11. Companions and Environments of Low-Mass Stars: From Star-Forming Regions to the Field

    NASA Astrophysics Data System (ADS)

    Ward-Duong, Kimberly; Patience, Jenny; De Rosa, Robert J.; Bulger, Joanna; Rajan, Abhijith; Goodwin, Simon; Parker, Richard J.; McCarthy, Donald W.; Kulesa, Craig; van der Plas, Gerrit; Menard, Francois; Pinte, Christophe; Jackson, Alan Patrick; Bryden, Geoffrey; Turner, Neal J.; Harvey, Paul M.; Hales, Antonio

    2017-01-01

    We present results from two studies probing the multiplicity and environmental properties of low-mass stars: (1) The MinMs (M-dwarfs in Multiples) Survey, a large, volume-limited survey of 245 field M-dwarfs within 15 pc, and (2) the TBOSS (Taurus Boundary of Stellar/Substellar) Survey, an ongoing study of disk properties for the lowest-mass members within the Taurus star-forming region. The MinMs Survey provides new measurements of the companion star fraction, separation distribution, and mass ratio distribution for the nearest K7-M6 dwarfs, utilizing a combination of high-resolution adaptive optics imaging and digitized widefield archival plates to cover an unprecedented separation range of ~1-10,000 AU. Within these data, we also identify companions below the stellar/brown dwarf boundary, enabling characterization of the substellar companion population to low-mass field stars. For the much younger population in Taurus, we present results from ALMA Band 7 continuum observations of low-mass stellar and substellar Class II objects, spanning spectral types from M4-M7.75. The sub-millimeter detections of these disks provide key estimates of the dust mass in small grains, which is then assessed within the context of region age, environment, and viability for planet formation. This young population also includes a number of interesting young binary systems. Covering both young (1-2 Myr) and old (>5 Gyr) populations of low-mass stars, the results from these studies provide benchmark measurements on the population statistics of low-mass field stars, and on the early protoplanetary environments of their younger M-star counterparts.

  12. Problem-Based Learning and Problem-Solving Tools: Synthesis and Direction for Distributed Education Environments.

    ERIC Educational Resources Information Center

    Friedman, Robert S.; Deek, Fadi P.

    2002-01-01

    Discusses how the design and implementation of problem-solving tools used in programming instruction are complementary with both the theories of problem-based learning (PBL), including constructivism, and the practices of distributed education environments. Examines how combining PBL, Web-based distributed education, and a problem-solving…

  13. Some Technical Implications of Distributed Cognition on the Design on Interactive Learning Environments.

    ERIC Educational Resources Information Center

    Dillenbourg, Pierre

    1996-01-01

    Maintains that diagnosis, explanation, and tutoring, the functions of an interactive learning environment, are collaborative processes. Examines how human-computer interaction can be improved using a distributed cognition framework. Discusses situational and distributed knowledge theories and provides a model on how they can be used to redesign…

  14. Development of the Policy Indicator Checklist: A Tool to Identify and Measure Policies for Calorie-Dense Foods and Sugar-Sweetened Beverages Across Multiple Settings

    PubMed Central

    Hallett, Allen M.; Parker, Nathan; Kudia, Ousswa; Kao, Dennis; Modelska, Maria; Rifai, Hanadi; O’Connor, Daniel P.

    2015-01-01

    Objectives. We developed the policy indicator checklist (PIC) to identify and measure policies for calorie-dense foods and sugar-sweetened beverages to determine how policies are clustered across multiple settings. Methods. In 2012 and 2013 we used existing literature, policy documents, government recommendations, and instruments to identify key policies. We then developed the PIC to examine the policy environments across 3 settings (communities, schools, and early care and education centers) in 8 communities participating in the Childhood Obesity Research Demonstration Project. Results. Principal components analysis revealed 5 components related to calorie-dense food policies and 4 components related to sugar-sweetened beverage policies. Communities with higher youth and racial/ethnic minority populations tended to have fewer and weaker policy environments concerning calorie-dense foods and healthy foods and beverages. Conclusions. The PIC was a helpful tool to identify policies that promote healthy food environments across multiple settings and to measure and compare the overall policy environments across communities. There is need for improved coordination across settings, particularly in areas with greater concentration of youths and racial/ethnic minority populations. Policies to support healthy eating are not equally distributed across communities, and disparities continue to exist in nutrition policies. PMID:25790397

  15. Searching for justice for body and self in a coercive environment: sex work in Kerala, India.

    PubMed

    Jayasree, A K

    2004-05-01

    Sex workers in Kerala, India, live in a coercive environment and face violence from the police and criminals, lack of shelter, lack of childcare support and have many physical and mental health problems. This paper documents the environment in which women have been selling sex in Kerala since 1995, and their efforts to claim their rights. It is based on sex workers' own reports and experiences, a situation analysis and a needs assessment study by the Foundation for Integrated Research in Mental Health. Involvement in HIV/AIDS prevention projects first gave sex workers in Kerala an opportunity to come together. Some have become peer educators and distribute condoms but they continue to be harassed by police. Most anti-trafficking interventions, including rescue and rehabilitation, either criminalise or victimise sex workers, and sex workers reject them as a solution to sex work. They understand that the lack of sexual fulfillment in other relationships and their own lack of access to other work and resources are the reasons why commercial sex flourishes. Sex workers are not mere victims without agency. They have a right to bodily integrity, pleasure, livelihood, self-determination and a safe working environment. Sex workers are organising themselves for these objectives and demand decriminalisation of sex work.

  16. Synthetic environments

    NASA Astrophysics Data System (ADS)

    Lukes, George E.; Cain, Joel M.

    1996-02-01

    The Advanced Distributed Simulation (ADS) Synthetic Environments Program seeks to create robust virtual worlds from operational terrain and environmental data sources of sufficient fidelity and currency to interact with the real world. While some applications can be met by direct exploitation of standard digital terrain data, more demanding applications -- particularly those support operations 'close to the ground' -- are well-served by emerging capabilities for 'value-adding' by the user working with controlled imagery. For users to rigorously refine and exploit controlled imagery within functionally different workstations they must have a shared framework to allow interoperability within and between these environments in terms of passing image and object coordinates and other information using a variety of validated sensor models. The Synthetic Environments Program is now being expanded to address rapid construction of virtual worlds with research initiatives in digital mapping, softcopy workstations, and cartographic image understanding. The Synthetic Environments Program is also participating in a joint initiative for a sensor model applications programer's interface (API) to ensure that a common controlled imagery exploitation framework is available to all researchers, developers and users. This presentation provides an introduction to ADS and the associated requirements for synthetic environments to support synthetic theaters of war. It provides a technical rationale for exploring applications of image understanding technology to automated cartography in support of ADS and related programs benefitting from automated analysis of mapping, earth resources and reconnaissance imagery. And it provides an overview and status of the joint initiative for a sensor model API.

  17. OhioView: Distribution of Remote Sensing Data Across Geographically Distributed Environments

    NASA Technical Reports Server (NTRS)

    Ramos, Calvin T.

    1998-01-01

    Various issues associated with the distribution of remote sensing data across geographically distributed environments are presented in viewgraph form. Specific topics include: 1) NASA education program background; 2) High level architectures, technologies and applications; 3) LeRC internal architecture and role; 4) Potential GIBN interconnect; 5) Potential areas of network investigation and research; 6) Draft of OhioView data model; and 7) the LeRC strategy and roadmap.

  18. Cooperative Control of Distributed Autonomous Vehicles in Adversarial Environments

    DTIC Science & Technology

    2006-08-14

    COOPERATIVE CONTROL OF DISTRIBUTED AUTONOMOUS VEHICLES IN ADVERSARIAL ENVIRONMENTS Grant #F49620–01–1–0361 Final Report Jeff Shamma Department of...CONTRACT NUMBER F49620-01-1-0361 5b. GRANT NUMBER 4. TITLE AND SUBTITLE COOPERATIVE CONTROL OF DISTRIBUTED AUTONOMOUS VEHICLES IN...single dominant language or a distribution of languages. A relation to multivehicle systems is understanding how highly autonomous vehicles on extended

  19. Marine realms information bank: A distributed geolibrary for the ocean

    USGS Publications Warehouse

    Marincioni, F.; Lightsom, F.; ,

    2002-01-01

    The Marine Realms Information Bank (MRIB) is a prototype web-based distributed geolibrary that organizes, indexes, and delivers online information about the oceanic and coastal environments. It implements the distributed geolibrary concept to organize, index, and deliver online information about the oceanic and coastal environments. The significance of MRIB lies both in the utility of the information bank and in the implementation of the distributed geolibraries concept.

  20. Fractionation and analysis of veterinary antibiotics and their related degradation products in agricultural soils and drainage waters following swine manure amendment.

    PubMed

    Solliec, Morgan; Roy-Lachapelle, Audrey; Gasser, Marc-Olivier; Coté, Caroline; Généreux, Mylène; Sauvé, Sébastien

    2016-02-01

    The fate of antimicrobial active compound residues in the environment, and especially antibiotics used in swine husbandry are of particular interest for their potential toxicity and contribution to antibiotic resistance. The presence of relatively high concentrations of bioactive compounds has been reported in agricultural areas but few information is available on their degradation products. Veterinary antibiotics reach terrestrial environments through many routes, including application of swine manure to soils. The objectives of this project were first, to develop an analytical method able to quantify and identify veterinary antibiotics and their degradation products in manure, soil and water samples; and second, to study the distribution of these target compounds in soils and drainage waters. A brief evaluation of their potential toxicity in the environment was also made. In order to achieve these objectives, liquid chromatography coupled to high-resolution mass spectrometry was used for its ability to quantify contaminants with sensitivity and selectivity, and its capacity to identify degradation products. Samples of manure, soil and water came from a long-term experimental site where swine manure containing veterinary antibiotics has been applied for many years. In this study, tetracycline antibiotics were found at several hundred μg L(-1) in the swine manure slurry used for fertilization, several hundred of ng L(-1) in drainage waters and several ng g(-1) in soils, while degradation products were sometimes found at concentrations higher than the parent compounds. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. A possible divot in the Kuiper belt's scattered-object size distribution

    NASA Astrophysics Data System (ADS)

    Shankman, C.; Kavelaars, J.; Gladman, B.; Petit, J.

    2014-07-01

    The formation and evolution history of the Solar System, while not directly accessible, has measurable signatures in the present-day size distributions of the Trans-Neptunian Object (TNO) populations. The form of the size distribution is modelled as a power law with number going as size to some characteristic slope. Recent works have shown that a single power law does not match the observations across all sizes; the power law breaks to a different form [1, 2, 3]. The large- size objects record the accretion history, while the small-size objects record the collision history. The changes of size-distribution shape and slope as one moves from 'large' to 'medium' to 'small' KBOs are the signature needed to constrain the formation and collision history of the Solar System. The scattering TNOs are those TNOs undergoing strong (scattering) interactions Neptune. The scattering objects can come to pericentre in the giant planet region. This close-in pericentre passage allows for the observation of smaller objects, and thus for the constraint of the small-size end of the size distribution. Our recent analysis of the Canada France Ecliptic Plane Survey's (CFEPS) scattering objects revealed an exciting potential form for the scattering object size distribution - a divot (see Figure). Our divot (a sharp drop in the number of objects per unit size which then returns at a potentially different slope) matches our observations well and can simultaneously explain observed features in other inclined (so-called "hot") Kuiper Belt populations. In this scenario all of the hot populations would share the same source and have been implanted in the outer solar system through scattering processes. If confirmed, our divot would represent a new exciting paradigm for the formation history of the Kuiper Belt. Here we present the results of an extension of our previous work to include a new, deeper, Kuiper Belt survey. By the addition of two new faint scattering objects from this survey which, in tandem with the full characterizations of the survey's biases (acting like non- detections limits), we better constrain the form of the scattering object size distribution.

  2. Solution to the SLAM problem in low dynamic environments using a pose graph and an RGB-D sensor.

    PubMed

    Lee, Donghwa; Myung, Hyun

    2014-07-11

    In this study, we propose a solution to the simultaneous localization and mapping (SLAM) problem in low dynamic environments by using a pose graph and an RGB-D (red-green-blue depth) sensor. The low dynamic environments refer to situations in which the positions of objects change over long intervals. Therefore, in the low dynamic environments, robots have difficulty recognizing the repositioning of objects unlike in highly dynamic environments in which relatively fast-moving objects can be detected using a variety of moving object detection algorithms. The changes in the environments then cause groups of false loop closing when the same moved objects are observed for a while, which means that conventional SLAM algorithms produce incorrect results. To address this problem, we propose a novel SLAM method that handles low dynamic environments. The proposed method uses a pose graph structure and an RGB-D sensor. First, to prune the falsely grouped constraints efficiently, nodes of the graph, that represent robot poses, are grouped according to the grouping rules with noise covariances. Next, false constraints of the pose graph are pruned according to an error metric based on the grouped nodes. The pose graph structure is reoptimized after eliminating the false information, and the corrected localization and mapping results are obtained. The performance of the method was validated in real experiments using a mobile robot system.

  3. Robot computer problem solving system

    NASA Technical Reports Server (NTRS)

    Merriam, E. W.; Becker, J. D.

    1973-01-01

    A robot computer problem solving system which represents a robot exploration vehicle in a simulated Mars environment is described. The model exhibits changes and improvements made on a previously designed robot in a city environment. The Martian environment is modeled in Cartesian coordinates; objects are scattered about a plane; arbitrary restrictions on the robot's vision have been removed; and the robot's path contains arbitrary curves. New environmental features, particularly the visual occlusion of objects by other objects, were added to the model. Two different algorithms were developed for computing occlusion. Movement and vision capabilities of the robot were established in the Mars environment, using LISP/FORTRAN interface for computational efficiency. The graphical display program was redesigned to reflect the change to the Mars-like environment.

  4. Performance Analysis of Distributed Object-Oriented Applications

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1998-01-01

    The purpose of this research was to evaluate the efficiency of a distributed simulation architecture which creates individual modules which are made self-scheduling through the use of a message-based communication system used for requesting input data from another module which is the source of that data. To make the architecture as general as possible, the message-based communication architecture was implemented using standard remote object architectures (Common Object Request Broker Architecture (CORBA) and/or Distributed Component Object Model (DCOM)). A series of experiments were run in which different systems are distributed in a variety of ways across multiple computers and the performance evaluated. The experiments were duplicated in each case so that the overhead due to message communication and data transmission can be separated from the time required to actually perform the computational update of a module each iteration. The software used to distribute the modules across multiple computers was developed in the first year of the current grant and was modified considerably to add a message-based communication scheme supported by the DCOM distributed object architecture. The resulting performance was analyzed using a model created during the first year of this grant which predicts the overhead due to CORBA and DCOM remote procedure calls and includes the effects of data passed to and from the remote objects. A report covering the distributed simulation software and the results of the performance experiments has been submitted separately. The above report also discusses possible future work to apply the methodology to dynamically distribute the simulation modules so as to minimize overall computation time.

  5. Quantum Darwinism in an Everyday Environment: Huge Redundancy in Scattered Photons

    NASA Astrophysics Data System (ADS)

    Riedel, C. Jess; Zurek, Wojciech H.

    2010-07-01

    We study quantum Darwinism—the redundant recording of information about the preferred states of a decohering system by its environment—for an object illuminated by a blackbody. In the cases of point-source and isotropic illumination, we calculate the quantum mutual information between the object and its photon environment. We demonstrate that this realistic model exhibits fast and extensive proliferation of information about the object into the environment and results in redundancies orders of magnitude larger than the exactly soluble models considered to date.

  6. An ultraviolet study of B[e] stars: evidence for pulsations, luminous blue variable type variations and processes in envelopes

    NASA Astrophysics Data System (ADS)

    Krtičková, I.; Krtička, J.

    2018-06-01

    Stars that exhibit a B[e] phenomenon comprise a very diverse group of objects in a different evolutionary status. These objects show common spectral characteristics, including the presence of Balmer lines in emission, forbidden lines and strong infrared excess due to dust. Observations of emission lines indicate illumination by an ultraviolet ionizing source, which is key to understanding the elusive nature of these objects. We study the ultraviolet variability of many B[e] stars to specify the geometry of the circumstellar environment and its variability. We analyse massive hot B[e] stars from our Galaxy and from the Magellanic Clouds. We study the ultraviolet broad-band variability derived from the flux-calibrated data. We determine variations of individual lines and the correlation with the total flux variability. We detected variability of the spectral energy distribution and of the line profiles. The variability has several sources of origin, including light absorption by the disc, pulsations, luminous blue variable type variations, and eclipses in the case of binaries. The stellar radiation of most of B[e] stars is heavily obscured by circumstellar material. This suggests that the circumstellar material is present not only in the disc but also above its plane. The flux and line variability is consistent with a two-component model of a circumstellar environment composed of a dense disc and an ionized envelope. Observations of B[e] supergiants show that many of these stars have nearly the same luminosity, about 1.9 × 105 L⊙, and similar effective temperatures.

  7. The development of a collaborative virtual environment for finite element simulation

    NASA Astrophysics Data System (ADS)

    Abdul-Jalil, Mohamad Kasim

    Communication between geographically distributed designers has been a major hurdle in traditional engineering design. Conventional methods of communication, such as video conferencing, telephone, and email, are less efficient especially when dealing with complex design models. Complex shapes, intricate features and hidden parts are often difficult to describe verbally or even using traditional 2-D or 3-D visual representations. Virtual Reality (VR) and Internet technologies have provided a substantial potential to bridge the present communication barrier. VR technology allows designers to immerse themselves in a virtual environment to view and manipulate this model just as in real-life. Fast Internet connectivity has enabled fast data transfer between remote locations. Although various collaborative virtual environment (CVE) systems have been developed in the past decade, they are limited to high-end technology that is not accessible to typical designers. The objective of this dissertation is to discover and develop a new approach to increase the efficiency of the design process, particularly for large-scale applications wherein participants are geographically distributed. A multi-platform and easily accessible collaborative virtual environment (CVRoom), is developed to accomplish the stated research objective. Geographically dispersed designers can meet in a single shared virtual environment to discuss issues pertaining to the engineering design process and to make trade-off decisions more quickly than before, thereby speeding the entire process. This 'faster' design process will be achieved through the development of capabilities to better enable the multidisciplinary and modeling the trade-off decisions that are so critical before launching into a formal detailed design. The features of the environment developed as a result of this research include the ability to view design models, use voice interaction, and to link engineering analysis modules (such as Finite Element Analysis module, such as is demonstrated in this work). One of the major issues in developing a CVE system for engineering design purposes is to obtain any pertinent simulation results in real-time. This is critical so that the designers can make decisions based on these results quickly. For example, in a finite element analysis, if a design model is changed or perturbed, the analysis results must be obtained in real-time or near real-time to make the virtual meeting environment realistic. In this research, the finite difference-based Design Sensitivity Analysis (DSA) approach is employed to approximate structural responses (i.e. stress, displacement, etc), so as to demonstrate the applicability of CVRoom for engineering design trade-offs. This DSA approach provides for fast approximation and is well-suited for the virtual meeting environment where fast response time is required. The DSA-based approach is tested on several example test problems to show its applicability and limitations. This dissertation demonstrates that an increase in efficiency and reduction of time required for a complex design processing can be accomplished using the approach developed in this dissertation research. Several implementations of CVRoom by students working on common design tasks were investigated. All participants confirmed the preference of using the collaborative virtual environment developed in this dissertation work (CVRoom) over other modes of interactions. It is proposed here that CVRoom is representative of the type of collaborative virtual environment that will be used by most designers in the future to reduce the time required in a design cycle and thereby reduce the associated cost.

  8. Reuseable Objects Software Environment (ROSE): Introduction to Air Force Software Reuse Workshop

    NASA Technical Reports Server (NTRS)

    Cottrell, William L.

    1994-01-01

    The Reusable Objects Software Environment (ROSE) is a common, consistent, consolidated implementation of software functionality using modern object oriented software engineering including designed-in reuse and adaptable requirements. ROSE is designed to minimize abstraction and reduce complexity. A planning model for the reverse engineering of selected objects through object oriented analysis is depicted. Dynamic and functional modeling are used to develop a system design, the object design, the language, and a database management system. The return on investment for a ROSE pilot program and timelines are charted.

  9. Distributed visualization framework architecture

    NASA Astrophysics Data System (ADS)

    Mishchenko, Oleg; Raman, Sundaresan; Crawfis, Roger

    2010-01-01

    An architecture for distributed and collaborative visualization is presented. The design goals of the system are to create a lightweight, easy to use and extensible framework for reasearch in scientific visualization. The system provides both single user and collaborative distributed environment. System architecture employs a client-server model. Visualization projects can be synchronously accessed and modified from different client machines. We present a set of visualization use cases that illustrate the flexibility of our system. The framework provides a rich set of reusable components for creating new applications. These components make heavy use of leading design patterns. All components are based on the functionality of a small set of interfaces. This allows new components to be integrated seamlessly with little to no effort. All user input and higher-level control functionality interface with proxy objects supporting a concrete implementation of these interfaces. These light-weight objects can be easily streamed across the web and even integrated with smart clients running on a user's cell phone. The back-end is supported by concrete implementations wherever needed (for instance for rendering). A middle-tier manages any communication and synchronization with the proxy objects. In addition to the data components, we have developed several first-class GUI components for visualization. These include a layer compositor editor, a programmable shader editor, a material editor and various drawable editors. These GUI components interact strictly with the interfaces. Access to the various entities in the system is provided by an AssetManager. The asset manager keeps track of all of the registered proxies and responds to queries on the overall system. This allows all user components to be populated automatically. Hence if a new component is added that supports the IMaterial interface, any instances of this can be used in the various GUI components that work with this interface. One of the main features is an interactive shader designer. This allows rapid prototyping of new visualization renderings that are shader-based and greatly accelerates the development and debug cycle.

  10. Remote sensing using MIMO systems

    DOEpatents

    Bikhazi, Nicolas; Young, William F; Nguyen, Hung D

    2015-04-28

    A technique for sensing a moving object within a physical environment using a MIMO communication link includes generating a channel matrix based upon channel state information of the MIMO communication link. The physical environment operates as a communication medium through which communication signals of the MIMO communication link propagate between a transmitter and a receiver. A spatial information variable is generated for the MIMO communication link based on the channel matrix. The spatial information variable includes spatial information about the moving object within the physical environment. A signature for the moving object is generated based on values of the spatial information variable accumulated over time. The moving object is identified based upon the signature.

  11. The Philippine Population Program strategic plan (1981-1985).

    PubMed

    1980-01-01

    The challenge of the population problem is to effectively mobilize the country's population for productive activity. Rather than simply concern with controlling numbers, emphasis is on human resource management, the structure of employment, labor productivity and income distribution. The long-term Philippine Development Plans reflect recognition of the dynamic interaction between fertility, productivity and welfare. Objectives of the 5-Year Philippine Development Plan 1978-1982, the 10-Year Plan 1978-1987, and the Long-Term Plan to year 2000 integrate population concerns and socioeconomic goals. These objectives include the following: promotion of social development and social justice; attainment of self-sufficiency in food and greater self-reliance in energy; increased development of lagging regions, especially rural areas; improvements of habitat through the development of human settlements and proper management of the environment; and maintenance of population growth at levels conducive to national welfare. Some population concerns that are directly relevant to welfare (in addition to those related to productivity) are distribution patterns of social goods and services, access to services by sectors of the population, and buying power of families. As a total population policy should establish closer linkages, operationally, between the demographic aspects and the productivity and welfare aspects of development, the mission of the National Population Program encompasses 3 areas: fertility; productivity; and welfare. Strategic policies include the following: abortion is unacceptable as a contraceptive method; the population program shall be non-coercive; and the program shall view individual and family welfare in the context and as the main objective of national socioeconomic programs.

  12. Learning Objects and Virtual Learning Environments Technical Evaluation Criteria

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Dagiene, Valentina

    2009-01-01

    The main scientific problems investigated in this article deal with technical evaluation of quality attributes of the main components of e-Learning systems (referred here as DLEs--Digital Libraries of Educational Resources and Services), i.e., Learning Objects (LOs) and Virtual Learning Environments (VLEs). The main research object of the work is…

  13. Using IMPRINT to Guide Experimental Design with Simulated Task Environments

    DTIC Science & Technology

    2015-06-18

    USING IMPRINT TO GUIDE EXPERIMENTAL DESIGN OF SIMULATED TASK ENVIRONMENTS THESIS Gregory...ENG-MS-15-J-052 USING IMPRINT TO GUIDE EXPERIMENTAL DESIGN WITH SIMULATED TASK ENVIRONMENTS THESIS Presented to the Faculty Department...Civilian, USAF June 2015 DISTRIBUTION STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENG-MS-15-J-052 USING IMPRINT

  14. Iterative optimizing quantization method for reconstructing three-dimensional images from a limited number of views

    DOEpatents

    Lee, Heung-Rae

    1997-01-01

    A three-dimensional image reconstruction method comprises treating the object of interest as a group of elements with a size that is determined by the resolution of the projection data, e.g., as determined by the size of each pixel. One of the projections is used as a reference projection. A fictitious object is arbitrarily defined that is constrained by such reference projection. The method modifies the known structure of the fictitious object by comparing and optimizing its four projections to those of the unknown structure of the real object and continues to iterate until the optimization is limited by the residual sum of background noise. The method is composed of several sub-processes that acquire four projections from the real data and the fictitious object: generate an arbitrary distribution to define the fictitious object, optimize the four projections, generate a new distribution for the fictitious object, and enhance the reconstructed image. The sub-process for the acquisition of the four projections from the input real data is simply the function of acquiring the four projections from the data of the transmitted intensity. The transmitted intensity represents the density distribution, that is, the distribution of absorption coefficients through the object.

  15. Students' perception of the learning environment in a distributed medical programme

    PubMed Central

    Veerapen, Kiran; McAleer, Sean

    2010-01-01

    Background The learning environment of a medical school has a significant impact on students' achievements and learning outcomes. The importance of equitable learning environments across programme sites is implicit in distributed undergraduate medical programmes being developed and implemented. Purpose To study the learning environment and its equity across two classes and three geographically separate sites of a distributed medical programme at the University of British Columbia Medical School that commenced in 2004. Method The validated Dundee Ready Educational Environment Survey was sent to all students in their 2nd and 3rd year (classes graduating in 2009 and 2008) of the programme. The domains of the learning environment surveyed were: students' perceptions of learning, students' perceptions of teachers, students' academic self-perceptions, students' perceptions of the atmosphere, and students' social self-perceptions. Mean scores, frequency distribution of responses, and inter- and intrasite differences were calculated. Results The perception of the global learning environment at all sites was more positive than negative. It was characterised by a strongly positive perception of teachers. The work load and emphasis on factual learning were perceived negatively. Intersite differences within domains of the learning environment were more evident in the pioneer class (2008) of the programme. Intersite differences consistent across classes were largely related to on-site support for students. Conclusions Shared strengths and weaknesses in the learning environment at UBC sites were evident in areas that were managed by the parent institution, such as the attributes of shared faculty and curriculum. A greater divergence in the perception of the learning environment was found in domains dependent on local arrangements and social factors that are less amenable to central regulation. This study underlines the need for ongoing comparative evaluation of the learning environment at the distributed sites and interaction between leaders of these sites. PMID:20922033

  16. Using PVM to host CLIPS in distributed environments

    NASA Technical Reports Server (NTRS)

    Myers, Leonard; Pohl, Kym

    1994-01-01

    It is relatively easy to enhance CLIPS (C Language Integrated Production System) to support multiple expert systems running in a distributed environment with heterogeneous machines. The task is minimized by using the PVM (Parallel Virtual Machine) code from Oak Ridge Labs to provide the distributed utility. PVM is a library of C and FORTRAN subprograms that supports distributive computing on many different UNIX platforms. A PVM deamon is easily installed on each CPU that enters the virtual machine environment. Any user with rsh or rexec access to a machine can use the one PVM deamon to obtain a generous set of distributed facilities. The ready availability of both CLIPS and PVM makes the combination of software particularly attractive for budget conscious experimentation of heterogeneous distributive computing with multiple CLIPS executables. This paper presents a design that is sufficient to provide essential message passing functions in CLIPS and enable the full range of PVM facilities.

  17. Distributed expert systems for ground and space applications

    NASA Technical Reports Server (NTRS)

    Buckley, Brian; Wheatcraft, Louis

    1992-01-01

    Presented here is the Spacecraft Command Language (SCL) concept of the unification of ground and space operations using a distributed approach. SCL is a hybrid software environment borrowing from expert system technology, fifth generation language development, and multitasking operating system environments. Examples of potential uses for the system and current distributed applications of SCL are given.

  18. 60-Hz electric and magnetic fields generated by a distribution network.

    PubMed

    Héroux, P

    1987-01-01

    From a mobile unit, 60-Hz electric and magnetic fields generated by Hydro-Québec's distribution network were measured. Nine runs, representative of various human environments, were investigated. Typical values were 32 V/m and 0.16 microT. The electrical distribution networks investigated were major contributors to the electric and magnetic environments.

  19. Supply chain optimization: a practitioner's perspective on the next logistics breakthrough.

    PubMed

    Schlegel, G L

    2000-08-01

    The objective of this paper is to profile a practitioner's perspective on supply chain optimization and highlight the critical elements of this potential new logistics breakthrough idea. The introduction will briefly describe the existing distribution network, and business environment. This will include operational statistics, manufacturing software, and hardware configurations. The first segment will cover the critical success factors or foundations elements that are prerequisites for success. The second segment will give you a glimpse of a "working game plan" for successful migration to supply chain optimization. The final segment will briefly profile "bottom-line" benefits to be derived from the use of supply chain optimization as a strategy, tactical tool, and competitive advantage.

  20. Agent oriented programming

    NASA Technical Reports Server (NTRS)

    Shoham, Yoav

    1994-01-01

    The goal of our research is a methodology for creating robust software in distributed and dynamic environments. The approach taken is to endow software objects with explicit information about one another, to have them interact through a commitment mechanism, and to equip them with a speech-acty communication language. System-level applications include software interoperation and compositionality. A government application of specific interest is an infrastructure for coordination among multiple planners. Daily activity applications include personal software assistants, such as programmable email, scheduling, and new group agents. Research topics include definition of mental state of agents, design of agent languages as well as interpreters for those languages, and mechanisms for coordination within agent societies such as artificial social laws and conventions.

Top