The Web Measurement Environment (WebME): A Tool for Combining and Modeling Distributed Data
NASA Technical Reports Server (NTRS)
Tesoriero, Roseanne; Zelkowitz, Marvin
1997-01-01
Many organizations have incorporated data collection into their software processes for the purpose of process improvement. However, in order to improve, interpreting the data is just as important as the collection of data. With the increased presence of the Internet and the ubiquity of the World Wide Web, the potential for software processes being distributed among several physically separated locations has also grown. Because project data may be stored in multiple locations and in differing formats, obtaining and interpreting data from this type of environment becomes even more complicated. The Web Measurement Environment (WebME), a Web-based data visualization tool, is being developed to facilitate the understanding of collected data in a distributed environment. The WebME system will permit the analysis of development data in distributed, heterogeneous environments. This paper provides an overview of the system and its capabilities.
ERIC Educational Resources Information Center
Dillenbourg, Pierre
1996-01-01
Maintains that diagnosis, explanation, and tutoring, the functions of an interactive learning environment, are collaborative processes. Examines how human-computer interaction can be improved using a distributed cognition framework. Discusses situational and distributed knowledge theories and provides a model on how they can be used to redesign…
The impact of distributed computing on education
NASA Technical Reports Server (NTRS)
Utku, S.; Lestingi, J.; Salama, M.
1982-01-01
In this paper, developments in digital computer technology since the early Fifties are reviewed briefly, and the parallelism which exists between these developments and developments in analysis and design procedures of structural engineering is identified. The recent trends in digital computer technology are examined in order to establish the fact that distributed processing is now an accepted philosophy for further developments. The impact of this on the analysis and design practices of structural engineering is assessed by first examining these practices from a data processing standpoint to identify the key operations and data bases, and then fitting them to the characteristics of distributed processing. The merits and drawbacks of the present philosophy in educating structural engineers are discussed and projections are made for the industry-academia relations in the distributed processing environment of structural analysis and design. An ongoing experiment of distributed computing in a university environment is described.
An approach for heterogeneous and loosely coupled geospatial data distributed computing
NASA Astrophysics Data System (ADS)
Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui
2010-07-01
Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.
Distributed Architecture for the Object-Oriented Method for Interoperability
2003-03-01
Collaborative Environment. ......................121 Figure V-2. Distributed OOMI And The Collaboration Centric Paradigm. .....................123 Figure V...of systems are formed into a system federation to resolve differences in modeling. An OOMI Integrated Development Environment (OOMI IDE) lends ...space for the creation of possible distributed systems is partitioned into User Centric systems, Processing/Storage Centric systems, Implementation
Code of Federal Regulations, 2011 CFR
2011-07-01
... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos... the manufacture, import, processing, or distribution in commerce of asbestos-containing products in...
Code of Federal Regulations, 2013 CFR
2013-07-01
... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos..., importation, processing, and distribution in commerce of the asbestos-containing products identified and at...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos..., importation, processing, and distribution in commerce of the asbestos-containing products identified and at...
Code of Federal Regulations, 2012 CFR
2012-07-01
... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos..., importation, processing, and distribution in commerce of the asbestos-containing products identified and at...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos..., importation, processing, and distribution in commerce of the asbestos-containing products identified and at...
Research into software executives for space operations support
NASA Technical Reports Server (NTRS)
Collier, Mark D.
1990-01-01
Research concepts pertaining to a software (workstation) executive which will support a distributed processing command and control system characterized by high-performance graphics workstations used as computing nodes are presented. Although a workstation-based distributed processing environment offers many advantages, it also introduces a number of new concerns. In order to solve these problems, allow the environment to function as an integrated system, and present a functional development environment to application programmers, it is necessary to develop an additional layer of software. This 'executive' software integrates the system, provides real-time capabilities, and provides the tools necessary to support the application requirements.
NASA Technical Reports Server (NTRS)
Hung, Ching-Chen (Inventor)
1999-01-01
A process for providing elemental metals or metal oxides distributed on a carbon substrate or self-supported utilizing graphite oxide as a percursor. The graphite oxide is exposed to one or more metal chlorides to form an intermediary product comprising carbon, metal, chloride, and oxygen. This intermediary product can be further processed by direct exposure to carbonate solutions to form a second intermediary product comprising carbon, metal carbonate, and oxygen. Either intermediary product may be further processed: a) in air to produce metal oxide; b) in an inert environment to produce metal oxide on carbon substrate; c) in a reducing environment to produce elemental metal distributed on carbon substrate. The product generally takes the shape of the carbon precursor.
NASA Technical Reports Server (NTRS)
Hung, Ching-Cheh (Inventor)
1999-01-01
A process for providing elemental metals or metal oxides distributed on a carbon substrate or self-supported utilizing graphite oxide as a precursor. The graphite oxide is exposed to one or more metal chlorides to form an intermediary product comprising carbon, metal, chloride, and oxygen. This intermediary product can be further processed by direct exposure to carbonate-solutions to form a second intermediary product comprising carbon, metal carbonate, and oxygen. Either intermediary product may be further processed: a) in air to produce metal oxide; b) in an inert environment to produce metal oxide on carbon substrate; c) in a reducing environment to produce elemental metal distributed on carbon substrate. The product generally takes the shape of the carbon precursor.
Distributed semantic networks and CLIPS
NASA Technical Reports Server (NTRS)
Snyder, James; Rodriguez, Tony
1991-01-01
Semantic networks of frames are commonly used as a method of reasoning in many problems. In most of these applications the semantic network exists as a single entity in a single process environment. Advances in workstation hardware provide support for more sophisticated applications involving multiple processes, interacting in a distributed environment. In these applications the semantic network may well be distributed over several concurrently executing tasks. This paper describes the design and implementation of a frame based, distributed semantic network in which frames are accessed both through C Language Integrated Production System (CLIPS) expert systems and procedural C++ language programs. The application area is a knowledge based, cooperative decision making model utilizing both rule based and procedural experts.
Exploiting virtual synchrony in distributed systems
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Joseph, Thomas A.
1987-01-01
Applications of a virtually synchronous environment are described for distributed programming, which underlies a collection of distributed programming tools in the ISIS2 system. A virtually synchronous environment allows processes to be structured into process groups, and makes events like broadcasts to the group as an entity, group membership changes, and even migration of an activity from one place to another appear to occur instantaneously, in other words, synchronously. A major advantage to this approach is that many aspects of a distributed application can be treated independently without compromising correctness. Moreover, user code that is designed as if the system were synchronous can often be executed concurrently. It is argued that this approach to building distributed and fault tolerant software is more straightforward, more flexible, and more likely to yield correct solutions than alternative approaches.
Fowler, Mike S; Ruokolainen, Lasse
2013-01-01
The colour of environmental variability influences the size of population fluctuations when filtered through density dependent dynamics, driving extinction risk through dynamical resonance. Slow fluctuations (low frequencies) dominate in red environments, rapid fluctuations (high frequencies) in blue environments and white environments are purely random (no frequencies dominate). Two methods are commonly employed to generate the coloured spatial and/or temporal stochastic (environmental) series used in combination with population (dynamical feedback) models: autoregressive [AR(1)] and sinusoidal (1/f) models. We show that changing environmental colour from white to red with 1/f models, and from white to red or blue with AR(1) models, generates coloured environmental series that are not normally distributed at finite time-scales, potentially confounding comparison with normally distributed white noise models. Increasing variability of sample Skewness and Kurtosis and decreasing mean Kurtosis of these series alter the frequency distribution shape of the realised values of the coloured stochastic processes. These changes in distribution shape alter patterns in the probability of single and series of extreme conditions. We show that the reduced extinction risk for undercompensating (slow growing) populations in red environments previously predicted with traditional 1/f methods is an artefact of changes in the distribution shapes of the environmental series. This is demonstrated by comparison with coloured series controlled to be normally distributed using spectral mimicry. Changes in the distribution shape that arise using traditional methods lead to underestimation of extinction risk in normally distributed, red 1/f environments. AR(1) methods also underestimate extinction risks in traditionally generated red environments. This work synthesises previous results and provides further insight into the processes driving extinction risk in model populations. We must let the characteristics of known natural environmental covariates (e.g., colour and distribution shape) guide us in our choice of how to best model the impact of coloured environmental variation on population dynamics.
Distributed collaborative environments for predictive battlespace awareness
NASA Astrophysics Data System (ADS)
McQuay, William K.
2003-09-01
The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Situational assessment is crucial in understanding the battlespace. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Decision support technologies can semi-automate activities, such as analysis and planning, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that the commander must fused. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing AFRL research efforts in applying distributed collaborative environments to predictive battlespace awareness.
NASA Astrophysics Data System (ADS)
Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.
2016-06-01
We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.
Process membership in asynchronous environments
NASA Technical Reports Server (NTRS)
Ricciardi, Aleta M.; Birman, Kenneth P.
1993-01-01
The development of reliable distributed software is simplified by the ability to assume a fail-stop failure model. The emulation of such a model in an asynchronous distributed environment is discussed. The solution proposed, called Strong-GMP, can be supported through a highly efficient protocol, and was implemented as part of a distributed systems software project at Cornell University. The precise definition of the problem, the protocol, correctness proofs, and an analysis of costs are addressed.
Efficient High Performance Collective Communication for Distributed Memory Environments
ERIC Educational Resources Information Center
Ali, Qasim
2009-01-01
Collective communication allows efficient communication and synchronization among a collection of processes, unlike point-to-point communication that only involves a pair of communicating processes. Achieving high performance for both kernels and full-scale applications running on a distributed memory system requires an efficient implementation of…
Process for producing metal compounds from graphite oxide
NASA Technical Reports Server (NTRS)
Hung, Ching-Cheh (Inventor)
2000-01-01
A process for providing elemental metals or metal oxides distributed on a carbon substrate or self-supported utilizing graphite oxide as a precursor. The graphite oxide is exposed to one or more metal chlorides to form an intermediary product comprising carbon, metal, chloride, and oxygen This intermediary product can be flier processed by direct exposure to carbonate solutions to form a second intermediary product comprising carbon, metal carbonate, and oxygen. Either intermediary product may be further processed: a) in air to produce metal oxide; b) in an inert environment to produce metal oxide on carbon substrate; c) in a reducing environment to produce elemental metal distributed on carbon substrate. The product generally takes the shape of the carbon precursor.
Process for Producing Metal Compounds from Graphite Oxide
NASA Technical Reports Server (NTRS)
Hung, Ching-Cheh (Inventor)
2000-01-01
A process for providing elemental metals or metal oxides distributed on a carbon substrate or self-supported utilizing graphite oxide as a precursor. The graphite oxide is exposed to one or more metal chlorides to form an intermediary product comprising carbon. metal. chloride. and oxygen This intermediary product can be flier processed by direct exposure to carbonate solutions to form a second intermediary product comprising carbon. metal carbonate. and oxygen. Either intermediary product may be further processed: a) in air to produce metal oxide: b) in an inert environment to produce metal oxide on carbon substrate: c) in a reducing environment. to produce elemental metal distributed on carbon substrate. The product generally takes the shape of the carbon precursor.
A gossip based information fusion protocol for distributed frequent itemset mining
NASA Astrophysics Data System (ADS)
Sohrabi, Mohammad Karim
2018-07-01
The computational complexity, huge memory space requirement, and time-consuming nature of frequent pattern mining process are the most important motivations for distribution and parallelization of this mining process. On the other hand, the emergence of distributed computational and operational environments, which causes the production and maintenance of data on different distributed data sources, makes the parallelization and distribution of the knowledge discovery process inevitable. In this paper, a gossip based distributed itemset mining (GDIM) algorithm is proposed to extract frequent itemsets, which are special types of frequent patterns, in a wireless sensor network environment. In this algorithm, local frequent itemsets of each sensor are extracted using a bit-wise horizontal approach (LHPM) from the nodes which are clustered using a leach-based protocol. Heads of clusters exploit a gossip based protocol in order to communicate each other to find the patterns which their global support is equal to or more than the specified support threshold. Experimental results show that the proposed algorithm outperforms the best existing gossip based algorithm in term of execution time.
Cardea: Providing Support for Dynamic Resource Access in a Distributed Computing Environment
NASA Technical Reports Server (NTRS)
Lepro, Rebekah
2003-01-01
The environment framing the modem authorization process span domains of administration, relies on many different authentication sources, and manages complex attributes as part of the authorization process. Cardea facilitates dynamic access control within this environment as a central function of an inter-operable authorization framework. The system departs from the traditional authorization model by separating the authentication and authorization processes, distributing the responsibility for authorization data and allowing collaborating domains to retain control over their implementation mechanisms. Critical features of the system architecture and its handling of the authorization process differentiate the system from existing authorization components by addressing common needs not adequately addressed by existing systems. Continuing system research seeks to enhance the implementation of the current authorization model employed in Cardea, increase the robustness of current features, further the framework for establishing trust and promote interoperability with existing security mechanisms.
NASA Astrophysics Data System (ADS)
Olasz, A.; Nguyen Thai, B.; Kristóf, D.
2016-06-01
Within recent years, several new approaches and solutions for Big Data processing have been developed. The Geospatial world is still facing the lack of well-established distributed processing solutions tailored to the amount and heterogeneity of geodata, especially when fast data processing is a must. The goal of such systems is to improve processing time by distributing data transparently across processing (and/or storage) nodes. These types of methodology are based on the concept of divide and conquer. Nevertheless, in the context of geospatial processing, most of the distributed computing frameworks have important limitations regarding both data distribution and data partitioning methods. Moreover, flexibility and expendability for handling various data types (often in binary formats) are also strongly required. This paper presents a concept for tiling, stitching and processing of big geospatial data. The system is based on the IQLib concept (https://github.com/posseidon/IQLib/) developed in the frame of the IQmulus EU FP7 research and development project (http://www.iqmulus.eu). The data distribution framework has no limitations on programming language environment and can execute scripts (and workflows) written in different development frameworks (e.g. Python, R or C#). It is capable of processing raster, vector and point cloud data. The above-mentioned prototype is presented through a case study dealing with country-wide processing of raster imagery. Further investigations on algorithmic and implementation details are in focus for the near future.
Distributed computing testbed for a remote experimental environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butner, D.N.; Casper, T.A.; Howard, B.C.
1995-09-18
Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ``Collaboratory.`` The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on themore » DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation`s Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility.« less
Indiva: a middleware for managing distributed media environment
NASA Astrophysics Data System (ADS)
Ooi, Wei-Tsang; Pletcher, Peter; Rowe, Lawrence A.
2003-12-01
This paper presents a unified set of abstractions and operations for hardware devices, software processes, and media data in a distributed audio and video environment. These abstractions, which are provided through a middleware layer called Indiva, use a file system metaphor to access resources and high-level commands to simplify the development of Internet webcast and distributed collaboration control applications. The design and implementation of Indiva are described and examples are presented to illustrate the usefulness of the abstractions.
PILOT: An intelligent distributed operations support system
NASA Technical Reports Server (NTRS)
Rasmussen, Arthur N.
1993-01-01
The Real-Time Data System (RTDS) project is exploring the application of advanced technologies to the real-time flight operations environment of the Mission Control Centers at NASA's Johnson Space Center. The system, based on a network of engineering workstations, provides services such as delivery of real time telemetry data to flight control applications. To automate the operation of this complex distributed environment, a facility called PILOT (Process Integrity Level and Operation Tracker) is being developed. PILOT comprises a set of distributed agents cooperating with a rule-based expert system; together they monitor process operation and data flows throughout the RTDS network. The goal of PILOT is to provide unattended management and automated operation under user control.
Laadan, Oren; Nieh, Jason; Phung, Dan
2012-10-02
Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.
WaveJava: Wavelet-based network computing
NASA Astrophysics Data System (ADS)
Ma, Kun; Jiao, Licheng; Shi, Zhuoer
1997-04-01
Wavelet is a powerful theory, but its successful application still needs suitable programming tools. Java is a simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multi- threaded, dynamic language. This paper addresses the design and development of a cross-platform software environment for experimenting and applying wavelet theory. WaveJava, a wavelet class library designed by the object-orient programming, is developed to take advantage of the wavelets features, such as multi-resolution analysis and parallel processing in the networking computing. A new application architecture is designed for the net-wide distributed client-server environment. The data are transmitted with multi-resolution packets. At the distributed sites around the net, these data packets are done the matching or recognition processing in parallel. The results are fed back to determine the next operation. So, the more robust results can be arrived quickly. The WaveJava is easy to use and expand for special application. This paper gives a solution for the distributed fingerprint information processing system. It also fits for some other net-base multimedia information processing, such as network library, remote teaching and filmless picture archiving and communications.
Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment
NASA Astrophysics Data System (ADS)
Zeigler, Bernard P.; Lee, J. S.
1998-08-01
In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.
Reducing acquisition risk through integrated systems of systems engineering
NASA Astrophysics Data System (ADS)
Gross, Andrew; Hobson, Brian; Bouwens, Christina
2016-05-01
In the fall of 2015, the Joint Staff J7 (JS J7) sponsored the Bold Quest (BQ) 15.2 event and conducted planning and coordination to combine this event into a joint event with the Army Warfighting Assessment (AWA) 16.1 sponsored by the U.S. Army. This multipurpose event combined a Joint/Coalition exercise (JS J7) with components of testing, training, and experimentation required by the Army. In support of Assistant Secretary of the Army for Acquisition, Logistics, and Technology (ASA(ALT)) System of Systems Engineering and Integration (SoSE&I), Always On-On Demand (AO-OD) used a system of systems (SoS) engineering approach to develop a live, virtual, constructive distributed environment (LVC-DE) to support risk mitigation utilizing this complex and challenging exercise environment for a system preparing to enter limited user test (LUT). AO-OD executed a requirements-based SoS engineering process starting with user needs and objectives from Army Integrated Air and Missile Defense (AIAMD), Patriot units, Coalition Intelligence, Surveillance and Reconnaissance (CISR), Focused End State 4 (FES4) Mission Command (MC) Interoperability with Unified Action Partners (UAP), and Mission Partner Environment (MPE) Integration and Training, Tactics and Procedures (TTP) assessment. The SoS engineering process decomposed the common operational, analytical, and technical requirements, while utilizing the Institute of Electrical and Electronics Engineers (IEEE) Distributed Simulation Engineering and Execution Process (DSEEP) to provide structured accountability for the integration and execution of the AO-OD LVC-DE. As a result of this process implementation, AO-OD successfully planned for, prepared, and executed a distributed simulation support environment that responsively satisfied user needs and objectives, demonstrating the viability of an LVC-DE environment to support multiple user objectives and support risk mitigation activities for systems in the acquisition process.
About Distributed Simulation-based Optimization of Forming Processes using a Grid Architecture
NASA Astrophysics Data System (ADS)
Grauer, Manfred; Barth, Thomas
2004-06-01
Permanently increasing complexity of products and their manufacturing processes combined with a shorter "time-to-market" leads to more and more use of simulation and optimization software systems for product design. Finding a "good" design of a product implies the solution of computationally expensive optimization problems based on the results of simulation. Due to the computational load caused by the solution of these problems, the requirements on the Information&Telecommunication (IT) infrastructure of an enterprise or research facility are shifting from stand-alone resources towards the integration of software and hardware resources in a distributed environment for high-performance computing. Resources can either comprise software systems, hardware systems, or communication networks. An appropriate IT-infrastructure must provide the means to integrate all these resources and enable their use even across a network to cope with requirements from geographically distributed scenarios, e.g. in computational engineering and/or collaborative engineering. Integrating expert's knowledge into the optimization process is inevitable in order to reduce the complexity caused by the number of design variables and the high dimensionality of the design space. Hence, utilization of knowledge-based systems must be supported by providing data management facilities as a basis for knowledge extraction from product data. In this paper, the focus is put on a distributed problem solving environment (PSE) capable of providing access to a variety of necessary resources and services. A distributed approach integrating simulation and optimization on a network of workstations and cluster systems is presented. For geometry generation the CAD-system CATIA is used which is coupled with the FEM-simulation system INDEED for simulation of sheet-metal forming processes and the problem solving environment OpTiX for distributed optimization.
Power Aware Signal Processing Environment (PASPE) for PAC/C
2003-02-01
vs. FFT Size For our implementation , the Annapolis FFT core was radix-256, and therefore the smallest PN code length that could be processed was the...PN-64. A C- code version of correlate was compared to the FPGA 61 implementation . The results in Figure 68 show that for a PN-1024, the...12a. DISTRIBUTION / AVAILABILITY STATEMENT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. 12b. DISTRIBUTION CODE 13. ABSTRACT (Maximum
NASA Astrophysics Data System (ADS)
Feehan, S.; Ruggiero, P.; Hempel, L. A.; Anderson, D. L.; Cohn, N.
2016-12-01
Characterizing Feedbacks Between Environmental Forcing and Sediment Characteristics in Fluvial and Coastal Systems American Geophysical Union, 2016 Fall Meeting: San Francisco, CA Authors: Scott Feehan, Peter Ruggiero, Laura Hempel, and Dylan Anderson Linking transport processes and sediment characteristics within different environments along the source to sink continuum provides critical insight into the dominant feedbacks between grain size distributions and morphological evolution. This research is focused on evaluating differences in sediment size distributions across both fluvial and coastal environments in the U.S. Pacific Northwest. The Cascades' high relief is characterized by diverse flow regimes with high peak/flashy flows and sub-threshold flows occurring in relative proximity and one of the most energetic wave climates in the world. Combining analyses of both fluvial and coastal environments provides a broader understanding of the dominant forces driving differences between each system's grain size distributions, sediment transport processes, and resultant evolution. We consider sediment samples taken during a large-scale flume experiment that simulated floods representative of both high/flashy peak flows analogous to runoff dominated rivers and sub-threshold flows, analogous to spring-fed rivers. High discharge flows resulted in narrower grain size distributions while low flows where less skewed. Relative sediment size showed clear dependence on distance from source and the environments' dominant fluid motion. Grain size distributions and sediment transport rates were also quantified in both wave dominated nearshore and aeolian dominated backshore portions of Long Beach Peninsula, Washington during SEDEX2, the Sandbar-aEolian-Dune EXchange Experiment of summer 2016. The distributions showed spatial patterns in mean grain size, skewness, and kurtosis dependent on the dominant sediment transport process. The feedback between these grain size distributions and the predominant driver of sediment transport controls the potential for geomorphic change on societally relevant time scales in multiple settings.
NASA Astrophysics Data System (ADS)
de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.
2014-10-01
Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.
Using an architectural approach to integrate heterogeneous, distributed software components
NASA Technical Reports Server (NTRS)
Callahan, John R.; Purtilo, James M.
1995-01-01
Many computer programs cannot be easily integrated because their components are distributed and heterogeneous, i.e., they are implemented in diverse programming languages, use different data representation formats, or their runtime environments are incompatible. In many cases, programs are integrated by modifying their components or interposing mechanisms that handle communication and conversion tasks. For example, remote procedure call (RPC) helps integrate heterogeneous, distributed programs. When configuring such programs, however, mechanisms like RPC must be used explicitly by software developers in order to integrate collections of diverse components. Each collection may require a unique integration solution. This paper describes improvements to the concepts of software packaging and some of our experiences in constructing complex software systems from a wide variety of components in different execution environments. Software packaging is a process that automatically determines how to integrate a diverse collection of computer programs based on the types of components involved and the capabilities of available translators and adapters in an environment. Software packaging provides a context that relates such mechanisms to software integration processes and reduces the cost of configuring applications whose components are distributed or implemented in different programming languages. Our software packaging tool subsumes traditional integration tools like UNIX make by providing a rule-based approach to software integration that is independent of execution environments.
Distributed systems status and control
NASA Technical Reports Server (NTRS)
Kreidler, David; Vickers, David
1990-01-01
Concepts are investigated for an automated status and control system for a distributed processing environment. System characteristics, data requirements for health assessment, data acquisition methods, system diagnosis methods and control methods were investigated in an attempt to determine the high-level requirements for a system which can be used to assess the health of a distributed processing system and implement control procedures to maintain an accepted level of health for the system. A potential concept for automated status and control includes the use of expert system techniques to assess the health of the system, detect and diagnose faults, and initiate or recommend actions to correct the faults. Therefore, this research included the investigation of methods by which expert systems were developed for real-time environments and distributed systems. The focus is on the features required by real-time expert systems and the tools available to develop real-time expert systems.
Distribution of tunnelling times for quantum electron transport.
Rudge, Samuel L; Kosov, Daniel S
2016-03-28
In electron transport, the tunnelling time is the time taken for an electron to tunnel out of a system after it has tunnelled in. We define the tunnelling time distribution for quantum processes in a dissipative environment and develop a practical approach for calculating it, where the environment is described by the general Markovian master equation. We illustrate the theory by using the rate equation to compute the tunnelling time distribution for electron transport through a molecular junction. The tunnelling time distribution is exponential, which indicates that Markovian quantum tunnelling is a Poissonian statistical process. The tunnelling time distribution is used not only to study the quantum statistics of tunnelling along the average electric current but also to analyse extreme quantum events where an electron jumps against the applied voltage bias. The average tunnelling time shows distinctly different temperature dependence for p- and n-type molecular junctions and therefore provides a sensitive tool to probe the alignment of molecular orbitals relative to the electrode Fermi energy.
Virtual Collaborative Simulation Environment for Integrated Product and Process Development
NASA Technical Reports Server (NTRS)
Gulli, Michael A.
1997-01-01
Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.
Cellular water distribution, transport, and its investigation methods for plant-based food material.
Khan, Md Imran H; Karim, M A
2017-09-01
Heterogeneous and hygroscopic characteristics of plant-based food material make it complex in structure, and therefore water distribution in its different cellular environments is very complex. There are three different cellular environments, namely the intercellular environment, the intracellular environment, and the cell wall environment inside the food structure. According to the bonding strength, intracellular water is defined as loosely bound water, cell wall water is categorized as strongly bound water, and intercellular water is known as free water (FW). During food drying, optimization of the heat and mass transfer process is crucial for the energy efficiency of the process and the quality of the product. For optimizing heat and mass transfer during food processing, understanding these three types of waters (strongly bound, loosely bound, and free water) in plant-based food material is essential. However, there are few studies that investigate cellular level water distribution and transport. As there is no direct method for determining the cellular level water distributions, various indirect methods have been applied to investigate the cellular level water distribution, and there is, as yet, no consensus on the appropriate method for measuring cellular level water in plant-based food material. Therefore, the main aim of this paper is to present a comprehensive review on the available methods to investigate the cellular level water, the characteristics of water at different cellular levels and its transport mechanism during drying. The effect of bound water transport on quality of food product is also discussed. This review article presents a comparative study of different methods that can be applied to investigate cellular water such as nuclear magnetic resonance (NMR), bioelectric impedance analysis (BIA), differential scanning calorimetry (DSC), and dilatometry. The article closes with a discussion of current challenges to investigating cellular water. Copyright © 2017 Elsevier Ltd. All rights reserved.
40 CFR 761.363 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Applicability. 761.363 Section 761.363 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT POLYCHLORINATED BIPHENYLS (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Double...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Background. 761.360 Section 761.360 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT POLYCHLORINATED BIPHENYLS (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Double...
40 CFR 761.320 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Applicability. 761.320 Section 761.320 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT POLYCHLORINATED BIPHENYLS (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Self...
Hierarchical Process Composition: Dynamic Maintenance of Structure in a Distributed Environment
1988-01-01
One prominent hne of research stresses the independence of address space and thread of control, and the resulting efficiencies due to shared memory...cooperating processes. StarOS focuses on case of use and a general capability mechanism, while Medusa stresses the effect of distributed hardware on system...process structure and the asynchrony among agents and between agents and sources of failure. By stressing dynamic structure, we are led to adopt an
Mubayi, Anuj; Greenwood, Priscilla E.; Castillo-Chávez, Carlos; Gruenewald, Paul; Gorman, Dennis M.
2009-01-01
Alcohol consumption is a function of social dynamics, environmental contexts, individuals’ preferences and family history. Empirical surveys have focused primarily on identification of risk factors for high-level drinking but have done little to clarify the underlying mechanisms at work. Also, there have been few attempts to apply nonlinear dynamics to the study of these mechanisms and processes at the population level. A simple framework where drinking is modeled as a socially contagious process in low- and high-risk connected environments is introduced. Individuals are classified as light, moderate (assumed mobile), and heavy drinkers. Moderate drinkers provide the link between both environments, that is, they are assumed to be the only individuals drinking in both settings. The focus here is on the effect of moderate drinkers, measured by the proportion of their time spent in “low-” versus “high-” risk drinking environments, on the distribution of drinkers. A simple model within our contact framework predicts that if the relative residence times of moderate drinkers is distributed randomly between low- and high-risk environments then the proportion of heavy drinkers is likely to be higher than expected. However, the full story even in a highly simplified setting is not so simple because “strong” local social mixing tends to increase high-risk drinking on its own. High levels of social interaction between light and moderate drinkers in low-risk environments can diminish the importance of the distribution of relative drinking times on the prevalence of heavy drinking. PMID:20161388
Technology, Learning and Instruction: Distributed Cognition in the Secondary English Classroom
ERIC Educational Resources Information Center
Gomez, Mary Louise; Schieble, Melissa; Curwood, Jen Scott; Hassett, Dawnene
2010-01-01
In this paper, we analyse interactions between secondary students and pre-service teachers in an online environment in order to understand how their meaning-making processes embody distributed cognition. We begin by providing a theoretical review of the ways in which literacy learning is distributed across learners, objects, tools, symbols,…
NASA Astrophysics Data System (ADS)
Chen, Ruey-Shun; Tsai, Yung-Shun; Tu, Arthur
In this study we propose a manufacturing control framework based on radio-frequency identification (RFID) technology and a distributed information system to construct a mass-customization production process in a loosely coupled shop-floor control environment. On the basis of this framework, we developed RFID middleware and an integrated information system for tracking and controlling the manufacturing process flow. A bicycle manufacturer was used to demonstrate the prototype system. The findings of this study were that the proposed framework can improve the visibility and traceability of the manufacturing process as well as enhance process quality control and real-time production pedigree access. Using this framework, an enterprise can easily integrate an RFID-based system into its manufacturing environment to facilitate mass customization and a just-in-time production model.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Inspections. 763.176 Section 763.176 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos...
40 CFR 763.178 - Recordkeeping.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Recordkeeping. 763.178 Section 763.178 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos...
40 CFR 761.93 - Import for disposal.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Import for disposal. 761.93 Section 761.93 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT POLYCHLORINATED BIPHENYLS (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE...
40 CFR 761.366 - Cleanup equipment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Cleanup equipment. 761.366 Section 761.366 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT POLYCHLORINATED BIPHENYLS (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS Double...
Ai, Dexiecuo; Gravel, Dominique; Chu, Chengjin; Wang, Gang
2013-01-01
The correspondence between species distribution and the environment depends on species’ ability to track favorable environmental conditions (via dispersal) and to maintain competitive hierarchy against the constant influx of migrants (mass effect) and demographic stochasticity (ecological drift). Here we report a simulation study of the influence of landscape structure on species distribution. We consider lottery competition for space in a spatially heterogeneous environment, where the landscape is represented as a network of localities connected by dispersal. We quantified the contribution of neutrality and species sorting to their spatial distribution. We found that neutrality increases and the strength of species-sorting decreases with the centrality of a community in the landscape when the average dispersal among communities is low, whereas the opposite was found at elevated dispersal. We also found that the strength of species-sorting increases with environmental heterogeneity. Our results illustrate that spatial structure of the environment and of dispersal must be taken into account for understanding species distribution. We stress the importance of spatial geographic structure on the relative importance of niche vs. neutral processes in controlling community dynamics. PMID:23874815
Ai, Dexiecuo; Gravel, Dominique; Chu, Chengjin; Wang, Gang
2013-01-01
The correspondence between species distribution and the environment depends on species' ability to track favorable environmental conditions (via dispersal) and to maintain competitive hierarchy against the constant influx of migrants (mass effect) and demographic stochasticity (ecological drift). Here we report a simulation study of the influence of landscape structure on species distribution. We consider lottery competition for space in a spatially heterogeneous environment, where the landscape is represented as a network of localities connected by dispersal. We quantified the contribution of neutrality and species sorting to their spatial distribution. We found that neutrality increases and the strength of species-sorting decreases with the centrality of a community in the landscape when the average dispersal among communities is low, whereas the opposite was found at elevated dispersal. We also found that the strength of species-sorting increases with environmental heterogeneity. Our results illustrate that spatial structure of the environment and of dispersal must be taken into account for understanding species distribution. We stress the importance of spatial geographic structure on the relative importance of niche vs. neutral processes in controlling community dynamics.
A THREE-DIMENSIONAL MODEL ASSESSMENT OF THE GLOBAL DISTRIBUTION OF HEXACHLOROBENZENE
The distributions of persistent organic pollutants (POPs) in the global environment have been studied typically with box/fugacity models with simplified treatments of atmospheric transport processes1. Such models are incapable of simulating the complex three-dimensional mechanis...
Biogeochemical Processes in Microbial Ecosystems
NASA Technical Reports Server (NTRS)
DesMarais, David J.
2001-01-01
The hierarchical organization of microbial ecosystems determines process rates that shape Earth's environment, create the biomarker sedimentary and atmospheric signatures of life, and define the stage upon which major evolutionary events occurred. In order to understand how microorganisms have shaped the global environment of Earth and, potentially, other worlds, we must develop an experimental paradigm that links biogeochemical processes with ever-changing temporal and spatial distributions of microbial populations and their metabolic properties. Additional information is contained in the original extended abstract.
A distributed Clips implementation: dClips
NASA Technical Reports Server (NTRS)
Li, Y. Philip
1993-01-01
A distributed version of the Clips language, dClips, was implemented on top of two existing generic distributed messaging systems to show that: (1) it is easy to create a coarse-grained parallel programming environment out of an existing language if a high level messaging system is used; and (2) the computing model of a parallel programming environment can be changed easily if we change the underlying messaging system. dClips processes were first connected with a simple master-slave model. A client-server model with intercommunicating agents was later implemented. The concept of service broker is being investigated.
40 CFR 761.369 - Pre-cleaning the surface.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Pre-cleaning the surface. 761.369 Section 761.369 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT POLYCHLORINATED BIPHENYLS (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jianhui; Lu, Xiaonan; Martino, Sal
Many distribution management systems (DMS) projects have achieved limited success because the electric utility did not sufficiently plan for actual use of the DMS functions in the control room environment. As a result, end users were not clear on how to use the new application software in actual production environments with existing, well-established business processes. An important first step in the DMS implementation process is development and refinement of the “to be” business processes. Development of use cases for the required DMS application functions is a key activity that leads to the formulation of the “to be” requirements. It ismore » also an important activity that is needed to develop specifications that are used to procure a new DMS.« less
NASA Technical Reports Server (NTRS)
Allard, R.; Mack, B.; Bayoumi, M. M.
1989-01-01
Most robot systems lack a suitable hardware and software environment for the efficient research of new control and sensing schemes. Typically, engineers and researchers need to be experts in control, sensing, programming, communication and robotics in order to implement, integrate and test new ideas in a robot system. In order to reduce this time, the Robot Controller Test Station (RCTS) has been developed. It uses a modular hardware and software architecture allowing easy physical and functional reconfiguration of a robot. This is accomplished by emphasizing four major design goals: flexibility, portability, ease of use, and ease of modification. An enhanced distributed processing version of RCTS is described. It features an expanded and more flexible communication system design. Distributed processing results in the availability of more local computing power and retains the low cost of microprocessors. A large number of possible communication, control and sensing schemes can therefore be easily introduced and tested, using the same basic software structure.
NASA Integrated Services Environment
NASA Technical Reports Server (NTRS)
Ing, Sharon
2005-01-01
This slide presentation will begin with a discussion on NASA's current distributed environment for directories, identity management and account management. We will follow with information concerning the drivers, design, reviews and implementation of the NISE Project. The final component of the presentation discusses processes used, status and conclusions.
Using process groups to implement failure detection in asynchronous environments
NASA Technical Reports Server (NTRS)
Ricciardi, Aleta M.; Birman, Kenneth P.
1991-01-01
Agreement on the membership of a group of processes in a distributed system is a basic problem that arises in a wide range of applications. Such groups occur when a set of processes cooperate to perform some task, share memory, monitor one another, subdivide a computation, and so forth. The group membership problems is discussed as it relates to failure detection in asynchronous, distributed systems. A rigorous, formal specification for group membership is presented under this interpretation. A solution is then presented for this problem.
Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion
NASA Astrophysics Data System (ADS)
Steinhaus, Peter; Strand, Marcus; Dillmann, Rüdiger
2007-12-01
Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO) for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media) in Karlsruhe.
Semantics-based distributed I/O with the ParaMEDIC framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balaji, P.; Feng, W.; Lin, H.
2008-01-01
Many large-scale applications simultaneously rely on multiple resources for efficient execution. For example, such applications may require both large compute and storage resources; however, very few supercomputing centers can provide large quantities of both. Thus, data generated at the compute site oftentimes has to be moved to a remote storage site for either storage or visualization and analysis. Clearly, this is not an efficient model, especially when the two sites are distributed over a wide-area network. Thus, we present a framework called 'ParaMEDIC: Parallel Metadata Environment for Distributed I/O and Computing' which uses application-specific semantic information to convert the generatedmore » data to orders-of-magnitude smaller metadata at the compute site, transfer the metadata to the storage site, and re-process the metadata at the storage site to regenerate the output. Specifically, ParaMEDIC trades a small amount of additional computation (in the form of data post-processing) for a potentially significant reduction in data that needs to be transferred in distributed environments.« less
Distributed service-based approach for sensor data fusion in IoT environments.
Rodríguez-Valenzuela, Sandra; Holgado-Terriza, Juan A; Gutiérrez-Guerrero, José M; Muros-Cobos, Jesús L
2014-10-15
The Internet of Things (IoT) enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments.
Distributed Service-Based Approach for Sensor Data Fusion in IoT Environments
Rodríguez-Valenzuela, Sandra; Holgado-Terriza, Juan A.; Gutiérrez-Guerrero, José M.; Muros-Cobos, Jesús L.
2014-01-01
The Internet of Things (IoT) enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments. PMID:25320907
A Research Program in Computer Technology. 1987 Annual Technical Report
1990-07-01
TITLE (Indcle Security Clanificstion) 1987 Annual Technical Report: *A Research Program in Computer Technology (Unclassified) 12. PERSONAL AUTHOR(S) IS...distributed processing, survivable networks 17. NCE: distributed processing, local networks, personal computers, workstation environment 18. SC Dev...are the auw’iors and should not be Interpreted as representIng the official opinion or policy of DARPA, the U.S. Government, or any person or agency
NASA Technical Reports Server (NTRS)
Sagan, Carl; Thompson, W. Reid; Chyba, Christopher F.; Khare, B. N.
1991-01-01
A review and partial summary of projects within several areas of research generally involving the origin, distribution, chemistry, and spectral/dielectric properties of volatiles and organic materials in the outer solar system and early terrestrial environments are presented. The major topics covered include: (1) impact delivery of volatiles and organic compounds to the early terrestrial planets; (2) optical constants measurements; (3) spectral classification, chemical processes, and distribution of materials; and (4) radar properties of ice, hydrocarbons, and organic heteropolymers.
How Can Innovative Learning Environments Promote the Diffusion of Innovation?
ERIC Educational Resources Information Center
Osborne, Mark
2016-01-01
Schools implementing innovative learning environments (ILEs) face many challenges, including the need to discard previously cherished practices and behaviours, adjust mindsets, and invent successful new ways of operating. Leaders can support these processes by implementing structures that: i) support ongoing, distributed, participatory innovation;…
Knebel, Harley J.; Circe, Ronald C.
1995-01-01
This report illustrates, describes, and briefly discusses the acoustic and textural characteristics and the distribution of bottom sedimentary environments in Boston Harbor and Massachusetts Bay. The study is an outgrowth of a larger research program designed to understand the regional processes that distribute sediments and related contaminants in the area. The report highlights the major findings presented in recent papers by Knebel and others (1991), Knebel, (1993), and Knebel and Circe (1995). The reader is urged to consult the full text of these earlier papers for a more definitive treatment of the data and for appropriate supporting references.
1998-05-22
NUMBER PR-98-1 T. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) Office of Naval Research Ballston Center Tower One One North Quincy...unlimited. 12 b. DISTRIBUTION CODE 19980601 082 13. ABSTRACT (Maximum 200 words) This research project is concerned with two distinct aspects of analysis...Environments With Application To Multitarget Tracking This research project is concerned with two distinct aspects of analysis and processing of sig
A distributed reasoning engine ecosystem for semantic context-management in smart environments.
Almeida, Aitor; López-de-Ipiña, Diego
2012-01-01
To be able to react adequately a smart environment must be aware of the context and its changes. Modeling the context allows applications to better understand it and to adapt to its changes. In order to do this an appropriate formal representation method is needed. Ontologies have proven themselves to be one of the best tools to do it. Semantic inference provides a powerful framework to reason over the context data. But there are some problems with this approach. The inference over semantic context information can be cumbersome when working with a large amount of data. This situation has become more common in modern smart environments where there are a lot sensors and devices available. In order to tackle this problem we have developed a mechanism to distribute the context reasoning problem into smaller parts in order to reduce the inference time. In this paper we describe a distributed peer-to-peer agent architecture of context consumers and context providers. We explain how this inference sharing process works, partitioning the context information according to the interests of the agents, location and a certainty factor. We also discuss the system architecture, analyzing the negotiation process between the agents. Finally we compare the distributed reasoning with the centralized one, analyzing in which situations is more suitable each approach.
NASA Astrophysics Data System (ADS)
Kodama, Yu; Hamagami, Tomoki
Distributed processing system for restoration of electric power distribution network using two-layered CNP is proposed. The goal of this study is to develop the restoration system which adjusts to the future power network with distributed generators. The state of the art of this study is that the two-layered CNP is applied for the distributed computing environment in practical use. The two-layered CNP has two classes of agents, named field agent and operating agent in the network. In order to avoid conflicts of tasks, operating agent controls privilege for managers to send the task announcement messages in CNP. This technique realizes the coordination between agents which work asynchronously in parallel with others. Moreover, this study implements the distributed processing system using a de-fact standard multi-agent framework, JADE(Java Agent DEvelopment framework). This study conducts the simulation experiments of power distribution network restoration and compares the proposed system with the previous system. We confirmed the results show effectiveness of the proposed system.
Electromagnetic Devices and Processes in Environment Protection: Post-Conference Materials.
1994-09-09
the Response of Power Transmission and Distribution Lines to a Nuclear Detonation at a High Altitude" 5. R. Goleman, M. Pahczyk, M. Pawlot...processes are unfortunately endothermal ones. However, if necessary energy is produced via a C02-free way(e.g. using solar, hydro, wind, nuclear or...POST-CONFERENCE MATERIALS 31 DETERMINING THE RESPONSE OF POWER TRANSMISSION AND DISTRIBUTION LINES TO A NUCLEAR DETONATION AT A HIGH ALTITUDE
Spatial probability models of fire in the desert grasslands of the southwestern USA
USDA-ARS?s Scientific Manuscript database
Fire is an important driver of ecological processes in semiarid environments; however, the role of fire in desert grasslands of the Southwestern US is controversial and the regional fire distribution is largely unknown. We characterized the spatial distribution of fire in the desert grassland region...
Evaluation of distributed hydrologic impacts of temperature-index and energy-based snow models
USDA-ARS?s Scientific Manuscript database
Proper characterizations of snow melt and accumulation processes in the snow-dominated mountain environment are needed to understand and predict spatiotemporal distribution of water cycle components. Two commonly used strategies in modeling of snow accumulation and melt are the full energy based and...
Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations
NASA Technical Reports Server (NTRS)
Chanchio, Kasidit; Sun, Xian-He
1996-01-01
This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.
A Distributed Snow Evolution Modeling System (SnowModel)
NASA Astrophysics Data System (ADS)
Liston, G. E.; Elder, K.
2004-12-01
A spatially distributed snow-evolution modeling system (SnowModel) has been specifically designed to be applicable over a wide range of snow landscapes, climates, and conditions. To reach this goal, SnowModel is composed of four sub-models: MicroMet defines the meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowMass simulates snow depth and water-equivalent evolution, and SnowTran-3D accounts for snow redistribution by wind. While other distributed snow models exist, SnowModel is unique in that it includes a well-tested blowing-snow sub-model (SnowTran-3D) for application in windy arctic, alpine, and prairie environments where snowdrifts are common. These environments comprise 68% of the seasonally snow-covered Northern Hemisphere land surface. SnowModel also accounts for snow processes occurring in forested environments (e.g., canopy interception related processes). SnowModel is designed to simulate snow-related physical processes occurring at spatial scales of 5-m and greater, and temporal scales of 1-hour and greater. These include: accumulation from precipitation; wind redistribution and sublimation; loading, unloading, and sublimation within forest canopies; snow-density evolution; and snowpack ripening and melt. To enhance its wide applicability, SnowModel includes the physical calculations required to simulate snow evolution within each of the global snow classes defined by Sturm et al. (1995), e.g., tundra, taiga, alpine, prairie, maritime, and ephemeral snow covers. The three, 25-km by 25-km, Cold Land Processes Experiment (CLPX) mesoscale study areas (MSAs: Fraser, North Park, and Rabbit Ears) are used as SnowModel simulation examples to highlight model strengths, weaknesses, and features in forested, semi-forested, alpine, and shrubland environments.
High-Performance Compute Infrastructure in Astronomy: 2020 Is Only Months Away
NASA Astrophysics Data System (ADS)
Berriman, B.; Deelman, E.; Juve, G.; Rynge, M.; Vöckler, J. S.
2012-09-01
By 2020, astronomy will be awash with as much as 60 PB of public data. Full scientific exploitation of such massive volumes of data will require high-performance computing on server farms co-located with the data. Development of this computing model will be a community-wide enterprise that has profound cultural and technical implications. Astronomers must be prepared to develop environment-agnostic applications that support parallel processing. The community must investigate the applicability and cost-benefit of emerging technologies such as cloud computing to astronomy, and must engage the Computer Science community to develop science-driven cyberinfrastructure such as workflow schedulers and optimizers. We report here the results of collaborations between a science center, IPAC, and a Computer Science research institute, ISI. These collaborations may be considered pathfinders in developing a high-performance compute infrastructure in astronomy. These collaborations investigated two exemplar large-scale science-driver workflow applications: 1) Calculation of an infrared atlas of the Galactic Plane at 18 different wavelengths by placing data from multiple surveys on a common plate scale and co-registering all the pixels; 2) Calculation of an atlas of periodicities present in the public Kepler data sets, which currently contain 380,000 light curves. These products have been generated with two workflow applications, written in C for performance and designed to support parallel processing on multiple environments and platforms, but with different compute resource needs: the Montage image mosaic engine is I/O-bound, and the NASA Star and Exoplanet Database periodogram code is CPU-bound. Our presentation will report cost and performance metrics and lessons-learned for continuing development. Applicability of Cloud Computing: Commercial Cloud providers generally charge for all operations, including processing, transfer of input and output data, and for storage of data, and so the costs of running applications vary widely according to how they use resources. The cloud is well suited to processing CPU-bound (and memory bound) workflows such as the periodogram code, given the relatively low cost of processing in comparison with I/O operations. I/O-bound applications such as Montage perform best on high-performance clusters with fast networks and parallel file-systems. Science-driven Cyberinfrastructure: Montage has been widely used as a driver application to develop workflow management services, such as task scheduling in distributed environments, designing fault tolerance techniques for job schedulers, and developing workflow orchestration techniques. Running Parallel Applications Across Distributed Cloud Environments: Data processing will eventually take place in parallel distributed across cyber infrastructure environments having different architectures. We have used the Pegasus Work Management System (WMS) to successfully run applications across three very different environments: TeraGrid, OSG (Open Science Grid), and FutureGrid. Provisioning resources across different grids and clouds (also referred to as Sky Computing), involves establishing a distributed environment, where issues of, e.g, remote job submission, data management, and security need to be addressed. This environment also requires building virtual machine images that can run in different environments. Usually, each cloud provides basic images that can be customized with additional software and services. In most of our work, we provisioned compute resources using a custom application, called Wrangler. Pegasus WMS abstracts the architectures of the compute environments away from the end-user, and can be considered a first-generation tool suitable for scientists to run their applications on disparate environments.
A development framework for distributed artificial intelligence
NASA Technical Reports Server (NTRS)
Adler, Richard M.; Cottman, Bruce H.
1989-01-01
The authors describe distributed artificial intelligence (DAI) applications in which multiple organizations of agents solve multiple domain problems. They then describe work in progress on a DAI system development environment, called SOCIAL, which consists of three primary language-based components. The Knowledge Object Language defines models of knowledge representation and reasoning. The metaCourier language supplies the underlying functionality for interprocess communication and control access across heterogeneous computing environments. The metaAgents language defines models for agent organization coordination, control, and resource management. Application agents and agent organizations will be constructed by combining metaAgents and metaCourier building blocks with task-specific functionality such as diagnostic or planning reasoning. This architecture hides implementation details of communications, control, and integration in distributed processing environments, enabling application developers to concentrate on the design and functionality of the intelligent agents and agent networks themselves.
Discrimination of particulate matter emission sources using stochastic methods
NASA Astrophysics Data System (ADS)
Szczurek, Andrzej; Maciejewska, Monika; Wyłomańska, Agnieszka; Sikora, Grzegorz; Balcerek, Michał; Teuerle, Marek
2016-12-01
Particulate matter (PM) is one of the criteria pollutants which has been determined as harmful to public health and the environment. For this reason the ability to recognize its emission sources is very important. There are a number of measurement methods which allow to characterize PM in terms of concentration, particles size distribution, and chemical composition. All these information are useful to establish a link between the dust found in the air, its emission sources and influence on human as well as the environment. However, the methods are typically quite sophisticated and not applicable outside laboratories. In this work, we considered PM emission source discrimination method which is based on continuous measurements of PM concentration with a relatively cheap instrument and stochastic analysis of the obtained data. The stochastic analysis is focused on the temporal variation of PM concentration and it involves two steps: (1) recognition of the category of distribution for the data i.e. stable or the domain of attraction of stable distribution and (2) finding best matching distribution out of Gaussian, stable and normal-inverse Gaussian (NIG). We examined six PM emission sources. They were associated with material processing in industrial environment, namely machining and welding aluminum, forged carbon steel and plastic with various tools. As shown by the obtained results, PM emission sources may be distinguished based on statistical distribution of PM concentration variations. Major factor responsible for the differences detectable with our method was the type of material processing and the tool applied. In case different materials were processed by the same tool the distinction of emission sources was difficult. For successful discrimination it was crucial to consider size-segregated mass fraction concentrations. In our opinion the presented approach is very promising. It deserves further study and development.
Programming with process groups: Group and multicast semantics
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Cooper, Robert; Gleeson, Barry
1991-01-01
Process groups are a natural tool for distributed programming and are increasingly important in distributed computing environments. Discussed here is a new architecture that arose from an effort to simplify Isis process group semantics. The findings include a refined notion of how the clients of a group should be treated, what the properties of a multicast primitive should be when systems contain large numbers of overlapping groups, and a new construct called the causality domain. A system based on this architecture is now being implemented in collaboration with the Chorus and Mach projects.
EVALUATING LANDSCAPE CHANGE AND HYDROLOGICAL CONSEQUENCES IN A SEMI-ARID ENVIRONMENT
During the past two decades, important advances in the integration of remote imagery, computer processing, and spatial analysis technologies have been used to better understand the distribution of natural communities and ecosystems, and the ecological processes that affect these ...
Research on Intelligent Synthesis Environment
NASA Technical Reports Server (NTRS)
Loftin, R. Bowen; Dryer, David; Major, Debra; Fletcher, Tom
2002-01-01
The ultimate goal of this research project is to develop a methodology for the assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments. This review provides the theoretical foundation upon which subsequent empirical work will be based. Our review of the team performance literature has identified the following 12 conceptually distinct team interaction processes as characteristic of effective teams. 1) Mission Analysis; 2) Resource Distribution; 3) Leadership; 4) Timing; 5) Intra-team Feedback; 6) Motivational Functions; 7) Team Orientation; 8) Communication; 9) Coordination; 10) Mutual Performance Monitoring; 11) Back-up Behaviors; and 12) Cooperation. In addition, this review summarizes how team task characteristics (i.e., task type, task complexity, motivation, and temporal changes), team characteristics (i.e., team structure and team knowledge), and individual team member characteristics (i.e., dispositions and teamwork knowledge, skills, and abilities) affect team interaction processes, determine the relevance of these processes, and influence team performance. The costs and benefits of distributed team collaboration are also considered. The review concludes with a brief discussion of the nature of collaborative team engineering tasks.
Research on Intelligent Synthesis Environment
NASA Astrophysics Data System (ADS)
Loftin, R. Bowen; Dryer, David; Major, Debra; Fletcher, Tom
2002-10-01
The ultimate goal of this research project is to develop a methodology for the assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments. This review provides the theoretical foundation upon which subsequent empirical work will be based. Our review of the team performance literature has identified the following 12 conceptually distinct team interaction processes as characteristic of effective teams. 1) Mission Analysis; 2) Resource Distribution; 3) Leadership; 4) Timing; 5) Intra-team Feedback; 6) Motivational Functions; 7) Team Orientation; 8) Communication; 9) Coordination; 10) Mutual Performance Monitoring; 11) Back-up Behaviors; and 12) Cooperation. In addition, this review summarizes how team task characteristics (i.e., task type, task complexity, motivation, and temporal changes), team characteristics (i.e., team structure and team knowledge), and individual team member characteristics (i.e., dispositions and teamwork knowledge, skills, and abilities) affect team interaction processes, determine the relevance of these processes, and influence team performance. The costs and benefits of distributed team collaboration are also considered. The review concludes with a brief discussion of the nature of collaborative team engineering tasks.
Distributed decision support for the 21st century mission space
NASA Astrophysics Data System (ADS)
McQuay, William K.
2002-07-01
The past decade has produced significant changes in the conduct of military operations: increased humanitarian missions, asymmetric warfare, the reliance on coalitions and allies, stringent rules of engagement, concern about casualties, and the need for sustained air operations. Future mission commanders will need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Integral to this process is creating situational assessment-understanding the mission space, simulation to analyze alternative futures, current capabilities, planning assessments, course-of-action assessments, and a common operational picture-keeping everyone on the same sheet of paper. Decision support tools in a distributed collaborative environment offer the capability of decomposing these complex multitask processes and distributing them over a dynamic set of execution assets. Decision support technologies can semi-automate activities, such as planning an operation, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that is not currently fused. The marriage of information and simulation technologies provides the mission commander with a collaborative virtual environment for planning and decision support.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cappellari, Michele
2013-11-20
The distribution of galaxies on the mass-size plane as a function of redshift or environment is a powerful test for galaxy formation models. Here we use integral-field stellar kinematics to interpret the variation of the mass-size distribution in two galaxy samples spanning extreme environmental densities. The samples are both identically and nearly mass-selected (stellar mass M {sub *} ≳ 6 × 10{sup 9} M {sub ☉}) and volume-limited. The first consists of nearby field galaxies from the ATLAS{sup 3D} parent sample. The second consists of galaxies in the Coma Cluster (Abell 1656), one of the densest environments for which good, resolvedmore » spectroscopy can be obtained. The mass-size distribution in the dense environment differs from the field one in two ways: (1) spiral galaxies are replaced by bulge-dominated disk-like fast-rotator early-type galaxies (ETGs), which follow the same mass-size relation and have the same mass distribution as in the field sample; (2) the slow-rotator ETGs are segregated in mass from the fast rotators, with their size increasing proportionally to their mass. A transition between the two processes appears around the stellar mass M {sub crit} ≈ 2 × 10{sup 11} M {sub ☉}. We interpret this as evidence for bulge growth (outside-in evolution) and bulge-related environmental quenching dominating at low masses, with little influence from merging. In contrast, significant dry mergers (inside-out evolution) and halo-related quenching drives the mass and size growth at the high-mass end. The existence of these two processes naturally explains the diverse size evolution of galaxies of different masses and the separability of mass and environmental quenching.« less
Security and privacy issues of personal health.
Blobel, Bernd; Pharow, Peter
2007-01-01
While health systems in developed countries and increasingly also in developing countries are moving from organisation-centred to person-centred health service delivery, the supporting communication and information technology is faced with new risks regarding security and privacy of stakeholders involved. The comprehensively distributed environment puts special burden on guaranteeing communication security services, but even more on guaranteeing application security services dealing with privilege management, access control and audit regarding social implication and connected sensitivity of personal information recorded, processed, communicated and stored in an even internationally distributed environment.
Web-Based Learning Support System
NASA Astrophysics Data System (ADS)
Fan, Lisa
Web-based learning support system offers many benefits over traditional learning environments and has become very popular. The Web is a powerful environment for distributing information and delivering knowledge to an increasingly wide and diverse audience. Typical Web-based learning environments, such as Web-CT, Blackboard, include course content delivery tools, quiz modules, grade reporting systems, assignment submission components, etc. They are powerful integrated learning management systems (LMS) that support a number of activities performed by teachers and students during the learning process [1]. However, students who study a course on the Internet tend to be more heterogeneously distributed than those found in a traditional classroom situation. In order to achieve optimal efficiency in a learning process, an individual learner needs his or her own personalized assistance. For a web-based open and dynamic learning environment, personalized support for learners becomes more important. This chapter demonstrates how to realize personalized learning support in dynamic and heterogeneous learning environments by utilizing Adaptive Web technologies. It focuses on course personalization in terms of contents and teaching materials that is according to each student's needs and capabilities. An example of using Rough Set to analyze student personal information to assist students with effective learning and predict student performance is presented.
Distributed Computing Framework for Synthetic Radar Application
NASA Technical Reports Server (NTRS)
Gurrola, Eric M.; Rosen, Paul A.; Aivazis, Michael
2006-01-01
We are developing an extensible software framework, in response to Air Force and NASA needs for distributed computing facilities for a variety of radar applications. The objective of this work is to develop a Python based software framework, that is the framework elements of the middleware that allows developers to control processing flow on a grid in a distributed computing environment. Framework architectures to date allow developers to connect processing functions together as interchangeable objects, thereby allowing a data flow graph to be devised for a specific problem to be solved. The Pyre framework, developed at the California Institute of Technology (Caltech), and now being used as the basis for next-generation radar processing at JPL, is a Python-based software framework. We have extended the Pyre framework to include new facilities to deploy processing components as services, including components that monitor and assess the state of the distributed network for eventual real-time control of grid resources.
A Process for Comparing Dynamics of Distributed Space Systems Simulations
NASA Technical Reports Server (NTRS)
Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.
2009-01-01
The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.
40 CFR 98.420 - Definition of the source category.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Definition of the source category. 98.420 Section 98.420 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... distribution of CO2. (4) Purification, compression, or processing of CO2. (5) On-site use of CO2 captured on...
40 CFR 98.420 - Definition of the source category.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Definition of the source category. 98.420 Section 98.420 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... distribution of CO2. (4) Purification, compression, or processing of CO2. (5) On-site use of CO2 captured on...
40 CFR 98.420 - Definition of the source category.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Definition of the source category. 98.420 Section 98.420 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... distribution of CO2. (4) Purification, compression, or processing of CO2. (5) On-site use of CO2 captured on...
40 CFR 98.420 - Definition of the source category.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Definition of the source category. 98.420 Section 98.420 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... distribution of CO2. (4) Purification, compression, or processing of CO2. (5) On-site use of CO2 captured on...
40 CFR 98.420 - Definition of the source category.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Definition of the source category. 98.420 Section 98.420 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... distribution of CO2. (4) Purification, compression, or processing of CO2. (5) On-site use of CO2 captured on...
Architecture for distributed design and fabrication
NASA Astrophysics Data System (ADS)
McIlrath, Michael B.; Boning, Duane S.; Troxel, Donald E.
1997-01-01
We describe a flexible, distributed system architecture capable of supporting collaborative design and fabrication of semi-conductor devices and integrated circuits. Such capabilities are of particular importance in the development of new technologies, where both equipment and expertise are limited. Distributed fabrication enables direct, remote, physical experimentation in the development of leading edge technology, where the necessary manufacturing resources are new, expensive, and scarce. Computational resources, software, processing equipment, and people may all be widely distributed; their effective integration is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages current vendor and consortia developments to define software interfaces and infrastructure based on existing and merging networking, CIM, and CAD standards. Process engineers and product designers access processing and simulation results through a common interface and collaborate across the distributed manufacturing environment.
Transitioning from Distributed and Traditional to Distributed and Agile: An Experience Report
NASA Astrophysics Data System (ADS)
Wildt, Daniel; Prikladnicki, Rafael
Global companies that experienced extensive waterfall phased plans are trying to improve their existing processes to expedite team engagement. Agile methodologies have become an acceptable path to follow because it comprises project management as part of its practices. Agile practices have been used with the objective of simplifying project control through simple processes, easy to update documentation and higher team iteration over exhaustive documentation, focusing rather on team continuous improvement and aiming to add value to business processes. The purpose of this chapter is to describe the experience of a global multinational company on transitioning from distributed and traditional to distributed and agile. This company has development centers across North America, South America and Asia. This chapter covers challenges faced by the project teams of two pilot projects, including strengths of using agile practices in a globally distributed environment and practical recommendations for similar endeavors.
System and method for secure group transactions
Goldsmith, Steven Y [Rochester, MN
2006-04-25
A method and a secure system, processing on one or more computers, provides a way to control a group transaction. The invention uses group consensus access control and multiple distributed secure agents in a network environment. Each secure agent can organize with the other secure agents to form a secure distributed agent collective.
Rapid Processing of Radio Interferometer Data for Transient Surveys
NASA Astrophysics Data System (ADS)
Bourke, S.; Mooley, K.; Hallinan, G.
2014-05-01
We report on a software infrastructure and pipeline developed to process large radio interferometer datasets. The pipeline is implemented using a radical redesign of the AIPS processing model. An infrastructure we have named AIPSlite is used to spawn, at runtime, minimal AIPS environments across a cluster. The pipeline then distributes and processes its data in parallel. The system is entirely free of the traditional AIPS distribution and is self configuring at runtime. This software has so far been used to process a EVLA Stripe 82 transient survey, the data for the JVLA-COSMOS project, and has been used to process most of the EVLA L-Band data archive imaging each integration to search for short duration transients.
Guidelines for developing distributed virtual environment applications
NASA Astrophysics Data System (ADS)
Stytz, Martin R.; Banks, Sheila B.
1998-08-01
We have conducted a variety of projects that served to investigate the limits of virtual environments and distributed virtual environment (DVE) technology for the military and medical professions. The projects include an application that allows the user to interactively explore a high-fidelity, dynamic scale model of the Solar System and a high-fidelity, photorealistic, rapidly reconfigurable aircraft simulator. Additional projects are a project for observing, analyzing, and understanding the activity in a military distributed virtual environment, a project to develop a distributed threat simulator for training Air Force pilots, a virtual spaceplane to determine user interface requirements for a planned military spaceplane system, and an automated wingman for use in supplementing or replacing human-controlled systems in a DVE. The last two projects are a virtual environment user interface framework; and a project for training hospital emergency department personnel. In the process of designing and assembling the DVE applications in support of these projects, we have developed rules of thumb and insights into assembling DVE applications and the environment itself. In this paper, we open with a brief review of the applications that were the source for our insights and then present the lessons learned as a result of these projects. The lessons we have learned fall primarily into five areas. These areas are requirements development, software architecture, human-computer interaction, graphical database modeling, and construction of computer-generated forces.
Optimized distributed computing environment for mask data preparation
NASA Astrophysics Data System (ADS)
Ahn, Byoung-Sup; Bang, Ju-Mi; Ji, Min-Kyu; Kang, Sun; Jang, Sung-Hoon; Choi, Yo-Han; Ki, Won-Tai; Choi, Seong-Woon; Han, Woo-Sung
2005-11-01
As the critical dimension (CD) becomes smaller, various resolution enhancement techniques (RET) are widely adopted. In developing sub-100nm devices, the complexity of optical proximity correction (OPC) is severely increased and applied OPC layers are expanded to non-critical layers. The transformation of designed pattern data by OPC operation causes complexity, which cause runtime overheads to following steps such as mask data preparation (MDP), and collapse of existing design hierarchy. Therefore, many mask shops exploit the distributed computing method in order to reduce the runtime of mask data preparation rather than exploit the design hierarchy. Distributed computing uses a cluster of computers that are connected to local network system. However, there are two things to limit the benefit of the distributing computing method in MDP. First, every sequential MDP job, which uses maximum number of available CPUs, is not efficient compared to parallel MDP job execution due to the input data characteristics. Second, the runtime enhancement over input cost is not sufficient enough since the scalability of fracturing tools is limited. In this paper, we will discuss optimum load balancing environment that is useful in increasing the uptime of distributed computing system by assigning appropriate number of CPUs for each input design data. We will also describe the distributed processing (DP) parameter optimization to obtain maximum throughput in MDP job processing.
Stellato, Giuseppina; La Storia, Antonietta; De Filippis, Francesca; Borriello, Giorgia; Villani, Francesco
2016-01-01
ABSTRACT Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The aims of this study were to learn more about the possible influence of the meat processing environment on initial fresh meat contamination and to investigate the differences between small-scale retail distribution (SD) and large-scale retail distribution (LD) facilities. Samples were collected from butcheries (n = 20), including LD (n = 10) and SD (n = 10) facilities, over two sampling campaigns. Samples included fresh beef and pork cuts and swab samples from the knife, the chopping board, and the butcher's hand. The microbiota of both meat samples and environmental swabs were very complex, including more than 800 operational taxonomic units (OTUs) collapsed at the species level. The 16S rRNA sequencing analysis showed that core microbiota were shared by 80% of the samples and included Pseudomonas spp., Streptococcus spp., Brochothrix spp., Psychrobacter spp., and Acinetobacter spp. Hierarchical clustering of the samples based on the microbiota showed a certain separation between meat and environmental samples, with higher levels of Proteobacteria in meat. In particular, levels of Pseudomonas and several Enterobacteriaceae members were significantly higher in meat samples, while Brochothrix, Staphylococcus, lactic acid bacteria, and Psychrobacter prevailed in environmental swab samples. Consistent clustering was also observed when metabolic activities were considered by predictive metagenomic analysis of the samples. An increase in carbohydrate metabolism was predicted for the environmental swabs and was consistently linked to Firmicutes, while increases in pathways related to amino acid and lipid metabolism were predicted for the meat samples and were positively correlated with Proteobacteria. Our results highlighted the importance of the processing environment in contributing to the initial microbial levels of meat and clearly showed that the type of retail facility (LD or SD) did not apparently affect the contamination. IMPORTANCE The study provides an in-depth description of the microbiota of meat and meat processing environments. It highlights the importance of the environment as a contamination source of spoilage bacteria, and it shows that the size of the retail facility does not affect the level and type of contamination. PMID:27129965
Ecogenomics: Ensemble Analysis of Gene Expression in Microbial Communities
NASA Technical Reports Server (NTRS)
Sogin, Mitchell; DesMarais, David J.; Stahl, D. A.; Pace, Norman R.
2001-01-01
The hierarchical organization of microbial ecosystems determines process rates that shape Earth's environment, create the biomarker sedimentary and atmospheric signatures of life, and define the stage upon which major evolutionary events occurred. In order to understand how microorganisms have shaped the global environment of Earth and, potentially, other worlds, we must develop an experimental paradigm that links biogeochemical processes with ever-changing temporal and spatial distributions of microbial populations and their metabolic properties. Additional information is contained in the original extended abstract.
Distributed GPU Computing in GIScience
NASA Astrophysics Data System (ADS)
Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.
2013-12-01
Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE Transactions on, 9(3), 378-394. 2. Li, J., Jiang, Y., Yang, C., Huang, Q., & Rice, M. (2013). Visualizing 3D/4D Environmental Data Using Many-core Graphics Processing Units (GPUs) and Multi-core Central Processing Units (CPUs). Computers & Geosciences, 59(9), 78-89. 3. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. (2008). GPU computing. Proceedings of the IEEE, 96(5), 879-899.
NASA Astrophysics Data System (ADS)
Hérail, Gérard; Fornari, Michel; Rouhier, Michel
1989-10-01
Gold placers are formed as a result of surficial processes but glacial and fluvioglacial systems are generally considered to be unfavourable for placer genesis. Nevertheless, some important glacial and fluvioglacial placers have been discovered and are currently being exploited in the Andes of Peru and Bolivia. In the Plio-Pleistocene Ananea-Ancocala basin (4300-4900 m above sea-level), the gold content of the various formations indicates that only glacial and fluvioglacial sediments related to the Ancocala and Chaquiminas Glaciations (middle and upper Pleistocene) contain gold in any notable quantity. Local concentrations of economic interest occur only where a glacier has cut through a primary mineralized zone. Glacial erosion of dispersed primary mineralizations does not produce high-content placers of the kind found in fluviatile environments. Gold distribution in tills is more irregular than in fluviatile sediments and no marked enrichment at bedrock occurs. The transition from a glacial to a fluvioglacial environment is characterized by an increase in gold content due to a relative concentration of the biggest gold flakes and by the appearance of a gold distribution pattern similar to that found in a fluviatile environment. During their transport by glacial and fluvioglacial processes, gold particles acquire specific features; the size and morphology of a gold flake population are determined by the sedimentological and geomorphological environment in which the flakes are carried.
Advanced end-to-end fiber optic sensing systems for demanding environments
NASA Astrophysics Data System (ADS)
Black, Richard J.; Moslehi, Behzad
2010-09-01
Optical fibers are small-in-diameter, light-in-weight, electromagnetic-interference immune, electrically passive, chemically inert, flexible, embeddable into different materials, and distributed-sensing enabling, and can be temperature and radiation tolerant. With appropriate processing and/or packaging, they can be very robust and well suited to demanding environments. In this paper, we review a range of complete end-to-end fiber optic sensor systems that IFOS has developed comprising not only (1) packaged sensors and mechanisms for integration with demanding environments, but (2) ruggedized sensor interrogators, and (3) intelligent decision aid algorithms software systems. We examine the following examples: " Fiber Bragg Grating (FBG) optical sensors systems supporting arrays of environmentally conditioned multiplexed FBG point sensors on single or multiple optical fibers: In conjunction with advanced signal processing, decision aid algorithms and reasoners, FBG sensor based structural health monitoring (SHM) systems are expected to play an increasing role in extending the life and reducing costs of new generations of aerospace systems. Further, FBG based structural state sensing systems have the potential to considerably enhance the performance of dynamic structures interacting with their environment (including jet aircraft, unmanned aerial vehicles (UAVs), and medical or extravehicular space robots). " Raman based distributed temperature sensing systems: The complete length of optical fiber acts as a very long distributed sensor which may be placed down an oil well or wrapped around a cryogenic tank.
Mission Assurance in a Distributed Environment
2009-06-01
Notation ( BPMN ) – Graphical representation of business processes in a workflow • Unified Modeling Language (UML) – Use standard UML diagrams to model the system – Component, sequence, activity diagrams
Research into display sharing techniques for distributed computing environments
NASA Technical Reports Server (NTRS)
Hugg, Steven B.; Fitzgerald, Paul F., Jr.; Rosson, Nina Y.; Johns, Stephen R.
1990-01-01
The X-based Display Sharing solution for distributed computing environments is described. The Display Sharing prototype includes the base functionality for telecast and display copy requirements. Since the prototype implementation is modular and the system design provided flexibility for the Mission Control Center Upgrade (MCCU) operational consideration, the prototype implementation can be the baseline for a production Display Sharing implementation. To facilitate the process the following discussions are presented: Theory of operation; System of architecture; Using the prototype; Software description; Research tools; Prototype evaluation; and Outstanding issues. The prototype is based on the concept of a dedicated central host performing the majority of the Display Sharing processing, allowing minimal impact on each individual workstation. Each workstation participating in Display Sharing hosts programs to facilitate the user's access to Display Sharing as host machine.
The WorkPlace distributed processing environment
NASA Technical Reports Server (NTRS)
Ames, Troy; Henderson, Scott
1993-01-01
Real time control problems require robust, high performance solutions. Distributed computing can offer high performance through parallelism and robustness through redundancy. Unfortunately, implementing distributed systems with these characteristics places a significant burden on the applications programmers. Goddard Code 522 has developed WorkPlace to alleviate this burden. WorkPlace is a small, portable, embeddable network interface which automates message routing, failure detection, and re-configuration in response to failures in distributed systems. This paper describes the design and use of WorkPlace, and its application in the construction of a distributed blackboard system.
Data on nearshore wave process and surficial beach deposits, central Tamil Nadu coast, India.
Joevivek, V; Chandrasekar, N
2017-08-01
The chronicles of nearshore morphology and surficial beach deposits provide valuable information about the nature of the beach condition and the depositional environment. It imparts an understanding about the spatial and temporal relationship of nearshore waves and its influence over the distribution of beach sediments. This article contains data about wave and sediment dynamics of the ten sandy beaches along the central Tamil Nadu coast, India. This present dataset comprises nearshore wave parameters, breaker wave type, beach morphodynamic state, grain size distribution and weight percentage of heavy and light mineral distribution. The dataset will figure out the beach morphology and hydrodynamic condition with respect to the different monsoonal season. This will act as a field reference to realize the coastal dynamics in an open sea condition. The nearshore entities were obtained from the intensive field survey between January 2011 and December 2011, while characteristics of beach sediments are examined by the chemical process in the laboratory environment.
John M. Buffington; Daniele Tonina
2009-01-01
We propose that the mechanisms driving hyporheic exchange vary systematically with different channel morphologies and associated fluvial processes that occur in mountain basins, providing a framework for examining physical controls on hyporheic environments and their spatial variation across the landscape. Furthermore, the spatial distribution of hyporheic environments...
ERIC Educational Resources Information Center
Lee, Young-Jin
2012-01-01
This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…
Transport, behavior, and fate of volatile organic compounds in streams
Rathbun, R.E.
1998-01-01
Volatile organic compounds (VOCs) are compounds with chemical and physical properties that allow the compounds to move freely between the water and air phases of the environment. VOCs are widespread in the environment because of this mobility. Many VOCs have properties making them suspected or known hazards to the health of humans and aquatic organisms. Consequently, understanding the processes affecting the concentration and distribution VOCs in the environment is necessary. The U.S. Geological Survey selected 55 VOCs for study. This report reviews the characteristics of the various process that could affect the transport, behavior, and fate of these VOCs in streams.
Using {sup 222}Rn as a tracer of geophysical processes in underground environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lacerda, T.; Anjos, R. M.; Valladares, D. L.
2014-11-11
Radon levels in two old mines in San Luis, Argentina, are reported and analyzed. These mines are today used for touristic visitation. Our goal was to assess the potential use of such radioactive noble gas as tracer of geological processes in underground environments. CR-39 nuclear track detectors were used during the winter and summer seasons. The findings show that the significant radon concentrations reported in this environment are subject to large seasonal modulations, due to the strong dependence of natural ventilation on the variations of outside temperature. The results also indicate that radon pattern distribution appear as a good methodmore » to localize unknown ducts, fissures or secondary tunnels in subterranean environments.« less
Dusty Plasmas on the Lunar Surface
NASA Astrophysics Data System (ADS)
Horanyi, M.; Andersson, L.; Colwell, J.; Ergun, R.; Gruen, E.; McClintock, B.; Peterson, W. K.; Robertson, S.; Sternovsky, Z.; Wang, X.
2006-12-01
The electrostatic levitation and transport of lunar dust remains one of the most interesting and controversial science issues from the Apollo era. This issue is also of great engineering importance in designing human habitats and protecting optical and mechanical devices. As function of time and location, the lunar surface is exposed to solar wind plasma, UV radiation, and/or the plasma environment of our magnetosphere. Dust grains on the lunar surface collect an electrostatic charge; alter the large-scale surface charge density distribution, ?and subsequently develop an interface region to the background plasma and radiation. There are several in situ and remote sensing observations that indicate that dusty plasma processes are likely to be responsible for the mobilization and transport of lunar soil. These processes are relevant to: a) understanding the lunar surface environment; b) develop dust mitigation strategies; c) to understand the basic physical processes involved in the birth and collapse of dust loaded plasma sheaths. This talk will focus on the dusty plasma processes on the lunar surface. We will review the existing body of observations, and will also consider future opportunities for the combination of in situ and remote sensing observations. Our goals are to characterize: a) the temporal variation of the spatial and size distributions of the levitated/transported dust; and b) the surface plasma environment
The Emerging Importance of Business Process Standards in the Federal Government
2006-02-23
delivers enough value for its commercialization into the general industry. Today, we are seeing standards such as SOA, BPMN and BPEL hit that...Process Modeling Notation ( BPMN ) and the Business Process Execution Language (BPEL). BPMN provides a standard representation for capturing and...execution. The combination of BPMN and BPEL offers organizations the potential to standardize processes in a distributed environment, enabling
Parallel task processing of very large datasets
NASA Astrophysics Data System (ADS)
Romig, Phillip Richardson, III
This research concerns the use of distributed computer technologies for the analysis and management of very large datasets. Improvements in sensor technology, an emphasis on global change research, and greater access to data warehouses all are increase the number of non-traditional users of remotely sensed data. We present a framework for distributed solutions to the challenges of datasets which exceed the online storage capacity of individual workstations. This framework, called parallel task processing (PTP), incorporates both the task- and data-level parallelism exemplified by many image processing operations. An implementation based on the principles of PTP, called Tricky, is also presented. Additionally, we describe the challenges and practical issues in modeling the performance of parallel task processing with large datasets. We present a mechanism for estimating the running time of each unit of work within a system and an algorithm that uses these estimates to simulate the execution environment and produce estimated runtimes. Finally, we describe and discuss experimental results which validate the design. Specifically, the system (a) is able to perform computation on datasets which exceed the capacity of any one disk, (b) provides reduction of overall computation time as a result of the task distribution even with the additional cost of data transfer and management, and (c) in the simulation mode accurately predicts the performance of the real execution environment.
Aging Affects Adaptation to Sound-Level Statistics in Human Auditory Cortex.
Herrmann, Björn; Maess, Burkhard; Johnsrude, Ingrid S
2018-02-21
Optimal perception requires efficient and adaptive neural processing of sensory input. Neurons in nonhuman mammals adapt to the statistical properties of acoustic feature distributions such that they become sensitive to sounds that are most likely to occur in the environment. However, whether human auditory responses adapt to stimulus statistical distributions and how aging affects adaptation to stimulus statistics is unknown. We used MEG to study how exposure to different distributions of sound levels affects adaptation in auditory cortex of younger (mean: 25 years; n = 19) and older (mean: 64 years; n = 20) adults (male and female). Participants passively listened to two sound-level distributions with different modes (either 15 or 45 dB sensation level). In a control block with long interstimulus intervals, allowing neural populations to recover from adaptation, neural response magnitudes were similar between younger and older adults. Critically, both age groups demonstrated adaptation to sound-level stimulus statistics, but adaptation was altered for older compared with younger people: in the older group, neural responses continued to be sensitive to sound level under conditions in which responses were fully adapted in the younger group. The lack of full adaptation to the statistics of the sensory environment may be a physiological mechanism underlying the known difficulty that older adults have with filtering out irrelevant sensory information. SIGNIFICANCE STATEMENT Behavior requires efficient processing of acoustic stimulation. Animal work suggests that neurons accomplish efficient processing by adjusting their response sensitivity depending on statistical properties of the acoustic environment. Little is known about the extent to which this adaptation to stimulus statistics generalizes to humans, particularly to older humans. We used MEG to investigate how aging influences adaptation to sound-level statistics. Listeners were presented with sounds drawn from sound-level distributions with different modes (15 vs 45 dB). Auditory cortex neurons adapted to sound-level statistics in younger and older adults, but adaptation was incomplete in older people. The data suggest that the aging auditory system does not fully capitalize on the statistics available in sound environments to tune the perceptual system dynamically. Copyright © 2018 the authors 0270-6474/18/381989-11$15.00/0.
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
Lv, Dong; Zhu, Tianle; Liu, Runwei; Li, Xinghua; Zhao, Yuan; Sun, Ye; Wang, Hongmei; Zhang, Fan; Zhao, Qinglin
2018-04-08
To understand the effects of co-processing sewage sludge in the cement kiln on non-criterion pollutants emissions and its surrounding environment, the flue gas from a cement kiln stack, ambient air and soil from the background/downwind sites were collected in the cement plant. Polycyclic aromatic hydrocarbons (PAHs) and heavy metals of the samples were analyzed. The results show that PAHs in flue gas mainly exist in the gas phase and the low molecular weight PAHs are the predominant congener. The co-processing sewage sludge results in the increase in PAHs and heavy metals emissions, especially high molecular weight PAHs and low-volatile heavy metals such as Cd and Pb in the particle phase, while it does not change their compositions and distribution patterns significantly. The concentrations and their distributions of the PAHs and heavy metals between the emissions and ambient air have a positive correlation and the co-processing sewage sludge results in the increase of PAHs and heavy metals concentrations in the ambient air. The PAHs concentration level and their distribution in soil are proportional to those in the particle phase of flue gas, and the co-processing sewage sludge can accelerate the accumulation of the PAHs and heavy metals in the surrounding soil, especially high/middle molecular weight PAHs and low-volatile heavy metals.
a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images
NASA Astrophysics Data System (ADS)
Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.
2015-07-01
Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.
Distributed computing feasibility in a non-dedicated homogeneous distributed system
NASA Technical Reports Server (NTRS)
Leutenegger, Scott T.; Sun, Xian-He
1993-01-01
The low cost and availability of clusters of workstations have lead researchers to re-explore distributed computing using independent workstations. This approach may provide better cost/performance than tightly coupled multiprocessors. In practice, this approach often utilizes wasted cycles to run parallel jobs. The feasibility of such a non-dedicated parallel processing environment assuming workstation processes have preemptive priority over parallel tasks is addressed. An analytical model is developed to predict parallel job response times. Our model provides insight into how significantly workstation owner interference degrades parallel program performance. A new term task ratio, which relates the parallel task demand to the mean service demand of nonparallel workstation processes, is introduced. It was proposed that task ratio is a useful metric for determining how large the demand of a parallel applications must be in order to make efficient use of a non-dedicated distributed system.
Red mud flocculation process in alumina production
NASA Astrophysics Data System (ADS)
Fedorova, E. R.; Firsov, A. Yu
2018-05-01
The process of thickening and washing red mud is a gooseneck of alumina production. The existing automated systems of the thickening process control involve stabilizing the parameters of the primary technological circuits of the thickener. The actual direction of scientific research is the creation and improvement of models and systems of the thickening process control by model. But the known models do not fully consider the presence of perturbing effects, in particular the particle size distribution in the feed process, distribution of floccules by size after the aggregation process in the feed barrel. The article is devoted to the basic concepts and terms used in writing the population balance algorithm. The population balance model is implemented in the MatLab environment. The result of the simulation is the particle size distribution after the flocculation process. This model allows one to foreseen the distribution range of floccules after the process of aggregation of red mud in the feed barrel. The mud of Jamaican bauxite was acting as an industrial sample of red mud; Cytec Industries of HX-3000 series with a concentration of 0.5% was acting as a flocculant. When simulating, model constants obtained in a tubular tank in the laboratories of CSIRO (Australia) were used.
Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso
2016-01-01
ABSTRACT Despite effective inactivation procedures, small numbers of bacterial cells may still remain in food samples. The risk that bacteria will survive these procedures has not been estimated precisely because deterministic models cannot be used to describe the uncertain behavior of bacterial populations. We used the Poisson distribution as a representative probability distribution to estimate the variability in bacterial numbers during the inactivation process. Strains of four serotypes of Salmonella enterica, three serotypes of enterohemorrhagic Escherichia coli, and one serotype of Listeria monocytogenes were evaluated for survival. We prepared bacterial cell numbers following a Poisson distribution (indicated by the parameter λ, which was equal to 2) and plated the cells in 96-well microplates, which were stored in a desiccated environment at 10% to 20% relative humidity and at 5, 15, and 25°C. The survival or death of the bacterial cells in each well was confirmed by adding tryptic soy broth as an enrichment culture. Changes in the Poisson distribution parameter during the inactivation process, which represent the variability in the numbers of surviving bacteria, were described by nonlinear regression with an exponential function based on a Weibull distribution. We also examined random changes in the number of surviving bacteria using a random number generator and computer simulations to determine whether the number of surviving bacteria followed a Poisson distribution during the bacterial death process by use of the Poisson process. For small initial cell numbers, more than 80% of the simulated distributions (λ = 2 or 10) followed a Poisson distribution. The results demonstrate that variability in the number of surviving bacteria can be described as a Poisson distribution by use of the model developed by use of the Poisson process. IMPORTANCE We developed a model to enable the quantitative assessment of bacterial survivors of inactivation procedures because the presence of even one bacterium can cause foodborne disease. The results demonstrate that the variability in the numbers of surviving bacteria was described as a Poisson distribution by use of the model developed by use of the Poisson process. Description of the number of surviving bacteria as a probability distribution rather than as the point estimates used in a deterministic approach can provide a more realistic estimation of risk. The probability model should be useful for estimating the quantitative risk of bacterial survival during inactivation. PMID:27940547
Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu
2017-02-15
Despite effective inactivation procedures, small numbers of bacterial cells may still remain in food samples. The risk that bacteria will survive these procedures has not been estimated precisely because deterministic models cannot be used to describe the uncertain behavior of bacterial populations. We used the Poisson distribution as a representative probability distribution to estimate the variability in bacterial numbers during the inactivation process. Strains of four serotypes of Salmonella enterica, three serotypes of enterohemorrhagic Escherichia coli, and one serotype of Listeria monocytogenes were evaluated for survival. We prepared bacterial cell numbers following a Poisson distribution (indicated by the parameter λ, which was equal to 2) and plated the cells in 96-well microplates, which were stored in a desiccated environment at 10% to 20% relative humidity and at 5, 15, and 25°C. The survival or death of the bacterial cells in each well was confirmed by adding tryptic soy broth as an enrichment culture. Changes in the Poisson distribution parameter during the inactivation process, which represent the variability in the numbers of surviving bacteria, were described by nonlinear regression with an exponential function based on a Weibull distribution. We also examined random changes in the number of surviving bacteria using a random number generator and computer simulations to determine whether the number of surviving bacteria followed a Poisson distribution during the bacterial death process by use of the Poisson process. For small initial cell numbers, more than 80% of the simulated distributions (λ = 2 or 10) followed a Poisson distribution. The results demonstrate that variability in the number of surviving bacteria can be described as a Poisson distribution by use of the model developed by use of the Poisson process. We developed a model to enable the quantitative assessment of bacterial survivors of inactivation procedures because the presence of even one bacterium can cause foodborne disease. The results demonstrate that the variability in the numbers of surviving bacteria was described as a Poisson distribution by use of the model developed by use of the Poisson process. Description of the number of surviving bacteria as a probability distribution rather than as the point estimates used in a deterministic approach can provide a more realistic estimation of risk. The probability model should be useful for estimating the quantitative risk of bacterial survival during inactivation. Copyright © 2017 Koyama et al.
1994-04-18
because they represent a microkernel and monolithic kernel approach to MLS operating system issues. TMACH is I based on MACH, a distributed operating...the operating system is [L.sed on a microkernel design or a monolithic kernel design. This distinction requires some caution since monolithic operating...are provided by 3 user-level processes, in contrast to standard UNIX, which has a large monolithic kernel that pro- I - 22 - Distributed O)perating
S. Wang; Z. Zhang; G. Sun; P. Strauss; J. Guo; Y. Tang; A. Yao
2012-01-01
Model calibration is essential for hydrologic modeling of large watersheds in a heterogeneous mountain environment. Little guidance is available for model calibration protocols for distributed models that aim at capturing the spatial variability of hydrologic processes. This study used the physically-based distributed hydrologic model, MIKE SHE, to contrast a lumped...
Light-induced electronic non-equilibrium in plasmonic particles.
Kornbluth, Mordechai; Nitzan, Abraham; Seideman, Tamar
2013-05-07
We consider the transient non-equilibrium electronic distribution that is created in a metal nanoparticle upon plasmon excitation. Following light absorption, the created plasmons decohere within a few femtoseconds, producing uncorrelated electron-hole pairs. The corresponding non-thermal electronic distribution evolves in response to the photo-exciting pulse and to subsequent relaxation processes. First, on the femtosecond timescale, the electronic subsystem relaxes to a Fermi-Dirac distribution characterized by an electronic temperature. Next, within picoseconds, thermalization with the underlying lattice phonons leads to a hot particle in internal equilibrium that subsequently equilibrates with the environment. Here we focus on the early stage of this multistep relaxation process, and on the properties of the ensuing non-equilibrium electronic distribution. We consider the form of this distribution as derived from the balance between the optical absorption and the subsequent relaxation processes, and discuss its implication for (a) heating of illuminated plasmonic particles, (b) the possibility to optically induce current in junctions, and (c) the prospect for experimental observation of such light-driven transport phenomena.
Oh, Jeongsu; Choi, Chi-Hwan; Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo
2016-01-01
High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology-a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in JAVA and is freely available at http://clustomcloud.kopri.re.kr.
Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo
2016-01-01
High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology–a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in JAVA and is freely available at http://clustomcloud.kopri.re.kr. PMID:26954507
Power laws, discontinuities and regional city size distributions
Garmestani, A.S.; Allen, Craig R.; Gallagher, C.M.
2008-01-01
Urban systems are manifestations of human adaptation to the natural environment. City size distributions are the expression of hierarchical processes acting upon urban systems. In this paper, we test the entire city size distributions for the southeastern and southwestern United States (1990), as well as the size classes in these regions for power law behavior. We interpret the differences in the size of the regional city size distributions as the manifestation of variable growth dynamics dependent upon city size. Size classes in the city size distributions are snapshots of stable states within urban systems in flux.
BioconductorBuntu: a Linux distribution that implements a web-based DNA microarray analysis server.
Geeleher, Paul; Morris, Dermot; Hinde, John P; Golden, Aaron
2009-06-01
BioconductorBuntu is a custom distribution of Ubuntu Linux that automatically installs a server-side microarray processing environment, providing a user-friendly web-based GUI to many of the tools developed by the Bioconductor Project, accessible locally or across a network. System installation is via booting off a CD image or by using a Debian package provided to upgrade an existing Ubuntu installation. In its current version, several microarray analysis pipelines are supported including oligonucleotide, dual-or single-dye experiments, including post-processing with Gene Set Enrichment Analysis. BioconductorBuntu is designed to be extensible, by server-side integration of further relevant Bioconductor modules as required, facilitated by its straightforward underlying Python-based infrastructure. BioconductorBuntu offers an ideal environment for the development of processing procedures to facilitate the analysis of next-generation sequencing datasets. BioconductorBuntu is available for download under a creative commons license along with additional documentation and a tutorial from (http://bioinf.nuigalway.ie).
Seafloor environments in Cape Cod Bay, a large coastal embayment
Knebel, H.J.; Rendigs, R. R.; List, J.H.; Signell, R.P.
1996-01-01
Cape Cod Bay is a glacial, semi-enclosed embayment that has a patchy distribution of modern seafloor sedimentary environments of erosion or nondeposition, deposition, and sediment reworking. Sidescan-sonar records and supplemental bathymetric, sedimentary, subbottom, and physical- oceanographic data indicate that the characteristics and distribution of these three categories of bottom environments are controlled by a combination of geologic and oceanographic processes that range from episodic to long-term and from regional to local. (1) Environments of erosion or nondeposition comprise exposares of bedrock, glacial drift, and coarse lag deposits that contain sediments (where present) ranging from boulder fields to gravelly coarse-to-medium sands. These environments are dominant on the shallow margins of the bay (water depths <30 m) where they reflect sediment resuspension, winnowing, and transport during modern northerly storms. (2) Environments of deposition are blanketed by fine-grained sediments ranging from muds to muddy fine sands. These environments are dominant across the floor of the central basin (water depths= 30-60 m) where fine- grained sediments (derived from regional and local sources and emplaced primarily during episodic wind- and density-driven flow) settle through the water column and accumulate under weak bottom currents during nonstorm conditions. (3) Environments of sediment reworking contain patches with diverse textures ranging from gravelly sands to muds. These environments occupy much of the transitional slopes between the margins and the basin floor and reflect a combination of erosion and deposition. The patchy distribution of sedimentary environments within the bay reflects not only regional changes in processes between the margins and the basin but local changes within each part of the bay as well. Small-scale patchiness is caused by local changes in the strengths of wave- and wind-driven currents and (on the margins) by local variations in the supply of fine-grained sediments. This study indicates areas within Cape Cod Bay where fine-grained sediments and associated contaminants are likely to be either moved or deposited. It also provides a guide to the locations and variability of benthic habitats.
Seafloor environments in Cape Cod Bay, a large coastal embayment
Knebel, H.J.; Rendigs, R. R.; List, J.H.; Signell, Richard P.
1996-01-01
Cape Cod Bay is a glacial, semi-enclosed embayment that has a patchy distribution of modern seafloor sedimentary environments of erosion or nondeposition, deposition, and sediment reworking. Sidescan-sonar records and supplemental bathymetric, sedimentary, subbottom, and physical-oceanographic data indicate that the characteristics and distribution of these three categories of bottom environments are controlled by a combination of geologic and oceanographic processes that range from episodic to long-term and from regional to local. (1) Environments of erosion or nondeposition comprise exposures of bedrock, glacial drift, and coarse lag deposits that contain sediments (where present) ranging from boulder fields to gravelly coarse-to-medium sands. These environments are dominant on the shallow margins of the bay (water depths < 30 m) where they reflect sediment resuspension, winnowing, and transport during modern northerly storms. (2) Environments of deposition are blanketed by fine-grained sediments ranging from muds to muddy fine sands. These environments are dominant across the floor of the central basin (water depths = 30–60 m) where fine-grained sediments (derived from regional and local sources and emplaced primarily during episodic wind- and density-driven flow) settle through the water column and accumulate under weak bottom currents during nonstorm conditions. (3) Environments of sediment reworking contain patches with diverse textures ranging from gravelly sands to muds. These environments occupy much of the transitional slopes between the margins and the basin floor and reflect a combination of erosion and deposition.The patchy distribution of sedimentary environments within the bay reflects not only regional changes in processes between the margins and the basin but local changes within each part of the bay as well. Small-scale patchiness is caused by local changes in the strengths of wave- and wind-driven currents and (on the margins) by local variations in the supply of fine-grained sediments.This study indicates areas within Cape Cod Bay where fine-grained sediments and associated contaminants are likely to be either moved or deposited. It also provides a guide to the locations and variability of benthic habitats.
Mission Assurance Analysis Protocol (MAAP): Assessing Risk in Complex Environments
2005-09-01
5 1.7 Focus on Risk .................................................................................. 6 2 Defining Risk ...20 CMU/SEI-2005-TN-032 4.4 Extrinsic and Intrinsic Risk ............................................................. 21 5 Operational Risk in...Section 5 , "Operational Risk in Distributed Processes," we look at the characteristics of operational risk in processes where management control is
Expressing Parallelism with ROOT
NASA Astrophysics Data System (ADS)
Piparo, D.; Tejedor, E.; Guiraud, E.; Ganis, G.; Mato, P.; Moneta, L.; Valls Pla, X.; Canal, P.
2017-10-01
The need for processing the ever-increasing amount of data generated by the LHC experiments in a more efficient way has motivated ROOT to further develop its support for parallelism. Such support is being tackled both for shared-memory and distributed-memory environments. The incarnations of the aforementioned parallelism are multi-threading, multi-processing and cluster-wide executions. In the area of multi-threading, we discuss the new implicit parallelism and related interfaces, as well as the new building blocks to safely operate with ROOT objects in a multi-threaded environment. Regarding multi-processing, we review the new MultiProc framework, comparing it with similar tools (e.g. multiprocessing module in Python). Finally, as an alternative to PROOF for cluster-wide executions, we introduce the efforts on integrating ROOT with state-of-the-art distributed data processing technologies like Spark, both in terms of programming model and runtime design (with EOS as one of the main components). For all the levels of parallelism, we discuss, based on real-life examples and measurements, how our proposals can increase the productivity of scientists.
Expressing Parallelism with ROOT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piparo, D.; Tejedor, E.; Guiraud, E.
The need for processing the ever-increasing amount of data generated by the LHC experiments in a more efficient way has motivated ROOT to further develop its support for parallelism. Such support is being tackled both for shared-memory and distributed-memory environments. The incarnations of the aforementioned parallelism are multi-threading, multi-processing and cluster-wide executions. In the area of multi-threading, we discuss the new implicit parallelism and related interfaces, as well as the new building blocks to safely operate with ROOT objects in a multi-threaded environment. Regarding multi-processing, we review the new MultiProc framework, comparing it with similar tools (e.g. multiprocessing module inmore » Python). Finally, as an alternative to PROOF for cluster-wide executions, we introduce the efforts on integrating ROOT with state-of-the-art distributed data processing technologies like Spark, both in terms of programming model and runtime design (with EOS as one of the main components). For all the levels of parallelism, we discuss, based on real-life examples and measurements, how our proposals can increase the productivity of scientists.« less
Distributed collaborative decision support environments for predictive awareness
NASA Astrophysics Data System (ADS)
McQuay, William K.; Stilman, Boris; Yakhnis, Vlad
2005-05-01
The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, rapidly assess the enemy"s course of action (eCOA) or possible actions and promulgate their own course of action (COA) - a need for predictive awareness. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Revolutionary new approaches to strategy generation and assessment such as Linguistic Geometry (LG) permit the rapid development of COA vs. enemy COA (eCOA). LG tools automatically generate and permit the operators to take advantage of winning strategies and tactics for mission planning and execution in near real-time. LG is predictive and employs deep "look-ahead" from the current state and provides a realistic, reactive model of adversary reasoning and behavior. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing research efforts in applying distributed collaborative environments to decision support for predictive mission awareness.
Organization of the secure distributed computing based on multi-agent system
NASA Astrophysics Data System (ADS)
Khovanskov, Sergey; Rumyantsev, Konstantin; Khovanskova, Vera
2018-04-01
Nowadays developing methods for distributed computing is received much attention. One of the methods of distributed computing is using of multi-agent systems. The organization of distributed computing based on the conventional network computers can experience security threats performed by computational processes. Authors have developed the unified agent algorithm of control system of computing network nodes operation. Network PCs is used as computing nodes. The proposed multi-agent control system for the implementation of distributed computing allows in a short time to organize using of the processing power of computers any existing network to solve large-task by creating a distributed computing. Agents based on a computer network can: configure a distributed computing system; to distribute the computational load among computers operated agents; perform optimization distributed computing system according to the computing power of computers on the network. The number of computers connected to the network can be increased by connecting computers to the new computer system, which leads to an increase in overall processing power. Adding multi-agent system in the central agent increases the security of distributed computing. This organization of the distributed computing system reduces the problem solving time and increase fault tolerance (vitality) of computing processes in a changing computing environment (dynamic change of the number of computers on the network). Developed a multi-agent system detects cases of falsification of the results of a distributed system, which may lead to wrong decisions. In addition, the system checks and corrects wrong results.
Improving the Aircraft Design Process Using Web-Based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)
2000-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Improving the Aircraft Design Process Using Web-based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.
2003-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos... Pennsylvania Ave., NW., Washington, DC 20460, ATTENTION: Asbestos Exemption. For information regarding the...
ENDOCRINE DISRUPTORS IN THE ENVIRONMENT
The endocrine system produces hormones which are powerful natural chemicals that regulate important life processes. Endocrine disruptors are human-made chemicals distributed globally which have the potential to interfere with the endocrine system and produce serious biological e...
A Java-Enabled Interactive Graphical Gas Turbine Propulsion System Simulator
NASA Technical Reports Server (NTRS)
Reed, John A.; Afjeh, Abdollah A.
1997-01-01
This paper describes a gas turbine simulation system which utilizes the newly developed Java language environment software system. The system provides an interactive graphical environment which allows the quick and efficient construction and analysis of arbitrary gas turbine propulsion systems. The simulation system couples a graphical user interface, developed using the Java Abstract Window Toolkit, and a transient, space- averaged, aero-thermodynamic gas turbine analysis method, both entirely coded in the Java language. The combined package provides analytical, graphical and data management tools which allow the user to construct and control engine simulations by manipulating graphical objects on the computer display screen. Distributed simulations, including parallel processing and distributed database access across the Internet and World-Wide Web (WWW), are made possible through services provided by the Java environment.
Extremophiles in Household Water Heaters
NASA Astrophysics Data System (ADS)
Wilpiszeski, R.; House, C. H.
2016-12-01
A significant fraction of Earth's microbial diversity comes from species living in extreme environments, but natural extreme environments can be difficult to access. Manmade systems like household water heaters serve as an effective proxy for thermophilic environments that are otherwise difficult to sample directly. As such, we are investigating the biogeography, taxonomic distribution, and evolution of thermophiles growing in domestic water heaters. Citizen scientists collected hot tap water culture- and filter- samples from 101 homes across the United States. We recovered a single species of thermophilic heterotroph from culture samples inoculated from water heaters across the United States, Thermus scotoductus. Whole-genome sequencing was conducted to better understand the distribution and evolution of this single species. We have also sequenced hyper-variable regions of the 16S rRNA gene from whole-community filter samples to identify the broad diversity and distribution of microbial cells captured from each water heater. These results shed light on the processes that shape thermophilic populations and genomes at a spatial resolution that is difficult to access in naturally occurring extreme ecosystems.
Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges
NASA Astrophysics Data System (ADS)
Maruping, Likoebe M.
Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.
Distributed automatic control of technological processes in conditions of weightlessness
NASA Technical Reports Server (NTRS)
Kukhtenko, A. I.; Merkulov, V. I.; Samoylenko, Y. I.; Ladikov-Royev, Y. P.
1986-01-01
Some problems associated with the automatic control of liquid metal and plasma systems under conditions of weightlessness are examined, with particular reference to the problem of stability of liquid equilibrium configurations. The theoretical fundamentals of automatic control of processes in electrically conducting continuous media are outlined, and means of using electromagnetic fields for simulating technological processes in a space environment are discussed.
Vascular system modeling in parallel environment - distributed and shared memory approaches
Jurczuk, Krzysztof; Kretowski, Marek; Bezy-Wendling, Johanne
2011-01-01
The paper presents two approaches in parallel modeling of vascular system development in internal organs. In the first approach, new parts of tissue are distributed among processors and each processor is responsible for perfusing its assigned parts of tissue to all vascular trees. Communication between processors is accomplished by passing messages and therefore this algorithm is perfectly suited for distributed memory architectures. The second approach is designed for shared memory machines. It parallelizes the perfusion process during which individual processing units perform calculations concerning different vascular trees. The experimental results, performed on a computing cluster and multi-core machines, show that both algorithms provide a significant speedup. PMID:21550891
An experimental paradigm for team decision processes
NASA Technical Reports Server (NTRS)
Serfaty, D.; Kleinman, D. L.
1986-01-01
The study of distributed information processing and decision making is presently hampered by two factors: (1) The inherent complexity of the mathematical formulation of decentralized problems has prevented the development of models that could be used to predict performance in a distributed environment; and (2) The lack of comprehensive scientific empirical data on human team decision making has hindered the development of significant descriptive models. As a part of a comprehensive effort to find a new framework for multihuman decision making problems, a novel experimental research paradigm was developed involving human terms in decision making tasks. Attempts to construct parts of an integrated model with ideas from queueing networks, team theory, distributed estimation and decentralized resource management are described.
1983-11-01
transmission, FM(R) will only have to hold one message. 3. Program Control Block (PCB) The PCB ( Deitel 82] will be maintained by the Executive in...and Use of Kernel to Process Interrupts 35 10. Layered Operating System Design 38 11. Program Control Block Table 43 12. Ready List Data Structure 45 13...examples of fully distributed systems in operation. An objective of the NPS research program for SPLICE is to advance our knowledge of distributed
DAI-CLIPS: Distributed, Asynchronous, Interacting CLIPS
NASA Technical Reports Server (NTRS)
Gagne, Denis; Garant, Alain
1994-01-01
DAI-CLIPS is a distributed computational environment within which each CLIPS is an active independent computational entity with the ability to communicate freely with other CLIPS. Furthermore, new CLIPS can be created, others can be deleted or modify their expertise, all dynamically in an asynchronous and independent fashion during execution. The participating CLIPS are distributed over a network of heterogeneous processors taking full advantage of the available processing power. We present the general framework encompassing DAI-CLIPS and discuss some of its advantages and potential applications.
Stellato, Giuseppina; La Storia, Antonietta; De Filippis, Francesca; Borriello, Giorgia; Villani, Francesco; Ercolini, Danilo
2016-07-01
Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The aims of this study were to learn more about the possible influence of the meat processing environment on initial fresh meat contamination and to investigate the differences between small-scale retail distribution (SD) and large-scale retail distribution (LD) facilities. Samples were collected from butcheries (n = 20), including LD (n = 10) and SD (n = 10) facilities, over two sampling campaigns. Samples included fresh beef and pork cuts and swab samples from the knife, the chopping board, and the butcher's hand. The microbiota of both meat samples and environmental swabs were very complex, including more than 800 operational taxonomic units (OTUs) collapsed at the species level. The 16S rRNA sequencing analysis showed that core microbiota were shared by 80% of the samples and included Pseudomonas spp., Streptococcus spp., Brochothrix spp., Psychrobacter spp., and Acinetobacter spp. Hierarchical clustering of the samples based on the microbiota showed a certain separation between meat and environmental samples, with higher levels of Proteobacteria in meat. In particular, levels of Pseudomonas and several Enterobacteriaceae members were significantly higher in meat samples, while Brochothrix, Staphylococcus, lactic acid bacteria, and Psychrobacter prevailed in environmental swab samples. Consistent clustering was also observed when metabolic activities were considered by predictive metagenomic analysis of the samples. An increase in carbohydrate metabolism was predicted for the environmental swabs and was consistently linked to Firmicutes, while increases in pathways related to amino acid and lipid metabolism were predicted for the meat samples and were positively correlated with Proteobacteria Our results highlighted the importance of the processing environment in contributing to the initial microbial levels of meat and clearly showed that the type of retail facility (LD or SD) did not apparently affect the contamination. The study provides an in-depth description of the microbiota of meat and meat processing environments. It highlights the importance of the environment as a contamination source of spoilage bacteria, and it shows that the size of the retail facility does not affect the level and type of contamination. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Gutyrchik, Evgeny; Bao, Yan; Blautzik, Janusch; Pöppel, Ernst; Zaytseva, Yuliya; Russell, Edmund
2015-01-01
This study capitalizes on individual episodic memories to investigate the question, how dif-ferent environments affect us on a neural level. Instead of using predefined environmental stimuli, this study relied on individual representations of beauty and pleasure. Drawing upon episodic memories we conducted two experiments. Healthy subjects imagined pleasant and non-pleasant environments, as well as beautiful and non-beautiful environments while neural activity was measured by using functional Magnetic Resonance Imaging. Although subjects found the different conditions equally simple to visualize, our results revealed more distribut-ed brain activations for non-pleasant and non-beautiful environments than for pleasant and beautiful environments. The additional regions activated in non-pleasant (left lateral prefrontal cortex) and non-beautiful environments (supplementary motor area, anterior cortical midline structures) are involved in self-regulation and top-down cognitive control. Taken together, the results show that perceptual experiences and emotional evaluations of environments within a positive and a negative frame of reference are based on distinct patterns of neural activity. We interpret the data in terms of a different cognitive and processing load placed by exposure to different environments. The results hint at the efficiency of subject-generated representations as stimulus material. PMID:25875000
Vedder, Aline; Smigielski, Lukasz; Gutyrchik, Evgeny; Bao, Yan; Blautzik, Janusch; Pöppel, Ernst; Zaytseva, Yuliya; Russell, Edmund
2015-01-01
This study capitalizes on individual episodic memories to investigate the question, how dif-ferent environments affect us on a neural level. Instead of using predefined environmental stimuli, this study relied on individual representations of beauty and pleasure. Drawing upon episodic memories we conducted two experiments. Healthy subjects imagined pleasant and non-pleasant environments, as well as beautiful and non-beautiful environments while neural activity was measured by using functional Magnetic Resonance Imaging. Although subjects found the different conditions equally simple to visualize, our results revealed more distribut-ed brain activations for non-pleasant and non-beautiful environments than for pleasant and beautiful environments. The additional regions activated in non-pleasant (left lateral prefrontal cortex) and non-beautiful environments (supplementary motor area, anterior cortical midline structures) are involved in self-regulation and top-down cognitive control. Taken together, the results show that perceptual experiences and emotional evaluations of environments within a positive and a negative frame of reference are based on distinct patterns of neural activity. We interpret the data in terms of a different cognitive and processing load placed by exposure to different environments. The results hint at the efficiency of subject-generated representations as stimulus material.
Distributed Systems Technology Survey.
1987-03-01
and prolocols. 2. Hardware Technology Ecnomic factor we a majo reonm for the prolierat of dlstbted systoe. Processors, memory, an magne tc ndoptical...destined messages and pertorn the a pro te forwarding. There gImsno agreement that a ightweight process mechanism is essential to support com- monly used...Xerox PARC environment [311. Shared file servers, discussed below, are essential to the success of such a scheme. 11. ecurlity A distributed
Fault Detection of Rotating Machinery using the Spectral Distribution Function
NASA Technical Reports Server (NTRS)
Davis, Sanford S.
1997-01-01
The spectral distribution function is introduced to characterize the process leading to faults in rotating machinery. It is shown to be a more robust indicator than conventional power spectral density estimates, but requires only slightly more computational effort. The method is illustrated with examples from seeded gearbox transmission faults and an analytical model of a defective bearing. Procedures are suggested for implementation in realistic environments.
Bringing your tools to CyVerse Discovery Environment using Docker
Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric
2016-01-01
Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse’s Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse’s production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use. PMID:27803802
Bringing your tools to CyVerse Discovery Environment using Docker.
Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric
2016-01-01
Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse's Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse's production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use.
Design alternatives for process group membership and multicast
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Cooper, Robert; Gleeson, Barry
1991-01-01
Process groups are a natural tool for distributed programming, and are increasingly important in distributed computing environments. However, there is little agreement on the most appropriate semantics for process group membership and group communication. These issues are of special importance in the Isis system, a toolkit for distributed programming. Isis supports several styles of process group, and a collection of group communication protocols spanning a range of atomicity and ordering properties. This flexibility makes Isis adaptable to a variety of applications, but is also a source of complexity that limits performance. This paper reports on a new architecture that arose from an effort to simplify Isis process group semantics. Our findings include a refined notion of how the clients of a group should be treated, what the properties of a multicast primitive should be when systems contain large numbers of overlapping groups, and a new construct called the casuality domain. As an illustration, we apply the architecture to the problem of converting processes into fault-tolerant process groups in a manner that is 'transparent' to other processes in the system.
Mechanistic modeling of pesticide exposure: The missing keystone of honey bee toxicology.
Sponsler, Douglas B; Johnson, Reed M
2017-04-01
The role of pesticides in recent honey bee losses is controversial, partly because field studies often fail to detect effects predicted by laboratory studies. This dissonance highlights a critical gap in the field of honey bee toxicology: there exists little mechanistic understanding of the patterns and processes of exposure that link honey bees to pesticides in their environment. The authors submit that 2 key processes underlie honey bee pesticide exposure: 1) the acquisition of pesticide by foraging bees, and 2) the in-hive distribution of pesticide returned by foragers. The acquisition of pesticide by foraging bees must be understood as the spatiotemporal intersection between environmental contamination and honey bee foraging activity. This implies that exposure is distributional, not discrete, and that a subset of foragers may acquire harmful doses of pesticide while the mean colony exposure would appear safe. The in-hive distribution of pesticide is a complex process driven principally by food transfer interactions between colony members, and this process differs importantly between pollen and nectar. High priority should be placed on applying the extensive literature on honey bee biology to the development of more rigorously mechanistic models of honey bee pesticide exposure. In combination with mechanistic effects modeling, mechanistic exposure modeling has the potential to integrate the field of honey bee toxicology, advancing both risk assessment and basic research. Environ Toxicol Chem 2017;36:871-881. © 2016 SETAC. © 2016 SETAC.
ERIC Educational Resources Information Center
Prince, Annabel N.; Pitts, Wesley B.; Parkin, David W.
2018-01-01
In this exploratory case study, we consider how students in an undergraduate biochemistry class engaged in the process of argumentation within an inquiry-oriented learning environment to investigate a chemical mechanism in a particular part of the tricarboxylic acid cycle. Audio/video recordings of student groups during the mechanism discussion…
Simulating the decentralized processes of the human immune system in a virtual anatomy model.
Sarpe, Vladimir; Jacob, Christian
2013-01-01
Many physiological processes within the human body can be perceived and modeled as large systems of interacting particles or swarming agents. The complex processes of the human immune system prove to be challenging to capture and illustrate without proper reference to the spatial distribution of immune-related organs and systems. Our work focuses on physical aspects of immune system processes, which we implement through swarms of agents. This is our first prototype for integrating different immune processes into one comprehensive virtual physiology simulation. Using agent-based methodology and a 3-dimensional modeling and visualization environment (LINDSAY Composer), we present an agent-based simulation of the decentralized processes in the human immune system. The agents in our model - such as immune cells, viruses and cytokines - interact through simulated physics in two different, compartmentalized and decentralized 3-dimensional environments namely, (1) within the tissue and (2) inside a lymph node. While the two environments are separated and perform their computations asynchronously, an abstract form of communication is allowed in order to replicate the exchange, transportation and interaction of immune system agents between these sites. The distribution of simulated processes, that can communicate across multiple, local CPUs or through a network of machines, provides a starting point to build decentralized systems that replicate larger-scale processes within the human body, thus creating integrated simulations with other physiological systems, such as the circulatory, endocrine, or nervous system. Ultimately, this system integration across scales is our goal for the LINDSAY Virtual Human project. Our current immune system simulations extend our previous work on agent-based simulations by introducing advanced visualizations within the context of a virtual human anatomy model. We also demonstrate how to distribute a collection of connected simulations over a network of computers. As a future endeavour, we plan to use parameter tuning techniques on our model to further enhance its biological credibility. We consider these in silico experiments and their associated modeling and optimization techniques as essential components in further enhancing our capabilities of simulating a whole-body, decentralized immune system, to be used both for medical education and research as well as for virtual studies in immunoinformatics.
Fiacco, P. A.; Rice, W. H.
1991-01-01
Computerized medical record systems require structured database architectures for information processing. However, the data must be able to be transferred across heterogeneous platform and software systems. Client-Server architecture allows for distributive processing of information among networked computers and provides the flexibility needed to link diverse systems together effectively. We have incorporated this client-server model with a graphical user interface into an outpatient medical record system, known as SuperChart, for the Department of Family Medicine at SUNY Health Science Center at Syracuse. SuperChart was developed using SuperCard and Oracle SuperCard uses modern object-oriented programming to support a hypermedia environment. Oracle is a powerful relational database management system that incorporates a client-server architecture. This provides both a distributed database and distributed processing which improves performance. PMID:1807732
NASA Astrophysics Data System (ADS)
Schwichtenberg, G.; Hildebrandt, A.; Samaniego-Eguiguren, L.; Kreutziger, Y.; Attinger, S.
2009-04-01
The spatio-temporal distribution of soil moisture in the unsaturated zone influences the vegetation growth, governs the runoff generation processes as well as the energy balance at the interface between biosphere and the atmosphere, by influencing evapotranspiration. A better understanding of the spatio-temporal variability and dependence of soil moisture on living versus abiotic environment would lead to an improved representation of the soil-vegetation-atmosphere processes in hydrological and climate models. The Jena Experiment site (Germany) was established October 2001 in order to analyse the interaction between plant diversity and ecosystem processes. The main experiment covers 92 plots of 20 x 20 m arranged into a grid, on which a mixture of up to 60 grassland species and of one to four plant functional groups have been seeded. Each of these plots is equipped with at least one measurement tube for soil moisture. Measurements have been conducted weekly for four growing seasons (SSF). Here, we use geostatistical methods, like variograms and multivariate regressions, to investigate in how far abiotic environment and ecosystem explain the spatial and temporal variation of soil moisture at the Jena Experiment site. We test the influence of the soil environment, biodiversity, leaf area index and groundwater table. The poster will present the results of this analysis.
Motion/imagery secure cloud enterprise architecture analysis
NASA Astrophysics Data System (ADS)
DeLay, John L.
2012-06-01
Cloud computing with storage virtualization and new service-oriented architectures brings a new perspective to the aspect of a distributed motion imagery and persistent surveillance enterprise. Our existing research is focused mainly on content management, distributed analytics, WAN distributed cloud networking performance issues of cloud based technologies. The potential of leveraging cloud based technologies for hosting motion imagery, imagery and analytics workflows for DOD and security applications is relatively unexplored. This paper will examine technologies for managing, storing, processing and disseminating motion imagery and imagery within a distributed network environment. Finally, we propose areas for future research in the area of distributed cloud content management enterprises.
GALLIEN, Laure; MAZEL, Florent; LAVERGNE, Sébastien; RENAUD, Julien; DOUZET, Rolland; THUILLER, Wilfried
2015-01-01
Despite considerable efforts devoted to investigate the community assembly processes driving plant invasions, few general conclusions have been drawn so far. Three main processes, generally acting as successive filters, are thought to be of prime importance. The invader has to disperse (1st filter) into a suitable environment (2nd filter) and succeed in establishing in recipient communities through competitive interactions (3rd filter) using two strategies: competition avoidance by the use of different resources (resource opportunity), or competitive exclusion of native species. Surprisingly, despite the general consensus on the importance of investigating these three processes and their interplay, they are usually studied independently. Here we aim to analyse these three filters together, by including them all: abiotic environment, dispersal and biotic interactions, into models of invasive species distributions. We first propose a suite of indices (based on species functional dissimilarities) supposed to reflect the two competitive strategies (resource opportunity and competition exclusion). Then, we use a set of generalised linear models to explain the distribution of seven herbaceous invaders in natural communities (using a large vegetation database for the French Alps containing 5,000 community-plots). Finally, we measure the relative importance of competitive interaction indices, identify the type of coexistence mechanism involved and how this varies along environmental gradients. Adding competition indices significantly improved model’s performance, but neither resource opportunity nor competitive exclusion were common strategies among the seven species. Overall, we show that combining environmental, dispersal and biotic information to model invasions has excellent potential for improving our understanding of invader success. PMID:26290653
Puerta, Patricia; Hunsicker, Mary E.; Quetglas, Antoni; Álvarez-Berastegui, Diego; Esteban, Antonio; González, María; Hidalgo, Manuel
2015-01-01
Populations of the same species can experience different responses to the environment throughout their distributional range as a result of spatial and temporal heterogeneity in habitat conditions. This highlights the importance of understanding the processes governing species distribution at local scales. However, research on species distribution often averages environmental covariates across large geographic areas, missing variability in population-environment interactions within geographically distinct regions. We used spatially explicit models to identify interactions between species and environmental, including chlorophyll a (Chla) and sea surface temperature (SST), and trophic (prey density) conditions, along with processes governing the distribution of two cephalopods with contrasting life-histories (octopus and squid) across the western Mediterranean Sea. This approach is relevant for cephalopods, since their population dynamics are especially sensitive to variations in habitat conditions and rarely stable in abundance and location. The regional distributions of the two cephalopod species matched two different trophic pathways present in the western Mediterranean Sea, associated with the Gulf of Lion upwelling and the Ebro river discharges respectively. The effects of the studied environmental and trophic conditions were spatially variant in both species, with usually stronger effects along their distributional boundaries. We identify areas where prey availability limited the abundance of cephalopod populations as well as contrasting effects of temperature in the warmest regions. Despite distributional patterns matching productive areas, a general negative effect of Chla on cephalopod densities suggests that competition pressure is common in the study area. Additionally, results highlight the importance of trophic interactions, beyond other common environmental factors, in shaping the distribution of cephalopod populations. Our study presents a valuable approach for understanding the spatially variant ecology of cephalopod populations, which is important for fisheries and ecosystem management. PMID:26201075
Securing Provenance of Distributed Processes in an Untrusted Environment
NASA Astrophysics Data System (ADS)
Syalim, Amril; Nishide, Takashi; Sakurai, Kouichi
Recently, there is much concern about the provenance of distributed processes, that is about the documentation of the origin and the processes to produce an object in a distributed system. The provenance has many applications in the forms of medical records, documentation of processes in the computer systems, recording the origin of data in the cloud, and also documentation of human-executed processes. The provenance of distributed processes can be modeled by a directed acyclic graph (DAG) where each node represents an entity, and an edge represents the origin and causal relationship between entities. Without sufficient security mechanisms, the provenance graph suffers from integrity and confidentiality problems, for example changes or deletions of the correct nodes, additions of fake nodes and edges, and unauthorized accesses to the sensitive nodes and edges. In this paper, we propose an integrity mechanism for provenance graph using the digital signature involving three parties: the process executors who are responsible in the nodes' creation, a provenance owner that records the nodes to the provenance store, and a trusted party that we call the Trusted Counter Server (TCS) that records the number of nodes stored by the provenance owner. We show that the mechanism can detect the integrity problem in the provenance graph, namely unauthorized and malicious “authorized” updates even if all the parties, except the TCS, collude to update the provenance. In this scheme, the TCS only needs a very minimal storage (linear with the number of the provenance owners). To protect the confidentiality and for an efficient access control administration, we propose a method to encrypt the provenance graph that allows access by paths and compartments in the provenance graph. We argue that encryption is important as a mechanism to protect the provenance data stored in an untrusted environment. We analyze the security of the integrity mechanism, and perform experiments to measure the performance of both mechanisms.
DAVE: A plug and play model for distributed multimedia application development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mines, R.F.; Friesen, J.A.; Yang, C.L.
1994-07-01
This paper presents a model being used for the development of distributed multimedia applications. The Distributed Audio Video Environment (DAVE) was designed to support the development of a wide range of distributed applications. The implementation of this model is described. DAVE is unique in that it combines a simple ``plug and play`` programming interface, supports both centralized and fully distributed applications, provides device and media extensibility, promotes object reuseability, and supports interoperability and network independence. This model enables application developers to easily develop distributed multimedia applications and create reusable multimedia toolkits. DAVE was designed for developing applications such as videomore » conferencing, media archival, remote process control, and distance learning.« less
40 CFR 763.179 - Confidential business information claims.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos-Containing Products; Labeling Requirements § 763.179 Confidential... asbestos on human health and the environment? If your answer is yes, explain. ...
Replication in Mobile Environments
2007-12-01
control number. 1. REPORT DATE 01 DEC 2007 2. REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE Data Replication Over Disadvantaged ...Communication, Information Processing, and Ergonomics KIE What is the Problem? Data replication among distributed databases occurring over disadvantaged
Patient Data Synchronization Process in a Continuity of Care Environment
Haras, Consuela; Sauquet, Dominique; Ameline, Philippe; Jaulent, Marie-Christine; Degoulet, Patrice
2005-01-01
In a distributed patient record environment, we analyze the processes needed to ensure exchange and access to EHR data. We propose an adapted method and the tools for data synchronization. Our study takes into account the issues of user rights management for data access and of decreasing the amount of data exchanged over the network. We describe a XML-based synchronization model that is portable and independent of specific medical data models. The implemented platform consists of several servers, of local network clients, of workstations running user’s interfaces and of data exchange and synchronization tools. PMID:16779049
NASA Technical Reports Server (NTRS)
Hinke, Thomas H.
2004-01-01
Grid technology consists of middleware that permits distributed computations, data and sensors to be seamlessly integrated into a secure, single-sign-on processing environment. In &is environment, a user has to identify and authenticate himself once to the grid middleware, and then can utilize any of the distributed resources to which he has been,panted access. Grid technology allows resources that exist in enterprises that are under different administrative control to be securely integrated into a single processing environment The grid community has adopted commercial web services technology as a means for implementing persistent, re-usable grid services that sit on top of the basic distributed processing environment that grids provide. These grid services can then form building blocks for even more complex grid services. Each grid service is characterized using the Web Service Description Language, which provides a description of the interface and how other applications can access it. The emerging Semantic grid work seeks to associates sufficient semantic information with each grid service such that applications wii1 he able to automatically select, compose and if necessary substitute available equivalent services in order to assemble collections of services that are most appropriate for a particular application. Grid technology has been used to provide limited support to various Earth and space science applications. Looking to the future, this emerging grid service technology can provide a cyberinfrastructures for both the Earth and space science communities. Groups within these communities could transform those applications that have community-wide applicability into persistent grid services that are made widely available to their respective communities. In concert with grid-enabled data archives, users could easily create complex workflows that extract desired data from one or more archives and process it though an appropriate set of widely distributed grid services discovered using semantic grid technology. As required, high-end computational resources could be drawn from available grid resource pools. Using grid technology, this confluence of data, services and computational resources could easily be harnessed to transform data from many different sources into a desired product that is delivered to a user's workstation or to a web portal though which it could be accessed by its intended audience.
Extending the granularity of representation and control for the MIL-STD CAIS 1.0 node model
NASA Technical Reports Server (NTRS)
Rogers, Kathy L.
1986-01-01
The Common APSE (Ada 1 Program Support Environment) Interface Set (CAIS) (DoD85) node model provides an excellent baseline for interfaces in a single-host development environment. To encompass the entire spectrum of computing, however, the CAIS model should be extended in four areas. It should provide the interface between the engineering workstation and the host system throughout the entire lifecycle of the system. It should provide a basis for communication and integration functions needed by distributed host environments. It should provide common interfaces for communications mechanisms to and among target processors. It should provide facilities for integration, validation, and verification of test beds extending to distributed systems on geographically separate processors with heterogeneous instruction set architectures (ISAS). Additions to the PROCESS NODE model to extend the CAIS into these four areas are proposed.
Al-Hwaiti, M. S.; Zielinski, R.A.; Bundham, J.R.; Ranville, J.F.; Ross, P.E.
2010-01-01
Phosphogypsum (PG) is a by-product of the chemical reaction called the "wet process" whereby sulphuric acid reacts with phosphate rock (PR) to produce phosphoric acid, needed for fertilizer production. Through the wet process, some impurities naturally present in the PR become incorporated in PG, including U decay-series radionuclides, are the main important concern which could have an effect on the surrounding environment and prevent its safe utilization. In order to determine the distribution and bioavailability of radionuclides to the surrounding environment, we used a sequential leaching of PG samples from Aqaba and Eshidiya fertilizer industry. The results showed that the percentages of 226Ra and 210Pb in PG are over those in the corresponding phosphate rocks (PG/PR), where 85% of the 226Ra and 85% of the 210Pb fractionate to PG. The sequential extraction results exhibited that most of 226Ra and 210Pb are bound in the residual phase (non-CaSO4) fraction ranging from 45-65% and 55%-75%, respectively, whereas only 10%-15% and 10%-20% respectively of these radionuclides are distributed in the most labile fraction. The results obtained from this study showed that radionuclides are not incorporated with gypsum itself and may not form a threat to the surrounding environment. ?? 2010 Science Press, Institute of Geochemistry, CAS and Springer Berlin Heidelberg.
NASA Astrophysics Data System (ADS)
Yamamoto, K.; Murata, K.; Kimura, E.; Honda, R.
2006-12-01
In the Solar-Terrestrial Physics (STP) field, the amount of satellite observation data has been increasing every year. It is necessary to solve the following three problems to achieve large-scale statistical analyses of plenty of data. (i) More CPU power and larger memory and disk size are required. However, total powers of personal computers are not enough to analyze such amount of data. Super-computers provide a high performance CPU and rich memory area, but they are usually separated from the Internet or connected only for the purpose of programming or data file transfer. (ii) Most of the observation data files are managed at distributed data sites over the Internet. Users have to know where the data files are located. (iii) Since no common data format in the STP field is available now, users have to prepare reading program for each data by themselves. To overcome the problems (i) and (ii), we constructed a parallel and distributed data analysis environment based on the Gfarm reference implementation of the Grid Datafarm architecture. The Gfarm shares both computational resources and perform parallel distributed processings. In addition, the Gfarm provides the Gfarm filesystem which can be as virtual directory tree among nodes. The Gfarm environment is composed of three parts; a metadata server to manage distributed files information, filesystem nodes to provide computational resources and a client to throw a job into metadata server and manages data processing schedulings. In the present study, both data files and data processes are parallelized on the Gfarm with 6 file system nodes: CPU clock frequency of each node is Pentium V 1GHz, 256MB memory and40GB disk. To evaluate performances of the present Gfarm system, we scanned plenty of data files, the size of which is about 300MB for each, in three processing methods: sequential processing in one node, sequential processing by each node and parallel processing by each node. As a result, in comparison between the number of files and the elapsed time, parallel and distributed processing shorten the elapsed time to 1/5 than sequential processing. On the other hand, sequential processing times were shortened in another experiment, whose file size is smaller than 100KB. In this case, the elapsed time to scan one file is within one second. It implies that disk swap took place in case of parallel processing by each node. We note that the operation became unstable when the number of the files exceeded 1000. To overcome the problem (iii), we developed an original data class. This class supports our reading of data files with various data formats since it converts them into an original data format since it defines schemata for every type of data and encapsulates the structure of data files. In addition, since this class provides a function of time re-sampling, users can easily convert multiple data (array) with different time resolution into the same time resolution array. Finally, using the Gfarm, we achieved a high performance environment for large-scale statistical data analyses. It should be noted that the present method is effective only when one data file size is large enough. At present, we are restructuring the new Gfarm environment with 8 nodes: CPU is Athlon 64 x2 Dual Core 2GHz, 2GB memory and 1.2TB disk (using RAID0) for each node. Our original class is to be implemented on the new Gfarm environment. In the present talk, we show the latest results with applying the present system for data analyses with huge number of satellite observation data files.
Karpievitch, Yuliya V; Almeida, Jonas S
2006-01-01
Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707
Karpievitch, Yuliya V; Almeida, Jonas S
2006-03-15
Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.
Surveys of the earth's resources and environment by satellites
NASA Technical Reports Server (NTRS)
Nordberg, W.; Tiedemann, H.; Bohn, C.
1975-01-01
The potential and promise of observing the earth from the vantage point of space is discussed. The systematic surveying of processes and phenomena occurring on the surface of the earth by Landsat 1 and Nimbus 5 is considered to be useful in the following areas: assessment of water resources; mineral and petroleum exploration; land use planning; crop, forest, and rangeland inventory; assessment of flood, earthquake, and other environmental hazards; monitoring coastal processes; environmental effects of industrial effluents and of air pollution; mapping the distribution and types of ice covering the earth's polar caps and global soil moisture distributions.
NASA Astrophysics Data System (ADS)
Romanosky, Robert R.
2017-05-01
he National Energy Technology Laboratory (NETL) under the Department of Energy (DOE) Fossil Energy (FE) Program is leading the effort to not only develop near zero emission power generation systems, but to increaser the efficiency and availability of current power systems. The overarching goal of the program is to provide clean affordable power using domestic resources. Highly efficient, low emission power systems can have extreme conditions of high temperatures up to 1600 oC, high pressures up to 600 psi, high particulate loadings, and corrosive atmospheres that require monitoring. Sensing in these harsh environments can provide key information that directly impacts process control and system reliability. The lack of suitable measurement technology serves as a driver for the innovations in harsh environment sensor development. Advancements in sensing using optical fibers are key efforts within NETL's sensor development program as these approaches offer the potential to survive and provide critical information about these processes. An overview of the sensor development supported by the National Energy Technology Laboratory (NETL) will be given, including research in the areas of sensor materials, designs, and measurement types. New approaches to intelligent sensing, sensor placement and process control using networked sensors will be discussed as will novel approaches to fiber device design concurrent with materials development research and development in modified and coated silica and sapphire fiber based sensors. The use of these sensors for both single point and distributed measurements of temperature, pressure, strain, and a select suite of gases will be addressed. Additional areas of research includes novel control architecture and communication frameworks, device integration for distributed sensing, and imaging and other novel approaches to monitoring and controlling advanced processes. The close coupling of the sensor program with process modeling and control will be discussed for the overarching goal of clean power production.
Ling, Yu-Chen; Bush, Richard; Grice, Kliti; Tulipani, Svenja; Berwick, Lyndon; Moreau, John W
2015-01-01
Coastal acid sulfate soils (CASS) constitute a serious and global environmental problem. Oxidation of iron sulfide minerals exposed to air generates sulfuric acid with consequently negative impacts on coastal and estuarine ecosystems. Tidal inundation represents one current treatment strategy for CASS, with the aim of neutralizing acidity by triggering microbial iron- and sulfate-reduction and inducing the precipitation of iron-sulfides. Although well-known functional guilds of bacteria drive these processes, their distributions within CASS environments, as well as their relationships to tidal cycling and the availability of nutrients and electron acceptors, are poorly understood. These factors will determine the long-term efficacy of "passive" CASS remediation strategies. Here we studied microbial community structure and functional guild distribution in sediment cores obtained from 10 depths ranging from 0 to 20 cm in three sites located in the supra-, inter- and sub-tidal segments, respectively, of a CASS-affected salt marsh (East Trinity, Cairns, Australia). Whole community 16S rRNA gene diversity within each site was assessed by 454 pyrotag sequencing and bioinformatic analyses in the context of local hydrological, geochemical, and lithological factors. The results illustrate spatial overlap, or close association, of iron-, and sulfate-reducing bacteria (SRB) in an environment rich in organic matter and controlled by parameters such as acidity, redox potential, degree of water saturation, and mineralization. The observed spatial distribution implies the need for empirical understanding of the timing, relative to tidal cycling, of various terminal electron-accepting processes that control acid generation and biogeochemical iron and sulfur cycling.
Function Allocation in a Robust Distributed Real-Time Environment
1991-12-01
fundamental characteristic of a distributed system is its ability to map individual logical functions of an application program onto many physical nodes... how much of a node’s processor time is scheduled for function processing. IMC is the function- to -function communication required to facilitate...indicator of how much excess processor time a node has. The reconfiguration algorithms use these variables to determine the most appropriate node(s) to
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bond, P.A.
1993-03-01
The global geochemical cycle for an element tracks its path from its various sources to its sinks via processes of weathering and transportation. The cycle may then be quantified in a necessarily approximate manner. The geochemical cycle (thus quantified) reveals constraints (known and unknown) on an element's behavior imposed by the various processes which act on it. In the context of a global geochemical cycle, a continent becomes essentially a source term. If, however, an element's behavior is examined in a local or regional context, sources and their related sinks may be identified. This suggests that small-scale geochemical cycles maymore » be superimposed on global geochemical cycles. Definition of such sub-cycles may clarify the distribution of an element in the earth's near-surface environment. In Florida, phosphate minerals of the Hawthorn Group act as a widely distributed source of uranium. Uranium is transported by surface- and ground-waters. Florida is the site of extensive wetlands and peatlands. The organic matter associated with these deposits adsorbs uranium and may act as a local sink depending on its hydrogeologic setting. This work examines the role of organic matter in the distribution of uranium in the surface and shallow subsurface environments of central and north Florida.« less
NASA Astrophysics Data System (ADS)
Steenhuis, T. S.; Mendoza, G.; Lyon, S. W.; Gerard Marchant, P.; Walter, M. T.; Schneiderman, E.
2003-04-01
Because the traditional Soil Conservation Service Curve Number (SCS-CN) approach continues to be ubiquitously used in GIS-BASED water quality models, new application methods are needed that are consistent with variable source area (VSA) hydrological processes in the landscape. We developed within an integrated GIS modeling environment a distributed approach for applying the traditional SCS-CN equation to watersheds where VSA hydrology is a dominant process. Spatial representation of hydrologic processes is important for watershed planning because restricting potentially polluting activities from runoff source areas is fundamental to controlling non-point source pollution. The methodology presented here uses the traditional SCS-CN method to predict runoff volume and spatial extent of saturated areas and uses a topographic index to distribute runoff source areas through watersheds. The resulting distributed CN-VSA method was incorporated in an existing GWLF water quality model and applied to sub-watersheds of the Delaware basin in the Catskill Mountains region of New York State. We found that the distributed CN-VSA approach provided a physically-based method that gives realistic results for watersheds with VSA hydrology.
The R-Shell approach - Using scheduling agents in complex distributed real-time systems
NASA Technical Reports Server (NTRS)
Natarajan, Swaminathan; Zhao, Wei; Goforth, Andre
1993-01-01
Large, complex real-time systems such as space and avionics systems are extremely demanding in their scheduling requirements. The current OS design approaches are quite limited in the capabilities they provide for task scheduling. Typically, they simply implement a particular uniprocessor scheduling strategy and do not provide any special support for network scheduling, overload handling, fault tolerance, distributed processing, etc. Our design of the R-Shell real-time environment fcilitates the implementation of a variety of sophisticated but efficient scheduling strategies, including incorporation of all these capabilities. This is accomplished by the use of scheduling agents which reside in the application run-time environment and are responsible for coordinating the scheduling of the application.
Role of environmental variability in the evolution of life history strategies.
Hastings, A; Caswell, H
1979-09-01
We reexamine the role of environmental variability in the evolution of life history strategies. We show that normally distributed deviations in the quality of the environment should lead to normally distributed deviations in the logarithm of year-to-year survival probabilities, which leads to interesting consequences for the evolution of annual and perennial strategies and reproductive effort. We also examine the effects of using differing criteria to determine the outcome of selection. Some predictions of previous theory are reversed, allowing distinctions between r and K theory and a theory based on variability. However, these distinctions require information about both the environment and the selection process not required by current theory.
Applications integration in a hybrid cloud computing environment: modelling and platform
NASA Astrophysics Data System (ADS)
Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang
2013-08-01
With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.
A Domain-Specific Language for Aviation Domain Interoperability
ERIC Educational Resources Information Center
Comitz, Paul
2013-01-01
Modern information systems require a flexible, scalable, and upgradeable infrastructure that allows communication and collaboration between heterogeneous information processing and computing environments. Aviation systems from different organizations often use differing representations and distribution policies for the same data and messages,…
40 CFR 763.165 - Manufacture and importation prohibitions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Manufacture and importation...) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos-Containing Products; Labeling Requirements § 763.165 Manufacture...
Proceedings of the 3rd Annual Conference on Aerospace Computational Control, volume 2
NASA Technical Reports Server (NTRS)
Bernard, Douglas E. (Editor); Man, Guy K. (Editor)
1989-01-01
This volume of the conference proceedings contain papers and discussions in the following topical areas: Parallel processing; Emerging integrated capabilities; Low order controllers; Real time simulation; Multibody component representation; User environment; and Distributed parameter techniques.
45 CFR 153.700 - Distributed data environment.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Distributed data environment. 153.700 Section 153... Distributed Data Collection for HHS-Operated Programs § 153.700 Distributed data environment. (a) Dedicated distributed data environments. For each benefit year in which HHS operates the risk adjustment or reinsurance...
45 CFR 153.700 - Distributed data environment.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Distributed data environment. 153.700 Section 153... Distributed Data Collection for HHS-Operated Programs § 153.700 Distributed data environment. (a) Dedicated distributed data environments. For each benefit year in which HHS operates the risk adjustment or reinsurance...
r-process nucleosynthesis in the high-entropy supernova bubble
NASA Technical Reports Server (NTRS)
Meyer, B. S.; Mathews, G. J.; Howard, W. M.; Woosley, S. E.; Hoffman, R. D.
1992-01-01
We show that the high-temperature, high-entropy evacuated region outside the recent neutron star in a core-collapse supernova may be an ideal r-process site. In this high-entropy environment it is possible that most nucleons are in the form of free neutrons or bound into alpha particles. Thus, there can be many neutrons per seed nucleus even though the material is not particularly neutron rich. The predicted amount of r-process material ejected per event from this environment agrees well with that required by simple galactic evolution arguments. When averaged over regions of different neutron excess in the supernova ejecta, the calculated r-process abundance curve can give a good representation of the solar-system r-process abundances as long as the entropy per baryon is sufficiently high. Neutrino irradiation may aid in smoothing the final abundance distribution.
NASA Technical Reports Server (NTRS)
Stubbs, T. J.; Glenar, D. A.; Wang, Y.; Hermalyn, B.; Sarantos, M.; Colaprete, A.; Elphic, R. C.
2015-01-01
The scientific objectives of the Lunar Atmosphere and Dust Environment Explorer (LADEE) mission are: (1) determine the composition of the lunar atmosphere, investigate processes controlling distribution and variability - sources, sinks, and surface interactions; and (2) characterize the lunar exospheric dust environment, measure spatial and temporal variability, and influences on the lunar atmosphere. Impacts on the lunar surface from meteoroid streams encountered by the Earth-Moon system are anticipated to result in enhancements in the both the lunar atmosphere and dust environment. Here we describe the annual meteoroid streams expected to be incident at the Moon during the LADEE mission, and their anticipated effects on the lunar environment.
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Putt, Charles W.
1997-01-01
The Visual Computing Environment (VCE) is a NASA Lewis Research Center project to develop a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis. The objectives of VCE are to (1) develop a visual computing environment for controlling the execution of individual simulation codes that are running in parallel and are distributed on heterogeneous host machines in a networked environment, (2) develop numerical coupling algorithms for interchanging boundary conditions between codes with arbitrary grid matching and different levels of dimensionality, (3) provide a graphical interface for simulation setup and control, and (4) provide tools for online visualization and plotting. VCE was designed to provide a distributed, object-oriented environment. Mechanisms are provided for creating and manipulating objects, such as grids, boundary conditions, and solution data. This environment includes parallel virtual machine (PVM) for distributed processing. Users can interactively select and couple any set of codes that have been modified to run in a parallel distributed fashion on a cluster of heterogeneous workstations. A scripting facility allows users to dictate the sequence of events that make up the particular simulation.
[Organization of monitoring of electromagnetic radiation in the urban environment].
Savel'ev, S I; Dvoeglazova, S V; Koz'min, V A; Kochkin, D E; Begishev, M R
2008-01-01
The authors describe new current approaches to monitoring the environment, including the sources of electromagnetic radiation and noise. Electronic maps of the area under study are shown to be made, by constructing the isolines or distributing the actual levels of controlled factors. These current approaches to electromagnetic and acoustic monitoring make it possible to automate a process of measurements, to analyze the established situation, and to simplify the risk controlling methodology.
Enhanced Vehicle Beddown Approximations for the Improved Theater Distribution Model
2014-03-27
processed utilizing a heuristic routing and scheduling procedure the authors called the Airlift Planning Algorithm ( APA ). The linear programming model...LINGO 13 environment. The model is then solved by LINGO 13 and solution data is passed back to the Excel environment in a readable format . All original...DSS is relatively unchanged when solutions to the ITDM are referenced for comparison testing. Readers are encouraged to see Appendix I for ITDM VBA
Moving code - Sharing geoprocessing logic on the Web
NASA Astrophysics Data System (ADS)
Müller, Matthias; Bernard, Lars; Kadner, Daniel
2013-09-01
Efficient data processing is a long-standing challenge in remote sensing. Effective and efficient algorithms are required for product generation in ground processing systems, event-based or on-demand analysis, environmental monitoring, and data mining. Furthermore, the increasing number of survey missions and the exponentially growing data volume in recent years have created demand for better software reuse as well as an efficient use of scalable processing infrastructures. Solutions that address both demands simultaneously have begun to slowly appear, but they seldom consider the possibility to coordinate development and maintenance efforts across different institutions, community projects, and software vendors. This paper presents a new approach to share, reuse, and possibly standardise geoprocessing logic in the field of remote sensing. Drawing from the principles of service-oriented design and distributed processing, this paper introduces moving-code packages as self-describing software components that contain algorithmic code and machine-readable descriptions of the provided functionality, platform, and infrastructure, as well as basic information about exploitation rights. Furthermore, the paper presents a lean publishing mechanism by which to distribute these packages on the Web and to integrate them in different processing environments ranging from monolithic workstations to elastic computational environments or "clouds". The paper concludes with an outlook toward community repositories for reusable geoprocessing logic and their possible impact on data-driven science in general.
Distributed collaborative environments for virtual capability-based planning
NASA Astrophysics Data System (ADS)
McQuay, William K.
2003-09-01
Distributed collaboration is an emerging technology that will significantly change how decisions are made in the 21st century. Collaboration involves two or more geographically dispersed individuals working together to share and exchange data, information, knowledge, and actions. The marriage of information, collaboration, and simulation technologies provides the decision maker with a collaborative virtual environment for planning and decision support. This paper reviews research that is focusing on the applying open standards agent-based framework with integrated modeling and simulation to a new Air Force initiative in capability-based planning and the ability to implement it in a distributed virtual environment. Virtual Capability Planning effort will provide decision-quality knowledge for Air Force resource allocation and investment planning including examining proposed capabilities and cost of alternative approaches, the impact of technologies, identification of primary risk drivers, and creation of executable acquisition strategies. The transformed Air Force business processes are enabled by iterative use of constructive and virtual modeling, simulation, and analysis together with information technology. These tools are applied collaboratively via a technical framework by all the affected stakeholders - warfighter, laboratory, product center, logistics center, test center, and primary contractor.
Graph Partitioning for Parallel Applications in Heterogeneous Grid Environments
NASA Technical Reports Server (NTRS)
Bisws, Rupak; Kumar, Shailendra; Das, Sajal K.; Biegel, Bryan (Technical Monitor)
2002-01-01
The problem of partitioning irregular graphs and meshes for parallel computations on homogeneous systems has been extensively studied. However, these partitioning schemes fail when the target system architecture exhibits heterogeneity in resource characteristics. With the emergence of technologies such as the Grid, it is imperative to study the partitioning problem taking into consideration the differing capabilities of such distributed heterogeneous systems. In our model, the heterogeneous system consists of processors with varying processing power and an underlying non-uniform communication network. We present in this paper a novel multilevel partitioning scheme for irregular graphs and meshes, that takes into account issues pertinent to Grid computing environments. Our partitioning algorithm, called MiniMax, generates and maps partitions onto a heterogeneous system with the objective of minimizing the maximum execution time of the parallel distributed application. For experimental performance study, we have considered both a realistic mesh problem from NASA as well as synthetic workloads. Simulation results demonstrate that MiniMax generates high quality partitions for various classes of applications targeted for parallel execution in a distributed heterogeneous environment.
Time distributions of solar energetic particle events: Are SEPEs really random?
NASA Astrophysics Data System (ADS)
Jiggens, P. T. A.; Gabriel, S. B.
2009-10-01
Solar energetic particle events (SEPEs) can exhibit flux increases of several orders of magnitude over background levels and have always been considered to be random in nature in statistical models with no dependence of any one event on the occurrence of previous events. We examine whether this assumption of randomness in time is correct. Engineering modeling of SEPEs is important to enable reliable and efficient design of both Earth-orbiting and interplanetary spacecraft and future manned missions to Mars and the Moon. All existing engineering models assume that the frequency of SEPEs follows a Poisson process. We present analysis of the event waiting times using alternative distributions described by Lévy and time-dependent Poisson processes and compared these with the usual Poisson distribution. The results show significant deviation from a Poisson process and indicate that the underlying physical processes might be more closely related to a Lévy-type process, suggesting that there is some inherent “memory” in the system. Inherent Poisson assumptions of stationarity and event independence are investigated, and it appears that they do not hold and can be dependent upon the event definition used. SEPEs appear to have some memory indicating that events are not completely random with activity levels varying even during solar active periods and are characterized by clusters of events. This could have significant ramifications for engineering models of the SEP environment, and it is recommended that current statistical engineering models of the SEP environment should be modified to incorporate long-term event dependency and short-term system memory.
NASA Astrophysics Data System (ADS)
Mueller, Sebastian B.; Kueppers, Ulrich; Huber, Matthew S.; Hess, Kai-Uwe; Poesges, Gisela; Ruthensteiner, Bernhard; Dingwell, Donald B.
2018-04-01
Aggregation is a common process occurring in many diverse particulate gas mixtures (e.g. those derived from explosive volcanic eruptions, meteorite impact events, and fluid bed processing). It results from the collision and sticking of particles suspended in turbulent gas/air. To date, there is no generalized model of the underlying physical processes. Here, we investigate aggregates from 18 natural deposits (16 volcanic deposits and two meteorite impact deposits) as well as aggregates produced experimentally via fluidized bed techniques. All aggregates were analyzed for their size, internal structuring, and constituent particle size distribution. Commonalities and differences between the aggregate types are then used to infer salient features of the aggregation process. Average core to rim ratios of internally structured aggregates (accretionary lapilli) is found to be similar for artificial and volcanic aggregates but up to an order of magnitude different than impact-related aggregates. Rim structures of artificial and volcanic aggregates appear to be physically similar (single, sub-spherical, regularly-shaped rims) whereas impact-related aggregates more often show multiple or irregularly shaped rims. The particle size distributions (PSDs) of all three aggregate types are similar (< 200 μm). This proves that in all three environments, aggregation occurs under broadly similar conditions despite the significant differences in source conditions (particle volume fraction, particle size distribution, particle composition, temperature), residence times, plume conditions (e.g., humidity and temperature), and dynamics of fallout and deposition. Impact-generated and volcanic aggregates share many similarities, and in some cases may be indistinguishable without their stratigraphic context.
Mueller, Sebastian B; Kueppers, Ulrich; Huber, Matthew S; Hess, Kai-Uwe; Poesges, Gisela; Ruthensteiner, Bernhard; Dingwell, Donald B
2018-01-01
Aggregation is a common process occurring in many diverse particulate gas mixtures (e.g. those derived from explosive volcanic eruptions, meteorite impact events, and fluid bed processing). It results from the collision and sticking of particles suspended in turbulent gas/air. To date, there is no generalized model of the underlying physical processes. Here, we investigate aggregates from 18 natural deposits (16 volcanic deposits and two meteorite impact deposits) as well as aggregates produced experimentally via fluidized bed techniques. All aggregates were analyzed for their size, internal structuring, and constituent particle size distribution. Commonalities and differences between the aggregate types are then used to infer salient features of the aggregation process. Average core to rim ratios of internally structured aggregates (accretionary lapilli) is found to be similar for artificial and volcanic aggregates but up to an order of magnitude different than impact-related aggregates. Rim structures of artificial and volcanic aggregates appear to be physically similar (single, sub-spherical, regularly-shaped rims) whereas impact-related aggregates more often show multiple or irregularly shaped rims. The particle size distributions (PSDs) of all three aggregate types are similar (< 200 μm). This proves that in all three environments, aggregation occurs under broadly similar conditions despite the significant differences in source conditions (particle volume fraction, particle size distribution, particle composition, temperature), residence times, plume conditions (e.g., humidity and temperature), and dynamics of fallout and deposition. Impact-generated and volcanic aggregates share many similarities, and in some cases may be indistinguishable without their stratigraphic context.
Extreme deconstruction supports niche conservatism driving New World bird diversity
NASA Astrophysics Data System (ADS)
Diniz-Filho, José Alexandre Felizola; Rangel, Thiago Fernando; dos Santos, Mariana Rocha
2012-08-01
It is expected that if environment fully establishes the borders of species geographic distribution, then richness patterns will arise simple by changing parameters on how environment affect each of the species. However, if other mechanisms (i.e., non-equilibrium of species' distributions with climate and historical contingency, shifts in adaptive peaks or biotic interactions) are driving species geographic distribution, models for species distribution and richness will not entirely match. Here we used the extreme deconstruction principle to test how niche conservatism keeping species geographic distributions in certain parts of environmental space drives richness patterns in New World birds, under tropical niche conservatism. Eight environmental variables were used to model the geographic distribution of 2790 species within 28 bird families using a GLM. Spatial patterns in richness for each of these families were also modeled as a function of these same variables using a standard OLS regression. Fit of these two types of models (mean MacFadden's ρ2 for GLM and R2 of OLS) across families and the match between GLM and OLS standardized slopes within and among bird families were then compared. We found a positive and significant correlation between GLM and OLS model fit (r = 0.601; P < 0.01), indicating that when environment strongly determine richness of a family, it also explains its species geographic distributions. The match between GLM and OLS slopes is significantly correlated with families' phylogenetic root distance (r = -0.467; P = 0.012), so that more basal families tend to have a better match between environmental drivers of richness and geographic distribution models. This is expected under tropical niche conservatism model and provides an integrated explanation on how processes at a lower hierarchical level (species' geographic distribution) drive diversity patterns.
A distributed version of the NASA Engine Performance Program
NASA Technical Reports Server (NTRS)
Cours, Jeffrey T.; Curlett, Brian P.
1993-01-01
Distributed NEPP, a version of the NASA Engine Performance Program, uses the original NEPP code but executes it in a distributed computer environment. Multiple workstations connected by a network increase the program's speed and, more importantly, the complexity of the cases it can handle in a reasonable time. Distributed NEPP uses the public domain software package, called Parallel Virtual Machine, allowing it to execute on clusters of machines containing many different architectures. It includes the capability to link with other computers, allowing them to process NEPP jobs in parallel. This paper discusses the design issues and granularity considerations that entered into programming Distributed NEPP and presents the results of timing runs.
NASA Astrophysics Data System (ADS)
Onda, Yuichi; Kato, Hiroaki; Patin, Jeremy; Yoshimura, Kazuya; Tsujimura, Maki; Wakahara, Taeko; Fukushima, Takehiko
2013-04-01
Previous experiences such as Chernobyl Nuclear Power Plant accident have confirmed that fallout radionuclides on the ground surface migrate through natural environment including soils and rivers. Therefore, in order to estimate future changes in radionuclide deposition, migration process of radionuclides in forests, soils, ground water, rivers should be monitored. However, such comprehensive studies on migration through forests, soils, ground water and rivers have not been conducted so far. Here, we present the following comprehensive investigation was conducted to confirm migration of radionuclides through natural environment including soils and rivers. 1)Study on depth distribution of radiocaesium in soils within forests, fields, and grassland 2)Confirmation of radionuclide distribution and investigation on migration in forests 3)Study on radionuclide migration due to soil erosion under different land use 4)Measurement of radionuclides entrained from natural environment including forests and soils 5)Investigation on radionuclide migration through soil water, ground water, stream water, spring water under different land use 6)Study on paddy-to-river transfer of radionuclides through suspended sediments 7)Study on river-to-ocean transfer of radionuclides via suspended sediments 8)Confirmation of radionuclide deposition in ponds and reservoirs
Derived virtual devices: a secure distributed file system mechanism
NASA Technical Reports Server (NTRS)
VanMeter, Rodney; Hotz, Steve; Finn, Gregory
1996-01-01
This paper presents the design of derived virtual devices (DVDs). DVDs are the mechanism used by the Netstation Project to provide secure shared access to network-attached peripherals distributed in an untrusted network environment. DVDs improve Input/Output efficiency by allowing user processes to perform I/O operations directly from devices without intermediate transfer through the controlling operating system kernel. The security enforced at the device through the DVD mechanism includes resource boundary checking, user authentication, and restricted operations, e.g., read-only access. To illustrate the application of DVDs, we present the interactions between a network-attached disk and a file system designed to exploit the DVD abstraction. We further discuss third-party transfer as a mechanism intended to provide for efficient data transfer in a typical NAP environment. We show how DVDs facilitate third-party transfer, and provide the security required in a more open network environment.
LAD Dissertation Prize Talk: Molecular Collisional Excitation in Astrophysical Environments
NASA Astrophysics Data System (ADS)
Walker, Kyle M.
2017-06-01
While molecular excitation calculations are vital in determining particle velocity distributions, internal state distributions, abundances, and ionization balance in gaseous environments, both theoretical calculations and experimental data for these processes are lacking. Reliable molecular collisional data with the most abundant species - H2, H, He, and electrons - are needed to probe material in astrophysical environments such as nebulae, molecular clouds, comets, and planetary atmospheres. However, excitation calculations with the main collider, H2, are computationally expensive and therefore various approximations are used to obtain unknown rate coefficients. The widely-accepted collider-mass scaling approach is flawed, and alternate scaling techniques based on physical and mathematical principles are presented here. The most up-to-date excitation data are used to model the chemical evolution of primordial species in the Recombination Era and produce accurate non-thermal spectra of the molecules H2+, HD, and H2 in a primordial cloud as it collapses into a first generation star.
NASA Technical Reports Server (NTRS)
Quinn, Todd M.; Walters, Jerry L.
1991-01-01
Future space explorations will require long term human presence in space. Space environments that provide working and living quarters for manned missions are becoming increasingly larger and more sophisticated. Monitor and control of the space environment subsystems by expert system software, which emulate human reasoning processes, could maintain the health of the subsystems and help reduce the human workload. The autonomous power expert (APEX) system was developed to emulate a human expert's reasoning processes used to diagnose fault conditions in the domain of space power distribution. APEX is a fault detection, isolation, and recovery (FDIR) system, capable of autonomous monitoring and control of the power distribution system. APEX consists of a knowledge base, a data base, an inference engine, and various support and interface software. APEX provides the user with an easy-to-use interactive interface. When a fault is detected, APEX will inform the user of the detection. The user can direct APEX to isolate the probable cause of the fault. Once a fault has been isolated, the user can ask APEX to justify its fault isolation and to recommend actions to correct the fault. APEX implementation and capabilities are discussed.
A functional model of sensemaking in a neurocognitive architecture.
Lebiere, Christian; Pirolli, Peter; Thomson, Robert; Paik, Jaehyon; Rutledge-Taylor, Matthew; Staszewski, James; Anderson, John R
2013-01-01
Sensemaking is the active process of constructing a meaningful representation (i.e., making sense) of some complex aspect of the world. In relation to intelligence analysis, sensemaking is the act of finding and interpreting relevant facts amongst the sea of incoming reports, images, and intelligence. We present a cognitive model of core information-foraging and hypothesis-updating sensemaking processes applied to complex spatial probability estimation and decision-making tasks. While the model was developed in a hybrid symbolic-statistical cognitive architecture, its correspondence to neural frameworks in terms of both structure and mechanisms provided a direct bridge between rational and neural levels of description. Compared against data from two participant groups, the model correctly predicted both the presence and degree of four biases: confirmation, anchoring and adjustment, representativeness, and probability matching. It also favorably predicted human performance in generating probability distributions across categories, assigning resources based on these distributions, and selecting relevant features given a prior probability distribution. This model provides a constrained theoretical framework describing cognitive biases as arising from three interacting factors: the structure of the task environment, the mechanisms and limitations of the cognitive architecture, and the use of strategies to adapt to the dual constraints of cognition and the environment.
A Functional Model of Sensemaking in a Neurocognitive Architecture
Lebiere, Christian; Paik, Jaehyon; Rutledge-Taylor, Matthew; Staszewski, James; Anderson, John R.
2013-01-01
Sensemaking is the active process of constructing a meaningful representation (i.e., making sense) of some complex aspect of the world. In relation to intelligence analysis, sensemaking is the act of finding and interpreting relevant facts amongst the sea of incoming reports, images, and intelligence. We present a cognitive model of core information-foraging and hypothesis-updating sensemaking processes applied to complex spatial probability estimation and decision-making tasks. While the model was developed in a hybrid symbolic-statistical cognitive architecture, its correspondence to neural frameworks in terms of both structure and mechanisms provided a direct bridge between rational and neural levels of description. Compared against data from two participant groups, the model correctly predicted both the presence and degree of four biases: confirmation, anchoring and adjustment, representativeness, and probability matching. It also favorably predicted human performance in generating probability distributions across categories, assigning resources based on these distributions, and selecting relevant features given a prior probability distribution. This model provides a constrained theoretical framework describing cognitive biases as arising from three interacting factors: the structure of the task environment, the mechanisms and limitations of the cognitive architecture, and the use of strategies to adapt to the dual constraints of cognition and the environment. PMID:24302930
Adaptation to stimulus statistics in the perception and neural representation of auditory space.
Dahmen, Johannes C; Keating, Peter; Nodal, Fernando R; Schulz, Andreas L; King, Andrew J
2010-06-24
Sensory systems are known to adapt their coding strategies to the statistics of their environment, but little is still known about the perceptual implications of such adjustments. We investigated how auditory spatial processing adapts to stimulus statistics by presenting human listeners and anesthetized ferrets with noise sequences in which interaural level differences (ILD) rapidly fluctuated according to a Gaussian distribution. The mean of the distribution biased the perceived laterality of a subsequent stimulus, whereas the distribution's variance changed the listeners' spatial sensitivity. The responses of neurons in the inferior colliculus changed in line with these perceptual phenomena. Their ILD preference adjusted to match the stimulus distribution mean, resulting in large shifts in rate-ILD functions, while their gain adapted to the stimulus variance, producing pronounced changes in neural sensitivity. Our findings suggest that processing of auditory space is geared toward emphasizing relative spatial differences rather than the accurate representation of absolute position.
Physical process in the coma of comet 67P derived from narrowband imaging of fragment species
NASA Astrophysics Data System (ADS)
Perez Lopez, F.; Küppers, M.; Marín-Yaseli de la Parra, J.; Besse, S.; Moissl, R.
2017-09-01
During the rendezvous of the Rosetta spacecraft with comet 67P/Churyumov-Gerasimenko, the OSIRIS scientific cameras monitored the near-nucleus gas environment in various narrow-band filters, observing various fragment species. It turned out that the excitation processes in the innermost coma are significantly different from the overall coma, as observed from the ground [1]. In particular, some of the observed emissions of fragments (daughter molecules) are created by direct dissociation of parent molecules, and in those cases the spatial distribution of the emission directly maps the distribution of parent molecules. We investigate the evolution of the brightness and distribution of the emissions over time to improve our understanding of the underlying emission mechanisms and to derive the spatial distribution of H2O and CO2. The outcome will provide constraints on the homogeneity of the cometary nucleus.
Distributed intrusion detection system based on grid security model
NASA Astrophysics Data System (ADS)
Su, Jie; Liu, Yahui
2008-03-01
Grid computing has developed rapidly with the development of network technology and it can solve the problem of large-scale complex computing by sharing large-scale computing resource. In grid environment, we can realize a distributed and load balance intrusion detection system. This paper first discusses the security mechanism in grid computing and the function of PKI/CA in the grid security system, then gives the application of grid computing character in the distributed intrusion detection system (IDS) based on Artificial Immune System. Finally, it gives a distributed intrusion detection system based on grid security system that can reduce the processing delay and assure the detection rates.
NASA Technical Reports Server (NTRS)
Zhang, Zhong
1997-01-01
The development of large-scale, composite software in a geographically distributed environment is an evolutionary process. Often, in such evolving systems, striving for consistency is complicated by many factors, because development participants have various locations, skills, responsibilities, roles, opinions, languages, terminology and different degrees of abstraction they employ. This naturally leads to many partial specifications or viewpoints. These multiple views on the system being developed usually overlap. From another aspect, these multiple views give rise to the potential for inconsistency. Existing CASE tools do not efficiently manage inconsistencies in distributed development environment for a large-scale project. Based on the ViewPoints framework the WHERE (Web-Based Hypertext Environment for requirements Evolution) toolkit aims to tackle inconsistency management issues within geographically distributed software development projects. Consequently, WHERE project helps make more robust software and support software assurance process. The long term goal of WHERE tools aims to the inconsistency analysis and management in requirements specifications. A framework based on Graph Grammar theory and TCMJAVA toolkit is proposed to detect inconsistencies among viewpoints. This systematic approach uses three basic operations (UNION, DIFFERENCE, INTERSECTION) to study the static behaviors of graphic and tabular notations. From these operations, subgraphs Query, Selection, Merge, Replacement operations can be derived. This approach uses graph PRODUCTIONS (rewriting rules) to study the dynamic transformations of graphs. We discuss the feasibility of implementation these operations. Also, We present the process of porting original TCM (Toolkit for Conceptual Modeling) project from C++ to Java programming language in this thesis. A scenario based on NASA International Space Station Specification is discussed to show the applicability of our approach. Finally, conclusion and future work about inconsistency management issues in WHERE project will be summarized.
Detection and Distribution of Natural Gaps in Tropical Rainforest
NASA Astrophysics Data System (ADS)
Goulamoussène, Y.; Linguet, L.; Hérault, B.
2014-12-01
Forest management is important to assess biodiversity and ecological processes. Requirements for disturbance information have also been motivated by the scientific community. Therefore, understanding and monitoring the distribution frequencies of treefall gaps is relevant to better understanding and predicting the carbon budget in response to global change and land use change. In this work we characterize and quantify the frequency distribution of natural canopy gaps. We observe then interaction between environment variables and gap formation across tropical rainforest of the French Guiana region by using high resolution airborne Light Detection and Ranging (LiDAR). We mapped gaps with canopy model distribution on 40000 ha of forest. We used a Bayesian modelling framework to estimate and select useful covariate model parameters. Topographic variables are included in a model to predict gap size distribution. We discuss results from the interaction between environment and gap size distribution, mainly topographic indexes. The use of both airborne and space-based techniques has improved our ability to supply needed disturbance information. This work is an approach at plot scale. The use of satellite data will allow us to work at forest scale. The inclusion of climate variables in our model will let us assess the impact of global change on tropical rainforest.
Assessing the Nexus of Built, Natural, and Social Environments and Public Health Outcomes
NASA Astrophysics Data System (ADS)
Archer, R.; Alexander, S.; Douglas, J.
2017-12-01
This study investigates community-related environmental justice concerns and chemical and non-chemical health stressors from built, natural, and social environments in Southeast Los Angeles (SELA) County and East Oakland, California. The geographical distribution of health outcomes is related to the built and natural environments, as well as impacts from the social environment. A holistic systems view is important in assessing healthy behaviors within a community, because they do not occur in isolation. Geospatial analysis will be performed to integrate a total environment framework and explore the spatial patterns of exposure to chemical and non-chemical stressors and access to health-promoting environments. Geographic Information Systems (GIS) analysis using primary and secondary existing data will be performed to determine how social environments impact exposure to chemical health stressors and access to health-promoting built and natural environments. This project will develop a comprehensive list of health-promoting built and natural environments (e.g., parks and community gardens) and polluting sites (e.g., shipping ports and sources of pollution not included in federal regulatory databases) in East Oakland and SELA. California Department of Public Health and U.S. Decennial Census data will also be included for geospatial analysis to overlay the distribution of air pollution-related morbidities (e.g. asthma, diabetes, and cancer) and access to health-promoting built and natural environments and related community assets, exposure to polluting industries, social disorganization, and public health outcomes in the target areas. This research will help identify the spatial and temporal distribution and cumulative impacts of critical pollution hotspots causing community environmental health impacts. The research team will also map how social environments impact exposure to chemical health stressors and access to health-promoting built and natural environments. The process and outcomes of this research should empower communities and aid decision-makers to integrate environmental justice considerations into public health policies.
Construction of integrated case environments.
Losavio, Francisca; Matteo, Alfredo; Pérez, María
2003-01-01
The main goal of Computer-Aided Software Engineering (CASE) technology is to improve the entire software system development process. The CASE approach is not merely a technology; it involves a fundamental change in the process of software development. The tendency of the CASE approach, technically speaking, is the integration of tools that assist in the application of specific methods. In this sense, the environment architecture, which includes the platform and the system's hardware and software, constitutes the base of the CASE environment. The problem of tools integration has been proposed for two decades. Current integration efforts emphasize the interoperability of tools, especially in distributed environments. In this work we use the Brown approach. The environment resulting from the application of this model is called a federative environment, focusing on the fact that this architecture pays special attention to the connections among the components of the environment. This approach is now being used in component-based design. This paper describes a concrete experience in civil engineering and architecture fields, for the construction of an integrated CASE environment. A generic architectural framework based on an intermediary architectural pattern is applied to achieve the integration of the different tools. This intermediary represents the control perspective of the PAC (Presentation-Abstraction-Control) style, which has been implemented as a Mediator pattern and it has been used in the interactive systems domain. In addition, a process is given to construct the integrated CASE.
Autoplan: A self-processing network model for an extended blocks world planning environment
NASA Technical Reports Server (NTRS)
Dautrechy, C. Lynne; Reggia, James A.; Mcfadden, Frank
1990-01-01
Self-processing network models (neural/connectionist models, marker passing/message passing networks, etc.) are currently undergoing intense investigation for a variety of information processing applications. These models are potentially very powerful in that they support a large amount of explicit parallel processing, and they cleanly integrate high level and low level information processing. However they are currently limited by a lack of understanding of how to apply them effectively in many application areas. The formulation of self-processing network methods for dynamic, reactive planning is studied. The long-term goal is to formulate robust, computationally effective information processing methods for the distributed control of semiautonomous exploration systems, e.g., the Mars Rover. The current research effort is focusing on hierarchical plan generation, execution and revision through local operations in an extended blocks world environment. This scenario involves many challenging features that would be encountered in a real planning and control environment: multiple simultaneous goals, parallel as well as sequential action execution, action sequencing determined not only by goals and their interactions but also by limited resources (e.g., three tasks, two acting agents), need to interpret unanticipated events and react appropriately through replanning, etc.
Microelectromechanical Systems
NASA Technical Reports Server (NTRS)
Gabriel, Kaigham J.
1995-01-01
Micro-electromechanical systems (MEMS) is an enabling technology that merges computation and communication with sensing and actuation to change the way people and machines interact with the physical world. MEMS is a manufacturing technology that will impact widespread applications including: miniature inertial measurement measurement units for competent munitions and personal navigation; distributed unattended sensors; mass data storage devices; miniature analytical instruments; embedded pressure sensors; non-invasive biomedical sensors; fiber-optics components and networks; distributed aerodynamic control; and on-demand structural strength. The long term goal of ARPA's MEMS program is to merge information processing with sensing and actuation to realize new systems and strategies for both perceiving and controlling systems, processes, and the environment. The MEMS program has three major thrusts: advanced devices and processes, system design, and infrastructure.
The role of graphics super-workstations in a supercomputing environment
NASA Technical Reports Server (NTRS)
Levin, E.
1989-01-01
A new class of very powerful workstations has recently become available which integrate near supercomputer computational performance with very powerful and high quality graphics capability. These graphics super-workstations are expected to play an increasingly important role in providing an enhanced environment for supercomputer users. Their potential uses include: off-loading the supercomputer (by serving as stand-alone processors, by post-processing of the output of supercomputer calculations, and by distributed or shared processing), scientific visualization (understanding of results, communication of results), and by real time interaction with the supercomputer (to steer an iterative computation, to abort a bad run, or to explore and develop new algorithms).
A distributed computing model for telemetry data processing
NASA Astrophysics Data System (ADS)
Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.
1994-05-01
We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.
A distributed computing model for telemetry data processing
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.
1994-01-01
We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.
2008-07-01
generation of process partitioning, a thread pipelining becomes possible. In this paper we briefly summarize the requirements and trends for FADEC based... FADEC environment, presenting a hypothetical realization of an example application. Finally we discuss the application of Time-Triggered...based control applications of the future. 15. SUBJECT TERMS Gas turbine, FADEC , Multi-core processing technology, disturbed based control
Distributed Motor Controller (DMC) for Operation in Extreme Environments
NASA Technical Reports Server (NTRS)
McKinney, Colin M.; Yager, Jeremy A.; Mojarradi, Mohammad M.; Some, Rafi; Sirota, Allen; Kopf, Ted; Stern, Ryan; Hunter, Don
2012-01-01
This paper presents an extreme environment capable Distributed Motor Controller (DMC) module suitable for operation with a distributed architecture of future spacecraft systems. This motor controller is designed to be a bus-based electronics module capable of operating a single Brushless DC motor in extreme space environments: temperature (-120 C to +85 C required, -180 C to +100 C stretch goal); radiation (>;20K required, >;100KRad stretch goal); >;360 cycles of operation. Achieving this objective will result in a scalable modular configuration for motor control with enhanced reliability that will greatly lower cost during the design, fabrication and ATLO phases of future missions. Within the heart of the DMC lies a pair of cold-capable Application Specific Integrated Circuits (ASICs) and a Field Programmable Gate Array (FPGA) that enable its miniaturization and operation in extreme environments. The ASICs are fabricated in the IBM 0.5 micron Silicon Germanium (SiGe) BiCMOS process and are comprised of Analog circuitry to provide telemetry information, sensor interface, and health and status of DMC. The FPGA contains logic to provide motor control, status monitoring and spacecraft interface. The testing and characterization of these ASICs have yielded excellent functionality in cold temperatures (-135 C). The DMC module has demonstrated successful operation of a motor at temperature.
Improved Protocols for Illumina Sequencing
Bronner, Iraad F.; Quail, Michael A.; Turner, Daniel J.; Swerdlow, Harold
2013-01-01
In this unit, we describe a set of improvements we have made to the standard Illumina protocols to make the sequencing process more reliable in a high-throughput environment, reduce amplification bias, narrow the distribution of insert sizes, and reliably obtain high yields of data. PMID:19582764
A multi-agent intelligent environment for medical knowledge.
Vicari, Rosa M; Flores, Cecilia D; Silvestre, André M; Seixas, Louise J; Ladeira, Marcelo; Coelho, Helder
2003-03-01
AMPLIA is a multi-agent intelligent learning environment designed to support training of diagnostic reasoning and modelling of domains with complex and uncertain knowledge. AMPLIA focuses on the medical area. It is a system that deals with uncertainty under the Bayesian network approach, where learner-modelling tasks will consist of creating a Bayesian network for a problem the system will present. The construction of a network involves qualitative and quantitative aspects. The qualitative part concerns the network topology, that is, causal relations among the domain variables. After it is ready, the quantitative part is specified. It is composed of the distribution of conditional probability of the variables represented. A negotiation process (managed by an intelligent MediatorAgent) will treat the differences of topology and probability distribution between the model the learner built and the one built-in in the system. That negotiation process occurs between the agents that represent the expert knowledge domain (DomainAgent) and the agent that represents the learner knowledge (LearnerAgent).
DataFed: A Federated Data System for Visualization and Analysis of Spatio-Temporal Air Quality Data
NASA Astrophysics Data System (ADS)
Husar, R. B.; Hoijarvi, K.
2017-12-01
DataFed is a distributed web-services-based computing environment for accessing, processing, and visualizing atmospheric data in support of air quality science and management. The flexible, adaptive environment facilitates the access and flow of atmospheric data from provider to users by enabling the creation of user-driven data processing/visualization applications. DataFed `wrapper' components, non-intrusively wrap heterogeneous, distributed datasets for access by standards-based GIS web services. The mediator components (also web services) map the heterogeneous data into a spatio-temporal data model. Chained web services provide homogeneous data views (e.g., geospatial, time views) using a global multi-dimensional data model. In addition to data access and rendering, the data processing component services can be programmed for filtering, aggregation, and fusion of multidimensional data. A complete application software is written in a custom made data flow language. Currently, the federated data pool consists of over 50 datasets originating from globally distributed data providers delivering surface-based air quality measurements, satellite observations, emissions data as well as regional and global-scale air quality models. The web browser-based user interface allows point and click navigation and browsing the XYZT multi-dimensional data space. The key applications of DataFed are for exploring spatial pattern of pollutants, seasonal, weekly, diurnal cycles and frequency distributions for exploratory air quality research. Since 2008, DataFed has been used to support EPA in the implementation of the Exceptional Event Rule. The data system is also used at universities in the US, Europe and Asia.
NASA Astrophysics Data System (ADS)
Chávez, G. Moreno; Sarocchi, D.; Santana, E. Arce; Borselli, L.
2015-12-01
The study of grain size distribution is fundamental for understanding sedimentological environments. Through these analyses, clast erosion, transport and deposition processes can be interpreted and modeled. However, grain size distribution analysis can be difficult in some outcrops due to the number and complexity of the arrangement of clasts and matrix and their physical size. Despite various technological advances, it is almost impossible to get the full grain size distribution (blocks to sand grain size) with a single method or instrument of analysis. For this reason development in this area continues to be fundamental. In recent years, various methods of particle size analysis by automatic image processing have been developed, due to their potential advantages with respect to classical ones; speed and final detailed content of information (virtually for each analyzed particle). In this framework, we have developed a novel algorithm and software for grain size distribution analysis, based on color image segmentation using an entropy-controlled quadratic Markov measure field algorithm and the Rosiwal method for counting intersections between clast and linear transects in the images. We test the novel algorithm in different sedimentary deposit types from 14 varieties of sedimentological environments. The results of the new algorithm were compared with grain counts performed manually by the same Rosiwal methods applied by experts. The new algorithm has the same accuracy as a classical manual count process, but the application of this innovative methodology is much easier and dramatically less time-consuming. The final productivity of the new software for analysis of clasts deposits after recording field outcrop images can be increased significantly.
NASA Astrophysics Data System (ADS)
Vulcani, Benedetta
2015-08-01
What physical processes regulate star formation in dense environments? Understanding why galaxy evolution is environment dependent is one of the key questions of current astrophysics. I will present the first characterization of the spatial distribution of star formation in cluster galaxies at z~0.5, in order to quantify the role of different physical processes that are believed to be responsible for shutting down star formation. The analysis makes use of data from the Grism Lens-Amplified Survey from Space (GLASS), a large HST cycle-21 program targeting 10 massive galaxy clusters with extensive HST imaging from CLASH and the Frontier Field Initiative. The program consists of 140 primary and 140 parallel orbits of near-infrared WCF3 and optical ACS slitless grism observations, which result in 3D spectroscopy of hundreds of galaxies. The grism data are used to produce spatially resolved maps of the star formation density, while the stellar mass density and optical surface brightness are obtained from multiband imaging. I will describe quantitative measures of the spatial location and extend of the star formation rate, showing that about half of the cluster members with significant Halpha detection have diffused star formation, larger than the optical counterpart. This suggests that star formation occurs out to larger radii than the rest frame continuum. For some systems, nuclear star forming regions are found. I will also present a comparison between the Halpha distribution observed in cluster and field galaxies. The characterization of the spatial distribution of Halpha provides a new window, yet poorly exploited, on the mechanisms that regulate star formation and morphological transformation in dense environments.
Jo, Hyeyeong; Son, Min-Hui; Seo, Sung-Hee; Chang, Yoon-Seok
2017-07-01
Hexabromocyclododecane (HBCD) contamination and its diastereomeric profile were investigated in a multi-media environment along a river at the local scale in air, soil, sludge, sediment, and fish samples. The spatial distribution of HBCD in each matrix showed a different result. The highest concentrations of HBCD in air and soil were detected near a general industrial complex; in the sediment and sludge samples, they were detected in the down-stream region (i.e., urban area). Each matrix showed the specific distribution patterns of HBCD diastereomers, suggesting continuous inputs of contaminants, different physicochemical properties, or isomerizations. The particle phases in air, sludge, and fish matrices were dominated by α-HBCD, owing to HBCD's various isomerization processes and different degradation rate in the environment, and metabolic capabilities of the fish; in contrast, the sediment and soil matrices were dominated by γ-HBCD because of the major composition of the technical mixtures and the strong adsorption onto solid particles. Based on these results, the prevalent and matrix-specific distribution of HBCD diastereomers suggested that more careful consideration should be given to the characteristics of the matrices and their effects on the potential influence of HBCD at the diastereomeric level. Copyright © 2017 Elsevier Ltd. All rights reserved.
RELATIVE CONTRIBUTIONS OF THE WEAK, MAIN, AND FISSION-RECYCLING r-PROCESS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shibagaki, S.; Kajino, T.; Mathews, G. J.
There has been a persistent conundrum in attempts to model the nucleosynthesis of heavy elements by rapid neutron capture (the r-process). Although the locations of the abundance peaks near nuclear mass numbers 130 and 195 identify an environment of rapid neutron capture near closed nuclear shells, the abundances of elements just above and below those peaks are often underproduced by more than an order of magnitude in model calculations. At the same time, there is a debate in the literature as to what degree the r-process elements are produced in supernovae or the mergers of binary neutron stars. In thismore » paper we propose a novel solution to both problems. We demonstrate that the underproduction of nuclides above and below the r-process peaks in main or weak r-process models (like magnetohydrodynamic jets or neutrino-driven winds in core-collapse supernovae) can be supplemented via fission fragment distributions from the recycling of material in a neutron-rich environment such as that encountered in neutron star mergers (NSMs). In this paradigm, the abundance peaks themselves are well reproduced by a moderately neutron-rich, main r-process environment such as that encountered in the magnetohydrodynamical jets in supernovae supplemented with a high-entropy, weakly neutron-rich environment such as that encountered in the neutrino-driven-wind model to produce the lighter r-process isotopes. Moreover, we show that the relative contributions to the r-process abundances in both the solar system and metal-poor stars from the weak, main, and fission-recycling environments required by this proposal are consistent with estimates of the relative Galactic event rates of core-collapse supernovae for the weak and main r-process and NSMs for the fission-recycling r-process.« less
Horneck, G
1995-01-01
The primary goal of exobiological research is to reach a better understanding of the processes leading to the origin, evolution and distribution of life on Earth or elsewhere in the universe. In this endeavour, scientists from a wide variety of disciplines are involved, such as astronomy, planetary research, organic chemistry, palaeontology and the various subdisciplines of biology including microbial ecology and molecular biology. Space technology plays an important part by offering the opportunity for exploring our solar system, for collecting extraterrestrial samples, and for utilizing the peculiar environment of space as a tool. Exobiological activities include comparison of the overall pattern of chemical evolution of potential precursors of life, in the interstellar medium, and on the planets and small bodies of our solar system; tracing the history of life on Earth back to its roots; deciphering the environments of the planets in our solar system and of their satellites, throughout their history, with regard to their habitability; searching for other planetary systems in our Galaxy and for signals of extraterrestrial civilizations; testing the impact of space environment on survivability of resistant life forms. This evolutionary approach towards understanding the phenomenon of life in the context of cosmic evolution may eventually contribute to a better understanding of the processes regulating the interactions of life with its environment on Earth.
Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline
Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur
2010-01-01
Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408
Franks, Bernard J.
1987-01-01
Because of the widespread distribution of creosote in the environment, an abandoned wood-treatment plant in Pensacola, Fla., was selected by the U.S. Geological Survey Office of Hazardous Waste Hydrology as one of three national research demonstration areas in order to increase our understanding of hydrologic processes affecting the distributions of contaminants in ground water. The site was selected because of its long, uninterrupted history (1902 81) of discharging wastewaters to unlined surface impoundments, availability of a preliminary data base (Troutman and others, 1984), and the high probability of useful technology transfer from an investigation of the fate of organic compounds associated with wood-preserving wastewaters in the subsurface environment.
NASA Astrophysics Data System (ADS)
Niles, P. B.; Golden, D. C.; Michalski, J. R.; Ming, D. W.
2017-12-01
Sulfur concentrations in the Mars soils are elevated above 1 wt% in nearly every location visited by landed spacecraft. This observation was first made by the Viking landers, and has been confirmed by subsequent missions. The wide distribution of sulfur in martian soils has been attributed to volcanic degassing, formation of sulfate aerosols, and later incorporation into martian soils during gravitational sedimentation. However, later discoveries of more concentrated sulfur bearing sediments by the Opportunity rover has led some to believe that sulfates may instead be a product of evaporation and aeolian redistribution. One question that has not been addressed is whether the modern surface conditions are too cold for weathering of volcanic materials by sulfate aerosols. We suggest here that mixtures of atmospheric aerosols, ice, and dust have the potential for creating small films of cryo-concentrated acidic solutions that may represent an important unexamined environment for understanding weathering processes on Mars. Laboratory experiments were conducted to simulate weathering of olivine under Mars-like conditions. The weathering rates measured in this study suggest that fine grained olivine on Mars would weather into sulfate minerals in short time periods if they are exposed to H2SO4 aerosols at temperatures at or above -40°C. In this system, the strength of the acidic solution is maximized through eutectic freezing in an environment where the silicate minerals are extremely fine grained and have high surface areas. This provides an ideal environment for olivine weathering despite the very low temperatures. The likelihood of substantial sulfur-rich volcanism on Mars and creation of abundant sulfate aerosols suggests that this process would have been important during formation of martian soils and sediments. Future work modeling sulfur release rates during volcanic eruptions and aerosol distribution over the surface will help understand how well this process could concentrate sulfate minerals in nearby surface materials or whether this process would simply result in widespread globally distributed sulfur materials.
Telearch - Integrated visual simulation environment for collaborative virtual archaeology.
NASA Astrophysics Data System (ADS)
Kurillo, Gregorij; Forte, Maurizio
Archaeologists collect vast amounts of digital data around the world; however, they lack tools for integration and collaborative interaction to support reconstruction and interpretation process. TeleArch software is aimed to integrate different data sources and provide real-time interaction tools for remote collaboration of geographically distributed scholars inside a shared virtual environment. The framework also includes audio, 2D and 3D video streaming technology to facilitate remote presence of users. In this paper, we present several experimental case studies to demonstrate the integration and interaction with 3D models and geographical information system (GIS) data in this collaborative environment.
Tsunami sediments and their grain size characteristics
NASA Astrophysics Data System (ADS)
Sulastya Putra, Purna
2018-02-01
Characteristics of tsunami deposits are very complex as the deposition by tsunami is very complex processes. The grain size characteristics of tsunami deposits are simply generalized no matter the local condition in which the deposition took place. The general characteristics are fining upward and landward, poor sorting, and the grain size distribution is not unimodal. Here I review the grain size characteristics of tsunami deposit in various environments: swale, coastal marsh and lagoon/lake. Review results show that although there are similar characters in some environments and cases, but in detail the characteristics in each environment can be distinguished; therefore, the tsunami deposit in each environment has its own characteristic. The local geological and geomorphological condition of the environment may greatly affect the grain size characteristics.
Experimental extraction of an entangled photon pair from two identically decohered pairs.
Yamamoto, Takashi; Koashi, Masato; Ozdemir, Sahin Kaya; Imoto, Nobuyuki
2003-01-23
Entanglement is considered to be one of the most important resources in quantum information processing schemes, including teleportation, dense coding and entanglement-based quantum key distribution. Because entanglement cannot be generated by classical communication between distant parties, distribution of entangled particles between them is necessary. During the distribution process, entanglement between the particles is degraded by the decoherence and dissipation processes that result from unavoidable coupling with the environment. Entanglement distillation and concentration schemes are therefore needed to extract pairs with a higher degree of entanglement from these less-entangled pairs; this is accomplished using local operations and classical communication. Here we report an experimental demonstration of extraction of a polarization-entangled photon pair from two decohered photon pairs. Two polarization-entangled photon pairs are generated by spontaneous parametric down-conversion and then distributed through a channel that induces identical phase fluctuations to both pairs; this ensures that no entanglement is available as long as each pair is manipulated individually. Then, through collective local operations and classical communication we extract from the two decohered pairs a photon pair that is observed to be polarization-entangled.
Reward skewness coding in the insula independent of probability and loss
Tobler, Philippe N.
2011-01-01
Rewards in the natural environment are rarely predicted with complete certainty. Uncertainty relating to future rewards has typically been defined as the variance of the potential outcomes. However, the asymmetry of predicted reward distributions, known as skewness, constitutes a distinct but neuroscientifically underexplored risk term that may also have an impact on preference. By changing only reward magnitudes, we study skewness processing in equiprobable ternary lotteries involving only gains and constant probabilities, thus excluding probability distortion or loss aversion as mechanisms for skewness preference formation. We show that individual preferences are sensitive to not only the mean and variance but also to the skewness of predicted reward distributions. Using neuroimaging, we show that the insula, a structure previously implicated in the processing of reward-related uncertainty, responds to the skewness of predicted reward distributions. Some insula responses increased in a monotonic fashion with skewness (irrespective of individual skewness preferences), whereas others were similarly elevated to both negative and positive as opposed to no reward skew. These data support the notion that the asymmetry of reward distributions is processed in the brain and, taken together with replicated findings of mean coding in the striatum and variance coding in the cingulate, suggest that the brain codes distinct aspects of reward distributions in a distributed fashion. PMID:21849610
Command and Control of Joint Air Operations through Mission Command
2016-06-01
and outlines the C2 architecture systems, processes, and philosophy of com- mand required to enable mission command effectively. Mission Command...General Dempsey highlights the fact that “trust is the moral sinew that binds the distributed Joint Force 2020 together” and observes that “unless...con- fident about how their subordinates will make decisions and adapt to the dynamic battlespace environment. Processes, Systems, and Philosophy of
Computer-Mediated Group Processes in Distributed Command and Control Systems
1988-06-01
Linville, "Michael J. Liebhaber, and Richard W. Obermayer Vreuls Corporation Jon J. Fallesen Army Research Institute DTIC SELECTEr • AUG I 1. 1988 ARI...control staffs who will operate in a computer- mediated environment. The Army Research Institute has initiated research to examine selected issues...computar-mediated group processes is needed. Procedure: The identification and selection of key research issues followed a three- step procedure. Previous
NASA Astrophysics Data System (ADS)
Onda, Y.; Kato, H.; Fukushima, T.; Wakahara, T.; Kita, K.; Takahashi, Y.; Sakaguchi, A.; Tanaka, K.; Yamashiki, Y.; Yoshida, N.
2012-12-01
After the Fukushima Daiichi Nuclear Power Plant acciden, fallout radionuclides on the ground surface will transfer through geomorphic processes. Therefore, in order to estimate future changes in radionuclide deposition, migration process of radionuclides in forests, soils, ground water, rivers, and entrainment from trees and soils should be confirmed. We (FMWSE group) was funded by MEXT, Japanese government, and 1 year following monitoring has been conducted about 1 year. 1 Migration study of radionuclides in natural environment including forests and rivers 1) Study on depth distribution of radiocaesium in soils within forests, fields, and grassland. 2) Confirmation of radionuclide distribution and investigation on migration in forests. 3) Study on radionuclide migration due to soil erosion under different land use. 4) Measurement of radionuclides entrained from natural environment including forests and soils. 2 Migration study of radionuclides through hydrological cycle such as soil water, rivers, lakes and ponds, ground water. 1) Investigation on radionuclide migration through soil water, ground water, stream water, spring water under different land use. 2) Study on paddy-to-river transfer of radionuclides through suspended sediment. 3) Study on river-to-ocean transfer of radionuclides via suspended sediment. 4) Confirmation of radionuclide deposition in ponds and reservoirs. We will present how and where the fallout radionulides transfter through geomorphic processes.
Molina, Manuel; Mota, Manuel; Ramos, Alfonso
2015-01-01
This work deals with mathematical modeling through branching processes. We consider sexually reproducing animal populations where, in each generation, the number of progenitor couples is determined in a non-predictable environment. By using a class of two-sex branching processes, we describe their demographic dynamics and provide several probabilistic and inferential contributions. They include results about the extinction of the population and the estimation of the offspring distribution and its main moments. We also present an application to salmonid populations.
Planning applications in image analysis
NASA Technical Reports Server (NTRS)
Boddy, Mark; White, Jim; Goldman, Robert; Short, Nick, Jr.
1994-01-01
We describe two interim results from an ongoing effort to automate the acquisition, analysis, archiving, and distribution of satellite earth science data. Both results are applications of Artificial Intelligence planning research to the automatic generation of processing steps for image analysis tasks. First, we have constructed a linear conditional planner (CPed), used to generate conditional processing plans. Second, we have extended an existing hierarchical planning system to make use of durations, resources, and deadlines, thus supporting the automatic generation of processing steps in time and resource-constrained environments.
Coagulation of dust particles in a plasma
NASA Technical Reports Server (NTRS)
Horanyi, M.; Goertz, C. K.
1990-01-01
The electrostatic charge of small dust grains in a plasma in which the temperature varies in time is discussed, pointing out that secondary electron emission might introduce charge separation. If the sign of the charge on small grains is opposite to that on big ones, enhanced coagulation can occur which will affect the size distribution of grains in a plasma. Two scenarios where this process might be relevant are considered: a hot plasma environment with temperature fluctuations and a cold plasma environment with transient heating events. The importance of the enhanced coagulation is uncertain, because the plasma parameters in grain-producing environments such as a molecular cloud or a protoplanetary disk are not known. It is possible, however, that this process is the most efficient mechanism for the growth of grains in the size range of 0.1-500 microns.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.
Advanced Image Processing for NASA Applications
NASA Technical Reports Server (NTRS)
LeMoign, Jacqueline
2007-01-01
The future of space exploration will involve cooperating fleets of spacecraft or sensor webs geared towards coordinated and optimal observation of Earth Science phenomena. The main advantage of such systems is to utilize multiple viewing angles as well as multiple spatial and spectral resolutions of sensors carried on multiple spacecraft but acting collaboratively as a single system. Within this framework, our research focuses on all areas related to sensing in collaborative environments, which means systems utilizing intracommunicating spatially distributed sensor pods or crafts being deployed to monitor or explore different environments. This talk will describe the general concept of sensing in collaborative environments, will give a brief overview of several technologies developed at NASA Goddard Space Flight Center in this area, and then will concentrate on specific image processing research related to that domain, specifically image registration and image fusion.
Wang, Y.; Boyd, E.; Crane, S.; Lu-Irving, P.; Krabbenhoft, D.; King, S.; Dighton, J.; Geesey, G.; Barkay, T.
2011-01-01
The distribution and phylogeny of extant protein-encoding genes recovered from geochemically diverse environments can provide insight into the physical and chemical parameters that led to the origin and which constrained the evolution of a functional process. Mercuric reductase (MerA) plays an integral role in mercury (Hg) biogeochemistry by catalyzing the transformation of Hg(II) to Hg(0). Putative merA sequences were amplified from DNA extracts of microbial communities associated with mats and sulfur precipitates from physicochemically diverse Hg-containing springs in Yellowstone National Park, Wyoming, using four PCR primer sets that were designed to capture the known diversity of merA. The recovery of novel and deeply rooted MerA lineages from these habitats supports previous evidence that indicates merA originated in a thermophilic environment. Generalized linear models indicate that the distribution of putative archaeal merA lineages was constrained by a combination of pH, dissolved organic carbon, dissolved total mercury and sulfide. The models failed to identify statistically well supported trends for the distribution of putative bacterial merA lineages as a function of these or other measured environmental variables, suggesting that these lineages were either influenced by environmental parameters not considered in the present study, or the bacterial primer sets were designed to target too broad of a class of genes which may have responded differently to environmental stimuli. The widespread occurrence of merA in the geothermal environments implies a prominent role for Hg detoxification in these environments. Moreover, the differences in the distribution of the merA genes amplified with the four merA primer sets suggests that the organisms putatively engaged in this activity have evolved to occupy different ecological niches within the geothermal gradient. ?? 2011 Springer Science+Business Media, LLC.
Wang, Yanping; Boyd, Eric; Crane, Sharron; Lu-Irving, Patricia; Krabbenhoft, David; King, Susan; Dighton, John; Geesey, Gill; Barkay, Tamar
2011-11-01
The distribution and phylogeny of extant protein-encoding genes recovered from geochemically diverse environments can provide insight into the physical and chemical parameters that led to the origin and which constrained the evolution of a functional process. Mercuric reductase (MerA) plays an integral role in mercury (Hg) biogeochemistry by catalyzing the transformation of Hg(II) to Hg(0). Putative merA sequences were amplified from DNA extracts of microbial communities associated with mats and sulfur precipitates from physicochemically diverse Hg-containing springs in Yellowstone National Park, Wyoming, using four PCR primer sets that were designed to capture the known diversity of merA. The recovery of novel and deeply rooted MerA lineages from these habitats supports previous evidence that indicates merA originated in a thermophilic environment. Generalized linear models indicate that the distribution of putative archaeal merA lineages was constrained by a combination of pH, dissolved organic carbon, dissolved total mercury and sulfide. The models failed to identify statistically well supported trends for the distribution of putative bacterial merA lineages as a function of these or other measured environmental variables, suggesting that these lineages were either influenced by environmental parameters not considered in the present study, or the bacterial primer sets were designed to target too broad of a class of genes which may have responded differently to environmental stimuli. The widespread occurrence of merA in the geothermal environments implies a prominent role for Hg detoxification in these environments. Moreover, the differences in the distribution of the merA genes amplified with the four merA primer sets suggests that the organisms putatively engaged in this activity have evolved to occupy different ecological niches within the geothermal gradient.
Software life cycle methodologies and environments
NASA Technical Reports Server (NTRS)
Fridge, Ernest
1991-01-01
Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.
Photosynthetic microbial mats in the 3,416-Myr-old ocean.
Tice, Michael M; Lowe, Donald R
2004-09-30
Recent re-evaluations of the geological record of the earliest life on Earth have led to the suggestion that some of the oldest putative microfossils and carbonaceous matter were formed through abiotic hydrothermal processes. Similarly, many early Archaean (more than 3,400-Myr-old) cherts have been reinterpreted as hydrothermal deposits rather than products of normal marine sedimentary processes. Here we present the results of a field, petrographic and geochemical study testing these hypotheses for the 3,416-Myr-old Buck Reef Chert, South Africa. From sedimentary structures and distributions of sand and mud, we infer that deposition occurred in normal open shallow to deep marine environments. The siderite enrichment that we observe in deep-water sediments is consistent with a stratified early ocean. We show that most carbonaceous matter was formed by photosynthetic mats within the euphotic zone and distributed as detrital matter by waves and currents to surrounding environments. We find no evidence that hydrothermal processes had any direct role in the deposition of either the carbonaceous matter or the enclosing sediments. Instead, we conclude that photosynthetic organisms had evolved and were living in a stratified ocean supersaturated in dissolved silica 3,416 Myr ago.
Photosynthetic microbial mats in the 3,416-Myr-old ocean
NASA Astrophysics Data System (ADS)
Tice, Michael M.; Lowe, Donald R.
2004-09-01
Recent re-evaluations of the geological record of the earliest life on Earth have led to the suggestion that some of the oldest putative microfossils and carbonaceous matter were formed through abiotic hydrothermal processes. Similarly, many early Archaean (more than 3,400-Myr-old) cherts have been reinterpreted as hydrothermal deposits rather than products of normal marine sedimentary processes. Here we present the results of a field, petrographic and geochemical study testing these hypotheses for the 3,416-Myr-old Buck Reef Chert, South Africa. From sedimentary structures and distributions of sand and mud, we infer that deposition occurred in normal open shallow to deep marine environments. The siderite enrichment that we observe in deep-water sediments is consistent with a stratified early ocean. We show that most carbonaceous matter was formed by photosynthetic mats within the euphotic zone and distributed as detrital matter by waves and currents to surrounding environments. We find no evidence that hydrothermal processes had any direct role in the deposition of either the carbonaceous matter or the enclosing sediments. Instead, we conclude that photosynthetic organisms had evolved and were living in a stratified ocean supersaturated in dissolved silica 3,416Myr ago.
Meng, Qing-Hao; Yang, Wei-Xing; Wang, Yang; Zeng, Ming
2011-01-01
This paper addresses the collective odor source localization (OSL) problem in a time-varying airflow environment using mobile robots. A novel OSL methodology which combines odor-source probability estimation and multiple robots' search is proposed. The estimation phase consists of two steps: firstly, the separate probability-distribution map of odor source is estimated via Bayesian rules and fuzzy inference based on a single robot's detection events; secondly, the separate maps estimated by different robots at different times are fused into a combined map by way of distance based superposition. The multi-robot search behaviors are coordinated via a particle swarm optimization algorithm, where the estimated odor-source probability distribution is used to express the fitness functions. In the process of OSL, the estimation phase provides the prior knowledge for the searching while the searching verifies the estimation results, and both phases are implemented iteratively. The results of simulations for large-scale advection-diffusion plume environments and experiments using real robots in an indoor airflow environment validate the feasibility and robustness of the proposed OSL method.
Meng, Qing-Hao; Yang, Wei-Xing; Wang, Yang; Zeng, Ming
2011-01-01
This paper addresses the collective odor source localization (OSL) problem in a time-varying airflow environment using mobile robots. A novel OSL methodology which combines odor-source probability estimation and multiple robots’ search is proposed. The estimation phase consists of two steps: firstly, the separate probability-distribution map of odor source is estimated via Bayesian rules and fuzzy inference based on a single robot’s detection events; secondly, the separate maps estimated by different robots at different times are fused into a combined map by way of distance based superposition. The multi-robot search behaviors are coordinated via a particle swarm optimization algorithm, where the estimated odor-source probability distribution is used to express the fitness functions. In the process of OSL, the estimation phase provides the prior knowledge for the searching while the searching verifies the estimation results, and both phases are implemented iteratively. The results of simulations for large-scale advection–diffusion plume environments and experiments using real robots in an indoor airflow environment validate the feasibility and robustness of the proposed OSL method. PMID:22346650
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.; Palikonda, R.; Smith, W. L., Jr.; Spangenberg, D.
2016-12-01
The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) processes and derives near real-time (NRT) global cloud products from operational geostationary satellite imager datasets. These products are being used in NRT to improve forecast model, aircraft icing warnings, and support aircraft field campaigns. Next generation satellites, such as the Japanese Himawari-8 and the upcoming NOAA GOES-R, present challenges for NRT data processing and product dissemination due to the increase in temporal and spatial resolution. The volume of data is expected to increase to approximately 10 folds. This increase in data volume will require additional IT resources to keep up with the processing demands to satisfy NRT requirements. In addition, these resources are not readily available due to cost and other technical limitations. To anticipate and meet these computing resource requirements, we have employed a hybrid cloud computing environment to augment the generation of SatCORPS products. This paper will describe the workflow to ingest, process, and distribute SatCORPS products and the technologies used. Lessons learn from working on both AWS Clouds and GovCloud will be discussed: benefits, similarities, and differences that could impact decision to use cloud computing and storage. A detail cost analysis will be presented. In addition, future cloud utilization, parallelization, and architecture layout will be discussed for GOES-R.
Software/hardware distributed processing network supporting the Ada environment
NASA Astrophysics Data System (ADS)
Wood, Richard J.; Pryk, Zen
1993-09-01
A high-performance, fault-tolerant, distributed network has been developed, tested, and demonstrated. The network is based on the MIPS Computer Systems, Inc. R3000 Risc for processing, VHSIC ASICs for high speed, reliable, inter-node communications and compatible commercial memory and I/O boards. The network is an evolution of the Advanced Onboard Signal Processor (AOSP) architecture. It supports Ada application software with an Ada- implemented operating system. A six-node implementation (capable of expansion up to 256 nodes) of the RISC multiprocessor architecture provides 120 MIPS of scalar throughput, 96 Mbytes of RAM and 24 Mbytes of non-volatile memory. The network provides for all ground processing applications, has merit for space-qualified RISC-based network, and interfaces to advanced Computer Aided Software Engineering (CASE) tools for application software development.
Semiautomatic mapping of permafrost in the Yukon Flats, Alaska
NASA Astrophysics Data System (ADS)
Gulbrandsen, Mats Lundh; Minsley, Burke J.; Ball, Lyndsay B.; Hansen, Thomas Mejer
2016-12-01
Thawing of permafrost due to global warming can have major impacts on hydrogeological processes, climate feedback, arctic ecology, and local environments. To understand these effects and processes, it is crucial to know the distribution of permafrost. In this study we exploit the fact that airborne electromagnetic (AEM) data are sensitive to the distribution of permafrost and demonstrate how the distribution of permafrost in the Yukon Flats, Alaska, is mapped in an efficient (semiautomatic) way, using a combination of supervised and unsupervised (machine) learning algorithms, i.e., Smart Interpretation and K-means clustering. Clustering is used to sort unfrozen and frozen regions, and Smart Interpretation is used to predict the depth of permafrost based on expert interpretations. This workflow allows, for the first time, a quantitative and objective approach to efficiently map permafrost based on large amounts of AEM data.
Semiautomatic mapping of permafrost in the Yukon Flats, Alaska
Gulbrandsen, Mats Lundh; Minsley, Burke J.; Ball, Lyndsay B.; Hansen, Thomas Mejer
2016-01-01
Thawing of permafrost due to global warming can have major impacts on hydrogeological processes, climate feedback, arctic ecology, and local environments. To understand these effects and processes, it is crucial to know the distribution of permafrost. In this study we exploit the fact that airborne electromagnetic (AEM) data are sensitive to the distribution of permafrost and demonstrate how the distribution of permafrost in the Yukon Flats, Alaska, is mapped in an efficient (semiautomatic) way, using a combination of supervised and unsupervised (machine) learning algorithms, i.e., Smart Interpretation and K-means clustering. Clustering is used to sort unfrozen and frozen regions, and Smart Interpretation is used to predict the depth of permafrost based on expert interpretations. This workflow allows, for the first time, a quantitative and objective approach to efficiently map permafrost based on large amounts of AEM data.
Dependence of Snowmelt Simulations on Scaling of the Forcing Processes (Invited)
NASA Astrophysics Data System (ADS)
Winstral, A. H.; Marks, D. G.; Gurney, R. J.
2009-12-01
The spatial organization and scaling relationships of snow distribution in mountain environs is ultimately dependent on the controlling processes. These processes include interactions between weather, topography, vegetation, snow state, and seasonally-dependent radiation inputs. In large scale snow modeling it is vital to know these dependencies to obtain accurate predictions while reducing computational costs. This study examined the scaling characteristics of the forcing processes and the dependency of distributed snowmelt simulations to their scaling. A base model simulation characterized these processes with 10m resolution over a 14.0 km2 basin with an elevation range of 1474 - 2244 masl. Each of the major processes affecting snow accumulation and melt - precipitation, wind speed, solar radiation, thermal radiation, temperature, and vapor pressure - were independently degraded to 1 km resolution. Seasonal and event-specific results were analyzed. Results indicated that scale effects on melt vary by process and weather conditions. The dependence of melt simulations on the scaling of solar radiation fluxes also had a seasonal component. These process-based scaling characteristics should remain static through time as they are based on physical considerations. As such, these results not only provide guidance for current modeling efforts, but are also well suited to predicting how potential climate changes will affect the heterogeneity of mountain snow distributions.
A Distributed Leadership Change Process Model for Higher Education
ERIC Educational Resources Information Center
Jones, Sandra; Harvey, Marina
2017-01-01
The higher education sector operates in an increasingly complex global environment that is placing it under considerable stress and resulting in widespread change to the operating context and leadership of higher education institutions. The outcome has been the increased likelihood of conflict between academics and senior leaders, presaging the…
Factors Affecting Information Seeking and Evaluation in a Distributed Learning Environment
ERIC Educational Resources Information Center
Lee, Jae-Shin; Cho, Hichang
2011-01-01
The purpose of this study was to identify and analyze the processes of seeking information online and evaluating this information. We hypothesized that individuals' social network, in-out group categorization, and cultural proclivity would influence their online information-seeking behavior. Also, we tested whether individuals differentiated…
40 CFR 750.40 - Cross-examination.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Cross-examination. 750.40 Section 750... Processing and Distribution in Commerce Exemptions § 750.40 Cross-examination. (a) After the close of the... cross-examination. The request must be received by EPA within one week after a full transcript of the...
40 CFR 761.205 - Notification of PCB waste activity (EPA Form 7710-53).
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) TOXIC SUBSTANCES CONTROL ACT POLYCHLORINATED BIPHENYLS (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS PCB Waste Disposal Records and Reports § 761.205 Notification of... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Notification of PCB waste activity...
40 CFR 761.205 - Notification of PCB waste activity (EPA Form 7710-53).
Code of Federal Regulations, 2014 CFR
2014-07-01
... (CONTINUED) TOXIC SUBSTANCES CONTROL ACT POLYCHLORINATED BIPHENYLS (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS PCB Waste Disposal Records and Reports § 761.205 Notification of... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Notification of PCB waste activity...
40 CFR 761.205 - Notification of PCB waste activity (EPA Form 7710-53).
Code of Federal Regulations, 2013 CFR
2013-07-01
... (CONTINUED) TOXIC SUBSTANCES CONTROL ACT POLYCHLORINATED BIPHENYLS (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS PCB Waste Disposal Records and Reports § 761.205 Notification of... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Notification of PCB waste activity...
40 CFR 761.205 - Notification of PCB waste activity (EPA Form 7710-53).
Code of Federal Regulations, 2012 CFR
2012-07-01
... (CONTINUED) TOXIC SUBSTANCES CONTROL ACT POLYCHLORINATED BIPHENYLS (PCBs) MANUFACTURING, PROCESSING, DISTRIBUTION IN COMMERCE, AND USE PROHIBITIONS PCB Waste Disposal Records and Reports § 761.205 Notification of... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Notification of PCB waste activity...
Vertical distribution of scab in large pecan trees
USDA-ARS?s Scientific Manuscript database
Pecan scab (caused by Fusicladium effusum) is a destructive disease of pecan (Carya illinoensis) grown in humid environments, such as the southeastern US. The disease can cause severe yield loss, and although much is known about the processes of dispersal and infection, there is no information on di...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Record. 750.34 Section 750.34... Processing and Distribution in Commerce Exemptions § 750.34 Record. (a) No later than the date of proposal of a rule subject to this subpart, a rulemaking record for that rule will be established. It will...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Record. 750.34 Section 750.34... Processing and Distribution in Commerce Exemptions § 750.34 Record. (a) No later than the date of proposal of a rule subject to this subpart, a rulemaking record for that rule will be established. It will...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Record. 750.34 Section 750.34... Processing and Distribution in Commerce Exemptions § 750.34 Record. (a) No later than the date of proposal of a rule subject to this subpart, a rulemaking record for that rule will be established. It will...
Hagen, R. W.; Ambos, H. D.; Browder, M. W.; Roloff, W. R.; Thomas, L. J.
1979-01-01
The Clinical Physiologic Research System (CPRS) developed from our experience in applying computers to medical instrumentation problems. This experience revealed a set of applications with a commonality in data acquisition, analysis, input/output, and control needs that could be met by a portable system. The CPRS demonstrates a practical methodology for integrating commercial instruments with distributed modular elements of local design in order to make facile responses to changing instrumentation needs in clinical environments. ImagesFigure 3
Advanced algorithms for distributed fusion
NASA Astrophysics Data System (ADS)
Gelfand, A.; Smith, C.; Colony, M.; Bowman, C.; Pei, R.; Huynh, T.; Brown, C.
2008-03-01
The US Military has been undergoing a radical transition from a traditional "platform-centric" force to one capable of performing in a "Network-Centric" environment. This transformation will place all of the data needed to efficiently meet tactical and strategic goals at the warfighter's fingertips. With access to this information, the challenge of fusing data from across the batttlespace into an operational picture for real-time Situational Awareness emerges. In such an environment, centralized fusion approaches will have limited application due to the constraints of real-time communications networks and computational resources. To overcome these limitations, we are developing a formalized architecture for fusion and track adjudication that allows the distribution of fusion processes over a dynamically created and managed information network. This network will support the incorporation and utilization of low level tracking information within the Army Distributed Common Ground System (DCGS-A) or Future Combat System (FCS). The framework is based on Bowman's Dual Node Network (DNN) architecture that utilizes a distributed network of interlaced fusion and track adjudication nodes to build and maintain a globally consistent picture across all assets.
A multiarchitecture parallel-processing development environment
NASA Technical Reports Server (NTRS)
Townsend, Scott; Blech, Richard; Cole, Gary
1993-01-01
A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.
Dynamic Load-Balancing for Distributed Heterogeneous Computing of Parallel CFD Problems
NASA Technical Reports Server (NTRS)
Ecer, A.; Chien, Y. P.; Boenisch, T.; Akay, H. U.
2000-01-01
The developed methodology is aimed at improving the efficiency of executing block-structured algorithms on parallel, distributed, heterogeneous computers. The basic approach of these algorithms is to divide the flow domain into many sub- domains called blocks, and solve the governing equations over these blocks. Dynamic load balancing problem is defined as the efficient distribution of the blocks among the available processors over a period of several hours of computations. In environments with computers of different architecture, operating systems, CPU speed, memory size, load, and network speed, balancing the loads and managing the communication between processors becomes crucial. Load balancing software tools for mutually dependent parallel processes have been created to efficiently utilize an advanced computation environment and algorithms. These tools are dynamic in nature because of the chances in the computer environment during execution time. More recently, these tools were extended to a second operating system: NT. In this paper, the problems associated with this application will be discussed. Also, the developed algorithms were combined with the load sharing capability of LSF to efficiently utilize workstation clusters for parallel computing. Finally, results will be presented on running a NASA based code ADPAC to demonstrate the developed tools for dynamic load balancing.
Parallel processing for scientific computations
NASA Technical Reports Server (NTRS)
Alkhatib, Hasan S.
1995-01-01
The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.
Raster Data Partitioning for Supporting Distributed GIS Processing
NASA Astrophysics Data System (ADS)
Nguyen Thai, B.; Olasz, A.
2015-08-01
In the geospatial sector big data concept also has already impact. Several studies facing originally computer science techniques applied in GIS processing of huge amount of geospatial data. In other research studies geospatial data is considered as it were always been big data (Lee and Kang, 2015). Nevertheless, we can prove data acquisition methods have been improved substantially not only the amount, but the resolution of raw data in spectral, spatial and temporal aspects as well. A significant portion of big data is geospatial data, and the size of such data is growing rapidly at least by 20% every year (Dasgupta, 2013). The produced increasing volume of raw data, in different format, representation and purpose the wealth of information derived from this data sets represents only valuable results. However, the computing capability and processing speed rather tackle with limitations, even if semi-automatic or automatic procedures are aimed on complex geospatial data (Kristóf et al., 2014). In late times, distributed computing has reached many interdisciplinary areas of computer science inclusive of remote sensing and geographic information processing approaches. Cloud computing even more requires appropriate processing algorithms to be distributed and handle geospatial big data. Map-Reduce programming model and distributed file systems have proven their capabilities to process non GIS big data. But sometimes it's inconvenient or inefficient to rewrite existing algorithms to Map-Reduce programming model, also GIS data can not be partitioned as text-based data by line or by bytes. Hence, we would like to find an alternative solution for data partitioning, data distribution and execution of existing algorithms without rewriting or with only minor modifications. This paper focuses on technical overview of currently available distributed computing environments, as well as GIS data (raster data) partitioning, distribution and distributed processing of GIS algorithms. A proof of concept implementation have been made for raster data partitioning, distribution and processing. The first results on performance have been compared against commercial software ERDAS IMAGINE 2011 and 2014. Partitioning methods heavily depend on application areas, therefore we may consider data partitioning as a preprocessing step before applying processing services on data. As a proof of concept we have implemented a simple tile-based partitioning method splitting an image into smaller grids (NxM tiles) and comparing the processing time to existing methods by NDVI calculation. The concept is demonstrated using own development open source processing framework.
Implications of movement for species distribution models - Rethinking environmental data tools.
Bruneel, Stijn; Gobeyn, Sacha; Verhelst, Pieterjan; Reubens, Jan; Moens, Tom; Goethals, Peter
2018-07-01
Movement is considered an essential process in shaping the distributions of species. Nevertheless, most species distribution models (SDMs) still focus solely on environment-species relationships to predict the occurrence of species. Furthermore, the currently used indirect estimates of movement allow to assess habitat accessibility, but do not provide an accurate description of movement. Better proxies of movement are needed to assess the dispersal potential of individual species and to gain a more practical insight in the interconnectivity of communities. Telemetry techniques are rapidly evolving and highly capable to provide explicit descriptions of movement, but their usefulness for SDMs will mainly depend on the ability of these models to deal with hitherto unconsidered ecological processes. More specifically, the integration of movement is likely to affect the environmental data requirements as the connection between environmental and biological data is crucial to provide reliable results. Mobility implies the occupancy of a continuum of space, hence an adequate representation of both geographical and environmental space is paramount to study mobile species distributions. In this context, environmental models, remote sensing techniques and animal-borne environmental sensors are discussed as potential techniques to obtain suitable environmental data. In order to provide an in-depth review of the aforementioned methods, we have chosen to use the modelling of fish distributions as a case study. The high mobility of fish and the often highly variable nature of the aquatic environment generally complicate model development, making it an adequate subject for research. Furthermore, insight into the distribution of fish is of great interest for fish stock assessments and water management worldwide, underlining its practical relevance. Copyright © 2018 Elsevier B.V. All rights reserved.
GSHR-Tree: a spatial index tree based on dynamic spatial slot and hash table in grid environments
NASA Astrophysics Data System (ADS)
Chen, Zhanlong; Wu, Xin-cai; Wu, Liang
2008-12-01
Computation Grids enable the coordinated sharing of large-scale distributed heterogeneous computing resources that can be used to solve computationally intensive problems in science, engineering, and commerce. Grid spatial applications are made possible by high-speed networks and a new generation of Grid middleware that resides between networks and traditional GIS applications. The integration of the multi-sources and heterogeneous spatial information and the management of the distributed spatial resources and the sharing and cooperative of the spatial data and Grid services are the key problems to resolve in the development of the Grid GIS. The performance of the spatial index mechanism is the key technology of the Grid GIS and spatial database affects the holistic performance of the GIS in Grid Environments. In order to improve the efficiency of parallel processing of a spatial mass data under the distributed parallel computing grid environment, this paper presents a new grid slot hash parallel spatial index GSHR-Tree structure established in the parallel spatial indexing mechanism. Based on the hash table and dynamic spatial slot, this paper has improved the structure of the classical parallel R tree index. The GSHR-Tree index makes full use of the good qualities of R-Tree and hash data structure. This paper has constructed a new parallel spatial index that can meet the needs of parallel grid computing about the magnanimous spatial data in the distributed network. This arithmetic splits space in to multi-slots by multiplying and reverting and maps these slots to sites in distributed and parallel system. Each sites constructs the spatial objects in its spatial slot into an R tree. On the basis of this tree structure, the index data was distributed among multiple nodes in the grid networks by using large node R-tree method. The unbalance during process can be quickly adjusted by means of a dynamical adjusting algorithm. This tree structure has considered the distributed operation, reduplication operation transfer operation of spatial index in the grid environment. The design of GSHR-Tree has ensured the performance of the load balance in the parallel computation. This tree structure is fit for the parallel process of the spatial information in the distributed network environments. Instead of spatial object's recursive comparison where original R tree has been used, the algorithm builds the spatial index by applying binary code operation in which computer runs more efficiently, and extended dynamic hash code for bit comparison. In GSHR-Tree, a new server is assigned to the network whenever a split of a full node is required. We describe a more flexible allocation protocol which copes with a temporary shortage of storage resources. It uses a distributed balanced binary spatial tree that scales with insertions to potentially any number of storage servers through splits of the overloaded ones. The application manipulates the GSHR-Tree structure from a node in the grid environment. The node addresses the tree through its image that the splits can make outdated. This may generate addressing errors, solved by the forwarding among the servers. In this paper, a spatial index data distribution algorithm that limits the number of servers has been proposed. We improve the storage utilization at the cost of additional messages. The structure of GSHR-Tree is believed that the scheme of this grid spatial index should fit the needs of new applications using endlessly larger sets of spatial data. Our proposal constitutes a flexible storage allocation method for a distributed spatial index. The insertion policy can be tuned dynamically to cope with periods of storage shortage. In such cases storage balancing should be favored for better space utilization, at the price of extra message exchanges between servers. This structure makes a compromise in the updating of the duplicated index and the transformation of the spatial index data. Meeting the needs of the grid computing, GSHRTree has a flexible structure in order to satisfy new needs in the future. The GSHR-Tree provides the R-tree capabilities for large spatial datasets stored over interconnected servers. The analysis, including the experiments, confirmed the efficiency of our design choices. The scheme should fit the needs of new applications of spatial data, using endlessly larger datasets. Using the system response time of the parallel processing of spatial scope query algorithm as the performance evaluation factor, According to the result of the simulated the experiments, GSHR-Tree is performed to prove the reasonable design and the high performance of the indexing structure that the paper presented.
Ejecta Production and Properties
NASA Astrophysics Data System (ADS)
Williams, Robin
2017-06-01
The interaction of an internal shock with the free surface of a dense material leads to the production of jets of particulate material from the surface into its environment. Understanding the processes which control the production of these jets -- both their occurrence, and properties such as the mass, velocity, and particle size distribution of material injected -- has been a topic of active research at AWE for over 50 years. I will discuss the effect of material physics, such as strength and spall, on the production of ejecta, drawing on experimental history and recent calculations, and consider the processes which determine the distribution of particle sizes which result as ejecta jets break up. British Crown Owned Copyright 2017/AWE.
Urbanization and Greenhouse Gas Emissions from Industry
NASA Astrophysics Data System (ADS)
Didenko, N. I.; Skripnuk, D. F.; Mirolyubova, O. V.
2017-06-01
This article analyses the global environment. The article describes processes that characterize the global environment, specific indicators are suggested, that can be used to measure the change in the global environment. It is said that cities and all urbanized territories have a negative effect on the global environment. Originally, the authors wanted to call the article «City as a source of destruction of the global environment». But taking into account the fact that urbanization contributes to improving the economic efficiency of the state, cities are the centers of the economic, cultural and informational potential that provide a «breakthrough» into the development of the economy. The article assesses the impact of urbanization on the global environment. For the analysis of the impact of urbanization on the natural habitat, the autoregressive distributed lags (ADL-model) are chosen.
NASA Astrophysics Data System (ADS)
Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi
Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.
Kylin, Henrik; Svensson, Teresia; Jensen, Sören; Strachan, William M J; Franich, Robert; Bouwman, Hindrik
2017-10-01
The production and use of pentachlorophenol (PCP) was recently prohibited/restricted by the Stockholm Convention on persistent organic pollutants (POPs), but environmental data are few and of varying quality. We here present the first extensive dataset of the continent-wide (Eurasia and Canada) occurrence of PCP and its methylation product pentachloroanisole (PCA) in the environment, specifically in pine needles. The highest concentrations of PCP were found close to expected point sources, while PCA chiefly shows a northern and/or coastal distribution not correlating with PCP distribution. Although long-range transport and environmental methylation of PCP or formation from other precursors cannot be excluded, the distribution patterns suggest that such processes may not be the only source of PCA to remote regions and unknown sources should be sought. We suggest that natural sources, e.g., chlorination of organic matter in Boreal forest soils enhanced by chloride deposition from marine sources, should be investigated as a possible partial explanation of the observed distributions. The results show that neither PCA nor total PCP (ΣPCP = PCP + PCA) should be used to approximate the concentrations of PCP; PCP and PCA must be determined and quantified separately to understand their occurrence and fate in the environment. The background work shows that the accumulation of airborne POPs in plants is a complex process. The variations in life cycles and physiological adaptations have to be taken into account when using plants to evaluate the concentrations of POPs in remote areas. Copyright © 2017 Elsevier Ltd. All rights reserved.
Collaborative environments for capability-based planning
NASA Astrophysics Data System (ADS)
McQuay, William K.
2005-05-01
Distributed collaboration is an emerging technology for the 21st century that will significantly change how business is conducted in the defense and commercial sectors. Collaboration involves two or more geographically dispersed entities working together to create a "product" by sharing and exchanging data, information, and knowledge. A product is defined broadly to include, for example, writing a report, creating software, designing hardware, or implementing robust systems engineering and capability planning processes in an organization. Collaborative environments provide the framework and integrate models, simulations, domain specific tools, and virtual test beds to facilitate collaboration between the multiple disciplines needed in the enterprise. The Air Force Research Laboratory (AFRL) is conducting a leading edge program in developing distributed collaborative technologies targeted to the Air Force's implementation of systems engineering for a simulation-aided acquisition and capability-based planning. The research is focusing on the open systems agent-based framework, product and process modeling, structural architecture, and the integration technologies - the glue to integrate the software components. In past four years, two live assessment events have been conducted to demonstrate the technology in support of research for the Air Force Agile Acquisition initiatives. The AFRL Collaborative Environment concept will foster a major cultural change in how the acquisition, training, and operational communities conduct business.
Werner, Nicole E; Jolliff, Anna F; Casper, Gail; Martell, Thomas; Ponto, Kevin
2018-08-01
Managing chronic illness requires personal health information management (PHIM) to be performed by lay individuals. Paramount to understanding the PHIM process is understanding the sociotechnical system in which it frequently occurs: the home environment. We combined distributed cognition theory and the patient work system model to investigate how characteristics of the home interact with the cognitive work of PHIM. We used a 3D virtual reality CAVE that enabled participants who had been diagnosed with diabetes (N = 20) to describe how they would perform PHIM in the home context. We found that PHIM is distinctly cognitive work, and rarely performed 'in the head'. Rather, features of the physical environment, tasks, people, and tools and technologies present, continuously shape and are shaped by the PHIM process. We suggest that approaches in which the individual (sans context) is considered the relevant unit of analysis overlook the pivotal role of the environment in shaping PHIM. Practitioner Summary: We examined how Personal Health Information Management (PHIM) is performed in the homes of diabetic patients. We found that approaches to studying cognition that focus on the individual, to the exclusion of their context, overlook the pivotal role of environmental, social, and technological features in shaping PHIM.
A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations
NASA Astrophysics Data System (ADS)
Demir, I.; Agliamzanov, R.
2014-12-01
Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.
NASA Astrophysics Data System (ADS)
Li, Jun-Wei; Cao, Jun-Wei
2010-04-01
One challenge in large-scale scientific data analysis is to monitor data in real-time in a distributed environment. For the LIGO (Laser Interferometer Gravitational-wave Observatory) project, a dedicated suit of data monitoring tools (DMT) has been developed, yielding good extensibility to new data type and high flexibility to a distributed environment. Several services are provided, including visualization of data information in various forms and file output of monitoring results. In this work, a DMT monitor, OmegaMon, is developed for tracking statistics of gravitational-wave (OW) burst triggers that are generated from a specific OW burst data analysis pipeline, the Omega Pipeline. Such results can provide diagnostic information as reference of trigger post-processing and interferometer maintenance.
Random heteropolymers preserve protein function in foreign environments
NASA Astrophysics Data System (ADS)
Panganiban, Brian; Qiao, Baofu; Jiang, Tao; DelRe, Christopher; Obadia, Mona M.; Nguyen, Trung Dac; Smith, Anton A. A.; Hall, Aaron; Sit, Izaac; Crosby, Marquise G.; Dennis, Patrick B.; Drockenmuller, Eric; Olvera de la Cruz, Monica; Xu, Ting
2018-03-01
The successful incorporation of active proteins into synthetic polymers could lead to a new class of materials with functions found only in living systems. However, proteins rarely function under the conditions suitable for polymer processing. On the basis of an analysis of trends in protein sequences and characteristic chemical patterns on protein surfaces, we designed four-monomer random heteropolymers to mimic intrinsically disordered proteins for protein solubilization and stabilization in non-native environments. The heteropolymers, with optimized composition and statistical monomer distribution, enable cell-free synthesis of membrane proteins with proper protein folding for transport and enzyme-containing plastics for toxin bioremediation. Controlling the statistical monomer distribution in a heteropolymer, rather than the specific monomer sequence, affords a new strategy to interface with biological systems for protein-based biomaterials.
Design of supply chain in fuzzy environment
NASA Astrophysics Data System (ADS)
Rao, Kandukuri Narayana; Subbaiah, Kambagowni Venkata; Singh, Ganja Veera Pratap
2013-05-01
Nowadays, customer expectations are increasing and organizations are prone to operate in an uncertain environment. Under this uncertain environment, the ultimate success of the firm depends on its ability to integrate business processes among supply chain partners. Supply chain management emphasizes cross-functional links to improve the competitive strategy of organizations. Now, companies are moving from decoupled decision processes towards more integrated design and control of their components to achieve the strategic fit. In this paper, a new approach is developed to design a multi-echelon, multi-facility, and multi-product supply chain in fuzzy environment. In fuzzy environment, mixed integer programming problem is formulated through fuzzy goal programming in strategic level with supply chain cost and volume flexibility as fuzzy goals. These fuzzy goals are aggregated using minimum operator. In tactical level, continuous review policy for controlling raw material inventories in supplier echelon and controlling finished product inventories in plant as well as distribution center echelon is considered as fuzzy goals. A non-linear programming model is formulated through fuzzy goal programming using minimum operator in the tactical level. The proposed approach is illustrated with a numerical example.
The Spatial Distributions and Variations of Water Environmental Risk in Yinma River Basin, China.
Di, Hui; Liu, Xingpeng; Zhang, Jiquan; Tong, Zhijun; Ji, Meichen
2018-03-15
Water environmental risk is the probability of the occurrence of events caused by human activities or the interaction of human activities and natural processes that will damage a water environment. This study proposed a water environmental risk index (WERI) model to assess the water environmental risk in the Yinma River Basin based on hazards, exposure, vulnerability, and regional management ability indicators in a water environment. The data for each indicator were gathered from 2000, 2005, 2010, and 2015 to assess the spatial and temporal variations in water environmental risk using particle swarm optimization and the analytic hierarchy process (PSO-AHP) method. The results showed that the water environmental risk in the Yinma River Basin decreased from 2000 to 2015. The risk level of the water environment was high in Changchun, while the risk levels in Yitong and Yongji were low. The research methods provide information to support future decision making by the risk managers in the Yinma River Basin, which is in a high-risk water environment. Moreover, water environment managers could reduce the risks by adjusting the indicators that affect water environmental risks.
NASA Astrophysics Data System (ADS)
Ott, Stephan; Herschel Science Ground Segment Consortium
2010-05-01
The Herschel Space Observatory, the fourth cornerstone mission in the ESA science program, was launched 14th of May 2009. With a 3.5 m telescope, it is the largest space telescope ever launched. Herschel's three instruments (HIFI, PACS, and SPIRE) perform photometry and spectroscopy in the 55 - 672 micron range and will deliver exciting science for the astronomical community during at least three years of routine observations. Since 2nd of December 2009 Herschel has been performing and processing observations in routine science mode. The development of the Herschel Data Processing System started eight years ago to support the data analysis for Instrument Level Tests. To fulfil the expectations of the astronomical community, additional resources were made available to implement a freely distributable Data Processing System capable of interactively and automatically reducing Herschel data at different processing levels. The system combines data retrieval, pipeline execution and scientific analysis in one single environment. The Herschel Interactive Processing Environment (HIPE) is the user-friendly face of Herschel Data Processing. The software is coded in Java and Jython to be platform independent and to avoid the need for commercial licenses. It is distributed under the GNU Lesser General Public License (LGPL), permitting everyone to access and to re-use its code. We will summarise the current capabilities of the Herschel Data Processing System and give an overview about future development milestones and plans, and how the astronomical community can contribute to HIPE. The Herschel Data Processing System is a joint development by the Herschel Science Ground Segment Consortium, consisting of ESA, the NASA Herschel Science Center, and the HIFI, PACS and SPIRE consortium members.
A distributed pipeline for DIDSON data processing
Li, Liling; Danner, Tyler; Eickholt, Jesse; McCann, Erin L.; Pangle, Kevin; Johnson, Nicholas
2018-01-01
Technological advances in the field of ecology allow data on ecological systems to be collected at high resolution, both temporally and spatially. Devices such as Dual-frequency Identification Sonar (DIDSON) can be deployed in aquatic environments for extended periods and easily generate several terabytes of underwater surveillance data which may need to be processed multiple times. Due to the large amount of data generated and need for flexibility in processing, a distributed pipeline was constructed for DIDSON data making use of the Hadoop ecosystem. The pipeline is capable of ingesting raw DIDSON data, transforming the acoustic data to images, filtering the images, detecting and extracting motion, and generating feature data for machine learning and classification. All of the tasks in the pipeline can be run in parallel and the framework allows for custom processing. Applications of the pipeline include monitoring migration times, determining the presence of a particular species, estimating population size and other fishery management tasks.
A Systematic Survey of Star Formation with the ORION MIDEX Mission
NASA Astrophysics Data System (ADS)
Scowen, P.; Morse, J.; Beasley, M.; Hester, J.; Windhorst, R.; Desch, S.; Jansen, R.; Calzetti, D.; Padgett, D.; Hartigan, P.; Oey, S.; Bally, J.; Gallagher, J.; O'Connell, R.; Kennicutt, R.; Lauer, T.
2004-05-01
The ORION MIDEX mission is a 1.2m UV-visual observatory orbiting at L2 that will conduct the first-ever high spatial resolution survey of a statistically significant sample of visible star-forming environments in the Solar neighborhood in emission lines and continuum. This survey will be used to characterize the star and planet forming environments within 2.5 kpc of the Sun, infer global properties and star formation history in these regions, understand how the environment influences the process of star and planet formation, and develop a classification scheme for star forming regions incorporating the earlier results. Based on these findings we will then conduct a similar high spatial resolution survey of large portions of the Magellanic Clouds, applying the classification scheme from local star forming environments to analogous regions in nearby galaxies, extending the classification scheme to regions that do not have nearby analogs but are common in external galaxies. The results from the local survey will allow us to infer characteristics of low mass star forming environments in the Magellanic Clouds, study the spatial distribution of star forming environments and analyze stellar population photometry to trace star formation history. Finally we will image a representative sample of external galaxies using the same filters used to characterize nearby star formation regions. We will map the distribution of star forming region type as a function of galactic environment for galaxies out to 5 Mpc to infer the distribution and history of low-mass star formation over galactic scales, characterize the stellar content and star formation history of galaxies, and relate these results to the current star forming environments in these galaxies. Ultimately we intend to use these diagnostics to extrapolate to star formation environments in the higher redshift Universe. We will also present an update on the technology development, project planning and operations for the proposed mission.
Space-based Observations of Star Formation using ORION: THE MIDEX
NASA Astrophysics Data System (ADS)
Scowen, P.; Morse, J.; Beasley, M.; Hester, J.; Windhorst, R.; Jansen, R.; Lauer, T.; Danielson, E.; Sepulveda, C.; Olarte, G.; ORION MIDEX Science Team
2003-12-01
The ORION MIDEX mission is a 1.2m UV-visual observatory orbiting at L2 that will conduct the first-ever high spatial resolution survey of a statistically significant sample of visible star-forming environments in the Solar neighborhood in emission lines and continuum. This survey will be used to characterize the star and planet forming environments within 2.5 kpc of the Sun, infer global properties and star formation history in these regions, understand how the environment influences the process of star and planet formation, and develop a classification scheme for star forming regions incorporating the earlier results. Based on these findings we will then conduct a similar high spatial resolution survey of large portions of the Magellanic Clouds, applying the classification scheme from local star forming environments to analogous regions in nearby galaxies, extending the classification scheme to regions that do not have nearby analogs but are common in external galaxies. The results from the local survey will allow us to infer characteristics of low mass star forming environments in the Magellanic Clouds, study the spatial distribution of star forming environments and analyze stellar population photometry to trace star formation history. Finally we will image a representative sample of external galaxies using the same filters used to characterize nearby star formation regions. We will map the distribution of star forming region type as a function of galactic environment for galaxies out to 5 Mpc to infer the distribution and history of low-mass star formation over galactic scales, characterize the stellar content and star formation history of galaxies, and relate these results to the current star forming environments in these galaxies. Ultimately we intend to use these diagnostics to extrapolate to star formation environments in the higher redshift Universe. We will also present details on technology development, project planning and operations for the proposed mission.
ORION: Hierarchical Space-based Observations of Star Formation, From Near to Far
NASA Astrophysics Data System (ADS)
Scowen, P. A.; Morse, J. A.; Beasley, M.; Veach, T.; ORION Science Team
2005-12-01
The ORION MIDEX mission is a 1.2m UV-visual observatory orbiting at L2 that will conduct the first-ever high spatial resolution survey of a statistically significant sample of visible star-forming environments in the Solar neighborhood in emission lines and continuum. This survey will be used to characterize the star and planet forming environments within 2.5 kpc of the Sun, infer global properties and star formation history in these regions, understand how the environment influences the process of star and planet formation, and develop a classification scheme for star forming regions incorporating the earlier results. Based on these findings we will then conduct a similar high spatial resolution survey of large portions of the Magellanic Clouds, applying the classification scheme from local star forming environments to analogous regions in nearby galaxies, extending the classification scheme to regions that do not have nearby analogs but are common in external galaxies. The results from the local survey will allow us to infer characteristics of low mass star forming environments in the Magellanic Clouds, study the spatial distribution of star forming environments and analyze stellar population photometry to trace star formation history. Finally we will image a representative sample of external galaxies using the same filters used to characterize nearby star formation regions. We will map the distribution of star forming region type as a function of galactic environment for galaxies out to 5 Mpc to infer the distribution and history of low-mass star formation over galactic scales, characterize the stellar content and star formation history of galaxies, and relate these results to the current star forming environments in these galaxies. Ultimately we intend to use these diagnostics to extrapolate to star formation environments in the higher redshift Universe. We will also present details on technology development, project planning and operations for the proposed mission.
A Systematic Survey of Star Formation with the ORION MIDEX Mission
NASA Astrophysics Data System (ADS)
Scowen, P.; Morse, J.; Beasley, M.; Hester, J.; Windhorst, R.; Desch, S.; Jansen, R.; Calzetti, D.; Padgett, D.; Hartigan, P.; Oey, S.; Bally, J.; Gallagher, J.; O'Connell, R.; Kennicutt, R.; Lauer, T.; McCaughrean, M.
2004-12-01
The ORION MIDEX mission is a 1.2m UV-visual observatory orbiting at L2 that will conduct the first-ever high spatial resolution survey of a statistically significant sample of visible star-forming environments in the Solar neighborhood in emission lines and continuum. This survey will be used to characterize the star and planet forming environments within 2.5 kpc of the Sun, infer global properties and star formation history in these regions, understand how the environment influences the process of star and planet formation, and develop a classification scheme for star forming regions incorporating the earlier results. Based on these findings we will then conduct a similar high spatial resolution survey of large portions of the Magellanic Clouds, applying the classification scheme from local star forming environments to analogous regions in nearby galaxies, extending the classification scheme to regions that do not have nearby analogs but are common in external galaxies. The results from the local survey will allow us to infer characteristics of low mass star forming environments in the Magellanic Clouds, study the spatial distribution of star forming environments and analyze stellar population photometry to trace star formation history. Finally we will image a representative sample of external galaxies using the same filters used to characterize nearby star formation regions. We will map the distribution of star forming region type as a function of galactic environment for galaxies out to 5 Mpc to infer the distribution and history of low-mass star formation over galactic scales, characterize the stellar content and star formation history of galaxies, and relate these results to the current star forming environments in these galaxies. Ultimately we intend to use these diagnostics to extrapolate to star formation environments in the higher redshift Universe. We will also present an update on the technology development, project planning and operations for the proposed mission.
Decentralized Control of Scheduling in Distributed Systems.
1983-03-18
the job scheduling algorithm adapts to the changing busyness of the various hosts in the system. The environment in which the job scheduling entities...resources and processes that constitute the node and a set of interfaces for accessing these processes and resources. The structure of a node could change ...parallel. Chang [CHNG82] has also described some algorithms for detecting properties of general graphs by traversing paths in a graph in parallel. One of
Converting customer expectations into achievable results.
Landis, G A
1999-11-01
It is not enough in today's environment to just meet customers' expectations--we must exceed them. Therefore, one must learn what constitutes expectations. These needs have expanded during the past few years from just manufacturing the product and looking at the outcome from a provincial standpoint. Now we must understand and satisfy the entire supply chain. To manage this process and satisfy the customer, the process now involves the supplier, the manufacturer, and the entire distribution system.
Design & implementation of distributed spatial computing node based on WPS
NASA Astrophysics Data System (ADS)
Liu, Liping; Li, Guoqing; Xie, Jibo
2014-03-01
Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed.
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.
2017-12-01
Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.
Pan, Tony; Flick, Patrick; Jain, Chirag; Liu, Yongchao; Aluru, Srinivas
2017-10-09
Counting and indexing fixed length substrings, or k-mers, in biological sequences is a key step in many bioinformatics tasks including genome alignment and mapping, genome assembly, and error correction. While advances in next generation sequencing technologies have dramatically reduced the cost and improved latency and throughput, few bioinformatics tools can efficiently process the datasets at the current generation rate of 1.8 terabases every 3 days. We present Kmerind, a high performance parallel k-mer indexing library for distributed memory environments. The Kmerind library provides a set of simple and consistent APIs with sequential semantics and parallel implementations that are designed to be flexible and extensible. Kmerind's k-mer counter performs similarly or better than the best existing k-mer counting tools even on shared memory systems. In a distributed memory environment, Kmerind counts k-mers in a 120 GB sequence read dataset in less than 13 seconds on 1024 Xeon CPU cores, and fully indexes their positions in approximately 17 seconds. Querying for 1% of the k-mers in these indices can be completed in 0.23 seconds and 28 seconds, respectively. Kmerind is the first k-mer indexing library for distributed memory environments, and the first extensible library for general k-mer indexing and counting. Kmerind is available at https://github.com/ParBLiSS/kmerind.
An analysis of the orbital distribution of solid rocket motor slag
NASA Astrophysics Data System (ADS)
Horstman, Matthew F.; Mulrooney, Mark
2009-01-01
The contribution by solid rocket motors (SRMs) to the orbital debris environment is potentially significant and insufficiently studied. Design and combustion processes can lead to the emission of enough by-products to warrant assessment of their contribution to orbital debris. These particles are formed during SRM tail-off, or burn termination, by the rapid solidification of molten Al2O3 slag accumulated during the burn. The propensity of SRMs to generate particles larger than 100μm raises concerns regarding the debris environment. Sizes as large as 1 cm have been witnessed in ground tests, and comparable sizes have been estimated via observations of sub-orbital tail-off events. Utilizing previous research we have developed more sophisticated size distributions and modeled the time evolution of resultant orbital populations using a historical database of SRM launches, propellant, and likely location and time of tail-off. This analysis indicates that SRM ejecta is a significant component of the debris environment.
The Impact Ejecta Environment of Near Earth Asteroids
NASA Astrophysics Data System (ADS)
Szalay, Jamey R.; Horányi, Mihály
2016-10-01
Impact ejecta production is a ubiquitous process that occurs on all airless bodies throughout the solar system. Unlike the Moon, which retains a large fraction of its ejecta, asteroids primarily shed their ejecta into the interplanetary dust population. These grains carry valuable information about the chemical compositions of their parent bodies that can be measured via in situ dust detection. Here, we use recent Lunar Atmosphere and Dust Environment Explorer/Lunar Dust Experiment measurements of the lunar dust cloud to calculate the dust ejecta distribution for any airless body near 1 au. We expect this dust distribution to be highly asymmetric, due to non-isotropic impacting fluxes. We predict that flybys near these asteroids would collect many times more dust impacts by transiting the apex side of the body compared to its anti-apex side. While these results are valid for bodies at 1 au, they can be used to qualitatively infer the ejecta environment for all solar-orbiting airless bodies.
Separation and reconstruction of high pressure water-jet reflective sound signal based on ICA
NASA Astrophysics Data System (ADS)
Yang, Hongtao; Sun, Yuling; Li, Meng; Zhang, Dongsu; Wu, Tianfeng
2011-12-01
The impact of high pressure water-jet on the different materials target will produce different reflective mixed sound. In order to reconstruct the reflective sound signals distribution on the linear detecting line accurately and to separate the environment noise effectively, the mixed sound signals acquired by linear mike array were processed by ICA. The basic principle of ICA and algorithm of FASTICA were described in detail. The emulation experiment was designed. The environment noise signal was simulated by using band-limited white noise and the reflective sound signal was simulated by using pulse signal. The reflective sound signal attenuation produced by the different distance transmission was simulated by weighting the sound signal with different contingencies. The mixed sound signals acquired by linear mike array were synthesized by using the above simulated signals and were whitened and separated by ICA. The final results verified that the environment noise separation and the reconstruction of the detecting-line sound distribution can be realized effectively.
Zhang, Pan; Hu, Rijun; Zhu, Longhai; Wang, Peng; Yin, Dongxiao; Zhang, Lianjie
2017-06-15
Heavy metals (Cu, Pb, Cr, Cd and As) contents in surface sediments from western Laizhou Bay were analysed to evaluate the spatial distribution pattern and their contamination level. As was mainly concentrated in the coastal area near the estuaries and the other metals were mainly concentrated in the central part of the study area. The heavy metals were present at unpolluted levels overall evaluated by the sediment quality guidelines and geoaccumulation index. Principal component analysis suggest that Cu, Pb and Cd were mainly sourced from natural processes and As was mainly derived from anthropogenic inputs. Meanwhile, Cr originated from both natural processes and anthropogenic contributions. Tidal currents, sediments and human activities were important factors affecting the distribution of heavy metals. The heavy metal environment was divided into four subareas to provide a reference for understanding the distribution and pollution of heavy metals in the estuary-bay system. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Distributed Control System Prototyping Environment to Support Control Room Modernization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lew, Roger Thomas; Boring, Ronald Laurids; Ulrich, Thomas Anthony
Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, however the set of tools for developing and designing HMIs is still in its infancy. Here we propose a rapid prototyping approach for integrating proposed HMIs into their native environments before a design is finalized. This approach allows researchers and developers tomore » test design ideas and eliminate design flaws prior to fully developing the new system. We illustrate this approach with four prototype designs developed using Microsoft’s Windows Presentation Foundation (WPF). One example is integrated into a microworld environment to test the functionality of the design and identify the optimal level of automation for a new system in a nuclear power plant. The other three examples are integrated into a full-scale, glasstop digital simulator of a nuclear power plant. One example demonstrates the capabilities of next generation control concepts; another aims to expand the current state of the art; lastly, an HMI prototype was developed as a test platform for a new control system currently in development at U.S. nuclear power plants. WPF possesses several characteristics that make it well suited to HMI design. It provides a tremendous amount of flexibility, agility, robustness, and extensibility. Distributed control system (DCS) specific environments tend to focus on the safety and reliability requirements for real-world interfaces and consequently have less emphasis on providing functionality to support novel interaction paradigms. Because of WPF’s large user-base, Microsoft can provide an extremely mature tool. Within process control applications,WPF is platform independent and can communicate with popular full-scope process control simulator vendor plant models and DCS platforms.« less
Harris, M. Camille; Pearce, John M.; Prosser, Diann J.; White, C. LeAnn; Miles, A. Keith; Sleeman, Jonathan M.; Brand, Christopher J.; Cronin, James P.; De La Cruz, Susan; Densmore, Christine L.; Doyle, Thomas W.; Dusek, Robert J.; Fleskes, Joseph P.; Flint, Paul L.; Guala, Gerald F.; Hall, Jeffrey S.; Hubbard, Laura E.; Hunt, Randall J.; Ip, Hon S.; Katz, Rachel A.; Laurent, Kevin W.; Miller, Mark P.; Munn, Mark D.; Ramey, Andy M.; Richards, Kevin D.; Russell, Robin E.; Stokdyk, Joel P.; Takekawa, John Y.; Walsh, Daniel P.
2016-08-18
IntroductionThrough the Science Strategy for Highly Pathogenic Avian Influenza (HPAI) in Wildlife and the Environment, the USGS will assess avian influenza (AI) dynamics in an ecological context to inform decisions made by resource managers and policymakers from the local to national level. Through collection of unbiased scientific information on the ecology of AI viruses and wildlife hosts in a changing world, the U.S. Geological Survey (USGS) will enhance the development of AI forecasting tools and ensure this information is integrated with a quality decision process for managing HPAI.The overall goal of this USGS Science Strategy for HPAI in Wildlife and the Environment goes beyond documenting the occurrence and distribution of AI viruses in wild birds. The USGS aims to understand the epidemiological processes and environmental factors that influence HPAI distribution and describe the mechanisms of transmission between wild birds and poultry. USGS scientists developed a conceptual model describing the process linking HPAI dispersal in wild waterfowl to the outbreaks in poultry. This strategy focuses on five long-term science goals, which include:Science Goal 1—Augment the National HPAI Surveillance Plan;Science Goal 2—Determine mechanisms of HPAI disease spread in wildlife and the environment;Science Goal 3—Characterize HPAI viruses circulating in wildlife;Science Goal 4—Understand implications of avian ecology on HPAI spread; andScience Goal 5—Develop HPAI forecasting and decision-making tools.These goals will help define and describe the processes outlined in the conceptual model with the ultimate goal of facilitating biosecurity and minimizing transfer of diseases across the wildlife-poultry interface. The first four science goals are focused on scientific discovery and the fifth goal is application-based. Decision analyses in the fifth goal will guide prioritization of proposed actions in the first four goals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hess, Kelley M.; Wilcots, Eric M., E-mail: hess@ast.uct.ac.za, E-mail: ewilcots@astro.wisc.edu
We present an analysis of the neutral hydrogen (H I) content and distribution of galaxies in groups as a function of their parent dark matter halo mass. The Arecibo Legacy Fast ALFA survey α.40 data release allows us, for the first time, to study the H I properties of over 740 galaxy groups in the volume of sky common to the Sloan Digital Sky Survey (SDSS) and ALFALFA surveys. We assigned ALFALFA H I detections a group membership based on an existing magnitude/volume-limited SDSS Data Release 7 group/cluster catalog. Additionally, we assigned group ''proximity' membership to H I detected objectsmore » whose optical counterpart falls below the limiting optical magnitude—thereby not contributing substantially to the estimate of the group stellar mass, but significantly to the total group H I mass. We find that only 25% of the H I detected galaxies reside in groups or clusters, in contrast to approximately half of all optically detected galaxies. Further, we plot the relative positions of optical and H I detections in groups as a function of parent dark matter halo mass to reveal strong evidence that H I is being processed in galaxies as a result of the group environment: as optical membership increases, groups become increasingly deficient of H I rich galaxies at their center and the H I distribution of galaxies in the most massive groups starts to resemble the distribution observed in comparatively more extreme cluster environments. We find that the lowest H I mass objects lose their gas first as they are processed in the group environment, and it is evident that the infall of gas rich objects is important to the continuing growth of large scale structure at the present epoch, replenishing the neutral gas supply of groups. Finally, we compare our results to those of cosmological simulations and find that current models cannot simultaneously predict the H I selected halo occupation distribution for both low and high mass halos.« less
[Study of changes in Chinese herbal medicine distribution channel].
Lv, Hua; Yang, Guang; Huang, Lu-Qi
2014-07-01
Distribution channel of Chinese herbal medicines has been changing. From Han to Ming Dynasty, Chinese herbal medicine were mainly trafficked to urban by dealers or farmers; From the Ming Dynasty to the foundation of new China, distribution channels are primarily intermediated with township "bazaar" and national distribution center with fixed place and regularly trading hours. In the planned economy period, the state-owned herbal medicine company was the sole medium with monopoly nature. From the mid1980s to the end of last century, planned economy and market economy have been co-existing. Stepping into 21st century, producing area highlighted in the distribution channels. Presence or absence and rise or fall of different types of distribution market went throughout the changing process of distribution channels, which became an important clue. Changes were motivated by economical consideration of channel subject, which originated from commodity characteristic and social environment changes.
Comparing host and target environments for distributed Ada programs
NASA Technical Reports Server (NTRS)
Paulk, Mark C.
1986-01-01
The Ada programming language provides a means of specifying logical concurrency by using multitasking. Extending the Ada multitasking concurrency mechanism into a physically concurrent distributed environment which imposes its own requirements can lead to incompatibilities. These problems are discussed. Using distributed Ada for a target system may be appropriate, but when using the Ada language in a host environment, a multiprocessing model may be more suitable than retargeting an Ada compiler for the distributed environment. The tradeoffs between multitasking on distributed targets and multiprocessing on distributed hosts are discussed. Comparisons of the multitasking and multiprocessing models indicate different areas of application.
Dusty Plasmas in Planetary Magnetospheres Award
NASA Technical Reports Server (NTRS)
Horanyi, Mihaly
2005-01-01
This is my final report for the grant Dusty Plasmas in Planetary Magnetospheres. The funding from this grant supported our research on dusty plasmas to study: a) dust plasma interactions in general plasma environments, and b) dusty plasma processes in planetary magnetospheres (Earth, Jupiter and Saturn). We have developed a general purpose transport code in order to follow the spatial and temporal evolution of dust density distributions in magnetized plasma environments. The code allows the central body to be represented by a multipole expansion of its gravitational and magnetic fields. The density and the temperature of the possibly many-component plasma environment can be pre-defined as a function of coordinates and, if necessary, the time as well. The code simultaneously integrates the equations of motion with the equations describing the charging processes. The charging currents are dependent not only on the instantaneous plasma parameters but on the velocity, as well as on the previous charging history of the dust grains.
Sedimentary controls on modern sand grain coat formation
NASA Astrophysics Data System (ADS)
Dowey, Patrick J.; Worden, Richard H.; Utley, James; Hodgson, David M.
2017-05-01
Coated sand grains can influence reservoir quality evolution during sandstone diagenesis. Porosity can be reduced and fluid flow restricted where grain coats encroach into pore space. Conversely pore-lining grain coats can restrict the growth of pore-filling quartz cement in deeply buried sandstones, and thus can result in unusually high porosity in deeply buried sandstones. Being able to predict the distribution of coated sand grains within petroleum reservoirs is thus important to help find good reservoir quality. Here we report a modern analogue study of 12 sediment cores from the Anllóns Estuary, Galicia, NW Spain, collected from a range of sub-environments, to help develop an understanding of the occurrence and distribution of coated grains. The cores were described for grain size, bioturbation and sedimentary structures, and then sub-sampled for electron and light microscopy, laser granulometry, and X-ray diffraction analysis. The Anllóns Estuary is sand-dominated with intertidal sand flats and saltmarsh environments at the margins; there is a shallowing/fining-upwards trend in the estuary-fill succession. Grain coats are present in nearly every sample analysed; they are between 1 μm and 100 μm thick and typically lack internal organisation. The extent of grain coat coverage can exceed 25% in some samples with coverage highest in the top 20 cm of cores. Samples from muddy intertidal flat and the muddy saltmarsh environments, close to the margins of the estuary, have the highest coat coverage (mean coat coverage of 20.2% and 21.3%, respectively). The lowest mean coat coverage occurs in the sandy saltmarsh (10.4%), beyond the upper tidal limit and sandy intertidal flat environments (8.4%), close to the main estuary channel. Mean coat coverage correlates with the concentration of clay fraction. The primary controls on the distribution of fine-grained sediment, and therefore grain coat distribution, are primary sediment transport and deposition processes that concentrate the clay fraction in the sediment towards the margins of the estuary. Bioturbation and clay illuviation/mechanical infiltration are secondary processes that may redistribute fine-grained sediment and produce grain coats. Here we have shown that detrital grain coats are more likely in marginal environments of ancient estuary-fills, which are typically found in the fining-upward part of progradational successions.
Energy-efficient hierarchical processing in the network of wireless intelligent sensors (WISE)
NASA Astrophysics Data System (ADS)
Raskovic, Dejan
Sensor network nodes have benefited from technological advances in the field of wireless communication, processing, and power sources. However, the processing power of microcontrollers is often not sufficient to perform sophisticated processing, while the power requirements of digital signal processing boards or handheld computers are usually too demanding for prolonged system use. We are matching the intrinsic hierarchical nature of many digital signal-processing applications with the natural hierarchy in distributed wireless networks, and building the hierarchical system of wireless intelligent sensors. Our goal is to build a system that will exploit the hierarchical organization to optimize the power consumption and extend battery life for the given time and memory constraints, while providing real-time processing of sensor signals. In addition, we are designing our system to be able to adapt to the current state of the environment, by dynamically changing the algorithm through procedure replacement. This dissertation presents the analysis of hierarchical environment and methods for energy profiling used to evaluate different system design strategies, and to optimize time-effective and energy-efficient processing.
NASA Astrophysics Data System (ADS)
Chasse, Kevin Robert
Duplex stainless steels (DSS) generally have superior strength and corrosion resistance as compared to most standard austenitic and ferritic stainless grades owing to a balanced microstructure of austenite and ferrite. As a result of having favorable properties, DSS have been selected for the construction of equipment in pulp and paper, chemical processing, nuclear, oil and gas as well as other industries. The use of DSS has been restricted in some cases because of stress corrosion cracking (SCC), which can initiate and grow in either the ferrite or austenite phase depending on the environment. Thorough understanding of SCC mechanisms of DSS in chloride- and hydrogen sulfide-containing solutions has been useful for material selection in many environments. However, understanding of SCC mechanisms of DSS in sulfide-containing caustic solutions is limited, which has restricted the capacity to optimize process and equipment design in pulp and paper environments. Process environments may contain different concentrations of hydroxide, sulfide, and chloride, altering corrosion and SCC susceptibility of each phase. Crack initiation and growth behavior will also change depending on the relative phase distribution and properties of austenite and ferrite. The role of microstructure and environment on the SCC of standard grade UNS S32205 and lean grade UNS S32101 in hot alkaline-sulfide solution were evaluated in this work using electrochemical, film characterization, mechanical testing, X-ray diffraction, and microscopy techniques. Microstructural aspects, which included residual stress state, phase distribution, phase ratio, and microhardness, were related to the propensity for SCC crack initiation in different simulated alkaline pulping liquors at 170 °C. Other grades of DSS and reference austenitic and superferritic grades of stainless steel were studied using exposure coupons for comparison to understand compositional effects and individual phase susceptibility. Environments having different ionic concentrations of inorganic salts, i.e. sodium hydroxide, sodium sulfide, and sodium chloride, were used to understand the effect of liquor alkalinity, percent sulfidity, and chloride content on the corrosion and SCC behavior. Hydrogen embrittlement of S32205 was studied to understand the electrochemical conditions and fracture features associated with this failure mode. The results showed that there is an appreciable increase in the susceptibility of DSS to SCC in the presence of sulfide and chloride in hot alkaline environments. Sulfide and chloride adsorption at active sites on the metal surface caused unstable passivity and defective film formation. Chloride and sulfide available at the electrolyte/film surface reduced the charge transfer resistance and shifted the response of the films to lower frequencies indicating the films became more defective. The surface films had an outer, discontinuous layer, and an inner, barrier layer. Fe, Mo, and Mn were selectively dissolved in hot alkaline environments. The onset of SCC was related to the extent of selective dissolution and was consistent with a slip-step dissolution mechanism. Selective corrosion of the austenite phase depended on percent sulfidity and liquor alkalinity. Chlorides enhanced crack initiation and coalescence along the austenite/ferrite boundaries. Crack initiation and transgranular growth strongly depended on the phase distribution in the banded microstructure of DSS. These findings will augment understanding of SCC in this alloy-environment combination and facilitate materials selection in hot alkaline-sulfide environments, particularly in the petrochemical, nuclear, chemical processing, and pulp and paper industries.
NASA Astrophysics Data System (ADS)
Oliva, Marc; Ruiz-Fernández, Jesús
2015-04-01
Elephant Point constitutes an ice-free environment of only 1.16 km2 in the south-western corner of Livingston Island (South Shetland Islands, Antarctica). In January 2014 we conducted a detailed geomorphological mapping in situ, examining the distribution of processes and landforms in Elephant Point. Four main geomorphological environments were identified: proglacial area, moraine system, bedrock plateaus and marine terraces. The ice cap covering most part of the western half of this island has significantly retreated during the last decades in parallel to the accelerated warming trend recorded in the Antarctic Peninsula. Between 1956 and 2010 this rapid retreat has exposed 17.3% of the present-day land surface in Elephant Point. Two of these geomorphological units are located in this new ice-free area: a polygenic moraine stretching from the western to the eastern edges of the peninsula and a relatively flat proglacial environment. The glacier sat next to the northern slope of the moraine in 1956, but the retreat of the Rotch dome glacier during the last decades left these environments free of glacier ice. Following the deglaciation, the postglacial dynamics in these areas showed the characteristic response of paraglacial systems. Very different geomorphological processes occur today in the northern and southern slopes of the moraine, which is related to the different stage of paraglacial adjustment in both sides. The southern slope shows a low to moderate activity of slope processes operating on coarser sediments that have built pronival ramparts, debris flows and alluvial fans. By contrast, mass wasting processes are very active in the northern slope, which is composed of fine-grained unconsolidated sediments. Here, ice-rich permafrost has been observed in slumps degrading the moraine. The sediments of the moraine are being mobilized down-slope in large amounts by landslides and slumps. Up to 9.6% of the surface of the moraine is affected by retrogressive-thaw slumps. Other features indicative of the degradation of the ground ice were found in Elephant Point, such as the kettle lakes distributed in the hummocky terrain between the moraine ridges and in the proglacial environment. It is expected that paraglacial processes and permafrost degradation will continue in this maritime permafrost environment in the near future, though their intensity and extension will depend on the future climate conditions prevailing in the northern Antarctic Peninsula. Acknowledgements This research was financially supported by the HOLOANTAR project (Holocene environmental change in the Maritime Antarctic. Interactions Between permafrost and the lacustrine environment, reference PTDC/CTE-GIX/119582/2010) funded by the Portuguese Science Foundation as well as by the PROPOLAR (Portuguese Polar Program).
An approach for modelling snowcover ablation and snowmelt runoff in cold region environments
NASA Astrophysics Data System (ADS)
Dornes, Pablo Fernando
Reliable hydrological model simulations are the result of numerous complex interactions among hydrological inputs, landscape properties, and initial conditions. Determination of the effects of these factors is one of the main challenges in hydrological modelling. This situation becomes even more difficult in cold regions due to the ungauged nature of subarctic and arctic environments. This research work is an attempt to apply a new approach for modelling snowcover ablation and snowmelt runoff in complex subarctic environments with limited data while retaining integrity in the process representations. The modelling strategy is based on the incorporation of both detailed process understanding and inputs along with information gained from observations of basin-wide streamflow phenomenon; essentially a combination of deductive and inductive approaches. The study was conducted in the Wolf Creek Research Basin, Yukon Territory, using three models, a small-scale physically based hydrological model, a land surface scheme, and a land surface hydrological model. The spatial representation was based on previous research studies and observations, and was accomplished by incorporating landscape units, defined according to topography and vegetation, as the spatial model elements. Comparisons between distributed and aggregated modelling approaches showed that simulations incorporating distributed initial snowcover and corrected solar radiation were able to properly simulate snowcover ablation and snowmelt runoff whereas the aggregated modelling approaches were unable to represent the differential snowmelt rates and complex snowmelt runoff dynamics. Similarly, the inclusion of spatially distributed information in a land surface scheme clearly improved simulations of snowcover ablation. Application of the same modelling approach at a larger scale using the same landscape based parameterisation showed satisfactory results in simulating snowcover ablation and snowmelt runoff with minimal calibration. Verification of this approach in an arctic basin illustrated that landscape based parameters are a feasible regionalisation framework for distributed and physically based models. In summary, the proposed modelling philosophy, based on the combination of an inductive and deductive reasoning, is a suitable strategy for reliable predictions of snowcover ablation and snowmelt runoff in cold regions and complex environments.
Simulating neutron star mergers as r-process sources in ultrafaint dwarf galaxies
NASA Astrophysics Data System (ADS)
Safarzadeh, Mohammadtaher; Scannapieco, Evan
2017-10-01
To explain the high observed abundances of r-process elements in local ultrafaint dwarf (UFD) galaxies, we perform cosmological zoom simulations that include r-process production from neutron star mergers (NSMs). We model star formation stochastically and simulate two different haloes with total masses ≈108 M⊙ at z = 6. We find that the final distribution of [Eu/H] versus [Fe/H] is relatively insensitive to the energy by which the r-process material is ejected into the interstellar medium, but strongly sensitive to the environment in which the NSM event occurs. In one halo, the NSM event takes place at the centre of the stellar distribution, leading to high levels of r-process enrichment such as seen in a local UFD, Reticulum II (Ret II). In a second halo, the NSM event takes place outside of the densest part of the galaxy, leading to a more extended r-process distribution. The subsequent star formation occurs in an interstellar medium with shallow levels of r-process enrichment that results in stars with low levels of [Eu/H] compared to Ret II stars even when the maximum possible r-process mass is assumed to be ejected. This suggests that the natal kicks of neutron stars may also play an important role in determining the r-process abundances in UFD galaxies, a topic that warrants further theoretical investigation.
Modeling Neutron stars as r-process sources in Ultra Faint Dwarf galaxies
NASA Astrophysics Data System (ADS)
Safarzadeh, Mohammadtaher; Scannapieco, Evan
2018-06-01
To explain the high observed abundances of r-process elements in local ultrafaint dwarf (UFD) galaxies, we perform cosmological zoom simulations that include r-process production from neutron star mergers (NSMs). We model star formation stochastically and simulate two different haloes with total masses ≈108 M⊙ at z = 6. We find that the final distribution of [Eu/H] versus [Fe/H] is relatively insensitive to the energy by which the r-process material is ejected into the interstellar medium, but strongly sensitive to the environment in which the NSM event occurs. In one halo, the NSM event takes place at the centre of the stellar distribution, leading to high levels of r-process enrichment such as seen in a local UFD, Reticulum II (Ret II). In a second halo, the NSM event takes place outside of the densest part of the galaxy, leading to a more extended r-process distribution. The subsequent star formation occurs in an interstellar medium with shallow levels of r-process enrichment that results in stars with low levels of [Eu/H] compared to Ret II stars even when the maximum possible r-process mass is assumed to be ejected. This suggests that the natal kicks of neutron stars may also play an important role in determining the r-process abundances in UFD galaxies, a topic that warrants further theoretical investigation.
ERIC Educational Resources Information Center
McDonald, Jacquie; McPhail, Janelle; Maguire, Michael; Millett, Bruce
2004-01-01
The University of Southern Queensland (USQ), Australia, has more than 25 years experience in distributed education. More recently, USQ has embraced information and communication technologies to delivery learning resources in a more integrated and interactive environment to on-campus and external students. To understand the complex reactions of…
Internet Wargaming with Distributed Processing Using the Client-Server Model
1997-03-01
in for war game development . There are tool kits for writing binary files that are interpreted by a particular plug-in. The most popular plug-in set...multi-player game development , the speed with which the environment is changing should be taken into 35 account. For this project JavaScript was chosen
40 CFR 761.35 - Storage for reuse.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Storage for reuse. 761.35 Section 761... Manufacturing, Processing, Distribution in Commerce, and Use of PCBs and PCB Items § 761.35 Storage for reuse... to the EPA Regional Administrator at least 6 months before the 5-year storage for reuse period...
40 CFR 761.35 - Storage for reuse.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Storage for reuse. 761.35 Section 761... Manufacturing, Processing, Distribution in Commerce, and Use of PCBs and PCB Items § 761.35 Storage for reuse... to the EPA Regional Administrator at least 6 months before the 5-year storage for reuse period...
40 CFR 761.35 - Storage for reuse.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Storage for reuse. 761.35 Section 761... Manufacturing, Processing, Distribution in Commerce, and Use of PCBs and PCB Items § 761.35 Storage for reuse... to the EPA Regional Administrator at least 6 months before the 5-year storage for reuse period...
40 CFR 761.35 - Storage for reuse.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Storage for reuse. 761.35 Section 761... Manufacturing, Processing, Distribution in Commerce, and Use of PCBs and PCB Items § 761.35 Storage for reuse... to the EPA Regional Administrator at least 6 months before the 5-year storage for reuse period...
Sensors Technology and Advanced Signal Processing Concepts for Layered Warfare/Layered Sensing
2010-04-01
for challenged environments will require contributions from many diverse technical disciplines across AFRL, the Air Force and beyond. By providing a...APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. STINFO COPY AIR FORCE...UNITED STATES AIR FORCE AIR FORCE MATERIEL COMMAND NOTICE AND SIGNATURE PAGE Using Government drawings, specifications, or other data
The Information Environment. Education and Curriculum Series No. 3. Syllabus for IST 501.
ERIC Educational Resources Information Center
Taylor, Robert S.
This syllabus outlines a graduate-level introductory overview of the agencies, industries, and services whose primary concerns are the creation, processing, storage, distribution, and use of information; also considered are questions relating to technological impact, the role of the information professional, and cost-benefits. The course is…
USDA-ARS?s Scientific Manuscript database
Listeria monocytogenes is widely distributed in the environment. The ubiquitous nature of this bacterium can result in contamination of foods. Listeriosis is a food-borne disease caused by consumption of L. monocytogenes-contaminated food. It is a public health problem of low incidence but high mort...
Szoboszlai, Z; Kertész, Zs; Szikszai, Z; Angyal, A; Furu, E; Török, Zs; Daróczi, L; Kiss, A Z
2012-02-15
In this case study, the elemental composition and mass size distribution of indoor aerosol particles were determined in a working environment where soldering of printed circuit boards (PCB) took place. Single particle analysis using ion and electron microscopy was carried out to obtain more detailed and reliable data about the origin of these particles. As a result, outdoor and indoor aerosol sources such as wave soldering, fluxing processes, workers' activity, mineral dust, biomass burning, fertilizing and other anthropogenic sources could be separated. With the help of scanning electron microscopy, characteristic particle types were identified. On the basis of the mass size distribution data, a stochastic lung deposition model was used to calculate the total and regional deposition efficiencies of the different types of particles within the human respiratory system. The information presented in this study aims to give insights into the detailed characteristics and the health impact of aerosol particles in a working environment where different kinds of soldering activity take place. Copyright © 2011 Elsevier B.V. All rights reserved.
Algorithms and programming tools for image processing on the MPP, part 2
NASA Technical Reports Server (NTRS)
Reeves, Anthony P.
1986-01-01
A number of algorithms were developed for image warping and pyramid image filtering. Techniques were investigated for the parallel processing of a large number of independent irregular shaped regions on the MPP. In addition some utilities for dealing with very long vectors and for sorting were developed. Documentation pages for the algorithms which are available for distribution are given. The performance of the MPP for a number of basic data manipulations was determined. From these results it is possible to predict the efficiency of the MPP for a number of algorithms and applications. The Parallel Pascal development system, which is a portable programming environment for the MPP, was improved and better documentation including a tutorial was written. This environment allows programs for the MPP to be developed on any conventional computer system; it consists of a set of system programs and a library of general purpose Parallel Pascal functions. The algorithms were tested on the MPP and a presentation on the development system was made to the MPP users group. The UNIX version of the Parallel Pascal System was distributed to a number of new sites.
NASA Astrophysics Data System (ADS)
Nair, B. G.; Winter, N.; Daniel, B.; Ward, R. M.
2016-07-01
Direct measurement of the flow of electric current during VAR is extremely difficult due to the aggressive environment as the arc process itself controls the distribution of current. In previous studies the technique of “magnetic source tomography” was presented; this was shown to be effective but it used a computationally intensive iterative method to analyse the distribution of arc centre position. In this paper we present faster computational methods requiring less numerical optimisation to determine the centre position of a single distributed arc both numerically and experimentally. Numerical validation of the algorithms were done on models and experimental validation on measurements based on titanium and nickel alloys (Ti6Al4V and INCONEL 718). The results are used to comment on the effects of process parameters on arc behaviour during VAR.
Development of climate data storage and processing model
NASA Astrophysics Data System (ADS)
Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.
2016-11-01
We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, M.; Anderson, D.P.
1988-01-01
Marionette is a system for distributed parallel programming in an environment of networked heterogeneous computer systems. It is based on a master/slave model. The master process can invoke worker operations (asynchronous remote procedure calls to single slaves) and context operations (updates to the state of all slaves). The master and slaves also interact through shared data structures that can be modified only by the master. The master and slave processes are programmed in a sequential language. The Marionette runtime system manages slave process creation, propagates shared data structures to slaves as needed, queues and dispatches worker and context operations, andmore » manages recovery from slave processor failures. The Marionette system also includes tools for automated compilation of program binaries for multiple architectures, and for distributing binaries to remote fuel systems. A UNIX-based implementation of Marionette is described.« less
Hyperswitch communication network
NASA Technical Reports Server (NTRS)
Peterson, J.; Pniel, M.; Upchurch, E.
1991-01-01
The Hyperswitch Communication Network (HCN) is a large scale parallel computer prototype being developed at JPL. Commercial versions of the HCN computer are planned. The HCN computer being designed is a message passing multiple instruction multiple data (MIMD) computer, and offers many advantages in price-performance ratio, reliability and availability, and manufacturing over traditional uniprocessors and bus based multiprocessors. The design of the HCN operating system is a uniquely flexible environment that combines both parallel processing and distributed processing. This programming paradigm can achieve a balance among the following competing factors: performance in processing and communications, user friendliness, and fault tolerance. The prototype is being designed to accommodate a maximum of 64 state of the art microprocessors. The HCN is classified as a distributed supercomputer. The HCN system is described, and the performance/cost analysis and other competing factors within the system design are reviewed.
A Framework for WWW Query Processing
NASA Technical Reports Server (NTRS)
Wu, Binghui Helen; Wharton, Stephen (Technical Monitor)
2000-01-01
Query processing is the most common operation in a DBMS. Sophisticated query processing has been mainly targeted at a single enterprise environment providing centralized control over data and metadata. Submitting queries by anonymous users on the web is different in such a way that load balancing or DBMS' accessing control becomes the key issue. This paper provides a solution by introducing a framework for WWW query processing. The success of this framework lies in the utilization of query optimization techniques and the ontological approach. This methodology has proved to be cost effective at the NASA Goddard Space Flight Center Distributed Active Archive Center (GDAAC).
Working toward integrated models of alpine plant distribution.
Carlson, Bradley Z; Randin, Christophe F; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe
2013-10-01
Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial-temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution.
NASA Astrophysics Data System (ADS)
Gleason, J. L.; Hillyer, T. N.; Wilkins, J.
2012-12-01
The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.
Overview of the LINCS architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fletcher, J.G.; Watson, R.W.
1982-01-13
Computing at the Lawrence Livermore National Laboratory (LLNL) has evolved over the past 15 years with a computer network based resource sharing environment. The increasing use of low cost and high performance micro, mini and midi computers and commercially available local networking systems will accelerate this trend. Further, even the large scale computer systems, on which much of the LLNL scientific computing depends, are evolving into multiprocessor systems. It is our belief that the most cost effective use of this environment will depend on the development of application systems structured into cooperating concurrent program modules (processes) distributed appropriately over differentmore » nodes of the environment. A node is defined as one or more processors with a local (shared) high speed memory. Given the latter view, the environment can be characterized as consisting of: multiple nodes communicating over noisy channels with arbitrary delays and throughput, heterogenous base resources and information encodings, no single administration controlling all resources, distributed system state, and no uniform time base. The system design problem is - how to turn the heterogeneous base hardware/firmware/software resources of this environment into a coherent set of resources that facilitate development of cost effective, reliable, and human engineered applications. We believe the answer lies in developing a layered, communication oriented distributed system architecture; layered and modular to support ease of understanding, reconfiguration, extensibility, and hiding of implementation or nonessential local details; communication oriented because that is a central feature of the environment. The Livermore Interactive Network Communication System (LINCS) is a hierarchical architecture designed to meet the above needs. While having characteristics in common with other architectures, it differs in several respects.« less
2017-12-01
SYSTEM ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT by Justin K. Davis...TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT 5. FUNDING NUMBERS 6. AUTHOR(S) Justin K...ARCHITECTURE TO INVESTIGATE THE IMPACT OF INTEGRATED AIR AND MISSILE DEFENSE IN A DISTRIBUTED LETHALITY ENVIRONMENT Justin K. Davis Lieutenant
NASA Astrophysics Data System (ADS)
Spindler, Ashley; Wake, David; Belfiore, Francesco; Bershady, Matthew; Bundy, Kevin; Drory, Niv; Masters, Karen; Thomas, Daniel; Westfall, Kyle; Wild, Vivienne
2018-05-01
We study the spatially resolved star formation of 1494 galaxies in the SDSS-IV MaNGA Survey. Star formation rates (SFRs) are calculated using a two-step process, using H α in star-forming regions and Dn4000 in regions identified as active galactic nucleus/low-ionization (nuclear) emission region [AGN/LI(N)ER] or lineless. The roles of secular and environmental quenching processes are investigated by studying the dependence of the radial profiles of specific star formation rate on stellar mass, galaxy structure, and environment. We report on the existence of `centrally suppressed' galaxies, which have suppressed Specific Star Formation Rate (SSFR) in their cores compared to their discs. The profiles of centrally suppressed and unsuppressed galaxies are distributed in a bimodal way. Galaxies with high stellar mass and core velocity dispersion are found to be much more likely to be centrally suppressed than low-mass galaxies, and we show that this is related to morphology and the presence of AGN/LI(N)ER like emission. Centrally suppressed galaxies also display lower star formation at all radii compared to unsuppressed galaxies. The profiles of central and satellite galaxies are also compared, and we find that satellite galaxies experience lower specific star formation rates at all radii than central galaxies. This uniform suppression could be a signal of the stripping of hot halo gas in the process known as strangulation. We find that satellites are not more likely to be suppressed in their cores than centrals, indicating that the core suppression is an entirely internal process. We find no correlation between the local environment density and the profiles of star formation rate surface density.
NASA Astrophysics Data System (ADS)
Burone, Leticia; Ortega, Leonardo; Franco-Fraguas, Paula; Mahiques, Michel; García-Rodriguez, Felipe; Venturini, Natalia; Marin, Yamandú; Brugnoli, Ernesto; Nagai, Renata; Muniz, Pablo; Bícego, Marcia; Figueira, Rubens; Salaroli, Alexandre
2013-03-01
Proxies of terrigenous versus marine input (Al and Ti, Fe/Ca and Ti/Ca ratios), origin of organic matter (δ13C, δ15N and C/N ratio), productivity (Corg; Nt; CaCO3, P, Ca, and Ba content; and Ba/Al and Ba/Ti ratios), hydrodynamics (grain size, mean diameter and sorting) and biological records of the main features of the environment (benthic foraminifera assemblage distribution) were used to assess the sediment footprint of river vs. marine influence along the salinity gradient between the Rio de la Plata (RdlP) estuary and the adjacent South Western Atlantic Shelf. These criteria permitted characterisation and interpretation of the sedimentary processes influencing transition between three known environments: tidal river, estuarine and marine zones. Increases in sand and clay content at the transition between tidal river and proper estuarine zones indicate resuspension/deposition processes associated with the maximum turbidity zone (MTZ). The MTZ was also characterised by an increase in mixed organic matter content indicated by stable carbon and nitrogen isotope values, an increment in productivity proxies (Corg, Nt and CaCO3) and the substitution of the Miliammina fusca assemblage (brackish environments) for the Ammonia tepida assemblage (estuarine environments). The transition between estuarine and marine environments was characterised by a sharp (up to 99%) increase in sand content, reflecting the progradation of modern RdlP sediments toward relict continental shelf sediment. C/N values typical of the marine environment, decreased trace element concentrations and the distribution of the Buliminella elegantissima assemblage (a more marine assemblage) also highlight the marine environment. This paper is particularly important as a tool both to better understand sedimentological dynamics in salinity fronts (along the shelf sediment of large estuaries) and to elaborate more precise palaeoenvironmental and palaeoceanographic reconstructions.
NASA Astrophysics Data System (ADS)
García, T.; Velo, A.; Fernandez-Bastero, S.; Gago-Duport, L.; Santos, A.; Alejo, I.; Vilas, F.
2005-02-01
This paper examines the linkages between the space-distribution of grain sizes and the relative percentage of the amount of mineral species that result from the mixing process of siliciclastic and carbonate sediments at the Ria de Vigo (NW of Spain). The space-distribution of minerals was initially determined, starting from a detailed mineralogical study based on XRD-Rietveld analysis of the superficial sediments. Correlations between the maps obtained for grain sizes, average fractions of either siliciclastic or carbonates, as well as for individual-minerals, were further stabilised. From this analysis, spatially organized patterns were found between carbonates and several minerals involved in the siliciclastic fraction. In particular, a coupled behaviour is observed between plagioclases and carbonates, in terms of their relative percentage amounts and the grain size distribution. In order to explain these results a conceptual model is proposed, based on the interplay between chemical processes at the seawater-sediment interface and hydrodynamical factors. This model suggests the existence of chemical control mechanisms that, by selective processes of dissolution-crystallization, constrain the mixed environment's long-term evolution, inducing the formation of self-organized sedimentary patterns.
An ecological perspective of Listeria monocytogenes biofilms in food processing facilities.
Valderrama, Wladir B; Cutter, Catherine N
2013-01-01
Listeria monocytogenes can enter the food chain at virtually any point. However, food processing environments seem to be of particular importance. From an ecological point of view, food processing facilities are microbial habitats that are constantly disturbed by cleaning and sanitizing procedures. Although L. monocytogenes is considered ubiquitous in nature, it is important to recognize that not all L. monocytogenes strains appear to be equally distributed; the distribution of the organism seems to be related to certain habitats. Currently, no direct evidence exists that L. monocytogenes-associated biofilms have played a role in food contamination or foodborne outbreaks, likely because biofilm isolation and identification are not part of an outbreak investigation, or the definition of biofilm is unclear. Because L. monocytogenes is known to colonize surfaces, we suggest that contamination patterns may be studied in the context of how biofilm formation is influenced by the environment within food processing facilities. In this review, direct and indirect epidemiological and phenotypic evidence of lineage-related biofilm formation capacity to specific ecological niches will be discussed. A critical view on the development of the biofilm concept, focused on the practical implications, strengths, and weaknesses of the current definitions also is discussed. The idea that biofilm formation may be an alternative surrogate for microbial fitness is proposed. Furthermore, current research on the influence of environmental factors on biofilm formation is discussed.
NASA Technical Reports Server (NTRS)
Twombly, I. Alexander; Smith, Jeffrey; Bruyns, Cynthia; Montgomery, Kevin; Boyle, Richard
2003-01-01
The International Space Station will soon provide an unparalleled research facility for studying the near- and longer-term effects of microgravity on living systems. Using the Space Station Glovebox Facility - a compact, fully contained reach-in environment - astronauts will conduct technically challenging life sciences experiments. Virtual environment technologies are being developed at NASA Ames Research Center to help realize the scientific potential of this unique resource by facilitating the experimental hardware and protocol designs and by assisting the astronauts in training. The Virtual GloveboX (VGX) integrates high-fidelity graphics, force-feedback devices and real- time computer simulation engines to achieve an immersive training environment. Here, we describe the prototype VGX system, the distributed processing architecture used in the simulation environment, and modifications to the visualization pipeline required to accommodate the display configuration.
Gong, Nina; Hong, Hanlie; Huff, Warren D; Fang, Qian; Bae, Christopher J; Wang, Chaowen; Yin, Ke; Chen, Shuling
2018-05-16
Permian-Triassic (P-Tr) altered volcanic ashes (tuffs) are widely distributed within the P-Tr boundary successions in South China. Volcanic altered ashes from terrestrial section-Chahe (CH) and marine section-Shangsi (SS) are selected to further understand the influence of sedimentary environments and volcanic sources on diagenetic alterarion on volcanic tuffs. The zircon 206 Pb/ 238 U ages of the corresponding beds between two sections are almost synchronous. Sedimentary environment of the altered tuffs was characterized by a low pH and did not experience a hydrothermal process. The dominant clay minerals of all the tuff beds are illite-smectite (I-S) minerals, with minor chlorite and kaolinite. I-S minerals of CH (R3) are more ordered than SS (R1), suggesting that CH also shows a higher diagenetic grade and more intensive chemical weathering. Besides, the nature of the volcanism of the tuff beds studied is derived from different magma sources. The clay mineral compositions of tuffs have little relation with the types of source volcanism and the depositional environments. Instead, the degree of the mixed-layer clay minerals and the REE distribution are mainly dependent upon the sedimentary environments. Thus, the mixed-layer clay minerals ratio and their geochemical index can be used as the paleoenvironmental indicator.
NASA Astrophysics Data System (ADS)
Korobova, Elena; Romanov, Sergey
2016-04-01
Distribution of artificial radionuclides in the environment has long been used successfully for revealing migration pathways of their stable analogues. Migration of water in natural conjugated elementary landscapes characterizing the system of top-slope-resulting depression, has a specific structure and the radionuclide tracer is inevitably reflecting it by specific sorption and exchange processes. Other important issues are the concentration levels and the difference in characteristic time of chemical element dispersion. Modern biosphere has acquired its sustainable structure within a long period of time and is formed by basic macroelements allowing the water soluble portion of elements functioning as activators of chemical exchange. Water migration is controlled by gravitation, climate and relief while fixation depends upon the parameters of surfaces and chemical composition. The resulting structure depends on specificity and duration of the process. The long-term redistribution of chemical elements in terrestrial environment has led to a distinct geochemical structure of conjugated landscapes with a specific geometry of redistribution and accumulation of chemical elements. Migration of the newly born anthropogenic radionuclides followed natural pathways in biosphere. The initial deposition of the Chernobyl's radionuclides within the elementary landscape-geochemical system was even by condition of aerial deposition. But further exchange process is controlled by the strength of fixation and migration ability of the carriers. Therefore patterns of spatial distribution of artificial radionuclides in natural landscapes are considerably different as compared to those of the long-term forming the basic structure of chemical fields in biosphere. Our monitoring of Cs-137 radial and lateral distribution in the test plots characterizing natural undisturbed conjugated elementary landscapes performed in the period from 2005 until now has revealed a stable and specifically polycentric structure of radiocesium distribution believed to reflect the character of radial and lateral water body migration and a high sensitivity of water distribution to surface parameters. This leads to an unusual wavy type of Cs-137 distribution down, along and across all the slopes examined for surface Cs-137 activity at every measured point. The finding is believed to have an important practical outcome allowing much more detailed evaluation of micronutrients distribution and optimization of their application.
Latency Hiding in Dynamic Partitioning and Load Balancing of Grid Computing Applications
NASA Technical Reports Server (NTRS)
Das, Sajal K.; Harvey, Daniel J.; Biswas, Rupak
2001-01-01
The Information Power Grid (IPG) concept developed by NASA is aimed to provide a metacomputing platform for large-scale distributed computations, by hiding the intricacies of highly heterogeneous environment and yet maintaining adequate security. In this paper, we propose a latency-tolerant partitioning scheme that dynamically balances processor workloads on the.IPG, and minimizes data movement and runtime communication. By simulating an unsteady adaptive mesh application on a wide area network, we study the performance of our load balancer under the Globus environment. The number of IPG nodes, the number of processors per node, and the interconnected speeds are parameterized to derive conditions under which the IPG would be suitable for parallel distributed processing of such applications. Experimental results demonstrate that effective solution are achieved when the IPG nodes are connected by a high-speed asynchronous interconnection network.
Rethinking the evolution of specialization: A model for the evolution of phenotypic heterogeneity.
Rubin, Ilan N; Doebeli, Michael
2017-12-21
Phenotypic heterogeneity refers to genetically identical individuals that express different phenotypes, even when in the same environment. Traditionally, "bet-hedging" in fluctuating environments is offered as the explanation for the evolution of phenotypic heterogeneity. However, there are an increasing number of examples of microbial populations that display phenotypic heterogeneity in stable environments. Here we present an evolutionary model of phenotypic heterogeneity of microbial metabolism and a resultant theory for the evolution of phenotypic versus genetic specialization. We use two-dimensional adaptive dynamics to track the evolution of the population phenotype distribution of the expression of two metabolic processes with a concave trade-off. Rather than assume a Gaussian phenotype distribution, we use a Beta distribution that is capable of describing genotypes that manifest as individuals with two distinct phenotypes. Doing so, we find that environmental variation is not a necessary condition for the evolution of phenotypic heterogeneity, which can evolve as a form of specialization in a stable environment. There are two competing pressures driving the evolution of specialization: directional selection toward the evolution of phenotypic heterogeneity and disruptive selection toward genetically determined specialists. Because of the lack of a singular point in the two-dimensional adaptive dynamics and the fact that directional selection is a first order process, while disruptive selection is of second order, the evolution of phenotypic heterogeneity dominates and often precludes speciation. We find that branching, and therefore genetic specialization, occurs mainly under two conditions: the presence of a cost to maintaining a high phenotypic variance or when the effect of mutations is large. A cost to high phenotypic variance dampens the strength of selection toward phenotypic heterogeneity and, when sufficiently large, introduces a singular point into the evolutionary dynamics, effectively guaranteeing eventual branching. Large mutations allow the second order disruptive selection to dominate the first order selection toward phenotypic heterogeneity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Process-driven and biological characterisation and mapping of seabed habitats sensitive to trawling.
Foveau, Aurélie; Vaz, Sandrine; Desroy, Nicolas; Kostylev, Vladimir E
2017-01-01
The increase of anthropogenic pressures on the marine environment together with the necessity of a sustainable management of marine living resources have underlined the need to map and model coastal environments, particularly for the purposes of spatial planning and for the implementation of integrated ecosystem-based management approach. The present study compares outputs of a process-driven benthic habitat sensitivity (PDS) model to the structure, composition and distribution of benthic invertebrates in the Eastern English Channel and southern part of the North Sea. Trawl disturbance indicators (TDI) computed from species biological traits and benthic community composition were produced from samples collected with a bottom trawl. The TDI was found to be highly correlated to the PDS further validating the latter's purpose to identify natural process-driven pattern of sensitivity. PDS was found to reflect an environmental potential that may no longer be fully observable in the field and difference with in situ biological observations could be partially explained by the spatial distribution of fishery pressure on the seafloor. The management implication of these findings are discussed and we suggest that, used in conjunction with TDI approaches, PDS may help monitor management effort by evaluating the difference between the current state and the presumed optimal environmental status of marine benthic habitats.
Integrated resource scheduling in a distributed scheduling environment
NASA Technical Reports Server (NTRS)
Zoch, David; Hall, Gardiner
1988-01-01
The Space Station era presents a highly-complex multi-mission planning and scheduling environment exercised over a highly distributed system. In order to automate the scheduling process, customers require a mechanism for communicating their scheduling requirements to NASA. A request language that a remotely-located customer can use to specify his scheduling requirements to a NASA scheduler, thus automating the customer-scheduler interface, is described. This notation, Flexible Envelope-Request Notation (FERN), allows the user to completely specify his scheduling requirements such as resource usage, temporal constraints, and scheduling preferences and options. The FERN also contains mechanisms for representing schedule and resource availability information, which are used in the inter-scheduler inconsistency resolution process. Additionally, a scheduler is described that can accept these requests, process them, generate schedules, and return schedule and resource availability information to the requester. The Request-Oriented Scheduling Engine (ROSE) was designed to function either as an independent scheduler or as a scheduling element in a network of schedulers. When used in a network of schedulers, each ROSE communicates schedule and resource usage information to other schedulers via the FERN notation, enabling inconsistencies to be resolved between schedulers. Individual ROSE schedules are created by viewing the problem as a constraint satisfaction problem with a heuristically guided search strategy.
Do social utility judgments influence attentional processing?
Shore, Danielle M; Heerey, Erin A
2013-10-01
Research shows that social judgments influence decision-making in social environments. For example, judgments about an interaction partners' trustworthiness affect a variety of social behaviors and decisions. One mechanism by which social judgments may influence social decisions is by biasing the automatic allocation of attention toward certain social partners, thereby shaping the information people acquire. Using an attentional blink paradigm, we investigate how trustworthiness judgments alter the allocation of attention to social stimuli in a set of two experiments. The first experiment investigates trustworthiness judgments based solely on a social partner's facial appearance. The second experiment examines the effect of trustworthiness judgments based on experienced behavior. In the first, strong appearance-based judgments (positive and negative) enhanced stimulus recognizability but did not alter the size of the attentional blink, suggesting that appearance-based social judgments enhance face memory but do not affect pre-attentive processing. However, in the second experiment, in which judgments were based on behavioral experience rather than appearance, positive judgments enhanced pre-attentive processing of trustworthy faces. This suggests that a stimulus's potential benefits, rather than its disadvantages, shape the automatic distribution of attentional resources. These results have implications for understanding how appearance- and behavior-based social cues shape attention distribution in social environments. Copyright © 2013 Elsevier B.V. All rights reserved.
Process-driven and biological characterisation and mapping of seabed habitats sensitive to trawling
Desroy, Nicolas; Kostylev, Vladimir E.
2017-01-01
The increase of anthropogenic pressures on the marine environment together with the necessity of a sustainable management of marine living resources have underlined the need to map and model coastal environments, particularly for the purposes of spatial planning and for the implementation of integrated ecosystem-based management approach. The present study compares outputs of a process-driven benthic habitat sensitivity (PDS) model to the structure, composition and distribution of benthic invertebrates in the Eastern English Channel and southern part of the North Sea. Trawl disturbance indicators (TDI) computed from species biological traits and benthic community composition were produced from samples collected with a bottom trawl. The TDI was found to be highly correlated to the PDS further validating the latter’s purpose to identify natural process-driven pattern of sensitivity. PDS was found to reflect an environmental potential that may no longer be fully observable in the field and difference with in situ biological observations could be partially explained by the spatial distribution of fishery pressure on the seafloor. The management implication of these findings are discussed and we suggest that, used in conjunction with TDI approaches, PDS may help monitor management effort by evaluating the difference between the current state and the presumed optimal environmental status of marine benthic habitats. PMID:28981504
Li, Yi-Qun; Xu, Li; Zhu, Hua-Xu; Tang, Zhi-Shu; Li, Bo; Pan, Yong-Lan; Yao, Wei-Wei; Fu, Ting-Ming; Guo, Li-Wei
2017-10-01
In order to explore the adsorption characteristics of proteins on the membrane surface and the effect of protein solution environment on the permeation behavior of berberine, berberine and proteins were used as the research object to prepare simulated solution. Low field NMR, static adsorption experiment and membrane separation experiment were used to study the interaction between the proteins and ceramic membrane or between the proteins and berberine. The static adsorption capacity of proteins, membrane relative flux, rejection rate of proteins, transmittance rate of berberine and the adsorption rate of proteins and berberine were used as the evaluation index. Meanwhile, the membrane resistance distribution, the particle size distribution and the scanning electron microscope (SEM) were determined to investigate the adsorption characteristics of proteins on ceramic membrane and the effect on membrane separation process of berberine. The results showed that the ceramic membrane could adsorb the proteins and the adsorption model was consistent with Langmuir adsorption model. In simulating the membrane separation process, proteins were the main factor to cause membrane fouling. However, when the concentration of proteins was 1 g•L⁻¹, the proteins had no significant effect on membrane separation process of berberine. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.
2016-12-01
Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data processing results through WMS and WFS services will be used to provide their interoperability. Financial support of this activity by the RF Ministry of Education and Science under Agreement 14.613.21.0037 (RFMEFI61315X0037) and by the Iola Hubbard Climate Change Endowment is acknowledged.
A comparison of queueing, cluster and distributed computing systems
NASA Technical Reports Server (NTRS)
Kaplan, Joseph A.; Nelson, Michael L.
1993-01-01
Using workstation clusters for distributed computing has become popular with the proliferation of inexpensive, powerful workstations. Workstation clusters offer both a cost effective alternative to batch processing and an easy entry into parallel computing. However, a number of workstations on a network does not constitute a cluster. Cluster management software is necessary to harness the collective computing power. A variety of cluster management and queuing systems are compared: Distributed Queueing Systems (DQS), Condor, Load Leveler, Load Balancer, Load Sharing Facility (LSF - formerly Utopia), Distributed Job Manager (DJM), Computing in Distributed Networked Environments (CODINE), and NQS/Exec. The systems differ in their design philosophy and implementation. Based on published reports on the different systems and conversations with the system's developers and vendors, a comparison of the systems are made on the integral issues of clustered computing.
Muška, Milan; Tušer, Michal; Frouzová, Jaroslava; Mrkvička, Tomáš; Ricard, Daniel; Seďa, Jaromír; Morelli, Federico; Kubečka, Jan
2018-03-29
Understanding spatial distribution of organisms in heterogeneous environment remains one of the chief issues in ecology. Spatial organization of freshwater fish was investigated predominantly on large-scale, neglecting important local conditions and ecological processes. However, small-scale processes are of an essential importance for individual habitat preferences and hence structuring trophic cascades and species coexistence. In this work, we analysed the real-time spatial distribution of pelagic freshwater fish in the Římov Reservoir (Czechia) observed by hydroacoustics in relation to important environmental predictors during 48 hours at 3-h interval. Effect of diurnal cycle was revealed of highest significance in all spatial models with inverse trends between fish distribution and predictors in day and night in general. Our findings highlighted daytime pelagic fish distribution as highly aggregated, with general fish preferences for central, deep and highly illuminated areas, whereas nighttime distribution was more disperse and fish preferred nearshore steep sloped areas with higher depth. This turnover suggests prominent movements of significant part of fish assemblage between pelagic and nearshore areas on a diel basis. In conclusion, hydroacoustics, GIS and spatial modelling proved as valuable tool for predicting local fish distribution and elucidate its drivers, which has far reaching implications for understanding freshwater ecosystem functioning.
Kalvelage, Thomas A.; Willems, Jennifer
2005-01-01
The US Geological Survey's EROS Data Center (EDC) hosts the Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC supports NASA's Earth Observing System (EOS), which is a series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. The EOS Data and Information Systems (EOSDIS) was designed to acquire, archive, manage and distribute Earth observation data to the broadest possible user community.The LP DAAC is one of four DAACs that utilize the EOSDIS Core System (ECS) to manage and archive their data. Since the ECS was originally designed, significant changes have taken place in technology, user expectations, and user requirements. Therefore the LP DAAC has implemented additional systems to meet the evolving needs of scientific users, tailored to an integrated working environment. These systems provide a wide variety of services to improve data access and to enhance data usability through subsampling, reformatting, and reprojection. These systems also support the wide breadth of products that are handled by the LP DAAC.The LP DAAC is the primary archive for the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data; it is the only facility in the United States that archives, processes, and distributes data from the Advanced Spaceborne Thermal Emission/Reflection Radiometer (ASTER) on NASA's Terra spacecraft; and it is responsible for the archive and distribution of “land products” generated from data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra and Aqua satellites.
High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering
NASA Technical Reports Server (NTRS)
Maly, K.
1998-01-01
Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.
Towards Portable Large-Scale Image Processing with High-Performance Computing.
Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A
2018-05-03
High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.
A Distributed Simulation Software System for Multi-Spacecraft Missions
NASA Technical Reports Server (NTRS)
Burns, Richard; Davis, George; Cary, Everett
2003-01-01
The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.
Bennetts, Victor Hernandez; Schaffernicht, Erik; Pomareda, Victor; Lilienthal, Achim J; Marco, Santiago; Trincavelli, Marco
2014-09-17
In this paper, we address the task of gas distribution modeling in scenarios where multiple heterogeneous compounds are present. Gas distribution modeling is particularly useful in emission monitoring applications where spatial representations of the gaseous patches can be used to identify emission hot spots. In realistic environments, the presence of multiple chemicals is expected and therefore, gas discrimination has to be incorporated in the modeling process. The approach presented in this work addresses the task of gas distribution modeling by combining different non selective gas sensors. Gas discrimination is addressed with an open sampling system, composed by an array of metal oxide sensors and a probabilistic algorithm tailored to uncontrolled environments. For each of the identified compounds, the mapping algorithm generates a calibrated gas distribution model using the classification uncertainty and the concentration readings acquired with a photo ionization detector. The meta parameters of the proposed modeling algorithm are automatically learned from the data. The approach was validated with a gas sensitive robot patrolling outdoor and indoor scenarios, where two different chemicals were released simultaneously. The experimental results show that the generated multi compound maps can be used to accurately predict the location of emitting gas sources.
Distributed agile software development for the SKA
NASA Astrophysics Data System (ADS)
Wicenec, Andreas; Parsons, Rebecca; Kitaeff, Slava; Vinsen, Kevin; Wu, Chen; Nelson, Paul; Reed, David
2012-09-01
The SKA software will most probably be developed by many groups distributed across the globe and coming from dierent backgrounds, like industries and research institutions. The SKA software subsystems will have to cover a very wide range of dierent areas, but still they have to react and work together like a single system to achieve the scientic goals and satisfy the challenging data ow requirements. Designing and developing such a system in a distributed fashion requires proper tools and the setup of an environment to allow for ecient detection and tracking of interface and integration issues in particular in a timely way. Agile development can provide much faster feedback mechanisms and also much tighter collaboration between the customer (scientist) and the developer. Continuous integration and continuous deployment on the other hand can provide much faster feedback of integration issues from the system level to the subsystem developers. This paper describes the results obtained from trialing a potential SKA development environment based on existing science software development processes like ALMA, the expected distribution of the groups potentially involved in the SKA development and experience gained in the development of large scale commercial software projects.
MID-INFRARED EVIDENCE FOR ACCELERATED EVOLUTION IN COMPACT GROUP GALAXIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Lisa May; Johnson, Kelsey E.; Gallagher, Sarah C.
2010-11-15
Compact galaxy groups are at the extremes of the group environment, with high number densities and low velocity dispersions that likely affect member galaxy evolution. To explore the impact of this environment in detail, we examine the distribution in the mid-infrared (MIR) 3.6-8.0 {mu}m color space of 42 galaxies from 12 Hickson compact groups (HCGs) in comparison with several control samples, including the LVL+SINGS galaxies, interacting galaxies, and galaxies from the Coma Cluster. We find that the HCG galaxies are strongly bimodal, with statistically significant evidence for a gap in their distribution. In contrast, none of the other samples showmore » such a marked gap, and only galaxies in the Coma infall region have a distribution that is statistically consistent with the HCGs in this parameter space. To further investigate the cause of the HCG gap, we compare the galaxy morphologies of the HCG and LVL+SINGS galaxies, and also probe the specific star formation rate (SSFR) of the HCG galaxies. While galaxy morphology in HCG galaxies is strongly linked to position with MIR color space, the more fundamental property appears to be the SSFR, or star formation rate normalized by stellar mass. We conclude that the unusual MIR color distribution of HCG galaxies is a direct product of their environment, which is most similar to that of the Coma infall region. In both cases, galaxy densities are high, but gas has not been fully processed or stripped. We speculate that the compact group environment fosters accelerated evolution of galaxies from star-forming and neutral gas-rich to quiescent and neutral gas-poor, leaving few members in the MIR gap at any time.« less
Cetinceviz, Yucel; Bayindir, Ramazan
2012-05-01
The network requirements of control systems in industrial applications increase day by day. The Internet based control system and various fieldbus systems have been designed in order to meet these requirements. This paper describes an Internet based control system with wireless fieldbus communication designed for distributed processes. The system was implemented as an experimental setup in a laboratory. In industrial facilities, the process control layer and the distance connection of the distributed control devices in the lowest levels of the industrial production environment are provided with fieldbus networks. In this paper, the Internet based control system that will be able to meet the system requirements with a new-generation communication structure, which is called wired/wireless hybrid system, has been designed on field level and carried out to cover all sectors of distributed automation, from process control, to distributed input/output (I/O). The system has been accomplished by hardware structure with a programmable logic controller (PLC), a communication processor (CP) module, two industrial wireless modules and a distributed I/O module, Motor Protection Package (MPP) and software structure with WinCC flexible program used for the screen of Scada (Supervisory Control And Data Acquisition), SIMATIC MANAGER package program ("STEP7") used for the hardware and network configuration and also for downloading control program to PLC. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
Menke, S.B.; Holway, D.A.; Fisher, R.N.; Jetz, W.
2009-01-01
Aim: Species distribution models (SDMs) or, more specifically, ecological niche models (ENMs) are a useful and rapidly proliferating tool in ecology and global change biology. ENMs attempt to capture associations between a species and its environment and are often used to draw biological inferences, to predict potential occurrences in unoccupied regions and to forecast future distributions under environmental change. The accuracy of ENMs, however, hinges critically on the quality of occurrence data. ENMs often use haphazardly collected data rather than data collected across the full spectrum of existing environmental conditions. Moreover, it remains unclear how processes affecting ENM predictions operate at different spatial scales. The scale (i.e. grain size) of analysis may be dictated more by the sampling regime than by biologically meaningful processes. The aim of our study is to jointly quantify how issues relating to region and scale affect ENM predictions using an economically important and ecologically damaging invasive species, the Argentine ant (Linepithema humile). Location: California, USA. Methods: We analysed the relationship between sampling sufficiency, regional differences in environmental parameter space and cell size of analysis and resampling environmental layers using two independently collected sets of presence/absence data. Differences in variable importance were determined using model averaging and logistic regression. Model accuracy was measured with area under the curve (AUC) and Cohen's kappa. Results: We first demonstrate that insufficient sampling of environmental parameter space can cause large errors in predicted distributions and biological interpretation. Models performed best when they were parametrized with data that sufficiently sampled environmental parameter space. Second, we show that altering the spatial grain of analysis changes the relative importance of different environmental variables. These changes apparently result from how environmental constraints and the sampling distributions of environmental variables change with spatial grain. Conclusions: These findings have clear relevance for biological inference. Taken together, our results illustrate potentially general limitations for ENMs, especially when such models are used to predict species occurrences in novel environments. We offer basic methodological and conceptual guidelines for appropriate sampling and scale matching. ?? 2009 The Authors Journal compilation ?? 2009 Blackwell Publishing.
Johnson, Timothy C.; Versteeg, Roelof J.; Ward, Andy; Day-Lewis, Frederick D.; Revil, André
2010-01-01
Electrical geophysical methods have found wide use in the growing discipline of hydrogeophysics for characterizing the electrical properties of the subsurface and for monitoring subsurface processes in terms of the spatiotemporal changes in subsurface conductivity, chargeability, and source currents they govern. Presently, multichannel and multielectrode data collections systems can collect large data sets in relatively short periods of time. Practitioners, however, often are unable to fully utilize these large data sets and the information they contain because of standard desktop-computer processing limitations. These limitations can be addressed by utilizing the storage and processing capabilities of parallel computing environments. We have developed a parallel distributed-memory forward and inverse modeling algorithm for analyzing resistivity and time-domain induced polar-ization (IP) data. The primary components of the parallel computations include distributed computation of the pole solutions in forward mode, distributed storage and computation of the Jacobian matrix in inverse mode, and parallel execution of the inverse equation solver. We have tested the corresponding parallel code in three efforts: (1) resistivity characterization of the Hanford 300 Area Integrated Field Research Challenge site in Hanford, Washington, U.S.A., (2) resistivity characterization of a volcanic island in the southern Tyrrhenian Sea in Italy, and (3) resistivity and IP monitoring of biostimulation at a Superfund site in Brandywine, Maryland, U.S.A. Inverse analysis of each of these data sets would be limited or impossible in a standard serial computing environment, which underscores the need for parallel high-performance computing to fully utilize the potential of electrical geophysical methods in hydrogeophysical applications.
NASA Astrophysics Data System (ADS)
Swartz, Charles S.
2003-05-01
The process of distributing and exhibiting a motion picture has changed little since the Lumière brothers presented the first motion picture to an audience in 1895. While this analog photochemical process is capable of producing screen images of great beauty and expressive power, more often the consumer experience is diminished by third generation prints and by the wear and tear of the mechanical process. Furthermore, the film industry globally spends approximately $1B annually manufacturing and shipping prints. Alternatively, distributing digital files would theoretically yield great benefits in terms of image clarity and quality, lower cost, greater security, and more flexibility in the cinema (e.g., multiple language versions). In order to understand the components of the digital cinema chain and evaluate the proposed technical solutions, the Entertainment Technology Center at USC in 2000 established the Digital Cinema Laboratory as a critical viewing environment, with the highest quality film and digital projection equipment. The presentation describes the infrastructure of the Lab, test materials, and testing methodologies developed for compression evaluation, and lessons learned up to the present. In addition to compression, the Digital Cinema Laboratory plans to evaluate other components of the digital cinema process as well.
GES DISC Data Recipes in Jupyter Notebooks
NASA Astrophysics Data System (ADS)
Li, A.; Banavige, B.; Garimella, K.; Rice, J.; Shen, S.; Liu, Z.
2017-12-01
The Earth Science Data and Information System (ESDIS) Project manages twelve Distributed Active Archive Centers (DAACs) which are geographically dispersed across the United States. The DAACs are responsible for ingesting, processing, archiving, and distributing Earth science data produced from various sources (satellites, aircraft, field measurements, etc.). In response to projections of an exponential increase in data production, there has been a recent effort to prototype various DAAC activities in the cloud computing environment. This, in turn, led to the creation of an initiative, called the Cloud Analysis Toolkit to Enable Earth Science (CATEES), to develop a Python software package in order to transition Earth science data processing to the cloud. This project, in particular, supports CATEES and has two primary goals. One, transition data recipes created by the Goddard Earth Science Data and Information Service Center (GES DISC) DAAC into an interactive and educational environment using Jupyter Notebooks. Two, acclimate Earth scientists to cloud computing. To accomplish these goals, we create Jupyter Notebooks to compartmentalize the different steps of data analysis and help users obtain and parse data from the command line. We also develop a Docker container, comprised of Jupyter Notebooks, Python library dependencies, and command line tools, and configure it into an easy to deploy package. The end result is an end-to-end product that simulates the use case of end users working in the cloud computing environment.
Space Propulsion Hazards Analysis Manual (SPHAM). Volume 2. Appendices
1988-10-01
lb. RESTRICTIVE MARKINGS UNCLASSIFIED 2a. SECURITY CLASSIFICATION AUTHORITY 3 . DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release...Volume I Chapter 2 - Requirementb and the Hazards Analysis Process .... Volume I Chapter 3 - Accident Scenarios...list of the hazardous materials that are discussed; 3 ) description of the failure scenarios; 4) type of post-accident environment that is discussed
2011-08-09
fastest 10 supercomputers in the world. Both systems rely on GPU co-processing, one using AMD cards, the second, called Nebulae , using NVIDIA Tesla...Page 9 of 10 UNCLASSIFIED capability of almost 3 petaflop/s, the highest in TOP500, Nebulae only holds the No. 2 position on the TOP500 list of the
ERIC Educational Resources Information Center
Fernández-Alemán, José Luis; Carrillo-de-Gea, Juan Manuel; Meca, Joaquín Vidal; Ros, Joaquín Nicolás; Toval, Ambrosio; Idri, Ali
2016-01-01
This paper presents the results of two educational experiments carried out to determine whether the process of specifying requirements (catalog-based reuse as opposed to conventional specification) has an impact on effectiveness and productivity in co-located and distributed software development environments. The participants in the experiments…
Multi-filter spectrophotometry of quasar environments
NASA Technical Reports Server (NTRS)
Craven, Sally E.; Hickson, Paul; Yee, Howard K. C.
1993-01-01
A many-filter photometric technique for determining redshifts and morphological types, by fitting spectral templates to spectral energy distributions, has good potential for application in surveys. Despite success in studies performed on simulated data, the results have not been fully reliable when applied to real, low signal-to-noise data. We are investigating techniques to improve the fitting process.
NASA Technical Reports Server (NTRS)
Munchak, Stephen Joseph; Kummerow, Christian; Elsaesser, Gregory
2013-01-01
Variability in the raindrop sized distribution (DSD) has long been recognized as a source of uncertainty in relationships between radar reflectivity Z and rain rate R. In this study, we analyze DSD retrievals from two years of data gathered by the Tropical Rainfall Measuring Mission (TRMM) satellite and processed with a combined radar-radiometer retrieval algorithm over the global oceans equatorward of 35?. Numerous variables describing properties of each reflectivity profile, large-scale organization, and the background environment are examined for relationships to the reflectivity-normalized median drop diameter, epsilonDSD. In general, we find that higher freezing levels and relative humidities are associated with smaller epsilonDSD. Within a given environment, the mesoscale organization of precipitation and the vertical profile of reflectivity are associated with DSD characteristics. In the tropics, the smallest epsilonDSD values are found in large but shallow convective systems, where warm rain formation processes are thought to be predominant, whereas larger sizes are found in the stratiform regions of organized deep convection. In the extratropics, the largest epsilonDSD values are found in the scattered convection that occurs when cold, dry continental air moves over the much warmer ocean after the passage of a cold front. The geographical distribution of the retrieved DSDs is consistent with many of the observed regional Z-R relationships found in the literature as well as discrepancies between the TRMM radar-only and radiometer-only precipitation products. In particular, mid-latitude and tropical regions near land tend to have larger drops for a given reflectivity, whereas the smallest drops are found in the eastern Pacific Intertropical Convergence Zone.
Divide and Conquer (DC) BLAST: fast and easy BLAST execution within HPC environments
Yim, Won Cheol; Cushman, John C.
2017-07-22
Bioinformatics is currently faced with very large-scale data sets that lead to computational jobs, especially sequence similarity searches, that can take absurdly long times to run. For example, the National Center for Biotechnology Information (NCBI) Basic Local Alignment Search Tool (BLAST and BLAST+) suite, which is by far the most widely used tool for rapid similarity searching among nucleic acid or amino acid sequences, is highly central processing unit (CPU) intensive. While the BLAST suite of programs perform searches very rapidly, they have the potential to be accelerated. In recent years, distributed computing environments have become more widely accessible andmore » used due to the increasing availability of high-performance computing (HPC) systems. Therefore, simple solutions for data parallelization are needed to expedite BLAST and other sequence analysis tools. However, existing software for parallel sequence similarity searches often requires extensive computational experience and skill on the part of the user. In order to accelerate BLAST and other sequence analysis tools, Divide and Conquer BLAST (DCBLAST) was developed to perform NCBI BLAST searches within a cluster, grid, or HPC environment by using a query sequence distribution approach. Scaling from one (1) to 256 CPU cores resulted in significant improvements in processing speed. Thus, DCBLAST dramatically accelerates the execution of BLAST searches using a simple, accessible, robust, and parallel approach. DCBLAST works across multiple nodes automatically and it overcomes the speed limitation of single-node BLAST programs. DCBLAST can be used on any HPC system, can take advantage of hundreds of nodes, and has no output limitations. Thus, this freely available tool simplifies distributed computation pipelines to facilitate the rapid discovery of sequence similarities between very large data sets.« less
NASA Astrophysics Data System (ADS)
Hunter, K. S.; Van Cappellen, P.
2000-01-01
Our paper, 'Kinetic modeling of microbially-driven redox chemistry of subsurface environments: coupling transport, microbial metabolism and geochemistry' (Hunter et al., 1998), presents a theoretical exploration of biogeochemical reaction networks and their importance to the biogeochemistry of groundwater systems. As with any other model, the kinetic reaction-transport model developed in our paper includes only a subset of all physically, biologically and chemically relevant processes in subsurface environments. It considers aquifer systems where the primary energy source driving microbial activity is the degradation of organic matter. In addition to the primary biodegradation pathways of organic matter (i.e. respiration and fermentation), the redox chemistry of groundwaters is also affected by reactions not directly involving organic matter oxidation. We refer to the latter as secondary reactions. By including secondary redox reactions which consume reduced reaction products (e.g., Mn2+, FeS, H2S), and in the process compete with microbial heterotrophic populations for available oxidants (i.e. O2, NO3-, Mn(IV), Fe(III), SO42-), we predict spatio-temporal distributions of microbial activity which differ significantly from those of models which consider only the biodegradation reactions. That is, the secondary reactions have a significant impact on the distributions of the rates of heterotrophic and chemolithotrophic metabolic pathways. We further show that secondary redox reactions, as well as non-redox reactions, significantly influence the acid-base chemistry of groundwaters. The distributions of dissolved inorganic redox species along flowpaths, however, are similar in simulations with and without secondary reactions (see Figs. 3(b) and 7(b) in Hunter et al., 1998), indicating that very different biogeochemical reaction dynamics may lead to essentially the same chemical redox zonation of a groundwater system.
Divide and Conquer (DC) BLAST: fast and easy BLAST execution within HPC environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yim, Won Cheol; Cushman, John C.
Bioinformatics is currently faced with very large-scale data sets that lead to computational jobs, especially sequence similarity searches, that can take absurdly long times to run. For example, the National Center for Biotechnology Information (NCBI) Basic Local Alignment Search Tool (BLAST and BLAST+) suite, which is by far the most widely used tool for rapid similarity searching among nucleic acid or amino acid sequences, is highly central processing unit (CPU) intensive. While the BLAST suite of programs perform searches very rapidly, they have the potential to be accelerated. In recent years, distributed computing environments have become more widely accessible andmore » used due to the increasing availability of high-performance computing (HPC) systems. Therefore, simple solutions for data parallelization are needed to expedite BLAST and other sequence analysis tools. However, existing software for parallel sequence similarity searches often requires extensive computational experience and skill on the part of the user. In order to accelerate BLAST and other sequence analysis tools, Divide and Conquer BLAST (DCBLAST) was developed to perform NCBI BLAST searches within a cluster, grid, or HPC environment by using a query sequence distribution approach. Scaling from one (1) to 256 CPU cores resulted in significant improvements in processing speed. Thus, DCBLAST dramatically accelerates the execution of BLAST searches using a simple, accessible, robust, and parallel approach. DCBLAST works across multiple nodes automatically and it overcomes the speed limitation of single-node BLAST programs. DCBLAST can be used on any HPC system, can take advantage of hundreds of nodes, and has no output limitations. Thus, this freely available tool simplifies distributed computation pipelines to facilitate the rapid discovery of sequence similarities between very large data sets.« less
Göhler, Daniel; Stintz, Michael; Hillemann, Lars; Vorbau, Manuel
2010-01-01
Nanoparticles are used in industrial and domestic applications to control customized product properties. But there are several uncertainties concerning possible hazard to health safety and environment. Hence, it is necessary to search for methods to analyze the particle release from typical application processes. Based on a survey of commercial sanding machines, the relevant sanding process parameters were employed for the design of a miniature sanding test setup in a particle-free environment for the quantification of the nanoparticle release into air from surface coatings. The released particles were moved by a defined airflow to a fast mobility particle sizer and other aerosol measurement equipment to enable the determination of released particle numbers additionally to the particle size distribution. First, results revealed a strong impact of the coating material on the swarf mass and the number of released particles. PMID:20696941
ARTEMIS: a collaborative framework for health care.
Reddy, R; Jagannathan, V; Srinivas, K; Karinthi, R; Reddy, S M; Gollapudy, C; Friedman, S
1993-01-01
Patient centered healthcare delivery is an inherently collaborative process. This involves a wide range of individuals and organizations with diverse perspectives: primary care physicians, hospital administrators, labs, clinics, and insurance. The key to cost reduction and quality improvement in health care is effective management of this collaborative process. The use of multi-media collaboration technology can facilitate timely delivery of patient care and reduce cost at the same time. During the last five years, the Concurrent Engineering Research Center (CERC), under the sponsorship of DARPA (Defense Advanced Research Projects Agency, recently renamed ARPA) developed a number of generic key subsystems of a comprehensive collaboration environment. These subsystems are intended to overcome the barriers that inhibit the collaborative process. Three subsystems developed under this program include: MONET (Meeting On the Net)--to provide consultation over a computer network, ISS (Information Sharing Server)--to provide access to multi-media information, and PCB (Project Coordination Board)--to better coordinate focussed activities. These systems have been integrated into an open environment to enable collaborative processes. This environment is being used to create a wide-area (geographically distributed) research testbed under DARPA sponsorship, ARTEMIS (Advance Research Testbed for Medical Informatics) to explore the collaborative health care processes. We believe this technology will play a key role in the current national thrust to reengineer the present health-care delivery system.
Entropic cohering power in quantum operations
NASA Astrophysics Data System (ADS)
Xi, Zhengjun; Hu, Ming-Liang; Li, Yongming; Fan, Heng
2018-02-01
Coherence is a basic feature of quantum systems and a common necessary condition for quantum correlations. It is also an important physical resource in quantum information processing. In this paper, using relative entropy, we consider a more general definition of the cohering power of quantum operations. First, we calculate the cohering power of unitary quantum operations and show that the amount of distributed coherence caused by non-unitary quantum operations cannot exceed the quantum-incoherent relative entropy between system of interest and its environment. We then find that the difference between the distributed coherence and the cohering power is larger than the quantum-incoherent relative entropy. As an application, we consider the distributed coherence caused by purification.
Use of EPANET solver to manage water distribution in Smart City
NASA Astrophysics Data System (ADS)
Antonowicz, A.; Brodziak, R.; Bylka, J.; Mazurkiewicz, J.; Wojtecki, S.; Zakrzewski, P.
2018-02-01
Paper presents a method of using EPANET solver to support manage water distribution system in Smart City. The main task is to develop the application that allows remote access to the simulation model of the water distribution network developed in the EPANET environment. Application allows to perform both single and cyclic simulations with the specified step of changing the values of the selected process variables. In the paper the architecture of application was shown. The application supports the selection of the best device control algorithm using optimization methods. Optimization procedures are possible with following methods: brute force, SLSQP (Sequential Least SQuares Programming), Modified Powell Method. Article was supplemented by example of using developed computer tool.
Learning Multirobot Hose Transportation and Deployment by Distributed Round-Robin Q-Learning.
Fernandez-Gauna, Borja; Etxeberria-Agiriano, Ismael; Graña, Manuel
2015-01-01
Multi-Agent Reinforcement Learning (MARL) algorithms face two main difficulties: the curse of dimensionality, and environment non-stationarity due to the independent learning processes carried out by the agents concurrently. In this paper we formalize and prove the convergence of a Distributed Round Robin Q-learning (D-RR-QL) algorithm for cooperative systems. The computational complexity of this algorithm increases linearly with the number of agents. Moreover, it eliminates environment non sta tionarity by carrying a round-robin scheduling of the action selection and execution. That this learning scheme allows the implementation of Modular State-Action Vetoes (MSAV) in cooperative multi-agent systems, which speeds up learning convergence in over-constrained systems by vetoing state-action pairs which lead to undesired termination states (UTS) in the relevant state-action subspace. Each agent's local state-action value function learning is an independent process, including the MSAV policies. Coordination of locally optimal policies to obtain the global optimal joint policy is achieved by a greedy selection procedure using message passing. We show that D-RR-QL improves over state-of-the-art approaches, such as Distributed Q-Learning, Team Q-Learning and Coordinated Reinforcement Learning in a paradigmatic Linked Multi-Component Robotic System (L-MCRS) control problem: the hose transportation task. L-MCRS are over-constrained systems with many UTS induced by the interaction of the passive linking element and the active mobile robots.
Combined distributed and concentrated transducer network for failure indication
NASA Astrophysics Data System (ADS)
Ostachowicz, Wieslaw; Wandowski, Tomasz; Malinowski, Pawel
2010-03-01
In this paper algorithm for discontinuities localisation in thin panels made of aluminium alloy is presented. Mentioned algorithm uses Lamb wave propagation methods for discontinuities localisation. Elastic waves were generated and received using piezoelectric transducers. They were arranged in concentrated arrays distributed on the specimen surface. In this way almost whole specimen could be monitored using this combined distributed-concentrated transducer network. Excited elastic waves propagate and reflect from panel boundaries and discontinuities existing in the panel. Wave reflection were registered through the piezoelectric transducers and used in signal processing algorithm. Proposed processing algorithm consists of two parts: signal filtering and extraction of obstacles location. The first part was used in order to enhance signals by removing noise from them. Second part allowed to extract features connected with wave reflections from discontinuities. Extracted features damage influence maps were a basis to create damage influence maps. Damage maps indicated intensity of elastic wave reflections which corresponds to obstacles coordinates. Described signal processing algorithms were implemented in the MATLAB environment. It should be underlined that in this work results based only on experimental signals were presented.
Adapting high-level language programs for parallel processing using data flow
NASA Technical Reports Server (NTRS)
Standley, Hilda M.
1988-01-01
EASY-FLOW, a very high-level data flow language, is introduced for the purpose of adapting programs written in a conventional high-level language to a parallel environment. The level of parallelism provided is of the large-grained variety in which parallel activities take place between subprograms or processes. A program written in EASY-FLOW is a set of subprogram calls as units, structured by iteration, branching, and distribution constructs. A data flow graph may be deduced from an EASY-FLOW program.
Simulation modeling for the health care manager.
Kennedy, Michael H
2009-01-01
This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.
A Widely-Accessible Distributed MEMS Processing Environment. The MEMS Exchange Program
2012-10-29
promise for high-aspect and deep etching into fused silica. This process capability is important for a DARPA project called the Navigation-Grade...on fused silica samples that appear to allow 2 to 1 aspect ratios in fused silica with a depth of etch of around 125 microns – a dramatic result in a...very hard to etch material such as fused silica! After receiving approval from DARPA, the MEMS Exchange purchased a previously- owned Ulvac etcher
2011-10-27
public release; distribution is unlimited Dr. Keith Bowman, AFRL, Precision Airdrop ( PAD ) Program Manager Ms. Carol Ventresca, SynGenics Corporation...Presentation Outline Entrance Criteria for PAD Integrated Product Team (IPT) S&T SE Process Steps Initial Project S&T Development Strategy...Entrance Criteria for PAD Integrated Product Team (IPT) S&T SE Process Steps Initial Project S&T Development Strategy User Understanding of
Enforcement of entailment constraints in distributed service-based business processes.
Hummer, Waldemar; Gaubatz, Patrick; Strembeck, Mark; Zdun, Uwe; Dustdar, Schahram
2013-11-01
A distributed business process is executed in a distributed computing environment. The service-oriented architecture (SOA) paradigm is a popular option for the integration of software services and execution of distributed business processes. Entailment constraints, such as mutual exclusion and binding constraints, are important means to control process execution. Mutually exclusive tasks result from the division of powerful rights and responsibilities to prevent fraud and abuse. In contrast, binding constraints define that a subject who performed one task must also perform the corresponding bound task(s). We aim to provide a model-driven approach for the specification and enforcement of task-based entailment constraints in distributed service-based business processes. Based on a generic metamodel, we define a domain-specific language (DSL) that maps the different modeling-level artifacts to the implementation-level. The DSL integrates elements from role-based access control (RBAC) with the tasks that are performed in a business process. Process definitions are annotated using the DSL, and our software platform uses automated model transformations to produce executable WS-BPEL specifications which enforce the entailment constraints. We evaluate the impact of constraint enforcement on runtime performance for five selected service-based processes from existing literature. Our evaluation demonstrates that the approach correctly enforces task-based entailment constraints at runtime. The performance experiments illustrate that the runtime enforcement operates with an overhead that scales well up to the order of several ten thousand logged invocations. Using our DSL annotations, the user-defined process definition remains declarative and clean of security enforcement code. Our approach decouples the concerns of (non-technical) domain experts from technical details of entailment constraint enforcement. The developed framework integrates seamlessly with WS-BPEL and the Web services technology stack. Our prototype implementation shows the feasibility of the approach, and the evaluation points to future work and further performance optimizations.
The Spatial Distributions and Variations of Water Environmental Risk in Yinma River Basin, China
Di, Hui; Liu, Xingpeng; Tong, Zhijun; Ji, Meichen
2018-01-01
Water environmental risk is the probability of the occurrence of events caused by human activities or the interaction of human activities and natural processes that will damage a water environment. This study proposed a water environmental risk index (WERI) model to assess the water environmental risk in the Yinma River Basin based on hazards, exposure, vulnerability, and regional management ability indicators in a water environment. The data for each indicator were gathered from 2000, 2005, 2010, and 2015 to assess the spatial and temporal variations in water environmental risk using particle swarm optimization and the analytic hierarchy process (PSO-AHP) method. The results showed that the water environmental risk in the Yinma River Basin decreased from 2000 to 2015. The risk level of the water environment was high in Changchun, while the risk levels in Yitong and Yongji were low. The research methods provide information to support future decision making by the risk managers in the Yinma River Basin, which is in a high-risk water environment. Moreover, water environment managers could reduce the risks by adjusting the indicators that affect water environmental risks. PMID:29543706
Analysis of Operational Parameters Affecting the Sulfur Content in Hot Metal of the COREX Process
NASA Astrophysics Data System (ADS)
Wu, Shengli; Wang, Laixin; Kou, Mingyin; Wang, Yujue; Zhang, Jiacong
2017-02-01
The COREX process, which has obvious advantages in environment protection, still has some disadvantages. It has a higher sulfur content in hot metal (HM) than the blast furnace has. In the present work, the distribution and transfer of sulfur in the COREX have been analyzed and several operational parameters related to the sulfur content in HM ([pct S]) have been obtained. Based on this, the effects of the coal rate, slag ratio, temperature of HM, melting rate, binary basicity ( R 2), the ratio of MgO/Al2O3, utilization of reducing gas, top gas consumption per ton burden solid, metallization rate, oxidation degree of reducing gas, and coal and DRI distribution index on the sulfur content in HM are investigated. What's more, a linear model has been developed and subsequently used for predicting and controlling the S content in HM of the COREX process.
A parallel-processing approach to computing for the geographic sciences
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.
Gravel, Dominique; Beaudet, Marilou; Messier, Christian
2008-10-01
Understanding coexistence of highly shade-tolerant tree species is a longstanding challenge for forest ecologists. A conceptual model for the coexistence of sugar maple (Acer saccharum) and American beech (Fagus grandibfolia) has been proposed, based on a low-light survival/high-light growth trade-off, which interacts with soil fertility and small-scale spatiotemporal variation in the environment. In this study, we first tested whether the spatial distribution of seedlings and saplings can be predicted by the spatiotemporal variability of light availability and soil fertility, and second, the manner in which the process of environmental filtering changes with regeneration size. We evaluate the support for this hypothesis relative to the one for a neutral model, i.e., for seed rain density predicted from the distribution of adult trees. To do so, we performed intensive sampling over 86 quadrats (5 x 5 m) in a 0.24-ha plot in a mature maple-beech community in Quebec, Canada. Maple and beech abundance, soil characteristics, light availability, and growth history (used as a proxy for spatiotemporal variation in light availability) were finely measured to model variation in sapling composition across different size classes. Results indicate that the variables selected to model species distribution do effectively change with size, but not as predicted by the conceptual model. Our results show that variability in the environment is not sufficient to differentiate these species' distributions in space. Although species differ in their spatial distribution in the small size classes, they tend to correlate at the larger size class in which recruitment occurs. Overall, the results are not supportive of a model of coexistence based on small-scale variations in the environment. We propose that, at the scale of a local stand, the lack of fit of the model could result from the high similarity of species in the range of environmental conditions encountered, and we suggest that coexistence would be stable only at larger spatial scales at which variability in the environment is greater.
Automating an integrated spatial data-mining model for landfill site selection
NASA Astrophysics Data System (ADS)
Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul
2017-10-01
An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.
Listeria monocytogenes in Fresh Produce: Outbreaks, Prevalence and Contamination Levels
Zhu, Qi; Gooneratne, Ravi; Hussain, Malik Altaf
2017-01-01
Listeria monocytogenes, a member of the genus Listeria, is widely distributed in agricultural environments, such as soil, manure and water. This organism is a recognized foodborne pathogenic bacterium that causes many diseases, from mild gastroenteritis to severe blood and/or central nervous system infections, as well as abortion in pregnant women. Generally, processed ready-to-eat and cold-stored meat and dairy products are considered high-risk foods for L. monocytogenes infections that cause human illness (listeriosis). However, recently, several listeriosis outbreaks have been linked to fresh produce contamination around the world. Additionally, many studies have detected L. monocytogenes in fresh produce samples and even in some minimally processed vegetables. Thus L. monocytogenes may contaminate fresh produce if present in the growing environment (soil and water). Prevention of biofilm formation is an important control measure to reduce the prevalence and survival of L. monocytogenes in growing environments and on fresh produce. This article specifically focuses on fresh produce–associated listeriosis outbreaks, prevalence in growing environments, contamination levels of fresh produce, and associated fresh produce safety challenges. PMID:28282938
Final Report for DOE Award ER25756
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kesselman, Carl
2014-11-17
The SciDAC-funded Center for Enabling Distributed Petascale Science (CEDPS) was established to address technical challenges that arise due to the frequent geographic distribution of data producers (in particular, supercomputers and scientific instruments) and data consumers (people and computers) within the DOE laboratory system. Its goal is to produce technical innovations that meet DOE end-user needs for (a) rapid and dependable placement of large quantities of data within a distributed high-performance environment, and (b) the convenient construction of scalable science services that provide for the reliable and high-performance processing of computation and data analysis requests from many remote clients. The Centermore » is also addressing (c) the important problem of troubleshooting these and other related ultra-high-performance distributed activities from the perspective of both performance and functionality« less
Thermal activation of dislocations in large scale obstacle bypass
NASA Astrophysics Data System (ADS)
Sobie, Cameron; Capolungo, Laurent; McDowell, David L.; Martinez, Enrique
2017-08-01
Dislocation dynamics simulations have been used extensively to predict hardening caused by dislocation-obstacle interactions, including irradiation defect hardening in the athermal case. Incorporating the role of thermal energy on these interactions is possible with a framework provided by harmonic transition state theory (HTST) enabling direct access to thermally activated reaction rates using the Arrhenius equation, including rates of dislocation-obstacle bypass processes. Moving beyond unit dislocation-defect reactions to a representative environment containing a large number of defects requires coarse-graining the activation energy barriers of a population of obstacles into an effective energy barrier that accurately represents the large scale collective process. The work presented here investigates the relationship between unit dislocation-defect bypass processes and the distribution of activation energy barriers calculated for ensemble bypass processes. A significant difference between these cases is observed, which is attributed to the inherent cooperative nature of dislocation bypass processes. In addition to the dislocation-defect interaction, the morphology of the dislocation segments pinned to the defects play an important role on the activation energies for bypass. A phenomenological model for activation energy stress dependence is shown to describe well the effect of a distribution of activation energies, and a probabilistic activation energy model incorporating the stress distribution in a material is presented.
NASA Astrophysics Data System (ADS)
Thomas, Zahra; Rousseau-Gueutin, Pauline; Kolbe, Tamara; Abbott, Ben; Marcais, Jean; Peiffer, Stefan; Frei, Sven; Bishop, Kevin; Le Henaff, Geneviève; Squividant, Hervé; Pichelin, Pascal; Pinay, Gilles; de Dreuzy, Jean-Raynald
2017-04-01
The distribution of groundwater residence time in a catchment provides synoptic information about catchment functioning (e.g. nutrient retention and removal, hydrograph flashiness). In contrast with interpreted model results, which are often not directly comparable between studies, residence time distribution is a general output that could be used to compare catchment behaviors and test hypotheses about landscape controls on catchment functioning. In this goal, we created a virtual observatory platform called Catchment Virtual Observatory for Sharing Flow and Transport Model Outputs (COnSOrT). The main goal of COnSOrT is to collect outputs from calibrated groundwater models from a wide range of environments. By comparing a wide variety of catchments from different climatic, topographic and hydrogeological contexts, we expect to enhance understanding of catchment connectivity, resilience to anthropogenic disturbance, and overall functioning. The web-based observatory will also provide software tools to analyze model outputs. The observatory will enable modelers to test their models in a wide range of catchment environments to evaluate the generality of their findings and robustness of their post-processing methods. Researchers with calibrated numerical models can benefit from observatory by using the post-processing methods to implement a new approach to analyzing their data. Field scientists interested in contributing data could invite modelers associated with the observatory to test their models against observed catchment behavior. COnSOrT will allow meta-analyses with community contributions to generate new understanding and identify promising pathways forward to moving beyond single catchment ecohydrology. Keywords: Residence time distribution, Models outputs, Catchment hydrology, Inter-catchment comparison
NASA Technical Reports Server (NTRS)
Noffke, Nora; Knoll, Andrew H.
2001-01-01
Shallow-marine, siliciclastic depositional systems are governed by physical sedimentary processes. Mineral precipitation or penecontemporaneous cementation play minor roles. Today, coastal siliciclastic environments may be colonized by a variety of epibenthic, mat-forming cyanobacteria. Studies on microbial mats showed that they are not randomly distributed in modern tidal environments. Distribution and abundancy is mainly function of a particular sedimentary facies. Fine-grained sands composed of "clear" (translucent) quartz particles constitute preferred substrates for cyanobacteria. Mat-builders also favor sites characterized by moderate hydrodynamic flow regimes, which permit biomass enrichment and construction of mat fabrics without lethal burial of mat populations by fine sediments. A comparable facies relationship can be observed in ancient siliciclastic shelf successions from the terminal Neoproterozoic Nama Group, Namibia. Wrinkle structures that record microbial mats are present but sparsely distributed in mid- to inner shelf sandstones of the Nudaus Formation. The sporadic distribution of these structures reflects both the narrow ecological window that governs mat development and the distinctive taphonomic conditions needed to preserve the structures. These observations caution that statements about changing mat abundance across the Proterozoic-Cambrian boundary must be firmly rooted in paleoenvironmental and taphonomic analysis. Understanding the factors that influence the formation and preservation of microbial structures in siliciclastic regimes can facilitate exploration for biological signatures in Earth's oldest rocks. Moreover, insofar as these structures can be preserved on bedding surfaces and are not easily mimicked by physical processes, they constitute a set of biological markers that can be searched for on Mars by remotely controlled rovers.
Use hyperspectral remote sensing technique to monitoring pine wood nomatode disease preliminary
NASA Astrophysics Data System (ADS)
Qin, Lin; Wang, Xianghong; Jiang, Jing; Yang, Xianchang; Ke, Daiyan; Li, Hongqun; Wang, Dingyi
2016-10-01
The pine wilt disease is a devastating disease of pine trees. In China, the first discoveries of the pine wilt disease on 1982 at Dr. Sun Yat-sen's Mausoleum in Nanjing. It occurred an area of 77000 hm2 in 2005, More than 1540000 pine trees deaths in the year. Many districts of Chongqing in Three Gorges Reservoir have different degrees of pine wilt disease occurrence. It is a serious threat to the ecological environment of the reservoir area. Use unmanned airship to carry high spectrum remote sensing monitoring technology to develop the study on pine wood nematode disease early diagnosis and early warning and forecasting in this study. The hyper spectral data and the digital orthophoto map data of Fuling District Yongsheng Forestry had been achieved In September 2015. Using digital image processing technology to deal with the digital orthophoto map, the number of disease tree and its distribution is automatic identified. Hyper spectral remote sensing data is processed by the spectrum comparison algorithm, and the number and distribution of disease pine trees are also obtained. Two results are compared, the distribution area of disease pine trees are basically the same, indicating that using low air remote sensing technology to monitor the pine wood nematode distribution is successful. From the results we can see that the hyper spectral data analysis results more accurate and less affected by environmental factors than digital orthophoto map analysis results, and more environment variable can be extracted, so the hyper spectral data study is future development direction.
Uranus: a rapid prototyping tool for FPGA embedded computer vision
NASA Astrophysics Data System (ADS)
Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.
2007-01-01
The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.
GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application
NASA Technical Reports Server (NTRS)
McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.
2010-01-01
The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.
Intelligent computer-aided training authoring environment
NASA Technical Reports Server (NTRS)
Way, Robert D.
1994-01-01
Although there has been much research into intelligent tutoring systems (ITS), there are few authoring systems available that support ITS metaphors. Instructional developers are generally obliged to use tools designed for creating on-line books. We are currently developing an authoring environment derived from NASA's research on intelligent computer-aided training (ICAT). The ICAT metaphor, currently in use at NASA has proven effective in disciplines from satellite deployment to high school physics. This technique provides a personal trainer (PT) who instructs the student using a simulated work environment (SWE). The PT acts as a tutor, providing individualized instruction and assistance to each student. Teaching in an SWE allows the student to learn tasks by doing them, rather than by reading about them. This authoring environment will expedite ICAT development by providing a tool set that guides the trainer modeling process. Additionally, this environment provides a vehicle for distributing NASA's ICAT technology to the private sector.
A cooperative model for IS security risk management in distributed environment.
Feng, Nan; Zheng, Chundong
2014-01-01
Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively.
Working toward integrated models of alpine plant distribution
Carlson, Bradley Z.; Randin, Christophe F.; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe
2014-01-01
Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial–temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution. PMID:24790594
Falcone, U; Gilardi, Luisella; Pasqualini, O; Santoro, S; Coffano, Elena
2010-01-01
Exposure to carcinogens is still widespread in working environments. For the purpose of defining priority of interventions, it is necessary to estimate the number and the geographic distribution of workers potentially exposed to carcinogens. It could therefore be useful to test the use of tools and information sources already available in order to map the distribution of exposure to carcinogens. Formaldehyde is suggested as an example of an occupational carcinogen in this study. The study aimed at verifying and investigating the potential of 3 integrated databases: MATline, CAREX, and company databases resulting from occupational accident and disease claims (INAIL), in order to estimate the number of workers exposed to formaldehyde and map their distribution in the Piedmont Region. The list of manufacturing processes involving exposure to formaldehyde was sorted by MIATline; for each process the number of firms and employees were obtained from the INAIL archives. By applying the prevalence of exposed workers obtained with CAREX, an estimate of exposure for each process was determined. A map of the distribution of employees associated with a specific process was produced using ArcView GIS software. It was estimated that more than 13,000 employees are exposed to formaldehyde in the Piedmont Region. The manufacture of furniture was identified as the process with the highest number of workers exposed to formaldehyde (3,130),followed by metal workers (2,301 exposed) and synthetic resin processing (1,391 exposed). The results obtained from the integrated use of databases provide a basis for defining priority of preventive interventions required in the industrial processes involving exposure to carcinogens in the Piedmont Region.
Xiao, Zhu; Havyarimana, Vincent; Li, Tong; Wang, Dong
2016-05-13
In this paper, a novel nonlinear framework of smoothing method, non-Gaussian delayed particle smoother (nGDPS), is proposed, which enables vehicle state estimation (VSE) with high accuracy taking into account the non-Gaussianity of the measurement and process noises. Within the proposed method, the multivariate Student's t-distribution is adopted in order to compute the probability distribution function (PDF) related to the process and measurement noises, which are assumed to be non-Gaussian distributed. A computation approach based on Ensemble Kalman Filter (EnKF) is designed to cope with the mean and the covariance matrix of the proposal non-Gaussian distribution. A delayed Gibbs sampling algorithm, which incorporates smoothing of the sampled trajectories over a fixed-delay, is proposed to deal with the sample degeneracy of particles. The performance is investigated based on the real-world data, which is collected by low-cost on-board vehicle sensors. The comparison study based on the real-world experiments and the statistical analysis demonstrates that the proposed nGDPS has significant improvement on the vehicle state accuracy and outperforms the existing filtering and smoothing methods.
Advances in methods for detection of anaerobic ammonium oxidizing (anammox) bacteria.
Li, Meng; Gu, Ji-Dong
2011-05-01
Anaerobic ammonium oxidation (anammox), the biochemical process oxidizing ammonium into dinitrogen gas using nitrite as an electron acceptor, has only been recognized for its significant role in the global nitrogen cycle not long ago, and its ubiquitous distribution in a wide range of environments has changed our knowledge about the contributors to the global nitrogen cycle. Currently, several groups of methods are used in detection of anammox bacteria based on their physiological and biochemical characteristics, cellular chemical composition, and both 16S rRNA gene and selective functional genes as biomarkers, including hydrazine oxidoreductase and nitrite reductase encoding genes hzo and nirS, respectively. Results from these methods coupling with advances in quantitative PCR, reverse transcription of mRNA genes and stable isotope labeling have improved our understanding on the distribution, diversity, and activity of anammox bacteria in different environments both natural and engineered ones. In this review, we summarize these methods used in detection of anammox bacteria from various environments, highlight the strengths and weakness of these methods, and also discuss the new development potentials on the existing and new techniques in the future.
System Level Uncertainty Assessment for Collaborative RLV Design
NASA Technical Reports Server (NTRS)
Charania, A. C.; Bradford, John E.; Olds, John R.; Graham, Matthew
2002-01-01
A collaborative design process utilizing Probabilistic Data Assessment (PDA) is showcased. Given the limitation of financial resources by both the government and industry, strategic decision makers need more than just traditional point designs, they need to be aware of the likelihood of these future designs to meet their objectives. This uncertainty, an ever-present character in the design process, can be embraced through a probabilistic design environment. A conceptual design process is presented that encapsulates the major engineering disciplines for a Third Generation Reusable Launch Vehicle (RLV). Toolsets consist of aerospace industry standard tools in disciplines such as trajectory, propulsion, mass properties, cost, operations, safety, and economics. Variations of the design process are presented that use different fidelities of tools. The disciplinary engineering models are used in a collaborative engineering framework utilizing Phoenix Integration's ModelCenter and AnalysisServer environment. These tools allow the designer to join disparate models and simulations together in a unified environment wherein each discipline can interact with any other discipline. The design process also uses probabilistic methods to generate the system level output metrics of interest for a RLV conceptual design. The specific system being examined is the Advanced Concept Rocket Engine 92 (ACRE-92) RLV. Previous experience and knowledge (in terms of input uncertainty distributions from experts and modeling and simulation codes) can be coupled with Monte Carlo processes to best predict the chances of program success.
Chen, Bang-Yuan; Wang, Chung-Yi; Wang, Chia-Lan; Fan, Yang-Chi; Weng, I-Ting; Chou, Chung-Hsi
2016-11-01
A 2-year study was performed at two ready-to-eat tilapia sashimi processing plants (A and B) to identify possible routes of contamination with Listeria monocytogenes during processing. Samples were collected from the aquaculture environments, transportation tanks, processing plants, and final products. Seventy-nine L. monocytogenes isolates were found in the processing environments and final products; 3.96% (50 of 1,264 samples) and 3.86% (29 of 752 samples) of the samples from plants A and B, respectively, were positive for L. monocytogenes . No L. monocytogenes was detected in the aquaculture environments or transportation tanks. The predominant L. monocytogenes serotypes were 1/2b (55.70%) and 4b (37.97%); serotypes 3b and 4e were detected at much lower percentages. At both plants, most processing sections were contaminated with L. monocytogenes before the start of processing, which indicated that the cleaning and sanitizing methods did not achieve adequate pathogen removal. Eleven seropulsotypes were revealed by pulsed-field gel electrophoresis and serotyping. Analysis of seropulsotype distribution revealed that the contamination was disseminated by the processing work; the same seropulsotypes were repeatedly found along the work flow line and in the final products. Specific seropulsotypes were persistently found during different sampling periods, which suggests that the sanitation procedures or equipment used at these plants were inadequate. Plant staff should improve the sanitation procedures and equipment to reduce the risk of L. monocytogenes cross-contamination and ensure the safety of ready-to-eat tilapia products.
Chaotic mixing by microswimmers moving on quasiperiodic orbits
NASA Astrophysics Data System (ADS)
Jalali, Mir Abbas; Khoshnood, Atefeh; Alam, Mohammad-Reza
2015-11-01
Life on the Earth is strongly dependent upon mixing across a vast range of scales. For example, mixing distributes nutrients for microorganisms in aquatic environments, and balances the spatial energy distribution in the oceans and the atmosphere. From industrial point of view, mixing is essential in many microfluidic processes and lab-on-a-chip operations, polymer engineering, pharmaceutics, food engineering, petroleum engineering, and biotechnology. Efficient mixing, typically characterized by chaotic advection, is hard to achieve in low Reynolds number conditions because of the linear nature of the Stokes equation that governs the motion. We report the first demonstration of chaotic mixing induced by a microswimmer that strokes on quasiperiodic orbits with multi-loop turning paths. Our findings can be utilized to understand the interactions of microorganisms with their environments, and to design autonomous robotic mixers that can sweep and mix an entire volume of complex-geometry containers.
Taniguchi, Makoto; Shimada, Jun; Fukuda, Yoichi; Yamano, Makoto; Onodera, Shin-ichi; Kaneko, Shinji; Yoshikoshi, Akihisa
2009-04-15
Anthropogenic effects in both Osaka and Bangkok were evaluated to compare the relationships between subsurface environment and the development stage of both cities. Subsurface thermal anomalies due to heat island effects were found in both cities. The Surface Warming Index (SWI), the departure depth from the steady geothermal gradient, was used as an indicator of the heat island effect. SWI increases (deeper) with the magnitude of heat island effect and the elapsed time starting from the surface warming. Distributions of subsurface thermal anomalies due to the heat island effect agreed well with the distribution of changes in air temperature due to the same process, which is described by the distribution of population density in both Osaka and Bangkok. Different time lags between groundwater depression and subsidence in the two cities was found. This is attributed to differences in hydrogeologic characters, such as porosity and hydraulic conductivity. We find that differences in subsurface degradations in Osaka and Bangkok, including subsurface thermal anomalies, groundwater depression, and land subsidence, depends on the difference of the development stage of urbanization and hydrogeological characters.
A shared-world conceptual model for integrating space station life sciences telescience operations
NASA Technical Reports Server (NTRS)
Johnson, Vicki; Bosley, John
1988-01-01
Mental models of the Space Station and its ancillary facilities will be employed by users of the Space Station as they draw upon past experiences, perform tasks, and collectively plan for future activities. The operational environment of the Space Station will incorporate telescience, a new set of operational modes. To investigate properties of the operational environment, distributed users, and the mental models they employ to manipulate resources while conducting telescience, an integrating shared-world conceptual model of Space Station telescience is proposed. The model comprises distributed users and resources (active elements); agents who mediate interactions among these elements on the basis of intelligent processing of shared information; and telescience protocols which structure the interactions of agents as they engage in cooperative, responsive interactions on behalf of users and resources distributed in space and time. Examples from the life sciences are used to instantiate and refine the model's principles. Implications for transaction management and autonomy are discussed. Experiments employing the model are described which the authors intend to conduct using the Space Station Life Sciences Telescience Testbed currently under development at Ames Research Center.
Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A
2016-01-01
The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.
Lightweight causal and atomic group multicast
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Schiper, Andre; Stephenson, Pat
1991-01-01
The ISIS toolkit is a distributed programming environment based on support for virtually synchronous process groups and group communication. A suite of protocols is presented to support this model. The approach revolves around a multicast primitive, called CBCAST, which implements a fault-tolerant, causally ordered message delivery. This primitive can be used directly or extended into a totally ordered multicast primitive, called ABCAST. It normally delivers messages immediately upon reception, and imposes a space overhead proportional to the size of the groups to which the sender belongs, usually a small number. It is concluded that process groups and group communication can achieve performance and scaling comparable to that of a raw message transport layer. This finding contradicts the widespread concern that this style of distributed computing may be unacceptably costly.
Subglacial Antarctic Lake Environment Research in the IPY
NASA Astrophysics Data System (ADS)
Kennicutt, M. C.; Priscu, J. C.
2006-12-01
Subglacial environments are continental-scale phenomena that occur under thick ice sheets. These environments differ in geologic setting, age, evolutionary history, and limnological conditions and may be connected by sub-ice hydrologic systems. Evidence suggests that subglacial lakes are linked to the onset of ice streams influencing the dynamics of overlying ice sheets. Outbursts of fresh water from subglacial environments have been invoked as an agent of landscape change in the past and there is speculation that subglacial freshwater discharges have influenced past climate. Subglacial environments rest at the intersection of continental ice sheets and the underlying lithosphere. The distribution of subglacial lakes is determined by the availability of water and basins for it to collect in. The distribution of water in subglacial environments is related to surface temperature, accumulation rates, ice thickness, ice velocities, and geothermal flux. The interconnectedness of these environments exerts a fundamental influence on subglacial physical, chemical, and ecological environments; the degree of isolation; and the evolution of life. Subglacial hydrology at a continental-scale must be mapped and modeled to evaluate past drainage events, map subglacial water, and quantify subglacial discharges. The geologic records of past hydrologic events will be reveal the impact of hydrological events on sediment distribution and landscape evolution. Subglacial environments are "natural" earth-bound macrocosms. In some instances these environments trace their origins to more than 35 million years before present when Antarctica became encased in ice. As opposed to other habitats on Earth, where solar energy is a primary influence, processes in subglacial environments are mediated by the flow of the overlying ice a glacial boundary condition and the flux of heat and possibly fluids from the underlying basin a tectonic control. Recent findings suggest that a third control on these environments is subglacial hydrology, which will influence water residence time and the delivery of water, materials, and heat to and through subglacial systems. Owing to the lack of solar energy, any microbiological metabolism in these systems must rely on energy and nutrition derived from glacial ice, the bedrock, and/or geothermal sources. For millions of years, many Antarctic subglacial environments have been insulated from weather, the seasons, and celestially controlled climatic changes that establish fundamental constraints on the structure and functioning of most other ecosystems. Subglacial environments provide an opportunity to advance understanding of how life, the environment, climate, and planetary history combine to produce the world as we know it today. Multi-national, interdisicplinary field campaigns during the IPY 2007-2008 will provide fundamental knowledge about the importance of subglacial environments during the history and evolution of Antarctica.
Sun, Qiyuan; Jiang, Juan; Zheng, Yuyi; Wang, Feifeng; Wu, Chunshan; Xie, Rong-Rong
2017-07-01
The distribution variation in chromophoric dissolved organic matter (CDOM) content in mid-latitude subtropical drinking water source reservoirs (MDWSRs) has great significance in the security of aquatic environments and human health. CDOM distribution is heavily influenced by biogeochemical processes and anthropogenic activity. However, little is known regarding the impact of component variation and phytoplankton growth on CDOM distribution variation in MDWSR. Therefore, samples were collected from a representative MDWSR (the Shanzai Reservoir) for analysis. CDOM absorption and fluorescence coupling with parallel factor analysis were measured and calculated. The results indicated that only two CDOM components were found in the surface water of Shanzai Reservoir, fulvic acid, and high-excitation tryptophan, originating from terrestrial and autochthonous sources, respectively. The types of components did not change with the season. The average molecular weight of CDOM increased in proportion to its fulvic acid content. The distribution variation in CDOM content mainly resulted from the variation in two CDOM components in summer and from high-excitation tryptophan in winter. Phytoplankton growth strongly influenced the distribution variation of CDOM content in summer; the metabolic processes of Cyanobacteria and Bacillariophyta consumed fulvic acid, while that of Cryptophyta produced high-excitation tryptophan.
Grace, Damian
2009-04-01
The papers in this volume deal with various aspects of the HCB legacy at the Orica plant at Botany. Whether explicitly or implicitly, they are concerned with questions of ethics; with the just distribution of burdens and benefits; with just processes for disposing of dangerous industrial waste; and with a just custodianship of the Botany environment. These ethical issues illustrate the difficulty of securing corporate accountability, and the elusiveness of responsibility within organisations. This paper reflects on some of the issues for ethics raised by the Orica case and their significance for corporate ethics.
The impact of common APSE interface set specifications on space station information systems
NASA Technical Reports Server (NTRS)
Diaz-Herrera, Jorge L.; Sibley, Edgar H.
1986-01-01
Certain types of software facilities are needed in a Space Station Information Systems Environment; the Common APSE (Ada Program Support Environment) Interface Set (CAIS) was proposed as a means of satisfying them. The reasonableness of this is discussed by examining the current CAIS, considering the changes due to the latest Requirements and Criteria (RAC) document, and postulating the effects on the CAIS 2.0. Finally, a few additional comments are made on the problems inherent in the Ada language itself, especially on its deficiencies when used for implementing large distributed processing and data base applications.
Probabilistic Verification of Multi-Robot Missions in Uncertain Environments
2015-11-01
has been used to measure the environment, including any dynamic obstacles. However, no matter how the model originates, this approach is based on...modeled as bivariate Gaussian distributions and estimated by calibration measurements . The Robot process model is described in prior work [13...sn〉 (pR,pE)(obR) = In〈pR〉〈p〉 ; In〈pE〉〈e〉 ; ( Gtr〈 d(p,e), sr〉〈p1〉 ; Out〈obR,p1〉 | Lte 〈 d(p,e), sr〉〈p2〉 ; Out〈obR, sn+p2 〉 ) ; Sensors
Liu, Hou-Qi; Lam, James C W; Li, Wen-Wei; Yu, Han-Qing; Lam, Paul K S
2017-05-15
Municipal wastewater treatment plants (WWTPs) are an important source of pharmaceuticals released into the environment. Understanding how various pharmaceuticals are distributed and handled in WWTPs is a prerequisite to optimize their abatement. Here we investigated the spatial distribution and removal efficiencies pharmaceuticals in China's WWTPs. A total of 35 pharmaceuticals in wastewater samples from 12 WWTPs at different cities of China were analyzed. Among these detected pharmaceuticals, caffeine showed the highest concentration (up to 1775.98ngL -1 ) in the WWTP influent. In addition, there were significant regional differences in pharmaceutical distribution with higher influent concentrations of total pharmaceuticals detected in WWTPs in the northern cities than the southern ones. The state-of-the-art treatment processes were generally inefficient in removing pharmaceuticals. Only 14.3% of pharmaceuticals were removed effectively (mean removal efficiency>70%), while 51.4% had a removal rate of below 30%. The anaerobic/anoxic/oxic (AAO)-membrane bioreactor (MBR) integrated process and sequencing batch reactor (SBR) showed better performance than the AAO and oxidation ditch (OD) processes. Ofloxacin, erythromycin-H 2 O, clarithromycin, roxithromycin and sulfamethoxazole in WWTP effluents exhibited a high or medium ecological risk and deserved special attention. Copyright © 2017 Elsevier B.V. All rights reserved.
A road map for integrating eco-evolutionary processes into biodiversity models.
Thuiller, Wilfried; Münkemüller, Tamara; Lavergne, Sébastien; Mouillot, David; Mouquet, Nicolas; Schiffers, Katja; Gravel, Dominique
2013-05-01
The demand for projections of the future distribution of biodiversity has triggered an upsurge in modelling at the crossroads between ecology and evolution. Despite the enthusiasm around these so-called biodiversity models, most approaches are still criticised for not integrating key processes known to shape species ranges and community structure. Developing an integrative modelling framework for biodiversity distribution promises to improve the reliability of predictions and to give a better understanding of the eco-evolutionary dynamics of species and communities under changing environments. In this article, we briefly review some eco-evolutionary processes and interplays among them, which are essential to provide reliable projections of species distributions and community structure. We identify gaps in theory, quantitative knowledge and data availability hampering the development of an integrated modelling framework. We argue that model development relying on a strong theoretical foundation is essential to inspire new models, manage complexity and maintain tractability. We support our argument with an example of a novel integrated model for species distribution modelling, derived from metapopulation theory, which accounts for abiotic constraints, dispersal, biotic interactions and evolution under changing environmental conditions. We hope such a perspective will motivate exciting and novel research, and challenge others to improve on our proposed approach. © 2013 John Wiley & Sons Ltd/CNRS.
An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments.
Mora, Higinio; Gil, David; Terol, Rafael Muñoz; Azorín, Jorge; Szymanski, Julian
2017-10-10
The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other 'things' ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers' heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries.
An IoT-Based Computational Framework for Healthcare Monitoring in Mobile Environments
Szymanski, Julian
2017-01-01
The new Internet of Things paradigm allows for small devices with sensing, processing and communication capabilities to be designed, which enable the development of sensors, embedded devices and other ‘things’ ready to understand the environment. In this paper, a distributed framework based on the internet of things paradigm is proposed for monitoring human biomedical signals in activities involving physical exertion. The main advantages and novelties of the proposed system is the flexibility in computing the health application by using resources from available devices inside the body area network of the user. This proposed framework can be applied to other mobile environments, especially those where intensive data acquisition and high processing needs take place. Finally, we present a case study in order to validate our proposal that consists in monitoring footballers’ heart rates during a football match. The real-time data acquired by these devices presents a clear social objective of being able to predict not only situations of sudden death but also possible injuries. PMID:28994743
Evidence of Chemical Cloud Processing from In Situ Measurements in the Polluted Marine Environment
NASA Astrophysics Data System (ADS)
Hudson, J. G.; Noble, S. R., Jr.
2017-12-01
Chemical cloud processing alters activated cloud condensation nuclei (CCN). Aqueous oxidation of trace gases dissolved within cloud droplets adds soluble material. As most cloud droplets evaporate, the residual material produces CCN that are larger and with a different hygroscopicity (κ). This improves the CCN, lowering the critical supersaturation (Sc), making it more easily activated. This process separates the processed (accumulation) and unprocessed (Aitken) modes creating bimodal CCN distributions (Hudson et al., 2015). Various measurements made during the MArine Stratus/stratocumulus Experiment (MASE), including CCN, exhibited aqueous processing signals. Particle size distributions; measured by a differential mobility analyzer; were compared with CCN distributions; measured by the Desert Research Institute CCN spectrometer; by converting size to Sc using κ to overlay concurrent distributions. By tuning each mode to the best agreement, κ for each mode is determined; processed κ (κp), unprocessed κ (κu). In MASE, 59% of bimodal distributions had different κ for the two modes indicating dominance of chemical processing via aqueous oxidation. This is consistent with Hudson et al. (2015). Figure 1A also indicates chemical processing with larger κp between 0.35-0.75. Processed CCN had an influx of soluble material from aqueous oxidation which increased κp versus κu. Above 0.75 κp is lower than κu (Fig. 1A). When κu is high and sulfate material is added, κp tends towards κ of the added material. Thus, κp is reduced by additional material that is less soluble than the original material. Chemistry measurements in MASE also indicate in-cloud aqueous oxidation (Fig. 1B and 1C). Higher fraction of CCN concentrations in the processed mode are also associated with larger amounts of sulfates (Fig. 1B, red) and nitrates (Fig. 1C, orange) while SO2 (Fig. 1B, black) and O3 (Fig. 1C, blue) have lower amounts. This larger amount of sulfate is at the expense of SO2, indicating aqueous oxidation within cloud as associated with larger concentrations in the processed mode. Thus, in situ measurements indicate that chemical cloud processing alters size, Sc and κ of activated CCN. Hudson et al. (2015), JGRA, 120, 3436-3452.
Massively Parallel Processing for Fast and Accurate Stamping Simulations
NASA Astrophysics Data System (ADS)
Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu
2005-08-01
The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.
Miniature Brain Decision Making in Complex Visual Environments
2008-07-18
release; distribution unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT The grantee investigated, using the honeybee ( Apis mellifera ) as a model...successful for understanding face processing in both human adults and infants. Individual honeybees ( Apis mellifera ) were trained with...for 30 bees (group 3) of the target stimuli. Bernard J, Stach S, Giurfa M (2007) Categorization of visual stimuli in the honeybee Apis mellifera
ERIC Educational Resources Information Center
Harjunen, Elina
2012-01-01
In this theoretical paper the role of power in classroom interactions is examined in terms of a dominance continuum to advance a theoretical framework justifying the emergence of three ways of distributing power when it comes to dealing with the control over the teaching-studying-learning (TSL) "pattern of teacher domination," "pattern of…
Planetary geomorphology research: FY 1990-1991
NASA Technical Reports Server (NTRS)
Malin, M. C.
1991-01-01
Progress in the following research areas is discussed: (1) volatile ice sublimation in a simulated Martian polar environment; (2) a global synthesis of Venusian tectonics; (3) a summary of nearly a decade of field studies of eolian processes in cold volcanic deserts; and (4) a model for interpretation of Martian sediment distribution using Viking observations. Some conclusions from the research are presented.
Jeffrey R. Garnas; David R. Houston; Mark J. Twery; Matthew P. Ayres; Celia Evans
2013-01-01
Spatial pattern in the distribution and abundance of organisms is an emergent property of collective rates of reproduction, survival and movement of individuals in a heterogeneous environment. The form, intensity and scale of spatial patterning can be used to test hypotheses regarding the relative importance of candidate processes to population dynamics. Using 84 plots...
A temporal analysis of urban forest carbon storage using remote sensing
Soojeong Myeong; David J. Nowak; Michael J. Duggin
2006-01-01
Quantifying the carbon storage, distribution, and change of urban trees is vital to understanding the role of vegetation in the urban environment. At present, this is mostly achieved through ground study. This paper presents a method based on the satellite image time series, which can save time and money and greatly speed the process of urban forest carbon storage...
ERIC Educational Resources Information Center
Turk-Browne, Nicholas B.; Scholl, Brian J.; Chun, Marvin M.; Johnson, Marcia K.
2009-01-01
Our environment contains regularities distributed in space and time that can be detected by way of statistical learning. This unsupervised learning occurs without intent or awareness, but little is known about how it relates to other types of learning, how it affects perceptual processing, and how quickly it can occur. Here we use fMRI during…
NASA Astrophysics Data System (ADS)
Sposini, Vittoria; Chechkin, Aleksei V.; Seno, Flavio; Pagnini, Gianni; Metzler, Ralf
2018-04-01
A considerable number of systems have recently been reported in which Brownian yet non-Gaussian dynamics was observed. These are processes characterised by a linear growth in time of the mean squared displacement, yet the probability density function of the particle displacement is distinctly non-Gaussian, and often of exponential (Laplace) shape. This apparently ubiquitous behaviour observed in very different physical systems has been interpreted as resulting from diffusion in inhomogeneous environments and mathematically represented through a variable, stochastic diffusion coefficient. Indeed different models describing a fluctuating diffusivity have been studied. Here we present a new view of the stochastic basis describing time-dependent random diffusivities within a broad spectrum of distributions. Concretely, our study is based on the very generic class of the generalised Gamma distribution. Two models for the particle spreading in such random diffusivity settings are studied. The first belongs to the class of generalised grey Brownian motion while the second follows from the idea of diffusing diffusivities. The two processes exhibit significant characteristics which reproduce experimental results from different biological and physical systems. We promote these two physical models for the description of stochastic particle motion in complex environments.
Limiting similarity and functional diversity along environmental gradients
Schwilk, D.W.; Ackerly, D.D.
2005-01-01
Recent developments in community models emphasize the importance of incorporating stochastic processes (e.g. ecological drift) in models of niche-structured community assembly. We constructed a finite, spatially explicit, lottery model to simulate the distribution of species in a one-dimensional landscape with an underlying gradient in environmental conditions. Our framework combines the potential for ecological drift with environmentally-mediated competition for space in a heterogeneous environment. We examined the influence of niche breadth, dispersal distances, community size (total number of individuals) and the breadth of the environmental gradient on levels of species and functional trait diversity (i.e. differences in niche optima). Three novel results emerge from this model: (1) niche differences between adjacent species (e.g. limiting similarity) increase in smaller communities, because of the interaction of competitive effects and finite population sizes; (2) immigration from a regional species pool, stochasticity and niche-assembly generate a bimodal distribution of species residence times ('transient' and 'resident') under a heterogeneous environment; and (3) the magnitude of environmental heterogeneity has a U-shaped effect on diversity, because of shifts in species richness of resident vs. transient species. These predictions illustrate the potential importance of stochastic (although not necessarily neutral) processes in community assembly. ??2005 Blackwell Publishing Ltd/CNRS.
A Cooperative Model for IS Security Risk Management in Distributed Environment
Zheng, Chundong
2014-01-01
Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively. PMID:24563626
Artificial intelligence in a mission operations and satellite test environment
NASA Technical Reports Server (NTRS)
Busse, Carl
1988-01-01
A Generic Mission Operations System using Expert System technology to demonstrate the potential of Artificial Intelligence (AI) automated monitor and control functions in a Mission Operations and Satellite Test environment will be developed at the National Aeronautics and Space Administration (NASA) Jet Propulsion Laboratory (JPL). Expert system techniques in a real time operation environment are being studied and applied to science and engineering data processing. Advanced decommutation schemes and intelligent display technology will be examined to develop imaginative improvements in rapid interpretation and distribution of information. The Generic Payload Operations Control Center (GPOCC) will demonstrate improved data handling accuracy, flexibility, and responsiveness in a complex mission environment. The ultimate goal is to automate repetitious mission operations, instrument, and satellite test functions by the applications of expert system technology and artificial intelligence resources and to enhance the level of man-machine sophistication.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whicker, F.W.; Pinder, J.E. III; Bowling, J.W.
1989-05-01
The gradual senescence of present-day operating nuclear facilities, and resultant contamination of aquatic and terrestrial ecosystems, emphasize the importance of understanding the behavior of radionuclides in the environment. Observations and deductions concerning mechanisms of radionuclide transport can contribute significantly to knowledge of fundamental ecological processes. This study emphasized the ecosystem-level distribution of several long-lived radionuclides in an abandoned reactor cooling impoundment after a twenty year period of chemical and biological equilibration. 90 refs., 14 figs., 5 tabs.
An Intelligent Archive Testbed Incorporating Data Mining
NASA Technical Reports Server (NTRS)
Ramapriyan, H.; Isaac, D.; Yang, W.; Bonnlander, B.; Danks, D.
2009-01-01
Many significant advances have occurred during the last two decades in remote sensing instrumentation, computation, storage, and communication technology. A series of Earth observing satellites have been launched by U.S. and international agencies and have been operating and collecting global data on a regular basis. These advances have created a data rich environment for scientific research and applications. NASA s Earth Observing System (EOS) Data and Information System (EOSDIS) has been operational since August 1994 with support for pre-EOS data. Currently, EOSDIS supports all the EOS missions including Terra (1999), Aqua (2002), ICESat (2002) and Aura (2004). EOSDIS has been effectively capturing, processing and archiving several terabytes of standard data products each day. It has also been distributing these data products at a rate of several terabytes per day to a diverse and globally distributed user community (Ramapriyan et al. 2009). There are other NASA-sponsored data system activities including measurement-based systems such as the Ocean Data Processing System and the Precipitation Processing system, and several projects under the Research, Education and Applications Solutions Network (REASoN), Making Earth Science Data Records for Use in Research Environments (MEaSUREs), and the Advancing Collaborative Connections for Earth-Sun System Science (ACCESS) programs. Together, these activities provide a rich set of resources constituting a value chain for users to obtain data at various levels ranging from raw radiances to interdisciplinary model outputs. The result has been a significant leap in our understanding of the Earth systems that all humans depend on for their enjoyment, livelihood, and survival. The trend in the community today is towards many distributed sets of providers of data and services. Despite this, visions for the future include users being able to locate, fuse and utilize data with location transparency and high degree of interoperability, and being able to convert data to information and usable knowledge in an efficient, convenient manner, aided significantly by automation (Ramapriyan et al. 2004; NASA 2005). We can look upon the distributed provider environment with capabilities to convert data to information and to knowledge as an Intelligent Archive in the Context of a Knowledge Building system (IA-KBS). Some of the key capabilities of an IA-KBS are: Virtual Product Generation, Significant Event Detection, Automated Data Quality Assessment, Large-Scale Data Mining, Dynamic Feedback Loop, and Data Discovery and Efficient Requesting (Ramapriyan et al. 2004).
Smart sensing surveillance system
NASA Astrophysics Data System (ADS)
Hsu, Charles; Chu, Kai-Dee; O'Looney, James; Blake, Michael; Rutar, Colleen
2010-04-01
An effective public safety sensor system for heavily-populated applications requires sophisticated and geographically-distributed infrastructures, centralized supervision, and deployment of large-scale security and surveillance networks. Artificial intelligence in sensor systems is a critical design to raise awareness levels, improve the performance of the system and adapt to a changing scenario and environment. In this paper, a highly-distributed, fault-tolerant, and energy-efficient Smart Sensing Surveillance System (S4) is presented to efficiently provide a 24/7 and all weather security operation in crowded environments or restricted areas. Technically, the S4 consists of a number of distributed sensor nodes integrated with specific passive sensors to rapidly collect, process, and disseminate heterogeneous sensor data from near omni-directions. These distributed sensor nodes can cooperatively work to send immediate security information when new objects appear. When the new objects are detected, the S4 will smartly select the available node with a Pan- Tilt- Zoom- (PTZ) Electro-Optics EO/IR camera to track the objects and capture associated imagery. The S4 provides applicable advanced on-board digital image processing capabilities to detect and track the specific objects. The imaging detection operations include unattended object detection, human feature and behavior detection, and configurable alert triggers, etc. Other imaging processes can be updated to meet specific requirements and operations. In the S4, all the sensor nodes are connected with a robust, reconfigurable, LPI/LPD (Low Probability of Intercept/ Low Probability of Detect) wireless mesh network using Ultra-wide band (UWB) RF technology. This UWB RF technology can provide an ad-hoc, secure mesh network and capability to relay network information, communicate and pass situational awareness and messages. The Service Oriented Architecture of S4 enables remote applications to interact with the S4 network and use the specific presentation methods. In addition, the S4 is compliant with Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards to efficiently discover, access, use, and control heterogeneous sensors and their metadata. These S4 capabilities and technologies have great potential for both military and civilian applications, enabling highly effective security support tools for improving surveillance activities in densely crowded environments. The S4 system is directly applicable to solutions for emergency response personnel, law enforcement, and other homeland security missions, as well as in applications requiring the interoperation of sensor networks with handheld or body-worn interface devices.
Development of a comprehensive software engineering environment
NASA Technical Reports Server (NTRS)
Hartrum, Thomas C.; Lamont, Gary B.
1987-01-01
The generation of a set of tools for software lifecycle is a recurring theme in the software engineering literature. The development of such tools and their integration into a software development environment is a difficult task because of the magnitude (number of variables) and the complexity (combinatorics) of the software lifecycle process. An initial development of a global approach was initiated in 1982 as the Software Development Workbench (SDW). Continuing efforts focus on tool development, tool integration, human interfacing, data dictionaries, and testing algorithms. Current efforts are emphasizing natural language interfaces, expert system software development associates and distributed environments with Ada as the target language. The current implementation of the SDW is on a VAX-11/780. Other software development tools are being networked through engineering workstations.
SigmaCLIPSE = presentation management + NASA CLI PS + SQL
NASA Technical Reports Server (NTRS)
Weiss, Bernard P., Jr.
1990-01-01
SigmaCLIPSE provides an expert systems and 'intelligent' data base development program for diverse systems integration environments that require support for automated reasoning and expert systems technology, presentation management, and access to 'intelligent' SQL data bases. The SigmaCLIPSE technology and and its integrated ability to access 4th generation application development and decision support tools through a portable SQL interface, comprises a sophisticated software development environment for solving knowledge engineering and expert systems development problems in information intensive commercial environments -- financial services, health care, and distributed process control -- where the expert system must be extendable -- a major architectural advantage of NASA CLIPS. SigmaCLIPSE is a research effort intended to test the viability of merging SQL data bases with expert systems technology.
Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework
NASA Astrophysics Data System (ADS)
Gannon, C.
2017-12-01
As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.
An Autonomous Distributed Fault-Tolerant Local Positioning System
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2017-01-01
We describe a fault-tolerant, GPS-independent (Global Positioning System) distributed autonomous positioning system for static/mobile objects and present solutions for providing highly-accurate geo-location data for the static/mobile objects in dynamic environments. The reliability and accuracy of a positioning system fundamentally depends on two factors; its timeliness in broadcasting signals and the knowledge of its geometry, i.e., locations and distances of the beacons. Existing distributed positioning systems either synchronize to a common external source like GPS or establish their own time synchrony using a scheme similar to a master-slave by designating a particular beacon as the master and other beacons synchronize to it, resulting in a single point of failure. Another drawback of existing positioning systems is their lack of addressing various fault manifestations, in particular, communication link failures, which, as in wireless networks, are increasingly dominating the process failures and are typically transient and mobile, in the sense that they typically affect different messages to/from different processes over time.
Butyltin sorption onto freshwater sediments: from batch experiments to the field values
NASA Astrophysics Data System (ADS)
Bancon-Montingy, C.; Aubert, G.; Chahinian, N.; Meyer, J.; Brunel, V.; Tournoud, M. G.
2009-04-01
Butyltins, and most particularly TBT were widely used by the industry in the 1970s and 1980s, namely as anti-fouling paints on ships. Although banned since 2003 in Europe, surveys still point out the presence of these compounds both in coastal and terrestrial environments. The resilience of organotin (OT) compounds can be explained by their high adsorption capacity. OTs can bond easily to particulate matter and "migrate" from the water column unto the sediments where their half-life can extend to a few decades. Consequently sediments can become important organotin stores and release OT compounds during dredging operations, storms, tides or floods. Studies on OT behavior in freshwater environments, mainly sediments, are scarce in the literature compared with marine sediments. However, it is known that sorption behaviour of organotin compounds on sediments is governed by the constituents of sediments, and the composition of interstitial water in the sediments and overlying water, i.e. grain size distribution, clay minerals, organic matter, iron, aluminium (hydr)oxides and carbonate in the sediments; salinity, ionic composition, and pH of interstitial water in the sediments and overlying water. The main objective of this work is to assess butyltin adsorption into the sediments of an intermittent river located in southern France: The Vène. Sediments were collected during high and low flow conditions and batch experiments were set up using "natural" and "crushed" sediments to assess the adsorption kinetics. Classical batch experiments and GC-ICP-MS analysis were carried out to measure the distribution coefficient (Kd). The influence of organic substances on sorption processes for organotin species was studied and the role of grain size distribution assessed by comparing natural and crushed sediments. The results indicated that organotin compounds are sorbed easily and quickly on freshwater sediments. The adsorption isotherm for butyltins follows the Freundlich equation which is used to describe the adsorption behaviour of non-polar organic matters. This is due to their organic substituent groups. The presence of organic matter modifies the sorption process: less OT is adsorbed onto the sediments. This leads to increased OT concentrations in solution and consequently a higher probability for assimilation by freshwater organisms. The comparison of our results to those reported in the literature for marine environments could not be carried out because of the wide differences in salinity and grain size distribution between the two environments.
NASA Technical Reports Server (NTRS)
Davis, George; Cary, Everett; Higinbotham, John; Burns, Richard; Hogie, Keith; Hallahan, Francis
2003-01-01
The paper will provide an overview of the web-based distributed simulation software system developed for end-to-end, multi-spacecraft mission design, analysis, and test at the NASA Goddard Space Flight Center (GSFC). This software system was developed for an internal research and development (IR&D) activity at GSFC called the Distributed Space Systems (DSS) Distributed Synthesis Environment (DSE). The long-term goal of the DSS-DSE is to integrate existing GSFC stand-alone test beds, models, and simulation systems to create a "hands on", end-to-end simulation environment for mission design, trade studies and simulations. The short-term goal of the DSE was therefore to develop the system architecture, and then to prototype the core software simulation capability based on a distributed computing approach, with demonstrations of some key capabilities by the end of Fiscal Year 2002 (FY02). To achieve the DSS-DSE IR&D objective, the team adopted a reference model and mission upon which FY02 capabilities were developed. The software was prototyped according to the reference model, and demonstrations were conducted for the reference mission to validate interfaces, concepts, etc. The reference model, illustrated in Fig. 1, included both space and ground elements, with functional capabilities such as spacecraft dynamics and control, science data collection, space-to-space and space-to-ground communications, mission operations, science operations, and data processing, archival and distribution addressed.
NASA Astrophysics Data System (ADS)
Dong, Yaxue; Fang, Xiaohua; Brain, D. A.; McFadden, James P.; Halekas, Jasper; Connerney, Jack
2015-04-01
The Mars-solar wind interaction accelerates and transports planetary ions away from the Martian atmosphere through a number of processes, including ‘pick-up’ by electromagnetic fields. The MAVEN spacecraft has made routine observations of escaping planetary ions since its arrival at Mars in September 2014. The SupraThermal And Thermal Ion Composition (STATIC) instrument measures the ion energy, mass, and angular spectra. It has detected energetic planetary ions during most of the spacecraft orbits, which are attributed to the pick-up process. We found significant variations in the escaping ion mass and velocity distributions from the STATIC data, which can be explained by factors such as varying solar wind conditions, contributions of particles from different source locations and different phases during the pick-up process. We also study the spatial distributions of different planetary ion species, which can provide insight into the physics of ion escaping process and enhance our understanding of atmospheric erosion by the solar wind. Our results will be further interpreted within the context of the upstream solar wind conditions measured by the MAVEN Solar Wind Ion Analyzer (SWIA) instrument and the magnetic field environment measured by the Magnetometer (MAG) instrument. Our study shows that the ion spatial distribution in the Mars-Sun-Electric-Field (MSE) coordinate system and the velocity space distribution with respect to the local magnetic field line can be used to distinguish the ions escaping through the polar plume and those through the tail region. The contribution of the polar plume ion escape to the total escape rate will also be discussed.
NASA Astrophysics Data System (ADS)
Trenkel, V. M.; Huse, G.; MacKenzie, B. R.; Alvarez, P.; Arrizabalaga, H.; Castonguay, M.; Goñi, N.; Grégoire, F.; Hátún, H.; Jansen, T.; Jacobsen, J. A.; Lehodey, P.; Lutcavage, M.; Mariani, P.; Melvin, G. D.; Neilson, J. D.; Nøttestad, L.; Óskarsson, G. J.; Payne, M. R.; Richardson, D. E.; Senina, I.; Speirs, D. C.
2014-12-01
This paper reviews the current knowledge on the ecology of widely distributed pelagic fish stocks in the North Atlantic basin with emphasis on their role in the food web and the factors determining their relationship with the environment. We consider herring (Clupea harengus), mackerel (Scomber scombrus), capelin (Mallotus villosus), blue whiting (Micromesistius poutassou), and horse mackerel (Trachurus trachurus), which have distributions extending beyond the continental shelf and predominantly occur on both sides of the North Atlantic. We also include albacore (Thunnus alalunga), bluefin tuna (Thunnus thynnus), swordfish (Xiphias gladius), and blue marlin (Makaira nigricans), which, by contrast, show large-scale migrations at the basin scale. We focus on the links between life history processes and the environment, horizontal and vertical distribution, spatial structure and trophic role. Many of these species carry out extensive migrations from spawning grounds to nursery and feeding areas. Large oceanographic features such as the North Atlantic subpolar gyre play an important role in determining spatial distributions and driving variations in stock size. Given the large biomasses of especially the smaller species considered here, these stocks can exert significant top-down pressures on the food web and are important in supporting higher trophic levels. The review reveals commonalities and differences between the ecology of widely distributed pelagic fish in the NE and NW Atlantic basins, identifies knowledge gaps and modelling needs that the EURO-BASIN project attempts to address.
A General-purpose Framework for Parallel Processing of Large-scale LiDAR Data
NASA Astrophysics Data System (ADS)
Li, Z.; Hodgson, M.; Li, W.
2016-12-01
Light detection and ranging (LiDAR) technologies have proven efficiency to quickly obtain very detailed Earth surface data for a large spatial extent. Such data is important for scientific discoveries such as Earth and ecological sciences and natural disasters and environmental applications. However, handling LiDAR data poses grand geoprocessing challenges due to data intensity and computational intensity. Previous studies received notable success on parallel processing of LiDAR data to these challenges. However, these studies either relied on high performance computers and specialized hardware (GPUs) or focused mostly on finding customized solutions for some specific algorithms. We developed a general-purpose scalable framework coupled with sophisticated data decomposition and parallelization strategy to efficiently handle big LiDAR data. Specifically, 1) a tile-based spatial index is proposed to manage big LiDAR data in the scalable and fault-tolerable Hadoop distributed file system, 2) two spatial decomposition techniques are developed to enable efficient parallelization of different types of LiDAR processing tasks, and 3) by coupling existing LiDAR processing tools with Hadoop, this framework is able to conduct a variety of LiDAR data processing tasks in parallel in a highly scalable distributed computing environment. The performance and scalability of the framework is evaluated with a series of experiments conducted on a real LiDAR dataset using a proof-of-concept prototype system. The results show that the proposed framework 1) is able to handle massive LiDAR data more efficiently than standalone tools; and 2) provides almost linear scalability in terms of either increased workload (data volume) or increased computing nodes with both spatial decomposition strategies. We believe that the proposed framework provides valuable references on developing a collaborative cyberinfrastructure for processing big earth science data in a highly scalable environment.
Minimization of Blast furnace Fuel Rate by Optimizing Burden and Gas Distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. Chenn Zhou
2012-08-15
The goal of the research is to improve the competitive edge of steel mills by using the advanced CFD technology to optimize the gas and burden distributions inside a blast furnace for achieving the best gas utilization. A state-of-the-art 3-D CFD model has been developed for simulating the gas distribution inside a blast furnace at given burden conditions, burden distribution and blast parameters. The comprehensive 3-D CFD model has been validated by plant measurement data from an actual blast furnace. Validation of the sub-models is also achieved. The user friendly software package named Blast Furnace Shaft Simulator (BFSS) has beenmore » developed to simulate the blast furnace shaft process. The research has significant benefits to the steel industry with high productivity, low energy consumption, and improved environment.« less
A geometric theory for Lévy distributions
NASA Astrophysics Data System (ADS)
Eliazar, Iddo
2014-08-01
Lévy distributions are of prime importance in the physical sciences, and their universal emergence is commonly explained by the Generalized Central Limit Theorem (CLT). However, the Generalized CLT is a geometry-less probabilistic result, whereas physical processes usually take place in an embedding space whose spatial geometry is often of substantial significance. In this paper we introduce a model of random effects in random environments which, on the one hand, retains the underlying probabilistic structure of the Generalized CLT and, on the other hand, adds a general and versatile underlying geometric structure. Based on this model we obtain geometry-based counterparts of the Generalized CLT, thus establishing a geometric theory for Lévy distributions. The theory explains the universal emergence of Lévy distributions in physical settings which are well beyond the realm of the Generalized CLT.
A geometric theory for Lévy distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2014-08-15
Lévy distributions are of prime importance in the physical sciences, and their universal emergence is commonly explained by the Generalized Central Limit Theorem (CLT). However, the Generalized CLT is a geometry-less probabilistic result, whereas physical processes usually take place in an embedding space whose spatial geometry is often of substantial significance. In this paper we introduce a model of random effects in random environments which, on the one hand, retains the underlying probabilistic structure of the Generalized CLT and, on the other hand, adds a general and versatile underlying geometric structure. Based on this model we obtain geometry-based counterparts ofmore » the Generalized CLT, thus establishing a geometric theory for Lévy distributions. The theory explains the universal emergence of Lévy distributions in physical settings which are well beyond the realm of the Generalized CLT.« less
Garbage Collection in a Distributed Object-Oriented System
NASA Technical Reports Server (NTRS)
Gupta, Aloke; Fuchs, W. Kent
1993-01-01
An algorithm is described in this paper for garbage collection in distributed systems with object sharing across processor boundaries. The algorithm allows local garbage collection at each node in the system to proceed independently of local collection at the other nodes. It requires no global synchronization or knowledge of the global state of the system and exhibits the capability of graceful degradation. The concept of a specialized dump node is proposed to facilitate the collection of inaccessible circular structures. An experimental evaluation of the algorithm is also described. The algorithm is compared with a corresponding scheme that requires global synchronization. The results show that the algorithm works well in distributed processing environments even when the locality of object references is low.
Issues in ATM Support of High-Performance, Geographically Distributed Computing
NASA Technical Reports Server (NTRS)
Claus, Russell W.; Dowd, Patrick W.; Srinidhi, Saragur M.; Blade, Eric D.G
1995-01-01
This report experimentally assesses the effect of the underlying network in a cluster-based computing environment. The assessment is quantified by application-level benchmarking, process-level communication, and network file input/output. Two testbeds were considered, one small cluster of Sun workstations and another large cluster composed of 32 high-end IBM RS/6000 platforms. The clusters had Ethernet, fiber distributed data interface (FDDI), Fibre Channel, and asynchronous transfer mode (ATM) network interface cards installed, providing the same processors and operating system for the entire suite of experiments. The primary goal of this report is to assess the suitability of an ATM-based, local-area network to support interprocess communication and remote file input/output systems for distributed computing.
Distributed run of a one-dimensional model in a regional application using SOAP-based web services
NASA Astrophysics Data System (ADS)
Smiatek, Gerhard
This article describes the setup of a distributed computing system in Perl. It facilitates the parallel run of a one-dimensional environmental model on a number of simple network PC hosts. The system uses Simple Object Access Protocol (SOAP) driven web services offering the model run on remote hosts and a multi-thread environment distributing the work and accessing the web services. Its application is demonstrated in a regional run of a process-oriented biogenic emission model for the area of Germany. Within a network consisting of up to seven web services implemented on Linux and MS-Windows hosts, a performance increase of approximately 400% has been reached compared to a model run on the fastest single host.
NASA Astrophysics Data System (ADS)
Klaus, Julian; Smettem, Keith; Pfister, Laurent; Harris, Nick
2017-04-01
There is ongoing interest in understanding and quantifying the travel times and dispersion of solutes moving through stream environments, including the hyporheic zone and/or in-channel dead zones where retention affects biogeochemical cycling processes that are critical to stream ecosystem functioning. Modelling these transport and retention processes requires acquisition of tracer data from injection experiments where the concentrations are recorded downstream. Such experiments are often time consuming and costly, which may be the reason many modelling studies of chemical transport have tended to rely on relatively few well documented field case studies. This leads to the need of fast and cheap distributed sensor arrays that respond instantly and record chemical transport at points of interest on timescales of seconds at various locations in the stream environment. To tackle this challenge we present data from several tracer experiments carried out in the Attert river catchment in Luxembourg employing low-cost (in the order of a euro per sensor) potentiometric chloride sensors in a distributed array. We injected NaCl under various baseflow conditions in streams of different morphologies and observed solute transport at various distances and locations. This data is used to benchmark the sensors to data obtained from more expensive electrical conductivity meters. Furthermore, the data allowed spatial resolution of hydrodynamic mixing processes and identification of chemical 'dead zones' in the study reaches.
Chiral Polychlorinated Biphenyl Transport, Metabolism and Distribution - A Review
Lehmler, Hans-Joachim; Harrad, Stuart J.; Hühnerfuss, Heinrich; Kania-Korwel, Izabela; Lee, Cindy M.; Lu, Zhe; Wong, Charles S.
2009-01-01
Chirality can be exploited to gain insight into enantioselective fate processes that may otherwise remain undetected because only biological, but not physical and chemical transport and transformation processes in an achiral environment will change enantiomer compositions. This review provides an in-depth overview of the application of chirality to the study of chiral polychlorinated biphenyls (PCBs), an important group of legacy pollutants. Like other chiral compounds, individual PCB enantiomers may interact enantioselectively (or enantiospecifically) with chiral macromolecules, such as cytochrome P-450 enzymes or ryanodine receptors, leading to differences in their toxicological effects and the enantioselective formation of chiral biotransformation products. Species and congener-specific enantiomer enrichment has been demonstrated in environmental compartments, wildlife and mammals, including humans, typically due to a complex combination of biotransformation processes and uptake via the diet by passive diffusion. Changes in the enantiomer composition of chiral PCBs in the environment have been used to understand complex aerobic and anaerobic microbial transformation pathways, to delineate and quantify PCB sources and transport in the environment, to gain insight into the biotransformation of PCBs in aquatic food webs, and to investigate the enantioselective disposition of PCBs and their methylsulfonyl PCBs metabolites in rodents. Overall, changes in chiral signatures are powerful, but currently underutilized tools for studies of environmental and biological processes of PCBs. PMID:20384371
Automatic extraction of tree crowns from aerial imagery in urban environment
NASA Astrophysics Data System (ADS)
Liu, Jiahang; Li, Deren; Qin, Xunwen; Yang, Jianfeng
2006-10-01
Traditionally, field-based investigation is the main method to investigate greenbelt in urban environment, which is costly and low updating frequency. In higher resolution image, the imagery structure and texture of tree canopy has great similarity in statistics despite the great difference in configurations of tree canopy, and their surface structures and textures of tree crown are very different from the other types. In this paper, we present an automatic method to detect tree crowns using high resolution image in urban environment without any apriori knowledge. Our method catches unique structure and texture of tree crown surface, use variance and mathematical expectation of defined image window to position the candidate canopy blocks coarsely, then analysis their inner structure and texture to refine these candidate blocks. The possible spans of all the feature parameters used in our method automatically generate from the small number of samples, and HOLE and its distribution as an important characteristics are introduced into refining processing. Also the isotropy of candidate image block and holes' distribution is integrated in our method. After introduction the theory of our method, aerial imageries were used ( with a resolution about 0.3m ) to test our method, and the results indicate that our method is an effective approach to automatically detect tree crown in urban environment.
Rendell, L.; Boyd, R.; Enquist, M.; Feldman, M. W.; Fogarty, L.; Laland, K. N.
2011-01-01
Darwinian processes should favour those individuals that deploy the most effective strategies for acquiring information about their environment. We organized a computer-based tournament to investigate which learning strategies would perform well in a changing environment. The most successful strategies relied almost exclusively on social learning (here, learning a behaviour performed by another individual) rather than asocial learning, even when environments were changing rapidly; moreover, successful strategies focused learning effort on periods of environmental change. Here, we use data from tournament simulations to examine how these strategies might affect cultural evolution, as reflected in the amount of culture (i.e. number of cultural traits) in the population, the distribution of cultural traits across individuals, and their persistence through time. We found that high levels of social learning are associated with a larger amount of more persistent knowledge, but a smaller amount of less persistent expressed behaviour, as well as more uneven distributions of behaviour, as individuals concentrated on exploiting a smaller subset of behaviour patterns. Increased rates of environmental change generated increases in the amount and evenness of behaviour. These observations suggest that copying confers on cultural populations an adaptive plasticity, allowing them to respond to changing environments rapidly by drawing on a wider knowledge base. PMID:21357234
Rendell, L; Boyd, R; Enquist, M; Feldman, M W; Fogarty, L; Laland, K N
2011-04-12
Darwinian processes should favour those individuals that deploy the most effective strategies for acquiring information about their environment. We organized a computer-based tournament to investigate which learning strategies would perform well in a changing environment. The most successful strategies relied almost exclusively on social learning (here, learning a behaviour performed by another individual) rather than asocial learning, even when environments were changing rapidly; moreover, successful strategies focused learning effort on periods of environmental change. Here, we use data from tournament simulations to examine how these strategies might affect cultural evolution, as reflected in the amount of culture (i.e. number of cultural traits) in the population, the distribution of cultural traits across individuals, and their persistence through time. We found that high levels of social learning are associated with a larger amount of more persistent knowledge, but a smaller amount of less persistent expressed behaviour, as well as more uneven distributions of behaviour, as individuals concentrated on exploiting a smaller subset of behaviour patterns. Increased rates of environmental change generated increases in the amount and evenness of behaviour. These observations suggest that copying confers on cultural populations an adaptive plasticity, allowing them to respond to changing environments rapidly by drawing on a wider knowledge base.
A system for distributed intrusion detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snapp, S.R.; Brentano, J.; Dias, G.V.
1991-01-01
The study of providing security in computer networks is a rapidly growing area of interest because the network is the medium over which most attacks or intrusions on computer systems are launched. One approach to solving this problem is the intrusion-detection concept, whose basic premise is that not only abandoning the existing and huge infrastructure of possibly-insecure computer and network systems is impossible, but also replacing them by totally-secure systems may not be feasible or cost effective. Previous work on intrusion-detection systems were performed on stand-alone hosts and on a broadcast local area network (LAN) environment. The focus of ourmore » present research is to extend our network intrusion-detection concept from the LAN environment to arbitarily wider areas with the network topology being arbitrary as well. The generalized distributed environment is heterogeneous, i.e., the network nodes can be hosts or servers from different vendors, or some of them could be LAN managers, like our previous work, a network security monitor (NSM), as well. The proposed architecture for this distributed intrusion-detection system consists of the following components: a host manager in each host; a LAN manager for monitoring each LAN in the system; and a central manager which is placed at a single secure location and which receives reports from various host and LAN managers to process these reports, correlate them, and detect intrusions. 11 refs., 2 figs.« less
Anaerobic methane oxidation coupled to denitrification is the dominant methane sink in a deep lake
Deutzmann, Joerg S.; Stief, Peter; Brandes, Josephin; Schink, Bernhard
2014-01-01
Anaerobic methane oxidation coupled to denitrification, also known as “nitrate/nitrite-dependent anaerobic methane oxidation” (n-damo), was discovered in 2006. Since then, only a few studies have identified this process and the associated microorganisms in natural environments. In aquatic sediments, the close proximity of oxygen- and nitrate-consumption zones can mask n-damo as aerobic methane oxidation. We therefore investigated the vertical distribution and the abundance of denitrifying methanotrophs related to Candidatus Methylomirabilis oxyfera with cultivation-independent molecular techniques in the sediments of Lake Constance. Additionally, the vertical distribution of methane oxidation and nitrate consumption zones was inferred from high-resolution microsensor profiles in undisturbed sediment cores. M. oxyfera-like bacteria were virtually absent at shallow-water sites (littoral sediment) and were very abundant at deep-water sites (profundal sediment). In profundal sediment, the vertical distribution of M. oxyfera-like bacteria showed a distinct peak in anoxic layers that coincided with the zone of methane oxidation and nitrate consumption, a strong indication for n-damo carried out by M. oxyfera-like bacteria. Both potential n-damo rates calculated from cell densities (660–4,890 µmol CH4⋅m−2⋅d−1) and actual rates calculated from microsensor profiles (31–437 µmol CH4⋅m−2⋅d−1) were sufficiently high to prevent methane release from profundal sediment solely by this process. Additionally, when nitrate was added to sediment cores exposed to anoxic conditions, the n-damo zone reestablished well below the sediment surface, completely preventing methane release from the sediment. We conclude that the previously overlooked n-damo process can be the major methane sink in stable freshwater environments if nitrate is available in anoxic zones. PMID:25472842
Anaerobic methane oxidation coupled to denitrification is the dominant methane sink in a deep lake.
Deutzmann, Joerg S; Stief, Peter; Brandes, Josephin; Schink, Bernhard
2014-12-23
Anaerobic methane oxidation coupled to denitrification, also known as "nitrate/nitrite-dependent anaerobic methane oxidation" (n-damo), was discovered in 2006. Since then, only a few studies have identified this process and the associated microorganisms in natural environments. In aquatic sediments, the close proximity of oxygen- and nitrate-consumption zones can mask n-damo as aerobic methane oxidation. We therefore investigated the vertical distribution and the abundance of denitrifying methanotrophs related to Candidatus Methylomirabilis oxyfera with cultivation-independent molecular techniques in the sediments of Lake Constance. Additionally, the vertical distribution of methane oxidation and nitrate consumption zones was inferred from high-resolution microsensor profiles in undisturbed sediment cores. M. oxyfera-like bacteria were virtually absent at shallow-water sites (littoral sediment) and were very abundant at deep-water sites (profundal sediment). In profundal sediment, the vertical distribution of M. oxyfera-like bacteria showed a distinct peak in anoxic layers that coincided with the zone of methane oxidation and nitrate consumption, a strong indication for n-damo carried out by M. oxyfera-like bacteria. Both potential n-damo rates calculated from cell densities (660-4,890 µmol CH4⋅m(-2)⋅d(-1)) and actual rates calculated from microsensor profiles (31-437 µmol CH4⋅m(-2)⋅d(-1)) were sufficiently high to prevent methane release from profundal sediment solely by this process. Additionally, when nitrate was added to sediment cores exposed to anoxic conditions, the n-damo zone reestablished well below the sediment surface, completely preventing methane release from the sediment. We conclude that the previously overlooked n-damo process can be the major methane sink in stable freshwater environments if nitrate is available in anoxic zones.
Arcade: A Web-Java Based Framework for Distributed Computing
NASA Technical Reports Server (NTRS)
Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)
2000-01-01
Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.
Phylogeny, biogeography and diversification patterns of side-necked turtles (Testudines: Pleurodira)
Langer, Max C.; Sterli, Juliana
2018-01-01
Pleurodires or side-necked turtles are today restricted to freshwater environments of South America, Africa–Madagascar and Australia, but in the past they were distributed much more broadly, being found also on Eurasia, India and North America, and marine environments. Two hypotheses were proposed to explain this distribution; in the first, vicariance would have shaped the current geographical distribution and, in the second, extinctions constrained a previously widespread distribution. Here, we aim to reconstruct pleurodiran biogeographic history and diversification patterns based on a new phylogenetic hypothesis recovered from the analysis of the largest morphological dataset yet compiled for the lineage, testing which biogeographical process prevailed during its evolutionary history. The resulting topology generally agrees with previous hypotheses of the group and shows that most diversification shifts were related to the exploration of new niches, e.g. littoral or marine radiations. In addition, as other turtles, pleurodires do not seem to have been much affected by either the Cretaceous–Palaeogene or the Eocene–Oligocene mass extinctions. The biogeographic analyses highlight the predominance of both anagenetic and cladogenetic dispersal events and support the importance of transoceanic dispersals as a more common driver of area changes than previously thought, agreeing with previous studies with other non-turtle lineages. PMID:29657780
NASA Astrophysics Data System (ADS)
Makatun, Dzmitry; Lauret, Jérôme; Rudová, Hana; Šumbera, Michal
2015-05-01
When running data intensive applications on distributed computational resources long I/O overheads may be observed as access to remotely stored data is performed. Latencies and bandwidth can become the major limiting factor for the overall computation performance and can reduce the CPU/WallTime ratio to excessive IO wait. Reusing the knowledge of our previous research, we propose a constraint programming based planner that schedules computational jobs and data placements (transfers) in a distributed environment in order to optimize resource utilization and reduce the overall processing completion time. The optimization is achieved by ensuring that none of the resources (network links, data storages and CPUs) are oversaturated at any moment of time and either (a) that the data is pre-placed at the site where the job runs or (b) that the jobs are scheduled where the data is already present. Such an approach eliminates the idle CPU cycles occurring when the job is waiting for the I/O from a remote site and would have wide application in the community. Our planner was evaluated and simulated based on data extracted from log files of batch and data management systems of the STAR experiment. The results of evaluation and estimation of performance improvements are discussed in this paper.
A virtual data language and system for scientific workflow management in data grid environments
NASA Astrophysics Data System (ADS)
Zhao, Yong
With advances in scientific instrumentation and simulation, scientific data is growing fast in both size and analysis complexity. So-called Data Grids aim to provide high performance, distributed data analysis infrastructure for data- intensive sciences, where scientists distributed worldwide need to extract information from large collections of data, and to share both data products and the resources needed to produce and store them. However, the description, composition, and execution of even logically simple scientific workflows are often complicated by the need to deal with "messy" issues like heterogeneous storage formats and ad-hoc file system structures. We show how these difficulties can be overcome via a typed workflow notation called virtual data language, within which issues of physical representation are cleanly separated from logical typing, and by the implementation of this notation within the context of a powerful virtual data system that supports distributed execution. The resulting language and system are capable of expressing complex workflows in a simple compact form, enacting those workflows in distributed environments, monitoring and recording the execution processes, and tracing the derivation history of data products. We describe the motivation, design, implementation, and evaluation of the virtual data language and system, and the application of the virtual data paradigm in various science disciplines, including astronomy, cognitive neuroscience.
Phylogeny, biogeography and diversification patterns of side-necked turtles (Testudines: Pleurodira)
NASA Astrophysics Data System (ADS)
Ferreira, Gabriel S.; Bronzati, Mario; Langer, Max C.; Sterli, Juliana
2018-03-01
Pleurodires or side-necked turtles are today restricted to freshwater environments of South America, Africa-Madagascar and Australia, but in the past they were distributed much more broadly, being found also on Eurasia, India and North America, and marine environments. Two hypotheses were proposed to explain this distribution; in the first, vicariance would have shaped the current geographical distribution and, in the second, extinctions constrained a previously widespread distribution. Here, we aim to reconstruct pleurodiran biogeographic history and diversification patterns based on a new phylogenetic hypothesis recovered from the analysis of the largest morphological dataset yet compiled for the lineage, testing which biogeographical process prevailed during its evolutionary history. The resulting topology generally agrees with previous hypotheses of the group and shows that most diversification shifts were related to the exploration of new niches, e.g. littoral or marine radiations. In addition, as other turtles, pleurodires do not seem to have been much affected by either the Cretaceous-Palaeogene or the Eocene-Oligocene mass extinctions. The biogeographic analyses highlight the predominance of both anagenetic and cladogenetic dispersal events and support the importance of transoceanic dispersals as a more common driver of area changes than previously thought, agreeing with previous studies with other non-turtle lineages.
ARTEMIS: a collaborative framework for health care.
Reddy, R.; Jagannathan, V.; Srinivas, K.; Karinthi, R.; Reddy, S. M.; Gollapudy, C.; Friedman, S.
1993-01-01
Patient centered healthcare delivery is an inherently collaborative process. This involves a wide range of individuals and organizations with diverse perspectives: primary care physicians, hospital administrators, labs, clinics, and insurance. The key to cost reduction and quality improvement in health care is effective management of this collaborative process. The use of multi-media collaboration technology can facilitate timely delivery of patient care and reduce cost at the same time. During the last five years, the Concurrent Engineering Research Center (CERC), under the sponsorship of DARPA (Defense Advanced Research Projects Agency, recently renamed ARPA) developed a number of generic key subsystems of a comprehensive collaboration environment. These subsystems are intended to overcome the barriers that inhibit the collaborative process. Three subsystems developed under this program include: MONET (Meeting On the Net)--to provide consultation over a computer network, ISS (Information Sharing Server)--to provide access to multi-media information, and PCB (Project Coordination Board)--to better coordinate focussed activities. These systems have been integrated into an open environment to enable collaborative processes. This environment is being used to create a wide-area (geographically distributed) research testbed under DARPA sponsorship, ARTEMIS (Advance Research Testbed for Medical Informatics) to explore the collaborative health care processes. We believe this technology will play a key role in the current national thrust to reengineer the present health-care delivery system. PMID:8130536
An Intelligent System for Document Retrieval in Distributed Office Environments.
ERIC Educational Resources Information Center
Mukhopadhyay, Uttam; And Others
1986-01-01
MINDS (Multiple Intelligent Node Document Servers) is a distributed system of knowledge-based query engines for efficiently retrieving multimedia documents in an office environment of distributed workstations. By learning document distribution patterns and user interests and preferences during system usage, it customizes document retrievals for…
Designing Distributed Learning Environments with Intelligent Software Agents
ERIC Educational Resources Information Center
Lin, Fuhua, Ed.
2005-01-01
"Designing Distributed Learning Environments with Intelligent Software Agents" reports on the most recent advances in agent technologies for distributed learning. Chapters are devoted to the various aspects of intelligent software agents in distributed learning, including the methodological and technical issues on where and how intelligent agents…
Assessment of Lead and Beryllium deposition and adsorption to exposed stream channel sediments
NASA Astrophysics Data System (ADS)
Pawlowski, E.; Karwan, D. L.
2016-12-01
The fallout radionuclides Beryllium-7 and Lead-210 have been shown to be effective sediment tracers that readily bind to particles. The adsorption capacity has primarily been assessed in marine and coastal environments with an important assumption being the radionuclides' uniform spatial distribution as fallout from the atmosphere. This neglects localized storm events that may mine stratospheric reserves creating variable distributions. To test this assumption atmospheric deposition is collected at the University of Minnesota St. Paul Campus weather station during individual storm events and subsequently analyzed for Beryllium-7 and Lead-210. This provides further insight into continental effects on radionuclide deposition. The study of Beryllium-7 and Lead-210 adsorption in marine and coastal environments has provided valuable insights into the processes that influence the element's binding to particles but research has been limited in freshwater river environments. These environments have greater variation in pH, iron oxide content, and dissolved organic carbon (DOC) levels which have been shown to influence the adsorption of Beryllium and Lead in marine settings. This research assesses the adsorption of Beryllium and Lead to river sediments collected from in-channel deposits by utilizing batch experiments that mimic the stream conditions from which the deposits were collected. Soils were collected from Difficult Run, VA, and the West Swan River, MN. Agitating the soils in a controlled solution of known background electrolyte and pH while varying the level of iron oxides and DOC in step provides a better understanding of the sorption of Lead and Beryllium under the conditions found within freshwater streams. Pairing the partitioning of Lead and Beryllium with their inputs to streams via depositional processes, from this study and others, allows for their assessment as possible sediment tracers and age-dating tools within the respective watersheds.
O'Donnell, Michael
2015-01-01
State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf
Analyzing Distributed Functions in an Integrated Hazard Analysis
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Massie, Michael J.
2010-01-01
Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.
NASA Astrophysics Data System (ADS)
Graham, N. M.
2015-12-01
The evolution and speciation of plants is directly tied to the environment as the constrained stages of dispersal creates strong genetic differentiation among populations. This can result in differing genetic patterns between nuclear and chloroplast loci, where genes are inherited differently and dispersed via separate vectors. By developing distribution models based on genetic patterns found within a species, it is possible to begin understanding the influence of historic geomorphic and/or climatic processes on population evolution. If genetic patterns of the current range correlate with specific patterns of climate variability within the Pleistocene, it is possible that future shifts in species distribution in response to climate change can be more accurately modelled due to the historic signature that is found within inherited genes. Preliminary genetic analyses of Linanthus dichotomus, an annual herb distributed across California, suggests that the current taxonomic treatment does not accurately depict how this species is evolving. Genetic patterns of chloroplast genes suggest that populations are more correlated with biogeography than what the current nomenclature states. Additionally, chloroplast and nuclear genes show discrepancies in the dispersal across the landscape, suggesting pollinator driven gene flow overcoming seed dispersal boundaries. By comparing discrepancies between pollinator and seed induced gene flow we may be able to gain insight into historical pollinator communities within the Pleistocene. This information can then be applied to projected climate models to more accurately understand how species and/or communities will respond to a changing environment.
Statham, P J; Connelly, D P; German, C R; Brand, T; Overnell, J O; Bulukin, E; Millard, N; McPhail, S; Pebody, M; Perrett, J; Squire, M; Stevenson, P; Webb, A
2005-12-15
Loch Etive is a fjordic system on the west coast of Scotland. The deep waters of the upper basin are periodically isolated, and during these periods oxygen is lost through benthic respiration and concentrations of dissolved manganese increase. In April 2000 the autonomous underwater vehicle (AUV) Autosub was fitted with an in situ dissolved manganese analyzer and was used to study the spatial variability of this element together with oxygen, salinity, and temperature throughout the basin. Six along-loch transects were completed at either constant height above the seafloor or at constant depth below the surface. The ca. 4000 in situ 10-s-average dissolved Mn (Mnd) data points obtained provide a new quasi-synoptic and highly detailed view of the distribution of manganese in this fjordic environment not possible using conventional (water bottle) sampling. There is substantial variability in concentrations (<25 to >600 nM) and distributions of Mnd. Surface waters are characteristically low in Mnd reflecting mixing of riverine and marine end-member waters, both of which are low in Mnd. The deeper waters are enriched in Mnd, and as the water column always contains some oxygen, this must reflect primarily benthic inputs of reduced dissolved Mn. However, this enrichment of Mnd is spatially very variable, presumably as a result of variability in release of Mn coupled with mixing of water in the loch and removal processes. This work demonstrates how AUVs coupled with chemical sensors can reveal substantial small-scale variability of distributions of chemical species in coastal environments that would not be resolved by conventional sampling approaches. Such information is essential if we are to improve our understanding of the nature and significance of the underlying processes leading to this variability.
Spaceflight Operations Services Grid (SOSG) Prototype Implementation and Feasibility Study
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Thigpen, William W.; Lisotta, Anthony J.; Redman, Sandra
2004-01-01
Science Operations Services Grid is focusing on building a prototype grid-based environment that incorporates existing and new spaceflight services to enable current and future NASA programs with cost savings and new and evolvable methods to conduct science in a distributed environment. The Science Operations Services Grid (SOSG) will provide a distributed environment for widely disparate organizations to conduct their systems and processes in a more efficient and cost effective manner. These organizations include those that: 1) engage in space-based science and operations, 2) develop space-based systems and processes, and 3) conduct scientific research, bringing together disparate scientific disciplines like geology and oceanography to create new information. In addition educational outreach will be significantly enhanced by providing to schools the same tools used by NASA with the ability of the schools to actively participate on many levels in the science generated by NASA from space and on the ground. The services range from voice, video and telemetry processing and display to data mining, high level processing and visualization tools all accessible from a single portal. In this environment, users would not require high end systems or processes at their home locations to use these services. Also, the user would need to know minimal details about the applications in order to utilize the services. In addition, security at all levels is an underlying goal of the project. The Science Operations Services Grid will focus on four tools that are currently used by the ISS Payload community along with nine more that are new to the community. Under the prototype four Grid virtual organizations PO) will be developed to represent four types of users. They are a Payload (experimenters) VO, a Flight Controllers VO, an Engineering and Science Collaborators VO and an Education and Public Outreach VO. The User-based services will be implemented to replicate the operational voice, video, telemetry and commanding systems. Once the User-based services are in place, they will be analyzed to establish feasibility for Grid enabling. If feasible then each User-based service will be Grid enabled. The remaining non-Grid services if not already Web enabled will be so enabled. In the end, four portals will be developed one for each VO. Each portal will contain the appropriate User-based services required for that VO to operate.
LYDIAN: An Extensible Educational Animation Environment for Distributed Algorithms
ERIC Educational Resources Information Center
Koldehofe, Boris; Papatriantafilou, Marina; Tsigas, Philippas
2006-01-01
LYDIAN is an environment to support the teaching and learning of distributed algorithms. It provides a collection of distributed algorithms as well as continuous animations. Users can combine algorithms and animations with arbitrary network structures defining the interconnection and behavior of the distributed algorithm. Further, it facilitates…
Using Java for distributed computing in the Gaia satellite data processing
NASA Astrophysics Data System (ADS)
O'Mullane, William; Luri, Xavier; Parsons, Paul; Lammers, Uwe; Hoar, John; Hernandez, Jose
2011-10-01
In recent years Java has matured to a stable easy-to-use language with the flexibility of an interpreter (for reflection etc.) but the performance and type checking of a compiled language. When we started using Java for astronomical applications around 1999 they were the first of their kind in astronomy. Now a great deal of astronomy software is written in Java as are many business applications. We discuss the current environment and trends concerning the language and present an actual example of scientific use of Java for high-performance distributed computing: ESA's mission Gaia. The Gaia scanning satellite will perform a galactic census of about 1,000 million objects in our galaxy. The Gaia community has chosen to write its processing software in Java. We explore the manifold reasons for choosing Java for this large science collaboration. Gaia processing is numerically complex but highly distributable, some parts being embarrassingly parallel. We describe the Gaia processing architecture and its realisation in Java. We delve into the astrometric solution which is the most advanced and most complex part of the processing. The Gaia simulator is also written in Java and is the most mature code in the system. This has been successfully running since about 2005 on the supercomputer "Marenostrum" in Barcelona. We relate experiences of using Java on a large shared machine. Finally we discuss Java, including some of its problems, for scientific computing.
NASA Astrophysics Data System (ADS)
Peer, Regina; Peer, Siegfried; Sander, Heike; Marsolek, Ingo; Koller, Wolfgang; Pappert, Dirk; Hierholzer, Johannes
2002-05-01
If new technology is introduced into medical practice it must prove to make a difference. However traditional approaches of outcome analysis failed to show a direct benefit of PACS on patient care and economical benefits are still in debate. A participatory process analysis was performed to compare workflow in a film based hospital and a PACS environment. This included direct observation of work processes, interview of involved staff, structural analysis and discussion of observations with staff members. After definition of common structures strong and weak workflow steps were evaluated. With a common workflow structure in both hospitals, benefits of PACS were revealed in workflow steps related to image reporting with simultaneous image access for ICU-physicians and radiologists, archiving of images as well as image and report distribution. However PACS alone is not able to cover the complete process of 'radiography for intensive care' from ordering of an image till provision of the final product equals image + report. Interference of electronic workflow with analogue process steps such as paper based ordering reduces the potential benefits of PACS. In this regard workflow modeling proved to be very helpful for the evaluation of complex work processes linking radiology and the ICU.
Localization of Acoustic Transients in Shallow Water Environments
1992-12-01
effect of the source signal uncertainty (in localizer performance . The localization process consists of two parts. First, a time domain propagation...for public release; distribution is unlimited 4. PERFORMING ORGANIZATION REPORT NUMBER(S) 5. MONITORING ORGANIZATION REPORT NUMBER(S) E OF PERFORMING ...SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE (Include Security Classification) LOCALIZATION OF
USDA-ARS?s Scientific Manuscript database
The occurrence of Listeria monocytogenes (LM) has been widely investigated in the poultry production chain from the processing plant to the final product. However, limited data are available on Listeria spp., including LM, in the poultry farm environment. Therefore, fecal and soil samples from 37 pa...
Sustainable Range Management of RDX and TNT by Phytoremediation with Engineered Plants
2016-04-01
transformation products in the environment. Dinitrotoluenes are often co- contaminants at TNT- manufacturing sites, and dinitrotoluene-mineralizing bacteria...specific commercial product, process, or service by trade name, trademark, manufacturer , or otherwise, does not necessarily constitute or imply its...Distribution A 13. SUPPLEMENTARY NOTES 14. ABSTRACT Decades of military activity on live-fire training ranges have resulted in the contamination of
The Nature and Process of Science and Applications to Geography Education: A US Perspective
ERIC Educational Resources Information Center
Gillette, Brandon
2015-01-01
Place-name geography, as it is sometimes called, is merely the tip of the iceberg in a field that aims to understand people and places and their interactions with the environment. Geography is also the study of spatial distributions and interpreting what they mean. This review lays out the definition of the nature of science as it relates to…
Operational Reconnaissance for the Anti-Access /Area Denial environment
2015-04-01
locations, the Air Force Distributed Common Ground System ( DCGS ) collects, processes, analyzes, and disseminates over 1.3 million megabits of... DCGS ; satellite data link between the aircraft and ground based receiver; and fiber- optic connection between the receiver, RPA crew, and DCGS . This...analysts and end users. DCGS Integration The Air Force global ISR enterprise is not configured to efficiently receive, exploit, or disseminate fighter
Blocking Strategies for Performing Entity Resolution in a Distributed Computing Environment
ERIC Educational Resources Information Center
Wang, Pei
2016-01-01
Entity resolution (ER) is an O(n[superscript 2]) problem where n is the number of records to be processed. The pair-wise nature of ER makes it impractical to perform on large datasets without the use of a technique called blocking. In blocking the records are separated into groups (called blocks) in such a way the records most likely to match are…
OGC and Grid Interoperability in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas
2010-05-01
EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/
ICESat Science Investigator led Processing System (I-SIPS)
NASA Astrophysics Data System (ADS)
Bhardwaj, S.; Bay, J.; Brenner, A.; Dimarzio, J.; Hancock, D.; Sherman, M.
2003-12-01
The ICESat Science Investigator-led Processing System (I-SIPS) generates the GLAS standard data products. It consists of two main parts the Scheduling and Data Management System (SDMS) and the Geoscience Laser Altimeter System (GLAS) Science Algorithm Software. The system has been operational since the successful launch of ICESat. It ingests data from the GLAS instrument, generates GLAS data products, and distributes them to the GLAS Science Computing Facility (SCF), the Instrument Support Facility (ISF) and the National Snow and Ice Data Center (NSIDC) ECS DAAC. The SDMS is the Planning, Scheduling and Data Management System that runs the GLAS Science Algorithm Software (GSAS). GSAS is based on the Algorithm Theoretical Basis Documents provided by the Science Team and is developed independently of SDMS. The SDMS provides the processing environment to plan jobs based on existing data, control job flow, data distribution, and archiving. The SDMS design is based on a mission-independent architecture that imposes few constraints on the science code thereby facilitating I-SIPS integration. I-SIPS currently works in an autonomous manner to ingest GLAS instrument data, distribute this data to the ISF, run the science processing algorithms to produce the GLAS standard products, reprocess data when new versions of science algorithms are released, and distributes the products to the SCF, ISF, and NSIDC. I-SIPS has a proven performance record, delivering the data to the SCF within hours after the initial instrument activation. The I-SIPS design philosophy gives this system a high potential for reuse in other science missions.
[Evoked Potential Blind Extraction Based on Fractional Lower Order Spatial Time-Frequency Matrix].
Long, Junbo; Wang, Haibin; Zha, Daifeng
2015-04-01
The impulsive electroencephalograph (EEG) noises in evoked potential (EP) signals is very strong, usually with a heavy tail and infinite variance characteristics like the acceleration noise impact, hypoxia and etc., as shown in other special tests. The noises can be described by a stable distribution model. In this paper, Wigner-Ville distribution (WVD) and pseudo Wigner-Ville distribution (PWVD) time-frequency distribution based on the fractional lower order moment are presented to be improved. We got fractional lower order WVD (FLO-WVD) and fractional lower order PWVD (FLO-PWVD) time-frequency distribution which could be suitable for a stable distribution process. We also proposed the fractional lower order spatial time-frequency distribution matrix (FLO-STFM) concept. Therefore, combining with time-frequency underdetermined blind source separation (TF-UBSS), we proposed a new fractional lower order spatial time-frequency underdetermined blind source separation (FLO-TF-UBSS) which can work in a stable distribution environment. We used the FLO-TF-UBSS algorithm to extract EPs. Simulations showed that the proposed method could effectively extract EPs in EEG noises, and the separated EPs and EEG signals based on FLO-TF-UBSS were almost the same as the original signal, but blind separation based on TF-UBSS had certain deviation. The correlation coefficient of the FLO-TF-UBSS algorithm was higher than the TF-UBSS algorithm when generalized signal-to-noise ratio (GSNR) changed from 10 dB to 30 dB and a varied from 1. 06 to 1. 94, and was approximately e- qual to 1. Hence, the proposed FLO-TF-UBSS method might be better than the TF-UBSS algorithm based on second order for extracting EP signal under an EEG noise environment.
The development of a collaborative virtual environment for finite element simulation
NASA Astrophysics Data System (ADS)
Abdul-Jalil, Mohamad Kasim
Communication between geographically distributed designers has been a major hurdle in traditional engineering design. Conventional methods of communication, such as video conferencing, telephone, and email, are less efficient especially when dealing with complex design models. Complex shapes, intricate features and hidden parts are often difficult to describe verbally or even using traditional 2-D or 3-D visual representations. Virtual Reality (VR) and Internet technologies have provided a substantial potential to bridge the present communication barrier. VR technology allows designers to immerse themselves in a virtual environment to view and manipulate this model just as in real-life. Fast Internet connectivity has enabled fast data transfer between remote locations. Although various collaborative virtual environment (CVE) systems have been developed in the past decade, they are limited to high-end technology that is not accessible to typical designers. The objective of this dissertation is to discover and develop a new approach to increase the efficiency of the design process, particularly for large-scale applications wherein participants are geographically distributed. A multi-platform and easily accessible collaborative virtual environment (CVRoom), is developed to accomplish the stated research objective. Geographically dispersed designers can meet in a single shared virtual environment to discuss issues pertaining to the engineering design process and to make trade-off decisions more quickly than before, thereby speeding the entire process. This 'faster' design process will be achieved through the development of capabilities to better enable the multidisciplinary and modeling the trade-off decisions that are so critical before launching into a formal detailed design. The features of the environment developed as a result of this research include the ability to view design models, use voice interaction, and to link engineering analysis modules (such as Finite Element Analysis module, such as is demonstrated in this work). One of the major issues in developing a CVE system for engineering design purposes is to obtain any pertinent simulation results in real-time. This is critical so that the designers can make decisions based on these results quickly. For example, in a finite element analysis, if a design model is changed or perturbed, the analysis results must be obtained in real-time or near real-time to make the virtual meeting environment realistic. In this research, the finite difference-based Design Sensitivity Analysis (DSA) approach is employed to approximate structural responses (i.e. stress, displacement, etc), so as to demonstrate the applicability of CVRoom for engineering design trade-offs. This DSA approach provides for fast approximation and is well-suited for the virtual meeting environment where fast response time is required. The DSA-based approach is tested on several example test problems to show its applicability and limitations. This dissertation demonstrates that an increase in efficiency and reduction of time required for a complex design processing can be accomplished using the approach developed in this dissertation research. Several implementations of CVRoom by students working on common design tasks were investigated. All participants confirmed the preference of using the collaborative virtual environment developed in this dissertation work (CVRoom) over other modes of interactions. It is proposed here that CVRoom is representative of the type of collaborative virtual environment that will be used by most designers in the future to reduce the time required in a design cycle and thereby reduce the associated cost.
Local impact of humidification on degradation in polymer electrolyte fuel cells
NASA Astrophysics Data System (ADS)
Sanchez, Daniel G.; Ruiu, Tiziana; Biswas, Indro; Schulze, Mathias; Helmly, Stefan; Friedrich, K. Andreas
2017-06-01
The water level in a polymer electrolyte membrane fuel cell (PEMFC) affects the durability as is seen from the degradation processes during operation a PEMFC with fully- and nonhumidified gas streams as analyzed using an in-situ segmented cell for local current density measurements during a 300 h test operating under constant conditions and using ex situ SEM/EDX and XPS post-test analysis of specific regions. The impact of the RH on spatial distribution of the degradation process results from different water distribution giving different chemical environments. Under nonhumidified gas streams, the cathode inlet region exhibits increased degradation, whereas with fully humidified gases the bottom of the cell had the higher performance losses. The degradation and the degree of reversibility produced by Pt dissolution, PTFE defluorination, and contaminants such as silicon (Si) and nickel (Ni) were locally evaluated.
2015-09-01
Figures iv List of Tables iv 1. Introduction 1 2. Device Status Data 1 2.1 SNMP 1 2.2 NMS 1 2.3 ICMP Ping 2 3. Data Collection 2 4. Hydra ...Configuration 3 4.1 Status Codes 4 4.2 Request Time 5 4.3 Hydra BLOb Metadata 6 5. Data Processing 6 5.1 Hydra Data Processing Framework 6 5.1.1...Basic Components 6 5.1.2 Map Component 7 5.1.3 Postmap Methods 8 5.1.4 Data Flow 9 5.1.5 Distributed Processing Considerations 9 5.2 Specific Hydra
Virtual Collaborative Environments for System of Systems Engineering and Applications for ISAT
NASA Technical Reports Server (NTRS)
Dryer, David A.
2002-01-01
This paper describes an system of systems or metasystems approach and models developed to help prepare engineering organizations for distributed engineering environments. These changes in engineering enterprises include competition in increasingly global environments; new partnering opportunities caused by advances in information and communication technologies, and virtual collaboration issues associated with dispersed teams. To help address challenges and needs in this environment, a framework is proposed that can be customized and adapted for NASA to assist in improved engineering activities conducted in distributed, enhanced engineering environments. The approach is designed to prepare engineers for such distributed collaborative environments by learning and applying e-engineering methods and tools to a real-world engineering development scenario. The approach consists of two phases: an e-engineering basics phase and e-engineering application phase. The e-engineering basics phase addresses skills required for e-engineering. The e-engineering application phase applies these skills in a distributed collaborative environment to system development projects.
Simulation Environment Synchronizing Real Equipment for Manufacturing Cell
NASA Astrophysics Data System (ADS)
Inukai, Toshihiro; Hibino, Hironori; Fukuda, Yoshiro
Recently, manufacturing industries face various problems such as shorter product life cycle, more diversified customer needs. In this situation, it is very important to reduce lead-time of manufacturing system constructions. At the manufacturing system implementation stage, it is important to make and evaluate facility control programs for a manufacturing cell, such as ladder programs for programmable logical controllers (PLCs) rapidly. However, before the manufacturing systems are implemented, methods to evaluate the facility control programs for the equipment while mixing and synchronizing real equipment and virtual factory models on the computers have not been developed. This difficulty is caused by the complexity of the manufacturing system composed of a great variety of equipment, and stopped precise and rapid support of a manufacturing engineering process. In this paper, a manufacturing engineering environment (MEE) to support manufacturing engineering processes using simulation technologies is proposed. MEE consists of a manufacturing cell simulation environment (MCSE) and a distributed simulation environment (DSE). MCSE, which consists of a manufacturing cell simulator and a soft-wiring system, is emphatically proposed in detail. MCSE realizes making and evaluating facility control programs by using virtual factory models on computers before manufacturing systems are implemented.
Camargo, Anderson Carlos; Woodward, Joshua John; Call, Douglas Ruben; Nero, Luís Augusto
2017-11-01
Listeria monocytogenes is a foodborne pathogen that contaminates food-processing environments and persists within biofilms on equipment, utensils, floors, and drains, ultimately reaching final products by cross-contamination. This pathogen grows even under high salt conditions or refrigeration temperatures, remaining viable in various food products until the end of their shelf life. While the estimated incidence of listeriosis is lower than other enteric illnesses, infections caused by L. monocytogenes are more likely to lead to hospitalizations and fatalities. Despite the description of L. monocytogenes occurrence in Brazilian food-processing facilities and foods, there is a lack of consistent data regarding listeriosis cases and outbreaks directly associated with food consumption. Listeriosis requires rapid treatment with antibiotics and most drugs suitable for Gram-positive bacteria are effective against L. monocytogenes. Only a minority of clinical antibiotic-resistant L. monocytogenes strains have been described so far; whereas many strains recovered from food-processing facilities and foods exhibited resistance to antimicrobials not suitable against listeriosis. L. monocytogenes control in food industries is a challenge, demanding proper cleaning and application of sanitization procedures to eliminate this foodborne pathogen from the food-processing environment and ensure food safety. This review focuses on presenting the L. monocytogenes distribution in food-processing environment, food contamination, and control in the food industry, as well as the consequences of listeriosis to human health, providing a comparison of the current Brazilian situation with the international scenario.
A sampling model of social judgment.
Galesic, Mirta; Olsson, Henrik; Rieskamp, Jörg
2018-04-01
Studies of social judgments have demonstrated a number of diverse phenomena that were so far difficult to explain within a single theoretical framework. Prominent examples are false consensus and false uniqueness, as well as self-enhancement and self-depreciation. Here we show that these seemingly complex phenomena can be a product of an interplay between basic cognitive processes and the structure of social and task environments. We propose and test a new process model of social judgment, the social sampling model (SSM), which provides a parsimonious quantitative account of different types of social judgments. In the SSM, judgments about characteristics of broader social environments are based on sampling of social instances from memory, where instances receive activation if they belong to a target reference class and have a particular characteristic. These sampling processes interact with the properties of social and task environments, including homophily, shapes of frequency distributions, and question formats. For example, in line with the model's predictions we found that whether false consensus or false uniqueness will occur depends on the level of homophily in people's social circles and on the way questions are asked. The model also explains some previously unaccounted-for patterns of self-enhancement and self-depreciation. People seem to be well informed about many characteristics of their immediate social circles, which in turn influence how they evaluate broader social environments and their position within them. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Automated monitoring of medical protocols: a secure and distributed architecture.
Alsinet, T; Ansótegui, C; Béjar, R; Fernández, C; Manyà, F
2003-03-01
The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.
NASA Technical Reports Server (NTRS)
Ferrario, Joseph; Byrne, Christian
2002-01-01
Processed ball clay samples used in the production of ceramics and samples of the ceramic products were collected and analyzed for the presence and concentration of the 2,3,7,8-Cl substituted polychlorinated dibenzo-p-dioxins and -furans (PCDDs/PCDFs). The processed ball clay had average PCDD concentrations of 3.2 ng/g toxic equivalents, a congener profile, and isomer distribution consistent with those found previously in raw ball clay. The PCDF concentrations were below the average limit of detection (LOD) of 0.5 pg/g. The final fired ceramic products were found to be free of PCDDs/PCDFs at the LODs. A consideration of the conditions involved in the firing process suggests that the PCDDs, if not destroyed, may be released to the atmosphere and could represent an as yet unidentified source of dioxins to the environment. In addition, the PCDDs in clay dust generated during manufacturing operations may represent a potential occupational exposure.
Performance and modeling of cesium ion exchange by ENGI neered form crystalline silicotitanates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony, R.G.; Gu, D.; Huckman, M.
1996-10-01
TAM-5, a hydrous crystalline silicotitanate (CST) powder developed by Sandia National Laboratories and Texas A&M University, and commercialized by UOP as IONSIV{reg_sign} Ion Exchanger Type IE-910, is a highly selective material for removing cesium and strontium from aqueous radioactive wastes such as those found at the Hanford site in Washington. An engineered form of the material suitable for column ion exchange type operations has been developed and tested. Data relevant to processing radioactive tank wastes including equilibrium distribution coefficients and column testing will be presented. The impact of exposure of the engineered form to chemically aggressive environments such as itmore » might experience during waste processing, and to the less aggressive environments it might experience during post processing storage has been assessed. The thermal stability of the material has also been evaluated. The experimental results have been integrated with an effort to model the material`s equilibrium and kinetic behavior.« less
Ferrario, Joseph; Byrne, Christian
2002-03-01
Processed ball clay samples used in the production of ceramics and samples of the ceramic products were collected and analyzed for the presence and concentration of the 2,3,7,8-Cl substituted polychlorinated dibenzo-p-dioxins and -furans (PCDDs/PCDFs). The processed ball clay had average PCDD concentrations of 3.2 ng/g toxic equivalents, a congener profile, and isomer distribution consistent with those found previously in raw ball clay. The PCDF concentrations were below the average limit of detection (LOD) of 0.5 pg/g. The final fired ceramic products were found to be free of PCDDs/PCDFs at the LODs. A consideration of the conditions involved in the firing process suggests that the PCDDs, if not destroyed, may be released to the atmosphere and could represent an as yet unidentified source of dioxins to the environment. In addition, the PCDDs in clay dust generated during manufacturing operations may represent a potential occupational exposure.
Lunestad, B T; Truong, T T T; Lindstedt, B-A
2013-10-01
The objective of this study was to characterize Listeria monocytogenes isolated from farmed Atlantic salmon (Salmo salar) and the processing environment in three different Norwegian factories, and compare these to clinical isolates by multiple-locus variable-number tandem repeat analysis (MLVA). The 65 L. monocytogenes isolates obtained gave 15 distinct MLVA profiles. There was great heterogeneity in the distribution of MLVA profiles in factories and within each factory. Nine of the 15 MLVA profiles found in the fish-associated isolates were found to match human profiles. The MLVA profile 07-07-09-10-06 was the most common strain in Norwegian listeriosis patients. L. monocytogenes with this profile has previously been associated with at least two known listeriosis outbreaks in Norway, neither determined to be due to fish consumption. However, since this profile was also found in fish and in the processing environment, fish should be considered as a possible food vehicle during sporadic cases and outbreaks of listeriosis.
SoS Notebook: An Interactive Multi-Language Data Analysis Environment.
Peng, Bo; Wang, Gao; Ma, Jun; Leong, Man Chong; Wakefield, Chris; Melott, James; Chiu, Yulun; Du, Di; Weinstein, John N
2018-05-22
Complex bioinformatic data analysis workflows involving multiple scripts in different languages can be difficult to consolidate, share, and reproduce. An environment that streamlines the entire processes of data collection, analysis, visualization and reporting of such multi-language analyses is currently lacking. We developed Script of Scripts (SoS) Notebook, a web-based notebook environment that allows the use of multiple scripting language in a single notebook, with data flowing freely within and across languages. SoS Notebook enables researchers to perform sophisticated bioinformatic analysis using the most suitable tools for different parts of the workflow, without the limitations of a particular language or complications of cross-language communications. SoS Notebook is hosted at http://vatlab.github.io/SoS/ and is distributed under a BSD license. bpeng@mdanderson.org.
Biogeochemical Processes in Microbial Ecosystems
NASA Technical Reports Server (NTRS)
DesMarais, David J.; DeVincenzi, Donald L. (Technical Monitor)
2001-01-01
The hierarchical organization of microbial ecosystems determines process rates that shape Earth's environment, create the biomarker sedimentary and atmospheric signatures of life and define the stage upon which major evolutionary events occurred. In order to understand how microorganisms have shaped the global environment of Earth and potentially, other worlds, we must develop an experimental paradigm that links biogeochemical processes with ever-changing temporal and spatial distributions of microbial population, and their metabolic properties. Photosynthetic microbial mats offer an opportunity to define holistic functionality at the millimeter scale. At the same time, their Biogeochemistry contributes to environmental processes on a planetary scale. These mats are possibly direct descendents of the most ancient biological communities; communities in which oxygenic photosynthesis might have been invented. Mats provide one of the best natural systems to study how microbial populations associate to control dynamic biogeochemical gradients. These are self-sustaining, complete ecosystems in which light energy absorbed over a diel (24 hour) cycle drives the synthesis of spatially-organized, diverse biomass. Tightly-coupled microorganisms in the mat have specialized metabolisms that catalyze transformations of carbon, nitrogen. sulfur, and a host of other elements.