Sample records for distribution system facilitating

  1. Design considerations, architecture, and use of the Mini-Sentinel distributed data system.

    PubMed

    Curtis, Lesley H; Weiner, Mark G; Boudreau, Denise M; Cooper, William O; Daniel, Gregory W; Nair, Vinit P; Raebel, Marsha A; Beaulieu, Nicolas U; Rosofsky, Robert; Woodworth, Tiffany S; Brown, Jeffrey S

    2012-01-01

    We describe the design, implementation, and use of a large, multiorganizational distributed database developed to support the Mini-Sentinel Pilot Program of the US Food and Drug Administration (FDA). As envisioned by the US FDA, this implementation will inform and facilitate the development of an active surveillance system for monitoring the safety of medical products (drugs, biologics, and devices) in the USA. A common data model was designed to address the priorities of the Mini-Sentinel Pilot and to leverage the experience and data of participating organizations and data partners. A review of existing common data models informed the process. Each participating organization designed a process to extract, transform, and load its source data, applying the common data model to create the Mini-Sentinel Distributed Database. Transformed data were characterized and evaluated using a series of programs developed centrally and executed locally by participating organizations. A secure communications portal was designed to facilitate queries of the Mini-Sentinel Distributed Database and transfer of confidential data, analytic tools were developed to facilitate rapid response to common questions, and distributed querying software was implemented to facilitate rapid querying of summary data. As of July 2011, information on 99,260,976 health plan members was included in the Mini-Sentinel Distributed Database. The database includes 316,009,067 person-years of observation time, with members contributing, on average, 27.0 months of observation time. All data partners have successfully executed distributed code and returned findings to the Mini-Sentinel Operations Center. This work demonstrates the feasibility of building a large, multiorganizational distributed data system in which organizations retain possession of their data that are used in an active surveillance system. Copyright © 2012 John Wiley & Sons, Ltd.

  2. Abstractions for Fault-Tolerant Distributed System Verification

    NASA Technical Reports Server (NTRS)

    Pike, Lee S.; Maddalon, Jeffrey M.; Miner, Paul S.; Geser, Alfons

    2004-01-01

    Four kinds of abstraction for the design and analysis of fault tolerant distributed systems are discussed. These abstractions concern system messages, faults, fault masking voting, and communication. The abstractions are formalized in higher order logic, and are intended to facilitate specifying and verifying such systems in higher order theorem provers.

  3. Efficient implementation of multidimensional fast fourier transform on a distributed-memory parallel multi-node computer

    DOEpatents

    Bhanot, Gyan V [Princeton, NJ; Chen, Dong [Croton-On-Hudson, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2012-01-10

    The present in invention is directed to a method, system and program storage device for efficiently implementing a multidimensional Fast Fourier Transform (FFT) of a multidimensional array comprising a plurality of elements initially distributed in a multi-node computer system comprising a plurality of nodes in communication over a network, comprising: distributing the plurality of elements of the array in a first dimension across the plurality of nodes of the computer system over the network to facilitate a first one-dimensional FFT; performing the first one-dimensional FFT on the elements of the array distributed at each node in the first dimension; re-distributing the one-dimensional FFT-transformed elements at each node in a second dimension via "all-to-all" distribution in random order across other nodes of the computer system over the network; and performing a second one-dimensional FFT on elements of the array re-distributed at each node in the second dimension, wherein the random order facilitates efficient utilization of the network thereby efficiently implementing the multidimensional FFT. The "all-to-all" re-distribution of array elements is further efficiently implemented in applications other than the multidimensional FFT on the distributed-memory parallel supercomputer.

  4. Efficient implementation of a multidimensional fast fourier transform on a distributed-memory parallel multi-node computer

    DOEpatents

    Bhanot, Gyan V [Princeton, NJ; Chen, Dong [Croton-On-Hudson, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2008-01-01

    The present in invention is directed to a method, system and program storage device for efficiently implementing a multidimensional Fast Fourier Transform (FFT) of a multidimensional array comprising a plurality of elements initially distributed in a multi-node computer system comprising a plurality of nodes in communication over a network, comprising: distributing the plurality of elements of the array in a first dimension across the plurality of nodes of the computer system over the network to facilitate a first one-dimensional FFT; performing the first one-dimensional FFT on the elements of the array distributed at each node in the first dimension; re-distributing the one-dimensional FFT-transformed elements at each node in a second dimension via "all-to-all" distribution in random order across other nodes of the computer system over the network; and performing a second one-dimensional FFT on elements of the array re-distributed at each node in the second dimension, wherein the random order facilitates efficient utilization of the network thereby efficiently implementing the multidimensional FFT. The "all-to-all" re-distribution of array elements is further efficiently implemented in applications other than the multidimensional FFT on the distributed-memory parallel supercomputer.

  5. Electricity distribution networks: Changing regulatory approaches

    NASA Astrophysics Data System (ADS)

    Cambini, Carlo

    2016-09-01

    Increasing the penetration of distributed generation and smart grid technologies requires substantial investments. A study proposes an innovative approach that combines four regulatory tools to provide economic incentives for distribution system operators to facilitate these innovative practices.

  6. Distributed decision-making in electric power system transmission maintenance scheduling using multi-agent systems (MAS)

    NASA Astrophysics Data System (ADS)

    Zhang, Zhong

    In this work, motivated by the need to coordinate transmission maintenance scheduling among a multiplicity of self-interested entities in restructured power industry, a distributed decision support framework based on multiagent negotiation systems (MANS) is developed. An innovative risk-based transmission maintenance optimization procedure is introduced. Several models for linking condition monitoring information to the equipment's instantaneous failure probability are presented, which enable quantitative evaluation of the effectiveness of maintenance activities in terms of system cumulative risk reduction. Methodologies of statistical processing, equipment deterioration evaluation and time-dependent failure probability calculation are also described. A novel framework capable of facilitating distributed decision-making through multiagent negotiation is developed. A multiagent negotiation model is developed and illustrated that accounts for uncertainty and enables social rationality. Some issues of multiagent negotiation convergence and scalability are discussed. The relationships between agent-based negotiation and auction systems are also identified. A four-step MAS design methodology for constructing multiagent systems for power system applications is presented. A generic multiagent negotiation system, capable of inter-agent communication and distributed decision support through inter-agent negotiations, is implemented. A multiagent system framework for facilitating the automated integration of condition monitoring information and maintenance scheduling for power transformers is developed. Simulations of multiagent negotiation-based maintenance scheduling among several independent utilities are provided. It is shown to be a viable alternative solution paradigm to the traditional centralized optimization approach in today's deregulated environment. This multiagent system framework not only facilitates the decision-making among competing power system entities, but also provides a tool to use in studying competitive industry relative to monopolistic industry.

  7. Diversity of free-living amoebae in a dual distribution (potable and recycled) water system

    EPA Science Inventory

    Free-living amoebae are known to facilitate the growth of water associated pathogens. This study, for the first time, explored the diversity of free-living amoebae in a dual distribution (potable and recycled) water system in Rouse Hill NSW, Australia. Water and biofilm samples w...

  8. Integrating security in a group oriented distributed system

    NASA Technical Reports Server (NTRS)

    Reiter, Michael; Birman, Kenneth; Gong, LI

    1992-01-01

    A distributed security architecture is proposed for incorporation into group oriented distributed systems, and in particular, into the Isis distributed programming toolkit. The primary goal of the architecture is to make common group oriented abstractions robust in hostile settings, in order to facilitate the construction of high performance distributed applications that can tolerate both component failures and malicious attacks. These abstractions include process groups and causal group multicast. Moreover, a delegation and access control scheme is proposed for use in group oriented systems. The focus is the security architecture; particular cryptosystems and key exchange protocols are not emphasized.

  9. Reflexive reasoning for distributed real-time systems

    NASA Technical Reports Server (NTRS)

    Goldstein, David

    1994-01-01

    This paper discusses the implementation and use of reflexive reasoning in real-time, distributed knowledge-based applications. Recently there has been a great deal of interest in agent-oriented systems. Implementing such systems implies a mechanism for sharing knowledge, goals and other state information among the agents. Our techniques facilitate an agent examining both state information about other agents and the parameters of the knowledge-based system shell implementing its reasoning algorithms. The shell implementing the reasoning is the Distributed Artificial Intelligence Toolkit, which is a derivative of CLIPS.

  10. Using Hypertext to Facilitate Information Sharing in Biomedical Research Groups

    PubMed Central

    Chaney, R. Jesse; Shipman, Frank M.; Gorry, G. Anthony

    1989-01-01

    As part of our effort to create an Integrated Academic Information Management System at Baylor College of Medicine, we are developing information technology to support the efforts of scientific work groups. Many of our ideas in this regard are embodied in a system called the Virtual Notebook which is intended to facilitate information sharing and management in such groups. Here we discuss the foundations of that system - a hypertext system that we have developed using a relational data base and the distributable interface the we have written in the X Window System.

  11. Proton beam therapy control system

    DOEpatents

    Baumann, Michael A [Riverside, CA; Beloussov, Alexandre V [Bernardino, CA; Bakir, Julide [Alta Loma, CA; Armon, Deganit [Redlands, CA; Olsen, Howard B [Colton, CA; Salem, Dana [Riverside, CA

    2008-07-08

    A tiered communications architecture for managing network traffic in a distributed system. Communication between client or control computers and a plurality of hardware devices is administered by agent and monitor devices whose activities are coordinated to reduce the number of open channels or sockets. The communications architecture also improves the transparency and scalability of the distributed system by reducing network mapping dependence. The architecture is desirably implemented in a proton beam therapy system to provide flexible security policies which improve patent safety and facilitate system maintenance and development.

  12. Proton beam therapy control system

    DOEpatents

    Baumann, Michael A.; Beloussov, Alexandre V.; Bakir, Julide; Armon, Deganit; Olsen, Howard B.; Salem, Dana

    2010-09-21

    A tiered communications architecture for managing network traffic in a distributed system. Communication between client or control computers and a plurality of hardware devices is administered by agent and monitor devices whose activities are coordinated to reduce the number of open channels or sockets. The communications architecture also improves the transparency and scalability of the distributed system by reducing network mapping dependence. The architecture is desirably implemented in a proton beam therapy system to provide flexible security policies which improve patent safety and facilitate system maintenance and development.

  13. Proton beam therapy control system

    DOEpatents

    Baumann, Michael A; Beloussov, Alexandre V; Bakir, Julide; Armon, Deganit; Olsen, Howard B; Salem, Dana

    2013-06-25

    A tiered communications architecture for managing network traffic in a distributed system. Communication between client or control computers and a plurality of hardware devices is administered by agent and monitor devices whose activities are coordinated to reduce the number of open channels or sockets. The communications architecture also improves the transparency and scalability of the distributed system by reducing network mapping dependence. The architecture is desirably implemented in a proton beam therapy system to provide flexible security policies which improve patent safety and facilitate system maintenance and development.

  14. Proton beam therapy control system

    DOEpatents

    Baumann, Michael A; Beloussov, Alexandre V; Bakir, Julide; Armon, Deganit; Olsen, Howard B; Salem, Dana

    2013-12-03

    A tiered communications architecture for managing network traffic in a distributed system. Communication between client or control computers and a plurality of hardware devices is administered by agent and monitor devices whose activities are coordinated to reduce the number of open channels or sockets. The communications architecture also improves the transparency and scalability of the distributed system by reducing network mapping dependence. The architecture is desirably implemented in a proton beam therapy system to provide flexible security policies which improve patent safety and facilitate system maintenance and development.

  15. Distributed hierarchical control architecture for integrating smart grid assets during normal and disrupted operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalsi, Karan; Fuller, Jason C.; Somani, Abhishek

    Disclosed herein are representative embodiments of methods, apparatus, and systems for facilitating operation and control of a resource distribution system (such as a power grid). Among the disclosed embodiments is a distributed hierarchical control architecture (DHCA) that enables smart grid assets to effectively contribute to grid operations in a controllable manner, while helping to ensure system stability and equitably rewarding their contribution. Embodiments of the disclosed architecture can help unify the dispatch of these resources to provide both market-based and balancing services.

  16. Assessing the distribution of environmental stewardship organizations and their relationship to the demographics of Los Angeles County

    Treesearch

    Krystle M. Golly

    2017-01-01

    An equal distribution of environmental stewardship organizations across the urban landscape provides an environment that facilitates community empowerment. The systemic issues found in Los Angeles County play an important role in the social development of the area. Through the utilization of modern technology and geographical mapping software, spatial distribution of...

  17. Contents of the NASA ocean data system archive, version 11-90

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A. (Editor); Lassanyi, Ruby A. (Editor)

    1990-01-01

    The National Aeronautics and Space Administration (NASA) Ocean Data System (NODS) archive at the Jet Propulsion Laboratory (JPL) includes satellite data sets for the ocean sciences and global-change research to facilitate multidisciplinary use of satellite ocean data. Parameters include sea-surface height, surface-wind vector, sea-surface temperature, atmospheric liquid water, and surface pigment concentration. NODS will become the Data Archive and Distribution Service of the JPL Distributed Active Archive Center for the Earth Observing System Data and Information System (EOSDIS) and will be the United States distribution site for Ocean Topography Experiment (TOPEX)/POSEIDON data and metadata.

  18. 78 FR 9951 - Excepted Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-12

    ...) Not to exceed 3000 positions that require unique cyber security skills and knowledge to perform cyber..., distributed control systems security, cyber incident response, cyber exercise facilitation and management, cyber vulnerability detection and assessment, network and systems engineering, enterprise architecture...

  19. Distributed Mission Operations: Training Today’s Warfighters for Tomorrow’s Conflicts

    DTIC Science & Technology

    2016-02-01

    systems or include dissimilar weapons systems to rehearse more complex mission sets. In addition to networking geographically separated simulators...over the past decade. Today, distributed mission operations can facilitate the rehearsal of theater wide operations, integrating all the anticipated...effective that many aviators earn their basic aircraft qualification before their first flight in the airplane.11 Computer memory was once a

  20. Distributed and parallel Ada and the Ada 9X recommendations

    NASA Technical Reports Server (NTRS)

    Volz, Richard A.; Goldsack, Stephen J.; Theriault, R.; Waldrop, Raymond S.; Holzbacher-Valero, A. A.

    1992-01-01

    Recently, the DoD has sponsored work towards a new version of Ada, intended to support the construction of distributed systems. The revised version, often called Ada 9X, will become the new standard sometimes in the 1990s. It is intended that Ada 9X should provide language features giving limited support for distributed system construction. The requirements for such features are given. Many of the most advanced computer applications involve embedded systems that are comprised of parallel processors or networks of distributed computers. If Ada is to become the widely adopted language envisioned by many, it is essential that suitable compilers and tools be available to facilitate the creation of distributed and parallel Ada programs for these applications. The major languages issues impacting distributed and parallel programming are reviewed, and some principles upon which distributed/parallel language systems should be built are suggested. Based upon these, alternative language concepts for distributed/parallel programming are analyzed.

  1. ClusterControl: a web interface for distributing and monitoring bioinformatics applications on a Linux cluster.

    PubMed

    Stocker, Gernot; Rieder, Dietmar; Trajanoski, Zlatko

    2004-03-22

    ClusterControl is a web interface to simplify distributing and monitoring bioinformatics applications on Linux cluster systems. We have developed a modular concept that enables integration of command line oriented program into the application framework of ClusterControl. The systems facilitate integration of different applications accessed through one interface and executed on a distributed cluster system. The package is based on freely available technologies like Apache as web server, PHP as server-side scripting language and OpenPBS as queuing system and is available free of charge for academic and non-profit institutions. http://genome.tugraz.at/Software/ClusterControl

  2. Building A Cloud Based Distributed Active Data Archive Center

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Baynes, Katie; Murphy, Kevin

    2017-01-01

    NASA's Earth Science Data System (ESDS) Program facilitates the implementation of NASA's Earth Science strategic plan, which is committed to the full and open sharing of Earth science data obtained from NASA instruments to all users. The Earth Science Data information System (ESDIS) project manages the Earth Observing System Data and Information System (EOSDIS). Data within EOSDIS are held at Distributed Active Archive Centers (DAACs). One of the key responsibilities of the ESDS Program is to continuously evolve the entire data and information system to maximize returns on the collected NASA data.

  3. The Persistence of Hierarchy: How One School District's Top Administrators Worked to Guide a Culture Change towards Collaborative Leadership

    ERIC Educational Resources Information Center

    Bravo, Robert Ronald

    2011-01-01

    There is no shortage of scholars that believe that traditional, top-down leadership is antithetical to facilitating the improvements that must be made to the American educational system if all children are to achieve academically. These scholars call for distributive, facilitative, and/or collaborative leadership, yet little is known about how…

  4. Advanced Distributed Simulation Technology Advanced Rotary Wing Aircraft. System/Segment Specification. Volume 1. Simulation System Module

    DTIC Science & Technology

    1994-03-31

    overhead water sprinklers in enclosed personnel areas not already protected by existing facility fire suppression systems. Sprinkler systems shall not...facilitate future changes and updates to remain current with the application aircraft. 3.4.4 Availabilit . The ARWA SS shall be designed and constructed to

  5. Contents of the JPL Distributed Active Archive Center (DAAC) archive, version 2-91

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A. (Editor); Lassanyi, Ruby A. (Editor)

    1991-01-01

    The Distributed Active Archive Center (DAAC) archive at the Jet Propulsion Laboratory (JPL) includes satellite data sets for the ocean sciences and global change research to facilitate multidisciplinary use of satellite ocean data. Parameters include sea surface height, surface wind vector, sea surface temperature, atmospheric liquid water, and surface pigment concentration. The Jet Propulsion Laboratory DAAC is an element of the Earth Observing System Data and Information System (EOSDIS) and will be the United States distribution site for the Ocean Topography Experiment (TOPEX)/POSEIDON data and metadata.

  6. JPL Physical Oceanography Distributed Active Archive Center (PO.DAAC) data availability, version 1-94

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Physical Oceanography Distributed Active Archive Center (PO.DAAC) archive at the Jet Propulsion Laboratory (JPL) includes satellite data sets for the ocean sciences and global-change research to facilitate multidisciplinary use of satellite ocean data. Parameters include sea-surface height, surface-wind vector, sea-surface temperature, atmospheric liquid water, and integrated water vapor. The JPL PO.DAAC is an element of the Earth Observing System Data and Information System (EOSDIS) and is the United States distribution site for Ocean Topography Experiment (TOPEX)/POSEIDON data and metadata.

  7. Research into display sharing techniques for distributed computing environments

    NASA Technical Reports Server (NTRS)

    Hugg, Steven B.; Fitzgerald, Paul F., Jr.; Rosson, Nina Y.; Johns, Stephen R.

    1990-01-01

    The X-based Display Sharing solution for distributed computing environments is described. The Display Sharing prototype includes the base functionality for telecast and display copy requirements. Since the prototype implementation is modular and the system design provided flexibility for the Mission Control Center Upgrade (MCCU) operational consideration, the prototype implementation can be the baseline for a production Display Sharing implementation. To facilitate the process the following discussions are presented: Theory of operation; System of architecture; Using the prototype; Software description; Research tools; Prototype evaluation; and Outstanding issues. The prototype is based on the concept of a dedicated central host performing the majority of the Display Sharing processing, allowing minimal impact on each individual workstation. Each workstation participating in Display Sharing hosts programs to facilitate the user's access to Display Sharing as host machine.

  8. Distributed Optimal Consensus Over Resource Allocation Network and Its Application to Dynamical Economic Dispatch.

    PubMed

    Li, Chaojie; Yu, Xinghuo; Huang, Tingwen; He, Xing; Chaojie Li; Xinghuo Yu; Tingwen Huang; Xing He; Li, Chaojie; Huang, Tingwen; He, Xing; Yu, Xinghuo

    2018-06-01

    The resource allocation problem is studied and reformulated by a distributed interior point method via a -logarithmic barrier. By the facilitation of the graph Laplacian, a fully distributed continuous-time multiagent system is developed for solving the problem. Specifically, to avoid high singularity of the -logarithmic barrier at boundary, an adaptive parameter switching strategy is introduced into this dynamical multiagent system. The convergence rate of the distributed algorithm is obtained. Moreover, a novel distributed primal-dual dynamical multiagent system is designed in a smart grid scenario to seek the saddle point of dynamical economic dispatch, which coincides with the optimal solution. The dual decomposition technique is applied to transform the optimization problem into easily solvable resource allocation subproblems with local inequality constraints. The good performance of the new dynamical systems is, respectively, verified by a numerical example and the IEEE six-bus test system-based simulations.

  9. Facilitation of macroalgae by the sedimentary tube forming polychaete Diopatra cuprea

    NASA Astrophysics Data System (ADS)

    Thomsen, M. S.; McGlathery, K.

    2005-01-01

    Marine foundation organisms such as seagrasses, corals, and kelps facilitate the distribution of numerous organisms by creating refuges from environmental stressors and by providing food and substrate for settlement and growth. Barren soft-sediment systems often have faunal organisms that facilitate other species by habitat modification. We investigated how an abundant (21 m -2) tube cap forming polychaete, Diopatra cuprea, facilitates macroalgal distribution in Hog Island Bay, a turbid shallow tidal lagoon in Virginia (USA). Seventy percent of the number of mudflat macroalgae were found incorporated into protruding D. cuprea tube caps and field experiments showed that D. cuprea facilitates algal persistence by providing a stable substrate retaining algae against hydrodynamic forces such as tidal flushing and storm surge. If tube caps were removed, simulating storm-induced erosion, they were rebuilt within days and new drift algae incorporated. Also, D. cuprea facilitated the algal assemblage by fragmenting thalli in the attachment process, thereby ensuring a constant fragment supply for vegetative re-growth if storm-induced pruning occurs. On a species-specific level, Gracilaria verrucosa and Ulva curvata benefited more from tube cap construction compared to Fucus vesiculosus, Agardhiella subulata and the alien Codium fragile ssp. tomentosoides. This was partly because G. verrucosa and U. curvata were incorporated and fragmented more readily, and partly because they probably have physiological, morphological and biomechanical traits that enable them to better co-exist with D. cuprea. These results suggest that macroalgal distribution throughout Hog Island Bay to a large extent is linked to the distribution of D. cuprea. The processes of algal attachment, retainment, recovery, re-growth and fragmentation, can have important ecosystem implications because of the sheer abundance of the Diopatra- Gracilaria/Ulva association.

  10. Linear Power-Flow Models in Multiphase Distribution Networks: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernstein, Andrey; Dall'Anese, Emiliano

    This paper considers multiphase unbalanced distribution systems and develops approximate power-flow models where bus-voltages, line-currents, and powers at the point of common coupling are linearly related to the nodal net power injections. The linearization approach is grounded on a fixed-point interpretation of the AC power-flow equations, and it is applicable to distribution systems featuring (i) wye connections; (ii) ungrounded delta connections; (iii) a combination of wye-connected and delta-connected sources/loads; and, (iv) a combination of line-to-line and line-to-grounded-neutral devices at the secondary of distribution transformers. The proposed linear models can facilitate the development of computationally-affordable optimization and control applications -- frommore » advanced distribution management systems settings to online and distributed optimization routines. Performance of the proposed models is evaluated on different test feeders.« less

  11. Experiences with an Augmented Human Intellect System: A Revolution in Communication.

    ERIC Educational Resources Information Center

    Bair, James H.

    The Augmented Human Intellect System (AHI) has been designed to facilitate communication among knowledge workers who may accomplish their entire job utilizing this advanced technology. The system is capable of sending information to geographically distributed users. It permits access to and modification of stored information by a number of persons…

  12. Impact of Actual Facilitator Alignment, Co-Location and Video Intervention on the Efficacy of Distributed Group Support Systems

    DTIC Science & Technology

    1999-12-01

    was operated over a network of four distributed clients connected to a Windows NT 4.0 server. The CU- SeeMe software was selected over University of...Acquires From Cornell University Full Intellectual Property Ownership Rights to CU- SeeMe and MeetingPoint Technologies, http://www.wpine.com

  13. Cardea: Dynamic Access Control in Distributed Systems

    NASA Technical Reports Server (NTRS)

    Lepro, Rebekah

    2004-01-01

    Modern authorization systems span domains of administration, rely on many different authentication sources, and manage complex attributes as part of the authorization process. This . paper presents Cardea, a distributed system that facilitates dynamic access control, as a valuable piece of an inter-operable authorization framework. First, the authorization model employed in Cardea and its functionality goals are examined. Next, critical features of the system architecture and its handling of the authorization process are then examined. Then the S A M L and XACML standards, as incorporated into the system, are analyzed. Finally, the future directions of this project are outlined and connection points with general components of an authorization system are highlighted.

  14. Garbage Collection in a Distributed Object-Oriented System

    NASA Technical Reports Server (NTRS)

    Gupta, Aloke; Fuchs, W. Kent

    1993-01-01

    An algorithm is described in this paper for garbage collection in distributed systems with object sharing across processor boundaries. The algorithm allows local garbage collection at each node in the system to proceed independently of local collection at the other nodes. It requires no global synchronization or knowledge of the global state of the system and exhibits the capability of graceful degradation. The concept of a specialized dump node is proposed to facilitate the collection of inaccessible circular structures. An experimental evaluation of the algorithm is also described. The algorithm is compared with a corresponding scheme that requires global synchronization. The results show that the algorithm works well in distributed processing environments even when the locality of object references is low.

  15. Displaced path integral formulation for the momentum distribution of quantum particles.

    PubMed

    Lin, Lin; Morrone, Joseph A; Car, Roberto; Parrinello, Michele

    2010-09-10

    The proton momentum distribution, accessible by deep inelastic neutron scattering, is a very sensitive probe of the potential of mean force experienced by the protons in hydrogen-bonded systems. In this work we introduce a novel estimator for the end-to-end distribution of the Feynman paths, i.e., the Fourier transform of the momentum distribution. In this formulation, free particle and environmental contributions factorize. Moreover, the environmental contribution has a natural analogy to a free energy surface in statistical mechanics, facilitating the interpretation of experiments. The new formulation is not only conceptually but also computationally advantageous. We illustrate the method with applications to an empirical water model, ab initio ice, and one dimensional model systems.

  16. HERA: A New Platform for Embedding Agents in Heterogeneous Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Alonso, Ricardo S.; de Paz, Juan F.; García, Óscar; Gil, Óscar; González, Angélica

    Ambient Intelligence (AmI) based systems require the development of innovative solutions that integrate distributed intelligent systems with context-aware technologies. In this sense, Multi-Agent Systems (MAS) and Wireless Sensor Networks (WSN) are two key technologies for developing distributed systems based on AmI scenarios. This paper presents the new HERA (Hardware-Embedded Reactive Agents) platform, that allows using dynamic and self-adaptable heterogeneous WSNs on which agents are directly embedded on the wireless nodes This approach facilitates the inclusion of context-aware capabilities in AmI systems to gather data from their surrounding environments, achieving a higher level of ubiquitous and pervasive computing.

  17. A Survey on Distributed Mobile Database and Data Mining

    NASA Astrophysics Data System (ADS)

    Goel, Ajay Mohan; Mangla, Neeraj; Patel, R. B.

    2010-11-01

    The anticipated increase in popular use of the Internet has created more opportunity in information dissemination, Ecommerce, and multimedia communication. It has also created more challenges in organizing information and facilitating its efficient retrieval. In response to this, new techniques have evolved which facilitate the creation of such applications. Certainly the most promising among the new paradigms is the use of mobile agents. In this paper, mobile agent and distributed database technologies are applied in the banking system. Many approaches have been proposed to schedule data items for broadcasting in a mobile environment. In this paper, an efficient strategy for accessing multiple data items in mobile environments and the bottleneck of current banking will be proposed.

  18. Electrically conductive concrete : a laboratory study.

    DOT National Transportation Integrated Search

    1987-01-01

    In the cathodic protection of existing reinforced concrete bridge decks, there is a need for a simple secondary-anode system to facilitate the distribution of direct current over the structure being protected. It is believed that a durable, electrica...

  19. Twenty-Eighth Annual Rank-Order Distribution of Administrative Salaries Paid, 1994-95.

    ERIC Educational Resources Information Center

    Arkansas Univ., Fayetteville. Office of Institutional Research.

    This report presents comparative data collected from 103 state-supported universities or university systems in 47 states, and 38 university systems representing 30 states, on the administrative salaries they paid in 1994-95. The salaries are presented in rank-order (from highest to lowest) to facilitate comparisons of a participant's relative…

  20. Predictors of Interpersonal Trust in Virtual Distributed Teams

    DTIC Science & Technology

    2008-09-01

    understand systems that are very complex in nature . Such understanding is essential to facilitate building or maintaining operators’ mental models of the...a significant impact on overall system performance. Specifically, the level of automation that combined human generation of options with computer...and/or computer servers had a significant impact on automated system performance. Additionally, Parasuraman, Sheridan, & Wickens (2000) proposed

  1. Data catalog for JPL Physical Oceanography Distributed Active Archive Center (PO.DAAC)

    NASA Technical Reports Server (NTRS)

    Digby, Susan

    1995-01-01

    The Physical Oceanography Distributed Active Archive Center (PO.DAAC) archive at the Jet Propulsion Laboratory contains satellite data sets and ancillary in-situ data for the ocean sciences and global-change research to facilitate multidisciplinary use of satellite ocean data. Geophysical parameters available from the archive include sea-surface height, surface-wind vector, surface-wind speed, surface-wind stress vector, sea-surface temperature, atmospheric liquid water, integrated water vapor, phytoplankton pigment concentration, heat flux, and in-situ data. PO.DAAC is an element of the Earth Observing System Data and Information System and is the United States distribution site for TOPEX/POSEIDON data and metadata.

  2. Towards scalable Byzantine fault-tolerant replication

    NASA Astrophysics Data System (ADS)

    Zbierski, Maciej

    2017-08-01

    Byzantine fault-tolerant (BFT) replication is a powerful technique, enabling distributed systems to remain available and correct even in the presence of arbitrary faults. Unfortunately, existing BFT replication protocols are mostly load-unscalable, i.e. they fail to respond with adequate performance increase whenever new computational resources are introduced into the system. This article proposes a universal architecture facilitating the creation of load-scalable distributed services based on BFT replication. The suggested approach exploits parallel request processing to fully utilize the available resources, and uses a load balancer module to dynamically adapt to the properties of the observed client workload. The article additionally provides a discussion on selected deployment scenarios, and explains how the proposed architecture could be used to increase the dependability of contemporary large-scale distributed systems.

  3. Conserved directed percolation: exact quasistationary distribution of small systems and Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    César Mansur Filho, Júlio; Dickman, Ronald

    2011-05-01

    We study symmetric sleepy random walkers, a model exhibiting an absorbing-state phase transition in the conserved directed percolation (CDP) universality class. Unlike most examples of this class studied previously, this model possesses a continuously variable control parameter, facilitating analysis of critical properties. We study the model using two complementary approaches: analysis of the numerically exact quasistationary (QS) probability distribution on rings of up to 22 sites, and Monte Carlo simulation of systems of up to 32 000 sites. The resulting estimates for critical exponents β, \\beta /\

  4. The Arbo‑zoonet Information System.

    PubMed

    Di Lorenzo, Alessio; Di Sabatino, Daria; Blanda, Valeria; Cioci, Daniela; Conte, Annamaria; Bruno, Rossana; Sauro, Francesca; Calistri, Paolo; Savini, Lara

    2016-06-30

    The Arbo‑zoonet Information System has been developed as part of the 'International Network for Capacity Building for the Control of Emerging Viral Vector Borne Zoonotic Diseases (Arbo‑zoonet)' project. The project aims to create common knowledge, sharing data, expertise, experiences, and scientific information on West Nile Disease (WND), Crimean‑Congo haemorrhagic fever (CCHF), and Rift Valley fever (RVF). These arthropod‑borne diseases of domestic and wild animals can affect humans, posing great threat to public health. Since November 2011, when the Schmallenberg virus (SBV) has been discovered for the first time in Northern Europe, the Arbo‑zoonet Information System has been used in order to collect information on newly discovered disease and to manage the epidemic emergency. The system monitors the geographical distribution and epidemiological evolution of CCHF, RVF, and WND since 1946. More recently, it has also been deployed to monitor the SBV data. The Arbo‑zoonet Information System includes a web application for the management of the database in which data are stored and a WebGIS application to explore spatial disease distributions, facilitating the epidemiological analysis. The WebGIS application is an effective tool to show and share the information and to facilitate the exchange and dissemination of relevant data among project's participants.

  5. Twenty-Seventh Annual Rank-Order Distribution of Administrative Salaries Paid, 1993-94.

    ERIC Educational Resources Information Center

    Arkansas Univ., Fayetteville. Office of Institutional Research.

    This study presents comparative data collected from 85 state-supported universities or university systems in 45 states, and 35 university systems representing 28 states on the administrative salaries they paid in 1993-94. The salaries are presented in rank-order (from highest to lowest) to facilitate comparisons of a particular position's salary…

  6. European Credit Transfer and Accumulation System: An Alternative Way to Calculate the ECTS Grades

    ERIC Educational Resources Information Center

    Grosges, Thomas; Barchiesi, Dominique

    2007-01-01

    The European Credit Transfer and Accumulation System (ECTS) has been developed and instituted to facilitate student mobility and academic recognition. This paper presents, discusses, and illustrates the pertinence and the limitation of the current statistical distribution of the ECTS grades, and we propose an alternative way to calculate the ECTS…

  7. Saguaro: A Distributed Operating System Based on Pools of Servers.

    DTIC Science & Technology

    1988-03-25

    asynchronous message passing, multicast, and semaphores are supported. We have found this flexibility to be very useful for distributed programming. The...variety of communication primitives provided by SR has facilitated the research of Stella Atkins, who was a visiting professor at Arizona during Spring...data bits in a raw communication channel to help keep the source and destination synchronized , Psync explicitly embeds timing information drawn from the

  8. Function Allocation in a Robust Distributed Real-Time Environment

    DTIC Science & Technology

    1991-12-01

    fundamental characteristic of a distributed system is its ability to map individual logical functions of an application program onto many physical nodes... how much of a node’s processor time is scheduled for function processing. IMC is the function- to -function communication required to facilitate...indicator of how much excess processor time a node has. The reconfiguration algorithms use these variables to determine the most appropriate node(s) to

  9. Logic Nanocells Within 3-Terminal Ordered Arrays

    DTIC Science & Technology

    2007-02-28

    DISTRIBUTION/AVAILABILITY STATEMENT DISTRIBUTION STATEMEN A: UNLIMITED AFRL- SR -AR-TR-07-0494 13. SUPPLEMENTARY NOTES 14. ABSTRACT ON SEPARATE SHEET... sputter -coating a 200 nm Au layer. Molecular grafting. Compounds 1, 2 and 3 were synthesized according to literature methods.24 26 The synthesis of 4...neutral (no counter ions ). In order to facilitate molecular conduction, the molecule was designed to be small and contain a continuous Tr-electron system

  10. Real-time sensor validation and fusion for distributed autonomous sensors

    NASA Astrophysics Data System (ADS)

    Yuan, Xiaojing; Li, Xiangshang; Buckles, Bill P.

    2004-04-01

    Multi-sensor data fusion has found widespread applications in industrial and research sectors. The purpose of real time multi-sensor data fusion is to dynamically estimate an improved system model from a set of different data sources, i.e., sensors. This paper presented a systematic and unified real time sensor validation and fusion framework (RTSVFF) based on distributed autonomous sensors. The RTSVFF is an open architecture which consists of four layers - the transaction layer, the process fusion layer, the control layer, and the planning layer. This paradigm facilitates distribution of intelligence to the sensor level and sharing of information among sensors, controllers, and other devices in the system. The openness of the architecture also provides a platform to test different sensor validation and fusion algorithms and thus facilitates the selection of near optimal algorithms for specific sensor fusion application. In the version of the model presented in this paper, confidence weighted averaging is employed to address the dynamic system state issue noted above. The state is computed using an adaptive estimator and dynamic validation curve for numeric data fusion and a robust diagnostic map for decision level qualitative fusion. The framework is then applied to automatic monitoring of a gas-turbine engine, including a performance comparison of the proposed real-time sensor fusion algorithms and a traditional numerical weighted average.

  11. Universal bursty behavior in the air transportation system.

    PubMed

    Ito, Hidetaka; Nishinari, Katsuhiro

    2015-12-01

    Social activities display bursty behavior characterized by heavy-tailed interevent time distributions. We examine the bursty behavior of airplanes' arrivals in hub airports. The analysis indicates that the air transportation system universally follows a power-law interarrival time distribution with an exponent α=2.5 and an exponential cutoff. Moreover, we investigate the mechanism of this bursty behavior by introducing a simple model to describe it. In addition, we compare the extent of the hub-and-spoke structure and the burstiness of various airline networks in the system. Remarkably, the results suggest that the hub-and-spoke network of the system and the carriers' strategy to facilitate transit are the origins of this universality.

  12. Atmospheric Composition Data and Information Services Center (ACDISC)

    NASA Technical Reports Server (NTRS)

    Kempler, S.

    2005-01-01

    NASA's GSFC Earth Sciences (GES) Data and Information and Data Services Center (DISC) manages the archive, distribution and data access for atmospheric composition data from AURA'S OMI, MLS, and hopefully one day, HIRDLS instruments, as well as heritage datasets from TOMS, UARS, MODIS, and AIRS. This data is currently archived in the GES Distributed Active Archive Center (DAAC). The GES DISC has begun the development of a community driven data management system that's sole purpose is to manage and provide value added services to NASA's Atmospheric Composition (AC) Data. This system, called the Atmospheric Composition Data and Information Services Center (ACDISC) will provide access all AC datasets from the above mentioned instruments, as well as AC datasets residing at remote archive sites (e.g, LaRC DAAC) The goals of the ACDISC are to: 1) Provide a data center for Atmospheric Scientists, guided by Atmospheric Scientists; 2) Be absolutely responsive to the data and data service needs of the Atmospheric Composition (AC) community; 3) Provide services (i.e., expertise) that will facilitate the effortless access to and usage of AC data; 4) Collaborate with AC scientists to facilitate the use of data from multiple sensors for long term atmospheric research. The ACDISC is an AC specific, user driven, multi-sensor, on-line, easy access archive and distribution system employing data analysis and visualization, data mining, and other user requested techniques that facilitate science data usage. The purpose of this presentation is to provide the evolution path that the GES DISC in order to better serve AC data, and also to receive continued community feedback and further foster collaboration with AC data users and providers.

  13. On the Path to SunShot - Emerging Issues and Challenges with Integrating High Levels of Solar into the Distribution System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palminitier, Bryan; Broderick, Robert; Mather, Barry

    2016-05-01

    Wide use of advanced inverters could double the electricity-distribution system’s hosting capacity for distributed PV at low costs—from about 170 GW to 350 GW (see Palmintier et al. 2016). At the distribution system level, increased variable generation due to high penetrations of distributed PV (typically rooftop and smaller ground-mounted systems) could challenge the management of distribution voltage, potentially increase wear and tear on electromechanical utility equipment, and complicate the configuration of circuit-breakers and other protection systems—all of which could increase costs, limit further PV deployment, or both. However, improved analysis of distribution system hosting capacity—the amount of distributed PV thatmore » can be interconnected without changing the existing infrastructure or prematurely wearing out equipment—has overturned previous rule-of-thumb assumptions such as the idea that distributed PV penetrations higher than 15% require detailed impact studies. For example, new analysis suggests that the hosting capacity for distributed PV could rise from approximately 170 GW using traditional inverters to about 350 GW with the use of advanced inverters for voltage management, and it could be even higher using accessible and low-cost strategies such as careful siting of PV systems within a distribution feeder and additional minor changes in distribution operations. Also critical to facilitating distributed PV deployment is the improvement of interconnection processes, associated standards and codes, and compensation mechanisms so they embrace PV’s contributions to system-wide operations. Ultimately SunShot-level PV deployment will require unprecedented coordination of the historically separate distribution and transmission systems along with incorporation of energy storage and “virtual storage,” which exploits improved management of electric vehicle charging, building energy systems, and other large loads. Additional analysis and innovation are neede« less

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION--FUELCELL ENERGY, INC.: DFC 300A MOLTEN CARBONATE FUEL CELL COMBINED HEAT AND POWER SYSTEM

    EPA Science Inventory

    The U.S. EPA operates the Environmental Technology Verification program to facilitate the deployment of innovative technologies through performance verification and information dissemination. A technology area of interest is distributed electrical power generation, particularly w...

  15. Validity Issues in Standard-Setting Studies

    ERIC Educational Resources Information Center

    Pant, Hans A.; Rupp, Andre A.; Tiffin-Richards, Simon P.; Koller, Olaf

    2009-01-01

    Standard-setting procedures are a key component within many large-scale educational assessment systems. They are consensual approaches in which committees of experts set cut-scores on continuous proficiency scales, which facilitate communication of proficiency distributions of students to a wide variety of stakeholders. This communicative function…

  16. Sunlamp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nutaro, James J

    The purpose of this model was to facilitate the design of a control system that uses fine grained control of residential and small commercial HVAC loads to counterbalance voltage swings caused by intermittent solar power sources (e.g., rooftop panels) installed in that distribution circuit. Included is the source code and pre-compiled 64 bit dll for adding building HVAC loads to an OpenDSS distribution circuit. As written, the Makefile assumes you are using the Microsoft C++ development tools.

  17. BTFS: The Border Trade Facilitation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, L.R.

    The author demonstrates the Border Trade Facilitation System (BTFS), an agent-based bilingual e-commerce system built to expedite the regulation, control, and execution of commercial trans-border shipments during the delivery phase. The system was built to serve maquila industries at the US/Mexican border. The BTFS uses foundation technology developed here at Sandia Laboratories' Advanced Information Systems Lab (AISL), including a distributed object substrate, a general-purpose agent development framework, dynamically generated agent-human interaction via the World-Wide Web, and a collaborative agent architecture. This technology is also the substrate for the Multi-Agent Simulation Management System (MASMAS) proposed for demonstration at this conference. Themore » BTFS executes authenticated transactions among agents performing open trading over the Internet. With the BTFS in place, one could conduct secure international transactions from any site with an Internet connection and a web browser. The BTFS is currently being evaluated for commercialization.« less

  18. [Good drug distribution practice and its implementation in drug distribution companies].

    PubMed

    Draksiene, Gailute

    2002-01-01

    Good Distribution Practice is based on the Directive of the Board of the European Community 92/25/EEC regarding the wholesale distribution of drugs for human consumption. It is stated in the Directive that the whole drug distribution channel is to be controlled from the point of drug production or import down to the supplies to the end user. In order to reach the goal, the drug distribution company must create the quality assurance system and facilitate its correct functioning. This aim requires development of the rules of the Good Distribution Practice. Those rules set the general requirements of the Good Distribution Practice for distribution companies that they must conduct. The article explains main requirements postulated in the rules of the Good Distribution Practice and implementation of the Good Distribution Practice requirements in drug distribution companies.

  19. Accurate calculation of field and carrier distributions in doped semiconductors

    NASA Astrophysics Data System (ADS)

    Yang, Wenji; Tang, Jianping; Yu, Hongchun; Wang, Yanguo

    2012-06-01

    We use the numerical squeezing algorithm(NSA) combined with the shooting method to accurately calculate the built-in fields and carrier distributions in doped silicon films (SFs) in the micron and sub-micron thickness range and results are presented in graphical form for variety of doping profiles under different boundary conditions. As a complementary approach, we also present the methods and the results of the inverse problem (IVP) - finding out the doping profile in the SFs for given field distribution. The solution of the IVP provides us the approach to arbitrarily design field distribution in SFs - which is very important for low dimensional (LD) systems and device designing. Further more, the solution of the IVP is both direct and much easy for all the one-, two-, and three-dimensional semiconductor systems. With current efforts focused on the LD physics, knowing of the field and carrier distribution details in the LD systems will facilitate further researches on other aspects and hence the current work provides a platform for those researches.

  20. Upconversion-based receivers for quantum hacking-resistant quantum key distribution

    NASA Astrophysics Data System (ADS)

    Jain, Nitin; Kanter, Gregory S.

    2016-07-01

    We propose a novel upconversion (sum frequency generation)-based quantum-optical system design that can be employed as a receiver (Bob) in practical quantum key distribution systems. The pump governing the upconversion process is produced and utilized inside the physical receiver, making its access or control unrealistic for an external adversary (Eve). This pump facilitates several properties which permit Bob to define and control the modes that can participate in the quantum measurement. Furthermore, by manipulating and monitoring the characteristics of the pump pulses, Bob can detect a wide range of quantum hacking attacks launched by Eve.

  1. Analysis of the access patterns at GSFC distributed active archive center

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore; Bedet, Jean-Jacques

    1996-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational for more than two years. Its mission is to support existing and pre Earth Observing System (EOS) Earth science datasets, facilitate the scientific research, and test Earth Observing System Data and Information System (EOSDIS) concepts. Over 550,000 files and documents have been archived, and more than six Terabytes have been distributed to the scientific community. Information about user request and file access patterns, and their impact on system loading, is needed to optimize current operations and to plan for future archives. To facilitate the management of daily activities, the GSFC DAAC has developed a data base system to track correspondence, requests, ingestion and distribution. In addition, several log files which record transactions on Unitree are maintained and periodically examined. This study identifies some of the users' requests and file access patterns at the GSFC DAAC during 1995. The analysis is limited to the subset of orders for which the data files are under the control of the Hierarchical Storage Management (HSM) Unitree. The results show that most of the data volume ordered was for two data products. The volume was also mostly made up of level 3 and 4 data and most of the volume was distributed on 8 mm and 4 mm tapes. In addition, most of the volume ordered was for deliveries in North America although there was a significant world-wide use. There was a wide range of request sizes in terms of volume and number of files ordered. On an average 78.6 files were ordered per request. Using the data managed by Unitree, several caching algorithms have been evaluated for both hit rate and the overhead ('cost') associated with the movement of data from near-line devices to disks. The algorithm called LRU/2 bin was found to be the best for this workload, but the STbin algorithm also worked well.

  2. An Ontology Driven Information Architecture for Interoperable Disparate Data Sources

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Dan; Hardman, Sean; Joyner, Ronald; Mattmann, Chris; Ramirez, Paul; Kelly, Sean; Castano, Rebecca

    2011-01-01

    The mission of the Planetary Data System is to facilitate achievement of NASA's planetary science goals by efficiently collecting, archiving, and making accessible digital data produced by or relevant to NASA's planetary missions, research programs, and data analysis programs. The vision is: (1) To gather and preserve the data obtained from exploration of the Solar System by the U.S. and other nations (2) To facilitate new and exciting discoveries by providing access to and ensuring usability of those data to the worldwide community (3) To inspire the public through availability and distribution of the body of knowledge reflected in the PDS data collection PDS is a federation of heterogeneous nodes including science and support nodes

  3. Transportable educational programs for scientific and technical professionals: More effective utilization of automated scientific and technical data base systems

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D.

    1987-01-01

    This grant final report executive summary documents a major, long-term program addressing innovative educational issues associated with the development, administration, evaluation, and widespread distribution of transportable educational programs for scientists and engineers to increase their knowledge of, and facilitate their utilization of automated scientific and technical information storage and retrieval systems. This educational program is of very broad scope, being targeted at Colleges of Engineering and Colleges of Physical sciences at a large number of colleges and universities throughout the United States. The educational program is designed to incorporate extensive hands-on, interactive usage of the NASA RECON system and is supported by a number of microcomputer-based software systems to facilitate the delivery and usage of the educational course materials developed as part of the program.

  4. Thinking Together: Modeling Clinical Decision-Support as a Sociotechnical System

    PubMed Central

    Hussain, Mustafa I.; Reynolds, Tera L.; Mousavi, Fatemeh E.; Chen, Yunan; Zheng, Kai

    2017-01-01

    Computerized clinical decision-support systems are members of larger sociotechnical systems, composed of human and automated actors, who send, receive, and manipulate artifacts. Sociotechnical consideration is rare in the literature. This makes it difficult to comparatively evaluate the success of CDS implementations, and it may also indicate that sociotechnical context receives inadequate consideration in practice. To facilitate sociotechnical consideration, we developed the Thinking Together model, a flexible diagrammatical means of representing CDS systems as sociotechnical systems. To develop this model, we examined the literature with the lens of Distributed Cognition (DCog) theory. We then present two case studies of vastly different CDSSs, one almost fully automated and the other with minimal automation, to illustrate the flexibility of the Thinking Together model. We show that this model, informed by DCog and the CDS literature, are capable of supporting both research, by enabling comparative evaluation, and practice, by facilitating explicit sociotechnical planning and communication. PMID:29854164

  5. Distributed visualization of gridded geophysical data: the Carbon Data Explorer, version 0.2.3

    NASA Astrophysics Data System (ADS)

    Endsley, K. A.; Billmire, M. G.

    2016-01-01

    Due to the proliferation of geophysical models, particularly climate models, the increasing resolution of their spatiotemporal estimates of Earth system processes, and the desire to easily share results with collaborators, there is a genuine need for tools to manage, aggregate, visualize, and share data sets. We present a new, web-based software tool - the Carbon Data Explorer - that provides these capabilities for gridded geophysical data sets. While originally developed for visualizing carbon flux, this tool can accommodate any time-varying, spatially explicit scientific data set, particularly NASA Earth system science level III products. In addition, the tool's open-source licensing and web presence facilitate distributed scientific visualization, comparison with other data sets and uncertainty estimates, and data publishing and distribution.

  6. [Drug vectorization or how to modulate tissular and cellular distribution of biologically active compounds].

    PubMed

    Couvreur, P

    2001-07-01

    Drug vectorization has undergone considerable development over the last few years. This review focuses on the intravenous route of administration. Colloid formulations allow a modulation of drug tissue distribution. Using liposomes and nanoparticles with unmodified surfaces, drugs can be targeted to macrophages of the reticulum endothelium system. When the liposomes or nanoparticles are covered with hydrophilic or flexible polymers, the vascular phase can be favored in order, for example, to facilitate selective extravasation at a tumor site. Therapeutic applications of these systems are presented. The development of "intelligent" vectors capable of modulating intracellular distribution of an active compounds is an equally interesting approach, for example pH-sensitive liposomes or nanoparticles decorated with folic acid capable of targeting intracellular cytoplasm.

  7. Distributed run of a one-dimensional model in a regional application using SOAP-based web services

    NASA Astrophysics Data System (ADS)

    Smiatek, Gerhard

    This article describes the setup of a distributed computing system in Perl. It facilitates the parallel run of a one-dimensional environmental model on a number of simple network PC hosts. The system uses Simple Object Access Protocol (SOAP) driven web services offering the model run on remote hosts and a multi-thread environment distributing the work and accessing the web services. Its application is demonstrated in a regional run of a process-oriented biogenic emission model for the area of Germany. Within a network consisting of up to seven web services implemented on Linux and MS-Windows hosts, a performance increase of approximately 400% has been reached compared to a model run on the fastest single host.

  8. Observer-based distributed adaptive iterative learning control for linear multi-agent systems

    NASA Astrophysics Data System (ADS)

    Li, Jinsha; Liu, Sanyang; Li, Junmin

    2017-10-01

    This paper investigates the consensus problem for linear multi-agent systems from the viewpoint of two-dimensional systems when the state information of each agent is not available. Observer-based fully distributed adaptive iterative learning protocol is designed in this paper. A local observer is designed for each agent and it is shown that without using any global information about the communication graph, all agents achieve consensus perfectly for all undirected connected communication graph when the number of iterations tends to infinity. The Lyapunov-like energy function is employed to facilitate the learning protocol design and property analysis. Finally, simulation example is given to illustrate the theoretical analysis.

  9. Pure random search for ambient sensor distribution optimisation in a smart home environment.

    PubMed

    Poland, Michael P; Nugent, Chris D; Wang, Hui; Chen, Liming

    2011-01-01

    Smart homes are living spaces facilitated with technology to allow individuals to remain in their own homes for longer, rather than be institutionalised. Sensors are the fundamental physical layer with any smart home, as the data they generate is used to inform decision support systems, facilitating appropriate actuator actions. Positioning of sensors is therefore a fundamental characteristic of a smart home. Contemporary smart home sensor distribution is aligned to either a) a total coverage approach; b) a human assessment approach. These methods for sensor arrangement are not data driven strategies, are unempirical and frequently irrational. This Study hypothesised that sensor deployment directed by an optimisation method that utilises inhabitants' spatial frequency data as the search space, would produce more optimal sensor distributions vs. the current method of sensor deployment by engineers. Seven human engineers were tasked to create sensor distributions based on perceived utility for 9 deployment scenarios. A Pure Random Search (PRS) algorithm was then tasked to create matched sensor distributions. The PRS method produced superior distributions in 98.4% of test cases (n=64) against human engineer instructed deployments when the engineers had no access to the spatial frequency data, and in 92.0% of test cases (n=64) when engineers had full access to these data. These results thus confirmed the hypothesis.

  10. Reconfiguring practice: the interdependence of experimental procedure and computing infrastructure in distributed earthquake engineering.

    PubMed

    De La Flor, Grace; Ojaghi, Mobin; Martínez, Ignacio Lamata; Jirotka, Marina; Williams, Martin S; Blakeborough, Anthony

    2010-09-13

    When transitioning local laboratory practices into distributed environments, the interdependent relationship between experimental procedure and the technologies used to execute experiments becomes highly visible and a focal point for system requirements. We present an analysis of ways in which this reciprocal relationship is reconfiguring laboratory practices in earthquake engineering as a new computing infrastructure is embedded within three laboratories in order to facilitate the execution of shared experiments across geographically distributed sites. The system has been developed as part of the UK Network for Earthquake Engineering Simulation e-Research project, which links together three earthquake engineering laboratories at the universities of Bristol, Cambridge and Oxford. We consider the ways in which researchers have successfully adapted their local laboratory practices through the modification of experimental procedure so that they may meet the challenges of coordinating distributed earthquake experiments.

  11. Application of Benchmark Dose Methodology to a Variety of Endpoints and Exposures

    EPA Science Inventory

    This latest beta version (1.1b) of the U.S. Environmental Protection Agency (EPA) Benchmark Dose Software (BMDS) is being distributed for public comment. The BMDS system is being developed as a tool to facilitate the application of benchmark dose (BMD) methods to EPA hazardous p...

  12. Turbidity-controlled sampling for suspended sediment load estimation

    Treesearch

    Jack Lewis

    2003-01-01

    Abstract - Automated data collection is essential to effectively measure suspended sediment loads in storm events, particularly in small basins. Continuous turbidity measurements can be used, along with discharge, in an automated system that makes real-time sampling decisions to facilitate sediment load estimation. The Turbidity Threshold Sampling method distributes...

  13. 26 CFR 1.355-7 - Recognition of gain on certain distributions of stock or securities in connection with an...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... distribution, if the distribution was motivated in whole or substantial part by a corporate business purpose (within the meaning of § 1.355-2(b)) other than a business purpose to facilitate the acquisition or a..., the distribution was motivated by a business purpose to facilitate the acquisition or a similar...

  14. Developing a complex systems perspective for medical education to facilitate the integration of basic science and clinical medicine.

    PubMed

    Aron, David C

    2017-04-01

    The purpose of medical education is to produce competent and capable professional practitioners who can combine the art and science of medicine. Moreover, this process must prepare individuals to practise in a field in which knowledge is increasing and the contexts in which that knowledge is applied are changing in unpredictable ways. The 'basic sciences' are important in the training of a physician. The goal of basic science training is to learn it in a way that the material can be applied in practice. Much effort has been expended to integrate basic science and clinical training, while adding many other topics to the medical curriculum. This effort has been challenging. The aims of the paper are (1) to propose a unifying conceptual framework that facilitates knowledge integration among all levels of living systems from cell to society and (2) illustrate the organizing principles with two examples of the framework in action - cybernetic systems (with feedback) and distributed robustness. Literature related to hierarchical and holarchical frameworks was reviewed. An organizing framework derived from living systems theory and spanning the range from molecular biology to health systems management was developed. The application of cybernetic systems to three levels (regulation of pancreatic beta cell production of insulin, physician adjustment of medication for glycaemic control and development and action of performance measures for diabetes care) was illustrated. Similarly distributed robustness was illustrated by the DNA damage response system and principles underlying patient safety. Each of the illustrated organizing principles offers a means to facilitate the weaving of basic science and clinical medicine throughout the course of study. The use of such an approach may promote systems thinking, which is a core competency for effective and capable medical practice. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  15. Current sheet in plasma as a system with a controlling parameter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fridman, Yu. A., E-mail: yulya-fridman@yandex.ru; Chukbar, K. V., E-mail: Chukbar-KV@nrcki.ru

    2015-08-15

    A simple kinetic model describing stationary solutions with bifurcated and single-peaked current density profiles of a plane electron beam or current sheet in plasma is presented. A connection is established between the two-dimensional constructions arising in terms of the model and the one-dimensional considerations by Bernstein−Greene−Kruskal facilitating the reconstruction of the distribution function of trapped particles when both the profile of the electric potential and the free particles distribution function are known.

  16. Distributed user interfaces for clinical ubiquitous computing applications.

    PubMed

    Bång, Magnus; Larsson, Anders; Berglund, Erik; Eriksson, Henrik

    2005-08-01

    Ubiquitous computing with multiple interaction devices requires new interface models that support user-specific modifications to applications and facilitate the fast development of active workspaces. We have developed NOSTOS, a computer-augmented work environment for clinical personnel to explore new user interface paradigms for ubiquitous computing. NOSTOS uses several devices such as digital pens, an active desk, and walk-up displays that allow the system to track documents and activities in the workplace. We present the distributed user interface (DUI) model that allows standalone applications to distribute their user interface components to several devices dynamically at run-time. This mechanism permit clinicians to develop their own user interfaces and forms to clinical information systems to match their specific needs. We discuss the underlying technical concepts of DUIs and show how service discovery, component distribution, events and layout management are dealt with in the NOSTOS system. Our results suggest that DUIs--and similar network-based user interfaces--will be a prerequisite of future mobile user interfaces and essential to develop clinical multi-device environments.

  17. Compilation of 1982 Annual Reports of the Navy ELF (Extremely Low Frequency) Communications System Ecological Monitoring Program.

    DTIC Science & Technology

    1983-05-01

    systems are generally considered difficult to study, several features of ectomycor- rhizae facilitate such investigations. Ectomycorrhizal fungi...communications antenna area and 2) distribution features of the induced field in the forest floor. The overall objective of these studies is to quantify key...Sultanova, K. Kayumov, and 0. Khasahov. 1981. Some features of microbiological processes under alfalfa depending on hoeing depth and fertilizer

  18. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  19. Determinants of pulmonary blood flow distribution.

    PubMed

    Glenny, Robb W; Robertson, H Thomas

    2011-01-01

    The primary function of the pulmonary circulation is to deliver blood to the alveolar capillaries to exchange gases. Distributing blood over a vast surface area facilitates gas exchange, yet the pulmonary vascular tree must be constrained to fit within the thoracic cavity. In addition, pressures must remain low within the circulatory system to protect the thin alveolar capillary membranes that allow efficient gas exchange. The pulmonary circulation is engineered for these unique requirements and in turn these special attributes affect the spatial distribution of blood flow. As the largest organ in the body, the physical characteristics of the lung vary regionally, influencing the spatial distribution on large-, moderate-, and small-scale levels. © 2011 American Physiological Society.

  20. The Web Measurement Environment (WebME): A Tool for Combining and Modeling Distributed Data

    NASA Technical Reports Server (NTRS)

    Tesoriero, Roseanne; Zelkowitz, Marvin

    1997-01-01

    Many organizations have incorporated data collection into their software processes for the purpose of process improvement. However, in order to improve, interpreting the data is just as important as the collection of data. With the increased presence of the Internet and the ubiquity of the World Wide Web, the potential for software processes being distributed among several physically separated locations has also grown. Because project data may be stored in multiple locations and in differing formats, obtaining and interpreting data from this type of environment becomes even more complicated. The Web Measurement Environment (WebME), a Web-based data visualization tool, is being developed to facilitate the understanding of collected data in a distributed environment. The WebME system will permit the analysis of development data in distributed, heterogeneous environments. This paper provides an overview of the system and its capabilities.

  1. Thirtieth Annual Rank-Order Distribution of Administrative Salaries Paid, 1996-97.

    ERIC Educational Resources Information Center

    Arkansas Univ., Fayetteville. Office of Institutional Research.

    This report presents comparative data collected from 87 state-supported universities in 45 states, and 27 university systems representing 21 states, on the administrative salaries they paid in 1996-97. The salaries are presented in rank-order (from highest to lowest) to facilitate comparisons of a participant's relative standing with other…

  2. Twenty-Ninth Annual Rank-Order Distribution of Administrative Salaries Paid, 1995-96.

    ERIC Educational Resources Information Center

    Arkansas Univ., Fayetteville. Office of Institutional Research.

    This report presents comparative data collected from 98 state-supported universities in 47 states, and 38 university systems representing 30 states, on the administrative salaries they paid in 1994-95. The salaries are presented in rank-order (from highest to lowest) to facilitate comparisons of a participant's relative standing with other…

  3. Spatial discretization of large watersheds and its influence on the estimation of hillslope sediment yield

    USDA-ARS?s Scientific Manuscript database

    The combined use of water erosion models and geographic information systems (GIS) has facilitated soil loss estimation at the watershed scale. Tools such as the Geo-spatial interface for the Water Erosion Prediction Project (GeoWEPP) model provide a convenient spatially distributed soil loss estimat...

  4. Managing Data and Facilitating Science: A spectrum of activities in the Centre for Environmental Data Archival. (Invited)

    NASA Astrophysics Data System (ADS)

    Lawrence, B.; Bennett, V.; Callaghan, S.; Juckes, M. N.; Pepler, S.

    2013-12-01

    The UK Centre for Environmental Data Archival (CEDA) hosts a number of formal data centres, including the British Atmospheric Data Centre (BADC), and is a partner in a range of national and international data federations, including the InfraStructure for the European Network for Earth system Simulation, the Earth System Grid Federation, and the distributed IPCC Data Distribution Centres. The mission of CEDA is to formally curate data from, and facilitate the doing of, environmental science. The twin aims are symbiotic: data curation helps facilitate science, and facilitating science helps with data curation. Here we cover how CEDA delivers this strategy by established internal processes supplemented by short-term projects, supported by staff with a range of roles. We show how CEDA adds value to data in the curated archive, and how it supports science, and show examples of the aforementioned symbiosis. We begin by discussing curation: CEDA has the formal responsibility for curating the data products of atmospheric science and earth observation research funded by the UK Natural Environment Research Council (NERC). However, curation is not just about the provider community, the consumer communities matter too, and the consumers of these data cross the boundaries of science, including engineers, medics, as well as the gamut of the environmental sciences. There is a small, and growing cohort of non-science users. For both producers and consumers of data, information about data is crucial, and a range of CEDA staff have long worked on tools and techniques for creating, managing, and delivering metadata (as well as data). CEDA "science support" staff work with scientists to help them prepare and document data for curation. As one of a spectrum of activities, CEDA has worked on data Publication as a method of both adding value to some data, and rewarding the effort put into the production of quality datasets. As such, we see this activity as both a curation and a facilitation activity. A range of more focused facilitation activities are carried out, from providing a computing platform suitable for big-data analytics (the Joint Analysis System, JASMIN), to working on distributed data analysis (EXARCH), and the acquisition of third party data to support science and impact (e.g. in the context of the facility for Climate and Environmental Monitoring from Space, CEMS). We conclude by confronting the view of Parsons and Fox (2013) that metaphors such as Data Publication, Big Iron, Science Support etc are limiting, and suggest the CEDA experience is that these sorts of activities can and do co-exist, much as they conclude they should. However, we also believe that within co-existing metaphors, production systems need to be limited in their scope, even if they are on a road to a more joined up infrastructure. We shouldn't confuse what we can do now with what we might want to do in the future.

  5. Facilitating the openEHR approach - organizational structures for defining high-quality archetypes.

    PubMed

    Kohl, Christian Dominik; Garde, Sebastian; Knaup, Petra

    2008-01-01

    Using openEHR archetypes to establish an electronic patient record promises rapid development and system interoperability by using or adopting existing archetypes. However, internationally accepted, high quality archetypes which enable a comprehensive semantic interoperability require adequate development and maintenance processes. Therefore, structures have to be created involving different health professions. In the following we present a model which facilitates and governs distributed but cooperative development and adoption of archetypes by different professionals including peer reviews. Our model consists of a hierarchical structure of professional committees and descriptions of the archetype development process considering these different committees.

  6. An RFID-Based Manufacturing Control Framework for Loosely Coupled Distributed Manufacturing System Supporting Mass Customization

    NASA Astrophysics Data System (ADS)

    Chen, Ruey-Shun; Tsai, Yung-Shun; Tu, Arthur

    In this study we propose a manufacturing control framework based on radio-frequency identification (RFID) technology and a distributed information system to construct a mass-customization production process in a loosely coupled shop-floor control environment. On the basis of this framework, we developed RFID middleware and an integrated information system for tracking and controlling the manufacturing process flow. A bicycle manufacturer was used to demonstrate the prototype system. The findings of this study were that the proposed framework can improve the visibility and traceability of the manufacturing process as well as enhance process quality control and real-time production pedigree access. Using this framework, an enterprise can easily integrate an RFID-based system into its manufacturing environment to facilitate mass customization and a just-in-time production model.

  7. The Global File System

    NASA Technical Reports Server (NTRS)

    Soltis, Steven R.; Ruwart, Thomas M.; OKeefe, Matthew T.

    1996-01-01

    The global file system (GFS) is a prototype design for a distributed file system in which cluster nodes physically share storage devices connected via a network-like fiber channel. Networks and network-attached storage devices have advanced to a level of performance and extensibility so that the previous disadvantages of shared disk architectures are no longer valid. This shared storage architecture attempts to exploit the sophistication of storage device technologies whereas a server architecture diminishes a device's role to that of a simple component. GFS distributes the file system responsibilities across processing nodes, storage across the devices, and file system resources across the entire storage pool. GFS caches data on the storage devices instead of the main memories of the machines. Consistency is established by using a locking mechanism maintained by the storage devices to facilitate atomic read-modify-write operations. The locking mechanism is being prototyped in the Silicon Graphics IRIX operating system and is accessed using standard Unix commands and modules.

  8. PRMS-IV, the precipitation-runoff modeling system, version 4

    USGS Publications Warehouse

    Markstrom, Steven L.; Regan, R. Steve; Hay, Lauren E.; Viger, Roland J.; Webb, Richard M.; Payn, Robert A.; LaFontaine, Jacob H.

    2015-01-01

    Computer models that simulate the hydrologic cycle at a watershed scale facilitate assessment of variability in climate, biota, geology, and human activities on water availability and flow. This report describes an updated version of the Precipitation-Runoff Modeling System. The Precipitation-Runoff Modeling System is a deterministic, distributed-parameter, physical-process-based modeling system developed to evaluate the response of various combinations of climate and land use on streamflow and general watershed hydrology. Several new model components were developed, and all existing components were updated, to enhance performance and supportability. This report describes the history, application, concepts, organization, and mathematical formulation of the Precipitation-Runoff Modeling System and its model components. This updated version provides improvements in (1) system flexibility for integrated science, (2) verification of conservation of water during simulation, (3) methods for spatial distribution of climate boundary conditions, and (4) methods for simulation of soil-water flow and storage.

  9. Framework and Method for Controlling a Robotic System Using a Distributed Computer Network

    NASA Technical Reports Server (NTRS)

    Sanders, Adam M. (Inventor); Strawser, Philip A. (Inventor); Barajas, Leandro G. (Inventor); Permenter, Frank Noble (Inventor)

    2015-01-01

    A robotic system for performing an autonomous task includes a humanoid robot having a plurality of compliant robotic joints, actuators, and other integrated system devices that are controllable in response to control data from various control points, and having sensors for measuring feedback data at the control points. The system includes a multi-level distributed control framework (DCF) for controlling the integrated system components over multiple high-speed communication networks. The DCF has a plurality of first controllers each embedded in a respective one of the integrated system components, e.g., the robotic joints, a second controller coordinating the components via the first controllers, and a third controller for transmitting a signal commanding performance of the autonomous task to the second controller. The DCF virtually centralizes all of the control data and the feedback data in a single location to facilitate control of the robot across the multiple communication networks.

  10. System Engineering Strategy for Distributed Multi-Purpose Simulation Architectures

    NASA Technical Reports Server (NTRS)

    Bhula, Dlilpkumar; Kurt, Cindy Marie; Luty, Roger

    2007-01-01

    This paper describes the system engineering approach used to develop distributed multi-purpose simulations. The multi-purpose simulation architecture focuses on user needs, operations, flexibility, cost and maintenance. This approach was used to develop an International Space Station (ISS) simulator, which is called the International Space Station Integrated Simulation (ISIS)1. The ISIS runs unmodified ISS flight software, system models, and the astronaut command and control interface in an open system design that allows for rapid integration of multiple ISS models. The initial intent of ISIS was to provide a distributed system that allows access to ISS flight software and models for the creation, test, and validation of crew and ground controller procedures. This capability reduces the cost and scheduling issues associated with utilizing standalone simulators in fixed locations, and facilitates discovering unknowns and errors earlier in the development lifecycle. Since its inception, the flexible architecture of the ISIS has allowed its purpose to evolve to include ground operator system and display training, flight software modification testing, and as a realistic test bed for Exploration automation technology research and development.

  11. A modular approach to addressing model design, scale, and parameter estimation issues in distributed hydrological modelling

    USGS Publications Warehouse

    Leavesley, G.H.; Markstrom, S.L.; Restrepo, Pedro J.; Viger, R.J.

    2002-01-01

    A modular approach to model design and construction provides a flexible framework in which to focus the multidisciplinary research and operational efforts needed to facilitate the development, selection, and application of the most robust distributed modelling methods. A variety of modular approaches have been developed, but with little consideration for compatibility among systems and concepts. Several systems are proprietary, limiting any user interaction. The US Geological Survey modular modelling system (MMS) is a modular modelling framework that uses an open source software approach to enable all members of the scientific community to address collaboratively the many complex issues associated with the design, development, and application of distributed hydrological and environmental models. Implementation of a common modular concept is not a trivial task. However, it brings the resources of a larger community to bear on the problems of distributed modelling, provides a framework in which to compare alternative modelling approaches objectively, and provides a means of sharing the latest modelling advances. The concepts and components of the MMS are described and an example application of the MMS, in a decision-support system context, is presented to demonstrate current system capabilities. Copyright ?? 2002 John Wiley and Sons, Ltd.

  12. Public Marketing: An Alternative Policy Decision-Making Idea for Small Cities. Community Development Research Series.

    ERIC Educational Resources Information Center

    Meyers, James; And Others

    The concept of public marketing presents a strategy for the systems approach to community development that would facilitate the community decision making process via improved communication. Basic aspects of the social marketing process include: (1) product policy; (2) channels of distribution; (3) pricing (perceived price vs quality and quantity…

  13. In Vitro Exposure Systems and Dosimetry Assessment Tools for Inhaled Tobacco Products: Workshop Proceedings, Conclusions, and Paths Forward for In Vitro Model Use

    EPA Science Inventory

    In 2009, the passing of The Family Smoking Prevention and Tobacco Control Act facilitated the establishment of the FDA Center for Tobacco Products (CTP) and gave it regulatory authority over the marketing, manufacture and distribution of tobacco products, including those termed “...

  14. 500 C Electronic Packaging and Dielectric Materials for High Temperature Applications

    NASA Technical Reports Server (NTRS)

    Chen, Liang-yu; Neudeck, Philip G.; Spry, David J.; Beheim, Glenn M.; Hunter, Gary W.

    2016-01-01

    High-temperature environment operable sensors and electronics are required for exploring the inner solar planets and distributed control of next generation aeronautical engines. Various silicon carbide (SiC) high temperature sensors, actuators, and electronics have been demonstrated at and above 500C. A compatible packaging system is essential for long-term testing and application of high temperature electronics and sensors. High temperature passive components are also necessary for high temperature electronic systems. This talk will discuss ceramic packaging systems developed for high temperature electronics, and related testing results of SiC circuits at 500C and silicon-on-insulator (SOI) integrated circuits at temperatures beyond commercial limit facilitated by these high temperature packaging technologies. Dielectric materials for high temperature multilayers capacitors will also be discussed. High-temperature environment operable sensors and electronics are required for probing the inner solar planets and distributed control of next generation aeronautical engines. Various silicon carbide (SiC) high temperature sensors, actuators, and electronics have been demonstrated at and above 500C. A compatible packaging system is essential for long-term testing and eventual applications of high temperature electronics and sensors. High temperature passive components are also necessary for high temperature electronic systems. This talk will discuss ceramic packaging systems developed for high electronics and related testing results of SiC circuits at 500C and silicon-on-insulator (SOI) integrated circuits at temperatures beyond commercial limit facilitated by high temperature packaging technologies. Dielectric materials for high temperature multilayers capacitors will also be discussed.

  15. Investigation of the applicability of a functional programming model to fault-tolerant parallel processing for knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Harper, Richard

    1989-01-01

    In a fault-tolerant parallel computer, a functional programming model can facilitate distributed checkpointing, error recovery, load balancing, and graceful degradation. Such a model has been implemented on the Draper Fault-Tolerant Parallel Processor (FTPP). When used in conjunction with the FTPP's fault detection and masking capabilities, this implementation results in a graceful degradation of system performance after faults. Three graceful degradation algorithms have been implemented and are presented. A user interface has been implemented which requires minimal cognitive overhead by the application programmer, masking such complexities as the system's redundancy, distributed nature, variable complement of processing resources, load balancing, fault occurrence and recovery. This user interface is described and its use demonstrated. The applicability of the functional programming style to the Activation Framework, a paradigm for intelligent systems, is then briefly described.

  16. Mentat: An object-oriented macro data flow system

    NASA Technical Reports Server (NTRS)

    Grimshaw, Andrew S.; Liu, Jane W. S.

    1988-01-01

    Mentat, an object-oriented macro data flow system designed to facilitate parallelism in distributed systems, is presented. The macro data flow model is a model of computation similar to the data flow model with two principal differences: the computational complexity of the actors is much greater than in traditional data flow systems, and there are persistent actors that maintain state information between executions. Mentat is a system that combines the object-oriented programming paradigm and the macro data flow model of computation. Mentat programs use a dynamic structure called a future list to represent the future of computations.

  17. Cardea: Providing Support for Dynamic Resource Access in a Distributed Computing Environment

    NASA Technical Reports Server (NTRS)

    Lepro, Rebekah

    2003-01-01

    The environment framing the modem authorization process span domains of administration, relies on many different authentication sources, and manages complex attributes as part of the authorization process. Cardea facilitates dynamic access control within this environment as a central function of an inter-operable authorization framework. The system departs from the traditional authorization model by separating the authentication and authorization processes, distributing the responsibility for authorization data and allowing collaborating domains to retain control over their implementation mechanisms. Critical features of the system architecture and its handling of the authorization process differentiate the system from existing authorization components by addressing common needs not adequately addressed by existing systems. Continuing system research seeks to enhance the implementation of the current authorization model employed in Cardea, increase the robustness of current features, further the framework for establishing trust and promote interoperability with existing security mechanisms.

  18. High Level Analysis, Design and Validation of Distributed Mobile Systems with CoreASM

    NASA Astrophysics Data System (ADS)

    Farahbod, R.; Glässer, U.; Jackson, P. J.; Vajihollahi, M.

    System design is a creative activity calling for abstract models that facilitate reasoning about the key system attributes (desired requirements and resulting properties) so as to ensure these attributes are properly established prior to actually building a system. We explore here the practical side of using the abstract state machine (ASM) formalism in combination with the CoreASM open source tool environment for high-level design and experimental validation of complex distributed systems. Emphasizing the early phases of the design process, a guiding principle is to support freedom of experimentation by minimizing the need for encoding. CoreASM has been developed and tested building on a broad scope of applications, spanning computational criminology, maritime surveillance and situation analysis. We critically reexamine here the CoreASM project in light of three different application scenarios.

  19. A mobile field-work data collection system for the wireless era of health surveillance.

    PubMed

    Forsell, Marianne; Sjögren, Petteri; Renard, Matthew; Johansson, Olle

    2011-03-01

    In many countries or regions the capacity of health care resources is below the needs of the population and new approaches for health surveillance are needed. Innovative projects, utilizing wireless communication technology, contribute to reliable methods for field-work data collection and reporting to databases. The objective was to describe a new version of a wireless IT-support system for field-work data collection and administration. The system requirements were drawn from the design objective and translated to system functions. The system architecture was based on fieldwork experiences and administrative requirements. The Smartphone devices were HTC Touch Diamond2s, while the system was based on a platform with Microsoft .NET components, and a SQL Server 2005 with Microsoft Windows Server 2003 operating system. The user interfaces were based on .NET programming, and Microsoft Windows Mobile operating system. A synchronization module enabled download of field data to the database, via a General Packet Radio Services (GPRS) to a Local Area Network (LAN) interface. The field-workers considered the here-described applications user-friendly and almost self-instructing. The office administrators considered that the back-office interface facilitated retrieval of health reports and invoice distribution. The current IT-support system facilitates short lead times from fieldwork data registration to analysis, and is suitable for various applications. The advantages of wireless technology, and paper-free data administration need to be increasingly emphasized in development programs, in order to facilitate reliable and transparent use of limited resources.

  20. The Warwick system of prospective workload allocation in cellular pathology—an aid to subspecialisation: a comparison with the Royal College of Pathologists' system

    PubMed Central

    Carr, R A; Sanders, D S A; Stores, O P; Smew, F A; Parkes, M E; Ross‐Gilbertson, V; Chachlani, N; Simon, J

    2006-01-01

    Background Guidelines on staffing and workload for histopathology and cytopathology departments was published by the Royal College of Pathologists (RCPath) in July 2003. In this document, a system is provided whereby the workload of a cellular pathology department and individual pathologists can be assessed with a scoring system based on specialty and complexity of the specimens. A similar, but simplified, system of scoring specimens by specialty was developed in the Warwick District General Hospital. The system was based on the specimen type and suggested clinical diagnosis, so that specimens could be allocated prospectively by the laboratory technical staff to even out workload and support subspecialisation in a department staffed by 4.6 whole‐time equivalent consultant pathologists. Methods The pathologists were asked to indicate their reporting preferences to determine specialist reporting teams. The workload was allocated according to the “prospective” Warwick system (based on specimen type and suggested clinical diagnosis, not affected by final diagnosis or individual pathologist variation in reference to numbers of blocks, sections and special stains examined) for October 2003. The cumulative Warwick score was compared with the “retrospective” RCPath scoring system for each pathologist and between specialties. Four pathologists recorded their time for cut‐up and reporting for the month audited. Results The equitable distribution of work between pathologists was ensured by the Warwick allocation and workload system, hence facilitating specialist reporting. Less variation was observed in points reported per hour by the Warwick system (6.3 (range 5.5–6.9)) than by the RCPath system (11.5 (range 9.3–15)). Conclusions The RCPath system of scoring is inherently complex, is applied retrospectively and is not consistent across subspecialities. The Warwick system is simpler, prospective and can be run by technical staff; it facilitates even workload distribution throughout the day. Subspecialisation within a small‐sized or medium‐sized department with fair distribution of work between pathologists is also allowed for by this system. Reporting times among pathologists were shown by time and motion studies to be more consistent with Warwick points per hour than with RCPath points per hour. PMID:16524963

  1. Modernizing Distribution System Restoration to Achieve Grid Resiliency Against Extreme Weather Events: An Integrated Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chen; Wang, Jianhui; Ton, Dan

    Recent severe power outages caused by extreme weather hazards have highlighted the importance and urgency of improving the resilience of the electric power grid. As the distribution grids still remain vulnerable to natural disasters, the power industry has focused on methods of restoring distribution systems after disasters in an effective and quick manner. The current distribution system restoration practice for utilities is mainly based on predetermined priorities and tends to be inefficient and suboptimal, and the lack of situational awareness after the hazard significantly delays the restoration process. As a result, customers may experience an extended blackout, which causes largemore » economic loss. On the other hand, the emerging advanced devices and technologies enabled through grid modernization efforts have the potential to improve the distribution system restoration strategy. However, utilizing these resources to aid the utilities in better distribution system restoration decision-making in response to extreme weather events is a challenging task. Therefore, this paper proposes an integrated solution: a distribution system restoration decision support tool designed by leveraging resources developed for grid modernization. We first review the current distribution restoration practice and discuss why it is inadequate in response to extreme weather events. Then we describe how the grid modernization efforts could benefit distribution system restoration, and we propose an integrated solution in the form of a decision support tool to achieve the goal. The advantages of the solution include improving situational awareness of the system damage status and facilitating survivability for customers. The paper provides a comprehensive review of how the existing methodologies in the literature could be leveraged to achieve the key advantages. The benefits of the developed system restoration decision support tool include the optimal and efficient allocation of repair crews and resources, the expediting of the restoration process, and the reduction of outage durations for customers, in response to severe blackouts due to extreme weather hazards.« less

  2. Automation and hypermedia technology applications

    NASA Technical Reports Server (NTRS)

    Jupin, Joseph H.; Ng, Edward W.; James, Mark L.

    1993-01-01

    This paper represents a progress report on HyLite (Hypermedia Library technology): a research and development activity to produce a versatile system as part of NASA's technology thrusts in automation, information sciences, and communications. HyLite can be used as a system or tool to facilitate the creation and maintenance of large distributed electronic libraries. The contents of such a library may be software components, hardware parts or designs, scientific data sets or databases, configuration management information, etc. Proliferation of computer use has made the diversity and quantity of information too large for any single user to sort, process, and utilize effectively. In response to this information deluge, we have created HyLite to enable the user to process relevant information into a more efficient organization for presentation, retrieval, and readability. To accomplish this end, we have incorporated various AI techniques into the HyLite hypermedia engine to facilitate parameters and properties of the system. The proposed techniques include intelligent searching tools for the libraries, intelligent retrievals, and navigational assistance based on user histories. HyLite itself is based on an earlier project, the Encyclopedia of Software Components (ESC) which used hypermedia to facilitate and encourage software reuse.

  3. On intra-supply chain system with an improved distribution plan, multiple sales locations and quality assurance.

    PubMed

    Chiu, Singa Wang; Huang, Chao-Chih; Chiang, Kuo-Wei; Wu, Mei-Fang

    2015-01-01

    Transnational companies, operating in extremely competitive global markets, always seek to lower different operating costs, such as inventory holding costs in their intra- supply chain system. This paper incorporates a cost reducing product distribution policy into an intra-supply chain system with multiple sales locations and quality assurance studied by [Chiu et al., Expert Syst Appl, 40:2669-2676, (2013)]. Under the proposed cost reducing distribution policy, an added initial delivery of end items is distributed to multiple sales locations to meet their demand during the production unit's uptime and rework time. After rework when the remaining production lot goes through quality assurance, n fixed quantity installments of finished items are then transported to sales locations at a fixed time interval. Mathematical modeling and optimization techniques are used to derive closed-form optimal operating policies for the proposed system. Furthermore, the study demonstrates significant savings in stock holding costs for both the production unit and sales locations. Alternative of outsourcing product delivery task to an external distributor is analyzed to assist managerial decision making in potential outsourcing issues in order to facilitate further reduction in operating costs.

  4. Directing and Facilitating Distributed Pedagogical Leadership: Best Practices in Early Childhood Education

    ERIC Educational Resources Information Center

    Bøe, Marit; Hognestad, Karin

    2017-01-01

    This paper uses a hybrid leadership framework to examine how formal teacher leaders at the middle management level direct and facilitate staff resources for distributed pedagogical leadership. By conducting qualitative shadowing, involving video observation, field notes and stimulated recall interviews, and abductive analysis, this study…

  5. Ecological Complexity in a Coffee Agroecosystem: Spatial Heterogeneity, Population Persistence and Biological Control

    PubMed Central

    Liere, Heidi; Jackson, Doug; Vandermeer, John

    2012-01-01

    Background Spatial heterogeneity is essential for the persistence of many inherently unstable systems such as predator-prey and parasitoid-host interactions. Since biological interactions themselves can create heterogeneity in space, the heterogeneity necessary for the persistence of an unstable system could be the result of local interactions involving elements of the unstable system itself. Methodology/Principal Findings Here we report on a predatory ladybird beetle whose natural history suggests that the beetle requires the patchy distribution of the mutualism between its prey, the green coffee scale, and the arboreal ant, Azteca instabilis. Based on known ecological interactions and the natural history of the system, we constructed a spatially-explicit model and showed that the clustered spatial pattern of ant nests facilitates the persistence of the beetle populations. Furthermore, we show that the dynamics of the beetle consuming the scale insects can cause the clustered distribution of the mutualistic ants in the first place. Conclusions/Significance From a theoretical point of view, our model represents a novel situation in which a predator indirectly causes a spatial pattern of an organism other than its prey, and in doing so facilitates its own persistence. From a practical point of view, it is noteworthy that one of the elements in the system is a persistent pest of coffee, an important world commodity. This pest, we argue, is kept within limits of control through a complex web of ecological interactions that involves the emergent spatial pattern. PMID:23029061

  6. Multiple mechanisms sustain a plant-animal facilitation on a coastal ecotone

    PubMed Central

    He, Qiang; Cui, Baoshan

    2015-01-01

    Theory suggests that species distributions are expanded by positive species interactions, but the importance of facilitation in expanding species distributions at physiological range limits has not been widely recognized. We investigated the effects of the nurse shrub Tamarix chinensis on the crab Helice tientsinensis on the terrestrial borders of salt marshes, a typical coastal ecotone, where Tamarix and Helice were on their lower and upper elevational distribution edges, respectively. Crab burrows were abundant under Tamarix, but were absent in open areas between Tamarix. Removing Tamarix decreased associated crab burrows with time, while simulating Tamarix in open areas by shading, excluding predators, and adding Tamarix branches as crab food, increased crab burrows. Measurements of soil and microclimate factors showed that removing Tamarix increased abiotic stress, while simulating Tamarix by shading decreased abiotic stress. Survival of tethered crabs was high only when protected from desiccation and predation. Thus, by alleviating abiotic and biotic stresses, as well as by food provision, Tamarix expanded the upper intertidal distribution of Helice. Our study provides clear evidence for the importance of facilitation in expanding species distributions at their range limits, and suggests that facilitation is a crucial biological force maintaining the ecotones between ecosystems. PMID:25721758

  7. Multiple mechanisms sustain a plant-animal facilitation on a coastal ecotone.

    PubMed

    He, Qiang; Cui, Baoshan

    2015-02-27

    Theory suggests that species distributions are expanded by positive species interactions, but the importance of facilitation in expanding species distributions at physiological range limits has not been widely recognized. We investigated the effects of the nurse shrub Tamarix chinensis on the crab Helice tientsinensis on the terrestrial borders of salt marshes, a typical coastal ecotone, where Tamarix and Helice were on their lower and upper elevational distribution edges, respectively. Crab burrows were abundant under Tamarix, but were absent in open areas between Tamarix. Removing Tamarix decreased associated crab burrows with time, while simulating Tamarix in open areas by shading, excluding predators, and adding Tamarix branches as crab food, increased crab burrows. Measurements of soil and microclimate factors showed that removing Tamarix increased abiotic stress, while simulating Tamarix by shading decreased abiotic stress. Survival of tethered crabs was high only when protected from desiccation and predation. Thus, by alleviating abiotic and biotic stresses, as well as by food provision, Tamarix expanded the upper intertidal distribution of Helice. Our study provides clear evidence for the importance of facilitation in expanding species distributions at their range limits, and suggests that facilitation is a crucial biological force maintaining the ecotones between ecosystems.

  8. Agent-based Decision Support System for the Third Generation Distributed Dynamic Decision-making (DDD-III) Simulator

    DTIC Science & Technology

    2004-06-01

    suitable form of organizational adaptation is effective organizational diagnosis and analysis. The organizational diagnosis and analysis involve...related to the mission environment, organizational structure, and strategy is imperative for an effective and efficient organizational diagnosis . The...not easily articulated nor expressed otherwise. These displays are crucial to facilitate effective organizational diagnosis and analysis, and

  9. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  10. Diagnostic layer integration in FPGA-based pipeline measurement systems for HEP experiments

    NASA Astrophysics Data System (ADS)

    Pozniak, Krzysztof T.

    2007-08-01

    Integrated triggering and data acquisition systems for high energy physics experiments may be considered as fast, multichannel, synchronous, distributed, pipeline measurement systems. A considerable extension of functional, technological and monitoring demands, which has recently been imposed on them, forced a common usage of large field-programmable gate array (FPGA), digital signal processing-enhanced matrices and fast optical transmission for their realization. This paper discusses modelling, design, realization and testing of pipeline measurement systems. A distribution of synchronous data stream flows is considered in the network. A general functional structure of a single network node is presented. A suggested, novel block structure of the node model facilitates full implementation in the FPGA chip, circuit standardization and parametrization, as well as integration of functional and diagnostic layers. A general method for pipeline system design was derived. This method is based on a unified model of the synchronous data network node. A few examples of practically realized, FPGA-based, pipeline measurement systems were presented. The described systems were applied in ZEUS and CMS.

  11. Coordinated distribution network control of tap changer transformers, capacitors and PV inverters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ceylan, Oğuzhan; Liu, Guodong; Tomsovic, Kevin

    A power distribution system operates most efficiently with voltage deviations along a feeder kept to a minimum and must ensure all voltages remain within specified limits. Recently with the increased integration of photovoltaics, the variable power output has led to increased voltage fluctuations and violation of operating limits. This study proposes an optimization model based on a recently developed heuristic search method, grey wolf optimization, to coordinate the various distribution controllers. Several different case studies on IEEE 33 and 69 bus test systems modified by including tap changing transformers, capacitors and photovoltaic solar panels are performed. Simulation results are comparedmore » to two other heuristic-based optimization methods: harmony search and differential evolution. Finally, the simulation results show the effectiveness of the method and indicate the usage of reactive power outputs of PVs facilitates better voltage magnitude profile.« less

  12. Coordinated distribution network control of tap changer transformers, capacitors and PV inverters

    DOE PAGES

    Ceylan, Oğuzhan; Liu, Guodong; Tomsovic, Kevin

    2017-06-08

    A power distribution system operates most efficiently with voltage deviations along a feeder kept to a minimum and must ensure all voltages remain within specified limits. Recently with the increased integration of photovoltaics, the variable power output has led to increased voltage fluctuations and violation of operating limits. This study proposes an optimization model based on a recently developed heuristic search method, grey wolf optimization, to coordinate the various distribution controllers. Several different case studies on IEEE 33 and 69 bus test systems modified by including tap changing transformers, capacitors and photovoltaic solar panels are performed. Simulation results are comparedmore » to two other heuristic-based optimization methods: harmony search and differential evolution. Finally, the simulation results show the effectiveness of the method and indicate the usage of reactive power outputs of PVs facilitates better voltage magnitude profile.« less

  13. Hierarchical control framework for integrated coordination between distributed energy resources and demand response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Di; Lian, Jianming; Sun, Yannan

    Demand response is representing a significant but largely untapped resource that can greatly enhance the flexibility and reliability of power systems. In this paper, a hierarchical control framework is proposed to facilitate the integrated coordination between distributed energy resources and demand response. The proposed framework consists of coordination and device layers. In the coordination layer, various resource aggregations are optimally coordinated in a distributed manner to achieve the system-level objectives. In the device layer, individual resources are controlled in real time to follow the optimal power generation or consumption dispatched from the coordination layer. For the purpose of practical applications,more » a method is presented to determine the utility functions of controllable loads by taking into account the real-time load dynamics and the preferences of individual customers. The effectiveness of the proposed framework is validated by detailed simulation studies.« less

  14. Reflections on organizational issues in developing, implementing, and maintaining state Web-based data query systems.

    PubMed

    Love, Denise; Shah, Gulzar H

    2006-01-01

    Emerging technologies, such as Web-based data query systems (WDQSs), provide opportunities for state and local agencies to systematically organize and disseminate data to broad audiences and streamline the data distribution process. Despite the progress in WDQSs' implementation, led by agencies considered the "early adopters," there are still agencies left behind. This article explores the organizational issues and barriers to development of WDQSs in public health agencies and highlights factors facilitating the implementation of WDQSs.

  15. Small-Animal Imaging Using Diffuse Fluorescence Tomography.

    PubMed

    Davis, Scott C; Tichauer, Kenneth M

    2016-01-01

    Diffuse fluorescence tomography (DFT) has been developed to image the spatial distribution of fluorescence-tagged tracers in living tissue. This capability facilitates the recovery of any number of functional parameters, including enzymatic activity, receptor density, blood flow, and gene expression. However, deploying DFT effectively is complex and often requires years of know-how, especially for newer mutlimodal systems that combine DFT with conventional imaging systems. In this chapter, we step through the process of using MRI-DFT imaging of a receptor-targeted tracer in small animals.

  16. Cost efficient command management

    NASA Technical Reports Server (NTRS)

    Brandt, Theresa; Murphy, C. W.; Kuntz, Jon; Barlett, Tom

    1996-01-01

    The design and implementation of a command management system (CMS) for a NASA control center, is described. The technology innovations implemented in the CMS provide the infrastructure required for operations cost reduction and future development cost reduction through increased operational efficiency and reuse in future missions. The command management design facilitates error-free operations which enables the automation of the routine control center functions and allows for the distribution of scheduling responsibility to the instrument teams. The reusable system was developed using object oriented methodologies.

  17. Dose factor entry and display tool for BNCT radiotherapy

    DOEpatents

    Wessol, Daniel E.; Wheeler, Floyd J.; Cook, Jeremy L.

    1999-01-01

    A system for use in Boron Neutron Capture Therapy (BNCT) radiotherapy planning where a biological distribution is calculated using a combination of conversion factors and a previously calculated physical distribution. Conversion factors are presented in a graphical spreadsheet so that a planner can easily view and modify the conversion factors. For radiotherapy in multi-component modalities, such as Fast-Neutron and BNCT, it is necessary to combine each conversion factor component to form an effective dose which is used in radiotherapy planning and evaluation. The Dose Factor Entry and Display System is designed to facilitate planner entry of appropriate conversion factors in a straightforward manner for each component. The effective isodose is then immediately computed and displayed over the appropriate background (e.g. digitized image).

  18. Remote maintenance monitoring system

    NASA Technical Reports Server (NTRS)

    Simpkins, Lorenz G. (Inventor); Owens, Richard C. (Inventor); Rochette, Donn A. (Inventor)

    1992-01-01

    A remote maintenance monitoring system retrofits to a given hardware device with a sensor implant which gathers and captures failure data from the hardware device, without interfering with its operation. Failure data is continuously obtained from predetermined critical points within the hardware device, and is analyzed with a diagnostic expert system, which isolates failure origin to a particular component within the hardware device. For example, monitoring of a computer-based device may include monitoring of parity error data therefrom, as well as monitoring power supply fluctuations therein, so that parity error and power supply anomaly data may be used to trace the failure origin to a particular plane or power supply within the computer-based device. A plurality of sensor implants may be rerofit to corresponding plural devices comprising a distributed large-scale system. Transparent interface of the sensors to the devices precludes operative interference with the distributed network. Retrofit capability of the sensors permits monitoring of even older devices having no built-in testing technology. Continuous real time monitoring of a distributed network of such devices, coupled with diagnostic expert system analysis thereof, permits capture and analysis of even intermittent failures, thereby facilitating maintenance of the monitored large-scale system.

  19. Distributed event-triggered consensus strategy for multi-agent systems under limited resources

    NASA Astrophysics Data System (ADS)

    Noorbakhsh, S. Mohammad; Ghaisari, Jafar

    2016-01-01

    The paper proposes a distributed structure to address an event-triggered consensus problem for multi-agent systems which aims at concurrent reduction in inter-agent communication, control input actuation and energy consumption. Following the proposed approach, asymptotic convergence of all agents to consensus requires that each agent broadcasts its sampled-state to the neighbours and updates its control input only at its own triggering instants, unlike the existing related works. Obviously, it decreases the network bandwidth usage, sensor energy consumption, computation resources usage and actuator wears. As a result, it facilitates the implementation of the proposed consensus protocol in the real-world applications with limited resources. The stability of the closed-loop system under an event-based protocol is proved analytically. Some numerical results are presented which confirm the analytical discussion on the effectiveness of the proposed design.

  20. Analysis of the use of industrial control systems in simulators: state of the art and basic guidelines.

    PubMed

    Carrasco, Juan A; Dormido, Sebastián

    2006-04-01

    The use of industrial control systems in simulators facilitates the execution of engineering activities related with the installation and the optimization of the control systems in real plants. "Industrial control system" intends to be a valid term that would represent all the control systems which can be installed in an industrial plant, ranging from complex distributed control systems and SCADA packages to small single control devices. This paper summarizes the current alternatives for the development of simulators of industrial plants and presents an analysis of the process of integrating an industrial control system into a simulator, with the aim of helping in the installation of real control systems in simulators.

  1. Thermodynamic Vent System for an On-Orbit Cryogenic Reaction Control Engine

    NASA Technical Reports Server (NTRS)

    Hurlbert, Eric A.; Romig, Kris A.; Jimenez, Rafael; Flores, Sam

    2012-01-01

    A report discusses a cryogenic reaction control system (RCS) that integrates a Joule-Thompson (JT) device (expansion valve) and thermodynamic vent system (TVS) with a cryogenic distribution system to allow fine control of the propellant quality (subcooled liquid) during operation of the device. It enables zero-venting when coupled with an RCS engine. The proper attachment locations and sizing of the orifice are required with the propellant distribution line to facilitate line conditioning. During operations, system instrumentation was strategically installed along the distribution/TVS line assembly, and temperature control bands were identified. A sub-scale run tank, full-scale distribution line, open-loop TVS, and a combination of procured and custom-fabricated cryogenic components were used in the cryogenic RCS build-up. Simulated on-orbit activation and thruster firing profiles were performed to quantify system heat gain and evaluate the TVS s capability to maintain the required propellant conditions at the inlet to the engine valves. Test data determined that a small control valve, such as a piezoelectric, is optimal to provide continuously the required thermal control. The data obtained from testing has also assisted with the development of fluid and thermal models of an RCS to refine integrated cryogenic propulsion system designs. This system allows a liquid oxygenbased main propulsion and reaction control system for a spacecraft, which improves performance, safety, and cost over conventional hypergolic systems due to higher performance, use of nontoxic propellants, potential for integration with life support and power subsystems, and compatibility with in-situ produced propellants.

  2. BioconductorBuntu: a Linux distribution that implements a web-based DNA microarray analysis server.

    PubMed

    Geeleher, Paul; Morris, Dermot; Hinde, John P; Golden, Aaron

    2009-06-01

    BioconductorBuntu is a custom distribution of Ubuntu Linux that automatically installs a server-side microarray processing environment, providing a user-friendly web-based GUI to many of the tools developed by the Bioconductor Project, accessible locally or across a network. System installation is via booting off a CD image or by using a Debian package provided to upgrade an existing Ubuntu installation. In its current version, several microarray analysis pipelines are supported including oligonucleotide, dual-or single-dye experiments, including post-processing with Gene Set Enrichment Analysis. BioconductorBuntu is designed to be extensible, by server-side integration of further relevant Bioconductor modules as required, facilitated by its straightforward underlying Python-based infrastructure. BioconductorBuntu offers an ideal environment for the development of processing procedures to facilitate the analysis of next-generation sequencing datasets. BioconductorBuntu is available for download under a creative commons license along with additional documentation and a tutorial from (http://bioinf.nuigalway.ie).

  3. Enhanced training using the life support for trauma and transport (LSTAT)

    NASA Astrophysics Data System (ADS)

    Hanson, Matthew E.; Toth, Louis S.; White, William H.

    1999-07-01

    The Life Support for Trauma and Transport (LSTAT) is an intensive care unit (ICU) in a 'stretcher' only 5 inches thick. LSTAT is a portable intensive care system which integrates state-of-the-art, commercial-off-the-shelf, hospital grade ICU devices into a single patient resuscitation, stabilization, evacuation, and surgical platform. LSTAT's current and evolving attributes include compact volume, low weight, integrated devices and subsystems, ergonomic patient-caregiver interface, patient and system information system, near-universal power interface, patient- caregiver hazardous environment isolation, and extensive evacuation vehicle interface compatibility. Although the LSTAT system architecture was established primarily to support diagnosis, monitoring and telemedicine consulting, the information architecture and communications suite can also support hosting training experiences and scenarios. The training scenario capabilities and features include: (1) moving training out to the field, (2) facilitating distributed training, (3) off-setting training with remote experts (or potentially embedded expert systems), and (4) facilitating training-by-simulation. Equipping the caregiver via such enhanced equipment and training should ultimately translate into better care for the patient.

  4. Characterizing opto-electret based paper speakers by using a real-time projection Moiré metrology system

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Ling; Hsu, Kuan-Yu; Lee, Chih-Kung

    2016-03-01

    Advancement of distributed piezo-electret sensors and actuators facilitates various smart systems development, which include paper speakers, opto-piezo/electret bio-chips, etc. The array-based loudspeaker system possess several advantages over conventional coil speakers, such as light-weightness, flexibility, low power consumption, directivity, etc. With the understanding that the performance of the large-area piezo-electret loudspeakers or even the microfluidic biochip transport behavior could be tailored by changing their dynamic behaviors, a full-field real-time high-resolution non-contact metrology system was developed. In this paper, influence of the resonance modes and the transient vibrations of an arraybased loudspeaker system on the acoustic effect were measured by using a real-time projection moiré metrology system and microphones. To make the paper speaker even more versatile, we combine the photosensitive material TiOPc into the original electret loudspeaker. The vibration of this newly developed opto-electret loudspeaker could be manipulated by illuminating different light-intensity patterns. Trying to facilitate the tailoring process of the opto-electret loudspeaker, projection moiré was adopted to measure its vibration. By recording the projected fringes which are modulated by the contours of the testing sample, the phase unwrapping algorithm can give us a continuous phase distribution which is proportional to the object height variations. With the aid of the projection moiré metrology system, the vibrations associated with each distinctive light pattern could be characterized. Therefore, we expect that the overall acoustic performance could be improved by finding the suitable illuminating patterns. In this manuscript, the system performance of the projection moiré and the optoelectret paper speakers were cross-examined and verified by the experimental results obtained.

  5. Intelligent Systems Technologies to Assist in Utilization of Earth Observation Data

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.; McConaughy, Gail; Lynnes, Christopher; McDonald, Kenneth; Kempler, Steven

    2003-01-01

    With the launch of several Earth observing satellites over the last decade, we are now in a data rich environment. From NASA's Earth Observing System (EOS) satellites alone, we are accumulating more than 3 TB per day of raw data and derived geophysical parameters. The data products are being distributed to a large user community comprising scientific researchers, educators and operational government agencies. Notable progress has been made in the last decade in facilitating access to data. However, to realize the full potential of the growing archives of valuable scientific data, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system. Potential Intelligent Archive concepts include: 1) Mining archived data holdings using Intelligent Data Understanding algorithms to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services involved in a scientific enterprise; 3) Recognizing the value of results, indexing and formatting them for easy access, and delivering them to concerned individuals; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building (i.e., the transformations from data to information to knowledge) instead of just data pipelining; and 5) Being aware of other nodes in the knowledge building system, participating in open systems interfaces and protocols for virtualization, and collaborative interoperability. This paper presents some of these concepts and identifies issues to be addressed by research in future intelligent systems technology.

  6. Acquisition of He3 Cryostat Insert for Experiments on Topological Insulators

    DTIC Science & Technology

    2016-02-03

    facilitated transport experiments on topological insulators and Dirac and Weyl semimetals. These experiments resulted in several notable achievements and...Approved for Public Release; Distribution Unlimited Final Report: Acquisition of He3 Cryostat Insert for Experiments on Topological Insulators . The views...Experiments on Topological Insulators . Report Title The award enabled the PI to acquire a complete cryogenic system with a 9-Tesla superconducting magnet. The

  7. Analysis of outcomes in radiation oncology: An integrated computational platform

    PubMed Central

    Liu, Dezhi; Ajlouni, Munther; Jin, Jian-Yue; Ryu, Samuel; Siddiqui, Farzan; Patel, Anushka; Movsas, Benjamin; Chetty, Indrin J.

    2009-01-01

    Radiotherapy research and outcome analyses are essential for evaluating new methods of radiation delivery and for assessing the benefits of a given technology on locoregional control and overall survival. In this article, a computational platform is presented to facilitate radiotherapy research and outcome studies in radiation oncology. This computational platform consists of (1) an infrastructural database that stores patient diagnosis, IMRT treatment details, and follow-up information, (2) an interface tool that is used to import and export IMRT plans in DICOM RT and AAPM/RTOG formats from a wide range of planning systems to facilitate reproducible research, (3) a graphical data analysis and programming tool that visualizes all aspects of an IMRT plan including dose, contour, and image data to aid the analysis of treatment plans, and (4) a software package that calculates radiobiological models to evaluate IMRT treatment plans. Given the limited number of general-purpose computational environments for radiotherapy research and outcome studies, this computational platform represents a powerful and convenient tool that is well suited for analyzing dose distributions biologically and correlating them with the delivered radiation dose distributions and other patient-related clinical factors. In addition the database is web-based and accessible by multiple users, facilitating its convenient application and use. PMID:19544785

  8. Looking for Evidence of the Impact of Introducing a Human Rights-Based Approach in Health: The SaluDerecho Experience.

    PubMed

    Escobar, María-Luisa; Cubillos, Leonardo; Iunes, Roberto

    2015-12-10

    This paper summarizes the background, methodology, results, and lessons learned from SaluDerecho, the Initiative on Priority Setting, Equity and Constitutional Mandates in Health. Originally facilitated by the capacity-building arm of the World Bank in 2010, it was implemented in Latin American countries and later expanded to other regions of the world. Segmentation, decentralization, and lack of coordination in health systems; weak information systems; stratified societies; and hierarchical power relations in participating countries are some of the characteristics that inhibit a human rights-based approach to health. Hence, deliberate interventions like SaluDerecho are vital. Facilitating the participation of multiple stakeholders in a more informed and transparent dialogue creates a "safe" working environment to co-create policy solutions to improve transparency and accountability. The proposed evaluation methodology involves several steps that begin with an assessment of behavioral changes in actors (including policy makers, citizens, payers, and health care providers) that reshape relationships and, over time, change the functioning of health systems. Despite certain limitations, SaluDerecho has provided evidence of positive change among participating countries. Copyright © 2015 Escobar, Cubillos, Iunes. This is an open access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/), which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original author and source are credited.

  9. Representing situation awareness in collaborative systems: a case study in the energy distribution domain.

    PubMed

    Salmon, P M; Stanton, N A; Walker, G H; Jenkins, D; Baber, C; McMaster, R

    2008-03-01

    The concept of distributed situation awareness (DSA) is currently receiving increasing attention from the human factors community. This article investigates DSA in a collaborative real-world industrial setting by discussing the results derived from a recent naturalistic study undertaken within the UK energy distribution domain. The results describe the DSA-related information used by the networks of agents involved in the scenarios analysed, the sharing of this information between the agents and the salience of different information elements used. Thus, the structure, quality and content of each network's DSA is discussed, along with the implications for DSA theory. The findings reinforce the notion that when viewing situation awareness (SA) in collaborative systems, it is useful to focus on the coordinated behaviour of the system itself, rather than on the individual as the unit of analysis and suggest that the findings from such assessments can potentially be used to inform system, procedure and training design. SA is a critical commodity for teams working in industrial systems and systems, procedures and training programmes should be designed to facilitate efficient system SA acquisition and maintenance. This article presents approaches for describing and understanding SA during real-world collaborative tasks, the outputs from which can potentially be used to inform system, training programmes and procedure design.

  10. Roles of Course Facilitators, Learners, and Technology in the Flow of Information of a cMOOC

    ERIC Educational Resources Information Center

    Skrypnyk, Oleksandra; Joksimovic, Srec´ko; Kovanovic, Vitomir; Gas?evic, Dragan; Dawson, Shane

    2015-01-01

    Distributed Massive Open Online Courses (MOOCs) are based on the premise that online learning occurs through a network of interconnected learners. The teachers' role in distributed courses extends to forming such a network by facilitating communication that connects learners and their separate personal learning environments scattered around the…

  11. A resilient and secure software platform and architecture for distributed spacecraft

    NASA Astrophysics Data System (ADS)

    Otte, William R.; Dubey, Abhishek; Karsai, Gabor

    2014-06-01

    A distributed spacecraft is a cluster of independent satellite modules flying in formation that communicate via ad-hoc wireless networks. This system in space is a cloud platform that facilitates sharing sensors and other computing and communication resources across multiple applications, potentially developed and maintained by different organizations. Effectively, such architecture can realize the functions of monolithic satellites at a reduced cost and with improved adaptivity and robustness. Openness of these architectures pose special challenges because the distributed software platform has to support applications from different security domains and organizations, and where information flows have to be carefully managed and compartmentalized. If the platform is used as a robust shared resource its management, configuration, and resilience becomes a challenge in itself. We have designed and prototyped a distributed software platform for such architectures. The core element of the platform is a new operating system whose services were designed to restrict access to the network and the file system, and to enforce resource management constraints for all non-privileged processes Mixed-criticality applications operating at different security labels are deployed and controlled by a privileged management process that is also pre-configuring all information flows. This paper describes the design and objective of this layer.

  12. Packaging Technology for SiC High Temperature Electronics

    NASA Technical Reports Server (NTRS)

    Chen, Liang-Yu; Neudeck, Philip G.; Spry, David J.; Meredith, Roger D.; Nakley, Leah M.; Beheim, Glenn M.; Hunter, Gary W.

    2017-01-01

    High-temperature environment operable sensors and electronics are required for long-term exploration of Venus and distributed control of next generation aeronautical engines. Various silicon carbide (SiC) high temperature sensors, actuators, and electronics have been demonstrated at and above 500 C. A compatible packaging system is essential for long-term testing and application of high temperature electronics and sensors in relevant environments. This talk will discuss a ceramic packaging system developed for high temperature electronics, and related testing results of SiC integrated circuits at 500 C facilitated by this high temperature packaging system, including the most recent progress.

  13. The CSM testbed software system: A development environment for structural analysis methods on the NAS CRAY-2

    NASA Technical Reports Server (NTRS)

    Gillian, Ronnie E.; Lotts, Christine G.

    1988-01-01

    The Computational Structural Mechanics (CSM) Activity at Langley Research Center is developing methods for structural analysis on modern computers. To facilitate that research effort, an applications development environment has been constructed to insulate the researcher from the many computer operating systems of a widely distributed computer network. The CSM Testbed development system was ported to the Numerical Aerodynamic Simulator (NAS) Cray-2, at the Ames Research Center, to provide a high end computational capability. This paper describes the implementation experiences, the resulting capability, and the future directions for the Testbed on supercomputers.

  14. Workflow based framework for life science informatics.

    PubMed

    Tiwari, Abhishek; Sekhar, Arvind K T

    2007-10-01

    Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.

  15. A Nonlinear Model for Interactive Data Analysis and Visualization and an Implementation Using Progressive Computation for Massive Remote Climate Data Ensembles

    NASA Astrophysics Data System (ADS)

    Christensen, C.; Liu, S.; Scorzelli, G.; Lee, J. W.; Bremer, P. T.; Summa, B.; Pascucci, V.

    2017-12-01

    The creation, distribution, analysis, and visualization of large spatiotemporal datasets is a growing challenge for the study of climate and weather phenomena in which increasingly massive domains are utilized to resolve finer features, resulting in datasets that are simply too large to be effectively shared. Existing workflows typically consist of pipelines of independent processes that preclude many possible optimizations. As data sizes increase, these pipelines are difficult or impossible to execute interactively and instead simply run as large offline batch processes. Rather than limiting our conceptualization of such systems to pipelines (or dataflows), we propose a new model for interactive data analysis and visualization systems in which we comprehensively consider the processes involved from data inception through analysis and visualization in order to describe systems composed of these processes in a manner that facilitates interactive implementations of the entire system rather than of only a particular component. We demonstrate the application of this new model with the implementation of an interactive system that supports progressive execution of arbitrary user scripts for the analysis and visualization of massive, disparately located climate data ensembles. It is currently in operation as part of the Earth System Grid Federation server running at Lawrence Livermore National Lab, and accessible through both web-based and desktop clients. Our system facilitates interactive analysis and visualization of massive remote datasets up to petabytes in size, such as the 3.5 PB 7km NASA GEOS-5 Nature Run simulation, previously only possible offline or at reduced resolution. To support the community, we have enabled general distribution of our application using public frameworks including Docker and Anaconda.

  16. Fiber in the Local Loop: The Role of Electric Utilities

    NASA Astrophysics Data System (ADS)

    Meehan, Charles M.

    1990-01-01

    Electric utilities are beginning to make heavy use of fiber for a number of applications beyond transmission of voice and data among operating centers and plant facilities which employed fiber on the electric transmission systems. These additional uses include load management and automatic meter reading. Thus, utilities are beginning to place fiber on the electric distribution systems which, in many cases covers the same customer base as the "local loop". This shift to fiber on the distribution system is due to the advantages offered by fiber and because of congestion in the radio bands used for load management. This shift to fiber has been facilitated by a regulatory policy permitting utilities to lease reserve capacity on their fiber systems on an unregulated basis. This, in turn, has interested electric utilities in building fiber to their residential and commercial customers for voice, data and video. This will also provide for sophisticated load management systems and, possibly, generation of revenue.

  17. A Multiagent System for Dynamic Data Aggregation in Medical Research

    PubMed Central

    Urovi, Visara; Barba, Imanol; Aberer, Karl; Schumacher, Michael Ignaz

    2016-01-01

    The collection of medical data for research purposes is a challenging and long-lasting process. In an effort to accelerate and facilitate this process we propose a new framework for dynamic aggregation of medical data from distributed sources. We use agent-based coordination between medical and research institutions. Our system employs principles of peer-to-peer network organization and coordination models to search over already constructed distributed databases and to identify the potential contributors when a new database has to be built. Our framework takes into account both the requirements of a research study and current data availability. This leads to better definition of database characteristics such as schema, content, and privacy parameters. We show that this approach enables a more efficient way to collect data for medical research. PMID:27975063

  18. Derived virtual devices: a secure distributed file system mechanism

    NASA Technical Reports Server (NTRS)

    VanMeter, Rodney; Hotz, Steve; Finn, Gregory

    1996-01-01

    This paper presents the design of derived virtual devices (DVDs). DVDs are the mechanism used by the Netstation Project to provide secure shared access to network-attached peripherals distributed in an untrusted network environment. DVDs improve Input/Output efficiency by allowing user processes to perform I/O operations directly from devices without intermediate transfer through the controlling operating system kernel. The security enforced at the device through the DVD mechanism includes resource boundary checking, user authentication, and restricted operations, e.g., read-only access. To illustrate the application of DVDs, we present the interactions between a network-attached disk and a file system designed to exploit the DVD abstraction. We further discuss third-party transfer as a mechanism intended to provide for efficient data transfer in a typical NAP environment. We show how DVDs facilitate third-party transfer, and provide the security required in a more open network environment.

  19. The Lord of the Rings - Deep Learning Craters on the Moon and Other Bodies

    NASA Astrophysics Data System (ADS)

    Silburt, Ari; Ali-Dib, Mohamad; Zhu, Chenchong; Jackson, Alan; Valencia, Diana; Kissin, Yevgeni; Tamayo, Daniel; Menou, Kristen

    2018-01-01

    Crater detection has traditionally been done via manual inspection of images, leading to statistically significant disagreements between scientists for the Moon's crater distribution. In addition, there are millions of uncategorized craters on the Moon and other Solar System bodies that will never be classified by humans due to the time required to manually detect craters. I will show that a deep learning model trained on the near-side of the Moon can successfully reproduce the crater distribution on the far-side, as well as detect thousands of small, new craters that were previously uncharacterized. In addition, this Moon-trained model can be transferred to accurately classify craters on Mercury. It is therefore likely that this model can be extended to classify craters on all Solar System bodies with Digital Elevation Maps. This will facilitate, for the first time ever, a systematic, accurate, and reproducible study of the crater records throughout the Solar System.

  20. Soil Infiltration Characteristics in Agroforestry Systems and Their Relationships with the Temporal Distribution of Rainfall on the Loess Plateau in China

    PubMed Central

    Wang, Lai; Zhong, Chonggao; Gao, Pengxiang; Xi, Weimin; Zhang, Shuoxin

    2015-01-01

    Many previous studies have shown that land use patterns are the main factors influencing soil infiltration. Thus, increasing soil infiltration and reducing runoff are crucial for soil and water conservation, especially in semi-arid environments. To explore the effects of agroforestry systems on soil infiltration and associated properties in a semi-arid area of the Loess Plateau in China, we compared three plant systems: a walnut (Juglans regia) monoculture system (JRMS), a wheat (Triticum aestivum) monoculture system (TAMS), and a walnut-wheat alley cropping system (JTACS) over a period of 11 years. Our results showed that the JTACS facilitated infiltration, and its infiltration rate temporal distribution showed a stronger relationship coupled with the rainfall temporal distribution compared with the two monoculture systems during the growing season. However, the effect of JTACS on the infiltration capacity was only significant in shallow soil layer, i.e., the 0–40 cm soil depth. Within JTACS, the speed of the wetting front’s downward movement was significantly faster than that in the two monoculture systems when the amount of rainfall and its intensity were higher. The soil infiltration rate was improved, and the two peaks of soil infiltration rate temporal distribution and the rainfall temporal distribution coupled in rainy season in the alley cropping system, which has an important significance in soil and water conservation. The results of this empirical study provide new insights into the sustainability of agroforestry, which may help farmers select rational planting patterns in this region, as well as other regions with similar climatic and environmental characteristics throughout the world. PMID:25893832

  1. Soil Infiltration Characteristics in Agroforestry Systems and Their Relationships with the Temporal Distribution of Rainfall on the Loess Plateau in China.

    PubMed

    Wang, Lai; Zhong, Chonggao; Gao, Pengxiang; Xi, Weimin; Zhang, Shuoxin

    2015-01-01

    Many previous studies have shown that land use patterns are the main factors influencing soil infiltration. Thus, increasing soil infiltration and reducing runoff are crucial for soil and water conservation, especially in semi-arid environments. To explore the effects of agroforestry systems on soil infiltration and associated properties in a semi-arid area of the Loess Plateau in China, we compared three plant systems: a walnut (Juglans regia) monoculture system (JRMS), a wheat (Triticum aestivum) monoculture system (TAMS), and a walnut-wheat alley cropping system (JTACS) over a period of 11 years. Our results showed that the JTACS facilitated infiltration, and its infiltration rate temporal distribution showed a stronger relationship coupled with the rainfall temporal distribution compared with the two monoculture systems during the growing season. However, the effect of JTACS on the infiltration capacity was only significant in shallow soil layer, i.e., the 0-40 cm soil depth. Within JTACS, the speed of the wetting front's downward movement was significantly faster than that in the two monoculture systems when the amount of rainfall and its intensity were higher. The soil infiltration rate was improved, and the two peaks of soil infiltration rate temporal distribution and the rainfall temporal distribution coupled in rainy season in the alley cropping system, which has an important significance in soil and water conservation. The results of this empirical study provide new insights into the sustainability of agroforestry, which may help farmers select rational planting patterns in this region, as well as other regions with similar climatic and environmental characteristics throughout the world.

  2. Can we infer plant facilitation from remote sensing? A test across global drylands

    PubMed Central

    Xu, Chi; Holmgren, Milena; Van Nes, Egbert H.; Maestre, Fernando T.; Soliveres, Santiago; Berdugo, Miguel; Kéfi, Sonia; Marquet, Pablo A.; Abades, Sebastian; Scheffer, Marten

    2016-01-01

    Facilitation is a major force shaping the structure and diversity of plant communities in terrestrial ecosystems. Detecting positive plant-plant interactions relies on the combination of field experimentation and the demonstration of spatial association between neighboring plants. This has often restricted the study of facilitation to particular sites, limiting the development of systematic assessments of facilitation over regional and global scales. Here we explore whether the frequency of plant spatial associations detected from high-resolution remotely-sensed images can be used to infer plant facilitation at the community level in drylands around the globe. We correlated the information from remotely-sensed images freely available through Google Earth™ with detailed field assessments, and used a simple individual-based model to generate patch-size distributions using different assumptions about the type and strength of plant-plant interactions. Most of the patterns found from the remotely-sensed images were more right-skewed than the patterns from the null model simulating a random distribution. This suggests that the plants in the studied drylands show stronger spatial clustering than expected by chance. We found that positive plant co-occurrence, as measured in the field, was significantly related to the skewness of vegetation patch-size distribution measured using Google Earth™ images. Our findings suggest that the relative frequency of facilitation may be inferred from spatial pattern signals measured from remotely-sensed images, since facilitation often determines positive co-occurrence among neighboring plants. They pave the road for a systematic global assessment of the role of facilitation in terrestrial ecosystems. PMID:26552256

  3. Decomposed fuzzy systems and their application in direct adaptive fuzzy control.

    PubMed

    Hsueh, Yao-Chu; Su, Shun-Feng; Chen, Ming-Chang

    2014-10-01

    In this paper, a novel fuzzy structure termed as the decomposed fuzzy system (DFS) is proposed to act as the fuzzy approximator for adaptive fuzzy control systems. The proposed structure is to decompose each fuzzy variable into layers of fuzzy systems, and each layer is to characterize one traditional fuzzy set. Similar to forming fuzzy rules in traditional fuzzy systems, layers from different variables form the so-called component fuzzy systems. DFS is proposed to provide more adjustable parameters to facilitate possible adaptation in fuzzy rules, but without introducing a learning burden. It is because those component fuzzy systems are independent so that it can facilitate minimum distribution learning effects among component fuzzy systems. It can be seen from our experiments that even when the rule number increases, the learning time in terms of cycles is still almost constant. It can also be found that the function approximation capability and learning efficiency of the DFS are much better than that of the traditional fuzzy systems when employed in adaptive fuzzy control systems. Besides, in order to further reduce the computational burden, a simplified DFS is proposed in this paper to satisfy possible real time constraints required in many applications. From our simulation results, it can be seen that the simplified DFS can perform fairly with a more concise decomposition structure.

  4. Web-services-based spatial decision support system to facilitate nuclear waste siting

    NASA Astrophysics Data System (ADS)

    Huang, L. Xinglai; Sheng, Grant

    2006-10-01

    The availability of spatial web services enables data sharing among managers, decision and policy makers and other stakeholders in much simpler ways than before and subsequently has created completely new opportunities in the process of spatial decision making. Though generally designed for a certain problem domain, web-services-based spatial decision support systems (WSDSS) can provide a flexible problem-solving environment to explore the decision problem, understand and refine problem definition, and generate and evaluate multiple alternatives for decision. This paper presents a new framework for the development of a web-services-based spatial decision support system. The WSDSS is comprised of distributed web services that either have their own functions or provide different geospatial data and may reside in different computers and locations. WSDSS includes six key components, namely: database management system, catalog, analysis functions and models, GIS viewers and editors, report generators, and graphical user interfaces. In this study, the architecture of a web-services-based spatial decision support system to facilitate nuclear waste siting is described as an example. The theoretical, conceptual and methodological challenges and issues associated with developing web services-based spatial decision support system are described.

  5. JANIS 4: An Improved Version of the NEA Java-based Nuclear Data Information System

    NASA Astrophysics Data System (ADS)

    Soppera, N.; Bossant, M.; Dupont, E.

    2014-06-01

    JANIS is software developed to facilitate the visualization and manipulation of nuclear data, giving access to evaluated data libraries, and to the EXFOR and CINDA databases. It is stand-alone Java software, downloadable from the web and distributed on DVD. Used offline, the system also makes use of an internet connection to access the NEA Data Bank database. It is now also offered as a full web application, only requiring a browser. The features added in the latest version of the software and this new web interface are described.

  6. JANIS 4: An Improved Version of the NEA Java-based Nuclear Data Information System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soppera, N., E-mail: nicolas.soppera@oecd.org; Bossant, M.; Dupont, E.

    JANIS is software developed to facilitate the visualization and manipulation of nuclear data, giving access to evaluated data libraries, and to the EXFOR and CINDA databases. It is stand-alone Java software, downloadable from the web and distributed on DVD. Used offline, the system also makes use of an internet connection to access the NEA Data Bank database. It is now also offered as a full web application, only requiring a browser. The features added in the latest version of the software and this new web interface are described.

  7. Single-user MIMO versus multi-user MIMO in distributed antenna systems with limited feedback

    NASA Astrophysics Data System (ADS)

    Schwarz, Stefan; Heath, Robert W.; Rupp, Markus

    2013-12-01

    This article investigates the performance of cellular networks employing distributed antennas in addition to the central antennas of the base station. Distributed antennas are likely to be implemented using remote radio units, which is enabled by a low latency and high bandwidth dedicated link to the base station. This facilitates coherent transmission from potentially all available antennas at the same time. Such distributed antenna system (DAS) is an effective way to deal with path loss and large-scale fading in cellular systems. DAS can apply precoding across multiple transmission points to implement single-user MIMO (SU-MIMO) and multi-user MIMO (MU-MIMO) transmission. The throughput performance of various SU-MIMO and MU-MIMO transmission strategies is investigated in this article, employing a Long-Term evolution (LTE) standard compliant simulation framework. The previously theoretically established cell-capacity improvement of MU-MIMO in comparison to SU-MIMO in DASs is confirmed under the practical constraints imposed by the LTE standard, even under the assumption of imperfect channel state information (CSI) at the base station. Because practical systems will use quantized feedback, the performance of different CSI feedback algorithms for DASs is investigated. It is shown that significant gains in the CSI quantization accuracy and in the throughput of especially MU-MIMO systems can be achieved with relatively simple quantization codebook constructions that exploit the available temporal correlation and channel gain differences.

  8. ICESat Science Investigator led Processing System (I-SIPS)

    NASA Astrophysics Data System (ADS)

    Bhardwaj, S.; Bay, J.; Brenner, A.; Dimarzio, J.; Hancock, D.; Sherman, M.

    2003-12-01

    The ICESat Science Investigator-led Processing System (I-SIPS) generates the GLAS standard data products. It consists of two main parts the Scheduling and Data Management System (SDMS) and the Geoscience Laser Altimeter System (GLAS) Science Algorithm Software. The system has been operational since the successful launch of ICESat. It ingests data from the GLAS instrument, generates GLAS data products, and distributes them to the GLAS Science Computing Facility (SCF), the Instrument Support Facility (ISF) and the National Snow and Ice Data Center (NSIDC) ECS DAAC. The SDMS is the Planning, Scheduling and Data Management System that runs the GLAS Science Algorithm Software (GSAS). GSAS is based on the Algorithm Theoretical Basis Documents provided by the Science Team and is developed independently of SDMS. The SDMS provides the processing environment to plan jobs based on existing data, control job flow, data distribution, and archiving. The SDMS design is based on a mission-independent architecture that imposes few constraints on the science code thereby facilitating I-SIPS integration. I-SIPS currently works in an autonomous manner to ingest GLAS instrument data, distribute this data to the ISF, run the science processing algorithms to produce the GLAS standard products, reprocess data when new versions of science algorithms are released, and distributes the products to the SCF, ISF, and NSIDC. I-SIPS has a proven performance record, delivering the data to the SCF within hours after the initial instrument activation. The I-SIPS design philosophy gives this system a high potential for reuse in other science missions.

  9. Quantitative refractive index distribution of single cell by combining phase-shifting interferometry and AFM imaging.

    PubMed

    Zhang, Qinnan; Zhong, Liyun; Tang, Ping; Yuan, Yingjie; Liu, Shengde; Tian, Jindong; Lu, Xiaoxu

    2017-05-31

    Cell refractive index, an intrinsic optical parameter, is closely correlated with the intracellular mass and concentration. By combining optical phase-shifting interferometry (PSI) and atomic force microscope (AFM) imaging, we constructed a label free, non-invasive and quantitative refractive index of single cell measurement system, in which the accurate phase map of single cell was retrieved with PSI technique and the cell morphology with nanoscale resolution was achieved with AFM imaging. Based on the proposed AFM/PSI system, we achieved quantitative refractive index distributions of single red blood cell and Jurkat cell, respectively. Further, the quantitative change of refractive index distribution during Daunorubicin (DNR)-induced Jurkat cell apoptosis was presented, and then the content changes of intracellular biochemical components were achieved. Importantly, these results were consistent with Raman spectral analysis, indicating that the proposed PSI/AFM based refractive index system is likely to become a useful tool for intracellular biochemical components analysis measurement, and this will facilitate its application for revealing cell structure and pathological state from a new perspective.

  10. Optimizing NEURON Simulation Environment Using Remote Memory Access with Recursive Doubling on Distributed Memory Systems.

    PubMed

    Shehzad, Danish; Bozkuş, Zeki

    2016-01-01

    Increase in complexity of neuronal network models escalated the efforts to make NEURON simulation environment efficient. The computational neuroscientists divided the equations into subnets amongst multiple processors for achieving better hardware performance. On parallel machines for neuronal networks, interprocessor spikes exchange consumes large section of overall simulation time. In NEURON for communication between processors Message Passing Interface (MPI) is used. MPI_Allgather collective is exercised for spikes exchange after each interval across distributed memory systems. The increase in number of processors though results in achieving concurrency and better performance but it inversely affects MPI_Allgather which increases communication time between processors. This necessitates improving communication methodology to decrease the spikes exchange time over distributed memory systems. This work has improved MPI_Allgather method using Remote Memory Access (RMA) by moving two-sided communication to one-sided communication, and use of recursive doubling mechanism facilitates achieving efficient communication between the processors in precise steps. This approach enhanced communication concurrency and has improved overall runtime making NEURON more efficient for simulation of large neuronal network models.

  11. Optimizing NEURON Simulation Environment Using Remote Memory Access with Recursive Doubling on Distributed Memory Systems

    PubMed Central

    Bozkuş, Zeki

    2016-01-01

    Increase in complexity of neuronal network models escalated the efforts to make NEURON simulation environment efficient. The computational neuroscientists divided the equations into subnets amongst multiple processors for achieving better hardware performance. On parallel machines for neuronal networks, interprocessor spikes exchange consumes large section of overall simulation time. In NEURON for communication between processors Message Passing Interface (MPI) is used. MPI_Allgather collective is exercised for spikes exchange after each interval across distributed memory systems. The increase in number of processors though results in achieving concurrency and better performance but it inversely affects MPI_Allgather which increases communication time between processors. This necessitates improving communication methodology to decrease the spikes exchange time over distributed memory systems. This work has improved MPI_Allgather method using Remote Memory Access (RMA) by moving two-sided communication to one-sided communication, and use of recursive doubling mechanism facilitates achieving efficient communication between the processors in precise steps. This approach enhanced communication concurrency and has improved overall runtime making NEURON more efficient for simulation of large neuronal network models. PMID:27413363

  12. Extreme event statistics in a drifting Markov chain

    NASA Astrophysics Data System (ADS)

    Kindermann, Farina; Hohmann, Michael; Lausch, Tobias; Mayer, Daniel; Schmidt, Felix; Widera, Artur

    2017-07-01

    We analyze extreme event statistics of experimentally realized Markov chains with various drifts. Our Markov chains are individual trajectories of a single atom diffusing in a one-dimensional periodic potential. Based on more than 500 individual atomic traces we verify the applicability of the Sparre Andersen theorem to our system despite the presence of a drift. We present detailed analysis of four different rare-event statistics for our system: the distributions of extreme values, of record values, of extreme value occurrence in the chain, and of the number of records in the chain. We observe that, for our data, the shape of the extreme event distributions is dominated by the underlying exponential distance distribution extracted from the atomic traces. Furthermore, we find that even small drifts influence the statistics of extreme events and record values, which is supported by numerical simulations, and we identify cases in which the drift can be determined without information about the underlying random variable distributions. Our results facilitate the use of extreme event statistics as a signal for small drifts in correlated trajectories.

  13. Carrier Mediated Distribution System (CAMDIS): a new approach for the measurement of octanol/water distribution coefficients.

    PubMed

    Wagner, Bjoern; Fischer, Holger; Kansy, Manfred; Seelig, Anna; Assmus, Frauke

    2015-02-20

    Here we present a miniaturized assay, referred to as Carrier-Mediated Distribution System (CAMDIS) for fast and reliable measurement of octanol/water distribution coefficients, log D(oct). By introducing a filter support for octanol, phase separation from water is facilitated and the tendency of emulsion formation (emulsification) at the interface is reduced. A guideline for the best practice of CAMDIS is given, describing a strategy to manage drug adsorption at the filter-supported octanol/buffer interface. We validated the assay on a set of 52 structurally diverse drugs with known shake flask log D(oct) values. Excellent agreement with literature data (r(2) = 0.996, standard error of estimate, SEE = 0.111), high reproducibility (standard deviation, SD < 0.1 log D(oct) units), minimal sample consumption (10 μL of 100 μM DMSO stock solution) and a broad analytical range (log D(oct) range = -0.5 to 4.2) make CAMDIS a valuable tool for the high-throughput assessment of log D(oc)t. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. A programming environment for distributed complex computing. An overview of the Framework for Interdisciplinary Design Optimization (FIDO) project. NASA Langley TOPS exhibit H120b

    NASA Technical Reports Server (NTRS)

    Townsend, James C.; Weston, Robert P.; Eidson, Thomas M.

    1993-01-01

    The Framework for Interdisciplinary Design Optimization (FIDO) is a general programming environment for automating the distribution of complex computing tasks over a networked system of heterogeneous computers. For example, instead of manually passing a complex design problem between its diverse specialty disciplines, the FIDO system provides for automatic interactions between the discipline tasks and facilitates their communications. The FIDO system networks all the computers involved into a distributed heterogeneous computing system, so they have access to centralized data and can work on their parts of the total computation simultaneously in parallel whenever possible. Thus, each computational task can be done by the most appropriate computer. Results can be viewed as they are produced and variables changed manually for steering the process. The software is modular in order to ease migration to new problems: different codes can be substituted for each of the current code modules with little or no effect on the others. The potential for commercial use of FIDO rests in the capability it provides for automatically coordinating diverse computations on a networked system of workstations and computers. For example, FIDO could provide the coordination required for the design of vehicles or electronics or for modeling complex systems.

  15. Barriers, Facilitators and Priorities for Implementation of WHO Maternal and Perinatal Health Guidelines in Four Lower-Income Countries: A GREAT Network Research Activity.

    PubMed

    Vogel, Joshua P; Moore, Julia E; Timmings, Caitlyn; Khan, Sobia; Khan, Dina N; Defar, Atkure; Hadush, Azmach; Minwyelet Terefe, Marta; Teshome, Luwam; Ba-Thike, Katherine; Than, Kyu Kyu; Makuwani, Ahmad; Mbaruku, Godfrey; Mrisho, Mwifadhi; Mugerwa, Kidza Yvonne; Puchalski Ritchie, Lisa M; Rashid, Shusmita; Straus, Sharon E; Gülmezoglu, A Metin

    2016-01-01

    Health systems often fail to use evidence in clinical practice. In maternal and perinatal health, the majority of maternal, fetal and newborn mortality is preventable through implementing effective interventions. To meet this challenge, WHO's Department of Reproductive Health and Research partnered with the Knowledge Translation Program at St. Michael's Hospital (SMH), University of Toronto, Canada to establish a collaboration on knowledge translation (KT) in maternal and perinatal health, called the GREAT Network (Guideline-driven, Research priorities, Evidence synthesis, Application of evidence, and Transfer of knowledge). We applied a systematic approach incorporating evidence and theory to identifying barriers and facilitators to implementation of WHO maternal heath recommendations in four lower-income countries and to identifying implementation strategies to address these. We conducted a mixed-methods study in Myanmar, Uganda, Tanzania and Ethiopia. In each country, stakeholder surveys, focus group discussions and prioritization exercises were used, involving multiple groups of health system stakeholders (including administrators, policymakers, NGOs, professional associations, frontline healthcare providers and researchers). Despite differences in guideline priorities and contexts, barriers identified across countries were often similar. Health system level factors, including health workforce shortages, and need for strengthened drug and equipment procurement, distribution and management systems, were consistently highlighted as limiting the capacity of providers to deliver high-quality care. Evidence-based health policies to support implementation, and improve the knowledge and skills of healthcare providers were also identified. Stakeholders identified a range of tailored strategies to address local barriers and leverage facilitators. This approach to identifying barriers, facilitators and potential strategies for improving implementation proved feasible in these four lower-income country settings. Further evaluation of the impact of implementing these strategies is needed.

  16. Barriers, Facilitators and Priorities for Implementation of WHO Maternal and Perinatal Health Guidelines in Four Lower-Income Countries: A GREAT Network Research Activity

    PubMed Central

    Vogel, Joshua P.; Moore, Julia E.; Timmings, Caitlyn; Khan, Sobia; Khan, Dina N.; Defar, Atkure; Hadush, Azmach; Minwyelet Terefe, Marta; Teshome, Luwam; Ba-Thike, Katherine; Than, Kyu Kyu; Makuwani, Ahmad; Mbaruku, Godfrey; Mrisho, Mwifadhi; Mugerwa, Kidza Yvonne; Puchalski Ritchie, Lisa M.; Rashid, Shusmita; Straus, Sharon E.; Gülmezoglu, A. Metin

    2016-01-01

    Background Health systems often fail to use evidence in clinical practice. In maternal and perinatal health, the majority of maternal, fetal and newborn mortality is preventable through implementing effective interventions. To meet this challenge, WHO’s Department of Reproductive Health and Research partnered with the Knowledge Translation Program at St. Michael’s Hospital (SMH), University of Toronto, Canada to establish a collaboration on knowledge translation (KT) in maternal and perinatal health, called the GREAT Network (Guideline-driven, Research priorities, Evidence synthesis, Application of evidence, and Transfer of knowledge). We applied a systematic approach incorporating evidence and theory to identifying barriers and facilitators to implementation of WHO maternal heath recommendations in four lower-income countries and to identifying implementation strategies to address these. Methods We conducted a mixed-methods study in Myanmar, Uganda, Tanzania and Ethiopia. In each country, stakeholder surveys, focus group discussions and prioritization exercises were used, involving multiple groups of health system stakeholders (including administrators, policymakers, NGOs, professional associations, frontline healthcare providers and researchers). Results Despite differences in guideline priorities and contexts, barriers identified across countries were often similar. Health system level factors, including health workforce shortages, and need for strengthened drug and equipment procurement, distribution and management systems, were consistently highlighted as limiting the capacity of providers to deliver high-quality care. Evidence-based health policies to support implementation, and improve the knowledge and skills of healthcare providers were also identified. Stakeholders identified a range of tailored strategies to address local barriers and leverage facilitators. Conclusion This approach to identifying barriers, facilitators and potential strategies for improving implementation proved feasible in these four lower-income country settings. Further evaluation of the impact of implementing these strategies is needed. PMID:27806041

  17. [The Herceptin® case : A case of falsification of medicinal products to a greater extent].

    PubMed

    Streit, Renz

    2017-11-01

    Falsified medicines are a raising problem for the German drug market. The complex distribution channels across the European market facilitates the introduction of falsified and stolen medicines into the legal supply chain and may pose a risk for patients. The "Herceptin® case" from 2014 of falsified medicines of Italian origin demonstrates how complex distribution systems have been misused by criminal organizations in order to introduce stolen and thus falsified medicines via the parallel trade into the market, and which measures the authorities and the parallel-traders in the national and European network have taken to ensure patient safety. Falsified medicines will continue to be a problem in the future, so new monitoring systems have to be established and effectively used for prevention. The introduction of the EU-wide serialisation system in February 2019 is therefore intended to identify falsified drugs and to prevent the further trade as well as the expenditure to the patient. Furthermore, the maintenance and intensification of the cooperation between all EU authorities involved remains indispensable to close gateways in the distribution system for falsified medicines and to minimise the risk to the population.

  18. Rnomads: An R Interface with the NOAA Operational Model Archive and Distribution System

    NASA Astrophysics Data System (ADS)

    Bowman, D. C.; Lees, J. M.

    2014-12-01

    The National Oceanic and Atmospheric Administration Operational Model Archive and Distribution System (NOMADS) facilitates rapid delivery of real time and archived environmental data sets from multiple agencies. These data are distributed free to the scientific community, industry, and the public. The rNOMADS package provides an interface between NOMADS and the R programming language. Like R itself, rNOMADS is open source and cross platform. It utilizes server-side functionality on the NOMADS system to subset model outputs for delivery to client R users. There are currently 57 real time and 10 archived models available through rNOMADS. Atmospheric models include the Global Forecast System and North American Mesoscale. Oceanic models include WAVEWATCH III and U. S. Navy Operational Global Ocean Model. rNOMADS has been downloaded 1700 times in the year since it was released. At the time of writing, it is being used for wind and solar power modeling, climate monitoring related to food security concerns, and storm surge/inundation calculations, among others. We introduce this new package and show how it can be used to extract data for infrasonic waveform modeling in the atmosphere.

  19. Scalable Architecture for Federated Translational Inquiries Network (SAFTINet) Technology Infrastructure for a Distributed Data Network

    PubMed Central

    Schilling, Lisa M.; Kwan, Bethany M.; Drolshagen, Charles T.; Hosokawa, Patrick W.; Brandt, Elias; Pace, Wilson D.; Uhrich, Christopher; Kamerick, Michael; Bunting, Aidan; Payne, Philip R.O.; Stephens, William E.; George, Joseph M.; Vance, Mark; Giacomini, Kelli; Braddy, Jason; Green, Mika K.; Kahn, Michael G.

    2013-01-01

    Introduction: Distributed Data Networks (DDNs) offer infrastructure solutions for sharing electronic health data from across disparate data sources to support comparative effectiveness research. Data sharing mechanisms must address technical and governance concerns stemming from network security and data disclosure laws and best practices, such as HIPAA. Methods: The Scalable Architecture for Federated Translational Inquiries Network (SAFTINet) deploys TRIAD grid technology, a common data model, detailed technical documentation, and custom software for data harmonization to facilitate data sharing in collaboration with stakeholders in the care of safety net populations. Data sharing partners host TRIAD grid nodes containing harmonized clinical data within their internal or hosted network environments. Authorized users can use a central web-based query system to request analytic data sets. Discussion: SAFTINet DDN infrastructure achieved a number of data sharing objectives, including scalable and sustainable systems for ensuring harmonized data structures and terminologies and secure distributed queries. Initial implementation challenges were resolved through iterative discussions, development and implementation of technical documentation, governance, and technology solutions. PMID:25848567

  20. Scalable Architecture for Federated Translational Inquiries Network (SAFTINet) Technology Infrastructure for a Distributed Data Network.

    PubMed

    Schilling, Lisa M; Kwan, Bethany M; Drolshagen, Charles T; Hosokawa, Patrick W; Brandt, Elias; Pace, Wilson D; Uhrich, Christopher; Kamerick, Michael; Bunting, Aidan; Payne, Philip R O; Stephens, William E; George, Joseph M; Vance, Mark; Giacomini, Kelli; Braddy, Jason; Green, Mika K; Kahn, Michael G

    2013-01-01

    Distributed Data Networks (DDNs) offer infrastructure solutions for sharing electronic health data from across disparate data sources to support comparative effectiveness research. Data sharing mechanisms must address technical and governance concerns stemming from network security and data disclosure laws and best practices, such as HIPAA. The Scalable Architecture for Federated Translational Inquiries Network (SAFTINet) deploys TRIAD grid technology, a common data model, detailed technical documentation, and custom software for data harmonization to facilitate data sharing in collaboration with stakeholders in the care of safety net populations. Data sharing partners host TRIAD grid nodes containing harmonized clinical data within their internal or hosted network environments. Authorized users can use a central web-based query system to request analytic data sets. SAFTINet DDN infrastructure achieved a number of data sharing objectives, including scalable and sustainable systems for ensuring harmonized data structures and terminologies and secure distributed queries. Initial implementation challenges were resolved through iterative discussions, development and implementation of technical documentation, governance, and technology solutions.

  1. Flight deck benefits of integrated data link communication

    NASA Technical Reports Server (NTRS)

    Waller, Marvin C.

    1992-01-01

    A fixed-base, piloted simulation study was conducted to determine the operational benefits that result when air traffic control (ATC) instructions are transmitted to the deck of a transport aircraft over a digital data link. The ATC instructions include altitude, airspeed, heading, radio frequency, and route assignment data. The interface between the flight deck and the data link was integrated with other subsystems of the airplane to facilitate data management. Data from the ATC instructions were distributed to the flight guidance and control system, the navigation system, and an automatically tuned communication radio. The co-pilot initiated the automation-assisted data distribution process. Digital communications and automated data distribution were compared with conventional voice radio communication and manual input of data into other subsystems of the simulated aircraft. Less time was required in the combined communication and data management process when data link ATC communication was integrated with the other subsystems. The test subjects, commercial airline pilots, provided favorable evaluations of both the digital communication and data management processes.

  2. Variation in chromosome number and breeding systems: implications for diversification in Pachycereus pringlei (Cactaceae).

    PubMed

    Gutiérrez-Flores, Carina; la Luz, José L León-de; León, Francisco J García-De; Cota-Sánchez, J Hugo

    2018-01-01

    Polyploidy, the possession of more than two sets of chromosomes, is a major biological process affecting plant evolution and diversification. In the Cactaceae, genome doubling has also been associated with reproductive isolation, changes in breeding systems, colonization ability, and speciation. Pachycereus pringlei (S. Watson, 1885) Britton & Rose, 1909, is a columnar cactus that has long drawn the attention of ecologists, geneticists, and systematists due to its wide distribution range and remarkable assortment of breeding systems in the Mexican Sonoran Desert and the Baja California Peninsula (BCP). However, several important evolutionary questions, such as the distribution of chromosome numbers and whether the diploid condition is dominant over a potential polyploid condition driving the evolution and diversity in floral morphology and breeding systems in this cactus, are still unclear. In this study, we determined chromosome numbers in 11 localities encompassing virtually the entire geographic range of distribution of P. pringlei . Our data revealed the first diploid (2n = 22) count in this species restricted to the hermaphroditic populations of Catalana (ICA) and Cerralvo (ICE) Islands, whereas the tetraploid (2n = 44) condition is consistently distributed throughout the BCP and mainland Sonora populations distinguished by a non-hermaphroditic breeding system. These results validate a wider distribution of polyploid relative to diploid individuals and a shift in breeding systems coupled with polyploidisation. Considering that the diploid base number and hermaphroditism are the proposed ancestral conditions in Cactaceae, we suggest that ICE and ICA populations represent the relicts of a southern diploid ancestor from which both polyploidy and unisexuality evolved in mainland BCP, facilitating the northward expansion of this species. This cytogeographic distribution in conjunction with differences in floral attributes suggests the distinction of the diploid populations as a new taxonomic entity. We suggest that chromosome doubling in conjunction with allopatric distribution, differences in neutral genetic variation, floral traits, and breeding systems has driven the reproductive isolation, evolution, and diversification of this columnar cactus.

  3. Variation in chromosome number and breeding systems: implications for diversification in Pachycereus pringlei (Cactaceae)

    PubMed Central

    Gutiérrez-Flores, Carina; la Luz, José L. León-de; León, Francisco J. García-De; Cota-Sánchez, J. Hugo

    2018-01-01

    Abstract Polyploidy, the possession of more than two sets of chromosomes, is a major biological process affecting plant evolution and diversification. In the Cactaceae, genome doubling has also been associated with reproductive isolation, changes in breeding systems, colonization ability, and speciation. Pachycereus pringlei (S. Watson, 1885) Britton & Rose, 1909, is a columnar cactus that has long drawn the attention of ecologists, geneticists, and systematists due to its wide distribution range and remarkable assortment of breeding systems in the Mexican Sonoran Desert and the Baja California Peninsula (BCP). However, several important evolutionary questions, such as the distribution of chromosome numbers and whether the diploid condition is dominant over a potential polyploid condition driving the evolution and diversity in floral morphology and breeding systems in this cactus, are still unclear. In this study, we determined chromosome numbers in 11 localities encompassing virtually the entire geographic range of distribution of P. pringlei. Our data revealed the first diploid (2n = 22) count in this species restricted to the hermaphroditic populations of Catalana (ICA) and Cerralvo (ICE) Islands, whereas the tetraploid (2n = 44) condition is consistently distributed throughout the BCP and mainland Sonora populations distinguished by a non-hermaphroditic breeding system. These results validate a wider distribution of polyploid relative to diploid individuals and a shift in breeding systems coupled with polyploidisation. Considering that the diploid base number and hermaphroditism are the proposed ancestral conditions in Cactaceae, we suggest that ICE and ICA populations represent the relicts of a southern diploid ancestor from which both polyploidy and unisexuality evolved in mainland BCP, facilitating the northward expansion of this species. This cytogeographic distribution in conjunction with differences in floral attributes suggests the distinction of the diploid populations as a new taxonomic entity. We suggest that chromosome doubling in conjunction with allopatric distribution, differences in neutral genetic variation, floral traits, and breeding systems has driven the reproductive isolation, evolution, and diversification of this columnar cactus. PMID:29675137

  4. Numerical Propulsion System Simulation Architecture

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia G.

    2004-01-01

    The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.

  5. CMS distributed data analysis with CRAB3

    NASA Astrophysics Data System (ADS)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  6. The New England School Effectiveness Project: A Facilitator's Sourcebook.

    ERIC Educational Resources Information Center

    Northeast Regional Exchange, Inc., Chelmsford, MA.

    The School Team Facilitator assists participating New England secondary schools in planning and implementing improvement efforts based on school effectiveness research. This publication, distributed at a team training conference, begins with the conference schedule, a list of facilitators, instructions on choosing a school team, and letters to…

  7. Drinking water for dairy cattle: always a benefit or a microbiological risk?

    PubMed

    Van Eenige, M J E M; Counotte, G H M; Noordhuizen, J P T M

    2013-02-01

    Drinking water can be considered an essential nutrient for dairy cattle. However, because it comes from different sources, its chemical and microbiological quality does not always reach accepted standards. Moreover, water quality is not routinely assessed on dairy farms. The microecology of drinking water sources and distribution systems is rather complex and still not fully understood. Water quality is adversely affected by the formation of biofilms in distribution systems, which form a persistent reservoir for potentially pathogenic bacteria. Saprophytic microorganisms associated with such biofilms interact with organic and inorganic matter in water, with pathogens, and even with each other. In addition, the presence of biofilms in water distribution systems makes cleaning and disinfection difficult and sometimes impossible. This article describes the complex dynamics of microorganisms in water distribution systems. Water quality is diminished primarily as a result of faecal contamination and rarely as a result of putrefaction in water distribution systems. The design of such systems (with/ without anti-backflow valves and pressure) and the materials used (polyethylene enhances biofilm; stainless steel does not) affect the quality of water they provide. The best option is an open, funnel-shaped galvanized drinking trough, possibly with a pressure system, air inlet, and anti-backflow valves. A poor microbiological quality of drinking water may adversely affect feed intake, and herd health and productivity. In turn, public health may be affected because cattle can become a reservoir of microorganisms hazardous to humans, such as some strains of E. coli, Yersinia enterocolitica, and Campylobacter jejuni. A better understanding of the biological processes in water sources and distribution systems and of the viability of microorganisms in these systems may contribute to better advice on herd health and productivity at a farm level. Certain on-farm risk factors for water quality have been identified. A practical approach will facilitate the control and management of these risks, and thereby improve herd health and productivity.

  8. GLAD: a system for developing and deploying large-scale bioinformatics grid.

    PubMed

    Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong

    2005-03-01

    Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.

  9. Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture

    NASA Astrophysics Data System (ADS)

    Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel

    2003-11-01

    Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.

  10. VIEWCACHE: An incremental pointer-based access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, N.; Sellis, Timos

    1992-01-01

    One of biggest problems facing NASA today is to provide scientists efficient access to a large number of distributed databases. Our pointer-based incremental database access method, VIEWCACHE, provides such an interface for accessing distributed data sets and directories. VIEWCACHE allows database browsing and search performing inter-database cross-referencing with no actual data movement between database sites. This organization and processing is especially suitable for managing Astrophysics databases which are physically distributed all over the world. Once the search is complete, the set of collected pointers pointing to the desired data are cached. VIEWCACHE includes spatial access methods for accessing image data sets, which provide much easier query formulation by referring directly to the image and very efficient search for objects contained within a two-dimensional window. We will develop and optimize a VIEWCACHE External Gateway Access to database management systems to facilitate distributed database search.

  11. An integrated chronostratigraphic data system for the twenty-first century

    USGS Publications Warehouse

    Sikora, P.J.; Ogg, James G.; Gary, A.; Cervato, C.; Gradstein, Felix; Huber, B.T.; Marshall, C.; Stein, J.A.; Wardlaw, B.

    2006-01-01

    Research in stratigraphy is increasingly multidisciplinary and conducted by diverse research teams whose members can be widely separated. This developing distributed-research process, facilitated by the availability of the Internet, promises tremendous future benefits to researchers. However, its full potential is hindered by the absence of a development strategy for the necessary infrastructure. At a National Science Foundation workshop convened in November 2001, thirty quantitative stratigraphers and database specialists from both academia and industry met to discuss how best to integrate their respective chronostratigraphic databases. The main goal was to develop a strategy that would allow efficient distribution and integration of existing data relevant to the study of geologic time. Discussions concentrated on three major themes: database standards and compatibility, strategies and tools for information retrieval and analysis of all types of global and regional stratigraphic data, and future directions for database integration and centralization of currently distributed depositories. The result was a recommendation to establish an integrated chronostratigraphic database, to be called Chronos, which would facilitate greater efficiency in stratigraphic studies (http://www.chronos.org/) . The Chronos system will both provide greater ease of data gathering and allow for multidisciplinary synergies, functions of fundamental importance in a variety of research, including time scale construction, paleoenvironmental analysis, paleoclimatology and paleoceanography. Beyond scientific research, Chronos will also provide educational and societal benefits by providing an accessible source of information of general interest (e.g., mass extinctions) and concern (e.g., climatic change). The National Science Foundation has currently funded a three-year program for implementing Chronos.. ?? 2006 Geological Society of America. All rights reserved.

  12. Laminar distribution of cholinergic- and serotonergic-dependent plasticity within kitten visual cortex.

    PubMed

    Kojic, L; Gu, Q; Douglas, R M; Cynader, M S

    2001-02-28

    Both cholinergic and serotonergic modulatory projections to mammalian striate cortex have been demonstrated to be involved in the regulation of postnatal plasticity, and a striking alteration in the number and intracortical distribution of cholinergic and serotonergic receptors takes place during the critical period for cortical plasticity. As well, agonists of cholinergic and serotonergic receptors have been demonstrated to facilitate induction of long-term synaptic plasticity in visual cortical slices supporting their involvement in the control of activity-dependent plasticity. We recorded field potentials from layers 4 and 2/3 in visual cortex slices of 60--80 day old kittens after white matter stimulation, before and after a period of high frequency stimulation (HFS), in the absence or presence of either cholinergic or serotonergic agonists. At these ages, the HFS protocol alone almost never induced long-term changes of synaptic plasticity in either layers 2/3 or 4. In layer 2/3, agonist stimulation of m1 receptors facilitated induction of long-term potentiation (LTP) with HFS stimulation, while the activation of serotonergic receptors had only a modest effect. By contrast, a strong serotonin-dependent LTP facilitation and insignificant muscarinic effects were observed after HFS within layer 4. The results show that receptor-dependent laminar stratification of synaptic modifiability occurs in the cortex at these ages. This plasticity may underly a control system gating the experience-dependent changes of synaptic organization within developing visual cortex.

  13. Development of the PHAST model: generating standard public health services data and evidence for decision-making.

    PubMed

    Bekemeier, Betty; Park, Seungeun

    2018-04-01

    Standardized data regarding the distribution, quality, reach, and variation in public health services provided at the community level and in wide use across states and communities do not exist. This leaves a major gap in our nation's understanding of the value of prevention activities and, in particular, the contributions of our government public health agencies charged with assuring community health promotion and protection. Public health and community leaders, therefore, are eager for accessible and comparable data regarding preventive services that can inform policy decisions about where to invest resources. We used literature review and a practice-based approach, employing an iterative process to identify factors that facilitate data provision among public health practitioners. This paper describes the model, systematically developed by our research team and with input from practice partners, that guides our process toward maximizing the uptake and integration of these standardized measures into state and local data collection systems. The model we developed, using a dissemination and implementation science framework, is intended to foster greater interest in and accountability for data collection around local health department services and to facilitate spatial exploration and statistical analysis of local health department service distribution, change, and performance. Our model is the first of its kind to thoroughly develop a means to guide research and practice in realizing the National Academy of Medicine's recommendation for developing systems to measure and track state and local public health system contributions to population health.

  14. Effects of reduced nitrogen inputs on crop yield and nitrogen use efficiency in a long-term maize-soybean relay strip intercropping system.

    PubMed

    Chen, Ping; Du, Qing; Liu, Xiaoming; Zhou, Li; Hussain, Sajad; Lei, Lu; Song, Chun; Wang, Xiaochun; Liu, Weiguo; Yang, Feng; Shu, Kai; Liu, Jiang; Du, Junbo; Yang, Wenyu; Yong, Taiwen

    2017-01-01

    The blind pursuit of high yields via increased fertilizer inputs increases the environmental costs. Relay intercropping has advantages for yield, but a strategy for N management is urgently required to decrease N inputs without yield loss in maize-soybean relay intercropping systems (IMS). Experiments were conducted with three levels of N and three planting patterns, and dry matter accumulation, nitrogen uptake, nitrogen use efficiency (NUE), competition ratio (CR), system productivity index (SPI), land equivalent ratio (LER), and crop root distribution were investigated. Our results showed that the CR of soybean was greater than 1, and that the change in root distribution in space and time resulted in an interspecific facilitation in IMS. The maximum yield of maize under monoculture maize (MM) occurred with conventional nitrogen (CN), whereas under IMS, the maximum yield occurred with reduced nitrogen (RN). The yield of monoculture soybean (MS) and of soybean in IMS both reached a maximum under RN. The LER of IMS varied from 1.85 to 2.36, and the SPI peaked under RN. Additionally, the NUE of IMS increased by 103.7% under RN compared with that under CN. In conclusion, the separation of the root ecological niche contributed to a positive interspecific facilitation, which increased the land productivity. Thus, maize-soybean relay intercropping with reduced N input provides a very useful approach to increase land productivity and avert environmental pollution.

  15. Moving Uphill: Microbial Facilitation at the Leading Edge of Plant Species Distributional Shifts

    NASA Astrophysics Data System (ADS)

    Suding, K.; Farrer, E.; Spasojevic, M.; Porazinska, D.; Bueno de Mesquita, C.; Schmidt, S. K.

    2016-12-01

    Climate change is expected to influence species distributions and reshuffle patterns of biodiversity. A key challenge to our understanding of these effects is that biotic interactions - new species to compete with, new stressors that increase dependence on facilitation, new prey or predators - will likely affect the ability of species to track climate at the leading edges of their distributional range. While it is well established that soil biota strongly influence plant abundance and diversity, it has been difficult to quantify the key belowground dynamics. This presentation will investigate the influence of one key biotic interaction, between plants and soil microbiota, on the ability of plant species to track climate change and expand their range uphill in a high montane system in the Front Range of Colorado. High-resolution photography from 1972 and 2008 indicate colonization of tundra vegetation in formerly unvegetated areas. Observational work on the distributions patterns of both plants and soil microbiota (bacteria, fungi and nematodes) in a spatially-explicit grid at the upper edge of plant distributions indicate strong, mostly positive, associations between plant species and soil taxa. Abiotic factors, while important, consistently underpredicted the occurrence of plant species and, in nine of the 12 most common tundra plants, co-occurring microbial taxa were important predictors of plant occurrence. Comparison of plant and microbial distributional patterns in 2007 and 2015 indicate the influence of microbial community composition on assembly and beta-diversity of the plant community over time. Plant colonization patterns in this region previously devoid of vegetation will likely influence carbon, nitrogen and phosphorus dynamics, with downstream consequences on nutrient limitation and phytoplankton composition in alpine lakes.

  16. Matching of electron beams for conformal therapy of target volumes at moderate depths.

    PubMed

    Zackrisson, B; Karlsson, M

    1996-06-01

    The basic requirements for conformal electron therapy are an accelerator with a wide range of energies and field shapes. The beams should be well characterised in a full 3-D dose planning system which has been verified for the geometries of the current application. Differences in the basic design of treatment units have been shown to have a large influence on beam quality and dosimetry. Modern equipment can deliver electron beams of good quality with a high degree of accuracy. A race-track microtron with minimised electron scattering and a multi-leaf collimator (MLC) for electron collimating will facilitate the isocentric technique as a general treatment technique for electrons. This will improve the possibility of performing combined electron field techniques in order to conform the dose distribution with no or minimal use of a bolus. Furthermore, the isocentric technique will facilitate multiple field arrangements that decrease the problems with distortion of the dose distribution due to inhomogeneities, etc. These situations are demonstrated by clinical examples where isocentric, matched electron fields for treatment of the nose, thyroid and thoracic wall have been used.

  17. Experimental OAI-Based Digital Library Systems

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L. (Editor); Maly, Kurt (Editor); Zubair, Mohammad (Editor); Rusch-Feja, Diann (Editor)

    2002-01-01

    The objective of Open Archives Initiative (OAI) is to develop a simple, lightweight framework to facilitate the discovery of content in distributed archives (http://www.openarchives.org). The focus of the workshop held at the 5th European Conference on Research and Advanced Technology for Digital Libraries (ECDL 2001) was to bring researchers in the area of digital libraries who are building OAI based systems so as to share their experiences, problems they are facing, and approaches they are taking to address them. The workshop consisted of invited talks from well-established researchers working in building OAI based digital library system along with short paper presentations.

  18. Important drug-nutrient interactions in the elderly.

    PubMed

    Thomas, J A; Burns, R A

    1998-09-01

    Several drug-nutrient interactions can occur, but their prevalence may be accentuated in the elderly. Geriatric patients may experience age-related changes in the pharmacokinetics of a drug-absorption, distribution, metabolism and excretion. When drug-nutrient interactions occur, they usually affect absorptive processes more frequently. Specific transporter systems facilitate the absorption of many drugs. Little is known about how these transporter systems are affected by aging. Co-existing disease states in the elderly may exaggerate the action of a drug and represent a confounding factor in drug-nutrient interactions. While several different drug-nutrient interactions are important in the elderly, those affecting the cardiovascular system warrant special attention.

  19. The Limited Use of Non-Physician Providers: is More Research the Cure.

    DTIC Science & Technology

    1977-12-01

    American society during the 1960’s. Activism, the questioning of the distribution of power and resources, and the re—examination of society’s...going to health care and the fact that many low income and minority groups found health beyond their purchasing power , and that the cost of educating a...facilitation ” solutions are insufficient where the presently structured system is characterized by an inadequate supply of facilities and personnel

  20. Camouflage Traffic: Minimizing Message Delay for Smart Grid Applications Under Jamming

    DTIC Science & Technology

    2015-01-16

    Conf. Wireless Netw. Security, 2011, pp. 47–52. [26] M. Strasser, B. Danev, and S. Capkun, “Detection of reactive jam- ming in sensor networks,” ACM...Evaluation of two anti-islanding schemes for a radial distribution system equipped with self-excited induction generator wind turbines ,” IEEE Trans...technologies. To facilitate efficient information exchange, wireless networks have been proposed to be widely used in the smart grid. However, the jamming

  1. Gravitational field calculations on a dynamic lattice by distributed computing.

    NASA Astrophysics Data System (ADS)

    Mähönen, P.; Punkka, V.

    A new method of calculating numerically time evolution of a gravitational field in general relativity is introduced. Vierbein (tetrad) formalism, dynamic lattice and massively parallelized computation are suggested as they are expected to speed up the calculations considerably and facilitate the solution of problems previously considered too hard to be solved, such as the time evolution of a system consisting of two or more black holes or the structure of worm holes.

  2. Gravitation Field Calculations on a Dynamic Lattice by Distributed Computing

    NASA Astrophysics Data System (ADS)

    Mähönen, Petri; Punkka, Veikko

    A new method of calculating numerically time evolution of a gravitational field in General Relatity is introduced. Vierbein (tetrad) formalism, dynamic lattice and massively parallelized computation are suggested as they are expected to speed up the calculations considerably and facilitate the solution of problems previously considered too hard to be solved, such as the time evolution of a system consisting of two or more black holes or the structure of worm holes.

  3. Collaborative Information Agents on the World Wide Web

    NASA Technical Reports Server (NTRS)

    Chen, James R.; Mathe, Nathalie; Wolfe, Shawn; Koga, Dennis J. (Technical Monitor)

    1998-01-01

    In this paper, we present DIAMS, a system of distributed, collaborative information agents which help users access, collect, organize, and exchange information on the World Wide Web. Personal agents provide their owners dynamic displays of well organized information collections, as well as friendly information management utilities. Personal agents exchange information with one another. They also work with other types of information agents such as matchmakers and knowledge experts to facilitate collaboration and communication.

  4. Scleroderma Mimickers

    PubMed Central

    Morgan, Nadia D.; Hummers, Laura K.

    2017-01-01

    Opinion statement Cutaneous fibrosing disorders encompass a diverse array of diseases united by the presence of varying degrees of dermal sclerosis. The quality and distribution of skin involvement, presence or absence of systemic complications and unique associated laboratory abnormalities often help to distinguish between these diseases. It is imperative that an effort is made to accurately differentiate between scleroderma and its mimics, in order to guide long-term management and facilitate implementation of the appropriate treatment modality where indicated. PMID:28473954

  5. Dynamic Power Distribution System Management With a Locally Connected Communication Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall-Anese, Emiliano; Zhang, Kaiqing; Basar, Tamer

    Coordinated optimization and control of distribution-level assets can enable a reliable and optimal integration of massive amount of distributed energy resources (DERs) and facilitate distribution system management (DSM). Accordingly, the objective is to coordinate the power injection at the DERs to maintain certain quantities across the network, e.g., voltage magnitude, line flows, or line losses, to be close to a desired profile. By and large, the performance of the DSM algorithms has been challenged by two factors: i) the possibly non-strongly connected communication network over DERs that hinders the coordination; ii) the dynamics of the real system caused by themore » DERs with heterogeneous capabilities, time-varying operating conditions, and real-time measurement mismatches. In this paper, we investigate the modeling and algorithm design and analysis with the consideration of these two factors. In particular, a game theoretic characterization is first proposed to account for a locally connected communication network over DERs, along with the analysis of the existence and uniqueness of the Nash equilibrium (NE) therein. To achieve the equilibrium in a distributed fashion, a projected-gradient-based asynchronous DSM algorithm is then advocated. The algorithm performance, including the convergence speed and the tracking error, is analytically guaranteed under the dynamic setting. Extensive numerical tests on both synthetic and realistic cases corroborate the analytical results derived.« less

  6. Spatial analysis and characteristics of pig farming in Thailand.

    PubMed

    Thanapongtharm, Weerapong; Linard, Catherine; Chinson, Pornpiroon; Kasemsuwan, Suwicha; Visser, Marjolein; Gaughan, Andrea E; Epprech, Michael; Robinson, Timothy P; Gilbert, Marius

    2016-10-06

    In Thailand, pig production intensified significantly during the last decade, with many economic, epidemiological and environmental implications. Strategies toward more sustainable future developments are currently investigated, and these could be informed by a detailed assessment of the main trends in the pig sector, and on how different production systems are geographically distributed. This study had two main objectives. First, we aimed to describe the main trends and geographic patterns of pig production systems in Thailand in terms of pig type (native, breeding, and fattening pigs), farm scales (smallholder and large-scale farming systems) and type of farming systems (farrow-to-finish, nursery, and finishing systems) based on a very detailed 2010 census. Second, we aimed to study the statistical spatial association between these different types of pig farming distribution and a set of spatial variables describing access to feed and markets. Over the last decades, pig population gradually increased, with a continuously increasing number of pigs per holder, suggesting a continuing intensification of the sector. The different pig-production systems showed very contrasted geographical distributions. The spatial distribution of large-scale pig farms corresponds with that of commercial pig breeds, and spatial analysis conducted using Random Forest distribution models indicated that these were concentrated in lowland urban or peri-urban areas, close to means of transportation, facilitating supply to major markets such as provincial capitals and the Bangkok Metropolitan region. Conversely the smallholders were distributed throughout the country, with higher densities located in highland, remote, and rural areas, where they supply local rural markets. A limitation of the study was that pig farming systems were defined from the number of animals per farm, resulting in their possible misclassification, but this should have a limited impact on the main patterns revealed by the analysis. The very contrasted distribution of different pig production systems present opportunities for future regionalization of pig production. More specifically, the detailed geographical analysis of the different production systems will be used to spatially-inform planning decisions for pig farming accounting for the specific health, environment and economical implications of the different pig production systems.

  7. Correlation between discrete probability and reaction front propagation rate in heterogeneous mixtures

    NASA Astrophysics Data System (ADS)

    Naine, Tarun Bharath; Gundawar, Manoj Kumar

    2017-09-01

    We demonstrate a very powerful correlation between the discrete probability of distances of neighboring cells and thermal wave propagation rate, for a system of cells spread on a one-dimensional chain. A gamma distribution is employed to model the distances of neighboring cells. In the absence of an analytical solution and the differences in ignition times of adjacent reaction cells following non-Markovian statistics, invariably the solution for thermal wave propagation rate for a one-dimensional system with randomly distributed cells is obtained by numerical simulations. However, such simulations which are based on Monte-Carlo methods require several iterations of calculations for different realizations of distribution of adjacent cells. For several one-dimensional systems, differing in the value of shaping parameter of the gamma distribution, we show that the average reaction front propagation rates obtained by a discrete probability between two limits, shows excellent agreement with those obtained numerically. With the upper limit at 1.3, the lower limit depends on the non-dimensional ignition temperature. Additionally, this approach also facilitates the prediction of burning limits of heterogeneous thermal mixtures. The proposed method completely eliminates the need for laborious, time intensive numerical calculations where the thermal wave propagation rates can now be calculated based only on macroscopic entity of discrete probability.

  8. Cardiological database management system as a mediator to clinical decision support.

    PubMed

    Pappas, C; Mavromatis, A; Maglaveras, N; Tsikotis, A; Pangalos, G; Ambrosiadou, V

    1996-03-01

    An object-oriented medical database management system is presented for a typical cardiologic center, facilitating epidemiological trials. Object-oriented analysis and design were used for the system design, offering advantages for the integrity and extendibility of medical information systems. The system was developed using object-oriented design and programming methodology, the C++ language and the Borland Paradox Relational Data Base Management System on an MS-Windows NT environment. Particular attention was paid to system compatibility, portability, the ease of use, and the suitable design of the patient record so as to support the decisions of medical personnel in cardiovascular centers. The system was designed to accept complex, heterogeneous, distributed data in various formats and from different kinds of examinations such as Holter, Doppler and electrocardiography.

  9. Understanding the organization of public health delivery systems: an empirical typology.

    PubMed

    Mays, Glen P; Scutchfield, F Douglas; Bhandari, Michelyn W; Smith, Sharla A

    2010-03-01

    Policy discussions about improving the U.S. health care system increasingly recognize the need to strengthen its capacities for delivering public health services. A better understanding of how public health delivery systems are organized across the United States is critical to improvement. To facilitate the development of such evidence, this article presents an empirical method of classifying and comparing public health delivery systems based on key elements of their organizational structure. This analysis uses data collected through a national longitudinal survey of local public health agencies serving communities with at least 100,000 residents. The survey measured the availability of twenty core public health activities in local communities and the types of organizations contributing to each activity. Cluster analysis differentiated local delivery systems based on the scope of activities delivered, the range of organizations contributing, and the distribution of effort within the system. Public health delivery systems varied widely in organizational structure, but the observed patterns of variation suggested that systems adhere to one of seven distinct configurations. Systems frequently migrated from one configuration to another over time, with an overall trend toward offering a broader scope of services and engaging a wider range of organizations. Public health delivery systems exhibit important structural differences that may influence their operations and outcomes. The typology developed through this analysis can facilitate comparative studies to identify which delivery system configurations perform best in which contexts.

  10. Resource acquisition, distribution and end-use efficiencies and the growth of industrial society

    NASA Astrophysics Data System (ADS)

    Jarvis, A.; Jarvis, S.; Hewitt, N.

    2015-01-01

    A key feature of the growth of industrial society is the acquisition of increasing quantities of resources from the environment and their distribution for end use. With respect to energy, growth has been near exponential for the last 160 years. We attempt to show that the global distribution of resources that underpins this growth may be facilitated by the continual development and expansion of near optimal directed networks. If so, the distribution efficiencies of these networks must decline as they expand due to path lengths becoming longer and more tortuous. To maintain long-term exponential growth the physical limits placed on the distribution networks appear to be counteracted by innovations deployed elsewhere in the system: namely at the points of acquisition and end use. We postulate that the maintenance of growth at the specific rate of ~2.4% yr-1 stems from an implicit desire to optimise patterns of energy use over human working lifetimes.

  11. Dispersion, sorption and photodegradation of petroleum hydrocarbons in dispersant-seawater-sediment systems.

    PubMed

    Zhao, Xiao; Liu, Wen; Fu, Jie; Cai, Zhengqing; O'Reilly, S E; Zhao, Dongye

    2016-08-15

    This work examined effects of model oil dispersants on dispersion, sorption and photodegradation of petroleum hydrocarbons in simulated marine systems. Three dispersants (Corexit 9500A, Corexit 9527A and SPC 1000) were used to prepare dispersed water accommodated oil (DWAO). While higher doses of dispersants dispersed more n-alkanes and PAHs, Corexit 9500A preferentially dispersed C11-C20 n-alkanes, whereas Corexit 9527A was more favorable for smaller alkanes (C10-C16), and SPC 1000 for C12-C28 n-alkanes. Sorption of petroleum hydrocarbons on sediment was proportional to TPH types/fractions in the DWAOs. Addition of 18mg/L of Corexit 9500A increased sediment uptake of 2-3 ring PAHs, while higher dispersant doses reduced the uptake, due to micelle-enhanced solubilization effects. Both dispersed n-alkanes and PAHs were susceptible to photodegradation under simulated sunlight. For PAHs, both photodegradation and photo-facilitated alkylation were concurrently taking place. The information can facilitate sounder assessment of fate and distribution of dispersed oil hydrocarbons in marine systems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Functional Development of the Circadian Clock in the Zebrafish Pineal Gland

    PubMed Central

    Ben-Moshe, Zohar; Foulkes, Nicholas S.

    2014-01-01

    The zebrafish constitutes a powerful model organism with unique advantages for investigating the vertebrate circadian timing system and its regulation by light. In particular, the remarkably early and rapid development of the zebrafish circadian system has facilitated exploring the factors that control the onset of circadian clock function during embryogenesis. Here, we review our understanding of the molecular basis underlying functional development of the central clock in the zebrafish pineal gland. Furthermore, we examine how the directly light-entrainable clocks in zebrafish cell lines have facilitated unravelling the general mechanisms underlying light-induced clock gene expression. Finally, we summarize how analysis of the light-induced transcriptome and miRNome of the zebrafish pineal gland has provided insight into the regulation of the circadian system by light, including the involvement of microRNAs in shaping the kinetics of light- and clock-regulated mRNA expression. The relative contributions of the pineal gland central clock and the distributed peripheral oscillators to the synchronization of circadian rhythms at the whole animal level are a crucial question that still remains to be elucidated in the zebrafish model. PMID:24839600

  13. A broadband multimedia TeleLearning system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ruiping; Karmouch, A.

    1996-12-31

    In this paper we discuss a broadband multimedia TeleLearning system under development in the Multimedia Information Research Laboratory at the University of Ottawa. The system aims at providing a seamless environment for TeleLearning using the latest telecommunication and multimedia information processing technology. It basically consists of a media production center, a courseware author site, a courseware database, a courseware user site, and an on-line facilitator site. All these components are distributed over an ATM network and work together to offer a multimedia interactive courseware service. An MHEG-based model is exploited in designing the system architecture to achieve the real-time, interactive,more » and reusable information interchange through heterogeneous platforms. The system architecture, courseware processing strategies, courseware document models are presented.« less

  14. A distributed computing model for telemetry data processing

    NASA Astrophysics Data System (ADS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  15. A distributed computing model for telemetry data processing

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  16. Semantic Coherence Facilitates Distributional Learning.

    PubMed

    Ouyang, Long; Boroditsky, Lera; Frank, Michael C

    2017-04-01

    Computational models have shown that purely statistical knowledge about words' linguistic contexts is sufficient to learn many properties of words, including syntactic and semantic category. For example, models can infer that "postman" and "mailman" are semantically similar because they have quantitatively similar patterns of association with other words (e.g., they both tend to occur with words like "deliver," "truck," "package"). In contrast to these computational results, artificial language learning experiments suggest that distributional statistics alone do not facilitate learning of linguistic categories. However, experiments in this paradigm expose participants to entirely novel words, whereas real language learners encounter input that contains some known words that are semantically organized. In three experiments, we show that (a) the presence of familiar semantic reference points facilitates distributional learning and (b) this effect crucially depends both on the presence of known words and the adherence of these known words to some semantic organization. Copyright © 2016 Cognitive Science Society, Inc.

  17. Facilitating the Specification Capture and Transformation Process in the Development of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Filho, Aluzio Haendehen; Caminada, Numo; Haeusler, Edward Hermann; vonStaa, Arndt

    2004-01-01

    To support the development of flexible and reusable MAS, we have built a framework designated MAS-CF. MAS-CF is a component framework that implements a layered architecture based on contextual composition. Interaction rules, controlled by architecture mechanisms, ensure very low coupling, making possible the sharing of distributed services in a transparent, dynamic and independent way. These properties propitiate large-scale reuse, since organizational abstractions can be reused and propagated to all instances created from a framework. The objective is to reduce complexity and development time of multi-agent systems through the reuse of generic organizational abstractions.

  18. Analysis of spatial configuration of the Palace Museum: an application of the axial-based space syntax

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Lu, Feng

    2006-10-01

    Movement in a spatial system is produced and determined by the structure of the complex space itself, rather than special attractors within the whole spatial system. Based on this theory of space syntax, tourists' convergence and dispersal in the Palace Museum should be originated by the distribution of the internal constructions form. This article presents an application of the space syntax approach to the Palace Museum. After analyzing its internal spatial configuration, as a conclusion, the paper provides some rational advices so as to facilitate tourists as well as protect our invaluable cultural heritage.

  19. Microstrip Yagi array for MSAT vehicle antenna application

    NASA Technical Reports Server (NTRS)

    Huang, John; Densmore, Arthur; Pozar, David

    1990-01-01

    A microstrip Yagi array was developed for the MSAT system as a low-cost mechanically steered medium-gain vehicle antenna. Because its parasitic reflector and director patches are not connected to any of the RF power distributing circuit, while still contributing to achieve the MSAT required directional beam, the antenna becomes a very efficient radiating system. With the complete monopulse beamforming circuit etched on a thin stripline board, the planar microstrip Yagi array is capable of achieving a very low profile. A theoretical model using the Method of Moments was developed to facilitate the ease of design and understanding of this antenna.

  20. Wind Energy Resource Atlas of the Dominican Republic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, D.; Schwartz, M.; George, R.

    2001-10-01

    The Wind Energy Resource Atlas of the Dominican Republic identifies the wind characteristics and the distribution of the wind resource in this country. This major project is the first of its kind undertaken for the Dominican Republic. The information contained in the atlas is necessary to facilitate the use of wind energy technologies, both for utility-scale power generation and off-grid wind energy applications. A computerized wind mapping system developed by NREL generated detailed wind resource maps for the entire country. This technique uses Geographic Information Systems (GIS) to produce high-resolution (1-square kilometer) annual average wind resource maps.

  1. Enabling NVM for Data-Intensive Scientific Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carns, Philip; Jenkins, John; Seo, Sangmin

    Specialized, transient data services are playing an increasingly prominent role in data-intensive scientific computing. These services offer flexible, on-demand pairing of applications with storage hardware using semantics that are optimized for the problem domain. Concurrent with this trend, upcoming scientific computing and big data systems will be deployed with emerging NVM technology to achieve the highest possible price/productivity ratio. Clearly, therefore, we must develop techniques to facilitate the confluence of specialized data services and NVM technology. In this work we explore how to enable the composition of NVM resources within transient distributed services while still retaining their essential performance characteristics.more » Our approach involves eschewing the conventional distributed file system model and instead projecting NVM devices as remote microservices that leverage user-level threads, RPC services, RMA-enabled network transports, and persistent memory libraries in order to maximize performance. We describe a prototype system that incorporates these concepts, evaluate its performance for key workloads on an exemplar system, and discuss how the system can be leveraged as a component of future data-intensive architectures.« less

  2. SSPI - Space Service Provider Infrastructure: Image Information Mining and Management Prototype for a Distributed Environment

    NASA Astrophysics Data System (ADS)

    Candela, L.; Ruggieri, G.; Giancaspro, A.

    2004-09-01

    In the sphere of "Multi-Mission Ground Segment" Italian Space Agency project, some innovative technologies such as CORBA[1], Z39.50[2], XML[3], Java[4], Java server Pages[4] and C++ has been experimented. The SSPI system (Space Service Provider Infrastructure) is the prototype of a distributed environment aimed to facilitate the access to Earth Observation (EO) data. SSPI allows to ingests, archive, consolidate, visualize and evaluate these data. Hence, SSPI is not just a database of or a data repository, but an application that by means of a set of protocols, standards and specifications provides a unified access to multi-mission EO data.

  3. A DRM based on renewable broadcast encryption

    NASA Astrophysics Data System (ADS)

    Ramkumar, Mahalingam; Memon, Nasir

    2005-07-01

    We propose an architecture for digital rights management based on a renewable, random key pre-distribution (KPD) scheme, HARPS (hashed random preloaded subsets). The proposed architecture caters for broadcast encryption by a trusted authority (TA) and by "parent" devices (devices used by vendors who manufacture compliant devices) for periodic revocation of devices. The KPD also facilitates broadcast encryption by peer devices, which permits peers to distribute content, and efficiently control access to the content encryption secret using subscription secrets. The underlying KPD also caters for broadcast authentication and mutual authentication of any two devices, irrespective of the vendors manufacturing the device, and thus provides a comprehensive solution for securing interactions between devices taking part in a DRM system.

  4. Blackboard system generator (BSG) - An alternative distributed problem-solving paradigm

    NASA Technical Reports Server (NTRS)

    Silverman, Barry G.; Feggos, Kostas; Chang, Joseph Shih

    1989-01-01

    A status review is presented for a generic blackboard-based distributed problem-solving environment in which multiple-agent cooperation can be effected. This environment is organized into a shared information panel, a chairman control panel, and a metaplanning panel. Each panel contains a number of embedded AI techniques that facilitate its operation and that provide heuristics for solving the underlying team-agent decision problem. The status of these panels and heuristics is described along with a number of robustness considerations. The techniques for each of the three panels and for four sets of paradigm-related advances are described, along with selected results from classroom teaching experiments and from three applications.

  5. Review of the socket design and interface pressure measurement for transtibial prosthesis.

    PubMed

    Pirouzi, Gh; Abu Osman, N A; Eshraghi, A; Ali, S; Gholizadeh, H; Wan Abas, W A B

    2014-01-01

    Socket is an important part of every prosthetic limb as an interface between the residual limb and prosthetic components. Biomechanics of socket-residual limb interface, especially the pressure and force distribution, have effect on patient satisfaction and function. This paper aimed to review and evaluate studies conducted in the last decades on the design of socket, in-socket interface pressure measurement, and socket biomechanics. Literature was searched to find related keywords with transtibial amputation, socket-residual limb interface, socket measurement, socket design, modeling, computational modeling, and suspension system. In accordance with the selection criteria, 19 articles were selected for further analysis. It was revealed that pressure and stress have been studied in the last decaeds, but quantitative evaluations remain inapplicable in clinical settings. This study also illustrates prevailing systems, which may facilitate improvements in socket design for improved quality of life for individuals ambulating with transtibial prosthesis. It is hoped that the review will better facilitate the understanding and determine the clinical relevance of quantitative evaluations.

  6. Review of the Socket Design and Interface Pressure Measurement for Transtibial Prosthesis

    PubMed Central

    Pirouzi, Gh.; Abu Osman, N. A.; Eshraghi, A.; Ali, S.; Gholizadeh, H.; Wan Abas, W. A. B.

    2014-01-01

    Socket is an important part of every prosthetic limb as an interface between the residual limb and prosthetic components. Biomechanics of socket-residual limb interface, especially the pressure and force distribution, have effect on patient satisfaction and function. This paper aimed to review and evaluate studies conducted in the last decades on the design of socket, in-socket interface pressure measurement, and socket biomechanics. Literature was searched to find related keywords with transtibial amputation, socket-residual limb interface, socket measurement, socket design, modeling, computational modeling, and suspension system. In accordance with the selection criteria, 19 articles were selected for further analysis. It was revealed that pressure and stress have been studied in the last decaeds, but quantitative evaluations remain inapplicable in clinical settings. This study also illustrates prevailing systems, which may facilitate improvements in socket design for improved quality of life for individuals ambulating with transtibial prosthesis. It is hoped that the review will better facilitate the understanding and determine the clinical relevance of quantitative evaluations. PMID:25197716

  7. Supervised interpretation of echocardiograms with a psychological model of expert supervision

    NASA Astrophysics Data System (ADS)

    Revankar, Shriram V.; Sher, David B.; Shalin, Valerie L.; Ramamurthy, Maya

    1993-07-01

    We have developed a collaborative scheme that facilitates active human supervision of the binary segmentation of an echocardiogram. The scheme complements the reliability of a human expert with the precision of segmentation algorithms. In the developed system, an expert user compares the computer generated segmentation with the original image in a user friendly graphics environment, and interactively indicates the incorrectly classified regions either by pointing or by circling. The precise boundaries of the indicated regions are computed by studying original image properties at that region, and a human visual attention distribution map obtained from the published psychological and psychophysical research. We use the developed system to extract contours of heart chambers from a sequence of two dimensional echocardiograms. We are currently extending this method to incorporate a richer set of inputs from the human supervisor, to facilitate multi-classification of image regions depending on their functionality. We are integrating into our system the knowledge related constraints that cardiologists use, to improve the capabilities of our existing system. This extension involves developing a psychological model of expert reasoning, functional and relational models of typical views in echocardiograms, and corresponding interface modifications to map the suggested actions to image processing algorithms.

  8. Semantics-enabled knowledge management for global Earth observation system of systems

    NASA Astrophysics Data System (ADS)

    King, Roger L.; Durbha, Surya S.; Younan, Nicolas H.

    2007-10-01

    The Global Earth Observation System of Systems (GEOSS) is a distributed system of systems built on current international cooperation efforts among existing Earth observing and processing systems. The goal is to formulate an end-to-end process that enables the collection and distribution of accurate, reliable Earth Observation data, information, products, and services to both suppliers and consumers worldwide. One of the critical components in the development of such systems is the ability to obtain seamless access of data across geopolitical boundaries. In order to gain support and willingness to participate by countries around the world in such an endeavor, it is necessary to devise mechanisms whereby the data and the intellectual capital is protected through procedures that implement the policies specific to a country. Earth Observations (EO) are obtained from a multitude of sources and requires coordination among different agencies and user groups to come to a shared understanding on a set of concepts involved in a domain. It is envisaged that the data and information in a GEOSS context will be unprecedented and the current data archiving and delivery methods need to be transformed into one that allows realization of seamless interoperability. Thus, EO data integration is dependent on the resolution of conflicts arising from a variety of areas. Modularization is inevitable in distributed environments to facilitate flexible and efficient reuse of existing ontologies. Therefore, we propose a framework for modular ontologies based knowledge management approach for GEOSS and present methods to enable efficient reasoning in such systems.

  9. Interconnection Assessment Methodology and Cost Benefit Analysis for High-Penetration PV Deployment in the Arizona Public Service System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baggu, Murali; Giraldez, Julieta; Harris, Tom

    In an effort to better understand the impacts of high penetrations of photovoltaic (PV) generators on distribution systems, Arizona Public Service and its partners completed a multi-year project to develop the tools and knowledge base needed to safely and reliably integrate high penetrations of utility- and residential-scale PV. Building upon the APS Community Power Project-Flagstaff Pilot, this project investigates the impact of PV on a representative feeder in northeast Flagstaff. To quantify and catalog the effects of the estimated 1.3 MW of PV that will be installed on the feeder (both smaller units at homes and large, centrally located systems),more » high-speed weather and electrical data acquisition systems and digital 'smart' meters were designed and installed to facilitate monitoring and to build and validate comprehensive, high-resolution models of the distribution system. These models are being developed to analyze the impacts of PV on distribution circuit protection systems (including coordination and anti-islanding), predict voltage regulation and phase balance issues, and develop volt/VAr control schemes. This paper continues from a paper presented at the 2014 IEEE PVSC conference that described feeder model evaluation and high penetration advanced scenario analysis, specifically feeder reconfiguration. This paper presents results from Phase 5 of the project. Specifically, the paper discusses tool automation; interconnection assessment methodology and cost benefit analysis.« less

  10. Facility Monitoring: A Qualitative Theory for Sensor Fusion

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando

    2001-01-01

    Data fusion and sensor management approaches have largely been implemented with centralized and hierarchical architectures. Numerical and statistical methods are the most common data fusion methods found in these systems. Given the proliferation and low cost of processing power, there is now an emphasis on designing distributed and decentralized systems. These systems use analytical/quantitative techniques or qualitative reasoning methods for date fusion.Based on other work by the author, a sensor may be treated as a highly autonomous (decentralized) unit. Each highly autonomous sensor (HAS) is capable of extracting qualitative behaviours from its data. For example, it detects spikes, disturbances, noise levels, off-limit excursions, step changes, drift, and other typical measured trends. In this context, this paper describes a distributed sensor fusion paradigm and theory where each sensor in the system is a HAS. Hence, given the reach qualitative information from each HAS, a paradigm and formal definitions are given so that sensors and processes can reason and make decisions at the qualitative level. This approach to sensor fusion makes it possible the implementation of intuitive (effective) methods to monitor, diagnose, and compensate processes/systems and their sensors. This paradigm facilitates a balanced distribution of intelligence (code and/or hardware) to the sensor level, the process/system level, and a higher controller level. The primary application of interest is in intelligent health management of rocket engine test stands.

  11. Competition and facilitation structure plant communities under nurse tree canopies in extremely stressful environments.

    PubMed

    Al-Namazi, Ali A; El-Bana, Magdy I; Bonser, Stephen P

    2017-04-01

    Nurse plant facilitation in stressful environments can produce an environment with relatively low stress under its canopy. These nurse plants may produce the conditions promoting intense competition between coexisting species under the canopy, and canopies may establish stress gradients, where stress increases toward the edge of the canopy. Competition and facilitation on these stress gradients may control species distributions in the communities under canopies. We tested the following predictions: (1) interactions between understory species shift from competition to facilitation in habitats experiencing increasing stress from the center to the edge of canopy of a nurse plant, and (2) species distributions in understory communities are controlled by competitive interactions at the center of canopy, and facilitation at the edge of the canopy. We tested these predictions using a neighbor removal experiment under nurse trees growing in arid environments. Established individuals of each of four of the most common herbaceous species in the understory were used in the experiment. Two species were more frequent in the center of the canopy, and two species were more frequent at the edge of the canopy. Established individuals of each species were subjected to neighbor removal or control treatments in both canopy center and edge habitats. We found a shift from competitive to facilitative interactions from the center to the edge of the canopy. The shift in the effect of neighbors on the target species can help to explain species distributions in these canopies. Canopy-dominant species only perform well in the presence of neighbors in the edge microhabitat. Competition from canopy-dominant species can also limit the performance of edge-dominant species in the canopy microhabitat. The shift from competition to facilitation under nurse plant canopies can structure the understory communities in extremely stressful environments.

  12. Simulated disparity and peripheral blur interact during binocular fusion.

    PubMed

    Maiello, Guido; Chessa, Manuela; Solari, Fabio; Bex, Peter J

    2014-07-17

    We have developed a low-cost, practical gaze-contingent display in which natural images are presented to the observer with dioptric blur and stereoscopic disparity that are dependent on the three-dimensional structure of natural scenes. Our system simulates a distribution of retinal blur and depth similar to that experienced in real-world viewing conditions by emmetropic observers. We implemented the system using light-field photographs taken with a plenoptic camera which supports digital refocusing anywhere in the images. We coupled this capability with an eye-tracking system and stereoscopic rendering. With this display, we examine how the time course of binocular fusion depends on depth cues from blur and stereoscopic disparity in naturalistic images. Our results show that disparity and peripheral blur interact to modify eye-movement behavior and facilitate binocular fusion, and the greatest benefit was gained by observers who struggled most to achieve fusion. Even though plenoptic images do not replicate an individual’s aberrations, the results demonstrate that a naturalistic distribution of depth-dependent blur may improve 3-D virtual reality, and that interruptions of this pattern (e.g., with intraocular lenses) which flatten the distribution of retinal blur may adversely affect binocular fusion. © 2014 ARVO.

  13. Simulated disparity and peripheral blur interact during binocular fusion

    PubMed Central

    Maiello, Guido; Chessa, Manuela; Solari, Fabio; Bex, Peter J

    2014-01-01

    We have developed a low-cost, practical gaze-contingent display in which natural images are presented to the observer with dioptric blur and stereoscopic disparity that are dependent on the three-dimensional structure of natural scenes. Our system simulates a distribution of retinal blur and depth similar to that experienced in real-world viewing conditions by emmetropic observers. We implemented the system using light-field photographs taken with a plenoptic camera which supports digital refocusing anywhere in the images. We coupled this capability with an eye-tracking system and stereoscopic rendering. With this display, we examine how the time course of binocular fusion depends on depth cues from blur and stereoscopic disparity in naturalistic images. Our results show that disparity and peripheral blur interact to modify eye-movement behavior and facilitate binocular fusion, and the greatest benefit was gained by observers who struggled most to achieve fusion. Even though plenoptic images do not replicate an individual's aberrations, the results demonstrate that a naturalistic distribution of depth-dependent blur may improve 3-D virtual reality, and that interruptions of this pattern (e.g., with intraocular lenses) which flatten the distribution of retinal blur may adversely affect binocular fusion. PMID:25034260

  14. Spatial optimization for decentralized non-potable water reuse

    NASA Astrophysics Data System (ADS)

    Kavvada, Olga; Nelson, Kara L.; Horvath, Arpad

    2018-06-01

    Decentralization has the potential to reduce the scale of the piped distribution network needed to enable non-potable water reuse (NPR) in urban areas by producing recycled water closer to its point of use. However, tradeoffs exist between the economies of scale of treatment facilities and the size of the conveyance infrastructure, including energy for upgradient distribution of recycled water. To adequately capture the impacts from distribution pipes and pumping requirements, site-specific conditions must be accounted for. In this study, a generalized framework (a heuristic modeling approach using geospatial algorithms) is developed that estimates the financial cost, the energy use, and the greenhouse gas emissions associated with NPR (for toilet flushing) as a function of scale of treatment and conveyance networks with the goal of determining the optimal degree of decentralization. A decision-support platform is developed to assess and visualize NPR system designs considering topography, economies of scale, and building size. The platform can be used for scenario development to explore the optimal system size based on the layout of current or new buildings. The model also promotes technology innovation by facilitating the systems-level comparison of options to lower costs, improve energy efficiency, and lower greenhouse gas emissions.

  15. Electro-osmotic flow of a model electrolyte

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Singer, Sherwin J.; Zheng, Zhi; Conlisk, A. T.

    2005-04-01

    Electro-osmotic flow is studied by nonequilibrium molecular dynamics simulations in a model system chosen to elucidate various factors affecting the velocity profile and facilitate comparison with existing continuum theories. The model system consists of spherical ions and solvent, with stationary, uniformly charged walls that make a channel with a height of 20 particle diameters. We find that hydrodynamic theory adequately describes simple pressure-driven (Poiseuille) flow in this model. However, Poisson-Boltzmann theory fails to describe the ion distribution in important situations, and therefore continuum fluid dynamics based on the Poisson-Boltzmann ion distribution disagrees with simulation results in those situations. The failure of Poisson-Boltzmann theory is traced to the exclusion of ions near the channel walls resulting from reduced solvation of the ions in that region. When a corrected ion distribution is used as input for hydrodynamic theory, agreement with numerical simulations is restored. An analytic theory is presented that demonstrates that repulsion of the ions from the channel walls increases the flow rate, and attraction to the walls has the opposite effect. A recent numerical study of electro-osmotic flow is reanalyzed in the light of our findings, and the results conform well to our conclusions for the model system.

  16. Fluid Dynamic Modeling to Support the Development of Flow-Based Hepatocyte Culture Systems for Metabolism Studies

    PubMed Central

    Pedersen, Jenny M.; Shim, Yoo-Sik; Hans, Vaibhav; Phillips, Martin B.; Macdonald, Jeffrey M.; Walker, Glenn; Andersen, Melvin E.; Clewell, Harvey J.; Yoon, Miyoung

    2016-01-01

    Accurate prediction of metabolism is a significant outstanding challenge in toxicology. The best predictions are based on experimental data from in vitro systems using primary hepatocytes. The predictivity of the primary hepatocyte-based culture systems, however, is still limited due to well-known phenotypic instability and rapid decline of metabolic competence within a few hours. Dynamic flow bioreactors for three-dimensional cell cultures are thought to be better at recapitulating tissue microenvironments and show potential to improve in vivo extrapolations of chemical or drug toxicity based on in vitro test results. These more physiologically relevant culture systems hold potential for extending metabolic competence of primary hepatocyte cultures as well. In this investigation, we used computational fluid dynamics to determine the optimal design of a flow-based hepatocyte culture system for evaluating chemical metabolism in vitro. The main design goals were (1) minimization of shear stress experienced by the cells to maximize viability, (2) rapid establishment of a uniform distribution of test compound in the chamber, and (3) delivery of sufficient oxygen to cells to support aerobic respiration. Two commercially available flow devices – RealBio® and QuasiVivo® (QV) – and a custom developed fluidized bed bioreactor were simulated, and turbulence, flow characteristics, test compound distribution, oxygen distribution, and cellular oxygen consumption were analyzed. Experimental results from the bioreactors were used to validate the simulation results. Our results indicate that maintaining adequate oxygen supply is the most important factor to the long-term viability of liver bioreactor cultures. Cell density and system flow patterns were the major determinants of local oxygen concentrations. The experimental results closely corresponded to the in silico predictions. Of the three bioreactors examined in this study, we were able to optimize the experimental conditions for long-term hepatocyte cell culture using the QV bioreactor. This system facilitated the use of low system volumes coupled with higher flow rates. This design supports cellular respiration by increasing oxygen concentrations in the vicinity of the cells and facilitates long-term kinetic studies of low clearance test compounds. These two goals were achieved while simultaneously keeping the shear stress experienced by the cells within acceptable limits. PMID:27747210

  17. Development of a muon radiographic imaging electronic board system for a stable solar power operation

    NASA Astrophysics Data System (ADS)

    Uchida, T.; Tanaka, H. K. M.; Tanaka, M.

    2010-02-01

    Cosmic-ray muon radiography is a method that is used to study the internal structure of volcanoes. We have developed a muon radiographic imaging board with a power consumption low enough to be powered by a small solar power system. The imaging board generates an angular distribution of the muons. Used for real-time reading, the method may facilitate the prediction of eruptions. For real-time observations, the Ethernet is employed, and the board works as a web server for a remote operation. The angular distribution can be obtained from a remote PC via a network using a standard web browser. We have collected and analyzed data obtained from a 3-day field study of cosmic-ray muons at a Satsuma-Iwojima volcano. The data provided a clear image of the mountain ridge as a cosmic-ray muon shadow. The measured performance of the system is sufficient for a stand-alone cosmic-ray muon radiography experiment.

  18. Range Information Systems Management (RISM) Phase 1 Report

    NASA Technical Reports Server (NTRS)

    Bastin, Gary L.; Harris, William G.; Nelson, Richard A.

    2002-01-01

    RISM investigated alternative approaches, technologies, and communication network architectures to facilitate building the Spaceports and Ranges of the future. RISM started by document most existing US ranges and their capabilities. In parallel, RISM obtained inputs from the following: 1) NASA and NASA-contractor engineers and managers, and; 2) Aerospace leaders from Government, Academia, and Industry, participating through the Space Based Range Distributed System Working Group (SBRDSWG), many of whom are also; 3) Members of the Advanced Range Technology Working Group (ARTWG) subgroups, and; 4) Members of the Advanced Spaceport Technology Working Group (ASTWG). These diverse inputs helped to envision advanced technologies for implementing future Ranges and Range systems that builds on today s cabled and wireless legacy infrastructures while seamlessly integrating both today s emerging and tomorrow s building-block communication techniques. The fundamental key is to envision a transition to a Space Based Range Distributed Subsystem. The enabling concept is to identify the specific needs of Range users that can be solved through applying emerging communication tech

  19. Improving Distribution Resiliency with Microgrids and State and Parameter Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuffner, Francis K.; Williams, Tess L.; Schneider, Kevin P.

    Modern society relies on low-cost reliable electrical power, both to maintain industry, as well as provide basic social services to the populace. When major disturbances occur, such as Hurricane Katrina or Hurricane Sandy, the nation’s electrical infrastructure can experience significant outages. To help prevent the spread of these outages, as well as facilitating faster restoration after an outage, various aspects of improving the resiliency of the power system are needed. Two such approaches are breaking the system into smaller microgrid sections, and to have improved insight into the operations to detect failures or mis-operations before they become critical. Breaking themore » system into smaller sections of microgrid islands, power can be maintained in smaller areas where distribution generation and energy storage resources are still available, but bulk power generation is no longer connected. Additionally, microgrid systems can maintain service to local pockets of customers when there has been extensive damage to the local distribution system. However, microgrids are grid connected a majority of the time and implementing and operating a microgrid is much different than when islanded. This report discusses work conducted by the Pacific Northwest National Laboratory that developed improvements for simulation tools to capture the characteristics of microgrids and how they can be used to develop new operational strategies. These operational strategies reduce the cost of microgrid operation and increase the reliability and resilience of the nation’s electricity infrastructure. In addition to the ability to break the system into microgrids, improved observability into the state of the distribution grid can make the power system more resilient. State estimation on the transmission system already provides great insight into grid operations and detecting abnormal conditions by leveraging existing measurements. These transmission-level approaches are expanded to using advanced metering infrastructure and other distribution-level measurements to create a three-phase, unbalanced distribution state estimation approach. With distribution-level state estimation, the grid can be operated more efficiently, and outages or equipment failures can be caught faster, improving the overall resilience and reliability of the grid.« less

  20. VIEWCACHE: An incremental pointer-base access method for distributed databases. Part 1: The universal index system design document. Part 2: The universal index system low-level design document. Part 3: User's guide. Part 4: Reference manual. Part 5: UIMS test suite

    NASA Technical Reports Server (NTRS)

    Kelley, Steve; Roussopoulos, Nick; Sellis, Timos

    1992-01-01

    The goal of the Universal Index System (UIS), is to provide an easy-to-use and reliable interface to many different kinds of database systems. The impetus for this system was to simplify database index management for users, thus encouraging the use of indexes. As the idea grew into an actual system design, the concept of increasing database performance by facilitating the use of time-saving techniques at the user level became a theme for the project. This Final Report describes the Design, the Implementation of UIS, and its Language Interfaces. It also includes the User's Guide and the Reference Manual.

  1. State Fall Prevention Coalitions as Systems Change Agents: An Emphasis on Policy.

    PubMed

    Schneider, Ellen C; Smith, Matthew Lee; Ory, Marcia G; Altpeter, Mary; Beattie, Bonita Lynn; Scheirer, Mary Ann; Shubert, Tiffany E

    2016-03-01

    Falls among older adults are an escalating public health issue, which requires a multidisciplinary and multilevel approach to affect systems change to effectively address this problem. The National Council on Aging established the Falls Free® Initiative, enfolding and facilitating statewide Fall Prevention Coalitions. Fall Free® activities included developing the State Policy Toolkit for Advancing Falls Prevention to promote sustainable change by supporting the dissemination and adoption of evidence-based strategies. To (1) determine if the policies being implemented were recommended and supported by the Toolkit, (2) identify the perceived barriers and facilitators to implementing policies, and (3) identify Coalitions' current and future fall prevention policy activities. A 63-item online survey was distributed to State Coalition Leads. Descriptive statistics (frequencies and counts) were used to describe Coalition characteristics and activities. Coalitions had several similarities, and varied greatly in their number of member organizations and members as well as meeting frequencies. Key activities included building partnerships, disseminating programs, and pursuing at least one of the eight National Council on Aging-recommended policy goals. The most commonly reported facilitator was active support from the Coalition Leads, whereas the lack of funding was the most cited barrier. This study serves as the first national census of empirical evidence regarding Falls Coalitions' composition, goals, and activities. Results indicate that Coalitions are actively pursuing evidence-based policies but could benefit from additional technical assistance and resources. Findings support the value of Toolkit recommendations by documenting what is feasible and being implemented. Knowledge about facilitators and barriers will inform future efforts to foster sustainable systems change in states with active Coalitions and encourage Coalitions in other states. © 2015 Society for Public Health Education.

  2. Secure multi-party communication with quantum key distribution managed by trusted authority

    DOEpatents

    Nordholt, Jane Elizabeth; Hughes, Richard John; Peterson, Charles Glen

    2013-07-09

    Techniques and tools for implementing protocols for secure multi-party communication after quantum key distribution ("QKD") are described herein. In example implementations, a trusted authority facilitates secure communication between multiple user devices. The trusted authority distributes different quantum keys by QKD under trust relationships with different users. The trusted authority determines combination keys using the quantum keys and makes the combination keys available for distribution (e.g., for non-secret distribution over a public channel). The combination keys facilitate secure communication between two user devices even in the absence of QKD between the two user devices. With the protocols, benefits of QKD are extended to multi-party communication scenarios. In addition, the protocols can retain benefit of QKD even when a trusted authority is offline or a large group seeks to establish secure communication within the group.

  3. Secure multi-party communication with quantum key distribution managed by trusted authority

    DOEpatents

    Hughes, Richard John; Nordholt, Jane Elizabeth; Peterson, Charles Glen

    2015-01-06

    Techniques and tools for implementing protocols for secure multi-party communication after quantum key distribution ("QKD") are described herein. In example implementations, a trusted authority facilitates secure communication between multiple user devices. The trusted authority distributes different quantum keys by QKD under trust relationships with different users. The trusted authority determines combination keys using the quantum keys and makes the combination keys available for distribution (e.g., for non-secret distribution over a public channel). The combination keys facilitate secure communication between two user devices even in the absence of QKD between the two user devices. With the protocols, benefits of QKD are extended to multi-party communication scenarios. In addition, the protocols can retain benefit of QKD even when a trusted authority is offline or a large group seeks to establish secure communication within the group.

  4. Elastomeric load sharing device

    NASA Technical Reports Server (NTRS)

    Isabelle, Charles J. (Inventor); Kish, Jules G. (Inventor); Stone, Robert A. (Inventor)

    1992-01-01

    An elastomeric load sharing device, interposed in combination between a driven gear and a central drive shaft to facilitate balanced torque distribution in split power transmission systems, includes a cylindrical elastomeric bearing and a plurality of elastomeric bearing pads. The elastomeric bearing and bearing pads comprise one or more layers, each layer including an elastomer having a metal backing strip secured thereto. The elastomeric bearing is configured to have a high radial stiffness and a low torsional stiffness and is operative to radially center the driven gear and to minimize torque transfer through the elastomeric bearing. The bearing pads are configured to have a low radial and torsional stiffness and a high axial stiffness and are operative to compressively transmit torque from the driven gear to the drive shaft. The elastomeric load sharing device has spring rates that compensate for mechanical deviations in the gear train assembly to provide balanced torque distribution between complementary load paths of split power transmission systems.

  5. A Distributed Multi-Agent System for Collaborative Information Management and Learning

    NASA Technical Reports Server (NTRS)

    Chen, James R.; Wolfe, Shawn R.; Wragg, Stephen D.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this paper, we present DIAMS, a system of distributed, collaborative agents to help users access, manage, share and exchange information. A DIAMS personal agent helps its owner find information most relevant to current needs. It provides tools and utilities for users to manage their information repositories with dynamic organization and virtual views. Flexible hierarchical display is integrated with indexed query search-to support effective information access. Automatic indexing methods are employed to support user queries and communication between agents. Contents of a repository are kept in object-oriented storage to facilitate information sharing. Collaboration between users is aided by easy sharing utilities as well as automated information exchange. Matchmaker agents are designed to establish connections between users with similar interests and expertise. DIAMS agents provide needed services for users to share and learn information from one another on the World Wide Web.

  6. Miocene marine incursions and marine/freshwater transitions: Evidence from Neotropical fishes

    NASA Astrophysics Data System (ADS)

    Lovejoy, Nathan R.; Albert, James S.; Crampton, William G. R.

    2006-03-01

    Amazonian rivers contain a remarkable fauna of endemic species derived from taxa that generally occur in oceans and seas. Several hypotheses have been proposed to explain the origin of marine-derived lineages, including opportunistic invasions via estuaries, vicariance related to uplift of the Andes, and vicariance related to Miocene marine incursions and connections. Here, we examine available data for marine-derived lineages of four groups: stingrays (Myliobatiformes), drums (Sciaenidae), anchovies (Engraulididae), and needlefish (Belonidae). Geographic distributions, age estimates (determined using fossils, biogeography, and molecular data sets), and phylogenies for these taxa are most compatible with origination during the Miocene from marine sister groups distributed along the northern coast of South America. We speculate that unique ecological and biogeographic aspects of the Miocene upper Amazonian wetland system, most notably long-term connections with marine systems, facilitated the evolutionary transition from marine to freshwater habitats.

  7. Reflective random indexing for semi-automatic indexing of the biomedical literature.

    PubMed

    Vasuki, Vidya; Cohen, Trevor

    2010-10-01

    The rapid growth of biomedical literature is evident in the increasing size of the MEDLINE research database. Medical Subject Headings (MeSH), a controlled set of keywords, are used to index all the citations contained in the database to facilitate search and retrieval. This volume of citations calls for efficient tools to assist indexers at the US National Library of Medicine (NLM). Currently, the Medical Text Indexer (MTI) system provides assistance by recommending MeSH terms based on the title and abstract of an article using a combination of distributional and vocabulary-based methods. In this paper, we evaluate a novel approach toward indexer assistance by using nearest neighbor classification in combination with Reflective Random Indexing (RRI), a scalable alternative to the established methods of distributional semantics. On a test set provided by the NLM, our approach significantly outperforms the MTI system, suggesting that the RRI approach would make a useful addition to the current methodologies.

  8. Implementation of Parallel Dynamic Simulation on Shared-Memory vs. Distributed-Memory Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Chen, Yousu; Wu, Di

    2015-12-09

    Power system dynamic simulation computes the system response to a sequence of large disturbance, such as sudden changes in generation or load, or a network short circuit followed by protective branch switching operation. It consists of a large set of differential and algebraic equations, which is computational intensive and challenging to solve using single-processor based dynamic simulation solution. High-performance computing (HPC) based parallel computing is a very promising technology to speed up the computation and facilitate the simulation process. This paper presents two different parallel implementations of power grid dynamic simulation using Open Multi-processing (OpenMP) on shared-memory platform, and Messagemore » Passing Interface (MPI) on distributed-memory clusters, respectively. The difference of the parallel simulation algorithms and architectures of the two HPC technologies are illustrated, and their performances for running parallel dynamic simulation are compared and demonstrated.« less

  9. A Quality Assurance Method that Utilizes 3D Dosimetry and Facilitates Clinical Interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oldham, Mark, E-mail: mark.oldham@duke.edu; Thomas, Andrew; O'Daniel, Jennifer

    2012-10-01

    Purpose: To demonstrate a new three-dimensional (3D) quality assurance (QA) method that provides comprehensive dosimetry verification and facilitates evaluation of the clinical significance of QA data acquired in a phantom. Also to apply the method to investigate the dosimetric efficacy of base-of-skull (BOS) intensity-modulated radiotherapy (IMRT) treatment. Methods and Materials: Two types of IMRT QA verification plans were created for 6 patients who received BOS IMRT. The first plan enabled conventional 2D planar IMRT QA using the Varian portal dosimetry system. The second plan enabled 3D verification using an anthropomorphic head phantom. In the latter, the 3D dose distribution wasmore » measured using the DLOS/Presage dosimetry system (DLOS = Duke Large-field-of-view Optical-CT System, Presage Heuris Pharma, Skillman, NJ), which yielded isotropic 2-mm data throughout the treated volume. In a novel step, measured 3D dose distributions were transformed back to the patient's CT to enable calculation of dose-volume histograms (DVH) and dose overlays. Measured and planned patient DVHs were compared to investigate clinical significance. Results: Close agreement between measured and calculated dose distributions was observed for all 6 cases. For gamma criteria of 3%, 2 mm, the mean passing rate for portal dosimetry was 96.8% (range, 92.0%-98.9%), compared to 94.9% (range, 90.1%-98.9%) for 3D. There was no clear correlation between 2D and 3D passing rates. Planned and measured dose distributions were evaluated on the patient's anatomy, using DVH and dose overlays. Minor deviations were detected, and the clinical significance of these are presented and discussed. Conclusions: Two advantages accrue to the methods presented here. First, treatment accuracy is evaluated throughout the whole treated volume, yielding comprehensive verification. Second, the clinical significance of any deviations can be assessed through the generation of DVH curves and dose overlays on the patient's anatomy. The latter step represents an important development that advances the clinical relevance of complex treatment QA.« less

  10. Two-dimensional simulation of a two-phase, regenerative pumped radiator loop utilizing direct contact heat transfer with phase change

    NASA Astrophysics Data System (ADS)

    Rhee, Hyop S.; Begg, Lester L.; Wetch, Joseph R.; Jang, Jong H.; Juhasz, Albert J.

    An innovative pumped loop concept for 600 K space power system radiators utilizing direct contact heat transfer, which facilitates repeated startup/shutdown of the power system without complex and time-consuming coolant thawing during power startup, is under development. The heat transfer process with melting/freezing of Li in an NaK flow was studied through two-dimensional time-dependent numerical simulations to characterize and predict the Li/NaK radiator performance during startup (thawing) and shutdown (cold-trapping). Effects of system parameters and the criteria for the plugging domain are presented together with temperature distribution patterns in solid Li and subsequent melting surface profile variations in time.

  11. Standardized Semantic Markup for Reference Terminologies, Thesauri and Coding Systems: Benefits for distributed E-Health Applications.

    PubMed

    Hoelzer, Simon; Schweiger, Ralf K; Liu, Raymond; Rudolf, Dirk; Rieger, Joerg; Dudeck, Joachim

    2005-01-01

    With the introduction of the ICD-10 as the standard for diagnosis, the development of an electronic representation of its complete content, inherent semantics and coding rules is necessary. Our concept refers to current efforts of the CEN/TC 251 to establish a European standard for hierarchical classification systems in healthcare. We have developed an electronic representation of the ICD-10 with the extensible Markup Language (XML) that facilitates the integration in current information systems or coding software taking into account different languages and versions. In this context, XML offers a complete framework of related technologies and standard tools for processing that helps to develop interoperable applications.

  12. Premotor neural correlates of predictive motor timing for speech production and hand movement: evidence for a temporal predictive code in the motor system.

    PubMed

    Johari, Karim; Behroozmand, Roozbeh

    2017-05-01

    The predictive coding model suggests that neural processing of sensory information is facilitated for temporally-predictable stimuli. This study investigated how temporal processing of visually-presented sensory cues modulates movement reaction time and neural activities in speech and hand motor systems. Event-related potentials (ERPs) were recorded in 13 subjects while they were visually-cued to prepare to produce a steady vocalization of a vowel sound or press a button in a randomized order, and to initiate the cued movement following the onset of a go signal on the screen. Experiment was conducted in two counterbalanced blocks in which the time interval between visual cue and go signal was temporally-predictable (fixed delay at 1000 ms) or unpredictable (variable between 1000 and 2000 ms). Results of the behavioral response analysis indicated that movement reaction time was significantly decreased for temporally-predictable stimuli in both speech and hand modalities. We identified premotor ERP activities with a left-lateralized parietal distribution for hand and a frontocentral distribution for speech that were significantly suppressed in response to temporally-predictable compared with unpredictable stimuli. The premotor ERPs were elicited approximately -100 ms before movement and were significantly correlated with speech and hand motor reaction times only in response to temporally-predictable stimuli. These findings suggest that the motor system establishes a predictive code to facilitate movement in response to temporally-predictable sensory stimuli. Our data suggest that the premotor ERP activities are robust neurophysiological biomarkers of such predictive coding mechanisms. These findings provide novel insights into the temporal processing mechanisms of speech and hand motor systems.

  13. Addressing Hydro-economic Modeling Limitations - A Limited Foresight Sacramento Valley Model and an Open-source Modeling Platform

    NASA Astrophysics Data System (ADS)

    Harou, J. J.; Hansen, K. M.

    2008-12-01

    Increased scarcity of world water resources is inevitable given the limited supply and increased human pressures. The idea that "some scarcity is optimal" must be accepted for rational resource use and infrastructure management decisions to be made. Hydro-economic systems models are unique at representing the overlap of economic drivers, socio-political forces and distributed water resource systems. They demonstrate the tangible benefits of cooperation and integrated flexible system management. Further improvement of models, quality control practices and software will be needed for these academic policy tools to become accepted into mainstream water resource practice. Promising features include: calibration methods, limited foresight optimization formulations, linked simulation-optimization approaches (e.g. embedding pre-existing calibrated simulation models), spatial groundwater models, stream-aquifer interactions and stream routing, etc.. Conventional user-friendly decision support systems helped spread simulation models on a massive scale. Hydro-economic models must also find a means to facilitate construction, distribution and use. Some of these issues and model features are illustrated with a hydro-economic optimization model of the Sacramento Valley. Carry-over storage value functions are used to limit hydrologic foresight of the multi- period optimization model. Pumping costs are included in the formulation by tracking regional piezometric head of groundwater sub-basins. To help build and maintain this type of network model, an open-source water management modeling software platform is described and initial project work is discussed. The objective is to generically facilitate the connection of models, such as those developed in a modeling environment (GAMS, MatLab, Octave, "), to a geographic user interface (drag and drop node-link network) and a database (topology, parameters and time series). These features aim to incrementally move hydro- economic models in the direction of more practical implementation.

  14. A Distributed Architecture for Tsunami Early Warning and Collaborative Decision-support in Crises

    NASA Astrophysics Data System (ADS)

    Moßgraber, J.; Middleton, S.; Hammitzsch, M.; Poslad, S.

    2012-04-01

    The presentation will describe work on the system architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". The challenges for a Tsunami Early Warning System (TEWS) are manifold and the success of a system depends crucially on the system's architecture. A modern warning system following a system-of-systems approach has to integrate various components and sub-systems such as different information sources, services and simulation systems. Furthermore, it has to take into account the distributed and collaborative nature of warning systems. In order to create an architecture that supports the whole spectrum of a modern, distributed and collaborative warning system one must deal with multiple challenges. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. At the bottom layer it has to reliably integrate a large set of conventional sensors, such as seismic sensors and sensor networks, buoys and tide gauges, and also innovative and unconventional sensors, such as streams of messages from social media services. At the top layer it has to support collaboration on high-level decision processes and facilitates information sharing between organizations. In between, the system has to process all data and integrate information on a semantic level in a timely manner. This complex communication follows an event-driven mechanism allowing events to be published, detected and consumed by various applications within the architecture. Therefore, at the upper layer the event-driven architecture (EDA) aspects are combined with principles of service-oriented architectures (SOA) using standards for communication and data exchange. The most prominent challenges on this layer include providing a framework for information integration on a syntactic and semantic level, leveraging distributed processing resources for a scalable data processing platform, and automating data processing and decision support workflows.

  15. Parallel processing for scientific computations

    NASA Technical Reports Server (NTRS)

    Alkhatib, Hasan S.

    1995-01-01

    The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.

  16. Photonic sensor opportunities for distributed and wireless systems in security applications

    NASA Astrophysics Data System (ADS)

    Krohn, David

    2006-10-01

    There are broad ranges of homeland security sensing applications that can be facilitated by distributed fiber optic sensors and photonics integrated wireless systems. These applications include [1]: Pipeline, (Monitoring, Security); Smart structures (Bridges, Tunnels, Dams, Public spaces); Power lines (Monitoring, Security); Transportation security; Chemical/biological detection; Wide area surveillance - perimeter; and Port Security (Underwater surveillance, Cargo container). Many vital assets which cover wide areas, such as pipeline and borders, are under constant threat of being attacked or breached. There is a rapidly emerging need to be able to provide identification of intrusion threats to such vital assets. Similar problems exit for monitoring the basic infrastructure such as water supply, power utilities, communications systems as well as transportation. There is a need to develop a coordinated and integrated solution for the detection of threats. From a sensor standpoint, consideration must not be limited to detection, but how does detection lead to intervention and deterrence. Fiber optic sensor technology must be compatible with other surveillance technologies such as wireless mote technology to facilitate integration. In addition, the multi-functionality of fiber optic sensors must be expanded to include bio-chemical detection. There have been a number of barriers for the acceptance and broad use of smart fiber optic sensors. Compared to telecommunications, the volume is low. This fact coupled with proprietary and custom specifications has kept the price of fiber optic sensors high. There is a general lack of a manufacturing infrastructure and lack of standards for packaging and reliability. Also, there are several competing technologies; some photonic based and other approaches based on conventional non-photonic technologies.

  17. Evaluation of blood neutrophil-lymphocyte ratio and platelet distribution width as inflammatory markers in patients with fibromyalgia.

    PubMed

    Aktürk, Semra; Büyükavcı, Raikan

    2017-08-01

    Fibromyalgia syndrome (FMS) is characterized by chronic widespread pain and systemic symptoms. The aetiology and pathogenesis of fibromyalgia are not yet fully understood. Blood neutrophil/lymphocyte ratio (NLR) is a marker of systemic inflammatory response. Platelet distribution width (PDW) and mean platelet volume (MPV) are the determinants of platelet activation and studied as markers in inflammatory diseases. The aim of the present study was to evaluate levels of NLR,PDW and MPV in patients with fibromyalgia. A total of 197 FMS patients and 53 healthy controls are included in the study. Demographic characteristics, erythrocyte sedimentation rate, C-reactive protein, neutrophil, lymphocyte and platelet counts, platelet distribution width and mean platelet volume levels were recorded. In the patient group, the blood NLR and MPV were significantly higher and the PDW was significantly lower compared to the control group. In the roc curve analysis, blood PDW ≥had 90.4% sensitivity and 90% specificity in predicting fibromyalgia. The results of this study suggest NLR and PDW as promising inflammatory markers indicating fibromyalgia and may be beneficial in facilitating the diagnosis of FMS patients.

  18. Full-Scale Wind-Tunnel Investigation of Wing-Cooling Ducts Effects of Propeller Slipstream, Special Report

    NASA Technical Reports Server (NTRS)

    Nickle, F. R.; Freeman, Arthur B.

    1939-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  19. CMS distributed data analysis with CRAB3

    DOE PAGES

    Mascheroni, M.; Balcas, J.; Belforte, S.; ...

    2015-12-23

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day.CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks andmore » submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. Furthermore, the new system improves the robustness, scalability and sustainability of the service.Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.« less

  20. Fiji: an open-source platform for biological-image analysis.

    PubMed

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  1. Developing a Coalition Battle Management Language to Facilitate Interoperability Between Operation CIS, and Simulations in Support of Training and Mission Rehearsal

    DTIC Science & Technology

    2005-06-01

    virtualisation of distributed computing and data resources such as processing, network bandwidth, and storage capacity, to create a single system...and Simulation (M&S) will be integrated into this heterogeneous SOA. M&S functionality will be available in the form of operational M&S services. One...documents defining net centric warfare, the use of M&S functionality is a common theme. Alberts and Hayes give a good overview on net centric operations

  2. A digital library for medical imaging activities

    NASA Astrophysics Data System (ADS)

    dos Santos, Marcelo; Furuie, Sérgio S.

    2007-03-01

    This work presents the development of an electronic infrastructure to make available a free, online, multipurpose and multimodality medical image database. The proposed infrastructure implements a distributed architecture for medical image database, authoring tools, and a repository for multimedia documents. Also it includes a peer-reviewed model that assures quality of dataset. This public repository provides a single point of access for medical images and related information to facilitate retrieval tasks. The proposed approach has been used as an electronic teaching system in Radiology as well.

  3. Saguaro: a distributed operating system based on pools of servers. Annual report, 1 January 1984-31 December 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, G.R.

    1986-03-03

    Prototypes of components of the Saguaro distributed operating system were implemented and the design of the entire system refined based on the experience. The philosophy behind Saguaro is to support the illusion of a single virtual machine while taking advantage of the concurrency and robustness that are possible in a network architecture. Within the system, these advantages are realized by the use of pools of server processes and decentralized allocation protocols. Potential concurrency and robustness are also made available to the user through low-cost mechanisms to control placement of executing commands and files, and to support semi-transparent file replication andmore » access. Another unique aspect of Saguaro is its extensive use of type system to describe user data such as files and to specify the types of arguments to commands and procedures. This enables the system to assist in type checking and leads to a user interface in which command-specific templates are available to facilitate command invocation. A mechanism, channels, is also provided to enable users to construct applications containing general graphs of communication processes.« less

  4. The breakdown of coordinated decision making in distributed systems.

    PubMed

    Bearman, Christopher; Paletz, Susannah B F; Orasanu, Judith; Thomas, Matthew J W

    2010-04-01

    This article aims to explore the nature and resolution of breakdowns in coordinated decision making in distributed safety-critical systems. In safety-critical domains, people with different roles and responsibilities often must work together to make coordinated decisions while geographically distributed. Although there is likely to be a large degree of overlap in the shared mental models of these people on the basis of procedures and experience, subtle differences may exist. Study 1 involves using Aviation Safety Reporting System reports to explore the ways in which coordinated decision making breaks down between pilots and air traffic controllers and the way in which the breakdowns are resolved. Study 2 replicates and extends those findings with the use of transcripts from the Apollo 13 National Aeronautics and Space Administration space mission. Across both studies, breakdowns were caused in part by different types of lower-level breakdowns (or disconnects), which are labeled as operational, informational, or evaluative. Evaluative disconnects were found to be significantly harder to resolve than other types of disconnects. Considering breakdowns according to the type of disconnect involved appears to capture useful information that should assist accident and incident investigators. The current trend in aviation of shifting responsibilities and providing increasingly more information to pilots may have a hidden cost of increasing evaluative disconnects. The proposed taxonomy facilitates the investigation of breakdowns in coordinated decision making and draws attention to the importance of considering subtle differences between participants' mental models when considering complex distributed systems.

  5. Integration of Earth System Models and Workflow Management under iRODS for the Northeast Regional Earth System Modeling Project

    NASA Astrophysics Data System (ADS)

    Lengyel, F.; Yang, P.; Rosenzweig, B.; Vorosmarty, C. J.

    2012-12-01

    The Northeast Regional Earth System Model (NE-RESM, NSF Award #1049181) integrates weather research and forecasting models, terrestrial and aquatic ecosystem models, a water balance/transport model, and mesoscale and energy systems input-out economic models developed by interdisciplinary research team from academia and government with expertise in physics, biogeochemistry, engineering, energy, economics, and policy. NE-RESM is intended to forecast the implications of planning decisions on the region's environment, ecosystem services, energy systems and economy through the 21st century. Integration of model components and the development of cyberinfrastructure for interacting with the system is facilitated with the integrated Rule Oriented Data System (iRODS), a distributed data grid that provides archival storage with metadata facilities and a rule-based workflow engine for automating and auditing scientific workflows.

  6. Secure multi-party communication with quantum key distribution managed by trusted authority

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Richard John; Nordholt, Jane Elizabeth; Peterson, Charles Glen

    Techniques and tools for implementing protocols for secure multi-party communication after quantum key distribution ("QKD") are described herein. In example implementations, a trusted authority facilitates secure communication between multiple user devices. The trusted authority distributes different quantum keys by QKD under trust relationships with different users. The trusted authority determines combination keys using the quantum keys and makes the combination keys available for distribution (e.g., for non-secret distribution over a public channel). The combination keys facilitate secure communication between two user devices even in the absence of QKD between the two user devices. With the protocols, benefits of QKD aremore » extended to multi-party communication scenarios. In addition, the protocols can retain benefit of QKD even when a trusted authority is offline or a large group seeks to establish secure communication within the group.« less

  7. Coordinating complex decision support activities across distributed applications

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  8. Anisotropy Induced Switching Field Distribution in High-Density Patterned Media

    NASA Astrophysics Data System (ADS)

    Talapatra, A.; Mohanty, J.

    We present here micromagnetic study of variation of switching field distribution (SFD) in a high-density patterned media as a function of magnetic anisotropy of the system. We consider the manifold effect of magnetic anisotropy in terms of its magnitude, tilt in anisotropy axis and random arrangements of magnetic islands with random anisotropy values. Our calculation shows that reduction in anisotropy causes linear decrease in coercivity because the anisotropy energy tries to align the spins along a preferred crystallographic direction. Tilt in anisotropy axis results in decrease in squareness of the hysteresis loop and hence facilitates switching. Finally, the experimental challenges like lithographic distribution of magnetic islands, their orientation, creation of defects, etc. demanded the distribution of anisotropy to be random along with random repetitions. We have explained that the range of anisotropy values and the number of bits with different anisotropy play a key role over SFD, whereas the position of the bits and their repetitions do not show a considerable contribution.

  9. ClimateSpark: An in-memory distributed computing framework for big climate data analytics

    NASA Astrophysics Data System (ADS)

    Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei

    2018-06-01

    The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.

  10. Design and Implementation of Replicated Object Layer

    NASA Technical Reports Server (NTRS)

    Koka, Sudhir

    1996-01-01

    One of the widely used techniques for construction of fault tolerant applications is the replication of resources so that if one copy fails sufficient copies may still remain operational to allow the application to continue to function. This thesis involves the design and implementation of an object oriented framework for replicating data on multiple sites and across different platforms. Our approach, called the Replicated Object Layer (ROL) provides a mechanism for consistent replication of data over dynamic networks. ROL uses the Reliable Multicast Protocol (RMP) as a communication protocol that provides for reliable delivery, serialization and fault tolerance. Besides providing type registration, this layer facilitates distributed atomic transactions on replicated data. A novel algorithm called the RMP Commit Protocol, which commits transactions efficiently in reliable multicast environment is presented. ROL provides recovery procedures to ensure that site and communication failures do not corrupt persistent data, and male the system fault tolerant to network partitions. ROL will facilitate building distributed fault tolerant applications by performing the burdensome details of replica consistency operations, and making it completely transparent to the application.Replicated databases are a major class of applications which could be built on top of ROL.

  11. Datasets on demographic trends in enrollment into undergraduate engineering programs at Covenant University, Nigeria.

    PubMed

    Popoola, Segun I; Atayero, Aderemi A; Badejo, Joke A; Odukoya, Jonathan A; Omole, David O; Ajayi, Priscilla

    2018-06-01

    In this data article, we present and analyze the demographic data of undergraduates admitted into engineering programs at Covenant University, Nigeria. The population distribution of 2649 candidates admitted into Chemical Engineering, Civil Engineering, Computer Engineering, Electrical and Electronics Engineering, Information and Communication Engineering, Mechanical Engineering, and Petroleum Engineering programs between 2002 and 2009 are analyzed by gender, age, and state of origin. The data provided in this data article were retrieved from the student bio-data submitted to the Department of Admissions and Student Records (DASR) and Center for Systems and Information Services (CSIS) by the candidates during the application process into the various engineering undergraduate programs. These vital information is made publicly available, after proper data anonymization, to facilitate empirical research in the emerging field of demographics analytics in higher education. A Microsoft Excel spreadsheet file is attached to this data article and the data is thoroughly described for easy reuse. Descriptive statistics and frequency distributions of the demographic data are presented in tables, plots, graphs, and charts. Unrestricted access to these demographic data will facilitate reliable and evidence-based research findings for sustainable education in developing countries.

  12. Dynamic shared state maintenance in distributed virtual environments

    NASA Astrophysics Data System (ADS)

    Hamza-Lup, Felix George

    Advances in computer networks and rendering systems facilitate the creation of distributed collaborative environments in which the distribution of information at remote locations allows efficient communication. Particularly challenging are distributed interactive Virtual Environments (VE) that allow knowledge sharing through 3D information. The purpose of this work is to address the problem of latency in distributed interactive VE and to develop a conceptual model for consistency maintenance in these environments based on the participant interaction model. An area that needs to be explored is the relationship between the dynamic shared state and the interaction with the virtual entities present in the shared scene. Mixed Reality (MR) and VR environments must bring the human participant interaction into the loop through a wide range of electronic motion sensors, and haptic devices. Part of the work presented here defines a novel criterion for categorization of distributed interactive VE and introduces, as well as analyzes, an adaptive synchronization algorithm for consistency maintenance in such environments. As part of the work, a distributed interactive Augmented Reality (AR) testbed and the algorithm implementation details are presented. Currently the testbed is part of several research efforts at the Optical Diagnostics and Applications Laboratory including 3D visualization applications using custom built head-mounted displays (HMDs) with optical motion tracking and a medical training prototype for endotracheal intubation and medical prognostics. An objective method using quaternion calculus is applied for the algorithm assessment. In spite of significant network latency, results show that the dynamic shared state can be maintained consistent at multiple remotely located sites. In further consideration of the latency problems and in the light of the current trends in interactive distributed VE applications, we propose a hybrid distributed system architecture for sensor-based distributed VE that has the potential to improve the system real-time behavior and scalability. (Abstract shortened by UMI.)

  13. Knowledge-base browsing: an application of hybrid distributed/local connectionist networks

    NASA Astrophysics Data System (ADS)

    Samad, Tariq; Israel, Peggy

    1990-08-01

    We describe a knowledge base browser based on a connectionist (or neural network) architecture that employs both distributed and local representations. The distributed representations are used for input and output thereby enabling associative noise-tolerant interaction with the environment. Internally all representations are fully local. This simplifies weight assignment and facilitates network configuration for specific applications. In our browser concepts and relations in a knowledge base are represented using " microfeatures. " The microfeatures can encode semantic attributes structural features contextual information etc. Desired portions of the knowledge base can then be associatively retrieved based on a structured cue. An ordered list of partial matches is presented to the user for selection. Microfeatures can also be used as " bookmarks" they can be placed dynamically at appropriate points in the knowledge base and subsequently used as retrieval cues. A proof-of-concept system has been implemented for an internally developed Honeywell-proprietary knowledge acquisition tool. 1.

  14. VIEWCACHE: An incremental pointer-based access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, N.; Sellis, Timos

    1993-01-01

    One of the biggest problems facing NASA today is to provide scientists efficient access to a large number of distributed databases. Our pointer-based incremental data base access method, VIEWCACHE, provides such an interface for accessing distributed datasets and directories. VIEWCACHE allows database browsing and search performing inter-database cross-referencing with no actual data movement between database sites. This organization and processing is especially suitable for managing Astrophysics databases which are physically distributed all over the world. Once the search is complete, the set of collected pointers pointing to the desired data are cached. VIEWCACHE includes spatial access methods for accessing image datasets, which provide much easier query formulation by referring directly to the image and very efficient search for objects contained within a two-dimensional window. We will develop and optimize a VIEWCACHE External Gateway Access to database management systems to facilitate database search.

  15. Hydrographic Data Curation and Stewardship: GO-SHIP

    NASA Astrophysics Data System (ADS)

    Stephen, Diggs; Lynne, Talley; Martin, Kramp; Bernadette, Sloyan

    2014-05-01

    Expert data management (access, formats, data life-cycle) facilitates the successful re-use of information which address many important scientific questions such as detecting decadal and longer-term changes in global ocean heat and freshwater content. Modern hydrographic data management has its origins in the WOCE program where new and existing distributed resources were identified and organized into an effective "super DAC". Data from this program are referenced in hundreds of scientific papers. The distributed hydrographic data system, now under the name GO-SHIP, exists today and has adapted to the new geoscience demands of the 21st century. This presentation will describe science drivers and the required data center resources (CCHDO, CDIAC, JCOMMOPS) which together provide reliable access for the global research community.

  16. Flow field and dissolved oxygen distributions in the outer channel of the Orbal oxidation ditch by monitor and CFD simulation.

    PubMed

    Guo, Xuesong; Zhou, Xin; Chen, Qiuwen; Liu, Junxin

    2013-04-01

    In the Orbal oxidation ditch, denitrification is primarily accomplished in the outer channel. However, the detailed characteristics of the flow field and dissolved oxygen (DO) distribution in the outer channel are not well understood. Therefore, in this study, the flow velocity and DO concentration in the outer channel of an Orbal oxidation ditch system in a wastewater treatment plant in Beijing (China) were monitored under actual operation conditions. The flow field and DO concentration distributions were analyzed by computed fluid dynamic modeling. In situ monitoring and modeling both showed that the flow velocity was heterogeneous in the outer channel. As a result, the DO was also heterogeneously distributed in the outer channel, with concentration gradients occurring along the flow direction as well as in the cross-section. This heterogeneous DO distribution created many anoxic and aerobic zones, which may have facilitated simultaneous nitrification-denitrification in the channel. These findings may provide supporting information for rational optimization of the performance of the Orbal oxidation ditch.

  17. Mapping the potential distribution of the invasive Red Shiner, Cyprinella lutrensis (Teleostei: Cyprinidae) across waterways of the conterminous United States

    USGS Publications Warehouse

    Poulos, Helen M.; Chernoff, Barry; Fuller, Pam L.; Butman, David

    2012-01-01

    Predicting the future spread of non-native aquatic species continues to be a high priority for natural resource managers striving to maintain biodiversity and ecosystem function. Modeling the potential distributions of alien aquatic species through spatially explicit mapping is an increasingly important tool for risk assessment and prediction. Habitat modeling also facilitates the identification of key environmental variables influencing species distributions. We modeled the potential distribution of an aggressive invasive minnow, the red shiner (Cyprinella lutrensis), in waterways of the conterminous United States using maximum entropy (Maxent). We used inventory records from the USGS Nonindigenous Aquatic Species Database, native records for C. lutrensis from museum collections, and a geographic information system of 20 raster climatic and environmental variables to produce a map of potential red shiner habitat. Summer climatic variables were the most important environmental predictors of C. lutrensis distribution, which was consistent with the high temperature tolerance of this species. Results from this study provide insights into the locations and environmental conditions in the US that are susceptible to red shiner invasion.

  18. The use of electronic health records in Spanish hospitals.

    PubMed

    Marca, Guillem; Perez, Angel; Blanco-Garcia, Martin German; Miravalles, Elena; Soley, Pere; Ortiga, Berta

    The aims of this study were to describe the level of adoption of electronic health records in Spanish hospitals and to identify potential barriers and facilitators to this process. We used an observational cross-sectional design. The survey was conducted between September and December 2011, using an electronic questionnaire distributed through email. We obtained a 30% response rate from the 214 hospitals contacted, all belonging to the Spanish National Health Service. The level of adoption of electronic health records in Spanish hospitals was found to be high: 39.1% of hospitals surveyed had a comprehensive EHR system while a basic system was functioning in 32.8% of the cases. However, in 2011 one third of the hospitals did not have a basic electronic health record system, although some have since implemented electronic functionalities, particularly those related to clinical documentation and patient administration. Respondents cited the acquisition and implementation costs as the main barriers to implementation. Facilitators for EHR implementation were: the possibility to hire technical support, both during and post implementation; security certification warranty; and objective third-party evaluations of EHR products. In conclusion, the number of hospitals that have electronic health records is in general high, being relatively higher in medium-sized hospitals.

  19. Large fluctuations of the macroscopic current in diffusive systems: a numerical test of the additivity principle.

    PubMed

    Hurtado, Pablo I; Garrido, Pedro L

    2010-04-01

    Most systems, when pushed out of equilibrium, respond by building up currents of locally conserved observables. Understanding how microscopic dynamics determines the averages and fluctuations of these currents is one of the main open problems in nonequilibrium statistical physics. The additivity principle is a theoretical proposal that allows to compute the current distribution in many one-dimensional nonequilibrium systems. Using simulations, we validate this conjecture in a simple and general model of energy transport, both in the presence of a temperature gradient and in canonical equilibrium. In particular, we show that the current distribution displays a Gaussian regime for small current fluctuations, as prescribed by the central limit theorem, and non-Gaussian (exponential) tails for large current deviations, obeying in all cases the Gallavotti-Cohen fluctuation theorem. In order to facilitate a given current fluctuation, the system adopts a well-defined temperature profile different from that of the steady state and in accordance with the additivity hypothesis predictions. System statistics during a large current fluctuation is independent of the sign of the current, which implies that the optimal profile (as well as higher-order profiles and spatial correlations) are invariant upon current inversion. We also demonstrate that finite-time joint fluctuations of the current and the profile are well described by the additivity functional. These results suggest the additivity hypothesis as a general and powerful tool to compute current distributions in many nonequilibrium systems.

  20. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  1. Gene Therapy for Neurologic Manifestations of Mucopolysaccharidoses

    PubMed Central

    Wolf, Daniel A.; Banerjee, Sharbani; Hackett, Perry B.; Whitley, Chester B.; McIvor, R. Scott; Low, Walter C.

    2015-01-01

    Introduction Mucopolysaccharidoses are a family of lysosomal disorders caused by mutations in genes that encode enzymes involved in the catabolism of glycoaminoglycans. These mutations affect multiple organ systems and can be particularly deleterious to the nervous system. At the present time, enzyme replacement therapy and hematopoietic stem-cell therapy are used to treat patients with different forms of these disorders. However, to a great extent the nervous system is not adequately responsive to current therapeutic approaches. Areas Covered Recent advances in gene therapy show great promise for treating mucopolysaccharidoses. This article reviews the current state of the art for routes of delivery in developing genetic therapies for treating the neurologic manifestations of mucopolysaccharidoses. Expert Opinion Gene therapy for treating neurological manifestations of mucopolysaccharidoses can be achieved by intraventricular, intrathecal, intranasal, and systemic administration. The intraventricular route of administration appears to provide the most wide-spread distribution of gene therapy vectors to the brain. The intrathecal route of delivery results in predominant distribution to the caudal areas of the brain while the intranasal route of delivery results in good distribution to the rostral areas of brain. The systemic route of delivery via intravenous delivery can also achieve wide spread delivery to the CNS, however, the distribution to the brain is greatly dependent on the vector system. Intravenous delivery using lentiviral vectors appear to be less effective than adeno-associated viral (AAV) vectors. Moreover, some subtypes of AAV vectors are more effective than others in crossing the blood-brain-barrier. In summary, the recent advances in gene vector technology and routes of delivery to the CNS will facilitate the clinical translation of gene therapy for the treatment of the neurological manifestations of mucopolysaccharidoses. PMID:25510418

  2. A Multiprocessor Operating System Simulator

    NASA Technical Reports Server (NTRS)

    Johnston, Gary M.; Campbell, Roy H.

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall semester of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT&T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows that of the 'Choices' family of operating systems for loosely- and tightly-coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.

  3. Distributed Leadership, Teacher Morale, and Teacher Enthusiasm: Unravelling the Leadership Pathways to School Success

    ERIC Educational Resources Information Center

    Sheppard, Bruce; Hurley, Noel; Dibbon, David

    2010-01-01

    The study reported in this paper advances the understanding of distributed leadership in schools, the role of the school principal in the facilitation of distributed leadership and its impact upon teachers' morale and enthusiasm for their work. While both the empirical base and practical application of distributed leadership has grown phenomenally…

  4. Intraparenchymal ultrasound application and improved distribution of infusate with convection-enhanced delivery in rodent and nonhuman primate brain.

    PubMed

    Mano, Yui; Saito, Ryuta; Haga, Yoichi; Matsunaga, Tadao; Zhang, Rong; Chonan, Masashi; Haryu, Shinya; Shoji, Takuhiro; Sato, Aya; Sonoda, Yukihiko; Tsuruoka, Noriko; Nishiyachi, Keisuke; Sumiyoshi, Akira; Nonaka, Hiroi; Kawashima, Ryuta; Tominaga, Teiji

    2016-05-01

    OBJECT Convection-enhanced delivery (CED) is an effective drug delivery method that delivers high concentrations of drugs directly into the targeted lesion beyond the blood-brain barrier. However, the drug distribution attained using CED has not satisfactorily covered the entire targeted lesion in tumors such as glioma. Recently, the efficacy of ultrasound assistance was reported for various drug delivery applications. The authors developed a new ultrasound-facilitated drug delivery (UFD) system that enables the application of ultrasound at the infusion site. The purpose of this study was to demonstrate the efficacy of the UFD system and to examine effective ultrasound profiles. METHODS The authors fabricated a steel bar-based device that generates ultrasound and enables infusion of the aqueous drug from one end of the bar. The volume of distribution (Vd) after infusion of 10 ml of 2% Evans blue dye (EBD) into rodent brain was tested with different frequencies and applied voltages: 252 kHz/30 V; 252 kHz/60 V; 524 kHz/13 V; 524 kHz/30 V; and 524 kHz/60 V. In addition, infusion of 5 mM gadopentetate dimeglumine (Gd-DTPA) was tested with 260 kHz/60 V, the distribution of which was evaluated using a 7-T MRI unit. In a nonhuman primate (Macaca fascicularis) study, 300 μl of 1 mM Gd-DTPA/EBD was infused. The final distribution was evaluated using MRI. Two-sample comparisons were made by Student t-test, and 1-way ANOVA was used for multiple comparisons. Significance was set at p < 0.05. RESULTS After infusion of 10 μl of EBD into the rat brain using the UFD system, the Vds of EBD in the UFD groups were significantly larger than those of the control group. When a frequency of 252 kHz was applied, the Vd of the group in which 60 V was applied was significantly larger than that of the group in which 30 V was used. When a frequency of 524 kHz was applied, the Vd tended to increase with application of a higher voltage; however, the differences were not significant (1-way ANOVA). The Vd of Gd-DTPA was also significantly larger in the UFD group than in the control group (p < 0.05, Student t-test). The volume of Gd-DTPA in the nonhuman primate used in this study was 1209.8 ± 193.6 mm(3). This volume was much larger than that achieved by conventional CED (568.6 ± 141.0 mm(3)). CONCLUSIONS The UFD system facilitated the distribution of EBD and Gd-DTPA more effectively than conventional CED. Lower frequency and higher applied voltage using resonance frequencies might be more effective to enlarge the Vd. The UFD system may provide a new treatment approach for CNS disorders.

  5. States of Cybersecurity: Electricity Distribution System Discussions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pena, Ivonne; Ingram, Michael; Martin, Maurice

    State and local entities that oversee the reliable, affordable provision of electricity are faced with growing and evolving threats from cybersecurity risks to our nation's electricity distribution system. All-hazards system resilience is a shared responsibility among electric utilities and their regulators or policy-setting boards of directors. Cybersecurity presents new challenges and should be a focus for states, local governments, and Native American tribes that are developing energy-assurance plans to protect critical infrastructure. This research sought to investigate the implementation of governance and policy at the distribution utility level that facilitates cybersecurity preparedness to inform the U.S. Department of Energy (DOE),more » Office of Energy Policy and Systems Analysis; states; local governments; and other stakeholders on the challenges, gaps, and opportunities that may exist for future analysis. The need is urgent to identify the challenges and inconsistencies in how cybersecurity practices are being applied across the United States to inform the development of best practices, mitigations, and future research and development investments in securing the electricity infrastructure. By examining the current practices and applications of cybersecurity preparedness, this report seeks to identify the challenges and persistent gaps between policy and execution and reflect the underlying motivations of distinct utility structures as they play out at the local level. This study aims to create an initial baseline of cybersecurity preparedness within the distribution electricity sector. The focus of this study is on distribution utilities not bound by the cybersecurity guidelines of the North American Electric Reliability Corporation (NERC) to examine the range of mechanisms taken by state regulators, city councils that own municipal utilities, and boards of directors of rural cooperatives.« less

  6. A Health Systems Approach to Integrated Community Case Management of Childhood Illness: Methods and Tools

    PubMed Central

    McGorman, Laura; Marsh, David R.; Guenther, Tanya; Gilroy, Kate; Barat, Lawrence M.; Hammamy, Diaa; Wansi, Emmanuel; Peterson, Stefan; Hamer, Davidson H.; George, Asha

    2012-01-01

    Integrated community case management (iCCM) of childhood illness is an increasingly popular strategy to expand life-saving health services to underserved communities. However, community health approaches vary widely across countries and do not always distribute resources evenly across local health systems. We present a harmonized framework, developed through interagency consultation and review, which supports the design of CCM by using a systems approach. To verify that the framework produces results, we also suggest a list of complementary indicators, including nine global metrics, and a menu of 39 country-specific measures. When used by program managers and evaluators, we propose that the framework and indicators can facilitate the design, implementation, and evaluation of community case management. PMID:23136280

  7. Projection systems with a cut-off line for automotive applications

    NASA Astrophysics Data System (ADS)

    Kloos, G.; Eichhorn, K.

    2005-08-01

    The lighting systems of a car provide a variety of challenges from the point of view of illumination science and technology. Engineering work in this field has to deal both with reflector and lens design as well as with opto-mechanical design and sensor technology. It has direct implications on traffic safety and the efficiency in which energy is used. Therefore, these systems are continuously improved and optimized. In this context, adaptive systems that we investigate for automotive applications gain increasing importance. The properties of the light distribution in the vicinity of the cut-off line are of key importance for the safe and efficient operation of automotive headlamps. An alternative approach is proposed to refine the description of these properties in an attempt to make it more quantitative. This description is intended to facilitate intercomparison between different systems and/or to study environmental influences on the cut-off line of a system under investigation. Designing projection systems it is necessary to take a delicate trade-off between efficiency, light-distribution characteristics, mechanical boundary conditions, and legal requirements into account. Considerations and results on optical properties of three-axial reflectors in dependence of layout parameters will be given. They can serve as a guideline for the optical workshop and for free-form optimization.

  8. Sensitivity Analysis of Data Link Alternatives for LVLASO

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1998-01-01

    As part of this research, we have modeled the Mode-S system when used to enhance communications among several ground vehicles to facilitate low-visibility landing and surface operations. The model has then been simulated using Bones Designer software. The effectiveness of the model has been evaluated under several conditions: (i) different number of vehicles (100, 200, and 300), (ii) different distributions of interarrival times for squitters: uniform, exponential, and constrained exponential, and (iii) Different safe distances (for collision purpose): squitter length, 1.5*squitter length, and 2* squitter length. The model has been developed in a modular fashion to facilitate any future modifications. The results from the simulations suggest that the Mode S system is indeed capable of functioning satisfactorily even when covering up to 300 vehicles. Certainly, about 10 percent of the squitters undergo collisions and hence the interarrival times for these is much larger than the expected time of 500 msec. In fact, the delay could be as much as 2 seconds. The model could be further enhanced to incorporate more realistic scenarios.

  9. Facilitating Children’s Ability to Distinguish Symbols for Emotions: The Effects of Background Color Cues and Spatial Arrangement of Symbols on Accuracy and Speed of Search

    PubMed Central

    Wilkinson, Krista M.; Snell, Julie

    2012-01-01

    Purpose Communication about feelings is a core element of human interaction. Aided augmentative and alternative communication systems must therefore include symbols representing these concepts. The symbols must be readily distinguishable in order for users to communicate effectively. However, emotions are represented within most systems by schematic faces in which subtle distinctions are difficult to represent. We examined whether background color cuing and spatial arrangement might help children identify symbols for different emotions. Methods Thirty nondisabled children searched for symbols representing emotions within an 8-choice array. On some trials, a color cue signaled the valence of the emotion (positive vs. negative). Additionally, symbols were either organized with the negatively-valenced symbols at the top and the positive symbols on the bottom of the display, or the symbols were distributed randomly throughout. Dependent variables were accuracy and speed of responses. Results The speed with which children could locate a target was significantly faster for displays in which symbols were clustered by valence, but only when the symbols had white backgrounds. Addition of a background color cue did not facilitate responses. Conclusions Rapid search was facilitated by a spatial organization cue, but not by the addition of background color. Further examination of the situations in which color cues may be useful is warranted. PMID:21813821

  10. Facilitating children's ability to distinguish symbols for emotions: the effects of background color cues and spatial arrangement of symbols on accuracy and speed of search.

    PubMed

    Wilkinson, Krista M; Snell, Julie

    2011-11-01

    Communication about feelings is a core element of human interaction. Aided augmentative and alternative communication systems must therefore include symbols representing these concepts. The symbols must be readily distinguishable in order for users to communicate effectively. However, emotions are represented within most systems by schematic faces in which subtle distinctions are difficult to represent. We examined whether background color cuing and spatial arrangement might help children identify symbols for different emotions. Thirty nondisabled children searched for symbols representing emotions within an 8-choice array. On some trials, a color cue signaled the valence of the emotion (positive vs. negative). Additionally, the symbols were either (a) organized with the negatively valenced symbols at the top and the positive symbols on the bottom of the display or (b) distributed randomly throughout. Dependent variables were accuracy and speed of responses. The speed with which children could locate a target was significantly faster for displays in which symbols were clustered by valence, but only when the symbols had white backgrounds. Addition of a background color cue did not facilitate responses. Rapid search was facilitated by a spatial organization cue, but not by the addition of background color. Further examination of the situations in which color cues may be useful is warranted.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bordetsky, A; Dougan, A D; Nekoogar, F

    The paper addresses technological and operational challenges of developing a global plug-and-play Maritime Domain Security testbed for the Global War on Terrorism mission. This joint NPS-LLNL project is based on the NPS Tactical Network Topology (TNT) composed of long-haul OFDM networks combined with self-forming wireless mesh links to air, surface, ground, and underwater unmanned vehicles. This long-haul network is combined with ultra-wideband (UWB) communications systems for wireless communications in harsh radio propagation channels. LLNL's UWB communication prototypes are designed to overcome shortcomings of the present narrowband communications systems in heavy metallic and constricted corridors inside ships. In the center ofmore » our discussion are networking solutions for the Maritime Interdiction Operation (MIO) Experiments in which geographically distributed command centers and subject matter experts collaborate with the Boarding Party in real time to facilitate situational understanding and course of action selection. The most recent experiment conducted via the testbed extension to the Alameda Island exercised several key technologies aimed at improving MIO. These technologies included UWB communications from within the ship to Boarding Party leader sending data files and pictures, advanced radiation detection equipment for search and identification, biometric equipment to record and send fingerprint files to facilitate rapid positive identification of crew members, and the latest updates of the NPS Tactical Network Topology facilitating reachback to LLNL, Biometric Fusion Center, USCG, and DTRA experts.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youssef, Tarek A.; El Hariri, Mohamad; Elsayed, Ahmed T.

    The smart grid is seen as a power system with realtime communication and control capabilities between the consumer and the utility. This modern platform facilitates the optimization in energy usage based on several factors including environmental, price preferences, and system technical issues. In this paper a real-time energy management system (EMS) for microgrids or nanogrids was developed. The developed system involves an online optimization scheme to adapt its parameters based on previous, current, and forecasted future system states. The communication requirements for all EMS modules were analyzed and are all integrated over a data distribution service (DDS) Ethernet network withmore » appropriate quality of service (QoS) profiles. In conclusion, the developed EMS was emulated with actual residential energy consumption and irradiance data from Miami, Florida and proved its effectiveness in reducing consumers’ bills and achieving flat peak load profiles.« less

  13. System Engineering Analysis For Improved Scout Business Information Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Slyke, D. A.

    The project uses system engineering principles to address the need of Boy Scout leaders for an integrated system to facilitate advancement and awards records, leader training and planning for meetings and activities. Existing products to address needs of Scout leaders and relevant stakeholders function to support record keeping and some communication functions but opportunity exists for a better system to fully integrate these functions with training delivery and recording, activity planning along with feedback and information gathering from stakeholders. Key stakeholders for the sytem include Scouts and their families, leaders, training providers, sellers of supplies and awards, content generators andmore » facilities that serve Scout activities. Key performance parameters for the system are protection of personal information, availability of current information, information accuracy and information content that has depth. Implementation concepts considered for the system include (1) owned and operated by Boy Scouts of America, (2) Contracted out to a vendor (3) distributed system that functions with BSA managed interfaces. The selected concept is to contract out to a vendor to maximize the likelihood of successful integration and take advantage of the best technology. Development of requirements considers three key use cases (1) System facilitates planning a hike with training needed satisfied in advance and advancement recording real time (2) Scheduling and documenting in-person training, (3) Family interested in Scouting receives information and can request follow-up. Non-functional requirements are analyzed with the Quality Function Deployment tool. Requirement addressing frequency of backup, compatibility with legacy and new technology, language support, software update are developed to address system reliability and intuitive interface. System functions analyzed include update of activity database, maintenance of advancement status, archive of documents, and monitoring of content that is accessible. The study examines risks associated with information security, technological change and continued popularity of Scouting. Mitigation is based on system functions that are defined. The approach to developing an improved system for facilitating Boy Scout leader functions was iterative with insights into capabilities coming in the course of working through the used cases and sequence diagrams.« less

  14. Three-dimensional evidence network plot system: covariate imbalances and effects in network meta-analysis explored using a new software tool.

    PubMed

    Batson, Sarah; Score, Robert; Sutton, Alex J

    2017-06-01

    The aim of the study was to develop the three-dimensional (3D) evidence network plot system-a novel web-based interactive 3D tool to facilitate the visualization and exploration of covariate distributions and imbalances across evidence networks for network meta-analysis (NMA). We developed the 3D evidence network plot system within an AngularJS environment using a third party JavaScript library (Three.js) to create the 3D element of the application. Data used to enable the creation of the 3D element for a particular topic are inputted via a Microsoft Excel template spreadsheet that has been specifically formatted to hold these data. We display and discuss the findings of applying the tool to two NMA examples considering multiple covariates. These two examples have been previously identified as having potentially important covariate effects and allow us to document the various features of the tool while illustrating how it can be used. The 3D evidence network plot system provides an immediate, intuitive, and accessible way to assess the similarity and differences between the values of covariates for individual studies within and between each treatment contrast in an evidence network. In this way, differences between the studies, which may invalidate the usual assumptions of an NMA, can be identified for further scrutiny. Hence, the tool facilitates NMA feasibility/validity assessments and aids in the interpretation of NMA results. The 3D evidence network plot system is the first tool designed specifically to visualize covariate distributions and imbalances across evidence networks in 3D. This will be of primary interest to systematic review and meta-analysis researchers and, more generally, those assessing the validity and robustness of an NMA to inform reimbursement decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Neurobiologically realistic determinants of self-organized criticality in networks of spiking neurons.

    PubMed

    Rubinov, Mikail; Sporns, Olaf; Thivierge, Jean-Philippe; Breakspear, Michael

    2011-06-01

    Self-organized criticality refers to the spontaneous emergence of self-similar dynamics in complex systems poised between order and randomness. The presence of self-organized critical dynamics in the brain is theoretically appealing and is supported by recent neurophysiological studies. Despite this, the neurobiological determinants of these dynamics have not been previously sought. Here, we systematically examined the influence of such determinants in hierarchically modular networks of leaky integrate-and-fire neurons with spike-timing-dependent synaptic plasticity and axonal conduction delays. We characterized emergent dynamics in our networks by distributions of active neuronal ensemble modules (neuronal avalanches) and rigorously assessed these distributions for power-law scaling. We found that spike-timing-dependent synaptic plasticity enabled a rapid phase transition from random subcritical dynamics to ordered supercritical dynamics. Importantly, modular connectivity and low wiring cost broadened this transition, and enabled a regime indicative of self-organized criticality. The regime only occurred when modular connectivity, low wiring cost and synaptic plasticity were simultaneously present, and the regime was most evident when between-module connection density scaled as a power-law. The regime was robust to variations in other neurobiologically relevant parameters and favored systems with low external drive and strong internal interactions. Increases in system size and connectivity facilitated internal interactions, permitting reductions in external drive and facilitating convergence of postsynaptic-response magnitude and synaptic-plasticity learning rate parameter values towards neurobiologically realistic levels. We hence infer a novel association between self-organized critical neuronal dynamics and several neurobiologically realistic features of structural connectivity. The central role of these features in our model may reflect their importance for neuronal information processing.

  16. Parallel computation and the Basis system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, G.R.

    1992-12-16

    A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to-use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communication costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis and Parallelmore » Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less

  17. Parallel computation and the basis system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, G.R.

    1993-05-01

    A software package has been written that can facilitate efforts to develop powerful, flexible, and easy-to use programs that can run in single-processor, massively parallel, and distributed computing environments. Particular attention has been given to the difficulties posed by a program consisting of many science packages that represent subsystems of a complicated, coupled system. Methods have been found to maintain independence of the packages by hiding data structures without increasing the communications costs in a parallel computing environment. Concepts developed in this work are demonstrated by a prototype program that uses library routines from two existing software systems, Basis andmore » Parallel Virtual Machine (PVM). Most of the details of these libraries have been encapsulated in routines and macros that could be rewritten for alternative libraries that possess certain minimum capabilities. The prototype software uses a flexible master-and-slaves paradigm for parallel computation and supports domain decomposition with message passing for partitioning work among slaves. Facilities are provided for accessing variables that are distributed among the memories of slaves assigned to subdomains. The software is named PROTOPAR.« less

  18. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geospatial Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ananthakrishnan, Rachana; Bell, Gavin; Cinquini, Luca

    2013-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less

  19. The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geo-Spatial Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cinquini, Luca; Crichton, Daniel; Miller, Neill

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less

  20. The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data

    NASA Technical Reports Server (NTRS)

    Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark; hide

    2012-01-01

    The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

  1. Simulations of the Sampling Distribution of the Mean Do Not Necessarily Mislead and Can Facilitate Learning

    ERIC Educational Resources Information Center

    Lane, David M.

    2015-01-01

    Recently Watkins, Bargagliotti, and Franklin (2014) discovered that simulations of the sampling distribution of the mean can mislead students into concluding that the mean of the sampling distribution of the mean depends on sample size. This potential error arises from the fact that the mean of a simulated sampling distribution will tend to be…

  2. Interrupted: The roles of distributed effort and incubation in preventing fixation and generating problem solutions.

    PubMed

    Sio, Ut Na; Kotovsky, Kenneth; Cagan, Jonathan

    2017-05-01

    Fixation on inappropriate concepts is a key barrier to problem solving. Previous research has shown that continuous work is likely to cause repeated retrieval of those concepts, resulting in increased fixation. Accordingly, distributing effort across problems through multiple, brief, and interlaced sessions (distributed effort) should prevent such fixation and in turn enhance problem solving. This study examined whether distributed effort can provide an advantage for problem solving, particularly for problems that can induce fixation (Experiment 1), and whether and how incubation can be combined with distributed effort to further enhance performance (Experiment 2). Remote Associates Test (RAT) problems were used as the problem-solving tasks. Half of them (i.e., misleading RAT) were more likely to mislead individuals to fixate on incorrect associates than the other half. Experiments revealed a superiority of distributed over massed effort on misleading RAT performance and a differing time course of incubation for the massed and distributed groups. We conclude that distributed effort facilitates problem solving, most likely via overcoming fixation. Cognitive mechanisms other than the commonly posited forgetting of inappropriate ideas may occur during incubation to facilitate problem solving. The experiments in this article offer support for the occurrence of spreading activation during incubation.

  3. Traceability from a US perspective.

    PubMed

    Smith, G C; Tatum, J D; Belk, K E; Scanga, J A; Grandin, T; Sofos, J N

    2005-09-01

    Traceability of a food consists of development of "an information trail that follows the food product's physical trail". Internationally, the US is lagging behind many countries in developing traceability systems for food in general and especially for livestock, poultry and their products. The US food industry is developing, implementing and maintaining traceability systems designed to improve food supply management, facilitate traceback for food safety and quality, and differentiate and market foods with subtle or undetectable quality attributes. Traceability, for livestock, poultry and meat, in its broadest context, can, could, or will eventually be used: (1) to ascertain origin and ownership, and to deter theft and misrepresentation, of animals and meat; (2) for surveillance, control and eradication of foreign animal diseases; (3) for biosecurity protection of the national livestock population; (4) for compliance with requirements of international customers; (5) for compliance with country-of-origin labeling requirements; (6) for improvement of supply-side management, distribution/delivery systems and inventory controls; (7) to facilitate value-based marketing; (8) to facilitate value-added marketing; (9) to isolate the source and extent of quality-control and food-safety problems; and (10) to minimize product recalls and make crisis management protocols more effective. Domestically and internationally, it has now become essential that producers, packers, processors, wholesalers, exporters and retailers assure that livestock, poultry and meat are identified, that record-keeping assures traceability through all or parts of the complete life-cycle, and that, in some cases, the source, the production-practices and/or the process of generating final products, can be verified. At issue, as the US develops traceback capabilities, will be the breadth, depth and precision of its specific traceability systems.

  4. Pitfalls in Persuasion: How Do Users Experience Persuasive Techniques in a Web Service?

    NASA Astrophysics Data System (ADS)

    Segerståhl, Katarina; Kotro, Tanja; Väänänen-Vainio-Mattila, Kaisa

    Persuasive technologies are designed by utilizing a variety of interactive techniques that are believed to promote target behaviors. This paper describes a field study in which the aim was to discover possible pitfalls of persuasion, i.e., situations in which persuasive techniques do not function as expected. The study investigated persuasive functionality of a web service targeting weight loss. A qualitative online questionnaire was distributed through the web service and a total of 291 responses were extracted for interpretative analysis. The Persuasive Systems Design model (PSD) was used for supporting systematic analysis of persuasive functionality. Pitfalls were identified through situations that evoked negative user experiences. The primary pitfalls discovered were associated with manual logging of eating and exercise behaviors, appropriateness of suggestions and source credibility issues related to social facilitation. These pitfalls, when recognized, can be addressed in design by applying functional and facilitative persuasive techniques in meaningful combinations.

  5. High-voltage electrode optimization towards uniform surface treatment by a pulsed volume discharge

    NASA Astrophysics Data System (ADS)

    Ponomarev, A. V.; Pedos, M. S.; Scherbinin, S. V.; Mamontov, Y. I.; Ponomarev, S. V.

    2015-11-01

    In this study, the shape and material of the high-voltage electrode of an atmospheric pressure plasma generation system were optimised. The research was performed with the goal of achieving maximum uniformity of plasma treatment of the surface of the low-voltage electrode with a diameter of 100 mm. In order to generate low-temperature plasma with the volume of roughly 1 cubic decimetre, a pulsed volume discharge was used initiated with a corona discharge. The uniformity of the plasma in the region of the low-voltage electrode was assessed using a system for measuring the distribution of discharge current density. The system's low-voltage electrode - collector - was a disc of 100 mm in diameter, the conducting surface of which was divided into 64 radially located segments of equal surface area. The current at each segment was registered by a high-speed measuring system controlled by an ARM™-based 32-bit microcontroller. To facilitate the interpretation of results obtained, a computer program was developed to visualise the results. The program provides a 3D image of the current density distribution on the surface of the low-voltage electrode. Based on the results obtained an optimum shape for a high-voltage electrode was determined. Uniformity of the distribution of discharge current density in relation to distance between electrodes was studied. It was proven that the level of non-uniformity of current density distribution depends on the size of the gap between electrodes. Experiments indicated that it is advantageous to use graphite felt VGN-6 (Russian abbreviation) as the material of the high-voltage electrode's emitting surface.

  6. Habitat Demonstration Unit (HDU) Pressurized Excursion Module (PEM) Systems Integration Strategy

    NASA Technical Reports Server (NTRS)

    Gill, Tracy; Merbitz, Jerad; Kennedy, Kriss; Tri, Terry; Toups, Larry; Howe, A. Scott

    2011-01-01

    The Habitat Demonstration Unit (HDU) project team constructed an analog prototype lunar surface laboratory called the Pressurized Excursion Module (PEM). The prototype unit subsystems were integrated in a short amount of time, utilizing a rapid prototyping approach that brought together over 20 habitation-related technologies from a variety of NASA centers. This paper describes the system integration strategies and lessons learned, that allowed the PEM to be brought from paper design to working field prototype using a multi-center team. The system integration process was based on a rapid prototyping approach. Tailored design review and test and integration processes facilitated that approach. The use of collaboration tools including electronic tools as well as documentation enabled a geographically distributed team take a paper concept to an operational prototype in approximately one year. One of the major tools used in the integration strategy was a coordinated effort to accurately model all the subsystems using computer aided design (CAD), so conflicts were identified before physical components came together. A deliberate effort was made following the deployment of the HDU PEM for field operations to collect lessons learned to facilitate process improvement and inform the design of future flight or analog versions of habitat systems. Significant items within those lessons learned were limitations with the CAD integration approach and the impact of shell design on flexibility of placing systems within the HDU shell.

  7. The semantic category-based grouping in the Multiple Identity Tracking task.

    PubMed

    Wei, Liuqing; Zhang, Xuemin; Li, Zhen; Liu, Jingyao

    2018-01-01

    In the Multiple Identity Tracking (MIT) task, categorical distinctions between targets and distractors have been found to facilitate tracking (Wei, Zhang, Lyu, & Li in Frontiers in Psychology, 7, 589, 2016). The purpose of this study was to further investigate the reasons for the facilitation effect, through six experiments. The results of Experiments 1-3 excluded the potential explanations of visual distinctiveness, attentional distribution strategy, and a working memory mechanism, respectively. When objects' visual information was preserved and categorical information was removed, the facilitation effect disappeared, suggesting that the visual distinctiveness between targets and distractors was not the main reason for the facilitation effect. Moreover, the facilitation effect was not the result of strategically shifting the attentional distribution, because the targets received more attention than the distractors in all conditions. Additionally, the facilitation effect did not come about because the identities of targets were encoded and stored in visual working memory to assist in the recovery from tracking errors; when working memory was disturbed by the object identities changing during tracking, the facilitation effect still existed. Experiments 4 and 5 showed that observers grouped targets together and segregated them from distractors on the basis of their categorical information. By doing this, observers could largely avoid distractor interference with tracking and improve tracking performance. Finally, Experiment 6 indicated that category-based grouping is not an automatic, but a goal-directed and effortful, strategy. In summary, the present findings show that a semantic category-based target-grouping mechanism exists in the MIT task, which is likely to be the major reason for the tracking facilitation effect.

  8. Effectiveness of a web-based automated cell distribution system.

    PubMed

    Niland, Joyce C; Stiller, Tracey; Cravens, James; Sowinski, Janice; Kaddis, John; Qian, Dajun

    2010-01-01

    In recent years, industries have turned to the field of operations research to help improve the efficiency of production and distribution processes. Largely absent is the application of this methodology to biological materials, such as the complex and costly procedure of human pancreas procurement and islet isolation. Pancreatic islets are used for basic science research and in a promising form of cell replacement therapy for a subset of patients afflicted with severe type 1 diabetes mellitus. Having an accurate and reliable system for cell distribution is therefore crucial. The Islet Cell Resource Center Consortium was formed in 2001 as the first and largest cooperative group of islet production and distribution facilities in the world. We previously reported on the development of a Matching Algorithm for Islet Distribution (MAID), an automated web-based tool used to optimize the distribution of human pancreatic islets by matching investigator requests to islet characteristics. This article presents an assessment of that algorithm and compares it to the manual distribution process used prior to MAID. A comparison was done using an investigator's ratio of the number of islets received divided by the number requested pre- and post-MAID. Although the supply of islets increased between the pre- versus post-MAID period, the median received-to-requested ratio remained around 60% due to an increase in demand post-MAID. A significantly smaller variation in the received-to-requested ratio was achieved in the post- versus pre-MAID period. In particular, the undesirable outcome of providing users with more islets than requested, ranging up to four times their request, was greatly reduced through the algorithm. In conclusion, this analysis demonstrates, for the first time, the effectiveness of using an automated web-based cell distribution system to facilitate efficient and consistent delivery of human pancreatic islets by enhancing the islet matching process.

  9. Simulation services and analysis tools at the CCMC to study multi-scale structure and dynamics of Earth's magnetopause

    NASA Astrophysics Data System (ADS)

    Kuznetsova, M. M.; Liu, Y. H.; Rastaetter, L.; Pembroke, A. D.; Chen, L. J.; Hesse, M.; Glocer, A.; Komar, C. M.; Dorelli, J.; Roytershteyn, V.

    2016-12-01

    The presentation will provide overview of new tools, services and models implemented at the Community Coordinated Modeling Center (CCMC) to facilitate MMS dayside results analysis. We will provide updates on implementation of Particle-in-Cell (PIC) simulations at the CCMC and opportunities for on-line visualization and analysis of results of PIC simulations of asymmetric magnetic reconnection for different guide fields and boundary conditions. Fields, plasma parameters, particle distribution moments as well as particle distribution functions calculated in selected regions of the vicinity of reconnection sites can be analyzed through the web-based interactive visualization system. In addition there are options to request distribution functions in user selected regions of interest and to fly through simulated magnetic reconnection configurations and a map of distributions to facilitate comparisons with observations. A broad collection of global magnetosphere models hosted at the CCMC provide opportunity to put MMS observations and local PIC simulations into global context. We recently implemented the RECON-X post processing tool (Glocer et al, 2016) which allows users to determine the location of separator surface around closed field lines and between open field lines and solar wind field lines. The tool also finds the separatrix line where the two surfaces touch and positions of magnetic nulls. The surfaces and the separatrix line can be visualized relative to satellite positions in the dayside magnetosphere using an interactive HTML-5 visualization for each time step processed. To validate global magnetosphere models' capability to simulate locations of dayside magnetosphere boundaries we will analyze the proximity of MMS to simulated separatrix locations for a set of MMS diffusion region crossing events.

  10. Transition from single to multiple axial potential structure in expanding helicon plasma

    NASA Astrophysics Data System (ADS)

    Ghosh, Soumen; Chattopadhyay, P. K.; Ghosh, J.; Pal, R.; Bora, D.

    2017-02-01

    Transition from single to multiple axial potential structure (MAPS) formation is reported in expanding helicon plasma. This transition is created by forming a cusp magnetic field at the downstream after the expansion throat. Two distinct potential drops are separated by a uniform axial potential zone. Non-uniform axial density distribution exists in expanding helicon systems. A cusp-like field nourishes both the axial density gradients sufficient enough for the formation of these two distinct potential drops. It is also shown that both single and multiple axial potential structures are observed only when both geometric and magnetic expansions closely coincide with each other. Coexistence of these two expansions at the same location enhances plasma expansion which facilitates deviation from Boltzmann distribution and violates quasi-neutrality locally.

  11. Space-time thermodynamics of the glass transition

    NASA Astrophysics Data System (ADS)

    Merolle, Mauro; Garrahan, Juan P.; Chandler, David

    2005-08-01

    We consider the probability distribution for fluctuations in dynamical action and similar quantities related to dynamic heterogeneity. We argue that the so-called “glass transition” is a manifestation of low action tails in these distributions where the entropy of trajectory space is subextensive in time. These low action tails are a consequence of dynamic heterogeneity and an indication of phase coexistence in trajectory space. The glass transition, where the system falls out of equilibrium, is then an order-disorder phenomenon in space-time occurring at a temperature Tg, which is a weak function of measurement time. We illustrate our perspective ideas with facilitated lattice models and note how these ideas apply more generally. Author contributions: M.M., J.P.G., and D.C. performed research and wrote the paper.

  12. Ultrashort pulse chirp measurement via transverse second-harmonic generation in strontium barium niobate crystal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trull, J.; Wang, B.; Parra, A.

    2015-06-01

    Pulse compression in dispersive strontium barium niobate crystal with a random size and distribution of the anti-parallel orientated nonlinear domains is observed via transverse second harmonic generation. The dependence of the transverse width of the second harmonic trace along the propagation direction allows for the determination of the initial chirp and duration of pulses in the femtosecond regime. This technique permits a real-time analysis of the pulse evolution and facilitates fast in-situ correction of pulse chirp acquired in the propagation through an optical system.

  13. Modal Interfaces in Hawaii

    NASA Technical Reports Server (NTRS)

    Wright, E. Alvey

    1974-01-01

    Hawaii, an archipelago where transportation distances are short but the interfaces are many, seeks elimination of modal changes by totally-submerged hydrofoil craft operating at the water surface directly between tourist resort destinations, by dual mode rapid transit vehicles operating directly between the deplaning bridges at Honolulu International Airport and hotel porte-cochere at Waikiki, by demand responsive vehicles for collection and distribution operating on fixed guideways for line haul, and by roll-on/roll-off inter-island ferries for all models of manually operated ground vehicles. The paper also describes facilitation of unavoidable interfaces by innovative sub-systems.

  14. Multiple Weyl points and the sign change of their topological charges in woodpile photonic crystals

    NASA Astrophysics Data System (ADS)

    Chang, Ming-Li; Xiao, Meng; Chen, Wen-Jie; Chan, C. T.

    2017-03-01

    We show that Weyl points with topological charges 1 and 2 can be found in very simple chiral woodpile photonic crystals and the distribution of the charges can be changed by changing the material parameters without altering space-group symmetry. The underlying physics can be understood through a tight-binding model. Gapless surface states and their backscattering immune properties also are demonstrated in these systems. Obtaining Weyl points in these easily fabricated woodpile photonic crystals will facilitate the realization of Weyl point physics in optical and IR frequencies.

  15. Use of a control chart to monitor diarrhoea admissions: a quality improvement exercise in West Kalimantan Provincial Hospital, Pontianak, Indonesia.

    PubMed

    Purba, M

    1999-09-01

    Data on the number of admissions for diarrhoea each week to the West Kalimantan Provincial Hospital, Pontianak, Indonesia over a 5 year period, 1992-1996, were collected. After cleaning and exclusion of extreme values, transformation was then performed to ensure that the data were free of special cause variation and normally distributed. A control chart was then constructed to provide an 'early warning' system for hospital authorities in order to facilitate the management of the epidemic and to improve patient care.

  16. Implementing Parquet equations using HPX

    NASA Astrophysics Data System (ADS)

    Kellar, Samuel; Wagle, Bibek; Yang, Shuxiang; Tam, Ka-Ming; Kaiser, Hartmut; Moreno, Juana; Jarrell, Mark

    A new C++ runtime system (HPX) enables simulations of complex systems to run more efficiently on parallel and heterogeneous systems. This increased efficiency allows for solutions to larger simulations of the parquet approximation for a system with impurities. The relevancy of the parquet equations depends upon the ability to solve systems which require long runs and large amounts of memory. These limitations, in addition to numerical complications arising from stability of the solutions, necessitate running on large distributed systems. As the computational resources trend towards the exascale and the limitations arising from computational resources vanish efficiency of large scale simulations becomes a focus. HPX facilitates efficient simulations through intelligent overlapping of computation and communication. Simulations such as the parquet equations which require the transfer of large amounts of data should benefit from HPX implementations. Supported by the the NSF EPSCoR Cooperative Agreement No. EPS-1003897 with additional support from the Louisiana Board of Regents.

  17. Housing is positively associated with invasive exotic plant species richness in New England, USA

    Treesearch

    Gregorio I. Gavier-Pizarro; Volker C. Radeloff; Susan I. Stewart; Cynthia D. Huebner; Nicholas S. Keuler

    2010-01-01

    Understanding the factors related to invasive exotic species distributions at broad spatial scales has important theoretical and management implications, because biological invasions are detrimental to many ecosystem functions and processes. Housing development facilitates invasions by disturbing land cover, introducing nonnative landscaping plants, and facilitating...

  18. Data-driven planning of distributed energy resources amidst socio-technical complexities

    NASA Astrophysics Data System (ADS)

    Jain, Rishee K.; Qin, Junjie; Rajagopal, Ram

    2017-08-01

    New distributed energy resources (DER) are rapidly replacing centralized power generation due to their environmental, economic and resiliency benefits. Previous analyses of DER systems have been limited in their ability to account for socio-technical complexities, such as intermittent supply, heterogeneous demand and balance-of-system cost dynamics. Here we develop ReMatch, an interdisciplinary modelling framework, spanning engineering, consumer behaviour and data science, and apply it to 10,000 consumers in California, USA. Our results show that deploying DER would yield nearly a 50% reduction in the levelized cost of electricity (LCOE) over the status quo even after accounting for socio-technical complexities. We abstract a detailed matching of consumers to DER infrastructure from our results and discuss how this matching can facilitate the development of smart and targeted renewable energy policies, programmes and incentives. Our findings point to the large-scale economic and technical feasibility of DER and underscore the pertinent role DER can play in achieving sustainable energy goals.

  19. MicroCT-based phenomics in the zebrafish skeleton reveals virtues of deep phenotyping in a distributed organ system.

    PubMed

    Hur, Matthew; Gistelinck, Charlotte A; Huber, Philippe; Lee, Jane; Thompson, Marjorie H; Monstad-Rios, Adrian T; Watson, Claire J; McMenamin, Sarah K; Willaert, Andy; Parichy, David M; Coucke, Paul; Kwon, Ronald Y

    2017-09-08

    Phenomics, which ideally involves in-depth phenotyping at the whole-organism scale, may enhance our functional understanding of genetic variation. Here, we demonstrate methods to profile hundreds of phenotypic measures comprised of morphological and densitometric traits at a large number of sites within the axial skeleton of adult zebrafish. We show the potential for vertebral patterns to confer heightened sensitivity, with similar specificity, in discriminating mutant populations compared to analyzing individual vertebrae in isolation. We identify phenotypes associated with human brittle bone disease and thyroid stimulating hormone receptor hyperactivity. Finally, we develop allometric models and show their potential to aid in the discrimination of mutant phenotypes masked by alterations in growth. Our studies demonstrate virtues of deep phenotyping in a spatially distributed organ system. Analyzing phenotypic patterns may increase productivity in genetic screens, and facilitate the study of genetic variants associated with smaller effect sizes, such as those that underlie complex diseases.

  20. Collective learning for the emergence of social norms in networked multiagent systems.

    PubMed

    Yu, Chao; Zhang, Minjie; Ren, Fenghui

    2014-12-01

    Social norms such as social rules and conventions play a pivotal role in sustaining system order by regulating and controlling individual behaviors toward a global consensus in large-scale distributed systems. Systematic studies of efficient mechanisms that can facilitate the emergence of social norms enable us to build and design robust distributed systems, such as electronic institutions and norm-governed sensor networks. This paper studies the emergence of social norms via learning from repeated local interactions in networked multiagent systems. A collective learning framework, which imitates the opinion aggregation process in human decision making, is proposed to study the impact of agent local collective behaviors on the emergence of social norms in a number of different situations. In the framework, each agent interacts repeatedly with all of its neighbors. At each step, an agent first takes a best-response action toward each of its neighbors and then combines all of these actions into a final action using ensemble learning methods. Extensive experiments are carried out to evaluate the framework with respect to different network topologies, learning strategies, numbers of actions, influences of nonlearning agents, and so on. Experimental results reveal some significant insights into the manipulation and control of norm emergence in networked multiagent systems achieved through local collective behaviors.

  1. Dynamic Transition and Resonance in Coupled Oscillators Under Symmetry-Breaking Fields

    NASA Astrophysics Data System (ADS)

    Choi, J.; Choi, M. Y.; Chung, M. S.; Yoon, B.-G.

    2013-06-01

    We investigate numerically the dynamic properties of a system of globally coupled oscillators driven by periodic symmetry-breaking fields in the presence of noise. The phase distribution of the oscillators is computed and a dynamic transition is disclosed. It is further found that the stochastic resonance is closely related to the behavior of the dynamic order parameter, which is in turn explained by the formation of a bi-cluster in the system. Here noise tends to symmetrize the motion of the oscillators, facilitating the bi-cluster formation. The observed resonance appears to be of the same class as the resonance present in the two-dimensional Ising model under oscillating fields.

  2. Dicoogle Mobile: a medical imaging platform for Android.

    PubMed

    Viana-Ferreira, Carlos; Ferreira, Daniel; Valente, Frederico; Monteiro, Eriksson; Costa, Carlos; Oliveira, José Luís

    2012-01-01

    Mobile computing technologies are increasingly becoming a valuable asset in healthcare information systems. The adoption of these technologies helps to assist in improving quality of care, increasing productivity and facilitating clinical decision support. They provide practitioners with ubiquitous access to patient records, being actually an important component in telemedicine and tele-work environments. We have developed Dicoogle Mobile, an Android application that provides remote access to distributed medical imaging data through a cloud relay service. Besides, this application has the capability to store and index local imaging data, so that they can also be searched and visualized. In this paper, we will describe Dicoogle Mobile concept as well the architecture of the whole system that makes it running.

  3. AphidBase: A centralized bioinformatic resource for annotation of the pea aphid genome

    PubMed Central

    Legeai, Fabrice; Shigenobu, Shuji; Gauthier, Jean-Pierre; Colbourne, John; Rispe, Claude; Collin, Olivier; Richards, Stephen; Wilson, Alex C. C.; Tagu, Denis

    2015-01-01

    AphidBase is a centralized bioinformatic resource that was developed to facilitate community annotation of the pea aphid genome by the International Aphid Genomics Consortium (IAGC). The AphidBase Information System designed to organize and distribute genomic data and annotations for a large international community was constructed using open source software tools from the Generic Model Organism Database (GMOD). The system includes Apollo and GBrowse utilities as well as a wiki, blast search capabilities and a full text search engine. AphidBase strongly supported community cooperation and coordination in the curation of gene models during community annotation of the pea aphid genome. AphidBase can be accessed at http://www.aphidbase.com. PMID:20482635

  4. Using Punnett Squares to Facilitate Students' Understanding of Isotopic Distributions in Mass Spectrometry

    ERIC Educational Resources Information Center

    Sein, Lawrence T., Jr.

    2006-01-01

    The isotopic distribution in mass spectroscopy is described for identifying pure compounds, being able to distinguish molecular fragments by masses. Punnett squares are familiar, easy to compute, and often graphical which makes helpful to students and the relative distribution of isotopic combination is easily generated for even isotopic…

  5. Distributed nestmate recognition in ants.

    PubMed

    Esponda, Fernando; Gordon, Deborah M

    2015-05-07

    We propose a distributed model of nestmate recognition, analogous to the one used by the vertebrate immune system, in which colony response results from the diverse reactions of many ants. The model describes how individual behaviour produces colony response to non-nestmates. No single ant knows the odour identity of the colony. Instead, colony identity is defined collectively by all the ants in the colony. Each ant responds to the odour of other ants by reference to its own unique decision boundary, which is a result of its experience of encounters with other ants. Each ant thus recognizes a particular set of chemical profiles as being those of non-nestmates. This model predicts, as experimental results have shown, that the outcome of behavioural assays is likely to be variable, that it depends on the number of ants tested, that response to non-nestmates changes over time and that it changes in response to the experience of individual ants. A distributed system allows a colony to identify non-nestmates without requiring that all individuals have the same complete information and helps to facilitate the tracking of changes in cuticular hydrocarbon profiles, because only a subset of ants must respond to provide an adequate response.

  6. Combining aesthetic with ecological values for landscape sustainability.

    PubMed

    Yang, Dewei; Luo, Tao; Lin, Tao; Qiu, Quanyi; Luo, Yunjian

    2014-01-01

    Humans receive multiple benefits from various landscapes that foster ecological services and aesthetic attractiveness. In this study, a hybrid framework was proposed to evaluate ecological and aesthetic values of five landscape types in Houguanhu Region of central China. Data from the public aesthetic survey and professional ecological assessment were converted into a two-dimensional coordinate system and distribution maps of landscape values. Results showed that natural landscapes (i.e. water body and forest) contributed positively more to both aesthetic and ecological values than semi-natural and human-dominated landscapes (i.e. farmland and non-ecological land). The distribution maps of landscape values indicated that the aesthetic, ecological and integrated landscape values were significantly associated with landscape attributes and human activity intensity. To combine aesthetic preferences with ecological services, the methods (i.e. field survey, landscape value coefficients, normalized method, a two-dimensional coordinate system, and landscape value distribution maps) were employed in landscape assessment. Our results could facilitate to identify the underlying structure-function-value chain, and also improve the understanding of multiple functions in landscape planning. The situation context could also be emphasized to bring ecological and aesthetic goals into better alignment.

  7. Temperature distribution in target tumor tissue and photothermal tissue destruction during laser immunotherapy

    NASA Astrophysics Data System (ADS)

    Doughty, Austin; Hasanjee, Aamr; Pettitt, Alex; Silk, Kegan; Liu, Hong; Chen, Wei R.; Zhou, Feifan

    2016-03-01

    Laser Immunotherapy is a novel cancer treatment modality that has seen much success in treating many different types of cancer, both in animal studies and in clinical trials. The treatment consists of the synergistic interaction between photothermal laser irradiation and the local injection of an immunoadjuvant. As a result of the therapy, the host immune system launches a systemic antitumor response. The photothermal effect induced by the laser irradiation has multiple effects at different temperature elevations which are all required for optimal response. Therefore, determining the temperature distribution in the target tumor during the laser irradiation in laser immunotherapy is crucial to facilitate the treatment of cancers. To investigate the temperature distribution in the target tumor, female Wistar Furth rats were injected with metastatic mammary tumor cells and, upon sufficient tumor growth, underwent laser irradiation and were monitored using thermocouples connected to locally-inserted needle probes and infrared thermography. From the study, we determined that the maximum central tumor temperature was higher for tumors of less volume. Additionally, we determined that the temperature near the edge of the tumor as measured with a thermocouple had a strong correlation with the maximum temperature value in the infrared camera measurement.

  8. Combining Aesthetic with Ecological Values for Landscape Sustainability

    PubMed Central

    Yang, Dewei; Luo, Tao; Lin, Tao; Qiu, Quanyi; Luo, Yunjian

    2014-01-01

    Humans receive multiple benefits from various landscapes that foster ecological services and aesthetic attractiveness. In this study, a hybrid framework was proposed to evaluate ecological and aesthetic values of five landscape types in Houguanhu Region of central China. Data from the public aesthetic survey and professional ecological assessment were converted into a two-dimensional coordinate system and distribution maps of landscape values. Results showed that natural landscapes (i.e. water body and forest) contributed positively more to both aesthetic and ecological values than semi-natural and human-dominated landscapes (i.e. farmland and non-ecological land). The distribution maps of landscape values indicated that the aesthetic, ecological and integrated landscape values were significantly associated with landscape attributes and human activity intensity. To combine aesthetic preferences with ecological services, the methods (i.e. field survey, landscape value coefficients, normalized method, a two-dimensional coordinate system, and landscape value distribution maps) were employed in landscape assessment. Our results could facilitate to identify the underlying structure-function-value chain, and also improve the understanding of multiple functions in landscape planning. The situation context could also be emphasized to bring ecological and aesthetic goals into better alignment. PMID:25050886

  9. Robust Operation of Soft Open Points in Active Distribution Networks with High Penetration of Photovoltaic Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Ji, Haoran; Wang, Chengshan

    Distributed generators (DGs) including photovoltaic panels (PVs) have been integrated dramatically in active distribution networks (ADNs). Due to the strong volatility and uncertainty, the high penetration of PV generation immensely exacerbates the conditions of voltage violation in ADNs. However, the emerging flexible interconnection technology based on soft open points (SOPs) provides increased controllability and flexibility to the system operation. For fully exploiting the regulation ability of SOPs to address the problems caused by PV, this paper proposes a robust optimization method to achieve the robust optimal operation of SOPs in ADNs. A two-stage adjustable robust optimization model is built tomore » tackle the uncertainties of PV outputs, in which robust operation strategies of SOPs are generated to eliminate the voltage violations and reduce the power losses of ADNs. A column-and-constraint generation (C&CG) algorithm is developed to solve the proposed robust optimization model, which are formulated as second-order cone program (SOCP) to facilitate the accuracy and computation efficiency. Case studies on the modified IEEE 33-node system and comparisons with the deterministic optimization approach are conducted to verify the effectiveness and robustness of the proposed method.« less

  10. Plasmonic modes in nanowire dimers: A study based on the hydrodynamic Drude model including nonlocal and nonlinear effects

    NASA Astrophysics Data System (ADS)

    Moeferdt, Matthias; Kiel, Thomas; Sproll, Tobias; Intravaia, Francesco; Busch, Kurt

    2018-02-01

    A combined analytical and numerical study of the modes in two distinct plasmonic nanowire systems is presented. The computations are based on a discontinuous Galerkin time-domain approach, and a fully nonlinear and nonlocal hydrodynamic Drude model for the metal is utilized. In the linear regime, these computations demonstrate the strong influence of nonlocality on the field distributions as well as on the scattering and absorption spectra. Based on these results, second-harmonic-generation efficiencies are computed over a frequency range that covers all relevant modes of the linear spectra. In order to interpret the physical mechanisms that lead to corresponding field distributions, the associated linear quasielectrostatic problem is solved analytically via conformal transformation techniques. This provides an intuitive classification of the linear excitations of the systems that is then applied to the full Maxwell case. Based on this classification, group theory facilitates the determination of the selection rules for the efficient excitation of modes in both the linear and nonlinear regimes. This leads to significantly enhanced second-harmonic generation via judiciously exploiting the system symmetries. These results regarding the mode structure and second-harmonic generation are of direct relevance to other nanoantenna systems.

  11. Human factor engineering based design and modernization of control rooms with new I and C systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larraz, J.; Rejas, L.; Ortega, F.

    2012-07-01

    Instrumentation and Control (I and C) systems of the latest nuclear power plants are based on the use of digital technology, distributed control systems and the integration of information in data networks (Distributed Control and Instrumentation Systems). This has a repercussion on Control Rooms (CRs), where the operations and monitoring interfaces correspond to these systems. These technologies are also used in modernizing I and C systems in currently operative nuclear power plants. The new interfaces provide additional capabilities for operation and supervision, as well as a high degree of flexibility, versatility and reliability. An example of this is the implementationmore » of solutions such as compact stations, high level supervision screens, overview displays, computerized procedures, new operational support systems or intelligent alarms processing systems in the modernized Man-Machine Interface (MMI). These changes in the MMI are accompanied by newly added Software (SW) controls and new solutions in automation. Tecnatom has been leading various projects in this area for several years, both in Asian countries and in the United States, using in all cases international standards from which Tecnatom own methodologies have been developed and optimized. The experience acquired in applying this methodology to the design of new control rooms is to a large extent applicable also to the modernization of current control rooms. An adequate design of the interface between the operator and the systems will facilitate safe operation, contribute to the prompt identification of problems and help in the distribution of tasks and communications between the different members of the operating shift. Based on Tecnatom experience in the field, this article presents the methodological approach used as well as the most relevant aspects of this kind of project. (authors)« less

  12. Towards an integrated EU data system within AtlantOS project

    NASA Astrophysics Data System (ADS)

    Pouliquen, Sylvie; Harscoat, Valerie; Waldmann, Christoph; Koop-Jakobsen, ketill

    2017-04-01

    The H2020 AtlantOS project started in June 2015 and aims to optimise and enhance the Integrated Atlantic Ocean Observing Systems (IAOOS). One goal is to ensure that data from different and diverse in-situ observing networks are readily accessible and useable to the wider community, international ocean science community and other stakeholders in this field. To achieve that, the strategy is to move towards an integrated data system within AtlantOS that harmonises work flows, data processing and distribution across the in-situ observing network systems, and integrates in-situ observations in existing European and international data infrastructures (Copernicus marine service, SeaDataNet NODCs, EMODnet, OBIS, GEOSS) so called Integrators. The targeted integrated system will deal with data management challenges for efficient and reliable data service to users: • Quality control commons for heterogeneous and nearly real time data • Standardisation of mandatory metadata for efficient data exchange • Interoperability of network and integrator data management systems Presently the situation is that the data acquired by the different in situ observing networks contributing to the AtlantOS project are processed and distributed using different methodologies and means. Depending on the network data management organization, the data are either processed following recommendations elaborated y the network teams and accessible through a unique portal (FTP or Web), or are processed by individual scientific researchers and made available through National Data Centres or directly at institution level. Some datasets are available through Integrators, such as Copernicus or EMODnet, but connected through ad-hoc links. To facilitate the access to the Atlantic observations and avoid "mixing pears with apples", it has been necessary to agree on (1) the EOVs list and definition across the Networks, (2) a minimum set of common vocabularies for metadata and data description to be used by all the Networks, and (3) a minimum level of Near Real Time Quality Control Procedures for selected EOVs. Then a data exchange backbone has been defined and is being setting up to facilitate discovery, viewing and downloading by the users. Some tools will be recommended to help Network plugging their data on this backbone and facilitate integration in the Integrators. Finally, existing services to the users for data discovery, viewing and downloading will be enhanced to ease access to existing observations. An initial working phase relying on existing international standards and protocols, involving data providers, both Networks and Integrators, and dealing with data harmonisation and integration objectives, has led to agreements and recommendations .The setup phase has started, both on Networks and Integrators sides, to adapt the existing systems in order to move toward this integrated EU data system within AtlantOS as well as collaboration with international partners arpound the ATlantic Ocean.

  13. A tool for teaching three-dimensional dermatomes combined with distribution of cutaneous nerves on the limbs.

    PubMed

    Kooloos, Jan G M; Vorstenbosch, Marc A T M

    2013-01-01

    A teaching tool that facilitates student understanding of a three-dimensional (3D) integration of dermatomes with peripheral cutaneous nerve field distributions is described. This model is inspired by the confusion in novice learners between dermatome maps and nerve field distribution maps. This confusion leads to the misconception that these two distribution maps fully overlap, and may stem from three sources: (1) the differences in dermatome maps in anatomical textbooks, (2) the limited views in the figures of dermatome maps and cutaneous nerve field maps, hampering the acquisition of a 3D picture, and (3) the lack of figures showing both maps together. To clarify this concept, the learning process can be facilitated by transforming the 2D drawings in textbooks to a 3D hands-on model and by merging the information from the separate maps. Commercially available models were covered with white cotton pantyhose, and borders between dermatomes were marked using the drawings from the students' required study material. Distribution maps of selected peripheral nerves were cut out from color transparencies. Both the model and the cut-out nerve fields were then at the students' disposal during a laboratory exercise. The students were instructed to affix the transparencies in the right place according to the textbook's figures. This model facilitates integrating the spatial relationships of the two types of nerve distributions. By highlighting the spatial relationship and aiming to provoke student enthusiasm, this model follows the advantages of other low-fidelity models. © 2013 American Association of Anatomists.

  14. A hybrid approach to advancing quantitative prediction of tissue distribution of basic drugs in human

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poulin, Patrick, E-mail: patrick-poulin@videotron.ca; Ekins, Sean; Department of Pharmaceutical Sciences, School of Pharmacy, University of Maryland, 20 Penn Street, Baltimore, MD 21201

    A general toxicity of basic drugs is related to phospholipidosis in tissues. Therefore, it is essential to predict the tissue distribution of basic drugs to facilitate an initial estimate of that toxicity. The objective of the present study was to further assess the original prediction method that consisted of using the binding to red blood cells measured in vitro for the unbound drug (RBCu) as a surrogate for tissue distribution, by correlating it to unbound tissue:plasma partition coefficients (Kpu) of several tissues, and finally to predict volume of distribution at steady-state (V{sub ss}) in humans under in vivo conditions. Thismore » correlation method demonstrated inaccurate predictions of V{sub ss} for particular basic drugs that did not follow the original correlation principle. Therefore, the novelty of this study is to provide clarity on the actual hypotheses to identify i) the impact of pharmacological mode of action on the generic correlation of RBCu-Kpu, ii) additional mechanisms of tissue distribution for the outlier drugs, iii) molecular features and properties that differentiate compounds as outliers in the original correlation analysis in order to facilitate its applicability domain alongside the properties already used so far, and finally iv) to present a novel and refined correlation method that is superior to what has been previously published for the prediction of human V{sub ss} of basic drugs. Applying a refined correlation method after identifying outliers would facilitate the prediction of more accurate distribution parameters as key inputs used in physiologically based pharmacokinetic (PBPK) and phospholipidosis models.« less

  15. Anodal Cerebellar Direct Current Stimulation Reduces Facilitation of Propriospinal Neurons in Healthy Humans.

    PubMed

    Chothia, Muhammed; Doeltgen, Sebastian; Bradnam, Lynley V

    2016-01-01

    Coordinated muscle synergies in the human upper limb are controlled, in part, by a neural distribution network located in the cervical spinal cord, known as the cervical propriospinal system. Studies in the cat and non-human primate indicate the cerebellum is indirectly connected to this system via output pathways to the brainstem. Therefore, the cerebellum may indirectly modulate excitability of putative propriospinal neurons (PNs) in humans during upper limb coordination tasks. This study aimed to test whether anodal direct current stimulation (DCS) of the cerebellum modulates PNs and upper limb coordination in healthy adults. The hypothesis was that cerebellar anodal DCS would reduce descending facilitation of PNs and improve upper limb coordination. Transcranial magnetic stimulation (TMS), paired with peripheral nerve stimulation, probed activity in facilitatory and inhibitory descending projections to PNs following an established protocol. Coordination was tested using a pursuit rotor task performed by the non-dominant (ipsilateral) hand. Anodal and sham DCS were delivered over the cerebellum ipsilateral to the non-dominant hand in separate experimental sessions. Anodal DCS was applied to a control site lateral to the vertex in a third session. Twelve right-handed healthy adults participated. Pairing TMS with sub-threshold peripheral nerve stimulation facilitated motor evoked potentials at intensities just above threshold in accordance with the protocol. Anodal cerebellar DCS reduced facilitation without influencing inhibition, but the reduction in facilitation was not associated with performance of the pursuit rotor task. The results of this study indicate dissociated indirect control over cervical PNs by the cerebellum in humans. Anodal DCS of the cerebellum reduced excitability in the facilitatory descending pathway with no effect on the inhibitory pathway to cervical PNs. The reduction in PN excitability is likely secondary to modulation of primary motor cortex or brainstem nuclei, and identifies a neuroanatomical pathway for the cerebellum to assist in coordination of upper limb muscle synergies in humans. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Implementing a distributed intranet-based information system.

    PubMed

    O'Kane, K C; McColligan, E E; Davis, G A

    1996-11-01

    The article discusses Internet and intranet technologies and describes how to install an intranet-based information system using the Merle language facility and other readily available components. Merle is a script language designed to support decentralized medical record information retrieval applications on the World Wide Web. The goal of this work is to provide a script language tool to facilitate construction of efficient, fully functional, multipoint medical record information systems that can be accessed anywhere by low-cost Web browsers to search, retrieve, and analyze patient information. The language allows legacy MUMPS applications to function in a Web environment and to make use of the Web graphical, sound, and video presentation services. It also permits downloading of script applets for execution on client browsers, and it can be used in standalone mode with the Unix, Windows 95, Windows NT, and OS/2 operating systems.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Auslander, David; Culler, David; Wright, Paul

    The goal of the 2.5 year Distributed Intelligent Automated Demand Response (DIADR) project was to reduce peak electricity load of Sutardja Dai Hall at UC Berkeley by 30% while maintaining a healthy, comfortable, and productive environment for the occupants. We sought to bring together both central and distributed control to provide “deep” demand response1 at the appliance level of the building as well as typical lighting and HVAC applications. This project brought together Siemens Corporate Research and Siemens Building Technology (the building has a Siemens Apogee Building Automation System (BAS)), Lawrence Berkeley National Laboratory (leveraging their Open Automated Demand Responsemore » (openADR), Auto-­Demand Response, and building modeling expertise), and UC Berkeley (related demand response research including distributed wireless control, and grid-­to-­building gateway development). Sutardja Dai Hall houses the Center for Information Technology Research in the Interest of Society (CITRIS), which fosters collaboration among industry and faculty and students of four UC campuses (Berkeley, Davis, Merced, and Santa Cruz). The 141,000 square foot building, occupied in 2009, includes typical office spaces and a nanofabrication laboratory. Heating is provided by a district heating system (steam from campus as a byproduct of the campus cogeneration plant); cooling is provided by one of two chillers: a more typical electric centrifugal compressor chiller designed for the cool months (Nov-­ March) and a steam absorption chiller for use in the warm months (April-­October). Lighting in the open office areas is provided by direct-­indirect luminaries with Building Management System-­based scheduling for open areas, and occupancy sensors for private office areas. For the purposes of this project, we focused on the office portion of the building. Annual energy consumption is approximately 8053 MWh; the office portion is estimated as 1924 MWh. The maximum peak load during the study period was 1175 kW. Several new tools facilitated this work, such as the Smart Energy Box, the distributed load controller or Energy Information Gateway, the web-­based DR controller (dubbed the Central Load-­Shed Coordinator or CLSC), and the Demand Response Capacity Assessment & Operation Assistance Tool (DRCAOT). In addition, an innovative data aggregator called sMAP (simple Measurement and Actuation Profile) allowed data from different sources collected in a compact form and facilitated detailed analysis of the building systems operation. A smart phone application (RAP or Rapid Audit Protocol) facilitated an inventory of the building’s plug loads. Carbon dioxide sensors located in conference rooms and classrooms allowed demand controlled ventilation. The extensive submetering and nimble access to this data provided great insight into the details of the building operation as well as quick diagnostics and analyses of tests. For example, students discovered a short-­cycling chiller, a stuck damper, and a leaking cooling coil in the first field tests. For our final field tests, we were able to see how each zone was affected by the DR strategies (e.g., the offices on the 7th floor grew very warm quickly) and fine-­tune the strategies accordingly.« less

  18. Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A

    2011-01-01

    The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less

  19. SPANG: a SPARQL client supporting generation and reuse of queries for distributed RDF databases.

    PubMed

    Chiba, Hirokazu; Uchiyama, Ikuo

    2017-02-08

    Toward improved interoperability of distributed biological databases, an increasing number of datasets have been published in the standardized Resource Description Framework (RDF). Although the powerful SPARQL Protocol and RDF Query Language (SPARQL) provides a basis for exploiting RDF databases, writing SPARQL code is burdensome for users including bioinformaticians. Thus, an easy-to-use interface is necessary. We developed SPANG, a SPARQL client that has unique features for querying RDF datasets. SPANG dynamically generates typical SPARQL queries according to specified arguments. It can also call SPARQL template libraries constructed in a local system or published on the Web. Further, it enables combinatorial execution of multiple queries, each with a distinct target database. These features facilitate easy and effective access to RDF datasets and integrative analysis of distributed data. SPANG helps users to exploit RDF datasets by generation and reuse of SPARQL queries through a simple interface. This client will enhance integrative exploitation of biological RDF datasets distributed across the Web. This software package is freely available at http://purl.org/net/spang .

  20. The Impact of Uncertain Physical Parameters on HVAC Demand Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yannan; Elizondo, Marcelo A.; Lu, Shuai

    HVAC units are currently one of the major resources providing demand response (DR) in residential buildings. Models of HVAC with DR function can improve understanding of its impact on power system operations and facilitate the deployment of DR technologies. This paper investigates the importance of various physical parameters and their distributions to the HVAC response to DR signals, which is a key step to the construction of HVAC models for a population of units with insufficient data. These parameters include the size of floors, insulation efficiency, the amount of solid mass in the house, and efficiency of the HVAC units.more » These parameters are usually assumed to follow Gaussian or Uniform distributions. We study the effect of uncertainty in the chosen parameter distributions on the aggregate HVAC response to DR signals, during transient phase and in steady state. We use a quasi-Monte Carlo sampling method with linear regression and Prony analysis to evaluate sensitivity of DR output to the uncertainty in the distribution parameters. The significance ranking on the uncertainty sources is given for future guidance in the modeling of HVAC demand response.« less

  1. Preservation of distributive vs. tributive and other fluvial system deposits in the rock record (Invited)

    NASA Astrophysics Data System (ADS)

    Fielding, C. R.

    2010-12-01

    A recent paper (Weissmann et al., 2010, Geology 38, 39-42) has suggested that deposits of distributive fluvial systems (DFS) “may represent the norm in the continental rock record, with axial and incised river deposits composing a relatively minor proportion of the succession”. Herein, I examine this hypothesis by reference to a number of well-exposed fluvial successions from a variety of basinal settings. The cited paper suggests that DFS dominate modern fluvial landscapes in subsiding sedimentary basins, while acknowledging that many merge into a trunk stream in the basin depocenter. Most of the modern World’s largest rivers, however, are tributive, and many of them preserve significant thicknesses of alluvium beneath and lateral to the modern channel belt. Because DFS are abundant on modern landscapes does not necessarily mean that they will be proportionately well-represented in the ancient. Consideration must also be given to the location within a basin where fluvial systems are most likely to be preserved (the depocenter), and to other factors. DFS (or fluvial/alluvial fans) are commonly developed on the tilted margins of asymmetric basins (hangingwalls of half-grabens and supradetachment basins, transtensional and foreland basins), but not in the depocenters. Symmetrically subsiding basins and long wavelength passive margin basins, however, facilitate development of extensive, very low-gradient plains where trunk streams with tributive or anabranching planforms are typical. Such basins, and the depocenters of asymmetric basins, are most likely to facilitate long-term establishment of trunk systems that have the greatest preservation potential. Incised and/or trunk stream deposits have, furthermore, been interpreted from a large number of ancient examples, some long-lived on timescales of millions of years. In the latter cases it has been argued that tectonic stability of the drainage basin is a key characteristic. A survey of the modern landscape therefore represents only a snapshot of time and one minor component of any climatically- or tectonically-driven cycle. It seems unlikely that DFS dominate alluvial stratigraphy. Criteria for recognition of DFS in the ancient have not yet been fully formulated, but might include 1) a relatively tightly constrained width vs. thickness distribution of channel lithosomes, and 2) lack of outsized channel bodies, in association with 3) centrifugal palaeocurrent distributions, and 4) down-paleoslope decreases in channel body dimensions. Neither these criteria, nor those cited in Weissmann et al. (2010), are necessarily unique to DFS, however. Accordingly, I consider it unlikely that a dominance of DFS in the alluvial rock record could be persuasively demonstrated even it were true.

  2. Resource acquisition, distribution and end-use efficiencies and the growth of industrial society

    NASA Astrophysics Data System (ADS)

    Jarvis, A. J.; Jarvis, S. J.; Hewitt, C. N.

    2015-10-01

    A key feature of the growth of industrial society is the acquisition of increasing quantities of resources from the environment and their distribution for end-use. With respect to energy, the growth of industrial society appears to have been near-exponential for the last 160 years. We provide evidence that indicates that the global distribution of resources that underpins this growth may be facilitated by the continual development and expansion of near-optimal directed networks (roads, railways, flight paths, pipelines, cables etc.). However, despite this continual striving for optimisation, the distribution efficiencies of these networks must decline over time as they expand due to path lengths becoming longer and more tortuous. Therefore, to maintain long-term exponential growth the physical limits placed on the distribution networks appear to be counteracted by innovations deployed elsewhere in the system, namely at the points of acquisition and end-use of resources. We postulate that the maintenance of the growth of industrial society, as measured by global energy use, at the observed rate of ~ 2.4 % yr-1 stems from an implicit desire to optimise patterns of energy use over human working lifetimes.

  3. Global Gridded Data from the Goddard Earth Observing System Data Assimilation System (GEOS-DAS)

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The Goddard Earth Observing System Data Assimilation System (GEOS-DAS) timeseries is a globally gridded atmospheric data set for use in climate research. This near real-time data set is produced by the Data Assimilation Office (DAO) at the NASA Goddard Space Flight Center in direct support of the operational EOS instrument product generation from the Terra (12/1999 launch), Aqua (05/2002 launch) and Aura (01/2004 launch) spacecrafts. The data is archived in the EOS Core System (ECS) at the Goddard Earth Sciences Data and Information Services Center/Distributed Active Archive Center (GES DISC DAAC). The data is only a selection of the products available from the GEOS-DAS. The data is organized chronologically in timeseries format to facilitate the computation of statistics. GEOS-DAS data will be available for the time period January 1, 2000, through present.

  4. System Simulation by Recursive Feedback: Coupling a Set of Stand-Alone Subsystem Simulations

    NASA Technical Reports Server (NTRS)

    Nixon, D. D.

    2001-01-01

    Conventional construction of digital dynamic system simulations often involves collecting differential equations that model each subsystem, arran g them to a standard form, and obtaining their numerical gin solution as a single coupled, total-system simultaneous set. Simulation by numerical coupling of independent stand-alone subsimulations is a fundamentally different approach that is attractive because, among other things, the architecture naturally facilitates high fidelity, broad scope, and discipline independence. Recursive feedback is defined and discussed as a candidate approach to multidiscipline dynamic system simulation by numerical coupling of self-contained, single-discipline subsystem simulations. A satellite motion example containing three subsystems (orbit dynamics, attitude dynamics, and aerodynamics) has been defined and constructed using this approach. Conventional solution methods are used in the subsystem simulations. Distributed and centralized implementations of coupling have been considered. Numerical results are evaluated by direct comparison with a standard total-system, simultaneous-solution approach.

  5. A DDS-Based Energy Management Framework for Small Microgrid Operation and Control

    DOE PAGES

    Youssef, Tarek A.; El Hariri, Mohamad; Elsayed, Ahmed T.; ...

    2017-09-26

    The smart grid is seen as a power system with realtime communication and control capabilities between the consumer and the utility. This modern platform facilitates the optimization in energy usage based on several factors including environmental, price preferences, and system technical issues. In this paper a real-time energy management system (EMS) for microgrids or nanogrids was developed. The developed system involves an online optimization scheme to adapt its parameters based on previous, current, and forecasted future system states. The communication requirements for all EMS modules were analyzed and are all integrated over a data distribution service (DDS) Ethernet network withmore » appropriate quality of service (QoS) profiles. In conclusion, the developed EMS was emulated with actual residential energy consumption and irradiance data from Miami, Florida and proved its effectiveness in reducing consumers’ bills and achieving flat peak load profiles.« less

  6. Overview of the LINCS architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fletcher, J.G.; Watson, R.W.

    1982-01-13

    Computing at the Lawrence Livermore National Laboratory (LLNL) has evolved over the past 15 years with a computer network based resource sharing environment. The increasing use of low cost and high performance micro, mini and midi computers and commercially available local networking systems will accelerate this trend. Further, even the large scale computer systems, on which much of the LLNL scientific computing depends, are evolving into multiprocessor systems. It is our belief that the most cost effective use of this environment will depend on the development of application systems structured into cooperating concurrent program modules (processes) distributed appropriately over differentmore » nodes of the environment. A node is defined as one or more processors with a local (shared) high speed memory. Given the latter view, the environment can be characterized as consisting of: multiple nodes communicating over noisy channels with arbitrary delays and throughput, heterogenous base resources and information encodings, no single administration controlling all resources, distributed system state, and no uniform time base. The system design problem is - how to turn the heterogeneous base hardware/firmware/software resources of this environment into a coherent set of resources that facilitate development of cost effective, reliable, and human engineered applications. We believe the answer lies in developing a layered, communication oriented distributed system architecture; layered and modular to support ease of understanding, reconfiguration, extensibility, and hiding of implementation or nonessential local details; communication oriented because that is a central feature of the environment. The Livermore Interactive Network Communication System (LINCS) is a hierarchical architecture designed to meet the above needs. While having characteristics in common with other architectures, it differs in several respects.« less

  7. School Leadership of the Future: How the National Education Institute in Slovenia Supported Schools to Develop Distributed Leadership Practice

    ERIC Educational Resources Information Center

    Sentocnik, Sonja; Rupar, Brigita

    2009-01-01

    Current educational literature suggests that distributing leadership in schools can facilitate individual and organizational development. While many state agencies in the United States and Europe are encouraging schools to reshape their leadership practice to distribute responsibilities for leadership tasks across roles, empirical research on how…

  8. A Tool for Teaching Three-Dimensional Dermatomes Combined with Distribution of Cutaneous Nerves on the Limbs

    ERIC Educational Resources Information Center

    Kooloos, Jan G. M.; Vorstenbosch, Marc A. T. M.

    2013-01-01

    A teaching tool that facilitates student understanding of a three-dimensional (3D) integration of dermatomes with peripheral cutaneous nerve field distributions is described. This model is inspired by the confusion in novice learners between dermatome maps and nerve field distribution maps. This confusion leads to the misconception that these two…

  9. Empirical Study of Nova Scotia Nurses' Adoption of Healthcare Information Systems: Implications for Management and Policy-Making.

    PubMed

    Ifinedo, Princely

    2017-08-13

    This paper used the Theory of Planned Behavior (TPB), which was extended, to investigate nurses' adoption of healthcare information systems (HIS) in Nova Scotia, Canada. Data was collected from 197 nurses in a survey and data analysis was carried out using the partial least squares (PLS) technique. In contrast to findings in prior studies that used TPB to investigate clinicians' adoption of technologies in Canada and elsewhere, this study found no statistical significance for the relationships between attitude and subjective norm in relation to nurses' intention to use HIS. Rather, facilitating organizational conditions was the only TPB variable that explained sampled nurses' intention to use HIS at work. In particular, effects of computer habit and computer anxiety among older nurses were signified. To encourage nurses' adoption of HIS, healthcare administrators need to pay attention to facilitating organization conditions at work. Enhancing computer knowledge or competence is important for acceptance. Information presented in the study can be used by administrators of healthcare facilities in the research location and comparable parts of the world to further improve HIS adoption among nurses. The management of nursing professionals, especially in certain contexts (eg, prevalence of older nursing professionals), can make use of this study's insights. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  10. Multilevel Drivers of HIV/AIDS among Black Philadelphians: Exploration Using Community Ethnography and Geographic Information Systems

    PubMed Central

    Brawner, Bridgette M.; Reason, Janaiya L.; Goodman, Bridget A.; Schensul, Jean J.; Guthrie, Barbara

    2014-01-01

    Background Unequal HIV/AIDS distribution is influenced by certain social and structural contexts that facilitate HIV transmission and concentrate HIV in disease epicenters. Thus, one of the first steps in designing effective community-level HIV/AIDS initiatives is to disentangle the influence of individual, social, and structural factors on HIV risk. Combining ethnographic methodology with geographic information systems (GIS) mapping can allow for a complex exploration of multilevel factors within communities that facilitate HIV transmission in highly affected areas. Objectives We present the formative comparative community-based case study findings of an investigation of individual-, social- , and structural-level factors that contribute to the HIV/AIDS epidemic among Black Philadelphians. Methods Communities were defined using census tracts. The methodology included ethnographic and GIS mapping, observation, informal conversations with residents and business owners, and secondary analyses of census tract-level data in four Philadelphia neighborhoods. Results Factors such as overcrowding, disadvantage, permeability in community boundaries, and availability and accessibility of health-related resources varied significantly. Further, HIV/AIDS trended with social and structural inequities above and beyond the community’s racial composition. Discussion This study was a first step to disentangle relationships between community-level factors and potential risk for HIV in an HIV epicenter. The findings also highlight stark sociodemographic differences within and across racial groups, and further substantiate the need for comprehensive, community-level HIV prevention interventions. These findings from targeted United States urban communities have potential applicability for examining the distribution of HIV/AIDS in broader national and international geosocial contexts. PMID:25738621

  11. WE-E-BRB-11: Riview a Web-Based Viewer for Radiotherapy.

    PubMed

    Apte, A; Wang, Y; Deasy, J

    2012-06-01

    Collaborations involving radiotherapy data collection, such as the recently proposed international radiogenomics consortium, require robust, web-based tools to facilitate reviewing treatment planning information. We present the architecture and prototype characteristics for a web-based radiotherapy viewer. The web-based environment developed in this work consists of the following components: 1) Import of DICOM/RTOG data: CERR was leveraged to import DICOM/RTOG data and to convert to database friendly RT objects. 2) Extraction and Storage of RT objects: The scan and dose distributions were stored as .png files per slice and view plane. The file locations were written to the MySQL database. Structure contours and DVH curves were written to the database as numeric data. 3) Web interfaces to query, retrieve and visualize the RT objects: The Web application was developed using HTML 5 and Ruby on Rails (RoR) technology following the MVC philosophy. The open source ImageMagick library was utilized to overlay scan, dose and structures. The application allows users to (i) QA the treatment plans associated with a study, (ii) Query and Retrieve patients matching anonymized ID and study, (iii) Review up to 4 plans simultaneously in 4 window panes (iv) Plot DVH curves for the selected structures and dose distributions. A subset of data for lung cancer patients was used to prototype the system. Five user accounts were created to have access to this study. The scans, doses, structures and DVHs for 10 patients were made available via the web application. A web-based system to facilitate QA, and support Query, Retrieve and the Visualization of RT data was prototyped. The RIVIEW system was developed using open source and free technology like MySQL and RoR. We plan to extend the RIVIEW system further to be useful in clinical trial data collection, outcomes research, cohort plan review and evaluation. © 2012 American Association of Physicists in Medicine.

  12. Annotation and visualization of endogenous retroviral sequences using the Distributed Annotation System (DAS) and eBioX

    PubMed Central

    Martínez Barrio, Álvaro; Lagercrantz, Erik; Sperber, Göran O; Blomberg, Jonas; Bongcam-Rudloff, Erik

    2009-01-01

    Background The Distributed Annotation System (DAS) is a widely used network protocol for sharing biological information. The distributed aspects of the protocol enable the use of various reference and annotation servers for connecting biological sequence data to pertinent annotations in order to depict an integrated view of the data for the final user. Results An annotation server has been devised to provide information about the endogenous retroviruses detected and annotated by a specialized in silico tool called RetroTector. We describe the procedure to implement the DAS 1.5 protocol commands necessary for constructing the DAS annotation server. We use our server to exemplify those steps. Data distribution is kept separated from visualization which is carried out by eBioX, an easy to use open source program incorporating multiple bioinformatics utilities. Some well characterized endogenous retroviruses are shown in two different DAS clients. A rapid analysis of areas free from retroviral insertions could be facilitated by our annotations. Conclusion The DAS protocol has shown to be advantageous in the distribution of endogenous retrovirus data. The distributed nature of the protocol is also found to aid in combining annotation and visualization along a genome in order to enhance the understanding of ERV contribution to its evolution. Reference and annotation servers are conjointly used by eBioX to provide visualization of ERV annotations as well as other data sources. Our DAS data source can be found in the central public DAS service repository, , or at . PMID:19534743

  13. KeyWare: an open wireless distributed computing environment

    NASA Astrophysics Data System (ADS)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  14. The Diesel Combustion Collaboratory: Combustion Researchers Collaborating over the Internet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. M. Pancerella; L. A. Rahn; C. Yang

    2000-02-01

    The Diesel Combustion Collaborator (DCC) is a pilot project to develop and deploy collaborative technologies to combustion researchers distributed throughout the DOE national laboratories, academia, and industry. The result is a problem-solving environment for combustion research. Researchers collaborate over the Internet using DCC tools, which include: a distributed execution management system for running combustion models on widely distributed computers, including supercomputers; web-accessible data archiving capabilities for sharing graphical experimental or modeling data; electronic notebooks and shared workspaces for facilitating collaboration; visualization of combustion data; and video-conferencing and data-conferencing among researchers at remote sites. Security is a key aspect of themore » collaborative tools. In many cases, the authors have integrated these tools to allow data, including large combustion data sets, to flow seamlessly, for example, from modeling tools to data archives. In this paper the authors describe the work of a larger collaborative effort to design, implement and deploy the DCC.« less

  15. OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing

    NASA Astrophysics Data System (ADS)

    Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping

    2017-02-01

    The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.

  16. Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment.

    PubMed

    Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel

    2016-08-30

    Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks' execution time can be improved, in particular for some regular jobs.

  17. Advantages of Brahms for Specifying and Implementing a Multiagent Human-Robotic Exploration System

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten; Kaskiris, Charis; vanHoof, Ron

    2003-01-01

    We have developed a model-based, distributed architecture that integrates diverse components in a system designed for lunar and planetary surface operations: an astronaut's space suit, cameras, all-terrain vehicles, robotic assistant, crew in a local habitat, and mission support team. Software processes ('agents') implemented in the Brahms language, run on multiple, mobile platforms. These mobile agents interpret and transform available data to help people and robotic systems coordinate their actions to make operations more safe and efficient. The Brahms-based mobile agent architecture (MAA) uses a novel combination of agent types so the software agents may understand and facilitate communications between people and between system components. A state-of-the-art spoken dialogue interface is integrated with Brahms models, supporting a speech-driven field observation record and rover command system. An important aspect of the methodology involves first simulating the entire system in Brahms, then configuring the agents into a runtime system Thus, Brahms provides a language, engine, and system builder's toolkit for specifying and implementing multiagent systems.

  18. Application of the 15N tracer method to study the effect of pyrolysis temperature and atmosphere on the distribution of biochar nitrogen in the biomass-biochar-plant system.

    PubMed

    Tan, Zhongxin; Ye, Zhixiong; Zhang, Limei; Huang, Qiaoyun

    2018-05-01

    Biochar nitrogen is key to improving soil fertility, but the distribution of biochar nitrogen in the biomass-biochar-plant system is still unclear. To provide clarity, the 15 N tracer method was utilised to study the distribution of biochar nitrogen in the biochar both before and after its addition to the soil. The results can be summarised as follows. 1) The retention rate of 15 N in biochar decreases from 45.23% to 20.09% with increasing pyrolysis temperature from 400 to 800°C in a CO 2 atmosphere. 2) The retention rate of 15 N in biochar prepared in a CO 2 atmosphere is higher than that prepared in a N 2 atmosphere when the pyrolysis temperature is below 600°C. 3) Not only can biochar N slowly facilitate the adsorption of N by plants but the addition of biochar to the soil can also promote the supply of soil nitrogen to the plant; in contrast, the direct return of wheat straw biomass to the soil inhibits the absorption of soil N by plants. 4) In addition, the distribution of nitrogen was clarified; that is, when biochar was prepared by the pyrolysis of wheat straw at 400°C in a CO 2 atmosphere, the biochar retained 45.23% N, and after the addition of this biochar to the soil, 39.99% of N was conserved in the biochar residue, 4.55% was released into the soil, and 0.69% was contained in the wheat after growth for 31days. Therefore, this study very clearly shows the distribution of nitrogen in the biomass-biochar-plant system. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Building a Propulsion Experiment Project Management Environment

    NASA Technical Reports Server (NTRS)

    Keiser, Ken; Tanner, Steve; Hatcher, Danny; Graves, Sara

    2004-01-01

    What do you get when you cross rocket scientists with computer geeks? It is an interactive, distributed computing web of tools and services providing a more productive environment for propulsion research and development. The Rocket Engine Advancement Program 2 (REAP2) project involves researchers at several institutions collaborating on propulsion experiments and modeling. In an effort to facilitate these collaborations among researchers at different locations and with different specializations, researchers at the Information Technology and Systems Center,' University of Alabama in Huntsville, are creating a prototype web-based interactive information system in support of propulsion research. This system, to be based on experience gained in creating similar systems for NASA Earth science field experiment campaigns such as the Convection and Moisture Experiments (CAMEX), will assist in the planning and analysis of model and experiment results across REAP2 participants. The initial version of the Propulsion Experiment Project Management Environment (PExPM) consists of a controlled-access web portal facilitating the drafting and sharing of working documents and publications. Interactive tools for building and searching an annotated bibliography of publications related to REAP2 research topics have been created to help organize and maintain the results of literature searches. Also work is underway, with some initial prototypes in place, for interactive project management tools allowing project managers to schedule experiment activities, track status and report on results. This paper describes current successes, plans, and expected challenges for this project.

  20. Microgrid to enable optimal distributed energy retail and end-user demand response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Ming; Feng, Wei; Marnay, Chris

    In the face of unprecedented challenges in environmental sustainability and grid resilience, there is an increasingly held consensus regarding the adoption of distributed and renewable energy resources such as microgrids (MGs), and the utilization of flexible electric loads by demand response (DR) to potentially drive a necessary paradigm shift in energy production and consumption patterns. However, the potential value of distributed generation and demand flexibility has not yet been fully realized in the operation of MGs. This study investigates the pricing and operation strategy with DR for a MG retailer in an integrated energy system (IES). Based on co-optimizing retailmore » rates and MG dispatch formulated as a mixed integer quadratic programming (MIQP) problem, our model devises a dynamic pricing scheme that reflects the cost of generation and promotes DR, in tandem with an optimal dispatch plan that exploits spark spread and facilitates the integration of renewables, resulting in improved retailer profits and system stability. Main issues like integrated energy coupling and customer bill reduction are addressed during pricing to ensure rates competitiveness and customer protection. By evaluating on real datasets, the system is demonstrated to optimally coordinate storage, renewables, and combined heat and power (CHP), reduce carbon dioxide emission while maintaining profits, and effectively alleviate the PV curtailment problem. Finally, the model can be used by retailers and MG operators to optimize their operations, as well as regulators to design new utility rates in support of the ongoing transformation of energy systems.« less

  1. SCSODC: Integrating Ocean Data for Visualization Sharing and Application

    NASA Astrophysics Data System (ADS)

    Xu, C.; Li, S.; Wang, D.; Xie, Q.

    2014-02-01

    The South China Sea Ocean Data Center (SCSODC) was founded in 2010 in order to improve collecting and managing of ocean data of the South China Sea Institute of Oceanology (SCSIO). The mission of SCSODC is to ensure the long term scientific stewardship of ocean data, information and products - collected through research groups, monitoring stations and observation cruises - and to facilitate the efficient use and distribution to possible users. However, data sharing and applications were limited due to the characteristics of distribution and heterogeneity that made it difficult to integrate the data. To surmount those difficulties, the Data Sharing System has been developed by the SCSODC using the most appropriate information management and information technology. The Data Sharing System uses open standards and tools to promote the capability to integrate ocean data and to interact with other data portals or users and includes a full range of processes such as data discovery, evaluation and access combining C/S and B/S mode. It provides a visualized management interface for the data managers and a transparent and seamless data access and application environment for users. Users are allowed to access data using the client software and to access interactive visualization application interface via a web browser. The architecture, key technologies and functionality of the system are discussed briefly in this paper. It is shown that the system of SCSODC is able to implement web visualization sharing and seamless access to ocean data in a distributed and heterogeneous environment.

  2. Microgrid to enable optimal distributed energy retail and end-user demand response

    DOE PAGES

    Jin, Ming; Feng, Wei; Marnay, Chris; ...

    2018-06-07

    In the face of unprecedented challenges in environmental sustainability and grid resilience, there is an increasingly held consensus regarding the adoption of distributed and renewable energy resources such as microgrids (MGs), and the utilization of flexible electric loads by demand response (DR) to potentially drive a necessary paradigm shift in energy production and consumption patterns. However, the potential value of distributed generation and demand flexibility has not yet been fully realized in the operation of MGs. This study investigates the pricing and operation strategy with DR for a MG retailer in an integrated energy system (IES). Based on co-optimizing retailmore » rates and MG dispatch formulated as a mixed integer quadratic programming (MIQP) problem, our model devises a dynamic pricing scheme that reflects the cost of generation and promotes DR, in tandem with an optimal dispatch plan that exploits spark spread and facilitates the integration of renewables, resulting in improved retailer profits and system stability. Main issues like integrated energy coupling and customer bill reduction are addressed during pricing to ensure rates competitiveness and customer protection. By evaluating on real datasets, the system is demonstrated to optimally coordinate storage, renewables, and combined heat and power (CHP), reduce carbon dioxide emission while maintaining profits, and effectively alleviate the PV curtailment problem. Finally, the model can be used by retailers and MG operators to optimize their operations, as well as regulators to design new utility rates in support of the ongoing transformation of energy systems.« less

  3. Local oscillator distribution using a geostationary satellite

    NASA Technical Reports Server (NTRS)

    Bardin, Joseph; Weinreb, Sander; Bagri, Durga

    2004-01-01

    A satellite communication system suitable for distribution of local oscillator reference signals for a widely spaced microwave array has been developed and tested experimentally. The system uses a round-trip correction method of the satellite This experiment was carried out using Telstar-5, a commercial Ku-band geostationary satellite. For this initial experiment, both earth stations were located at the same site to facilitate direct comparison of the received signals. The local oscillator reference frequency was chosen to be 300MHz and was sent as the difference between two Ku-band tones. The residual error after applying the round trip correction has been measured to be better than 3psec for integration times ranging from 1 to 2000 seconds. For integration times greater then 500 seconds, the system outperforms a pair of hydrogen masers with the limitation believed to be ground-based equipment phase stability. The idea of distributing local oscillators using a geostationary satellite is not new; several researchers experimented with this technique in the eighties, but the achieved accuracy was 3 to 100 times worse than the present results. Since substantially and the performance of various components has improved. An important factor is the leasing of small amounts of satellite communication bandwidth. We lease three 100kHz bands at approximately one hundredth the cost of a full 36 MHz transponder. Further tests of the system using terminal separated by large distances and comparison tests with two hydrogen masers and radio interferometry is needed.

  4. Design and methodology of the Geo-social Analysis of Physicians' settlement (GAP-Study) in Germany.

    PubMed

    Groneberg, David A; Boll, Michael; Bauer, Jan

    2016-01-01

    Unequally distributed disease burdens within populations are well-known and occur worldwide. They are depending on residents' social status and/or ethnic background. Country-specific health care systems - especially the coverage and distribution of health care providers - are both a potential cause as well as an important solution for health inequalities. Registers are built of all accredited physicians and psychotherapists within the outpatient care system in German metropolises by utilizing the database of the Associations of Statutory Health Insurance Physicians. The physicians' practice neighborhood will be analyzed under socioeconomic and demographic perspectives. Therefore, official city districts' statistics will be assigned to the physicians and psychotherapists according to their practice location. Averages of neighborhood indicators will be calculated for each specialty. Moreover, advanced studies will inspect differences by physicians' gender or practice type. Geo-spatial analyses of the intra-city practices distribution will complete the settlement characteristics of physicians and psychotherapists within the outpatient care system in German metropolises. The project "Geo-social Analysis of Physicians' settlement" (GAP) is designed to elucidate gaps of physician coverage within the outpatient care system, dependent on neighborhood residents' social status or ethnics in German metropolises. The methodology of the GAP-Study enables the standardized investigation of physicians' settlement behavior in German metropolises and their inter-city comparisons. The identification of potential gaps within the physicians' coverage should facilitate the delineation of approaches for solving health care inequality problems.

  5. Influenza Virus Database (IVDB): an integrated information resource and analysis platform for influenza virus research.

    PubMed

    Chang, Suhua; Zhang, Jiajie; Liao, Xiaoyun; Zhu, Xinxing; Wang, Dahai; Zhu, Jiang; Feng, Tao; Zhu, Baoli; Gao, George F; Wang, Jian; Yang, Huanming; Yu, Jun; Wang, Jing

    2007-01-01

    Frequent outbreaks of highly pathogenic avian influenza and the increasing data available for comparative analysis require a central database specialized in influenza viruses (IVs). We have established the Influenza Virus Database (IVDB) to integrate information and create an analysis platform for genetic, genomic, and phylogenetic studies of the virus. IVDB hosts complete genome sequences of influenza A virus generated by Beijing Institute of Genomics (BIG) and curates all other published IV sequences after expert annotation. Our Q-Filter system classifies and ranks all nucleotide sequences into seven categories according to sequence content and integrity. IVDB provides a series of tools and viewers for comparative analysis of the viral genomes, genes, genetic polymorphisms and phylogenetic relationships. A search system has been developed for users to retrieve a combination of different data types by setting search options. To facilitate analysis of global viral transmission and evolution, the IV Sequence Distribution Tool (IVDT) has been developed to display the worldwide geographic distribution of chosen viral genotypes and to couple genomic data with epidemiological data. The BLAST, multiple sequence alignment and phylogenetic analysis tools were integrated for online data analysis. Furthermore, IVDB offers instant access to pre-computed alignments and polymorphisms of IV genes and proteins, and presents the results as SNP distribution plots and minor allele distributions. IVDB is publicly available at http://influenza.genomics.org.cn.

  6. Design of object-oriented distributed simulation classes

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D. (Principal Investigator)

    1995-01-01

    Distributed simulation of aircraft engines as part of a computer aided design package is being developed by NASA Lewis Research Center for the aircraft industry. The project is called NPSS, an acronym for 'Numerical Propulsion Simulation System'. NPSS is a flexible object-oriented simulation of aircraft engines requiring high computing speed. It is desirable to run the simulation on a distributed computer system with multiple processors executing portions of the simulation in parallel. The purpose of this research was to investigate object-oriented structures such that individual objects could be distributed. The set of classes used in the simulation must be designed to facilitate parallel computation. Since the portions of the simulation carried out in parallel are not independent of one another, there is the need for communication among the parallel executing processors which in turn implies need for their synchronization. Communication and synchronization can lead to decreased throughput as parallel processors wait for data or synchronization signals from other processors. As a result of this research, the following have been accomplished. The design and implementation of a set of simulation classes which result in a distributed simulation control program have been completed. The design is based upon MIT 'Actor' model of a concurrent object and uses 'connectors' to structure dynamic connections between simulation components. Connectors may be dynamically created according to the distribution of objects among machines at execution time without any programming changes. Measurements of the basic performance have been carried out with the result that communication overhead of the distributed design is swamped by the computation time of modules unless modules have very short execution times per iteration or time step. An analytical performance model based upon queuing network theory has been designed and implemented. Its application to realistic configurations has not been carried out.

  7. Design of Object-Oriented Distributed Simulation Classes

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1995-01-01

    Distributed simulation of aircraft engines as part of a computer aided design package being developed by NASA Lewis Research Center for the aircraft industry. The project is called NPSS, an acronym for "Numerical Propulsion Simulation System". NPSS is a flexible object-oriented simulation of aircraft engines requiring high computing speed. It is desirable to run the simulation on a distributed computer system with multiple processors executing portions of the simulation in parallel. The purpose of this research was to investigate object-oriented structures such that individual objects could be distributed. The set of classes used in the simulation must be designed to facilitate parallel computation. Since the portions of the simulation carried out in parallel are not independent of one another, there is the need for communication among the parallel executing processors which in turn implies need for their synchronization. Communication and synchronization can lead to decreased throughput as parallel processors wait for data or synchronization signals from other processors. As a result of this research, the following have been accomplished. The design and implementation of a set of simulation classes which result in a distributed simulation control program have been completed. The design is based upon MIT "Actor" model of a concurrent object and uses "connectors" to structure dynamic connections between simulation components. Connectors may be dynamically created according to the distribution of objects among machines at execution time without any programming changes. Measurements of the basic performance have been carried out with the result that communication overhead of the distributed design is swamped by the computation time of modules unless modules have very short execution times per iteration or time step. An analytical performance model based upon queuing network theory has been designed and implemented. Its application to realistic configurations has not been carried out.

  8. The integrated information architecture: a pilot study approach to leveraging logistics management with regard to influenza preparedness.

    PubMed

    Lin, Chinho; Lin, Chun-Mei; Yen, David C; Wu, Wu-Han

    2012-02-01

    Pandemic influenza is considered catastrophic to global health, with severe economic and social effects. Consequently, a strategy for the rapid deployment of essential medical supplies used for the prevention of influenza transmission and to alleviate public panic caused by the expected shortage of such supplies needs to be developed. Therefore, we employ integrated information concepts to develop a simulated influenza medical material supply system to facilitate a rapid response to such a crisis. Various scenarios are analyzed to estimate the appropriate inventory policy needed under different pandemic influenza outbreaks, and to establish a mechanism to evaluate the necessary stockpiles of medications and other requirements in the different phases of the pandemic. This study constructed a web-based decision support system framework prototype that displayed transparent data related to medical stockpiles in each district and integrated expert opinion about the best distribution of these supplies in the influenza pandemic scenarios. A data collection system was also designed to gather information through a daily VPN transmitted into one central repository for reporting and distribution purposes. This study provides timely and transparent medical supplies distribution information that can help decision makers to make the appropriate decisions under different pandemic influenza outbreaks, and also attempts to establish a mechanism of evaluating the stockpiles and requirements in the different phases of the pandemic.

  9. Canine spontaneous glioma: A translational model system for convection-enhanced delivery

    PubMed Central

    Dickinson, Peter J.; LeCouteur, Richard A.; Higgins, Robert J.; Bringas, John R.; Larson, Richard F.; Yamashita, Yoji; Krauze, Michal T.; Forsayeth, John; Noble, Charles O.; Drummond, Daryl C.; Kirpotin, Dmitri B.; Park, John W.; Berger, Mitchel S.; Bankiewicz, Krystof S.

    2010-01-01

    Canine spontaneous intracranial tumors bear striking similarities to their human tumor counterparts and have the potential to provide a large animal model system for more realistic validation of novel therapies typically developed in small rodent models. We used spontaneously occurring canine gliomas to investigate the use of convection-enhanced delivery (CED) of liposomal nanoparticles, containing topoisomerase inhibitor CPT-11. To facilitate visualization of intratumoral infusions by real-time magnetic resonance imaging (MRI), we included identically formulated liposomes loaded with Gadoteridol. Real-time MRI defined distribution of infusate within both tumor and normal brain tissues. The most important limiting factor for volume of distribution within tumor tissue was the leakage of infusate into ventricular or subarachnoid spaces. Decreased tumor volume, tumor necrosis, and modulation of tumor phenotype correlated with volume of distribution of infusate (Vd), infusion location, and leakage as determined by real-time MRI and histopathology. This study demonstrates the potential for canine spontaneous gliomas as a model system for the validation and development of novel therapeutic strategies for human brain tumors. Data obtained from infusions monitored in real time in a large, spontaneous tumor may provide information, allowing more accurate prediction and optimization of infusion parameters. Variability in Vd between tumors strongly suggests that real-time imaging should be an essential component of CED therapeutic trials to allow minimization of inappropriate infusions and accurate assessment of clinical outcomes. PMID:20488958

  10. A Role for Semantic Web Technologies in Patient Record Data Collection

    NASA Astrophysics Data System (ADS)

    Ogbuji, Chimezie

    Business Process Management Systems (BPMS) are a component of the stack of Web standards that comprise Service Oriented Architecture (SOA). Such systems are representative of the architectural framework of modern information systems built in an enterprise intranet and are in contrast to systems built for deployment on the larger World Wide Web. The REST architectural style is an emerging style for building loosely coupled systems based purely on the native HTTP protocol. It is a coordinated set of architectural constraints with a goal to minimize latency, maximize the independence and scalability of distributed components, and facilitate the use of intermediary processors.Within the development community for distributed, Web-based systems, there has been a debate regarding themerits of both approaches. In some cases, there are legitimate concerns about the differences in both architectural styles. In other cases, the contention seems to be based on concerns that are marginal at best. In this chapter, we will attempt to contribute to this debate by focusing on a specific, deployed use case that emphasizes the role of the Semantic Web, a simple Web application architecture that leverages the use of declarative XML processing, and the needs of a workflow system. The use case involves orchestrating a work process associated with the data entry of structured patient record content into a research registry at the Cleveland Clinic's Clinical Investigation department in the Heart and Vascular Institute.

  11. Cognitive Systems Modeling and Analysis of Command and Control Systems

    NASA Technical Reports Server (NTRS)

    Norlander, Arne

    2012-01-01

    Military operations, counter-terrorism operations and emergency response often oblige operators and commanders to operate within distributed organizations and systems for safe and effective mission accomplishment. Tactical commanders and operators frequently encounter violent threats and critical demands on cognitive capacity and reaction time. In the future they will make decisions in situations where operational and system characteristics are highly dynamic and non-linear, i.e. minor events, decisions or actions may have serious and irreversible consequences for the entire mission. Commanders and other decision makers must manage true real time properties at all levels; individual operators, stand-alone technical systems, higher-order integrated human-machine systems and joint operations forces alike. Coping with these conditions in performance assessment, system development and operational testing is a challenge for both practitioners and researchers. This paper reports on research from which the results led to a breakthrough: An integrated approach to information-centered systems analysis to support future command and control systems research development. This approach integrates several areas of research into a coherent framework, Action Control Theory (ACT). It comprises measurement techniques and methodological advances that facilitate a more accurate and deeper understanding of the operational environment, its agents, actors and effectors, generating new and updated models. This in turn generates theoretical advances. Some good examples of successful approaches are found in the research areas of cognitive systems engineering, systems theory, and psychophysiology, and in the fields of dynamic, distributed decision making and naturalistic decision making.

  12. Team Learning in Technology-Mediated Distributed Teams

    ERIC Educational Resources Information Center

    Andres, Hayward P.; Shipps, Belinda P.

    2010-01-01

    This study examines technological, educational/learning, and social affordances associated with the facilitation of project-based learning and problem solving in technology-mediated distributed teams. An empirical interpretive research approach using direct observation is used to interpret, evaluate and rate observable manifested behaviors and…

  13. Visual Computing Environment

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Putt, Charles W.

    1997-01-01

    The Visual Computing Environment (VCE) is a NASA Lewis Research Center project to develop a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis. The objectives of VCE are to (1) develop a visual computing environment for controlling the execution of individual simulation codes that are running in parallel and are distributed on heterogeneous host machines in a networked environment, (2) develop numerical coupling algorithms for interchanging boundary conditions between codes with arbitrary grid matching and different levels of dimensionality, (3) provide a graphical interface for simulation setup and control, and (4) provide tools for online visualization and plotting. VCE was designed to provide a distributed, object-oriented environment. Mechanisms are provided for creating and manipulating objects, such as grids, boundary conditions, and solution data. This environment includes parallel virtual machine (PVM) for distributed processing. Users can interactively select and couple any set of codes that have been modified to run in a parallel distributed fashion on a cluster of heterogeneous workstations. A scripting facility allows users to dictate the sequence of events that make up the particular simulation.

  14. A multiprocessor operating system simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, G.M.; Campbell, R.H.

    1988-01-01

    This paper describes a multiprocessor operating system simulator that was developed by the authors in the Fall of 1987. The simulator was built in response to the need to provide students with an environment in which to build and test operating system concepts as part of the coursework of a third-year undergraduate operating systems course. Written in C++, the simulator uses the co-routine style task package that is distributed with the AT and T C++ Translator to provide a hierarchy of classes that represents a broad range of operating system software and hardware components. The class hierarchy closely follows thatmore » of the Choices family of operating systems for loosely and tightly coupled multiprocessors. During an operating system course, these classes are refined and specialized by students in homework assignments to facilitate experimentation with different aspects of operating system design and policy decisions. The current implementation runs on the IBM RT PC under 4.3bsd UNIX.« less

  15. The Glymphatic System: A Beginner's Guide.

    PubMed

    Jessen, Nadia Aalling; Munk, Anne Sofie Finmann; Lundgaard, Iben; Nedergaard, Maiken

    2015-12-01

    The glymphatic system is a recently discovered macroscopic waste clearance system that utilizes a unique system of perivascular tunnels, formed by astroglial cells, to promote efficient elimination of soluble proteins and metabolites from the central nervous system. Besides waste elimination, the glymphatic system also facilitates  brain-wide distribution of several compounds, including glucose, lipids, amino acids, growth factors, and neuromodulators. Intriguingly, the glymphatic system function mainly during sleep and is largely disengaged during wakefulness. The biological need for sleep across all species may therefore reflect that the brain must enter a state of activity that enables elimination of potentially neurotoxic waste products, including β-amyloid. Since the concept of the glymphatic system is relatively new, we will here review its basic structural elements, organization, regulation, and functions. We will also discuss recent studies indicating that glymphatic function is suppressed in various diseases and that failure of glymphatic function in turn might contribute to pathology in neurodegenerative disorders, traumatic brain injury and stroke.

  16. Emergent Aerospace Designs Using Negotiating Autonomous Agents

    NASA Technical Reports Server (NTRS)

    Deshmukh, Abhijit; Middelkoop, Timothy; Krothapalli, Anjaneyulu; Smith, Charles

    2000-01-01

    This paper presents a distributed design methodology where designs emerge as a result of the negotiations between different stake holders in the process, such as cost, performance, reliability, etc. The proposed methodology uses autonomous agents to represent design decision makers. Each agent influences specific design parameters in order to maximize their utility. Since the design parameters depend on the aggregate demand of all the agents in the system, design agents need to negotiate with others in the market economy in order to reach an acceptable utility value. This paper addresses several interesting research issues related to distributed design architectures. First, we present a flexible framework which facilitates decomposition of the design problem. Second, we present overview of a market mechanism for generating acceptable design configurations. Finally, we integrate learning mechanisms in the design process to reduce the computational overhead.

  17. Intelligent manufacturing: the challenge for manufacturing strategy in China in the 21st century--what we will do

    NASA Astrophysics Data System (ADS)

    Yang, Shuzi; Lei, Ming; Guan, Zai-Lin; Xiong, Youlun

    1995-08-01

    This paper first introduces the project of intelligent manufacturing in China and the research state of the IIMRC (Intelligent and Integrated Manufacturing Research Centre) of HUST (Huazhong University of Science and Technology), then reviews the recent advances in object- oriented and distributed artificial intelligence and puts forth the view that these advances open up the prospect of systems that will enable the true integration of enterprises. In an attempt to identify domain requirements and match them with research achievements, the paper examines the current literature and distinguishes 14 features that are common. It argues that effective enterprise-wide support could be greatly facilitated by the existence of intelligent software entities with autonomous processing capabilities, that possess coordination and negotiation facilities and are organized in distributed hierarchical states.

  18. Disturbance facilitates the coexistence of antagonistic ecosystem engineers in California estuaries.

    PubMed

    Castorani, Max C N; Hovel, Kevin A; Williams, Susan L; Baskett, Marissa L

    2014-08-01

    Ecological theory predicts that interactions between antagonistic ecosystem engineers can lead to local competitive exclusion, but disturbance can facilitate broader coexistence. However, few empirical studies have tested the potential for disturbance to mediate competition between engineers. We examined the capacity for disturbance and habitat modification to explain the disjunct distributions of two benthic ecosystem engineers, eelgrass Zostera marina and the burrowing ghost shrimp Neotrypaea californiensis, in two California estuaries. Sediment sampling in eelgrass and ghost shrimp patches revealed that ghost shrimp change benthic biogeochemistry over small scales (centimeters) but not patch scales (meters to tens of meters), suggesting a limited capacity for sediment modification to explain species distributions. To determine the relative competitive abilities of engineers, we conducted reciprocal transplantations of ghost shrimp and eelgrass. Local ghost shrimp densities declined rapidly following the addition of eelgrass, and transplanted eelgrass expanded laterally into the surrounding ghost shrimp-dominated areas. When transplanted into eelgrass patches, ghost shrimp failed to persist. Ghost shrimp were also displaced from plots with structural mimics of eelgrass rhizomes and roots, suggesting that autogenic habitat modification by eelgrass is an important mechanism determining ghost shrimp distributions. However, ghost shrimp were able to rapidly colonize experimental disturbances to eelgrass patch edges, which are common in shallow estuaries. We conclude that coexistence in this system is maintained by spatiotemporally asynchronous disturbances and a competition-colonization trade-off: eelgrass is a competitively superior ecosystem engineer, but benthic disturbances permit the coexistence of ghost shrimp at the landscape scale by modulating the availability of space.

  19. Strategies for distributing cancer screening decision aids in primary care.

    PubMed

    Brackett, Charles; Kearing, Stephen; Cochran, Nan; Tosteson, Anna N A; Blair Brooks, W

    2010-02-01

    Decision aids (DAs) have been shown to facilitate shared decision making about cancer screening. However, little data exist on optimal strategies for dissemination. Our objective was to compare different decision aid distribution models. Eligible patients received video decision aids for prostate cancer (PSA) or colon cancer screening (CRC) through 4 distribution methods. Outcome measures included DA loans (N), % of eligible patients receiving DA, and patient and provider satisfaction. Automatically mailing DAs to all age/gender appropriate patients led to near universal receipt by screening-eligible patients, but also led to ineligible patients receiving DAs. Three different elective (non-automatic) strategies led to low rates of receipt. Clinician satisfaction was higher when patients viewed the DA before the visit, and this model facilitated implementation of the screening choice. Regardless of timing or distribution method, patient satisfaction was high. An automatic DA distribution method is more effective than relying on individual initiative. Enabling patients to view the DA before the visit is preferred. Systematically offering DAs to all eligible patients before their appointments is the ideal strategy, but may be challenging to implement. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  20. Open Data, Open Specifications and Free and Open Source Software: A powerful mix to create distributed Web-based water information systems

    NASA Astrophysics Data System (ADS)

    Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael

    2015-04-01

    We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems. Geospatial capabilities for mapping, visualization, and spatial analysis will be important components of these new generation of Web-based interoperable information systems in the water domain. The purpose of this presentation is to increase the awareness of scientists, IT personnel and agency managers about the advantages offered by the combined use of Open Data, Open Specifications for geospatial and water-related data collection, storage and sharing, as well as mature FOSS projects for the creation of interoperable Web-based information systems in the water domain. A case study is used to illustrate how these principles and technologies can be integrated to create a system with the previously mentioned characteristics for managing and responding to flood events.

  1. Remote sensing of the distribution and abundance of host species for spruce budworm in Northern Minnesota and Ontario

    Treesearch

    Peter T. Wolter; Philip A. Townsend; Brian R. Sturtevant; Clayton C. Kingdon

    2008-01-01

    Insects and disease affect large areas of forest in the U.S. and Canada. Understanding ecosystem impacts of such disturbances requires knowledge of host species distribution patterns on the landscape. In this study, we mapped the distribution and abundance of host species for the spruce budworm (Choristoneura fumiferana) to facilitate landscape scale...

  2. Macromolecular Structure Database. Final Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilliland, Gary L.

    2003-09-23

    The central activity of the PDB continues to be the collection, archiving and distribution of high quality structural data to the scientific community on a timely basis. In support of these activities NIST has continued its roles in developing the physical archive, in developing data uniformity, in dealing with NMR issues and in the distribution of PDB data through CD-ROMs. The physical archive holdings have been organized, inventoried, and a database has been created to facilitate their use. Data from individual PDB entries have been annotated to produce uniform values improving tremendously the accuracy of results of queries. Working withmore » the NMR community we have established data items specific for NMR that will be included in new entries and facilitate data deposition. The PDB CD-ROM production has continued on a quarterly basis, and new products are being distributed.« less

  3. A novel key management solution for reinforcing compliance with HIPAA privacy/security regulations.

    PubMed

    Lee, Chien-Ding; Ho, Kevin I-J; Lee, Wei-Bin

    2011-07-01

    Digitizing medical records facilitates the healthcare process. However, it can also cause serious security and privacy problems, which are the major concern in the Health Insurance Portability and Accountability Act (HIPAA). While various conventional encryption mechanisms can solve some aspects of these problems, they cannot address the illegal distribution of decrypted medical images, which violates the regulations defined in the HIPAA. To protect decrypted medical images from being illegally distributed by an authorized staff member, the model proposed in this paper provides a way to integrate several cryptographic mechanisms. In this model, the malicious staff member can be tracked by a watermarked clue. By combining several well-designed cryptographic mechanisms and developing a key management scheme to facilitate the interoperation among these mechanisms, the risk of illegal distribution can be reduced.

  4. Astronomical large projects managed with MANATEE: management tool for effective engineering

    NASA Astrophysics Data System (ADS)

    García-Vargas, M. L.; Mujica-Alvarez, E.; Pérez-Calpena, A.

    2012-09-01

    This paper describes MANATEE, which is the Management project web tool developed by FRACTAL, specifically designed for managing large astronomical projects. MANATEE facilitates the management by providing an overall view of the project and the capabilities to control the three main projects parameters: scope, schedule and budget. MANATEE is one of the three tools of the FRACTAL System & Project Suite, which is composed also by GECO (System Engineering Tool) and DOCMA (Documentation Management Tool). These tools are especially suited for those Consortia and teams collaborating in a multi-discipline, complex project in a geographically distributed environment. Our Management view has been applied successfully in several projects and currently is being used for Managing MEGARA, the next instrument for the GTC 10m telescope.

  5. Pi-Sat: A Low Cost Small Satellite and Distributed Spacecraft Mission System Test Platform

    NASA Technical Reports Server (NTRS)

    Cudmore, Alan

    2015-01-01

    Current technology and budget trends indicate a shift in satellite architectures from large, expensive single satellite missions, to small, low cost distributed spacecraft missions. At the center of this shift is the SmallSatCubesat architecture. The primary goal of the Pi-Sat project is to create a low cost, and easy to use Distributed Spacecraft Mission (DSM) test bed to facilitate the research and development of next-generation DSM technologies and concepts. This test bed also serves as a realistic software development platform for Small Satellite and Cubesat architectures. The Pi-Sat is based on the popular $35 Raspberry Pi single board computer featuring a 700Mhz ARM processor, 512MB of RAM, a flash memory card, and a wealth of IO options. The Raspberry Pi runs the Linux operating system and can easily run Code 582s Core Flight System flight software architecture. The low cost and high availability of the Raspberry Pi make it an ideal platform for a Distributed Spacecraft Mission and Cubesat software development. The Pi-Sat models currently include a Pi-Sat 1U Cube, a Pi-Sat Wireless Node, and a Pi-Sat Cubesat processor card.The Pi-Sat project takes advantage of many popular trends in the Maker community including low cost electronics, 3d printing, and rapid prototyping in order to provide a realistic platform for flight software testing, training, and technology development. The Pi-Sat has also provided fantastic hands on training opportunities for NASA summer interns and Pathways students.

  6. Numeric stratigraphic modeling: Testing sequence Numeric stratigraphic modeling: Testing sequence stratigraphic concepts using high resolution geologic examples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armentrout, J.M.; Smith-Rouch, L.S.; Bowman, S.A.

    1996-08-01

    Numeric simulations based on integrated data sets enhance our understanding of depositional geometry and facilitate quantification of depositional processes. Numeric values tested against well-constrained geologic data sets can then be used in iterations testing each variable, and in predicting lithofacies distributions under various depositional scenarios using the principles of sequence stratigraphic analysis. The stratigraphic modeling software provides a broad spectrum of techniques for modeling and testing elements of the petroleum system. Using well-constrained geologic examples, variations in depositional geometry and lithofacies distributions between different tectonic settings (passive vs. active margin) and climate regimes (hothouse vs. icehouse) can provide insight tomore » potential source rock and reservoir rock distribution, maturation timing, migration pathways, and trap formation. Two data sets are used to illustrate such variations: both include a seismic reflection profile calibrated by multiple wells. The first is a Pennsylvanian mixed carbonate-siliciclastic system in the Paradox basin, and the second a Pliocene-Pleistocene siliciclastic system in the Gulf of Mexico. Numeric simulations result in geometry and facies distributions consistent with those interpreted using the integrated stratigraphic analysis of the calibrated seismic profiles. An exception occurs in the Gulf of Mexico study where the simulated sediment thickness from 3.8 to 1.6 Ma within an upper slope minibasin was less than that mapped using a regional seismic grid. Regional depositional patterns demonstrate that this extra thickness was probably sourced from out of the plane of the modeled transect, illustrating the necessity for three-dimensional constraints on two-dimensional modeling.« less

  7. Synchrotron Radiation Microcomputed Tomography Guided Chromatographic Analysis for Displaying the Material Distribution in Tablets.

    PubMed

    Zhang, Liu; Wu, Li; Wang, Caifen; Zhang, Guoqing; Yu, Lin; Li, Haiyan; Maharjan, Abi; Tang, Yan; He, Dunwei; York, Peter; Sun, Huimin; Yin, Xianzhen; Zhang, Jiwen; Sun, Lixin

    2018-03-06

    One unusual and challenging scientific field that has received only cursory attention to date is the three-dimensional (3D) microstructure and spatial distribution of drug(s) and formulation materials in solid dosage forms. This study aims to provide deeper insight into the relationships between the microstructure of multiple-unit pellet system (MUPS) tablets and the spatial distribution of the active pharmaceutical ingredient (API) and excipients to facilitate the design of quantitative models for drug delivery systems. Synchrotron radiation X-ray microcomputed tomography (SR-μCT) was established as a 3D structure elucidation technique, which, in conjunction with liquid chromatography coupled to mass spectrometry (LC-MS) or liquid chromatography with evaporative light-scattering detector (LC-ELSD) enables chemical analysis of tablets. On the basis of the specific interior construction of theophylline MUPS tablets, the spatial distribution of materials was acquired by quantifying microregion samples that had been validated by SR-μCT for their locations in the MUPS tablets. The 3D structure of the MUPS tablets was catalogued as three structural domains: a matrix layer (ML), a protective cushion layer (PCL), and pellets (PL). Compared with the components in the ML, components in the PL had a larger proportion of theophylline, sucrose, and diethyl phthalate and a smaller proportion of lactose and sodium lauryl sulfate, whereas glyceryl monostearate was found to account for a large portion of the PCL. Microstructural characterization-guided zonal chemical determination represents a new approach for quality assessment and the development of drug delivery systems with in-depth insight into their constituent layers on a new scale.

  8. Collaborative Research: Robust Climate Projections and Stochastic Stability of Dynamical Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilya Zaliapin

    This project focused on conceptual exploration of El Nino/Southern Oscillation (ENSO) variability and sensitivity using a Delay Differential Equation developed in the project. We have (i) established the existence and continuous dependence of solutions of the model (ii) explored multiple models solutions, and the distribution of solutions extrema, and (iii) established and explored the phase locking phenomenon and the existence of multiple solutions for the same values of model parameters. In addition, we have applied to our model the concept of pullback attractor, which greatly facilitated predictive understanding of the nonlinear model's behavior.

  9. The Extratropical Transition of Tropical Storm Cindy From a GLM, ISS LIS and GPM Perspective

    NASA Technical Reports Server (NTRS)

    Heuscher, Lena; Gatlin, Patrick; Petersen, Walt; Liu, Chuntao; Cecil, Daniel J.

    2017-01-01

    The distribution of lightning with respect to tropical convective precipitation systems has been well established in previous studies and more recently by the successful Tropical Rainfall Measuring Mission (TRMM). However, TRMM did not provide information about precipitation features poleward of +/-38 deg latitude. Hence we focus on the evolution of lightning within extra-tropical cyclones traversing the mid-latitudes, especially its oceans. To facilitate such studies, lightning data from the Geostationary Lightning Mapper (GLM) onboard GOES-16 was combined with precipitation features obtained from the Global Precipitation Measurement (GPM) mission constellation of satellites.

  10. Content-based image exploitation for situational awareness

    NASA Astrophysics Data System (ADS)

    Gains, David

    2008-04-01

    Image exploitation is of increasing importance to the enterprise of building situational awareness from multi-source data. It involves image acquisition, identification of objects of interest in imagery, storage, search and retrieval of imagery, and the distribution of imagery over possibly bandwidth limited networks. This paper describes an image exploitation application that uses image content alone to detect objects of interest, and that automatically establishes and preserves spatial and temporal relationships between images, cameras and objects. The application features an intuitive user interface that exposes all images and information generated by the system to an operator thus facilitating the formation of situational awareness.

  11. Telearch - Integrated visual simulation environment for collaborative virtual archaeology.

    NASA Astrophysics Data System (ADS)

    Kurillo, Gregorij; Forte, Maurizio

    Archaeologists collect vast amounts of digital data around the world; however, they lack tools for integration and collaborative interaction to support reconstruction and interpretation process. TeleArch software is aimed to integrate different data sources and provide real-time interaction tools for remote collaboration of geographically distributed scholars inside a shared virtual environment. The framework also includes audio, 2D and 3D video streaming technology to facilitate remote presence of users. In this paper, we present several experimental case studies to demonstrate the integration and interaction with 3D models and geographical information system (GIS) data in this collaborative environment.

  12. Design and evaluation of a microgrid for PEV charging with flexible distribution of energy sources and storage

    NASA Astrophysics Data System (ADS)

    Pyne, Moinak

    This thesis aspires to model and control, the flow of power in a DC microgrid. Specifically, the energy sources are a photovoltaic system and the utility grid, a lead acid battery based energy storage system and twenty PEV charging stations as the loads. Theoretical principles of large scale state space modeling are applied to model the considerable number of power electronic converters needed for controlling voltage and current thresholds. The energy storage system is developed using principles of neural networks to facilitate a stable and uncomplicated model of the lead acid battery. Power flow control is structured as a hierarchical problem with multiple interactions between individual components of the microgrid. The implementation is done using fuzzy logic with scheduling the maximum use of available solar energy and compensating demand or excess power with the energy storage system, and minimizing utility grid use, while providing multiple speeds of charging the PEVs.

  13. Pleasure systems in the brain

    PubMed Central

    Berridge, Kent C.; Kringelbach, Morten L.

    2015-01-01

    Pleasure is mediated by well-developed mesocorticolimbic circuitry, and serves adaptive functions. In affective disorders anhedonia (lack of pleasure) or dysphoria (negative affect) can result from breakdowns of that hedonic system. Human neuroimaging studies indicate that surprisingly similar circuitry is activated by quite diverse pleasures, suggesting a common neural currency shared by all. Wanting for rewards is generated by a large and distributed brain system. Liking, or pleasure itself, is generated by a smaller set of hedonic hotspots within limbic circuitry. Those hotspots also can be embedded in broader anatomical patterns of valence organization, such as in a keyboard pattern of nucleus accumbens generators for desire versus dread. In contrast, some of the best known textbook candidates for pleasure generators, including classic pleasure electrodes and the mesolimbic dopamine system, may not generate pleasure after all. These emerging insights into brain pleasure mechanisms may eventually facilitate better treatments for affective disorders. PMID:25950633

  14. Robust Stabilization of T-S Fuzzy Stochastic Descriptor Systems via Integral Sliding Modes.

    PubMed

    Li, Jinghao; Zhang, Qingling; Yan, Xing-Gang; Spurgeon, Sarah K

    2017-09-19

    This paper addresses the robust stabilization problem for T-S fuzzy stochastic descriptor systems using an integral sliding mode control paradigm. A classical integral sliding mode control scheme and a nonparallel distributed compensation (Non-PDC) integral sliding mode control scheme are presented. It is shown that two restrictive assumptions previously adopted developing sliding mode controllers for Takagi-Sugeno (T-S) fuzzy stochastic systems are not required with the proposed framework. A unified framework for sliding mode control of T-S fuzzy systems is formulated. The proposed Non-PDC integral sliding mode control scheme encompasses existing schemes when the previously imposed assumptions hold. Stability of the sliding motion is analyzed and the sliding mode controller is parameterized in terms of the solutions of a set of linear matrix inequalities which facilitates design. The methodology is applied to an inverted pendulum model to validate the effectiveness of the results presented.

  15. Shigella Iron Acquisition Systems and their Regulation.

    PubMed

    Wei, Yahan; Murphy, Erin R

    2016-01-01

    Survival of Shigella within the host is strictly dependent on the ability of the pathogen to acquire essential nutrients, such as iron. As an innate immune defense against invading pathogens, the level of bio-available iron within the human host is maintained at exceeding low levels, by sequestration of the element within heme and other host iron-binding compounds. In response to sequestration mediated iron limitation, Shigella produce multiple iron-uptake systems that each function to facilitate the utilization of a specific host-associated source of nutrient iron. As a mechanism to balance the essential need for iron and the toxicity of the element when in excess, the production of bacterial iron acquisition systems is tightly regulated by a variety of molecular mechanisms. This review summarizes the current state of knowledge on the iron-uptake systems produced by Shigella species, their distribution within the genus, and the molecular mechanisms that regulate their production.

  16. An XML-based system for the flexible classification and retrieval of clinical practice guidelines.

    PubMed Central

    Ganslandt, T.; Mueller, M. L.; Krieglstein, C. F.; Senninger, N.; Prokosch, H. U.

    2002-01-01

    Beneficial effects of clinical practice guidelines (CPGs) have not yet reached expectations due to limited routine adoption. Electronic distribution and reminder systems have the potential to overcome implementation barriers. Existing electronic CPG repositories like the National Guideline Clearinghouse (NGC) provide individual access but lack standardized computer-readable interfaces necessary for automated guideline retrieval. The aim of this paper was to facilitate automated context-based selection and presentation of CPGs. Using attributes from the NGC classification scheme, an XML-based metadata repository was successfully implemented, providing document storage, classification and retrieval functionality. Semi-automated extraction of attributes was implemented for the import of XML guideline documents using XPath. A hospital information system interface was exemplarily implemented for diagnosis-based guideline invocation. Limitations of the implemented system are discussed and possible future work is outlined. Integration of standardized computer-readable search interfaces into existing CPG repositories is proposed. PMID:12463831

  17. RTEMS CENTRE- Support and Maintenance CENTRE to RTEMS Operating System

    NASA Astrophysics Data System (ADS)

    Silva, H.; Constantino, A.; Coutunho, M.; Freitas, D.; Faustino, S.; Mota, M.; Colaço, P.; Zulianello, M.

    2008-08-01

    RTEMS stands for Real-Time Operating System for Multiprocessor Systems. It is a full featured Real Time Operating System that supports a variety of open APIs and interface standards. It provides a high performance environment for embedded applications, including a fixed-priority preemptive/non-preemptive scheduler, a comprehensive set of multitasking operations and a large range of supported architectures. Support and maintenance CENTRE to RTEMS operating system (RTEMS CENTRE) is a joint initiative of ESA-Portugal Task force, aiming to build a strong technical competence in the space flight (on- board) software, to offer support, maintenance and improvements to RTEMS. This paper provides a high level description of the current and future activities of the RTEMS CENTRE. It presents a brief description of the RTEMS operating system, a description of the tools developed and distributed to the community [1] and the improvements to be made to the operating system, including facilitation for the qualification of RTEMS (4.8.0) [2] for the space missions.

  18. Utilization of negative beat-frequencies for maximizing the update-rate of OFDR

    NASA Astrophysics Data System (ADS)

    Gabai, Haniel; Botsev, Yakov; Hahami, Meir; Eyal, Avishay

    2015-07-01

    In traditional OFDR systems, the backscattered profile of a sensing fiber is inefficiently duplicated to the negative band of spectrum. In this work, we present a new OFDR design and algorithm that remove this redundancy and make use of negative beat frequencies. In contrary to conventional OFDR designs, it facilitates efficient use of the available system bandwidth and enables distributed sensing with the maximum allowable interrogation update-rate for a given fiber length. To enable the reconstruction of negative beat frequencies an I/Q type receiver is used. In this receiver, both the in-phase (I) and quadrature (Q) components of the backscatter field are detected. Following detection, both components are digitally combined to produce a complex backscatter signal. Accordingly, due to its asymmetric nature, the produced spectrum will not be corrupted by the appearance of negative beat-frequencies. Here, via a comprehensive computer simulation, we show that in contrast to conventional OFDR systems, I/Q OFDR can be operated at maximum interrogation update-rate for a given fiber length. In addition, we experimentally demonstrate, for the first time, the ability of I/Q OFDR to utilize negative beat-frequencies for long-range distributed sensing.

  19. Chlorine stress mediates microbial surface attachment in drinking water systems.

    PubMed

    Liu, Li; Le, Yang; Jin, Juliang; Zhou, Yuliang; Chen, Guowei

    2015-03-01

    Microbial attachment to drinking water pipe surfaces facilitates pathogen survival and deteriorates disinfection performance, directly threatening the safety of drinking water. Notwithstanding that the formation of biofilm has been studied for decades, the underlying mechanisms for the origins of microbial surface attachment in biofilm development in drinking water pipelines remain largely elusive. We combined experimental and mathematical methods to investigate the role of environmental stress-mediated cell motility on microbial surface attachment in chlorination-stressed drinking water distribution systems. Results show that at low levels of disinfectant (0.0-1.0 mg/L), the presence of chlorine promotes initiation of microbial surface attachment, while higher amounts of disinfectant (>1.0 mg/L) inhibit microbial attachment. The proposed mathematical model further demonstrates that chlorination stress (0.0-5.0 mg/L)-mediated microbial cell motility regulates the frequency of cell-wall collision and thereby controls initial microbial surface attachment. The results reveal that transport processes and decay patterns of chlorine in drinking water pipelines regulate microbial cell motility and, thus, control initial surface cell attachment. It provides a mechanistic understanding of microbial attachment shaped by environmental disinfection stress and leads to new insights into microbial safety protocols in water distribution systems.

  20. Low Noise Cruise Efficient Short Take-Off and Landing Transport Vehicle Study

    NASA Technical Reports Server (NTRS)

    Kim, Hyun D.; Berton, Jeffrey J.; Jones, Scott M.

    2007-01-01

    The saturation of the airspace around current airports combined with increasingly stringent community noise limits represents a serious impediment to growth in world aviation travel. Breakthrough concepts that both increase throughput and reduce noise impacts are required to enable growth in aviation markets. Concepts with a 25 year horizon must facilitate a 4x increase in air travel while simultaneously meeting community noise constraints. Attacking these horizon issues holistically is the concept study of a Cruise Efficient Short Take-Off and Landing (CESTOL) high subsonic transport under the NASA's Revolutionary Systems Concepts for Aeronautics (RSCA) project. The concept is a high-lift capable airframe with a partially embedded distributed propulsion system that takes a synergistic approach in propulsion-airframe-integration (PAI) by fully integrating the airframe and propulsion systems to achieve the benefits of both low-noise short take-off and landing (STOL) operations and efficient high speed cruise. This paper presents a summary of the recent study of a distributed propulsion/airframe configuration that provides low-noise STOL operation to enable 24-hour use of the untapped regional and city center airports to increase the capacity of the overall airspace while still maintaining efficient high subsonic cruise flight capability.

  1. 3D in-vivo imaging of GFP-expressing T-cells in mice with non-contact fluorescence molecular tomography (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Garofalakis, Anikitos; Meyer, Heiko; Zacharakis, Giannis; Economou, Eleftherios N.; Mamalaki, Clio; Papamatheakis, Joseph; Ntziachristos, Vasilis; Ripoll, Jorge

    2005-06-01

    Optical imaging and tomography in tissues can facilitate the quantitative study of several important chromophores and fluorophores in-vivo. Due to this fact, there has been great interest in developing imaging systems offering quantitative information on the location and concentration of chromophores and fluorescent probes. In this study we present a novel imaging system that enables three dimensional (3D) imaging of fluorescent signals in bodies of arbitrary shapes in a non-contact geometry, in combination with a 3D surface reconstruction algorithm, which is appropriate for in-vivo small animal imaging of fluorescent probes. The system consists of a rotating sample holder and a lens coupled Charge Coupled Device (CCD) camera in combination with a fiber coupled laser scanning device. An Argon ion laser is used as the source and different filters are used for the detection of various fluorophores or fluorescing proteins. With this new setup a large measurements dataset can be achieved while the use of inversion models give a high capacity for quantitative 3D reconstruction of fluorochrome distributions as well as high spatial resolution. The system has already been tested in the observation of the distribution of Green Fluorescent Protein (GFP) expressing T-lymphocytes in order to study the function of the immune system in a murine model, which can then be related to the function of the human immune system.

  2. Improvements in Space Geodesy Data Discovery at the CDDIS

    NASA Technical Reports Server (NTRS)

    Noll, C.; Pollack, N.; Michael, P.

    2011-01-01

    The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data products in a central data bank. to maintain information about the archival of these data, and to disseminate these data and information in a timely manner to a global scientific research community. The archive consists of GNSS, laser ranging, VLBI, and DORIS data sets and products derived from these data. The CDDIS is one of NASA's Earth Observing System Data and Information System (EOSDIS) distributed data centers; EOSDIS data centers serve a diverse user community and arc tasked to provide facilities to search and access science data and products. Several activities are currently under development at the CDDIS to aid users in data discovery, both within the current community and beyond. The CDDIS is cooperating in the development of Geodetic Seamless Archive Centers (GSAC) with colleagues at UNAVCO and SIO. TIle activity will provide web services to facilitate data discovery within and across participating archives. In addition, the CDDIS is currently implementing modifications to the metadata extracted from incoming data and product files pushed to its archive. These enhancements will permit information about COOlS archive holdings to be made available through other data portals such as Earth Observing System (EOS) Clearinghouse (ECHO) and integration into the Global Geodetic Observing System (GGOS) portal.

  3. Understanding Barriers and Facilitators to the use of Clinical Information Systems for Intensive Care Units and Anesthesia Record Keeping: A Rapid Ethnography

    PubMed Central

    Saleem, Jason J.; Plew, William R.; Speir, Ross C.; Herout, Jennifer; Wilck, Nancy R.; Ryan, Dale Marie; Cullen, Theresa A.; Scott, Jean M.; Beene, Murielle S.; Phillips, Toni

    2017-01-01

    Objective This study evaluated the current use of commercial-off-the-shelf Clinical Information Systems (CIS) for intensive care units (ICU) and Anesthesia Record Keeping (ARK) for operating rooms and post-anesthesia care recovery settings at three Veterans Affairs Medical Centers (VAMCs). Clinicians and administrative staff use these applications at bedside workstations, in operating rooms, at nursing stations, in physician’s rooms, and in other various settings. The intention of a CIS or an ARK system is to facilitate creation of electronic records of data, assessments, and procedures from multiple medical devices. The US Department of Veterans Affairs (VA) Office of the Chief of Nursing Informatics sought to understand usage barriers and facilitators to optimize these systems in the future. Therefore, a human factors study was carried out to observe the CIS and ARK systems in use at three VAMCs in order to identify best practices and suggested improvements to currently implemented CIS and ARK systems. Methods We conducted a rapid ethnographic study of clinical end-users interacting with the CIS and ARK systems in the critical care and anesthesia care areas in each of three geographically distributed VAMCs. Two observers recorded interactions and/or interview responses from 88 CIS and ARK end-users. We coded and sorted into logical categories field notes from 69 shadowed participants. The team transcribed and combined data from key informant interviews with 19 additional participants with the observation data. We then integrated findings across observations into meaningful patterns and abstracted the data into themes, which translated directly to barriers to effective adoption and optimization of the CIS and ARK systems. Results Effective optimization of the CIS and ARK systems was impeded by: (1) integration issues with other software systems; (2) poor usability; (3) software challenges; (4) hardware challenges; (5) training concerns; (6) unclear roles and lack of coordination among stakeholders; and (7) insufficient technical support. Many of these barriers are multi-faceted and have associated sub-barriers, which are described in detail along with relevant quotes from participants. In addition, regionalized purchases of different CIS and ARK systems, as opposed to enterprise level purchases, contributed to some of the identified barriers. Facilitators to system use included (1) automation and (2) a dedicated facility-level CIS-ARK coordinator. Conclusions We identified barriers that explain some of the challenges with the optimization of the CIS and ARK commercial systems across the Veterans Health Administration (VHA). To help address these barriers, and evolve them into facilitators, we categorized report findings as (1) interface and system-level changes that vendors or VA healthcare systems can implement; (2) implementation factors under VA control and not under VA control; and (3) factors that may be used to inform future application purchases. We outline several recommendations for improved adoption of CIS and ARK systems and further recommend that human factors engineering and usability requirements become an integral part of VA health information technology (HIT) application procurement, customization, and implementation in order to help eliminate or mitigate some of the barriers of use identified in this study. Human factors engineering methods can be utilized to apply a user-centered approach to application requirements specification, application evaluation, system integration, and application implementation. PMID:25843931

  4. Understanding barriers and facilitators to the use of Clinical Information Systems for intensive care units and Anesthesia Record Keeping: A rapid ethnography.

    PubMed

    Saleem, Jason J; Plew, William R; Speir, Ross C; Herout, Jennifer; Wilck, Nancy R; Ryan, Dale Marie; Cullen, Theresa A; Scott, Jean M; Beene, Murielle S; Phillips, Toni

    2015-07-01

    This study evaluated the current use of commercial-off-the-shelf Clinical Information Systems (CIS) for intensive care units (ICUs) and Anesthesia Record Keeping (ARK) for operating rooms and post-anesthesia care recovery settings at three Veterans Affairs Medical Centers (VAMCs). Clinicians and administrative staff use these applications at bedside workstations, in operating rooms, at nursing stations, in physician's rooms, and in other various settings. The intention of a CIS or an ARK system is to facilitate creation of electronic records of data, assessments, and procedures from multiple medical devices. The US Department of Veterans Affairs (VA) Office of the Chief of Nursing Informatics sought to understand usage barriers and facilitators to optimize these systems in the future. Therefore, a human factors study was carried out to observe the CIS and ARK systems in use at three VAMCs in order to identify best practices and suggested improvements to currently implemented CIS and ARK systems. We conducted a rapid ethnographic study of clinical end-users interacting with the CIS and ARK systems in the critical care and anesthesia care areas in each of three geographically distributed VAMCs. Two observers recorded interactions and/or interview responses from 88 CIS and ARK end-users. We coded and sorted into logical categories field notes from 69 shadowed participants. The team transcribed and combined data from key informant interviews with 19 additional participants with the observation data. We then integrated findings across observations into meaningful patterns and abstracted the data into themes, which translated directly to barriers to effective adoption and optimization of the CIS and ARK systems. Effective optimization of the CIS and ARK systems was impeded by: (1) integration issues with other software systems; (2) poor usability; (3) software challenges; (4) hardware challenges; (5) training concerns; (6) unclear roles and lack of coordination among stakeholders; and (7) insufficient technical support. Many of these barriers are multi-faceted and have associated sub-barriers, which are described in detail along with relevant quotes from participants. In addition, regionalized purchases of different CIS and ARK systems, as opposed to enterprise level purchases, contributed to some of the identified barriers. Facilitators to system use included (1) automation and (2) a dedicated facility-level CIS-ARK Coordinator. We identified barriers that explain some of the challenges with the optimization of the CIS and ARK commercial systems across the Veterans Health Administration (VHA). To help address these barriers, and evolve them into facilitators, we categorized report findings as (1) interface and system-level changes that vendors or VA healthcare systems can implement; (2) implementation factors under VA control and not under VA control; and (3) factors that may be used to inform future application purchases. We outline several recommendations for improved adoption of CIS and ARK systems and further recommend that human factors engineering and usability requirements become an integral part of VA health information technology (HIT) application procurement, customization, and implementation in order to help eliminate or mitigate some of the barriers of use identified in this study. Human factors engineering methods can be utilized to apply a user-centered approach to application requirements specification, application evaluation, system integration, and application implementation. Published by Elsevier Ireland Ltd.

  5. pysimm: A Python Package for Simulation of Molecular Systems

    NASA Astrophysics Data System (ADS)

    Fortunato, Michael; Colina, Coray

    pysimm, short for python simulation interface for molecular modeling, is a python package designed to facilitate the structure generation and simulation of molecular systems through convenient and programmatic access to object-oriented representations of molecular system data. This poster presents core features of pysimm and design philosophies that highlight a generalized methodology for incorporation of third-party software packages through API interfaces. The integration with the LAMMPS simulation package is explained to demonstrate this methodology. pysimm began as a back-end python library that powered a cloud-based application on nanohub.org for amorphous polymer simulation. The extension from a specific application library to general purpose simulation interface is explained. Additionally, this poster highlights the rapid development of new applications to construct polymer chains capable of controlling chain morphology such as molecular weight distribution and monomer composition.

  6. Why is it so hard to implement change? A qualitative examination of barriers and facilitators to distribution of naloxone for overdose prevention in a safety net environment.

    PubMed

    Drainoni, Mari-Lynn; Koppelman, Elisa A; Feldman, James A; Walley, Alexander Y; Mitchell, Patricia M; Ellison, Jacqueline; Bernstein, Edward

    2016-10-18

    The increase in opioid overdose deaths has become a national public health crisis. Naloxone is an important tool in opioid overdose prevention. Distribution of nasal naloxone has been found to be a feasible, and effective intervention in community settings and may have potential high applicability in the emergency department, which is often the initial point of care for persons at high risk of overdose. One safety net hospital introduced an innovative policy to offer take-home nasal naloxone via a standing order to ensure distribution to patients at risk for overdose. The aims of this study were to examine acceptance and uptake of the policy and assess facilitators and barriers to implementation. After obtaining pre-post data on naloxone distribution, we conducted a qualitative study. The PARiHS framework steered development of the qualitative guide. We used theoretical sampling in order to include the range of types of emergency department staff (50 total). The constant comparative method was initially used to code the transcripts and identify themes; the themes that emerged from the coding were then mapped back to the evidence, context and facilitation constructs of the PARiHS framework. Acceptance of the policy was good but uptake was low. Primary themes related to facilitators included: real-world driven intervention with philosophical, clinician and leadership support; basic education and training efforts; availability of resources; and ability to leave the ED with the naloxone kit in hand. Barriers fell into five general categories: protocol and policy; workflow and logistical; patient-related; staff roles and responsibilities; and education and training. The actual implementation of a new innovation in healthcare delivery is largely driven by factors beyond acceptance. Despite support and resources, implementation was challenging, with low uptake. While the potential of this innovation is unknown, understanding the experience is important to improve uptake in this setting and offer possible solutions for other facilities to address the opioid overdose crisis. Use of the PARiHS framework allowed us to recognize and understand key evidence, contextual and facilitation barriers to the successful implementation of the policy and to identify areas for improvement.

  7. Fostering Self-Regulation in Distributed Learning

    ERIC Educational Resources Information Center

    Terry, Krista P.; Doolittle, Peter

    2006-01-01

    Although much has been written about fostering self-regulated learning in traditional classroom settings, there has been little that addresses how to facilitate self-regulated learning skills in distributed and online environments. This article will examine some such strategies by specifically focusing on time management. Specific principles for…

  8. Astro-WISE: Chaining to the Universe

    NASA Astrophysics Data System (ADS)

    Valentijn, E. A.; McFarland, J. P.; Snigula, J.; Begeman, K. G.; Boxhoorn, D. R.; Rengelink, R.; Helmich, E.; Heraudeau, P.; Verdoes Kleijn, G.; Vermeij, R.; Vriend, W.-J.; Tempelaar, M. J.; Deul, E.; Kuijken, K.; Capaccioli, M.; Silvotti, R.; Bender, R.; Neeser, M.; Saglia, R.; Bertin, E.; Mellier, Y.

    2007-10-01

    The recent explosion of recorded digital data and its processed derivatives threatens to overwhelm researchers when analysing their experimental data or looking up data items in archives and file systems. While current hardware developments allow the acquisition, processing and storage of hundreds of terabytes of data at the cost of a modern sports car, the software systems to handle these data are lagging behind. This problem is very general and is well recognized by various scientific communities; several large projects have been initiated, e.g., DATAGRID/EGEE {http://www.eu-egee.org/} federates compute and storage power over the high-energy physical community, while the international astronomical community is building an Internet geared Virtual Observatory {http://www.euro-vo.org/pub/} (Padovani 2006) connecting archival data. These large projects either focus on a specific distribution aspect or aim to connect many sub-communities and have a relatively long trajectory for setting standards and a common layer. Here, we report first light of a very different solution (Valentijn & Kuijken 2004) to the problem initiated by a smaller astronomical IT community. It provides an abstract scientific information layer which integrates distributed scientific analysis with distributed processing and federated archiving and publishing. By designing new abstractions and mixing in old ones, a Science Information System with fully scalable cornerstones has been achieved, transforming data systems into knowledge systems. This break-through is facilitated by the full end-to-end linking of all dependent data items, which allows full backward chaining from the observer/researcher to the experiment. Key is the notion that information is intrinsic in nature and thus is the data acquired by a scientific experiment. The new abstraction is that software systems guide the user to that intrinsic information by forcing full backward and forward chaining in the data modelling.

  9. The evolution of phenotypic correlations and ‘developmental memory’

    PubMed Central

    Watson, Richard A.; Wagner, Günter P.; Pavlicev, Mihaela; Weinreich, Daniel M.; Mills, Rob

    2014-01-01

    Development introduces structured correlations among traits that may constrain or bias the distribution of phenotypes produced. Moreover, when suitable heritable variation exists, natural selection may alter such constraints and correlations, affecting the phenotypic variation available to subsequent selection. However, exactly how the distribution of phenotypes produced by complex developmental systems can be shaped by past selective environments is poorly understood. Here we investigate the evolution of a network of recurrent non-linear ontogenetic interactions, such as a gene regulation network, in various selective scenarios. We find that evolved networks of this type can exhibit several phenomena that are familiar in cognitive learning systems. These include formation of a distributed associative memory that can ‘store’ and ‘recall’ multiple phenotypes that have been selected in the past, recreate complete adult phenotypic patterns accurately from partial or corrupted embryonic phenotypes, and ‘generalise’ (by exploiting evolved developmental modules) to produce new combinations of phenotypic features. We show that these surprising behaviours follow from an equivalence between the action of natural selection on phenotypic correlations and associative learning, well-understood in the context of neural networks. This helps to explain how development facilitates the evolution of high-fitness phenotypes and how this ability changes over evolutionary time. PMID:24351058

  10. Regionalization of Habitat Suitability of Masson’s Pine based on geographic information system and Fuzzy Matter-Element Model

    PubMed Central

    Zhou, Xiuteng; Zhao, Manxi; Zhou, Liangyun; Yang, Guang; Huang, Luqi; Yan, Cuiqi; Huang, Quanshu; Ye, Liang; Zhang, Xiaobo; Guo, Lanpin; Ke, Xiao; Guo, Jiao

    2016-01-01

    Pine needles have been widely used in the development of anti-hypertensive and anti-hyperlipidemic agents and health food. However, the widespread distribution of this tree poses great obstacles to the quality control and efficacy evaluation. To facilitate the effective and rational exploitation of Masson’s pine (Pinus massoniana Lamb), as well as ensure effective development of Masson’s pine needles as a medicinal agent, we investigated the spatial distribution of habitat suitability and evaluated the optimal ranges of ecological factors of P. massoniana with 280 samples collected from 12 provinces in China through the evaluation of four constituents known to be effective medicinally. The results of habitat suitability evaluation were also verified by Root Mean Square Error (RMSE). Finally, five ecological factors were chosen in the establishment of a habitat suitability evaluation system. The most suitable areas for P. massoniana growth were mainly concentrated in the middle and lower reaches of the Yangtze River basin, such as Sichuan, Guizhou, and Jiangxi provinces, while the best quality needles were from Guizhou, Sichuan, and the junction area of Chongqing, Hunan, and Hubei provinces. This information revealed that suitable areas for effective constituent accumulation of Masson’s pine needles accounted for only 7.41% of its distribution area. PMID:27694967

  11. Regionalization of Habitat Suitability of Masson’s Pine based on geographic information system and Fuzzy Matter-Element Model

    NASA Astrophysics Data System (ADS)

    Zhou, Xiuteng; Zhao, Manxi; Zhou, Liangyun; Yang, Guang; Huang, Luqi; Yan, Cuiqi; Huang, Quanshu; Ye, Liang; Zhang, Xiaobo; Guo, Lanpin; Ke, Xiao; Guo, Jiao

    2016-10-01

    Pine needles have been widely used in the development of anti-hypertensive and anti-hyperlipidemic agents and health food. However, the widespread distribution of this tree poses great obstacles to the quality control and efficacy evaluation. To facilitate the effective and rational exploitation of Masson’s pine (Pinus massoniana Lamb), as well as ensure effective development of Masson’s pine needles as a medicinal agent, we investigated the spatial distribution of habitat suitability and evaluated the optimal ranges of ecological factors of P. massoniana with 280 samples collected from 12 provinces in China through the evaluation of four constituents known to be effective medicinally. The results of habitat suitability evaluation were also verified by Root Mean Square Error (RMSE). Finally, five ecological factors were chosen in the establishment of a habitat suitability evaluation system. The most suitable areas for P. massoniana growth were mainly concentrated in the middle and lower reaches of the Yangtze River basin, such as Sichuan, Guizhou, and Jiangxi provinces, while the best quality needles were from Guizhou, Sichuan, and the junction area of Chongqing, Hunan, and Hubei provinces. This information revealed that suitable areas for effective constituent accumulation of Masson’s pine needles accounted for only 7.41% of its distribution area.

  12. Method and system for producing sputtered thin films with sub-angstrom thickness uniformity or custom thickness gradients

    DOEpatents

    Folta, James A.; Montcalm, Claude; Walton, Christopher

    2003-01-01

    A method and system for producing a thin film with highly uniform (or highly accurate custom graded) thickness on a flat or graded substrate (such as concave or convex optics), by sweeping the substrate across a vapor deposition source with controlled (and generally, time-varying) velocity. In preferred embodiments, the method includes the steps of measuring the source flux distribution (using a test piece that is held stationary while exposed to the source), calculating a set of predicted film thickness profiles, each film thickness profile assuming the measured flux distribution and a different one of a set of sweep velocity modulation recipes, and determining from the predicted film thickness profiles a sweep velocity modulation recipe which is adequate to achieve a predetermined thickness profile. Aspects of the invention include a practical method of accurately measuring source flux distribution, and a computer-implemented method employing a graphical user interface to facilitate convenient selection of an optimal or nearly optimal sweep velocity modulation recipe to achieve a desired thickness profile on a substrate. Preferably, the computer implements an algorithm in which many sweep velocity function parameters (for example, the speed at which each substrate spins about its center as it sweeps across the source) can be varied or set to zero.

  13. Providing traceability for neuroimaging analyses.

    PubMed

    McClatchey, Richard; Branson, Andrew; Anjum, Ashiq; Bloodsworth, Peter; Habib, Irfan; Munir, Kamran; Shamdasani, Jetendr; Soomro, Kamran

    2013-09-01

    With the increasingly digital nature of biomedical data and as the complexity of analyses in medical research increases, the need for accurate information capture, traceability and accessibility has become crucial to medical researchers in the pursuance of their research goals. Grid- or Cloud-based technologies, often based on so-called Service Oriented Architectures (SOA), are increasingly being seen as viable solutions for managing distributed data and algorithms in the bio-medical domain. For neuroscientific analyses, especially those centred on complex image analysis, traceability of processes and datasets is essential but up to now this has not been captured in a manner that facilitates collaborative study. Few examples exist, of deployed medical systems based on Grids that provide the traceability of research data needed to facilitate complex analyses and none have been evaluated in practice. Over the past decade, we have been working with mammographers, paediatricians and neuroscientists in three generations of projects to provide the data management and provenance services now required for 21st century medical research. This paper outlines the finding of a requirements study and a resulting system architecture for the production of services to support neuroscientific studies of biomarkers for Alzheimer's disease. The paper proposes a software infrastructure and services that provide the foundation for such support. It introduces the use of the CRISTAL software to provide provenance management as one of a number of services delivered on a SOA, deployed to manage neuroimaging projects that have been studying biomarkers for Alzheimer's disease. In the neuGRID and N4U projects a Provenance Service has been delivered that captures and reconstructs the workflow information needed to facilitate researchers in conducting neuroimaging analyses. The software enables neuroscientists to track the evolution of workflows and datasets. It also tracks the outcomes of various analyses and provides provenance traceability throughout the lifecycle of their studies. As the Provenance Service has been designed to be generic it can be applied across the medical domain as a reusable tool for supporting medical researchers thus providing communities of researchers for the first time with the necessary tools to conduct widely distributed collaborative programmes of medical analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. Fabricating cooled electronic system with liquid-cooled cold plate and thermal spreader

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chainer, Timothy J.; Graybill, David P.; Iyengar, Madhusudan K.

    Methods are provided for facilitating cooling of an electronic component. The method includes providing a liquid-cooled cold plate and a thermal spreader associated with the cold plate. The cold plate includes multiple coolant-carrying channel sections extending within the cold plate, and a thermal conduction surface with a larger surface area than a surface area of the component to be cooled. The thermal spreader includes one or more heat pipes including multiple heat pipe sections. One or more heat pipe sections are partially aligned to a first region of the cold plate, that is, where aligned to the surface to bemore » cooled, and partially aligned to a second region of the cold plate, which is outside the first region. The one or more heat pipes facilitate distribution of heat from the electronic component to coolant-carrying channel sections of the cold plate located in the second region of the cold plate.« less

  15. Fabricating cooled electronic system with liquid-cooled cold plate and thermal spreader

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chainer, Timothy J.; Graybill, David P.; Iyengar, Madhusudan K.

    Methods are provided for facilitating cooling of an electronic component. The methods include providing a liquid-cooled cold plate and a thermal spreader associated with the cold plate. The cold plate includes multiple coolant-carrying channel sections extending within the cold plate, and a thermal conduction surface with a larger surface area than a surface area of the component to be cooled. The thermal spreader includes one or more heat pipes including multiple heat pipe sections. One or more heat pipe sections are partially aligned to a first region of the cold plate, that is, where aligned to the surface to bemore » cooled, and partially aligned to a second region of the cold plate, which is outside the first region. The one or more heat pipes facilitate distribution of heat from the electronic component to coolant-carrying channel sections of the cold plate located in the second region of the cold plate.« less

  16. Cooled electronic system with liquid-cooled cold plate and thermal spreader coupled to electronic component

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chainer, Timothy J.; Graybill, David P.; Iyengar, Madhusudan K.

    Apparatus and method are provided for facilitating cooling of an electronic component. The apparatus includes a liquid-cooled cold plate and a thermal spreader associated with the cold plate. The cold plate includes multiple coolant-carrying channel sections extending within the cold plate, and a thermal conduction surface with a larger surface area than a surface area of the component to be cooled. The thermal spreader includes one or more heat pipes including multiple heat pipe sections. One or more heat pipe sections are partially aligned to a first region of the cold plate, that is, where aligned to the surface tomore » be cooled, and partially aligned to a second region of the cold plate, which is outside the first region. The one or more heat pipes facilitate distribution of heat from the electronic component to coolant-carrying channel sections of the cold plate located in the second region of the cold plate.« less

  17. Cooled electronic system with liquid-cooled cold plate and thermal spreader coupled to electronic component

    DOEpatents

    Chainer, Timothy J.; Graybill, David P.; Iyengar, Madhusudan K.; Kamath, Vinod; Kochuparambil, Bejoy J.; Schmidt, Roger R.; Steinke, Mark E.

    2016-08-09

    Apparatus and method are provided for facilitating cooling of an electronic component. The apparatus includes a liquid-cooled cold plate and a thermal spreader associated with the cold plate. The cold plate includes multiple coolant-carrying channel sections extending within the cold plate, and a thermal conduction surface with a larger surface area than a surface area of the component to be cooled. The thermal spreader includes one or more heat pipes including multiple heat pipe sections. One or more heat pipe sections are partially aligned to a first region of the cold plate, that is, where aligned to the surface to be cooled, and partially aligned to a second region of the cold plate, which is outside the first region. The one or more heat pipes facilitate distribution of heat from the electronic component to coolant-carrying channel sections of the cold plate located in the second region of the cold plate.

  18. Cooled electronic system with liquid-cooled cold plate and thermal spreader coupled to electronic component

    DOEpatents

    Chainer, Timothy J.; Graybill, David P.; Iyengar, Madhusudan K.; Kamath, Vinod; Kochuparambil, Bejoy J.; Schmidt, Roger R.; Steinke, Mark E.

    2016-04-05

    Apparatus and method are provided for facilitating cooling of an electronic component. The apparatus includes a liquid-cooled cold plate and a thermal spreader associated with the cold plate. The cold plate includes multiple coolant-carrying channel sections extending within the cold plate, and a thermal conduction surface with a larger surface area than a surface area of the component to be cooled. The thermal spreader includes one or more heat pipes including multiple heat pipe sections. One or more heat pipe sections are partially aligned to a first region of the cold plate, that is, where aligned to the surface to be cooled, and partially aligned to a second region of the cold plate, which is outside the first region. The one or more heat pipes facilitate distribution of heat from the electronic component to coolant-carrying channel sections of the cold plate located in the second region of the cold plate.

  19. Using Open and Interoperable Ways to Publish and Access LANCE AIRS Near-Real Time Data

    NASA Technical Reports Server (NTRS)

    Zhao, Peisheng; Lynnes, Christopher; Vollmer, Bruce; Savtchenko, Andrey; Theobald, Michael; Yang, Wenli

    2011-01-01

    The Atmospheric Infrared Sounder (AIRS) Near-Real Time (NRT) data from the Land Atmosphere Near real-time Capability for EOS (LANCE) element at the Goddard Earth Sciences Data and Information Services Center (GES DISC) provides information on the global and regional atmospheric state, with very low temporal latency, to support climate research and improve weather forecasting. An open and interoperable platform is useful to facilitate access to, and integration of, LANCE AIRS NRT data. As Web services technology has matured in recent years, a new scalable Service-Oriented Architecture (SOA) is emerging as the basic platform for distributed computing and large networks of interoperable applications. Following the provide-register-discover-consume SOA paradigm, this presentation discusses how to use open-source geospatial software components to build Web services for publishing and accessing AIRS NRT data, explore the metadata relevant to registering and discovering data and services in the catalogue systems, and implement a Web portal to facilitate users' consumption of the data and services.

  20. Bridging the gap: linking a legacy hospital information system with a filmless radiology picture archiving and communications system within a nonhomogeneous environment.

    PubMed

    Rubin, R K; Henri, C J; Cox, R D

    1999-05-01

    A health level 7 (HL7)-conformant data link to exchange information between the mainframe hospital information system (HIS) of our hospital and our home-grown picture archiving and communications system (PACS) is a result of a collaborative effort between the HIS department and the PACS development team. Based of the ability to link examination requisitions and image studies, applications have been generated to optimise workflow and to improve the reliability and distribution of radiology information. Now, images can be routed to individual radiologists and clinicians; worklists facilitate radiology reporting; applications exist to create, edit, and view reports and images via the internet; and automated quality control now limits the incidence of "lost" cases and errors in image routing. By following the HL7 standard to develop the gateway to the legacy system, the development of a radiology information system for booking, reading, reporting, and billing remains universal and does not preclude the option to integrate off-the-shelf commercial products.

  1. Oligonucleotide facilitators may inhibit or activate a hammerhead ribozyme.

    PubMed Central

    Jankowsky, E; Schwenzer, B

    1996-01-01

    Facilitators are oligonucleotides capable of affecting hammerhead ribozyme activity by interacting with the substrate at the termini of the ribozyme. Facilitator effects were determined in vitro using a system consisting of a ribozyme with 7 nucleotides in every stem sequence and two substrates with inverted facilitator binding sequences. The effects of 9mer and 12mer RNA as well as DNA facilitators which bind either adjacent to the 3'- or 5'-end of the ribozyme were investigated. A kinetic model was developed which allows determination of the apparent dissociation constant of the ribozyme-substrate complex from single turnover reactions. We observed a decreased dissociation constant of the ribozyme-substrate complex due to facilitator addition corresponding to an additional stabilization energy of delta delta G=-1.7 kcal/mol with 3'-end facilitators. The cleavage rate constant was increased by 3'-end facilitators and decreased by 5'-end facilitators. Values for Km were slightly lowered by all facilitators and kcat was increased by 3'-end facilitators and decreased by 5'-end facilitators in our system. Generally the facilitator effects increased with the length of the facilitators and RNA provided greater effects than DNA of the same sequence. Results suggest facilitator influences on several steps of the hammerhead reaction, substrate association, cleavage and dissociation of products. Moreover, these effects are dependent in different manners on ribozyme and substrate concentration. This leads to the conclusion that there is a concentration dependence whether activation or inhibition is caused by facilitators. Conclusions are drawn with regard to the design of hammerhead ribozyme facilitator systems. PMID:8602353

  2. Dynamic wave field synthesis: enabling the generation of field distributions with a large space-bandwidth product.

    PubMed

    Kamau, Edwin N; Heine, Julian; Falldorf, Claas; Bergmann, Ralf B

    2015-11-02

    We present a novel approach for the design and fabrication of multiplexed computer generated volume holograms (CGVH) which allow for a dynamic synthesis of arbitrary wave field distributions. To achieve this goal, we developed a hybrid system that consists of a CGVH as a static element and an electronically addressed spatial light modulator as the dynamic element. We thereby derived a new model for describing the scattering process within the inhomogeneous dielectric material of the hologram. This model is based on the linearization of the scattering process within the Rytov approximation and incorporates physical constraints that account for voxel based laser-lithography using micro-fabrication of the holograms in a nonlinear optical material. In this article we demonstrate that this system basically facilitates a high angular Bragg selectivity on the order of 1°. Additionally, it allows for a qualitatively low cross-talk dynamic synthesis of predefined wave fields with a much larger space-bandwidth product (SBWP ≥ 8.7 × 10(6)) as compared to the current state of the art in computer generated holography.

  3. Best Practice to Order Authors in Multi/Interdisciplinary Health Sciences Research Publications.

    PubMed

    Smith, Elise; Master, Zubin

    2017-01-01

    Misunderstanding and disputes about authorship are commonplace among members of multi/interdisciplinary health research teams. If left unmanaged and unresolved, these conflicts can undermine knowledge sharing and collaboration, obscure accountability for research, and contribute to the incorrect attribution of credit. To mitigate these issues, certain researchers suggest quantitative authorship distributions schemes (e.g., point systems), while others wish to replace or minimize the importance of authorship by using "contributorship"-a system based on authors' self-reporting contributions. While both methods have advantages, we argue that authorship and contributorship will most likely continue to coexist for multiple ethical and practical reasons. In this article, we develop a five-step "best practice" that incorporates the distribution of both contributorship and authorship for multi/interdisciplinary research. This procedure involves continuous dialogue and the use of a detailed contributorship taxonomy ending with a declaration explaining contributorship, which is used to justify authorship order. Institutions can introduce this approach in responsible conduct of research training as it promotes greater fairness, trust, and collegiality among team members and ultimately reduces confusion and facilitates resolution of time-consuming disagreements.

  4. Dynamics of translational friction in needle-tissue interaction during needle insertion.

    PubMed

    Asadian, Ali; Patel, Rajni V; Kermani, Mehrdad R

    2014-01-01

    In this study, a distributed approach to account for dynamic friction during needle insertion in soft tissue is presented. As is well known, friction is a complex nonlinear phenomenon. It appears that classical or static models are unable to capture some of the observations made in systems subjected to significant frictional effects. In needle insertion, translational friction would be a matter of importance when the needle is very flexible, or a stop-and-rotate motion profile at low insertion velocities is implemented, and thus, the system is repeatedly transitioned from a pre-sliding to a sliding mode and vice versa. In order to characterize friction components, a distributed version of the LuGre model in the state-space representation is adopted. This method also facilitates estimating cutting force in an intra-operative manner. To evaluate the performance of the proposed family of friction models, experiments were conducted on homogeneous artificial phantoms and animal tissue. The results illustrate that our approach enables us to represent the main features of friction which is a major force component in needle-tissue interaction during needle-based interventions.

  5. Method and system using power modulation and velocity modulation producing sputtered thin films with sub-angstrom thickness uniformity or custom thickness gradients

    DOEpatents

    Montcalm, Claude [Livermore, CA; Folta, James Allen [Livermore, CA; Walton, Christopher Charles [Berkeley, CA

    2003-12-23

    A method and system for determining a source flux modulation recipe for achieving a selected thickness profile of a film to be deposited (e.g., with highly uniform or highly accurate custom graded thickness) over a flat or curved substrate (such as concave or convex optics) by exposing the substrate to a vapor deposition source operated with time-varying flux distribution as a function of time. Preferably, the source is operated with time-varying power applied thereto during each sweep of the substrate to achieve the time-varying flux distribution as a function of time. Preferably, the method includes the steps of measuring the source flux distribution (using a test piece held stationary while exposed to the source with the source operated at each of a number of different applied power levels), calculating a set of predicted film thickness profiles, each film thickness profile assuming the measured flux distribution and a different one of a set of source flux modulation recipes, and determining from the predicted film thickness profiles a source flux modulation recipe which is adequate to achieve a predetermined thickness profile. Aspects of the invention include a computer-implemented method employing a graphical user interface to facilitate convenient selection of an optimal or nearly optimal source flux modulation recipe to achieve a desired thickness profile on a substrate. The method enables precise modulation of the deposition flux to which a substrate is exposed to provide a desired coating thickness distribution.

  6. 16 CFR 1302.1 - Scope and application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... produced or distributed for sale to, or for the personal use, consumption or enjoyment of consumers in or... manufacturer, distributor, or retailer customarily produces or distributes the product for sale to, or use by consumers, or if the manufacturer, distributor, or retailer fosters or facilitates the product's sale to, or...

  7. Development of a novel imaging informatics-based system with an intelligent workflow engine (IWEIS) to support imaging-based clinical trials

    PubMed Central

    Wang, Ximing; Liu, Brent J; Martinez, Clarisa; Zhang, Xuejun; Winstein, Carolee J

    2015-01-01

    Imaging based clinical trials can benefit from a solution to efficiently collect, analyze, and distribute multimedia data at various stages within the workflow. Currently, the data management needs of these trials are typically addressed with custom-built systems. However, software development of the custom- built systems for versatile workflows can be resource-consuming. To address these challenges, we present a system with a workflow engine for imaging based clinical trials. The system enables a project coordinator to build a data collection and management system specifically related to study protocol workflow without programming. Web Access to DICOM Objects (WADO) module with novel features is integrated to further facilitate imaging related study. The system was initially evaluated by an imaging based rehabilitation clinical trial. The evaluation shows that the cost of the development of system can be much reduced compared to the custom-built system. By providing a solution to customize a system and automate the workflow, the system will save on development time and reduce errors especially for imaging clinical trials. PMID:25870169

  8. Basic technologies of web services framework for research, discovery, and processing the disparate massive Earth observation data from heterogeneous sources

    NASA Astrophysics Data System (ADS)

    Savorskiy, V.; Lupyan, E.; Balashov, I.; Burtsev, M.; Proshin, A.; Tolpin, V.; Ermakov, D.; Chernushich, A.; Panova, O.; Kuznetsov, O.; Vasilyev, V.

    2014-04-01

    Both development and application of remote sensing involves a considerable expenditure of material and intellectual resources. Therefore, it is important to use high-tech means of distribution of remote sensing data and processing results in order to facilitate access for as much as possible number of researchers. It should be accompanied with creation of capabilities for potentially more thorough and comprehensive, i.e. ultimately deeper, acquisition and complex analysis of information about the state of Earth's natural resources. As well objective need in a higher degree of Earth observation (EO) data assimilation is set by conditions of satellite observations, in which the observed objects are uncontrolled state. Progress in addressing this problem is determined to a large extent by order of the distributed EO information system (IS) functioning. Namely, it is largely dependent on reducing the cost of communication processes (data transfer) between spatially distributed IS nodes and data users. One of the most effective ways to improve the efficiency of data exchange processes is the creation of integrated EO IS optimized for running procedures of distributed data processing. The effective EO IS implementation should be based on specific software architecture.

  9. Equilibrium Molecular Thermodynamics from Kirkwood Sampling

    PubMed Central

    2015-01-01

    We present two methods for barrierless equilibrium sampling of molecular systems based on the recently proposed Kirkwood method (J. Chem. Phys.2009, 130, 134102). Kirkwood sampling employs low-order correlations among internal coordinates of a molecule for random (or non-Markovian) sampling of the high dimensional conformational space. This is a geometrical sampling method independent of the potential energy surface. The first method is a variant of biased Monte Carlo, where Kirkwood sampling is used for generating trial Monte Carlo moves. Using this method, equilibrium distributions corresponding to different temperatures and potential energy functions can be generated from a given set of low-order correlations. Since Kirkwood samples are generated independently, this method is ideally suited for massively parallel distributed computing. The second approach is a variant of reservoir replica exchange, where Kirkwood sampling is used to construct a reservoir of conformations, which exchanges conformations with the replicas performing equilibrium sampling corresponding to different thermodynamic states. Coupling with the Kirkwood reservoir enhances sampling by facilitating global jumps in the conformational space. The efficiency of both methods depends on the overlap of the Kirkwood distribution with the target equilibrium distribution. We present proof-of-concept results for a model nine-atom linear molecule and alanine dipeptide. PMID:25915525

  10. Seasonal distribution, aggregation, and habitat selection of common carp in Clear Lake, Iowa

    USGS Publications Warehouse

    Penne, C.R.; Pierce, C.L.

    2008-01-01

    The common carp Cyprinus carpio is widely distributed and frequently considered a nuisance species outside its native range. Common carp are abundant in Clear Lake, Iowa, where their presence is both a symptom of degradation and an impediment to improving water quality and the sport fishery. We used radiotelemetry to quantify seasonal distribution, aggregation, and habitat selection of adult and subadult common carp in Clear Lake during 2005-2006 in an effort to guide future control strategies. Over a 22-month period, we recorded 1,951 locations of 54 adults and 60 subadults implanted with radio transmitters. Adults demonstrated a clear tendency to aggregate in an offshore area during the late fall and winter and in shallow, vegetated areas before and during spring spawning. Late-fall and winter aggregations were estimated to include a larger percentage of the tracked adults than spring aggregations. Subadults aggregated in shallow, vegetated areas during the spring and early summer. Our study, when considered in combination with previous research, suggests repeatable patterns of distribution, aggregation, and habitat selection that should facilitate common carp reduction programs in Clear Lake and similar systems. ?? Copyright by the American Fisheries Society 2008.

  11. Lessons from the Chilean earthquake: how a human rights framework facilitates disaster response.

    PubMed

    Arbour, MaryCatherine; Murray, Kara; Arriet, Felipe; Moraga, Cecilia; Vega, Miguel Cordero

    2011-07-14

    The earthquake of 2010 in Chile holds important lessons about how a rights-based public health system can guide disaster response to protect vulnerable populations. This article tells the story of Chile Grows With You (Chile Crece Contigo), an intersectoral system created three years before the earthquake for protection of child rights and development, and its role in the disaster response. The creation of Chile Grows With You with an explicit rights-oriented mandate established intersectoral mechanisms, relationships, and common understanding between governmental groups at the national and local levels. After the earthquake, Chile Grows With You organized its activities according to its founding principles: it provided universal access and support for all Chilean children, with special attention and services for those at greatest risk. This tiered approach involved public health and education materials for all children and families; epidemiologic data for local planners about children in their municipalities at-risk before the earthquake; and an instrument developed to assist in the assessment and intervention of children put at risk by the earthquake. This disaster response illustrates how a rights-based framework defined and operationalized in times of stability facilitated organization, prioritization, and sustained action to protect and support children and families in the acute aftermath of the earthquake, despite a change in government from a left-wing to a right-wing president, and into the early recovery period. Copyright © 2011 Arbour, Murray, Arriet, Moraga, and Vega. This is an open access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/), which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original author and source are credited.

  12. Architecture-Centric Development in Globally Distributed Projects

    NASA Astrophysics Data System (ADS)

    Sauer, Joachim

    In this chapter architecture-centric development is proposed as a means to strengthen the cohesion of distributed teams and to tackle challenges due to geographical and temporal distances and the clash of different cultures. A shared software architecture serves as blueprint for all activities in the development process and ties them together. Architecture-centric development thus provides a plan for task allocation, facilitates the cooperation of globally distributed developers, and enables continuous integration reaching across distributed teams. Advice is also provided for software architects who work with distributed teams in an agile manner.

  13. Fuel cell with interdigitated porous flow-field

    DOEpatents

    Wilson, Mahlon S.

    1997-01-01

    A polymer electrolyte membrane (PEM) fuel cell is formed with an improved system for distributing gaseous reactants to the membrane surface. A PEM fuel cell has an ionic transport membrane with opposed catalytic surfaces formed thereon and separates gaseous reactants that undergo reactions at the catalytic surfaces of the membrane. The fuel cell may also include a thin gas diffusion layer having first and second sides with a first side contacting at least one of the catalytic surfaces. A macroporous flow-field with interdigitated inlet and outlet reactant channels contacts the second side of the thin gas diffusion layer for distributing one of the gaseous reactants over the thin gas diffusion layer for transport to an adjacent one of the catalytic surfaces of the membrane. The porous flow field may be formed from a hydrophilic material and provides uniform support across the backside of the electrode assembly to facilitate the use of thin backing layers.

  14. Diamond Eye: a distributed architecture for image data mining

    NASA Astrophysics Data System (ADS)

    Burl, Michael C.; Fowlkes, Charless; Roden, Joe; Stechert, Andre; Mukhtar, Saleem

    1999-02-01

    Diamond Eye is a distributed software architecture, which enables users (scientists) to analyze large image collections by interacting with one or more custom data mining servers via a Java applet interface. Each server is coupled with an object-oriented database and a computational engine, such as a network of high-performance workstations. The database provides persistent storage and supports querying of the 'mined' information. The computational engine provides parallel execution of expensive image processing, object recognition, and query-by-content operations. Key benefits of the Diamond Eye architecture are: (1) the design promotes trial evaluation of advanced data mining and machine learning techniques by potential new users (all that is required is to point a web browser to the appropriate URL), (2) software infrastructure that is common across a range of science mining applications is factored out and reused, and (3) the system facilitates closer collaborations between algorithm developers and domain experts.

  15. Fuel cell with interdigitated porous flow-field

    DOEpatents

    Wilson, M.S.

    1997-06-24

    A polymer electrolyte membrane (PEM) fuel cell is formed with an improved system for distributing gaseous reactants to the membrane surface. A PEM fuel cell has an ionic transport membrane with opposed catalytic surfaces formed thereon and separates gaseous reactants that undergo reactions at the catalytic surfaces of the membrane. The fuel cell may also include a thin gas diffusion layer having first and second sides with a first side contacting at least one of the catalytic surfaces. A macroporous flow-field with interdigitated inlet and outlet reactant channels contacts the second side of the thin gas diffusion layer for distributing one of the gaseous reactants over the thin gas diffusion layer for transport to an adjacent one of the catalytic surfaces of the membrane. The porous flow field may be formed from a hydrophilic material and provides uniform support across the backside of the electrode assembly to facilitate the use of thin backing layers. 9 figs.

  16. Prey field switching based on preferential behaviour can induce Lévy flights

    PubMed Central

    Lundy, Mathieu G.; Harrison, Alan; Buckley, Daniel J.; Boston, Emma S.; Scott, David D.; Teeling, Emma C.; Montgomery, W. Ian; Houghton, Jonathan D. R.

    2013-01-01

    Using the foraging movements of an insectivorous bat, Myotis mystacinus, we describe temporal switching of foraging behaviour in response to resource availability. These observations conform to predictions of optimized search under the Lévy flight paradigm. However, we suggest that this occurs as a result of a preference behaviour and knowledge of resource distribution. Preferential behaviour and knowledge of a familiar area generate distinct movement patterns as resource availability changes on short temporal scales. The behavioural response of predators to changes in prey fields can elicit different functional responses, which are considered to be central in the development of stable predator–prey communities. Recognizing how the foraging movements of an animal relate to environmental conditions also elucidates the evolution of optimized search and the prevalence of discrete strategies in natural systems. Applying techniques that use changes in the frequency distribution of movements facilitates exploration of the processes that underpin behavioural changes. PMID:23054951

  17. Brain Radiation Information Data Exchange (BRIDE): integration of experimental data from low-dose ionising radiation research for pathway discovery.

    PubMed

    Karapiperis, Christos; Kempf, Stefan J; Quintens, Roel; Azimzadeh, Omid; Vidal, Victoria Linares; Pazzaglia, Simonetta; Bazyka, Dimitry; Mastroberardino, Pier G; Scouras, Zacharias G; Tapio, Soile; Benotmane, Mohammed Abderrafi; Ouzounis, Christos A

    2016-05-11

    The underlying molecular processes representing stress responses to low-dose ionising radiation (LDIR) in mammals are just beginning to be understood. In particular, LDIR effects on the brain and their possible association with neurodegenerative disease are currently being explored using omics technologies. We describe a light-weight approach for the storage, analysis and distribution of relevant LDIR omics datasets. The data integration platform, called BRIDE, contains information from the literature as well as experimental information from transcriptomics and proteomics studies. It deploys a hybrid, distributed solution using both local storage and cloud technology. BRIDE can act as a knowledge broker for LDIR researchers, to facilitate molecular research on the systems biology of LDIR response in mammals. Its flexible design can capture a range of experimental information for genomics, epigenomics, transcriptomics, and proteomics. The data collection is available at: .

  18. Experimental study on all-fiber-based unidimensional continuous-variable quantum key distribution

    NASA Astrophysics Data System (ADS)

    Wang, Xuyang; Liu, Wenyuan; Wang, Pu; Li, Yongmin

    2017-06-01

    We experimentally demonstrated an all-fiber-based unidimensional continuous-variable quantum key distribution (CV QKD) protocol and analyzed its security under collective attack in realistic conditions. A pulsed balanced homodyne detector, which could not be accessed by eavesdroppers, with phase-insensitive efficiency and electronic noise, was considered. Furthermore, a modulation method and an improved relative phase-locking technique with one amplitude modulator and one phase modulator were designed. The relative phase could be locked precisely with a standard deviation of 0.5° and a mean of almost zero. Secret key bit rates of 5.4 kbps and 700 bps were achieved for transmission fiber lengths of 30 and 50 km, respectively. The protocol, which simplified the CV QKD system and reduced the cost, displayed a performance comparable to that of a symmetrical counterpart under realistic conditions. It is expected that the developed protocol can facilitate the practical application of the CV QKD.

  19. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License Notice: Published under licence in Journal of Physics: Conference Series by IOP Publishing Ltd.

  20. User interface for ground-water modeling: Arcview extension

    USGS Publications Warehouse

    Tsou, Ming‐shu; Whittemore, Donald O.

    2001-01-01

    Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.

  1. An application of queuing theory to waterfowl migration

    USGS Publications Warehouse

    Sojda, Richard S.; Cornely, John E.; Fredrickson, Leigh H.; Rizzoli, A.E.; Jakeman, A.J.

    2002-01-01

    There has always been great interest in the migration of waterfowl and other birds. We have applied queuing theory to modelling waterfowl migration, beginning with a prototype system for the Rocky Mountain Population of trumpeter swans (Cygnus buccinator) in Western North America. The queuing model can be classified as a D/BB/28 system, and we describe the input sources, service mechanism, and network configuration of queues and servers. The intrinsic nature of queuing theory is to represent the spatial and temporal characteristics of entities and how they move, are placed in queues, and are serviced. The service mechanism in our system is an algorithm representing how swans move through the flyway based on seasonal life cycle events. The system uses an observed number of swans at each of 27 areas for a breeding season as input and simulates their distribution through four seasonal steps. The result is a simulated distribution of birds for the subsequent year's breeding season. The model was built as a multiagent system with one agent handling movement algorithms, with one facilitating user interface, and with one to seven agents representing specific geographic areas for which swan management interventions can be implemented. The many parallels in queuing model servers and service mechanisms with waterfowl management areas and annual life cycle events made the transfer of the theory to practical application straightforward.

  2. Plug-in hybrid electric vehicles as a source of distributed frequency regulation

    NASA Astrophysics Data System (ADS)

    Mullen, Sara Kathryn

    The movement to transform the North American power grid into a smart grid may be accomplished by expanding integrated sensing, communications, and control technologies to include every part of the grid to the point of end-use. Plug-in hybrid electric vehicles (PHEV) provide an opportunity for small-scale distributed storage while they are plugged-in. With large numbers of PHEV and the communications and sensing associated with the smart grid, PHEV could provide ancillary services for the grid. Frequency regulation is an ideal service for PHEV because the duration of supply is short (order of minutes) and it is the highest priced ancillary service on the market offering greater financial returns for vehicle owners. Using Simulink a power system simulator modeling the IEEE 14 Bus System was combined with a model of PHEV charging and the controllers which facilitate vehicle-to-grid (V2G) regulation supply. The system includes a V2G controller for each vehicle which makes regulation supply decisions based on battery state, user preferences, and the recommended level of supply. A PHEV coordinator controller located higher in the system has access to reliable frequency measurements and can determine a suitable local automatic generation control (AGC) raise/lower signal for participating vehicles. A first step implementation of the V2G supply system where battery charging is modulated to provide regulation was developed. The system was simulated following a step change in loading using three scenarios: (1) Central generating units provide frequency regulation, (2) PHEV contribute to primary regulation analogous to generator speed governor control, and (3) PHEV contribute to primary and secondary regulation using an additional integral term in the PHEV control signal. In both cases the additional regulation provided by PHEV reduced the area control error (ACE) compared to the base case. Unique contributions resulting from this work include: (1) Studied PHEV energy systems and limitations on battery charging/discharging, (2) Reviewed standards for interconnection of distributed resources and electric vehicle charging [1], [2], (3) Explored strategies for distributed control of PHEV charging, (4) Developed controllers to accommodate PHEV regulation, and (5) Developed a simulator combining a power system model and PHEV/V2G components.

  3. Multi-Disciplinary Analysis for Future Launch Systems Using NASA's Advanced Engineering Environment (AEE)

    NASA Technical Reports Server (NTRS)

    Monell, D.; Mathias, D.; Reuther, J.; Garn, M.

    2003-01-01

    A new engineering environment constructed for the purposes of analyzing and designing Reusable Launch Vehicles (RLVs) is presented. The new environment has been developed to allow NASA to perform independent analysis and design of emerging RLV architectures and technologies. The new Advanced Engineering Environment (AEE) is both collaborative and distributed. It facilitates integration of the analyses by both vehicle performance disciplines and life-cycle disciplines. Current performance disciplines supported include: weights and sizing, aerodynamics, trajectories, propulsion, structural loads, and CAD-based geometries. Current life-cycle disciplines supported include: DDT&E cost, production costs, operations costs, flight rates, safety and reliability, and system economics. Involving six NASA centers (ARC, LaRC, MSFC, KSC, GRC and JSC), AEE has been tailored to serve as a web-accessed agency-wide source for all of NASA's future launch vehicle systems engineering functions. Thus, it is configured to facilitate (a) data management, (b) automated tool/process integration and execution, and (c) data visualization and presentation. The core components of the integrated framework are a customized PTC Windchill product data management server, a set of RLV analysis and design tools integrated using Phoenix Integration's Model Center, and an XML-based data capture and transfer protocol. The AEE system has seen production use during the Initial Architecture and Technology Review for the NASA 2nd Generation RLV program, and it continues to undergo development and enhancements in support of its current main customer, the NASA Next Generation Launch Technology (NGLT) program.

  4. SBSI: an extensible distributed software infrastructure for parameter estimation in systems biology.

    PubMed

    Adams, Richard; Clark, Allan; Yamaguchi, Azusa; Hanlon, Neil; Tsorman, Nikos; Ali, Shakir; Lebedeva, Galina; Goltsov, Alexey; Sorokin, Anatoly; Akman, Ozgur E; Troein, Carl; Millar, Andrew J; Goryanin, Igor; Gilmore, Stephen

    2013-03-01

    Complex computational experiments in Systems Biology, such as fitting model parameters to experimental data, can be challenging to perform. Not only do they frequently require a high level of computational power, but the software needed to run the experiment needs to be usable by scientists with varying levels of computational expertise, and modellers need to be able to obtain up-to-date experimental data resources easily. We have developed a software suite, the Systems Biology Software Infrastructure (SBSI), to facilitate the parameter-fitting process. SBSI is a modular software suite composed of three major components: SBSINumerics, a high-performance library containing parallelized algorithms for performing parameter fitting; SBSIDispatcher, a middleware application to track experiments and submit jobs to back-end servers; and SBSIVisual, an extensible client application used to configure optimization experiments and view results. Furthermore, we have created a plugin infrastructure to enable project-specific modules to be easily installed. Plugin developers can take advantage of the existing user-interface and application framework to customize SBSI for their own uses, facilitated by SBSI's use of standard data formats. All SBSI binaries and source-code are freely available from http://sourceforge.net/projects/sbsi under an Apache 2 open-source license. The server-side SBSINumerics runs on any Unix-based operating system; both SBSIVisual and SBSIDispatcher are written in Java and are platform independent, allowing use on Windows, Linux and Mac OS X. The SBSI project website at http://www.sbsi.ed.ac.uk provides documentation and tutorials.

  5. Descriptive statistics and spatial distributions of geochemical variables associated with manganese oxide-rich phases in the northern Pacific

    USGS Publications Warehouse

    Botbol, Joseph Moses; Evenden, Gerald Ian

    1989-01-01

    Tables, graphs, and maps are used to portray the frequency characteristics and spatial distribution of manganese oxide-rich phase geochemical data, to characterize the northern Pacific in terms of publicly available nodule geochemical data, and to develop data portrayal methods that will facilitate data analysis. Source data are a subset of the Scripps Institute of Oceanography's Sediment Data Bank. The study area is bounded by 0° N., 40° N., 120° E., and 100° W. and is arbitrarily subdivided into 14-20°x20° geographic subregions. Frequency distributions of trace metals characterized in the original raw data are graphed as ogives, and salient parameters are tabulated. All variables are transformed to enrichment values relative to median concentration within their host subregions. Scatter plots of all pairs of original variables and their enrichment transforms are provided as an aid to the interpretation of correlations between variables. Gridded spatial distributions of all variables are portrayed as gray-scale maps. The use of tables and graphs to portray frequency statistics and gray-scale maps to portray spatial distributions is an effective way to prepare for and facilitate multivariate data analysis.

  6. Design and development of novel bandages for compression therapy.

    PubMed

    Rajendran, Subbiyan; Anand, Subhash

    2003-03-01

    During the past few years there have been increasing concerns relating to the performance of bandages, especially their pressure distribution properties for the treatment of venous leg ulcers. This is because compression therapy is a complex system and requires two or multi-layer bandages, and the performance properties of each layer differs from other layers. The widely accepted sustained graduated compression mainly depends on the uniform pressure distribution of different layers of bandages, in which textile fibres and bandage structures play a major role. This article examines how the fibres, fibre blends and structures influence the absorption and pressure distribution properties of bandages. It is hoped that the research findings will help medical professionals, especially nurses, to gain an insight into the development of bandages. A total of 12 padding bandages have been produced using various fibres and fibre blends. A new technique that would facilitate good resilience and cushioning properties, higher and more uniform pressure distribution and enhanced water absorption and retention was adopted during the production. It has been found that the properties of developed padding bandages, which include uniform pressure distribution around the leg, are superior to existing commercial bandages and possess a number of additional properties required to meet the criteria stipulated for an ideal padding bandage. Results have indicated that none of the mostly used commercial padding bandages provide the required uniform pressure distribution around the limb.

  7. An accurate evaluation of the performance of asynchronous DS-CDMA systems with zero-correlation-zone coding in Rayleigh fading

    NASA Astrophysics Data System (ADS)

    Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.

    2010-04-01

    An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.

  8. Implications of Mycobacterium Major Facilitator Superfamily for Novel Measures against Tuberculosis.

    PubMed

    Wang, Rui; Zhang, Zhen; Xie, Longxiang; Xie, Jianping

    2015-01-01

    Major facilitator superfamily (MFS) is an important secondary membrane transport protein superfamily conserved from prokaryotes to eukaryotes. The MFS proteins are widespread among bacteria and are responsible for the transfer of substrates. Pathogenic Mycobacterium MFS transporters, their distribution, function, phylogeny, and predicted crystal structures were studied to better understand the function of MFS and to discover specific inhibitors of MFS for better tuberculosis control.

  9. Integrated hydraulic and organophosphate pesticide injection simulations for enhancing event detection in water distribution systems.

    PubMed

    Schwartz, Rafi; Lahav, Ori; Ostfeld, Avi

    2014-10-15

    As a complementary step towards solving the general event detection problem of water distribution systems, injection of the organophosphate pesticides, chlorpyrifos (CP) and parathion (PA), were simulated at various locations within example networks and hydraulic parameters were calculated over 24-h duration. The uniqueness of this study is that the chemical reactions and byproducts of the contaminants' oxidation were also simulated, as well as other indicative water quality parameters such as alkalinity, acidity, pH and the total concentration of free chlorine species. The information on the change in water quality parameters induced by the contaminant injection may facilitate on-line detection of an actual event involving this specific substance and pave the way to development of a generic methodology for detecting events involving introduction of pesticides into water distribution systems. Simulation of the contaminant injection was performed at several nodes within two different networks. For each injection, concentrations of the relevant contaminants' mother and daughter species, free chlorine species and water quality parameters, were simulated at nodes downstream of the injection location. The results indicate that injection of these substances can be detected at certain conditions by a very rapid drop in Cl2, functioning as the indicative parameter, as well as a drop in alkalinity concentration and a small decrease in pH, both functioning as supporting parameters, whose usage may reduce false positive alarms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Interactive analysis of geographically distributed population imaging data collections over light-path data networks

    NASA Astrophysics Data System (ADS)

    van Lew, Baldur; Botha, Charl P.; Milles, Julien R.; Vrooman, Henri A.; van de Giessen, Martijn; Lelieveldt, Boudewijn P. F.

    2015-03-01

    The cohort size required in epidemiological imaging genetics studies often mandates the pooling of data from multiple hospitals. Patient data, however, is subject to strict privacy protection regimes, and physical data storage may be legally restricted to a hospital network. To enable biomarker discovery, fast data access and interactive data exploration must be combined with high-performance computing resources, while respecting privacy regulations. We present a system using fast and inherently secure light-paths to access distributed data, thereby obviating the need for a central data repository. A secure private cloud computing framework facilitates interactive, computationally intensive exploration of this geographically distributed, privacy sensitive data. As a proof of concept, MRI brain imaging data hosted at two remote sites were processed in response to a user command at a third site. The system was able to automatically start virtual machines, run a selected processing pipeline and write results to a user accessible database, while keeping data locally stored in the hospitals. Individual tasks took approximately 50% longer compared to a locally hosted blade server but the cloud infrastructure reduced the total elapsed time by a factor of 40 using 70 virtual machines in the cloud. We demonstrated that the combination light-path and private cloud is a viable means of building an analysis infrastructure for secure data analysis. The system requires further work in the areas of error handling, load balancing and secure support of multiple users.

  11. Facilitative Effect of a Generalist Herbivore on the Recovery of a Perennial Alga: Consequences for Persistence at the Edge of Their Geographic Range.

    PubMed

    Aguilera, Moisés A; Valdivia, Nelson; Broitman, Bernardo R

    2015-01-01

    Understanding the impacts of consumers on the abundance, growth rate, recovery and persistence of their resources across their distributional range can shed light on the role of trophic interactions in determining species range shifts. Here, we examined if consumptive effects of the intertidal grazer Scurria viridula positively influences the abundance and recovery from disturbances of the alga Mazzaella laminarioides at the edge of its geographic distributions in northern-central Chilean rocky shores. Through field experiments conducted at a site in the region where M. laminarioides overlaps with the polar range edge of S. viridula, we estimated the effects of grazing on different life stages of M. laminarioides. We also used long-term abundance surveys conducted across ~700 km of the shore to evaluate co-occurrence patterns of the study species across their range overlap. We found that S. viridula had positive net effects on M. laminarioides by increasing its cover and re-growth from perennial basal crusts. Probability of occurrence of M. laminarioides increased significantly with increasing density of S. viridula across the range overlap. The negative effect of S. viridula on the percentage cover of opportunistic green algae-shown to compete for space with corticated algae-suggests that competitive release may be part of the mechanism driving the positive effect of the limpet on the abundance and recovery from disturbance of M. laminarioides. We suggest that grazer populations contribute to enhance the abundance of M. laminarioides, facilitating its recolonization and persistence at its distributional range edge. Our study highlights that indirect facilitation can determine the recovery and persistence of a resource at the limit of its distribution, and may well contribute to the ecological mechanisms governing species distributions and range shifts.

  12. Facilitative Effect of a Generalist Herbivore on the Recovery of a Perennial Alga: Consequences for Persistence at the Edge of Their Geographic Range

    PubMed Central

    Aguilera, Moisés A.; Valdivia, Nelson; Broitman, Bernardo R.

    2015-01-01

    Understanding the impacts of consumers on the abundance, growth rate, recovery and persistence of their resources across their distributional range can shed light on the role of trophic interactions in determining species range shifts. Here, we examined if consumptive effects of the intertidal grazer Scurria viridula positively influences the abundance and recovery from disturbances of the alga Mazzaella laminarioides at the edge of its geographic distributions in northern-central Chilean rocky shores. Through field experiments conducted at a site in the region where M. laminarioides overlaps with the polar range edge of S. viridula, we estimated the effects of grazing on different life stages of M. laminarioides. We also used long-term abundance surveys conducted across ~700 km of the shore to evaluate co-occurrence patterns of the study species across their range overlap. We found that S. viridula had positive net effects on M. laminarioides by increasing its cover and re-growth from perennial basal crusts. Probability of occurrence of M. laminarioides increased significantly with increasing density of S. viridula across the range overlap. The negative effect of S. viridula on the percentage cover of opportunistic green algae—shown to compete for space with corticated algae—suggests that competitive release may be part of the mechanism driving the positive effect of the limpet on the abundance and recovery from disturbance of M. laminarioides. We suggest that grazer populations contribute to enhance the abundance of M. laminarioides, facilitating its recolonization and persistence at its distributional range edge. Our study highlights that indirect facilitation can determine the recovery and persistence of a resource at the limit of its distribution, and may well contribute to the ecological mechanisms governing species distributions and range shifts. PMID:26716986

  13. Implementing an overdose education and naloxone distribution program in a health system.

    PubMed

    Devries, Jennifer; Rafie, Sally; Polston, Gregory

    To design and implement a health system-wide program increasing provision of take-home naloxone in patients at risk for opioid overdose, with the downstream aim of reducing fatalities. The program includes health care professional education and guidelines, development, and dissemination of patient education materials, electronic health record changes to promote naloxone prescriptions, and availability of naloxone in pharmacies. Academic health system, San Diego, California. University of California, San Diego Health (UCSDH), offers both inpatient and outpatient primary care and specialty services with 563 beds spanning 2 hospitals and 6 pharmacies. UCSDH is part of the University of California health system, and it serves as the county's safety net hospital. In January 2016, a multisite academic health system initiated a system-wide overdose education and naloxone distribution program to prevent opioid overdose and opioid overdose-related deaths. An interdisciplinary, interdepartmental team came together to develop and implement the program. To strengthen institutional support, naloxone prescribing guidelines were developed and approved for the health system. Education on naloxone for physicians, pharmacists, and nurses was provided through departmental trainings, bulletins, and e-mail notifications. Alerts in the electronic health record and preset naloxone orders facilitated co-prescribing of naloxone with opioid prescriptions. Electronic health record reports captured naloxone prescriptions ordered. Summary reports on the electronic health record measured naloxone reminder alerts and response rates. Since the start of the program, the health system has trained 252 physicians, pharmacists, and nurses in overdose education and take-home naloxone. There has been an increase in the number of prescriptions for naloxone from a baseline of 4.5 per month to an average of 46 per month during the 3 months following full implementation of the program including implementation of electronic health record alerts. Initiating and implementing an overdose education and naloxone distribution program is feasible in an academic health system. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  14. Impact of matric potential and pore size distribution on growth dynamics of filamentous and non-filamentous soil bacteria.

    PubMed

    Wolf, Alexandra B; Vos, Michiel; de Boer, Wietse; Kowalchuk, George A

    2013-01-01

    The filamentous growth form is an important strategy for soil microbes to bridge air-filled pores in unsaturated soils. In particular, fungi perform better than bacteria in soils during drought, a property that has been ascribed to the hyphal growth form of fungi. However, it is unknown if, and to what extent, filamentous bacteria may also display similar advantages over non-filamentous bacteria in soils with low hydraulic connectivity. In addition to allowing for microbial interactions and competition across connected micro-sites, water films also facilitate the motility of non-filamentous bacteria. To examine these issues, we constructed and characterized a series of quartz sand microcosms differing in matric potential and pore size distribution and, consequently, in connection of micro-habitats via water films. Our sand microcosms were used to examine the individual and competitive responses of a filamentous bacterium (Streptomyces atratus) and a motile rod-shaped bacterium (Bacillus weihenstephanensis) to differences in pore sizes and matric potential. The Bacillus strain had an initial advantage in all sand microcosms, which could be attributed to its faster growth rate. At later stages of the incubation, Streptomyces became dominant in microcosms with low connectivity (coarse pores and dry conditions). These data, combined with information on bacterial motility (expansion potential) across a range of pore-size and moisture conditions, suggest that, like their much larger fungal counterparts, filamentous bacteria also use this growth form to facilitate growth and expansion under conditions of low hydraulic conductivity. The sand microcosm system developed and used in this study allowed for precise manipulation of hydraulic properties and pore size distribution, thereby providing a useful approach for future examinations of how these properties influence the composition, diversity and function of soil-borne microbial communities.

  15. Evaluation of a prototype correction algorithm to reduce metal artefacts in flat detector computed tomography of scaphoid fixation screws.

    PubMed

    Filli, Lukas; Marcon, Magda; Scholz, Bernhard; Calcagni, Maurizio; Finkenstädt, Tim; Andreisek, Gustav; Guggenberger, Roman

    2014-12-01

    The aim of this study was to evaluate a prototype correction algorithm to reduce metal artefacts in flat detector computed tomography (FDCT) of scaphoid fixation screws. FDCT has gained interest in imaging small anatomic structures of the appendicular skeleton. Angiographic C-arm systems with flat detectors allow fluoroscopy and FDCT imaging in a one-stop procedure emphasizing their role as an ideal intraoperative imaging tool. However, FDCT imaging can be significantly impaired by artefacts induced by fixation screws. Following ethical board approval, commercially available scaphoid fixation screws were inserted into six cadaveric specimens in order to fix artificially induced scaphoid fractures. FDCT images corrected with the algorithm were compared to uncorrected images both quantitatively and qualitatively by two independent radiologists in terms of artefacts, screw contour, fracture line visibility, bone visibility, and soft tissue definition. Normal distribution of variables was evaluated using the Kolmogorov-Smirnov test. In case of normal distribution, quantitative variables were compared using paired Student's t tests. The Wilcoxon signed-rank test was used for quantitative variables without normal distribution and all qualitative variables. A p value of < 0.05 was considered to indicate statistically significant differences. Metal artefacts were significantly reduced by the correction algorithm (p < 0.001), and the fracture line was more clearly defined (p < 0.01). The inter-observer reliability was "almost perfect" (intra-class correlation coefficient 0.85, p < 0.001). The prototype correction algorithm in FDCT for metal artefacts induced by scaphoid fixation screws may facilitate intra- and postoperative follow-up imaging. Flat detector computed tomography (FDCT) is a helpful imaging tool for scaphoid fixation. The correction algorithm significantly reduces artefacts in FDCT induced by scaphoid fixation screws. This may facilitate intra- and postoperative follow-up imaging.

  16. Efficient statistically accurate algorithms for the Fokker-Planck equation in large dimensions

    NASA Astrophysics Data System (ADS)

    Chen, Nan; Majda, Andrew J.

    2018-02-01

    Solving the Fokker-Planck equation for high-dimensional complex turbulent dynamical systems is an important and practical issue. However, most traditional methods suffer from the curse of dimensionality and have difficulties in capturing the fat tailed highly intermittent probability density functions (PDFs) of complex systems in turbulence, neuroscience and excitable media. In this article, efficient statistically accurate algorithms are developed for solving both the transient and the equilibrium solutions of Fokker-Planck equations associated with high-dimensional nonlinear turbulent dynamical systems with conditional Gaussian structures. The algorithms involve a hybrid strategy that requires only a small number of ensembles. Here, a conditional Gaussian mixture in a high-dimensional subspace via an extremely efficient parametric method is combined with a judicious non-parametric Gaussian kernel density estimation in the remaining low-dimensional subspace. Particularly, the parametric method provides closed analytical formulae for determining the conditional Gaussian distributions in the high-dimensional subspace and is therefore computationally efficient and accurate. The full non-Gaussian PDF of the system is then given by a Gaussian mixture. Different from traditional particle methods, each conditional Gaussian distribution here covers a significant portion of the high-dimensional PDF. Therefore a small number of ensembles is sufficient to recover the full PDF, which overcomes the curse of dimensionality. Notably, the mixture distribution has significant skill in capturing the transient behavior with fat tails of the high-dimensional non-Gaussian PDFs, and this facilitates the algorithms in accurately describing the intermittency and extreme events in complex turbulent systems. It is shown in a stringent set of test problems that the method only requires an order of O (100) ensembles to successfully recover the highly non-Gaussian transient PDFs in up to 6 dimensions with only small errors.

  17. Internetting tactical security sensor systems

    NASA Astrophysics Data System (ADS)

    Gage, Douglas W.; Bryan, W. D.; Nguyen, Hoa G.

    1998-08-01

    The Multipurpose Surveillance and Security Mission Platform (MSSMP) is a distributed network of remote sensing packages and control stations, designed to provide a rapidly deployable, extended-range surveillance capability for a wide variety of military security operations and other tactical missions. The baseline MSSMP sensor suite consists of a pan/tilt unit with video and FLIR cameras and laser rangefinder. With an additional radio transceiver, MSSMP can also function as a gateway between existing security/surveillance sensor systems such as TASS, TRSS, and IREMBASS, and IP-based networks, to support the timely distribution of both threat detection and threat assessment information. The MSSMP system makes maximum use of Commercial Off The Shelf (COTS) components for sensing, processing, and communications, and of both established and emerging standard communications networking protocols and system integration techniques. Its use of IP-based protocols allows it to freely interoperate with the Internet -- providing geographic transparency, facilitating development, and allowing fully distributed demonstration capability -- and prepares it for integration with the IP-based tactical radio networks that will evolve in the next decade. Unfortunately, the Internet's standard Transport layer protocol, TCP, is poorly matched to the requirements of security sensors and other quasi- autonomous systems in being oriented to conveying a continuous data stream, rather than discrete messages. Also, its canonical 'socket' interface both conceals short losses of communications connectivity and simply gives up and forces the Application layer software to deal with longer losses. For MSSMP, a software applique is being developed that will run on top of User Datagram Protocol (UDP) to provide a reliable message-based Transport service. In addition, a Session layer protocol is being developed to support the effective transfer of control of multiple platforms among multiple control stations.

  18. Real-Time MENTAT programming language and architecture

    NASA Technical Reports Server (NTRS)

    Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.

    1989-01-01

    Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.

  19. Northern Everglades, Florida, satellite image map

    USGS Publications Warehouse

    Thomas, Jean-Claude; Jones, John W.

    2002-01-01

    These satellite image maps are one product of the USGS Land Characteristics from Remote Sensing project, funded through the USGS Place-Based Studies Program with support from the Everglades National Park. The objective of this project is to develop and apply innovative remote sensing and geographic information system techniques to map the distribution of vegetation, vegetation characteristics, and related hydrologic variables through space and over time. The mapping and description of vegetation characteristics and their variations are necessary to accurately simulate surface hydrology and other surface processes in South Florida and to monitor land surface changes. As part of this research, data from many airborne and satellite imaging systems have been georeferenced and processed to facilitate data fusion and analysis. These image maps were created using image fusion techniques developed as part of this project.

  20. Value Streams in Microgrids: A literature Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stadler, Michael; Center for Energy and Innovative Technologies; Cardoso, Gonçalo

    2015-10-01

    Microgrids are an increasingly common component of the evolving electricity grids with the potential to improve local reliability, reduce costs, and increase penetration rates for distributed renewable generation. The additional complexity of microgrids often leads to increased investment costs, creating a barrier for widespread adoption. These costs may result directly from specific needs for islanding detection, protection systems and power quality assurance that would otherwise be avoided in simpler system configurations. However, microgrids also facilitate additional value streams that may make up for their increased costs and improve the economic viability of microgrid deployment. This paper analyses the literature currentlymore » available on research relevant to value streams occurring in microgrids that may contribute to offset the increased investment costs. A review on research related to specific microgrid requirements is also presented.« less

  1. A coastal and marine digital library at USGS

    USGS Publications Warehouse

    Lightsom, Fran

    2003-01-01

    The Marine Realms Information Bank (MRIB) is a distributed geolibrary [NRC, 1999] from the U.S. Geological Survey (USGS) and the Woods Hole Oceanographic Institution (WHOI), whose purpose is to classify, integrate, and facilitate access to Earth systems science information about ocean, lake, and coastal environments. Core MRIB services are: (1) the search and display of information holdings by place and subject, and (2) linking of information assets that exist in remote physical locations. The design of the MRIB features a classification system to integrate information from remotely maintained sources. This centralized catalogue organizes information using 12 criteria: locations, geologic time, physiographic features, biota, disciplines, research methods, hot topics, project names, agency names, authors, content type, and file type. For many of these fields, MRIB has developed classification hierarchies.

  2. BioImg.org: A Catalog of Virtual Machine Images for the Life Sciences

    PubMed Central

    Dahlö, Martin; Haziza, Frédéric; Kallio, Aleksi; Korpelainen, Eija; Bongcam-Rudloff, Erik; Spjuth, Ola

    2015-01-01

    Virtualization is becoming increasingly important in bioscience, enabling assembly and provisioning of complete computer setups, including operating system, data, software, and services packaged as virtual machine images (VMIs). We present an open catalog of VMIs for the life sciences, where scientists can share information about images and optionally upload them to a server equipped with a large file system and fast Internet connection. Other scientists can then search for and download images that can be run on the local computer or in a cloud computing environment, providing easy access to bioinformatics environments. We also describe applications where VMIs aid life science research, including distributing tools and data, supporting reproducible analysis, and facilitating education. BioImg.org is freely available at: https://bioimg.org. PMID:26401099

  3. BioImg.org: A Catalog of Virtual Machine Images for the Life Sciences.

    PubMed

    Dahlö, Martin; Haziza, Frédéric; Kallio, Aleksi; Korpelainen, Eija; Bongcam-Rudloff, Erik; Spjuth, Ola

    2015-01-01

    Virtualization is becoming increasingly important in bioscience, enabling assembly and provisioning of complete computer setups, including operating system, data, software, and services packaged as virtual machine images (VMIs). We present an open catalog of VMIs for the life sciences, where scientists can share information about images and optionally upload them to a server equipped with a large file system and fast Internet connection. Other scientists can then search for and download images that can be run on the local computer or in a cloud computing environment, providing easy access to bioinformatics environments. We also describe applications where VMIs aid life science research, including distributing tools and data, supporting reproducible analysis, and facilitating education. BioImg.org is freely available at: https://bioimg.org.

  4. Image Guided Biodistribution and Pharmacokinetic Studies of Theranostics

    PubMed Central

    Ding, Hong; Wu, Fang

    2012-01-01

    Image guided technique is playing an increasingly important role in the investigation of the biodistribution and pharmacokinetics of drugs or drug delivery systems in various diseases, especially cancers. Besides anatomical imaging modalities such as computed tomography (CT), magnetic resonance imaging (MRI), molecular imaging strategy including optical imaging, positron emission tomography (PET) and single-photon emission computed tomography (SPECT) will facilitate the localization and quantization of radioisotope or optical probe labeled nanoparticle delivery systems in the category of theranostics. The quantitative measurement of the bio-distribution and pharmacokinetics of theranostics in the fields of new drug/probe development, diagnosis and treatment process monitoring as well as tracking the brain-blood-barrier (BBB) breaking through by high sensitive imaging method, and the applications of the representative imaging modalities are summarized in this review. PMID:23227121

  5. How I do it: a practical database management system to assist clinical research teams with data collection, organization, and reporting.

    PubMed

    Lee, Howard; Chapiro, Julius; Schernthaner, Rüdiger; Duran, Rafael; Wang, Zhijun; Gorodetski, Boris; Geschwind, Jean-François; Lin, MingDe

    2015-04-01

    The objective of this study was to demonstrate that an intra-arterial liver therapy clinical research database system is a more workflow efficient and robust tool for clinical research than a spreadsheet storage system. The database system could be used to generate clinical research study populations easily with custom search and retrieval criteria. A questionnaire was designed and distributed to 21 board-certified radiologists to assess current data storage problems and clinician reception to a database management system. Based on the questionnaire findings, a customized database and user interface system were created to perform automatic calculations of clinical scores including staging systems such as the Child-Pugh and Barcelona Clinic Liver Cancer, and facilitates data input and output. Questionnaire participants were favorable to a database system. The interface retrieved study-relevant data accurately and effectively. The database effectively produced easy-to-read study-specific patient populations with custom-defined inclusion/exclusion criteria. The database management system is workflow efficient and robust in retrieving, storing, and analyzing data. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  6. Policy-based Distributed Data Management

    NASA Astrophysics Data System (ADS)

    Moore, R. W.

    2009-12-01

    The analysis and understanding of climate variability and change builds upon access to massive collections of observational and simulation data. The analyses involve distributed computing, both at the storage systems (which support data subsetting) and at compute engines (for assimilation of observational data into simulations). The integrated Rule Oriented Data System (iRODS) organizes the distributed data into collections to facilitate enforcement of management policies, support remote data processing, and enable development of reference collections. Currently at RENCI, the iRODS data grid is being used to manage ortho-photos and lidar data for the State of North Carolina, provide a unifying storage environment for engagement centers across the state, support distributed access to visualizations of weather data, and is being explored to manage and disseminate collections of ensembles of meteorological and hydrological model results. In collaboration with the National Climatic Data Center, an iRODS data grid is being established to support data transmission from NCDC to ORNL, and to integrate NCDC archives with ORNL compute services. To manage the massive data transfers, parallel I/O streams are used between High Performance Storage System tape archives and the supercomputers at ORNL. Further, we are exploring the movement and management of large RADAR and in situ datasets to be used for data mining between RENCI and NCDC, and for the distributed creation of decision support and climate analysis tools. The iRODS data grid supports all phases of the scientific data life cycle, from management of data products for a project, to sharing of data between research institutions, to publication of data in a digital library, to preservation of data for use in future research projects. Each phase is characterized by a broader user community, with higher expectations for more detailed descriptions and analysis mechanisms for manipulating the data. The higher usage requirements are enforced by management policies that define the required metadata, the required data formats, and the required analysis tools. The iRODS policy based data management system automates the creation of the community chosen data products, validates integrity and authenticity assessment criteria, and enforces management policies across all accesses of the system.

  7. A cloud-based system for automatic glaucoma screening.

    PubMed

    Fengshou Yin; Damon Wing Kee Wong; Ying Quan; Ai Ping Yow; Ngan Meng Tan; Gopalakrishnan, Kavitha; Beng Hai Lee; Yanwu Xu; Zhuo Zhang; Jun Cheng; Jiang Liu

    2015-08-01

    In recent years, there has been increasing interest in the use of automatic computer-based systems for the detection of eye diseases including glaucoma. However, these systems are usually standalone software with basic functions only, limiting their usage in a large scale. In this paper, we introduce an online cloud-based system for automatic glaucoma screening through the use of medical image-based pattern classification technologies. It is designed in a hybrid cloud pattern to offer both accessibility and enhanced security. Raw data including patient's medical condition and fundus image, and resultant medical reports are collected and distributed through the public cloud tier. In the private cloud tier, automatic analysis and assessment of colour retinal fundus images are performed. The ubiquitous anywhere access nature of the system through the cloud platform facilitates a more efficient and cost-effective means of glaucoma screening, allowing the disease to be detected earlier and enabling early intervention for more efficient intervention and disease management.

  8. Impact of Targeted Programs on Health Systems: A Case Study of the Polio Eradication Initiative

    PubMed Central

    Loevinsohn, Benjamin; Aylward, Bruce; Steinglass, Robert; Ogden, Ellyn; Goodman, Tracey; Melgaard, Bjorn

    2002-01-01

    The results of 2 large field studies on the impact of the polio eradication initiative on health systems and 3 supplementary reports were presented at a December 1999 meeting convened by the World Health Organization. All of these studies concluded that positive synergies exist between polio eradication and health systems but that these synergies have not been vigorously exploited. The eradication of polio has probably improved health systems worldwide by broadening distribution of vitamin A supplements, improving cooperation among enterovirus laboratories, and facilitating linkages between health workers and their communities. The results of these studies also show that eliminating polio did not cause a diminution of funding for immunization against other illnesses. Relatively little is known about the opportunity costs of polio eradication. Improved planning in disease eradication initiatives can minimize disruptions in the delivery of other services. Future initiatives should include indicators and baseline data for monitoring effects on health systems development. PMID:11772750

  9. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  10. Facilitators and Barriers of Implementing a Measurement Feedback System in Public Youth Mental Health.

    PubMed

    Kotte, Amelia; Hill, Kaitlin A; Mah, Albert C; Korathu-Larson, Priya A; Au, Janelle R; Izmirian, Sonia; Keir, Scott S; Nakamura, Brad J; Higa-McMillan, Charmaine K

    2016-11-01

    This study examines implementation facilitators and barriers of a statewide roll-out of a measurement feedback system (MFS) in a youth public mental health system. 76 % of all state care coordinators (N = 47) completed interviews, which were coded via content analysis until saturation. Facilitators (e.g., recognition of the MFS's clinical utility) and barriers (e.g., MFS's reliability and validity) emerged paralleling the Exploration, Adoption/Preparation, Implementation, and Sustainment framework outlined by Aarons et al. (Adm Policy Mental Health Mental Health Serv Res, 38:4-23, 2011). Sustainment efforts may leverage innovation fit, individual adopter, and system related facilitators.

  11. Ubiquitous Low-Cost Functionalized Multi-Walled Carbon Nanotube Sensors for Distributed Methane Leak Detection

    DOE PAGES

    Humayun, Md Tanim; Divan, Ralu; Stan, Liliana; ...

    2016-06-16

    This paper presents a highly sensitive, energy efficient and low-cost distributed methane (CH 4) sensor system (DMSS) for continuous monitoring, detection, and localization of CH 4 leaks in natural gas infrastructure, such as transmission and distribution pipelines, wells, and production pads. The CH 4 sensing element, a key component of the DMSS, consists of a metal oxide nanocrystal (MONC) functionalized multi-walled carbon nanotube (MWCNT) mesh which, in comparison to existing literature, shows stronger relative resistance change while interacting with lower parts per million (ppm) concentration of CH 4. A Gaussian plume triangulation algorithm has been developed for the DMSS. Givenmore » a geometric model of the surrounding environment the algorithm can precisely detect and localize a CH 4 leak as well as estimate its mass emission rate. A UV-based surface recovery technique making the sensor recover 10 times faster than the reported ones is presented for the DMSS. In conclusion, a control algorithm based on the UV-accelerated recovery is developed which facilitates faster leak detection.« less

  12. Data distribution service-based interoperability framework for smart grid testbed infrastructure

    DOE PAGES

    Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.

    2016-03-02

    This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less

  13. Estimation Accuracy on Execution Time of Run-Time Tasks in a Heterogeneous Distributed Environment

    PubMed Central

    Liu, Qi; Cai, Weidong; Jin, Dandan; Shen, Jian; Fu, Zhangjie; Liu, Xiaodong; Linge, Nigel

    2016-01-01

    Distributed Computing has achieved tremendous development since cloud computing was proposed in 2006, and played a vital role promoting rapid growth of data collecting and analysis models, e.g., Internet of things, Cyber-Physical Systems, Big Data Analytics, etc. Hadoop has become a data convergence platform for sensor networks. As one of the core components, MapReduce facilitates allocating, processing and mining of collected large-scale data, where speculative execution strategies help solve straggler problems. However, there is still no efficient solution for accurate estimation on execution time of run-time tasks, which can affect task allocation and distribution in MapReduce. In this paper, task execution data have been collected and employed for the estimation. A two-phase regression (TPR) method is proposed to predict the finishing time of each task accurately. Detailed data of each task have drawn interests with detailed analysis report being made. According to the results, the prediction accuracy of concurrent tasks’ execution time can be improved, in particular for some regular jobs. PMID:27589753

  14. We are not the 99 percent: quantifying asphericity in the distribution of Local Group satellites

    NASA Astrophysics Data System (ADS)

    Forero-Romero, Jaime E.; Arias, Verónica

    2018-05-01

    We use simulations to build an analytic probability distribution for the asphericity in the satellite distribution around Local Group (LG) type galaxies in the Lambda Cold Dark Matter (LCDM) paradigm. We use this distribution to estimate the atypicality of the satellite distributions in the LG even when the underlying simulations do not have enough systems fully resembling the LG in terms of its typical masses, separation and kinematics. We demonstrate the method using three different simulations (Illustris-1, Illustris-1-Dark and ELVIS) and a number of satellites ranging from 11 to 15. Detailed results differ greatly among the simulations suggesting a strong influence of the typical DM halo mass, the number of satellites and the simulated baryonic effects. However, there are three common trends. First, at most 2% of the pairs are expected to have satellite distributions with the same asphericity as the LG; second, at most 80% of the pairs have a halo with a satellite distribution as aspherical as in M31; and third, at most 4% of the pairs have a halo with satellite distribution as planar as in the MW. These quantitative results place the LG at the level of a 3σ outlier in the LCDM paradigm. We suggest that understanding the reasons for this atypicality requires quantifying the asphericity probability distribution as a function of halo mass and large scale environment. The approach presented here can facilitate that kind of study and other comparisons between different numerical setups and choices to study satellites around LG pairs in simulations.

  15. Development of building energy asset rating using stock modelling in the USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Na; Goel, Supriya; Makhmalbaf, Atefe

    2016-01-29

    The US Building Energy Asset Score helps building stakeholders quickly gain insight into the efficiency of building systems (envelope, electrical and mechanical systems). A robust, easy-to-understand 10-point scoring system was developed to facilitate an unbiased comparison of similar building types across the country. The Asset Score does not rely on a database or specific building baselines to establish a rating. Rather, distributions of energy use intensity (EUI) for various building use types were constructed using Latin hypercube sampling and converted to a series of stepped linear scales to score buildings. A score is calculated based on the modelled source EUImore » after adjusting for climate. A web-based scoring tool, which incorporates an analytical engine and a simulation engine, was developed to standardize energy modelling and reduce implementation cost. This paper discusses the methodology used to perform several hundred thousand building simulation runs and develop the scoring scales.« less

  16. Ice Sheet and Sea Ice Observations from Unmanned Aircraft Systems

    NASA Astrophysics Data System (ADS)

    Crocker, R. I.; Maslanik, J. A.

    2011-12-01

    A suite of sensors has been assembled to map ice sheet and sea ice surface topography with fine-resolution from small unmanned aircraft systems (UAS). This payload is optimized to provide coincident surface elevation and imagery data, and with its low cost and ease of reproduction, it has the potential to become a widely-distributed observational resource to complement polar manned-aircraft and satellite missions. To date, it has been deployed to map ice sheet elevations near Jakobshavn Isbræ in Greenland, and to measure sea ice freeboard and roughness in Fram Strait off the coast of Svalbard. Data collected during these campaigns have facilitate a detailed assessment of the system's surface elevation measurement accuracy, and provide a glimpse of the summer 2009 Fram Strait sea ice conditions. These findings are presented, along with a brief overview of our future Arctic UAS operations.

  17. A RESTful Service Oriented Architecture for Science Data Processing

    NASA Astrophysics Data System (ADS)

    Duggan, B.; Tilmes, C.; Durbin, P.; Masuoka, E.

    2012-12-01

    The Atmospheric Composition Processing System is an implementation of a RESTful Service Oriented Architecture which handles incoming data from the Ozone Monitoring Instrument and the Ozone Monitoring and Profiler Suite aboard the Aura and NPP spacecrafts respectively. The system has been built entirely from open source components, such as Postgres, Perl, and SQLite and has leveraged the vast resources of the Comprehensive Perl Archive Network (CPAN). The modular design of the system also allows for many of the components to be easily released and integrated into the CPAN ecosystem and reused independently. At minimal expense, the CPAN infrastructure and community provide peer review, feedback and continuous testing in a wide variety of environments and architectures. A well defined set of conventions also facilitates dependency management, packaging, and distribution of code. Test driven development also provides a way to ensure stability despite a continuously changing base of dependencies.

  18. Landlord project multi-year program plan, fiscal year 1999, WBS 1.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dallas, M.D.

    The MYWP technical baseline describes the work to be accomplished by the Project and the technical standards which govern that work. The mission of Landlord Project is to provide more maintenance replacement of general infrastructure facilities and systems to facilitate the Hanford Site cleanup mission. Also, once an infrastructure facility or system is no longer needed the Landlord Project transitions the facility to final closure/removal through excess, salvage or demolition. Landlord Project activities will be performed in an environmentally sound, safe, economical, prudent, and reliable manner. The Landlord Project consists of the following facilities systems: steam, water, liquid sanitary waste,more » electrical distribution, telecommunication, sanitary landfill, emergency services, general purpose offices, general purpose shops, general purpose warehouses, environmental supports facilities, roads, railroad, and the site land. The objectives for general infrastructure support are reflected in two specific areas, (1) Core Infrastructure Maintenance, and (2) Infrastructure Risk Mitigation.« less

  19. Patient Summary and medicines reconciliation: application of the ISO/CEN EN 13606 standard in clinical practice.

    PubMed

    Farfán Sedano, Francisco J; Terrón Cuadrado, Marta; Castellanos Clemente, Yolanda; Serrano Balazote, Pablo; Moner Cano, David; Robles Viejo, Montserrat

    2011-01-01

    The comparison of the patient's current medication list with the medication being ordered when admitted to Hospital, identifying omissions, duplications, dosing errors, and potential interactions, constitutes the core process of medicines reconciliation. Access to the medication the patient is taking at home could be unfeasible as this information is frequently stored in various locations and in diverse proprietary formats. The lack of interoperability between those information systems, namely the Primary Care and the Specialized Electronic Health Records (EHRs), facilitates medication errors and endangers patient safety. Thus, the development of a Patient Summary that includes clinical data from different electronic systems will allow doctors access to relevant information enabling a safer and more efficient assistance. Such a collection of data from heterogeneous and distributed systems has been achieved in this Project through the construction of a federated view based on the ISO/CEN EN13606 Standard for architecture and communication of EHRs.

  20. MISSE in the Materials and Processes Technical Information System (MAPTIS )

    NASA Technical Reports Server (NTRS)

    Burns, DeWitt; Finckenor, Miria; Henrie, Ben

    2013-01-01

    Materials International Space Station Experiment (MISSE) data is now being collected and distributed through the Materials and Processes Technical Information System (MAPTIS) at Marshall Space Flight Center in Huntsville, Alabama. MISSE data has been instrumental in many programs and continues to be an important source of data for the space community. To facilitate great access to the MISSE data the International Space Station (ISS) program office and MAPTIS are working to gather this data into a central location. The MISSE database contains information about materials, samples, and flights along with pictures, pdfs, excel files, word documents, and other files types. Major capabilities of the system are: access control, browsing, searching, reports, and record comparison. The search capabilities will search within any searchable files so even if the desired meta-data has not been associated data can still be retrieved. Other functionality will continue to be added to the MISSE database as the Athena Platform is expanded

  1. Calibration of a Modified Andersen Bacterial Aerosol Sampler

    PubMed Central

    May, K. R.

    1964-01-01

    A study of the flow regime in the commercial Andersen sampler revealed defects in the sampling of the larger airborne particles. Satisfactory sampling was obtained by redesigning the hole pattern of the top stages and adding one more stage to extend the range of the instrument. A new, rational hole pattern is suggested for the lower stages. With both patterns a special colony-counting mask can be used to facilitate the assay. A calibration of the modified system is presented which enables particle size distribution curves to be drawn from the colony counts. Images FIG. 2 FIG. 3 FIG. 4 FIG. 5 FIG. 6 FIG. 7 FIG. 8 PMID:14106938

  2. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    PubMed

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  3. Health care: economic impact of caring for geriatric patients.

    PubMed

    Rich, Preston B; Adams, Sasha D

    2015-02-01

    National health care expenditures constitute a continuously expanding component of the US economy. Health care resources are distributed unequally among the population, and geriatric patients are disproportionately represented. Characterizing this group of individuals that accounts for the largest percentage of US health spending may facilitate the introduction of targeted interventions in key high-impact areas. Changing demographics, an increasing incidence of chronic disease and progressive disability, rapid technological advances, and systemic market failures in the health care sector combine to drive cost. A multidisciplinary approach will become increasingly necessary to balance the delicate relationship between our constrained supply and increasing demand. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Software environment for implementing engineering applications on MIMD computers

    NASA Technical Reports Server (NTRS)

    Lopez, L. A.; Valimohamed, K. A.; Schiff, S.

    1990-01-01

    In this paper the concept for a software environment for developing engineering application systems for multiprocessor hardware (MIMD) is presented. The philosophy employed is to solve the largest problems possible in a reasonable amount of time, rather than solve existing problems faster. In the proposed environment most of the problems concerning parallel computation and handling of large distributed data spaces are hidden from the application program developer, thereby facilitating the development of large-scale software applications. Applications developed under the environment can be executed on a variety of MIMD hardware; it protects the application software from the effects of a rapidly changing MIMD hardware technology.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    A. Alfonsi; C. Rabiti; D. Mandelli

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less

  6. [Commentary on "group and organization: a dimension of collaboration of anthropology and epidemiology" by Song LM and Wang N].

    PubMed

    Liao, Su-Su; Zhang, Qing-Ning; Hou, Lei

    2012-10-01

    Epidemiology, as the study of occurrence and distribution of diseases or health events in specified populations and the application of the study to control health problems, is not just a method to study determinants of diseases at individual level through analysis of mass data based on individuals. To achieve the aims on the control of health problems in specified populations, Epidemiology should be public health-oriented to reduce incidence, prevalence and mortality, and should include study on determinants at the population level. Interdisplinarity and systems science will facilitate the breakthrough in improving health of the populations.

  7. Development of a high precision dosimetry system for the measurement of surface dose rate distribution for eye applicators.

    PubMed

    Eichmann, Marion; Flühs, Dirk; Spaan, Bernhard

    2009-10-01

    The therapeutic outcome of the therapy with ophthalmic applicators is highly dependent on the application of a sufficient dose to the tumor, whereas the dose applied to the surrounding tissue needs to be minimized. The goal for the newly developed apparatus described in this work is the determination of the individual applicator surface dose rate distribution with a high spatial resolution and a high precision in dose rate with respect to time and budget constraints especially important for clinical procedures. Inhomogeneities of the dose rate distribution can be detected and taken into consideration for the treatment planning. In order to achieve this, a dose rate profile as well as a surface profile of the applicator are measured and correlated with each other. An instrumental setup has been developed consisting of a plastic scintillator detector system and a newly designed apparatus for guiding the detector across the applicator surface at a constant small distance. It performs an angular movement of detector and applicator with high precision. The measurements of surface dose rate distributions discussed in this work demonstrate the successful operation of the measuring setup. Measuring the surface dose rate distribution with a small distance between applicator and detector and with a high density of measuring points results in a complete and gapless coverage of the applicator surface, being capable of distinguishing small sized spots with high activities. The dosimetrical accuracy of the measurements and its analysis is sufficient (uncertainty in the dose rate in terms of absorbed dose to water is <7%), especially when taking the surgical techniques in positioning of the applicator on the eyeball into account. The method developed so far allows a fully automated quality assurance of eye applicators even under clinical conditions. These measurements provide the basis for future calculation of a full 3D dose rate distribution, which then can be used as input for a refined clinical treatment planning system. The improved dose rate measurements will facilitate a clinical study, which could correlate the therapeutic outcome of a brachytherapy treatment with an applicator and its individual dose rate distribution.

  8. Development of a high precision dosimetry system for the measurement of surface dose rate distribution for eye applicators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eichmann, Marion; Fluehs, Dirk; Spaan, Bernhard

    2009-10-15

    Purpose: The therapeutic outcome of the therapy with ophthalmic applicators is highly dependent on the application of a sufficient dose to the tumor, whereas the dose applied to the surrounding tissue needs to be minimized. The goal for the newly developed apparatus described in this work is the determination of the individual applicator surface dose rate distribution with a high spatial resolution and a high precision in dose rate with respect to time and budget constraints especially important for clinical procedures. Inhomogeneities of the dose rate distribution can be detected and taken into consideration for the treatment planning. Methods: Inmore » order to achieve this, a dose rate profile as well as a surface profile of the applicator are measured and correlated with each other. An instrumental setup has been developed consisting of a plastic scintillator detector system and a newly designed apparatus for guiding the detector across the applicator surface at a constant small distance. It performs an angular movement of detector and applicator with high precision. Results: The measurements of surface dose rate distributions discussed in this work demonstrate the successful operation of the measuring setup. Measuring the surface dose rate distribution with a small distance between applicator and detector and with a high density of measuring points results in a complete and gapless coverage of the applicator surface, being capable of distinguishing small sized spots with high activities. The dosimetrical accuracy of the measurements and its analysis is sufficient (uncertainty in the dose rate in terms of absorbed dose to water is <7%), especially when taking the surgical techniques in positioning of the applicator on the eyeball into account. Conclusions: The method developed so far allows a fully automated quality assurance of eye applicators even under clinical conditions. These measurements provide the basis for future calculation of a full 3D dose rate distribution, which then can be used as input for a refined clinical treatment planning system. The improved dose rate measurements will facilitate a clinical study, which could correlate the therapeutic outcome of a brachytherapy treatment with an applicator and its individual dose rate distribution.« less

  9. Malaria in Kakuma refugee camp, Turkana, Kenya: facilitation of Anopheles arabiensis vector populations by installed water distribution and catchment systems

    PubMed Central

    2011-01-01

    Background Malaria is a major health concern for displaced persons occupying refugee camps in sub-Saharan Africa, yet there is little information on the incidence of infection and nature of transmission in these settings. Kakuma Refugee Camp, located in a dry area of north-western Kenya, has hosted ca. 60,000 to 90,000 refugees since 1992, primarily from Sudan and Somalia. The purpose of this study was to investigate malaria prevalence and attack rate and sources of Anopheles vectors in Kakuma refugee camp, in 2005-2006, after a malaria epidemic was observed by staff at camp clinics. Methods Malaria prevalence and attack rate was estimated from cases of fever presenting to camp clinics and the hospital in August 2005, using rapid diagnostic tests and microscopy of blood smears. Larval habitats of vectors were sampled and mapped. Houses were sampled for adult vectors using the pyrethrum knockdown spray method, and mapped. Vectors were identified to species level and their infection with Plasmodium falciparum determined. Results Prevalence of febrile illness with P. falciparum was highest among the 5 to 17 year olds (62.4%) while malaria attack rate was highest among the two to 4 year olds (5.2/1,000/day). Infected individuals were spatially concentrated in three of the 11 residential zones of the camp. The indoor densities of Anopheles arabiensis, the sole malaria vector, were similar during the wet and dry seasons, but were distributed in an aggregated fashion and predominantly in the same zones where malaria attack rates were high. Larval habitats and larval populations were also concentrated in these zones. Larval habitats were man-made pits of water associated with tap-stands installed as the water delivery system to residents with year round availability in the camp. Three percent of A. arabiensis adult females were infected with P. falciparum sporozoites in the rainy season. Conclusions Malaria in Kakuma refugee camp was due mainly to infection with P. falciparum and showed a hyperendemic age-prevalence profile, in an area with otherwise low risk of malaria given prevailing climate. Transmission was sustained by A. arabiensis, whose populations were facilitated by installation of man-made water distribution and catchment systems. PMID:21639926

  10. A multiprojection noncontact fluorescence tomography setup for imaging arbitrary geometries

    NASA Astrophysics Data System (ADS)

    Meyer, H.; Garofalakis, A.; Zacharakis, G.; Economou, E. N.; Mamalaki, C.; Kioussis, D.; Ntziachristos, V.; Ripoll, J.

    2005-04-01

    Optical imaging and tomography in tissues can facilitate the quantitative study of several important chromophores and fluorophores in-vivo. Due to this fact, there has been great interest in developing imaging systems offering quantitative information on the location and concentration of chromophores and fluorescent probes. However, most imaging systems currently used in research make use of fiber technology for delivery and detection, which restricts the size of the photon collecting arrays leading to insufficient spatial sampling and field of view. To enable large data sets and full 360o angular measurements, we developed a novel imaging system that enables 3D imaging of fluorescent signals in bodies of arbitrary shapes in a non-contact geometry in combination with a 3D surface reconstruction algorithm. The system consists of a rotating subject holder and a lens coupled Charge Coupled Device (CCD) camera in combination with a fiber coupled laser scanning device. An Argon ion laser is used as the source and different filters are used for the detection of various fluorophores or fluorescing proteins. With this new setup a large measurements dataset can be achieved while the use of inversion models give a high capacity for quantitative 3D reconstruction of fluorochrome distributions as well as high spatial resolution. The system is currently being tested in the observation of the distribution of Green Fluorescent Protein (GFP) expressing T-lymphocytes in order to study the function of the immune system in a murine model.

  11. Pragmatic open space box utilization: asteroid survey model using distributed objects management based articulation (DOMBA)

    NASA Astrophysics Data System (ADS)

    Mohammad, Atif Farid; Straub, Jeremy

    2015-05-01

    A multi-craft asteroid survey has significant data synchronization needs. Limited communication speeds drive exacting performance requirements. Tables have been used in Relational Databases, which are structure; however, DOMBA (Distributed Objects Management Based Articulation) deals with data in terms of collections. With this, no read/write roadblocks to the data exist. A master/slave architecture is created by utilizing the Gossip protocol. This facilitates expanding a mission that makes an important discovery via the launch of another spacecraft. The Open Space Box Framework facilitates the foregoing while also providing a virtual caching layer to make sure that continuously accessed data is available in memory and that, upon closing the data file, recharging is applied to the data.

  12. Transcriptional Modulation of Tumor-Associated Macrophages to Facilitate Prostate Cancer Immunotherapy

    DTIC Science & Technology

    2017-09-01

    AWARD NUMBER: W81XWH-16-1-0334 TITLE : TRANSCRIPTIONAL MODULATION OF TUMOR-ASSOCIATED MACROPHAGES TO FACILITATE PROSTATE CANCER IMMUNOTHERAPY...OF REPORT: Annual PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland 21702-5012 DISTRIBUTION STATEMENT...Annual 3. DATES COVERED (From - To) 1SEP2016 - 31AUG2017 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER TRANSCRIPTIONAL MODULATION OF TUMOR-ASSOCIATED

  13. Inter-species competition-facilitation in stochastic riparian vegetation dynamics.

    PubMed

    Tealdi, Stefano; Camporeale, Carlo; Ridolfi, Luca

    2013-02-07

    Riparian vegetation is a highly dynamic community that lives on river banks and which depends to a great extent on the fluvial hydrology. The stochasticity of the discharge and erosion/deposition processes in fact play a key role in determining the distribution of vegetation along a riparian transect. These abiotic processes interact with biotic competition/facilitation mechanisms, such as plant competition for light, water, and nutrients. In this work, we focus on the dynamics of plants characterized by three components: (1) stochastic forcing due to river discharges, (2) competition for resources, and (3) inter-species facilitation due to the interplay between vegetation and fluid dynamics processes. A minimalist stochastic bio-hydrological model is proposed for the dynamics of the biomass of two vegetation species: one species is assumed dominant and slow-growing, the other is subdominant, but fast-growing. The stochastic model is solved analytically and the probability density function of the plant biomasses is obtained as a function of both the hydrologic and biologic parameters. The impact of the competition/facilitation processes on the distribution of vegetation species along the riparian transect is investigated and remarkable effects are observed. Finally, a good qualitative agreement is found between the model results and field data. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Internest food sharing within wood ant colonies: resource redistribution behavior in a complex system

    PubMed Central

    Robinson, Elva J.H.

    2016-01-01

    Resource sharing is an important cooperative behavior in many animals. Sharing resources is particularly important in social insect societies, as division of labor often results in most individuals including, importantly, the reproductives, relying on other members of the colony to provide resources. Sharing resources between individuals is therefore fundamental to the success of social insects. Resource sharing is complicated if a colony inhabits several spatially separated nests, a nesting strategy common in many ant species. Resources must be shared not only between individuals in a single nest but also between nests. We investigated the behaviors facilitating resource redistribution between nests in a dispersed-nesting population of wood ant Formica lugubris. We marked ants, in the field, as they transported resources along the trails between nests of a colony, to investigate how the behavior of individual workers relates to colony-level resource exchange. We found that workers from a particular nest “forage” to other nests in the colony, treating them as food sources. Workers treating other nests as food sources means that simple, pre-existing foraging behaviors are used to move resources through a distributed system. It may be that this simple behavioral mechanism facilitates the evolution of this complex life-history strategy. PMID:27004016

  15. High-resolution remote sensing of water quality in the San Francisco Bay-Delta Estuary

    USGS Publications Warehouse

    Fichot, Cédric G.; Downing, Bryan D.; Bergamaschi, Brian; Windham-Myers, Lisamarie; Marvin-DiPasquale, Mark C.; Thompson, David R.; Gierach, Michelle M.

    2015-01-01

    The San Francisco Bay–Delta Estuary watershed is a major source of freshwater for California and a profoundly human-impacted environment. The water quality monitoring that is critical to the management of this important water resource and ecosystem relies primarily on a system of fixed water-quality monitoring stations, but the limited spatial coverage often hinders understanding. Here, we show how the latest technology in visible/near-infrared imaging spectroscopy can facilitate water quality monitoring in this highly dynamic and heterogeneous system by enabling simultaneous depictions of several water quality indicators at very high spatial resolution. The airborne portable remote imaging spectrometer (PRISM) was used to derive high-spatial-resolution (2.6 × 2.6 m) distributions of turbidity, and dissolved organic carbon (DOC) and chlorophyll-a concentrations in a wetland-influenced region of this estuary. A filter-passing methylmercury vs DOC relationship was also developed using in situ samples and enabled the high-spatial-resolution depiction of surface methylmercury concentrations in this area. The results illustrate how high-resolution imaging spectroscopy can inform management and policy development in important inland and estuarine water bodies by facilitating the detection of point- and nonpoint-source pollution, and by providing data to help assess the complex impacts of wetland restoration and climate change on water quality and ecosystem productivity.

  16. Protist-facilitated transport of soil bacteria in an artificial soil micromodel

    NASA Astrophysics Data System (ADS)

    Rubinstein, R. L.; Cousens, V.; Gage, D. J.; Shor, L. M.

    2013-12-01

    Soil bacteria within the rhizosphere benefit plants by protecting roots from pathogens, producing growth factors, and improving nutrient availability. These effects can greatly improve overall plant health and increase crop yield, but as roots grow out from the tips they quickly outpace their bacterial partners. Some soil bacteria are motile and can chemotact towards root tips, but bacterial mobility in unsaturated soils is limited to interconnected hydrated pores. Mobility is further reduced by the tendency of soil bacteria to form biofilms. The introduction of protists to the rhizosphere has been shown to benefit plants, purportedly by selective grazing on harmful bacteria or release of nutrients otherwise sequestered in bacteria. We propose that an additional benefit to the presence of protists is the facilitated transport of beneficial bacteria along root systems. Using microfluidic devices designed to imitate narrow, fluid-filled channels in soil, we have shown that the distribution of bacteria through micro-channels is accelerated in the presence of protists. Furthermore, we have observed that even with predation effects, the bacteria remain viable and continue to reproduce for the duration of our experiments. These results expand upon our understanding of complex bio-physical interactions in the rhizosphere system, and may have important implications for agricultural practices.

  17. A markup language for electrocardiogram data acquisition and analysis (ecgML)

    PubMed Central

    Wang, Haiying; Azuaje, Francisco; Jung, Benjamin; Black, Norman

    2003-01-01

    Background The storage and distribution of electrocardiogram data is based on different formats. There is a need to promote the development of standards for their exchange and analysis. Such models should be platform-/ system- and application-independent, flexible and open to every member of the scientific community. Methods A minimum set of information for the representation and storage of electrocardiogram signals has been synthesised from existing recommendations. This specification is encoded into an XML-vocabulary. The model may aid in a flexible exchange and analysis of electrocardiogram information. Results Based on advantages of XML technologies, ecgML has the ability to present a system-, application- and format-independent solution for representation and exchange of electrocardiogram data. The distinction between the proposal developed by the U.S Food and Drug Administration and ecgML model is given. A series of tools, which aim to facilitate ecgML-based applications, are presented. Conclusions The models proposed here can facilitate the generation of a data format, which opens ways for better and clearer interpretation by both humans and machines. Its structured and transparent organisation will allow researchers to expand and test its capabilities in different application domains. The specification and programs for this protocol are publicly available. PMID:12735790

  18. Local density variation of gold nanoparticles in aquatic environments

    NASA Astrophysics Data System (ADS)

    Hosseinzadeh, F.; Shirazian, F.; Shahsavari, R.; Khoei, A. R.

    2016-10-01

    Gold (Au) nanoparticles are widely used in diagnosing cancer, imaging, and identification of therapeutic methods due to their particular quantum characteristics. This research presents different types of aqueous models and potentials used in TIP3P, to study the effect of the particle size and density of Au clusters in aquatic environments; so it can be useful to facilitate future investigation of the interaction of proteins with Au nanoparticles. The EAM potential is used to model the structure of gold clusters. It is observed that in the systems with identical gold/water density and different cluster radii, gold particles are distributed in aqueous environment almost identically. Thus, Au particles have identical local densities, and the root mean square displacement (RMSD) increases with a constant slope. However in systems with constant cluster radii and different gold/water densities, Au particle dispersion increases with density; as a result, the local density decreases and the RMSD increases with a larger slope. In such systems, the larger densities result in more blunted second peaks in gold-gold radial distribution functions, owing to more intermixing of the clusters and less FCC crystalline features at longer range, a mechanism that is mediated by the competing effects of gold-water and gold-gold interactions.

  19. A distributed computational search strategy for the identification of diagnostics targets: application to finding aptamer targets for methicillin-resistant staphylococci.

    PubMed

    Flanagan, Keith; Cockell, Simon; Harwood, Colin; Hallinan, Jennifer; Nakjang, Sirintra; Lawry, Beth; Wipat, Anil

    2014-06-30

    The rapid and cost-effective identification of bacterial species is crucial, especially for clinical diagnosis and treatment. Peptide aptamers have been shown to be valuable for use as a component of novel, direct detection methods. These small peptides have a number of advantages over antibodies, including greater specificity and longer shelf life. These properties facilitate their use as the detector components of biosensor devices. However, the identification of suitable aptamer targets for particular groups of organisms is challenging. We present a semi-automated processing pipeline for the identification of candidate aptamer targets from whole bacterial genome sequences. The pipeline can be configured to search for protein sequence fragments that uniquely identify a set of strains of interest. The system is also capable of identifying additional organisms that may be of interest due to their possession of protein fragments in common with the initial set. Through the use of Cloud computing technology and distributed databases, our system is capable of scaling with the rapidly growing genome repositories, and consequently of keeping the resulting data sets up-to-date. The system described is also more generically applicable to the discovery of specific targets for other diagnostic approaches such as DNA probes, PCR primers and antibodies.

  20. Collaborative environments for capability-based planning

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2005-05-01

    Distributed collaboration is an emerging technology for the 21st century that will significantly change how business is conducted in the defense and commercial sectors. Collaboration involves two or more geographically dispersed entities working together to create a "product" by sharing and exchanging data, information, and knowledge. A product is defined broadly to include, for example, writing a report, creating software, designing hardware, or implementing robust systems engineering and capability planning processes in an organization. Collaborative environments provide the framework and integrate models, simulations, domain specific tools, and virtual test beds to facilitate collaboration between the multiple disciplines needed in the enterprise. The Air Force Research Laboratory (AFRL) is conducting a leading edge program in developing distributed collaborative technologies targeted to the Air Force's implementation of systems engineering for a simulation-aided acquisition and capability-based planning. The research is focusing on the open systems agent-based framework, product and process modeling, structural architecture, and the integration technologies - the glue to integrate the software components. In past four years, two live assessment events have been conducted to demonstrate the technology in support of research for the Air Force Agile Acquisition initiatives. The AFRL Collaborative Environment concept will foster a major cultural change in how the acquisition, training, and operational communities conduct business.

  1. A web-based institutional DICOM distribution system with the integration of the Clinical Trial Processor (CTP).

    PubMed

    Aryanto, K Y E; Broekema, A; Langenhuysen, R G A; Oudkerk, M; van Ooijen, P M A

    2015-05-01

    To develop and test a fast and easy rule-based web-environment with optional de-identification of imaging data to facilitate data distribution within a hospital environment. A web interface was built using Hypertext Preprocessor (PHP), an open source scripting language for web development, and Java with SQL Server to handle the database. The system allows for the selection of patient data and for de-identifying these when necessary. Using the services provided by the RSNA Clinical Trial Processor (CTP), the selected images were pushed to the appropriate services using a protocol based on the module created for the associated task. Five pipelines, each performing a different task, were set up in the server. In a 75 month period, more than 2,000,000 images are transferred and de-identified in a proper manner while 20,000,000 images are moved from one node to another without de-identification. While maintaining a high level of security and stability, the proposed system is easy to setup, it integrate well with our clinical and research practice and it provides a fast and accurate vendor-neutral process of transferring, de-identifying, and storing DICOM images. Its ability to run different de-identification processes in parallel pipelines is a major advantage in both clinical and research setting.

  2. A distributed computational search strategy for the identification of diagnostics targets: Application to finding aptamer targets for methicillin-resistant staphylococci.

    PubMed

    Flanagan, Keith; Cockell, Simon; Harwood, Colin; Hallinan, Jennifer; Nakjang, Sirintra; Lawry, Beth; Wipat, Anil

    2014-06-01

    The rapid and cost-effective identification of bacterial species is crucial, especially for clinical diagnosis and treatment. Peptide aptamers have been shown to be valuable for use as a component of novel, direct detection methods. These small peptides have a number of advantages over antibodies, including greater specificity and longer shelf life. These properties facilitate their use as the detector components of biosensor devices. However, the identification of suitable aptamer targets for particular groups of organisms is challenging. We present a semi-automated processing pipeline for the identification of candidate aptamer targets from whole bacterial genome sequences. The pipeline can be configured to search for protein sequence fragments that uniquely identify a set of strains of interest. The system is also capable of identifying additional organisms that may be of interest due to their possession of protein fragments in common with the initial set. Through the use of Cloud computing technology and distributed databases, our system is capable of scaling with the rapidly growing genome repositories, and consequently of keeping the resulting data sets up-to-date. The system described is also more generically applicable to the discovery of specific targets for other diagnostic approaches such as DNA probes, PCR primers and antibodies.

  3. The Gemini Recipe System: A Dynamic Workflow for Automated Data Reduction

    NASA Astrophysics Data System (ADS)

    Labrie, K.; Hirst, P.; Allen, C.

    2011-07-01

    Gemini's next generation data reduction software suite aims to offer greater automation of the data reduction process without compromising the flexibility required by science programs using advanced or unusual observing strategies. The Recipe System is central to our new data reduction software. Developed in Python, it facilitates near-real time processing for data quality assessment, and both on- and off-line science quality processing. The Recipe System can be run as a standalone application or as the data processing core of an automatic pipeline. Building on concepts that originated in ORAC-DR, a data reduction process is defined in a Recipe written in a science (as opposed to computer) oriented language, and consists of a sequence of data reduction steps called Primitives. The Primitives are written in Python and can be launched from the PyRAF user interface by users wishing for more hands-on optimization of the data reduction process. The fact that the same processing Primitives can be run within both the pipeline context and interactively in a PyRAF session is an important strength of the Recipe System. The Recipe System offers dynamic flow control allowing for decisions regarding processing and calibration to be made automatically, based on the pixel and the metadata properties of the dataset at the stage in processing where the decision is being made, and the context in which the processing is being carried out. Processing history and provenance recording are provided by the AstroData middleware, which also offers header abstraction and data type recognition to facilitate the development of instrument-agnostic processing routines. All observatory or instrument specific definitions are isolated from the core of the AstroData system and distributed in external configuration packages that define a lexicon including classifications, uniform metadata elements, and transformations.

  4. Analysis of the Space Propulsion System Problem Using RAVEN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    diego mandelli; curtis smith; cristian rabiti

    This paper presents the solution of the space propulsion problem using a PRA code currently under development at Idaho National Laboratory (INL). RAVEN (Reactor Analysis and Virtual control ENviroment) is a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities. It is designed to derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures) and to perform both Monte- Carlo sampling of random distributed events and Event Tree based analysis. In order to facilitate the input/output handling, a Graphical User Interface (GUI) and a post-processing data-mining module are available.more » RAVEN allows also to interface with several numerical codes such as RELAP5 and RELAP-7 and ad-hoc system simulators. For the space propulsion system problem, an ad-hoc simulator has been developed and written in python language and then interfaced to RAVEN. Such simulator fully models both deterministic (e.g., system dynamics and interactions between system components) and stochastic behaviors (i.e., failures of components/systems such as distribution lines and thrusters). Stochastic analysis is performed using random sampling based methodologies (i.e., Monte-Carlo). Such analysis is accomplished to determine both the reliability of the space propulsion system and to propagate the uncertainties associated to a specific set of parameters. As also indicated in the scope of the benchmark problem, the results generated by the stochastic analysis are used to generate risk-informed insights such as conditions under witch different strategy can be followed.« less

  5. Design and development of a medical big data processing system based on Hadoop.

    PubMed

    Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song

    2015-03-01

    Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.

  6. Distributed morality in an information society.

    PubMed

    Floridi, Luciano

    2013-09-01

    The phenomenon of distributed knowledge is well-known in epistemic logic. In this paper, a similar phenomenon in ethics, somewhat neglected so far, is investigated, namely distributed morality. The article explains the nature of distributed morality, as a feature of moral agency, and explores the implications of its occurrence in advanced information societies. In the course of the analysis, the concept of infraethics is introduced, in order to refer to the ensemble of moral enablers, which, although morally neutral per se, can significantly facilitate or hinder both positive and negative moral behaviours.

  7. 75 FR 59771 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Order Approving the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-28

    ... participating in distributions of listed and unlisted securities and is designed to ensure that FINRA receives pertinent distribution-related information from its members in a timely fashion to facilitate its Regulation... Section 15A(b)(6) of the Act,\\6\\ which requires, among other things, that FINRA rules be designed to...

  8. Distributed Leadership as a Factor in and Outcome of Teacher Action Learning

    ERIC Educational Resources Information Center

    Dinham, Stephen; Aubusson, Peter; Brady, Laurie

    2008-01-01

    This paper reports an evaluation of Quality Teaching Action Learning (QTAL) projects conducted at New South Wales (NSW), Australia public (state) primary and secondary schools and explores how distributed leadership facilitated and was an outcome of the QTAL projects. The evaluation encompassed all 50 projects at 82 NSW public schools, and nine of…

  9. 78 FR 20983 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-08

    ... the proposed rule change is available on the Exchange's Web site at www.nyse.com , at the principal... in a manner to facilitate its distribution via Web sites or mobile devices. \\4\\ See Securities... broadcasters, Web site and mobile device service providers, and others to distribute this data product to their...

  10. New consumer load prototype for electricity theft monitoring

    NASA Astrophysics Data System (ADS)

    Abdullateef, A. I.; Salami, M. J. E.; Musse, M. A.; Onasanya, M. A.; Alebiosu, M. I.

    2013-12-01

    Illegal connection which is direct connection to the distribution feeder and tampering of energy meter has been identified as a major process through which nefarious consumers steal electricity on low voltage distribution system. This has contributed enormously to the revenue losses incurred by the power and energy providers. A Consumer Load Prototype (CLP) is constructed and proposed in this study in order to understand the best possible pattern through which the stealing process is effected in real life power consumption. The construction of consumer load prototype will facilitate real time simulation and data collection for the monitoring and detection of electricity theft on low voltage distribution system. The prototype involves electrical design and construction of consumer loads with application of various standard regulations from Institution of Engineering and Technology (IET), formerly known as Institution of Electrical Engineers (IEE). LABVIEW platform was used for data acquisition and the data shows a good representation of the connected loads. The prototype will assist researchers and power utilities, currently facing challenges in getting real time data for the study and monitoring of electricity theft. The simulation of electricity theft in real time is one of the contributions of this prototype. Similarly, the power and energy community including students will appreciate the practical approach which the prototype provides for real time information rather than software simulation which has hitherto been used in the study of electricity theft.

  11. Determination of Organic Partitioning Coefficients in Water-Supercritical CO2 Systems by Simultaneous in Situ UV and Near-Infrared Spectroscopies.

    PubMed

    Bryce, David A; Shao, Hongbo; Cantrell, Kirk J; Thompson, Christopher J

    2016-06-07

    CO2 injected into depleted oil or gas reservoirs for long-term storage has the potential to mobilize organic compounds and distribute them between sediments and reservoir brines. Understanding this process is important when considering health and environmental risks, but little quantitative data currently exists on the partitioning of organics between supercritical CO2 and water. In this work, a high-pressure, in situ measurement capability was developed to assess the distribution of organics between CO2 and water at conditions relevant to deep underground storage of CO2. The apparatus consists of a titanium reactor with quartz windows, near-infrared and UV spectroscopic detectors, and switching valves that facilitate quantitative injection of organic reagents into the pressurized reactor. To demonstrate the utility of the system, partitioning coefficients were determined for benzene in water/supercritical CO2 over the range 35-65 °C and approximately 25-150 bar. Density changes in the CO2 phase with increasing pressure were shown to have dramatic impacts on benzene's partitioning behavior. Our partitioning coefficients were approximately 5-15 times lower than values previously determined by ex situ techniques that are prone to sampling losses. The in situ methodology reported here could be applied to quantify the distribution behavior of a wide range of organic compounds that may be present in geologic CO2 storage scenarios.

  12. Memory guidance in distractor suppression is governed by the availability of cognitive control.

    PubMed

    Wen, Wen; Hou, Yin; Li, Sheng

    2018-03-26

    Information stored in the memory systems can affect visual search. Previous studies have shown that holding the to-be-ignored features of distractors in working memory (WM) could accelerate target selection. However, such facilitation effect was only observed when the cued to-be-ignored features remained unchanged within an experimental block (i.e., the fixed cue condition). No search benefit was obtained if the to-be-ignored features varied from trial to trial (i.e., the varied cue condition). In the present study, we conducted three behavioral experiments to investigate whether the WM and long-term memory (LTM) representations of the to-be-ignored features could facilitate visual search in the fixed cue (Experiment 1) and varied cue (Experiments 2 and 3) conditions. Given the importance of the processing time of cognitive control in distractor suppression, we divided visual search trials into five quintiles based on their reaction times (RTs) and examined the temporal characteristics of the suppression effect. Results showed that both the WM and LTM representations of the to-be-ignored features could facilitate distractor suppression in the fixed cue condition, and the facilitation effects were evident across the quintiles in the RT distribution. However, in the varied cue condition, the RT benefits of the WM-matched distractors occurred only in the trials with the longest RTs, whereas no advantage of the LTM-matched distractors was observed. These results suggest that the effective WM-guided distractor suppression depends on the availability of cognitive control and the LTM-guided suppression occurs only if sufficient WM resource is accessible by LTM reactivation.

  13. Use of Open Architecture Middleware for Autonomous Platforms

    NASA Astrophysics Data System (ADS)

    Naranjo, Hector; Diez, Sergio; Ferrero, Francisco

    2011-08-01

    Network Enabled Capabilities (NEC) is the vision for next-generation systems in the defence domain formulated by governments, the European Defence Agency (EDA) and the North Atlantic Treaty Organization (NATO). It involves the federation of military information systems, rather than just a simple interconnection, to provide each user with the "right information, right place, right time - and not too much". It defines openness, standardization and flexibility principles in military systems, likewise applicable in the civilian space applications.This paper provides the conclusions drawn from "Architecture for Embarked Middleware" (EMWARE) study, funded by the European Defence Agency (EDA).The aim of the EMWARE project was to provide the information and understanding to facilitate the adoption of informed decisions regarding the specification and implementation of Open Architecture Middleware in future distributed systems, linking it with the NEC goal.EMWARE project included the definition of four business cases, each devoted to a different field of application (Unmanned Aerial Vehicles, Helicopters, Unmanned Ground Vehicles and the Satellite Ground Segment).

  14. A Rocket Engine Design Expert System

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.

    1989-01-01

    The overall structure and capabilities of an expert system designed to evaluate rocket engine performance are described. The expert system incorporates a JANNAF standard reference computer code to determine rocket engine performance and a state of the art finite element computer code to calculate the interactions between propellant injection, energy release in the combustion chamber, and regenerative cooling heat transfer. Rule-of-thumb heuristics were incorporated for the H2-O2 coaxial injector design, including a minimum gap size constraint on the total number of injector elements. One dimensional equilibrium chemistry was used in the energy release analysis of the combustion chamber. A 3-D conduction and/or 1-D advection analysis is used to predict heat transfer and coolant channel wall temperature distributions, in addition to coolant temperature and pressure drop. Inputting values to describe the geometry and state properties of the entire system is done directly from the computer keyboard. Graphical display of all output results from the computer code analyses is facilitated by menu selection of up to five dependent variables per plot.

  15. Masked Morphological Priming in German-Speaking Adults and Children: Evidence from Response Time Distributions

    PubMed Central

    Hasenäcker, Jana; Beyersmann, Elisabeth; Schroeder, Sascha

    2016-01-01

    In this study, we looked at masked morphological priming effects in German children and adults beyond mean response times by taking into account response time distributions. We conducted an experiment comparing suffixed word primes (kleidchen-KLEID), suffixed nonword primes (kleidtum-KLEID), nonsuffixed nonword primes (kleidekt-KLEID), and unrelated controls (träumerei-KLEID). The pattern of priming in adults showed facilitation from suffixed words, suffixed nonwords, and nonsuffixed nonwords relative to unrelated controls, and from both suffixed conditions relative to nonsuffixed nonwords, thus providing evidence for morpho-orthographic and embedded stem priming. Children also showed facilitation from real suffixed words, suffixed nonwords, and nonsuffixed nonwords compared to unrelated words, but no difference between the suffixed and nonsuffixed conditions, thus suggesting that German elementary school children do not make use of morpho-orthographic segmentation. Interestingly, for all priming effects, a shift of the response time distribution was observed. Consequences for theories of morphological processing are discussed. PMID:27445899

  16. Superresolution imaging of viral protein trafficking

    PubMed Central

    Salka, Kyle; Bhuvanendran, Shivaprasad; Yang, David

    2015-01-01

    The endoplasmic reticulum (ER) membrane is closely apposed to the outer mitochondrial membrane (OMM), which facilitates communication between these organelles. These contacts, known as mitochondria-associated membranes (MAM), facilitate calcium signaling, lipid transfer, as well as antiviral and stress responses. How cellular proteins traffic to the MAM, are distributed therein, and interact with ER and mitochondrial proteins are subject of great interest. The human cytomegalovirus UL37 exon 1 protein or viral mitochondria-localized inhibitor of apoptosis (vMIA) is crucial for viral growth. Upon synthesis at the ER, vMIA traffics to the MAM and OMM, where it reprograms the organization and function of these compartments. vMIA significantly changes the abundance of cellular proteins at the MAM and OMM, including proteins that regulate calcium homeostasis and cell death. Through the use of superresolution imaging, we have shown that vMIA is distributed at the OMM in nanometer scale clusters. This is similar to the clusters reported for the mitochondrial calcium channel, VDAC, as well as electron transport chain, translocase of the OMM complex, and mitochondrial inner membrane organizing system components. Thus, aside from addressing how vMIA targets the MAM and regulates survival of infected cells, biochemical studies and superresolution imaging of vMIA offer insights into the formation, organization, and functioning of MAM. Here, we discuss these insights into trafficking, function, and organization of vMIA at the MAM and OMM and discuss how the use of superresolution imaging is contributing to the study of the formation and trafficking of viruses. PMID:25724304

  17. Efficient Use of Distributed Systems for Scientific Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Valerie; Chen, Jian; Canfield, Thomas; Richard, Jacques

    2000-01-01

    Distributed computing has been regarded as the future of high performance computing. Nationwide high speed networks such as vBNS are becoming widely available to interconnect high-speed computers, virtual environments, scientific instruments and large data sets. One of the major issues to be addressed with distributed systems is the development of computational tools that facilitate the efficient execution of parallel applications on such systems. These tools must exploit the heterogeneous resources (networks and compute nodes) in distributed systems. This paper presents a tool, called PART, which addresses this issue for mesh partitioning. PART takes advantage of the following heterogeneous system features: (1) processor speed; (2) number of processors; (3) local network performance; and (4) wide area network performance. Further, different finite element applications under consideration may have different computational complexities, different communication patterns, and different element types, which also must be taken into consideration when partitioning. PART uses parallel simulated annealing to partition the domain, taking into consideration network and processor heterogeneity. The results of using PART for an explicit finite element application executing on two IBM SPs (located at Argonne National Laboratory and the San Diego Supercomputer Center) indicate an increase in efficiency by up to 36% as compared to METIS, a widely used mesh partitioning tool. The input to METIS was modified to take into consideration heterogeneous processor performance; METIS does not take into consideration heterogeneous networks. The execution times for these applications were reduced by up to 30% as compared to METIS. These results are given in Figure 1 for four irregular meshes with number of elements ranging from 30,269 elements for the Barth5 mesh to 11,451 elements for the Barth4 mesh. Future work with PART entails using the tool with an integrated application requiring distributed systems. In particular this application, illustrated in the document entails an integration of finite element and fluid dynamic simulations to address the cooling of turbine blades of a gas turbine engine design. It is not uncommon to encounter high-temperature, film-cooled turbine airfoils with 1,000,000s of degrees of freedom. This results because of the complexity of the various components of the airfoils, requiring fine-grain meshing for accuracy. Additional information is contained in the original.

  18. Glymphatic distribution of CSF-derived apoE into brain is isoform specific and suppressed during sleep deprivation.

    PubMed

    Achariyar, Thiyagaragan M; Li, Baoman; Peng, Weiguo; Verghese, Philip B; Shi, Yang; McConnell, Evan; Benraiss, Abdellatif; Kasper, Tristan; Song, Wei; Takano, Takahiro; Holtzman, David M; Nedergaard, Maiken; Deane, Rashid

    2016-12-08

    Apolipoprotein E (apoE) is a major carrier of cholesterol and essential for synaptic plasticity. In brain, it's expressed by many cells but highly expressed by the choroid plexus and the predominant apolipoprotein in cerebrospinal fluid (CSF). The role of apoE in the CSF is unclear. Recently, the glymphatic system was described as a clearance system whereby CSF and ISF (interstitial fluid) is exchanged via the peri-arterial space and convective flow of ISF clearance is mediated by aquaporin 4 (AQP4), a water channel. We reasoned that this system also serves to distribute essential molecules in CSF into brain. The aim was to establish whether apoE in CSF, secreted by the choroid plexus, is distributed into brain, and whether this distribution pattern was altered by sleep deprivation. We used fluorescently labeled lipidated apoE isoforms, lenti-apoE3 delivered to the choroid plexus, immunohistochemistry to map apoE brain distribution, immunolabeled cells and proteins in brain, Western blot analysis and ELISA to determine apoE levels and radiolabeled molecules to quantify CSF inflow into brain and brain clearance in mice. Data were statistically analyzed using ANOVA or Student's t- test. We show that the glymphatic fluid transporting system contributes to the delivery of choroid plexus/CSF-derived human apoE to neurons. CSF-delivered human apoE entered brain via the perivascular space of penetrating arteries and flows radially around arteries, but not veins, in an isoform specific manner (apoE2 > apoE3 > apoE4). Flow of apoE around arteries was facilitated by AQP4, a characteristic feature of the glymphatic system. ApoE3, delivered by lentivirus to the choroid plexus and ependymal layer but not to the parenchymal cells, was present in the CSF, penetrating arteries and neurons. The inflow of CSF, which contains apoE, into brain and its clearance from the interstitium were severely suppressed by sleep deprivation compared to the sleep state. Thus, choroid plexus/CSF provides an additional source of apoE and the glymphatic fluid transporting system delivers it to brain via the periarterial space. By implication, failure in this essential physiological role of the glymphatic fluid flow and ISF clearance may also contribute to apoE isoform-specific disorders in the long term.

  19. Programmes for advance distribution of misoprostol to prevent post-partum haemorrhage: a rapid literature review of factors affecting implementation.

    PubMed

    Smith, Helen J; Colvin, Christopher J; Richards, Esther; Roberson, Jeffrey; Sharma, Geeta; Thapa, Kusum; Gülmezoglu, A Metin

    2016-02-01

    Recent efforts to prevent post-partum haemorrhage (PPH) in low-income countries have focused on providing women with access to oral misoprostol during home birth. The WHO recommends using lay health workers (LHWs) to administer misoprostol in settings where skilled birth attendants are not available. This review synthesizes current knowledge about the barriers and facilitators affecting implementation of advance community distribution of misoprostol to prevent PPH, where misoprostol may be self-administered or administered by an LHW.We searched for and summarized available empirical evidence, and collected primary data from programme stakeholders about their experiences of programme implementation.We present key outcomes and features of advanced distribution programmes that are in operation or have been piloted globally. We categorized factors influencing implementation into those that operate at the health system level, factors related to the community and policy context and those factors more closely connected to the end user.Debates around advance distribution have centred on the potential risks and benefits of making misoprostol available to pregnant women and community members during pregnancy for administration in the home. However, the risks of advance distribution appear manageable and the benefits of self-administration, especially for women who have little chance of expert care for PPH, are considerable. © The Author 2015. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  20. Adaptive invasive species distribution models: A framework for modeling incipient invasions

    USGS Publications Warehouse

    Uden, Daniel R.; Allen, Craig R.; Angeler, David G.; Corral, Lucia; Fricke, Kent A.

    2015-01-01

    The utilization of species distribution model(s) (SDM) for approximating, explaining, and predicting changes in species’ geographic locations is increasingly promoted for proactive ecological management. Although frameworks for modeling non-invasive species distributions are relatively well developed, their counterparts for invasive species—which may not be at equilibrium within recipient environments and often exhibit rapid transformations—are lacking. Additionally, adaptive ecological management strategies address the causes and effects of biological invasions and other complex issues in social-ecological systems. We conducted a review of biological invasions, species distribution models, and adaptive practices in ecological management, and developed a framework for adaptive, niche-based, invasive species distribution model (iSDM) development and utilization. This iterative, 10-step framework promotes consistency and transparency in iSDM development, allows for changes in invasive drivers and filters, integrates mechanistic and correlative modeling techniques, balances the avoidance of type 1 and type 2 errors in predictions, encourages the linking of monitoring and management actions, and facilitates incremental improvements in models and management across space, time, and institutional boundaries. These improvements are useful for advancing coordinated invasive species modeling, management and monitoring from local scales to the regional, continental and global scales at which biological invasions occur and harm native ecosystems and economies, as well as for anticipating and responding to biological invasions under continuing global change.

Top