Science.gov

Sample records for distributed block virtualization

  1. Recoverable distributed shared virtual memory

    NASA Technical Reports Server (NTRS)

    Wu, Kun-Lung; Fuchs, W. Kent

    1990-01-01

    The problem of rollback recovery in distributed shared virtual environments, in which the shared memory is implemented in software in a loosely coupled distributed multicomputer system, is examined. A user-transparent checkpointing recovery scheme and a new twin-page disk storage management technique are presented for implementing recoverable distributed shared virtual memory. The checkpointing scheme can be integrated with the memory coherence protocol for managing the shared virtual memory. The twin-page disk design allows checkpointing to proceed in an incremental fashion without an explicit undo at the time of recovery. The recoverable distributed shared virtual memory allows the system to restart computation from a checkpoint without a global restart.

  2. DIstributed VIRtual System (DIVIRS) project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, Clifford B.

    1995-01-01

    As outlined in our continuation proposal 92-ISI-50R (revised) on NASA cooperative agreement NCC2-539, we are (1) developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to develop and execute parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; (2) developing communications routines that support the abstractions implemented in item one; (3) continuing the development of file and information systems based on the Virtual System Model; and (4) incorporating appropriate security measures to allow the mechanisms developed in items 1 through 3 to be used on an open network. The goal throughout our work is to provide a uniform model that can be applied to both parallel and distributed systems. We believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. Our work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  3. Distributed Virtual System (DIVIRS) Project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, B. Clifford

    1993-01-01

    As outlined in our continuation proposal 92-ISI-50R (revised) on contract NCC 2-539, we are (1) developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to program parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; (2) developing communications routines that support the abstractions implemented in item one; (3) continuing the development of file and information systems based on the virtual system model; and (4) incorporating appropriate security measures to allow the mechanisms developed in items 1 through 3 to be used on an open network. The goal throughout our work is to provide a uniform model that can be applied to both parallel and distributed systems. We believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. Our work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  4. DIstributed VIRtual System (DIVIRS) project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, B. Clifford

    1994-01-01

    As outlined in our continuation proposal 92-ISI-. OR (revised) on NASA cooperative agreement NCC2-539, we are (1) developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to develop and execute parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; (2) developing communications routines that support the abstractions implemented in item one; (3) continuing the development of file and information systems based on the Virtual System Model; and (4) incorporating appropriate security measures to allow the mechanisms developed in items 1 through 3 to be used on an open network. The goal throughout our work is to provide a uniform model that can be applied to both parallel and distributed systems. We believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. Our work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  5. Distributed Virtual System (DIVIRS) project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, B. Clifford

    1993-01-01

    As outlined in the continuation proposal 92-ISI-50R (revised) on NASA cooperative agreement NCC 2-539, the investigators are developing software, including a system manager and a job manager, that will manage available resources and that will enable programmers to develop and execute parallel applications in terms of a virtual configuration of processors, hiding the mapping to physical nodes; developing communications routines that support the abstractions implemented; continuing the development of file and information systems based on the Virtual System Model; and incorporating appropriate security measures to allow the mechanisms developed to be used on an open network. The goal throughout the work is to provide a uniform model that can be applied to both parallel and distributed systems. The authors believe that multiprocessor systems should exist in the context of distributed systems, allowing them to be more easily shared by those that need them. The work provides the mechanisms through which nodes on multiprocessors are allocated to jobs running within the distributed system and the mechanisms through which files needed by those jobs can be located and accessed.

  6. Evidence of Blocking with Geometric Cues in a Virtual Watermaze

    ERIC Educational Resources Information Center

    Redhead, Edward S.; Hamilton, Derek A.

    2009-01-01

    Three computer based experiments, testing human participants in a non-immersive virtual watermaze task, used a blocking design to assess whether two sets of geometric cues would compete in a manner described by associative models of learning. In stage 1, participants were required to discriminate between visually distinct platforms. In stage 2,…

  7. Block data distribution for parallel nested dissection

    SciTech Connect

    Charrier, P.; Facq, L.; Roman, J.

    1995-12-01

    In this paper, we consider the problem of data partitioning for block sparse Cholesky factorization on distributed memory MIMD computers. We propose a preprocessing algorithm which computes and distributes a column block partition based on an initial partition induced by a nested dissection ordering. This preprocessing algorithm works by optimizing load balancing under precedence constraints and communication traffic. It can be performed in linear time and space complexities.

  8. Distributed virtual environment for emergency medical training

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.; Garcia, Brian W.; Godsell-Stytz, Gayl M.

    1997-07-01

    In many professions where individuals must work in a team in a high stress environment to accomplish a time-critical task, individual and team performance can benefit from joint training using distributed virtual environments (DVEs). One professional field that lacks but needs a high-fidelity team training environment is the field of emergency medicine. Currently, emergency department (ED) medical personnel train by using words to create a metal picture of a situation for the physician and staff, who then cooperate to solve the problems portrayed by the word picture. The need in emergency medicine for realistic virtual team training is critical because ED staff typically encounter rarely occurring but life threatening situations only once in their careers and because ED teams currently have no realistic environment in which to practice their team skills. The resulting lack of experience and teamwork makes diagnosis and treatment more difficult. Virtual environment based training has the potential to redress these shortfalls. The objective of our research is to develop a state-of-the-art virtual environment for emergency medicine team training. The virtual emergency room (VER) allows ED physicians and medical staff to realistically prepare for emergency medical situations by performing triage, diagnosis, and treatment on virtual patients within an environment that provides them with the tools they require and the team environment they need to realistically perform these three tasks. There are several issues that must be addressed before this vision is realized. The key issues deal with distribution of computations; the doctor and staff interface to the virtual patient and ED equipment; the accurate simulation of individual patient organs' response to injury, medication, and treatment; and an accurate modeling of the symptoms and appearance of the patient while maintaining a real-time interaction capability. Our ongoing work addresses all of these issues. In this

  9. Guidelines for developing distributed virtual environment applications

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.

    1998-08-01

    We have conducted a variety of projects that served to investigate the limits of virtual environments and distributed virtual environment (DVE) technology for the military and medical professions. The projects include an application that allows the user to interactively explore a high-fidelity, dynamic scale model of the Solar System and a high-fidelity, photorealistic, rapidly reconfigurable aircraft simulator. Additional projects are a project for observing, analyzing, and understanding the activity in a military distributed virtual environment, a project to develop a distributed threat simulator for training Air Force pilots, a virtual spaceplane to determine user interface requirements for a planned military spaceplane system, and an automated wingman for use in supplementing or replacing human-controlled systems in a DVE. The last two projects are a virtual environment user interface framework; and a project for training hospital emergency department personnel. In the process of designing and assembling the DVE applications in support of these projects, we have developed rules of thumb and insights into assembling DVE applications and the environment itself. In this paper, we open with a brief review of the applications that were the source for our insights and then present the lessons learned as a result of these projects. The lessons we have learned fall primarily into five areas. These areas are requirements development, software architecture, human-computer interaction, graphical database modeling, and construction of computer-generated forces.

  10. Exploiting virtual synchrony in distributed systems

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.; Joseph, Thomas A.

    1987-01-01

    Applications of a virtually synchronous environment are described for distributed programming, which underlies a collection of distributed programming tools in the ISIS2 system. A virtually synchronous environment allows processes to be structured into process groups, and makes events like broadcasts to the group as an entity, group membership changes, and even migration of an activity from one place to another appear to occur instantaneously, in other words, synchronously. A major advantage to this approach is that many aspects of a distributed application can be treated independently without compromising correctness. Moreover, user code that is designed as if the system were synchronous can often be executed concurrently. It is argued that this approach to building distributed and fault tolerant software is more straightforward, more flexible, and more likely to yield correct solutions than alternative approaches.

  11. Orientation Distribution for Thin Film Block Copolymers

    NASA Astrophysics Data System (ADS)

    Jones, Ronald; Zhang, Xiaohua; Kim, Sangcheol; Karim, Alamgir; Briber, Robert; Kim, Ho-Cheol

    2008-03-01

    The directed self-assembly of nanostructured films with vertically oriented morphologies is a potential solution for manufacture of next generation data storage platforms, microelectronic devices, and nanoporous membranes. In many of these applications, the distribution of orientation must be tightly controlled to enable pattern transfer. This parameter is expected to depend on factors such as the Flory-Huggins chi parameter, but little data has been reported to date. We present results from tomographic small angle scattering on a series of block copolymer films whose assembly has been directed through solvent annealing. Films of poly(styrene-b-ethylene oxide) are cast as a function of annealing time and their orientation distribution reported. The results provide significant insight into the fundamental limits of line edge roughness and defect control possible using this fabrication technique.

  12. Performance Studies on Distributed Virtual Screening

    PubMed Central

    Krüger, Jens; de la Garza, Luis; Kohlbacher, Oliver; Nagel, Wolfgang E.

    2014-01-01

    Virtual high-throughput screening (vHTS) is an invaluable method in modern drug discovery. It permits screening large datasets or databases of chemical structures for those structures binding possibly to a drug target. Virtual screening is typically performed by docking code, which often runs sequentially. Processing of huge vHTS datasets can be parallelized by chunking the data because individual docking runs are independent of each other. The goal of this work is to find an optimal splitting maximizing the speedup while considering overhead and available cores on Distributed Computing Infrastructures (DCIs). We have conducted thorough performance studies accounting not only for the runtime of the docking itself, but also for structure preparation. Performance studies were conducted via the workflow-enabled science gateway MoSGrid (Molecular Simulation Grid). As input we used benchmark datasets for protein kinases. Our performance studies show that docking workflows can be made to scale almost linearly up to 500 concurrent processes distributed even over large DCIs, thus accelerating vHTS campaigns significantly. PMID:25032219

  13. Virtuality Distributions and Pion Transition Form Factor

    DOE PAGESBeta

    Radyushkin, Anatoly V.

    2015-03-01

    Using the example of hard exclusive transition process γ*γ → π0 at the handbag level, we outline basics of a new approach to transverse momentum dependence in hard processes. In coordinate representation, matrix elements of operators (in the simplest case, bilocal O(0,z)) describing a hadron with momentum p, are functions of (pz) and z2 parametrized through virtuality distribution amplitudes (VDA) Φ(x, σ), with x being Fourier-conjugate to (pz) and σ Laplace-conjugate to z2. For intervals with z+=0, we introduce the transverse momentum distribution amplitude (TMDA) Ψ(x, k_perp), and write it in terms of VDA Φ(x,σ). We propose models for softmore » VDAs/TMDAs, and use them for comparison of handbag results with experimental (BaBar and BELLE) data. We also discuss the generation of hard tails of TMDAs from initially soft forms.« less

  14. Virtuality Distributions and Pion Transition Form Factor

    NASA Astrophysics Data System (ADS)

    Radyushkin, A. V.

    2015-02-01

    Using the example of hard exclusive transition process γ*γ → π0 at the handbag level, we outline basics of a new approach to transverse momentum dependence in hard processes. In coordinate representation, matrix elements of operators (in the simplest case, bilocal 𝒪(0, z)) describing a hadron with momentum p, are functions of (pz) and z2 parametrized through virtuality distribution amplitudes (VDA) Φ(x, σ), with x being Fourier-conjugate to (pz) and σ Laplace-conjugate to z2. For intervals with z+ = 0, we introduce the transverse momentum distribution amplitude (TMDA) Ψ(x, k⊥), and write it in terms of VDA Φ(x, σ). We propose models for soft VDAs/TMDAs, and use them for comparison of handbag results with experimental (BaBar and BELLE) data. We also discuss the generation of hard tails of TMDAs from initially soft forms.

  15. Virtual Solar Observatory Distributed Query Construction

    NASA Technical Reports Server (NTRS)

    Gurman, J. B.; Dimitoglou, G.; Bogart, R.; Davey, A.; Hill, F.; Martens, P.

    2003-01-01

    Through a prototype implementation (Tian et al., this meeting) the VSO has already demonstrated the capability of unifying geographically distributed data sources following the Web Services paradigm and utilizing mechanisms such as the Simple Object Access Protocol (SOAP). So far, four participating sites (Stanford, Montana State University, National Solar Observatory and the Solar Data Analysis Center) permit Web-accessible, time-based searches that allow browse access to a number of diverse data sets. Our latest work includes the extension of the simple, time-based queries to include numerous other searchable observation parameters. For VSO users, this extended functionality enables more refined searches. For the VSO, it is a proof of concept that more complex, distributed queries can be effectively constructed and that results from heterogeneous, remote sources can be synthesized and presented to users as a single, virtual data product.

  16. Data management system for distributed virtual screening.

    PubMed

    Zhou, Ting; Caflisch, Amedeo

    2009-01-01

    High throughput docking (HTD) using high performance computing platforms is a multidisciplinary challenge. To handle HTD data effectively and efficiently, we have developed a distributed virtual screening data management system (DVSDMS) in which the data handling and the distribution of jobs are realized by the open-source structured query language database software MySQL. The essential concept of DVSDMS is the separation of the data management from the docking and ranking applications. DVSDMS can be used to dock millions of molecules effectively, monitor the process in real time, analyze docking results promptly, and process up to 10(8) poses by energy ranking techniques. In an HTD campaign to identify kinase inhibitors a low cost Linux PC has allowed DVSDMS to efficiently assign the workload to more than 500 computing clients. Notably, in a stress test of DVSDMS that emulated a large number of clients, about 60 molecules per second were distributed to the clients for docking, which indicates that DVSDMS can run efficiently on very large compute cluster (up to about 40000 cores). PMID:19072299

  17. Virtuality Distributions and Pion Transition Form Factor

    SciTech Connect

    Radyushkin, Anatoly V.

    2015-03-01

    Using the example of hard exclusive transition process γ*γ → π0 at the handbag level, we outline basics of a new approach to transverse momentum dependence in hard processes. In coordinate representation, matrix elements of operators (in the simplest case, bilocal O(0,z)) describing a hadron with momentum p, are functions of (pz) and z2 parametrized through virtuality distribution amplitudes (VDA) Φ(x, σ), with x being Fourier-conjugate to (pz) and σ Laplace-conjugate to z2. For intervals with z+=0, we introduce the transverse momentum distribution amplitude (TMDA) Ψ(x, k_perp), and write it in terms of VDA Φ(x,σ). We propose models for soft VDAs/TMDAs, and use them for comparison of handbag results with experimental (BaBar and BELLE) data. We also discuss the generation of hard tails of TMDAs from initially soft forms.

  18. MOCHA/ISAIA: Building Blocks for Interoperability in a Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Cheung, C. Y.; Hanisch, R. J.; McGlynn, T. A.; Plante, R. L.; Shaya, E. J.

    2000-12-01

    Some basic building blocks must be put in place before we can realize the vision of a National or Global Virtual Observatory. MOCHA is a project that is building a prototype interoperability infrastructure for a Virtual Observatory. ISAIA is an effort that defines the astrophysics query profile to enables searches to networked astrophysics resources that have very different data structures. Both projects are funded by the NASA Applied Information Systems Research Program. We shall describe a joint demonstration by these two projects that involves four data centers: the Astronomical Data Center (ADC), the High Energy Astrophysics Science Archive Research Center (HEASARC), the Astronomical Digital Image Library (ADIL), the Space Telescope Science Institute (STScI); and the University of Maryland. We shall show how a positional query for astrophysical data in a region of arbitrary geometrical boundary can be carried out using these basic components. We shall also describe a scheme by which user software can be deployed to a data center to extend its services, and how the system will return to the researcher only the desired scientific results. This capability is very important for multispectral studies using the large all-sky surveys that reside in distributed data archives.

  19. Mechanics of distributed fault and block rotation

    NASA Technical Reports Server (NTRS)

    Nur, A.; Scotti, O.; Ron, H.

    1989-01-01

    Paleomagnetic data, structural geology, and rock mechanics are used to explore the validity and significance of the block rotation concept. The analysis is based on data from Northern Israel, where fault slip and spacing are used to predict block rotation; the Mojave Desert, with well documented strike-slip sets; the Lake Mead, Nevada fault system with well-defined sets of strike-slip faults; and the San Gabriel Mountains domain with a multiple set of strike-slip faults. The results of the analysis indicate that block rotations can have a profound influence on the interpretation of geodetic measurments and the inversion of geodetic data. Furthermore, the block rotations and domain boundaries may be involved in creating the heterogeneities along active fault systems which may be responsible for the initiation and termination of earthquake rupture.

  20. Recoverable distributed shared virtual memory - Memory coherence and storage structures

    NASA Technical Reports Server (NTRS)

    Wu, Kun-Lung; Fuchs, W. Kent

    1989-01-01

    This paper examines the problem of implementing rollback recovery in multicomputer distributed shared virtual memory environments, in which the shared memory is implemented in software and exists only virtually. A user-transparent checkpointing recovery scheme and new twin-page disk storage management are presented to implement a recoverable distributed shared virtual memory. The checkpointing scheme is integrated with the shared virtual memory management. The twin-page disk approach allows incremental checkpointing without an explicit undo at the time of recovery. A single consistent checkpoint state is maintained on stable disk storage. The recoverable distributed shared virtual memory allows the system to restart computation from a previous checkpoint due to a processor failure without a global restart.

  1. Distributed virtual worlds in high-speed networks

    NASA Astrophysics Data System (ADS)

    Schiffner, Norbert

    1998-09-01

    Recent research efforts have concentrated on determining how the distributed workplace can be transformed into a shared virtual environment. Interaction among people and process virtual worlds has to be provided and improved. To enhance the usability of our virtual collaborative environment we integrated a multicast communication environment. With the availability of global information highways, 3D graphical intercontinental collaboration will become a part of our daily work routine. This paper describes the basics of our network infrastructure and the multicast support. As a proof of concept, a virtual world scenario is also presented in this paper.

  2. Virtual Observatory and Distributed Data Mining

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.

    2012-03-01

    New modes of discovery are enabled by the growth of data and computational resources (i.e., cyberinfrastructure) in the sciences. This cyberinfrastructure includes structured databases, virtual observatories (distributed data, as described in Section 20.2.1 of this chapter), high-performance computing (petascale machines), distributed computing (e.g., the Grid, the Cloud, and peer-to-peer networks), intelligent search and discovery tools, and innovative visualization environments. Data streams from experiments, sensors, and simulations are increasingly complex and growing in volume. This is true in most sciences, including astronomy, climate simulations, Earth observing systems, remote sensing data collections, and sensor networks. At the same time, we see an emerging confluence of new technologies and approaches to science, most clearly visible in the growing synergism of the four modes of scientific discovery: sensors-modeling-computing-data (Eastman et al. 2005). This has been driven by numerous developments, including the information explosion, development of large-array sensors, acceleration in high-performance computing (HPC) power, advances in algorithms, and efficient modeling techniques. Among these, the most extreme is the growth in new data. Specifically, the acquisition of data in all scientific disciplines is rapidly accelerating and causing a data glut (Bell et al. 2007). It has been estimated that data volumes double every year—for example, the NCSA (National Center for Supercomputing Applications) reported that their users cumulatively generated one petabyte of data over the first 19 years of NCSA operation, but they then generated their next one petabyte in the next year alone, and the data production has been growing by almost 100% each year after that (Butler 2008). The NCSA example is just one of many demonstrations of the exponential (annual data-doubling) growth in scientific data collections. In general, this putative data-doubling is an

  3. A distributed framework for inter-domain virtual network embedding

    NASA Astrophysics Data System (ADS)

    Wang, Zihua; Han, Yanni; Lin, Tao; Tang, Hui

    2013-03-01

    Network virtualization has been a promising technology for overcoming the Internet impasse. A main challenge in network virtualization is the efficient assignment of virtual resources. Existing work focused on intra-domain solutions whereas inter-domain situation is more practical in realistic setting. In this paper, we present a distributed inter-domain framework for mapping virtual networks to physical networks which can ameliorate the performance of the virtual network embedding. The distributed framework is based on a Multi-agent approach. A set of messages for information exchange is defined. We design different operations and IPTV use scenarios to validate the advantages of our framework. Use cases shows that our framework can solve the inter-domain problem efficiently.

  4. Distributed Cognition in a Virtual World

    ERIC Educational Resources Information Center

    Gillen, Julia; Ferguson, Rebecca; Peachey, Anna; Twining, Peter

    2012-01-01

    Over a 13-month period, the Schome Park Programme operated the first "closed" (i.e. protected) Teen Second Life project in Europe. The project organised diverse educational events that centred on use of a virtual world and an associated asynchronous forum and wiki. Students and staff together exploited the affordances of the environment to develop…

  5. On the asymptotic distribution of block-modified random matrices

    NASA Astrophysics Data System (ADS)

    Arizmendi, Octavio; Nechita, Ion; Vargas, Carlos

    2016-01-01

    We study random matrices acting on tensor product spaces which have been transformed by a linear block operation. Using operator-valued free probability theory, under some mild assumptions on the linear map acting on the blocks, we compute the asymptotic eigenvalue distribution of the modified matrices in terms of the initial asymptotic distribution. Moreover, using recent results on operator-valued subordination, we present an algorithm that computes, numerically but in full generality, the limiting eigenvalue distribution of the modified matrices. Our analytical results cover many cases of interest in quantum information theory: we unify some known results and we obtain new distributions and various generalizations.

  6. Virtual Files in a Distributed Environment.

    ERIC Educational Resources Information Center

    Bennett, K. H.; And Others

    1986-01-01

    Demonstrates that hierarchical naming schemes--one with a single namespace and fixed root spread over all machines in a distributed system, and a second with total namespace composed of some aggregation of individual namespaces of each component system's filestores--can coexist in distributed computer systems. Combined system's design is…

  7. Distribution of Parental Genome Blocks in Recombinant Inbred Lines

    PubMed Central

    Martin, Olivier C.; Hospital, Frédéric

    2011-01-01

    We consider recombinant inbred lines obtained by crossing two given homozygous parents and then applying multiple generations of self-crossings or full-sib matings. The chromosomal content of any such line forms a mosaic of blocks, each alternatively inherited identically by descent from one of the parents. Quantifying the statistical properties of such mosaic genomes has remained an open challenge for many years. Here, we solve this problem by taking a continuous chromosome picture and assuming crossovers to be noninterfering. Using a continuous-time random walk framework and Markov chain theory, we determine the statistical properties of these identical-by-descent blocks. We find that successive block lengths are only very slightly correlated. Furthermore, the blocks on the ends of chromosomes are larger on average than the others, a feature understandable from the nonexponential distribution of block lengths. PMID:21840856

  8. Information visualization in a distributed virtual decision support environment

    NASA Astrophysics Data System (ADS)

    Blocher, Timothy W.

    2002-07-01

    The visualization of and interaction with decision quality information is critical for effective decision makers in today's data rich environments. The generation and presentation of intuitively meaningful decision support information is the challenge. In order to investigate various visualization approaches to improve the timeliness and quality of Commander decisions, a robust, distributed virtual simulation environment, based on AFRL's Global Awareness Virtual Testbed (GAVTB), is being developed to represent an Air Operations Center (AOC) environment. The powerful Jview visualization technology is employed to efficiently and effectively utilize the simulation products to experiment with various decision quality representations and interactions required by military commanders.

  9. Decentralized commanding and supervision: the distributed projective virtual reality approach

    NASA Astrophysics Data System (ADS)

    Rossmann, Juergen

    2000-10-01

    As part of the cooperation between the University of Souther California (USC) and the Institute of Robotics Research (IRF) of the University of Dortmund experiments regarding the control of robots over long distances by means of virtual reality based man machine interfaces have been successfully carried out. In this paper, the newly developed virtual reality system that is being used for the control of a multi-robot system for space applications as well as for the control and supervision of industrial robotics and automation applications is presented. The general aim of the development was to provide the framework for Projective Virtual Reality which allows users to project their actions in the virtual world into the real world primarily by means of robots but also by other means of automation. The framework is based on a new approach which builds on the task deduction capabilities of a newly developed virtual reality system and a task planning component. The advantage of this new approach is that robots which work at great distances from the control station can be controlled as easily and intuitively as robots that work right next to the control station. Robot control technology now provides the user in the virtual world with a prolonged arm into the physical environment, thus paving the way for a new quality of user-friendly man machine interfaces for automation applications. Lately, this work has been enhanced by a new structure that allows to distribute the virtual reality application over multiple computers. With this new step, it is now possible for multiple users to work together in the same virtual room, although they may physically be thousands of miles apart. They only need an Internet or ISDN connection to share this new experience. Last but not least, the distribution technology has been further developed to not just allow users to cooperate but to be able to run the virtual world on many synchronized PCs so that a panorama projection or even a cave can

  10. Distributed collaborative environments for virtual capability-based planning

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2003-09-01

    Distributed collaboration is an emerging technology that will significantly change how decisions are made in the 21st century. Collaboration involves two or more geographically dispersed individuals working together to share and exchange data, information, knowledge, and actions. The marriage of information, collaboration, and simulation technologies provides the decision maker with a collaborative virtual environment for planning and decision support. This paper reviews research that is focusing on the applying open standards agent-based framework with integrated modeling and simulation to a new Air Force initiative in capability-based planning and the ability to implement it in a distributed virtual environment. Virtual Capability Planning effort will provide decision-quality knowledge for Air Force resource allocation and investment planning including examining proposed capabilities and cost of alternative approaches, the impact of technologies, identification of primary risk drivers, and creation of executable acquisition strategies. The transformed Air Force business processes are enabled by iterative use of constructive and virtual modeling, simulation, and analysis together with information technology. These tools are applied collaboratively via a technical framework by all the affected stakeholders - warfighter, laboratory, product center, logistics center, test center, and primary contractor.

  11. A distributed virtual environment prototype for emergency medical procedures training.

    PubMed

    Stytz, M R; Garcia, B W; Godsell-Stytz, G M; Banks, S B

    1997-01-01

    Because of the increasing complexity of emergency medical care, medical staffs require increasingly sophisticated training systems. Virtual environments offer a low cost means to achieve a widely usable yet sophisticated training capability. We describe the Virtual Emergency Room (VER) project, a simulation system designed to enable emergency department personnel within level I and II emergency rooms to practice emergency medical procedures and protocols. Because emergency rooms are manned by a wide variety of medical professionals, we are developing a simulation facility that uses a distributed virtual environment architecture to enable real-time, multi-participant simulations. The potential advantages of this system include the ability to evaluate and refine treatment skills, and the ability to provide scenario-specific training for mobile military field hospital teams. These advantages will ultimately improve the readiness of emergency department staffs for a wide variety of trauma situations. This paper describes the VER and the major components of its distributed virtual environment. The current capabilities of our system are described followed by a discussion of recommended follow-on work. PMID:10168942

  12. Running a distributed virtual observatory: U.S. Virtual Astronomical Observatory operations

    NASA Astrophysics Data System (ADS)

    McGlynn, Thomas A.; Hanisch, Robert J.; Berriman, G. Bruce; Thakar, Aniruddha R.

    2012-09-01

    Operation of the US Virtual Astronomical Observatory shares some issues with modern physical observatories, e.g., intimidating data volumes and rapid technological change, and must also address unique concerns like the lack of direct control of the underlying and scattered data resources, and the distributed nature of the observatory itself. In this paper we discuss how the VAO has addressed these challenges to provide the astronomical community with a coherent set of science-enabling tools and services. The distributed nature of our virtual observatory-with data and personnel spanning geographic, institutional and regime boundaries-is simultaneously a major operational headache and the primary science motivation for the VAO. Most astronomy today uses data from many resources. Facilitation of matching heterogeneous datasets is a fundamental reason for the virtual observatory. Key aspects of our approach include continuous monitoring and validation of VAO and VO services and the datasets provided by the community, monitoring of user requests to optimize access, caching for large datasets, and providing distributed storage services that allow user to collect results near large data repositories. Some elements are now fully implemented, while others are planned for subsequent years. The distributed nature of the VAO requires careful attention to what can be a straightforward operation at a conventional observatory, e.g., the organization of the web site or the collection and combined analysis of logs. Many of these strategies use and extend protocols developed by the international virtual observatory community. Our long-term challenge is working with the underlying data providers to ensure high quality implementation of VO data access protocols (new and better 'telescopes'), assisting astronomical developers to build robust integrating tools (new 'instruments'), and coordinating with the research community to maximize the science enabled.

  13. Distributed deformation and block rotation in 3D

    NASA Technical Reports Server (NTRS)

    Scotti, Oona; Nur, Amos; Estevez, Raul

    1990-01-01

    The authors address how block rotation and complex distributed deformation in the Earth's shallow crust may be explained within a stationary regional stress field. Distributed deformation is characterized by domains of sub-parallel fault-bounded blocks. In response to the contemporaneous activity of neighboring domains some domains rotate, as suggested by both structural and paleomagnetic evidence. Rotations within domains are achieved through the contemporaneous slip and rotation of the faults and of the blocks they bound. Thus, in regions of distributed deformation, faults must remain active in spite of their poor orientation in the stress field. The authors developed a model that tracks the orientation of blocks and their bounding faults during rotation in a 3D stress field. In the model, the effective stress magnitudes of the principal stresses (sigma sub 1, sigma sub 2, and sigma sub 3) are controlled by the orientation of fault sets in each domain. Therefore, adjacent fault sets with differing orientations may be active and may display differing faulting styles, and a given set of faults may change its style of motion as it rotates within a stationary stress regime. The style of faulting predicted by the model depends on a dimensionless parameter phi = (sigma sub 2 - sigma sub 3)/(sigma sub 1 - sigma sub 3). Thus, the authors present a model for complex distributed deformation and complex offset history requiring neither geographical nor temporal changes in the stress regime. They apply the model to the Western Transverse Range domain of southern California. There, it is mechanically feasible for blocks and faults to have experienced up to 75 degrees of clockwise rotation in a phi = 0.1 strike-slip stress regime. The results of the model suggest that this domain may first have accommodated deformation along preexisting NNE-SSW faults, reactivated as normal faults. After rotation, these same faults became strike-slip in nature.

  14. TeleMed: A distributed virtual patient record system

    SciTech Connect

    Forslund, D.W.; Phillips, R.L.; Kilman, D.G.; Cook, J.L.

    1996-06-01

    TeleMed is a distributed diagnosis and analysis system, which permits physicians who are not collocated to consult on the status of a patient. The patient`s record is dynamically constructed from data that may reside at several sites but which can be quickly assembled for viewing by pointing to the patient`s name. Then, a graphical patient record appears, through which consulting physicians can retrieve textual and radiographic data with a single mouse click. TeleMed uses modern distributed object technology and emerging telecollaboration tools. The authors describe in this paper some of the motivation for this change, what they mean by a virtual patient record, and some results of some early implementations of a virtual patient record.

  15. A distributed architecture for a loosely coupled virtual microscopy system

    NASA Astrophysics Data System (ADS)

    Sánchez, César; Romero, Eduardo

    2011-03-01

    Virtual microscopy systems are typically implemented following standard client-server architectures, under which the server must store a huge quantity of data. The server must attend requests from many clients as several Regions of Interest (RoIs) at any desired levels of magnification and quality. The communication bandwidth limitation, the I/O image data accesses, the decompression processing and specific raw image data operations such as clipping or zooming to a desired magnification, are highly time-consuming processes. All this together may result in poor navigation experiences with annoying effects produced by the delayed response times. This article presents a virtual microscope system with a distributed storage system and parallel processing. The system attends each request in parallel, using a clustered java virtual machine and a distributed filesystem. Images are stored in JPEG2000 which allows natural parallelization by splitting the image data into a set of small codeblocks that contain independent information of an image patch, namely, a particular magnification, a specific image location and a pre-established quality level. The compressed J2K file is replicated within the Distributed Filesystem, providing fault tolerance and fast access. A requested RoI is split into stripes which are independently decoded for the distributed filesystem, using an index file which allows to easily locate the particular node containing the required set of codeblocks. When comparing with a non-parallelized version of the virtual microscope software, user experience is improved by speeding up RoI displaying in about 60 % using two computers.

  16. Dynamic shared state maintenance in distributed virtual environments

    NASA Astrophysics Data System (ADS)

    Hamza-Lup, Felix George

    Advances in computer networks and rendering systems facilitate the creation of distributed collaborative environments in which the distribution of information at remote locations allows efficient communication. Particularly challenging are distributed interactive Virtual Environments (VE) that allow knowledge sharing through 3D information. The purpose of this work is to address the problem of latency in distributed interactive VE and to develop a conceptual model for consistency maintenance in these environments based on the participant interaction model. An area that needs to be explored is the relationship between the dynamic shared state and the interaction with the virtual entities present in the shared scene. Mixed Reality (MR) and VR environments must bring the human participant interaction into the loop through a wide range of electronic motion sensors, and haptic devices. Part of the work presented here defines a novel criterion for categorization of distributed interactive VE and introduces, as well as analyzes, an adaptive synchronization algorithm for consistency maintenance in such environments. As part of the work, a distributed interactive Augmented Reality (AR) testbed and the algorithm implementation details are presented. Currently the testbed is part of several research efforts at the Optical Diagnostics and Applications Laboratory including 3D visualization applications using custom built head-mounted displays (HMDs) with optical motion tracking and a medical training prototype for endotracheal intubation and medical prognostics. An objective method using quaternion calculus is applied for the algorithm assessment. In spite of significant network latency, results show that the dynamic shared state can be maintained consistent at multiple remotely located sites. In further consideration of the latency problems and in the light of the current trends in interactive distributed VE applications, we propose a hybrid distributed system architecture for

  17. Addressing security issues related to virtual institute distributed activities

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.

    2008-03-01

    One issue confounding the development and experimentation of distributed modeling and simulation environments is the inability of the project team to identify and collaborate with resources, both human and technical, from outside the United States. This limitation is especially significant within the human behavior representation area where areas such as cultural effects research and joint command team behavior modeling require the participation of various cultural and national representatives. To address this limitation, as well as other human behavior representation research issues, NATO Research and Technology Organization initiated a project to develop a NATO virtual institute that enables more effective and more collaborative research into human behavior representation. However, in building and operating a virtual institute one of the chief concerns must be the cyber security of the institute. Because the institute "exists" in cyberspace, all of its activities are susceptible to cyberattacks, subterfuge, denial of service and all of the vulnerabilities that networked computers must face. In our opinion, for the concept of virtual institutes to be successful and useful, their operations and services must be protected from the threats in the cyber environment. A key to developing the required protection is the development and promulgation of standards for cyber security. In this paper, we discuss the types of cyber standards that are required, how new internet technologies can be exploited and can benefit the promulgation, development, maintenance, and robustness of the standards. This paper is organized as follows. Section One introduces the concept of the virtual institutes, the expected benefits, and the motivation for our research and for research in this area. Section Two presents background material and a discussion of topics related to VIs, uman behavior and cultural modeling, and network-centric warfare. Section Three contains a discussion of the

  18. Deeply Virtual Exclusive Processes and Generalized Parton Distributions

    SciTech Connect

    ,

    2011-06-01

    The goal of the comprehensive program in Deeply Virtual Exclusive Scattering at Jefferson Laboratory is to create transverse spatial images of quarks and gluons as a function of their longitudinal momentum fraction in the proton, the neutron, and in nuclei. These functions are the Generalized Parton Distributions (GPDs) of the target nucleus. Cross section measurements of the Deeply Virtual Compton Scattering (DVCS) reaction ep {yields} ep{gamma} in Hall A support the QCD factorization of the scattering amplitude for Q^2 {>=} 2 GeV^2. Quasi-free neutron-DVCS measurements on the Deuteron indicate sensitivity to the quark angular momentum sum rule. Fully exclusive H(e, e'p{gamma} ) measurements have been made in a wide kinematic range in CLAS with polarized beam, and with both unpolarized and longitudinally polarized targets. Existing models are qualitatively consistent with the JLab data, but there is a clear need for less constrained models. Deeply virtual vector meson production is studied in CLAS. The 12 GeV upgrade will be essential for for these channels. The {rho} and {omega} channels reactions offer the prospect of flavor sensitivity to the quark GPDs, while the {phi}-production channel is dominated by the gluon distribution.

  19. Evolution History of Asteroid Itokawa Based on Block Distribution Analysis

    NASA Astrophysics Data System (ADS)

    Mazrouei, Sara; Daly, Michael; Barnouin, Olivier; Ernst, Carolyn

    2013-04-01

    This work investigates trends in the global and regional distribution of blocks on asteroid 25143 Itokawa in order to discover new findings to better understand the history of this asteroid. Itokawa is a near-Earth object, and the first asteroid that was targeted for a sample return mission. Trends in block population provide new insights in regards to Itokawa's current appearance following the disruption of a possible parent body, and how its surface might have changed since then. Here blocks are defined as rocks or features with distinctive positive relief that are larger than a few meters in size. The size and distribution of blocks are measured by mapping the outline of the blocks using the Small Body Mapping Tool (SBMT) created by the Johns Hopkins University Applied Physics Laboratory [1]. The SBMT allows the user to overlap correctly geo-located Hayabusa images [2] onto the Itokawa shape model. This study provides additional inferences on the original disruption and subsequent re-accretion of Itokawa's "head" and "body" from block analyses. A new approach is taken by analyzing the population of blocks with respect to latitude for both Itokawa's current state, and a hypothetical elliptical body. Itokawa currently rotates approximately about its maximum moment of inertia, which is expected due to conservation of momentum and minimum energy arguments. After the possible disruption of the parent body of Itokawa, the "body" of Itokawa would have tended to a similar rotation. The shape of this body is made by removing the head of Itokawa and applying a semispherical cap. Using the method of [3] inertial properties of this object are calculated. With the assumption that this object had settled to its stable rotational axis, it is found that the pole axis could have been tilted about 13° away from the current axis in the direction opposite the head, equivalent to a 33 meter change in the center of mass. The results of this study provide means to test the hypothesis

  20. 3D structure of nucleon with virtuality distributions

    NASA Astrophysics Data System (ADS)

    Radyushkin, Anatoly

    2014-09-01

    We describe a new approach to transverse momentum dependence in hard processes. Our starting point is coordinate representation for matrix elements of operators (in the simplest case, bilocal O (0 , z)) describing a hadron with momentum p. Treated as functions of (pz) and z2, they are parametrized through parton virtuality distribution (PVD) Φ (x , σ) , with x being Fourier-conjugate to (pz) and σ Laplace-conjugate to z2. For intervals with z+ = 0 , we introduce the transverse momentum distribution (TMD) f (x ,k⊥) , and write it in terms of PVD Φ (x , σ) . The results of covariant calculations, written in terms of Φ (x , σ) are converted into expressions involving f (x ,k⊥) . We propose models for soft PVDs/TMDs,and describe how one can generate high-k⊥ tails of TMDs from primordial soft distributions. We describe a new approach to transverse momentum dependence in hard processes. Our starting point is coordinate representation for matrix elements of operators (in the simplest case, bilocal O (0 , z)) describing a hadron with momentum p. Treated as functions of (pz) and z2, they are parametrized through parton virtuality distribution (PVD) Φ (x , σ) , with x being Fourier-conjugate to (pz) and σ Laplace-conjugate to z2. For intervals with z+ = 0 , we introduce the transverse momentum distribution (TMD) f (x ,k⊥) , and write it in terms of PVD Φ (x , σ) . The results of covariant calculations, written in terms of Φ (x , σ) are converted into expressions involving f (x ,k⊥) . We propose models for soft PVDs/TMDs,and describe how one can generate high-k⊥ tails of TMDs from primordial soft distributions. Supported by Jefferson Science Associates, LLC under U.S. DOE Contract #DE-AC05-06OR23177 and by U.S. DOE Grant #DE-FG02-97ER41028.

  1. Generalized parton distributions from deep virtual compton scattering at CLAS

    SciTech Connect

    Guidal, M.

    2010-04-24

    Here, we have analyzed the beam spin asymmetry and the longitudinally polarized target spin asymmetry of the Deep Virtual Compton Scattering process, recently measured by the Jefferson Lab CLAS collaboration. Our aim is to extract information about the Generalized Parton Distributions of the proton. By fitting these data, in a largely model-independent procedure, we are able to extract numerical values for the two Compton Form Factors $H_{Im}$ and $\\tilde{H}_{Im}$ with uncertainties, in average, of the order of 30%.

  2. Generalized parton distributions from deep virtual compton scattering at CLAS

    DOE PAGESBeta

    Guidal, M.

    2010-04-24

    Here, we have analyzed the beam spin asymmetry and the longitudinally polarized target spin asymmetry of the Deep Virtual Compton Scattering process, recently measured by the Jefferson Lab CLAS collaboration. Our aim is to extract information about the Generalized Parton Distributions of the proton. By fitting these data, in a largely model-independent procedure, we are able to extract numerical values for the two Compton Form Factorsmore » $$H_{Im}$$ and $$\\tilde{H}_{Im}$$ with uncertainties, in average, of the order of 30%.« less

  3. Research on distributed virtual reality system in electronic commerce

    NASA Astrophysics Data System (ADS)

    Xue, Qiang; Wang, Jiening; Sun, Jizhou

    2004-03-01

    In this paper, Distributed Virtual Reality (DVR) technology applied in Electronical Commerce (EC) is discussed. DVR has the capability of providing a new means for human being to recognize, analyze and resolve the large scale, complex problems, which makes it develop quickly in EC fields. The technology of CSCW (Computer Supported Cooperative Work) and middleware is introduced into the development of EC-DVR system to meet the need of a platform which can provide the necessary cooperation and communication services to avoid developing the basic module repeatedly. Finally, the paper gives a platform structure of EC-DVR system.

  4. A Virtual Hosting Environment for Distributed Online Gaming

    NASA Astrophysics Data System (ADS)

    Brossard, David; Prieto Martinez, Juan Luis

    With enterprise boundaries becoming fuzzier, it’s become clear that businesses need to share resources, expose services, and interact in many different ways. In order to achieve such a distribution in a dynamic, flexible, and secure way, we have designed and implemented a virtual hosting environment (VHE) which aims at integrating business services across enterprise boundaries and virtualising the ICT environment within which these services operate in order to exploit economies of scale for the businesses as well as achieve shorter concept-to-market time scales. To illustrate the relevance of the VHE, we have applied it to the online gaming world. Online gaming is an early adopter of distributed computing and more than 30% of gaming developer companies, being aware of the shift, are focusing on developing high performance platforms for the new online trend.

  5. Managing distributed software development in the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Evans, Janet D.; Plante, Raymond L.; Boneventura, Nina; Busko, Ivo; Cresitello-Dittmar, Mark; D'Abrusco, Raffaele; Doe, Stephen; Ebert, Rick; Laurino, Omar; Pevunova, Olga; Refsdal, Brian; Thomas, Brian

    2012-09-01

    The U.S. Virtual Astronomical Observatory (VAO) is a product-driven organization that provides new scientific research capabilities to the astronomical community. Software development for the VAO follows a lightweight framework that guides development of science applications and infrastructure. Challenges to be overcome include distributed development teams, part-time efforts, and highly constrained schedules. We describe the process we followed to conquer these challenges while developing Iris, the VAO application for analysis of 1-D astronomical spectral energy distributions (SEDs). Iris was successfully built and released in less than a year with a team distributed across four institutions. The project followed existing International Virtual Observatory Alliance inter-operability standards for spectral data and contributed a SED library as a by-product of the project. We emphasize lessons learned that will be folded into future development efforts. In our experience, a well-defined process that provides guidelines to ensure the project is cohesive and stays on track is key to success. Internal product deliveries with a planned test and feedback loop are critical. Release candidates are measured against use cases established early in the process, and provide the opportunity to assess priorities and make course corrections during development. Also key is the participation of a stakeholder such as a lead scientist who manages the technical questions, advises on priorities, and is actively involved as a lead tester. Finally, frequent scheduled communications (for example a bi-weekly tele-conference) assure issues are resolved quickly and the team is working toward a common vision.

  6. Pion Electromagnetic Form Factor in Virtuality Distribution Formalism

    SciTech Connect

    Radyushkin, Anatoly V.

    2016-01-01

    We discuss two applications of the {\\it Virtuality Distribution Amplitudes} (VDA) formalism developed in our recent papers. We start with an overview of the main properties of the pion distribution amplitude emphasizing the quantitative measures of its width, and possibility to access them through the pion transition form factor studies. We formulate the basic concepts of the VDA approach and introduce the pion {\\it transverse momentum distribution amplitude} (TMDA) which plays, in a covariant Lagrangian formulation, a role similar to that of the pion wave function in the 3-dimensional Hamiltonian light-front approach. We propose simple factorized models for soft TMDAs, and use them to describe existing data on the pion transition form factor, thus fixing the scale determining the size of the transverse-momentum effects. Finally, we apply the VDA approach to the one-gluon exchange contribution for the pion electromagnetic form factor. We observe a very late $Q^2 \\gtrsim 20$ GeV$^2$ onset of transition to the asymptotic pQCD predictions and show that in the $Q^2 \\lesssim 10$ GeV$^2$ region there is essentially no sensitivity to the shape of the pion distribution amplitude. Furthermore, the magnitude of the one-gluon exchange contribution in this region is estimated to be an order of magnitude below the Jefferson Lab data, thus leaving the Feynman mechanism as the only one relevant to the pion electromagnetic form factor behavior for accessible $Q^2$.

  7. Estimate Soil Erodibility Factors Distribution for Maioli Block

    NASA Astrophysics Data System (ADS)

    Lee, Wen-Ying

    2014-05-01

    The natural conditions in Taiwan are poor. Because of the steep slopes, rushing river and fragile geology, soil erosion turn into a serious problem. Not only undermine the sloping landscape, but also created sediment disaster like that reservoir sedimentation, river obstruction…etc. Therefore, predict and control the amount of soil erosion has become an important research topic. Soil erodibility factor (K) is a quantitative index of distinguish the ability of soil to resist the erosion separation and handling. Taiwan soil erodibility factors have been calculated 280 soil samples' erodibility factors by Wann and Huang (1989) use the Wischmeier and Smith nomorgraph. 221 samples were collected at the Maioli block in Miaoli. The coordinates of every sample point and the land use situations were recorded. The physical properties were analyzed for each sample. Three estimation methods, consist of Kriging, Inverse Distance Weighted (IDW) and Spline, were applied to estimate soil erodibility factors distribution for Maioli block by using 181 points data, and the remaining 40 points for the validation. Then, the SPSS regression analysis was used to comparison of the accuracy of the training data and validation data by three different methods. Then, the best method can be determined. In the future, we can used this method to predict the soil erodibility factors in other areas.

  8. Derived virtual devices: a secure distributed file system mechanism

    NASA Technical Reports Server (NTRS)

    VanMeter, Rodney; Hotz, Steve; Finn, Gregory

    1996-01-01

    This paper presents the design of derived virtual devices (DVDs). DVDs are the mechanism used by the Netstation Project to provide secure shared access to network-attached peripherals distributed in an untrusted network environment. DVDs improve Input/Output efficiency by allowing user processes to perform I/O operations directly from devices without intermediate transfer through the controlling operating system kernel. The security enforced at the device through the DVD mechanism includes resource boundary checking, user authentication, and restricted operations, e.g., read-only access. To illustrate the application of DVDs, we present the interactions between a network-attached disk and a file system designed to exploit the DVD abstraction. We further discuss third-party transfer as a mechanism intended to provide for efficient data transfer in a typical NAP environment. We show how DVDs facilitate third-party transfer, and provide the security required in a more open network environment.

  9. Virtuality and transverse momentum dependence of the pion distribution amplitude

    NASA Astrophysics Data System (ADS)

    Radyushkin, A. V.

    2016-03-01

    We describe basics of a new approach to transverse momentum dependence in hard exclusive processes. We develop it in application to the transition process γ*γ →π0 at the handbag level. Our starting point is coordinate representation for matrix elements of operators [in the simplest case, bilocal O (0 ,z ) ] describing a hadron with momentum p . Treated as functions of (p z ) and z2, they are parametrized through virtuality distribution amplitudes (VDA) Φ (x ,σ ) , with x being Fourier conjugate to (p z ) and σ Laplace conjugate to z2. For intervals with z+=0 , we introduce the transverse momentum distribution amplitude (TMDA) Ψ (x ,k⊥), and write it in terms of VDA Φ (x ,σ ). The results of covariant calculations, written in terms of Φ (x ,σ ), are converted into expressions involving Ψ (x ,k⊥). Starting with scalar toy models, we extend the analysis onto the case of spin-1 /2 quarks and QCD. We propose simple models for soft VDAs/TMDAs, and use them for comparison of handbag results with experimental (BABAR and BELLE) data on the pion transition form factor. We also discuss how one can generate high-k⊥ tails from primordial soft distributions.

  10. Virtuality and transverse momentum dependence of the pion distribution amplitude

    DOE PAGESBeta

    Radyushkin, Anatoly V.

    2016-03-08

    We describe basics of a new approach to transverse momentum dependence in hard exclusive processes. We develop it in application to the transition process γ*γ → π0 at the handbag level. Our starting point is coordinate representation for matrix elements of operators (in the simplest case, bilocal O (0,z)) describing a hadron with momentum p. Treated as functions of (pz) and z2, they are parametrized through virtuality distribution amplitudes (VDA) Φ(x,σ), with x being Fourier-conjugate to (pz) and σ Laplace-conjugate to z2. For intervals with z+ = 0, we introduce the transverse momentum distribution amplitude (TMDA) ψ(x, k), and writemore » it in terms of VDA Φ(x,σ). The results of covariant calculations, written in terms of Φ(x, σ) are converted into expressions involving ψ(x, k). Starting with scalar toy models, we extend the analysis onto the case of spin-1/2 quarks and QCD. We propose simple models for soft VDAs/TMDAs, and use them for comparison of handbag results with experimental (BaBar and BELLE) data on the pion transition form factor. Furthermore, we discuss how one can generate high-k tails from primordial soft distributions.« less

  11. Protective Role of False Tendon in Subjects with Left Bundle Branch Block: A Virtual Population Study

    PubMed Central

    Lange, Matthias; Di Marco, Luigi Yuri; Lekadir, Karim; Lassila, Toni; Frangi, Alejandro F.

    2016-01-01

    False tendons (FTs) are fibrous or fibromuscular bands that can be found in both the normal and abnormal human heart in various anatomical forms depending on their attachment points, tissue types, and geometrical properties. While FTs are widely considered to affect the function of the heart, their specific roles remain largely unclear and unexplored. In this paper, we present an in silico study of the ventricular activation time of the human heart in the presence of FTs. This study presents the first computational model of the human heart that includes a FT, Purkinje network, and papillary muscles. Based on this model, we perform simulations to investigate the effect of different types of FTs on hearts with the electrical conduction abnormality of a left bundle branch block (LBBB). We employ a virtual population of 70 human hearts derived from a statistical atlas, and run a total of 560 simulations to assess ventricular activation time with different FT configurations. The obtained results indicate that, in the presence of a LBBB, the FT reduces the total activation time that is abnormally augmented due to a branch block, to such an extent that surgical implant of cardiac resynchronisation devices might not be recommended by international guidelines. Specifically, the simulation results show that FTs reduce the QRS duration at least 10 ms in 80% of hearts, and up to 45 ms for FTs connecting to the ventricular free wall, suggesting a significant reduction of cardiovascular mortality risk. In further simulation studies we show the reduction in the QRS duration is more sensitive to the shape of the heart then the size of the heart or the exact location of the FT. Finally, the model suggests that FTs may contribute to reducing the activation time difference between the left and right ventricles from 12 ms to 4 ms. We conclude that FTs may provide an alternative conduction pathway that compensates for the propagation delay caused by the LBBB. Further investigation is

  12. Constructing and Analyzing Spectral Energy Distributions with the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Laurino, Omar; Busko, I.; Cresitello-Dittmar, M.; D'Abrusco, R.; Doe, S.; Evans, J.; Pevunova, O.; Norris, P.

    2013-01-01

    Spectral energy distributions (SEDs) are a common and useful means of assessing the relative contributions of different emission processes occurring within an object. Iris, the Virtual Astronomical Observatory (VAO) SED tool, seamlessly combines key features of several existing astronomical software applications to streamline and enhance the SED analysis process. With Iris, users may build and display SEDs, browse data and metadata and apply filters to them, fit models to SEDs, and calculate confidence limits on best-fit parameters. SED data may be built from a number of sources using the SED Builder. Iris supports the Simple Application Messaging Protocol for interoperability with other Virtual Observatory applications, like the VAO Data Discovery tool, and can directly fetch SEDs from the NASA Extragalactic Database SED service. Particular attention has been paid to the integration of user spectrophotometric data from files in several different formats. File readers for custom formats can be provided at runtime, as well as custom models to fit the data, as template libraries for template fitting or arbitrary python functions. New functionalities can be added by installing plugins, i.e. third party components that are developed using the Iris Software Development Kit. The VAO was established as a partnership of the Associated Universities, Inc. and the Association of Universities for Research in Astronomy, Inc. Iris Individual components have also been supported by the National Aeronautics and Space Administration (NASA) through the Chandra X-ray Center, which is operated by the Smithsonian Astrophysical Observatory for and on behalf of the NASA contract NAS8-03060, and by the Space Telescope Science Institute, operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555. This research has made use of the NASA/IPAC Extragalactic Database which is operated by the Jet Propulsion Laboratory, California Institute of

  13. Guest Editor's introduction: Special issue on distributed virtual environments

    NASA Astrophysics Data System (ADS)

    Lea, Rodger

    1998-09-01

    Distributed virtual environments (DVEs) combine technology from 3D graphics, virtual reality and distributed systems to provide an interactive 3D scene that supports multiple participants. Each participant has a representation in the scene, often known as an avatar, and is free to navigate through the scene and interact with both the scene and other viewers of the scene. Changes to the scene, for example, position changes of one avatar as the associated viewer navigates through the scene, or changes to objects in the scene via manipulation, are propagated in real time to all viewers. This ensures that all viewers of a shared scene `see' the same representation of it, allowing sensible reasoning about the scene. Early work on such environments was restricted to their use in simulation, in particular in military simulation. However, over recent years a number of interesting and potentially far-reaching attempts have been made to exploit the technology for a range of other uses, including: Social spaces. Such spaces can be seen as logical extensions of the familiar text chat space. In 3D social spaces avatars, representing participants, can meet in shared 3D scenes and in addition to text chat can use visual cues and even in some cases spatial audio. Collaborative working. A number of recent projects have attempted to explore the use of DVEs to facilitate computer-supported collaborative working (CSCW), where the 3D space provides a context and work space for collaboration. Gaming. The shared 3D space is already familiar, albeit in a constrained manner, to the gaming community. DVEs are a logical superset of existing 3D games and can provide a rich framework for advanced gaming applications. e-commerce. The ability to navigate through a virtual shopping mall and to look at, and even interact with, 3D representations of articles has appealed to the e-commerce community as it searches for the best method of presenting merchandise to electronic consumers. The technology

  14. Approximating the imbibition and absorption behavior of a distribution of matrix blocks by an equivalent spherical block

    SciTech Connect

    Zimmerman, R.W.; Bodvarsson, G.S.

    1994-03-01

    A theoretical study is presented of the effect of matrix block shape and matrix block size distribution on liquid imbibition and solute absorption in a fractured rock mass. It is shown that the behavior of an individual irregularly-shaped matrix block can be modeled with reasonable accuracy by using the results for a spherical matrix block, if one uses an effective radius {tilde a} = 3V/A, where V is the volume of the block and A is its surface area. In the early-time regime of matrix imbibition, it is shown that a collection of blocks of different sizes can be modeled by a single equivalent block, with an equivalent radius of {sup {minus}1}, where the average is taken on a volumetrically-weighted basis. In an intermediate time regime, it is shown for the case where the radii are normally distributed that the equivalent radius is reasonably well approximated by the mean radius . In the long-time limit, where no equivalent radius can be rigorously defined, an asymptotic expression is derived for the cumulative diffusion as a function of the mean and the standard deviation of the radius distribution function.

  15. Distributed Computing Software Building-Blocks for Ubiquitous Computing Societies

    NASA Astrophysics Data System (ADS)

    Kim, K. H. (Kane

    The steady approach of advanced nations toward realization of ubiquitous computing societies has given birth to rapidly growing demands for new-generation distributed computing (DC) applications. Consequently, economic and reliable construction of new-generation DC applications is currently a major issue faced by the software technology research community. What is needed is a new-generation DC software engineering technology which is at least multiple times more effective in constructing new-generation DC applications than the currently practiced technologies are. In particular, this author believes that a new-generation building-block (BB), which is much more advanced than the current-generation DC object that is a small extension of the object model embedded in languages C++, Java, and C#, is needed. Such a BB should enable systematic and economic construction of DC applications that are capable of taking critical actions with 100-microsecond-level or even 10-microsecond-level timing accuracy, fault tolerance, and security enforcement while being easily expandable and taking advantage of all sorts of network connectivity. Some directions considered worth pursuing for finding such BBs are discussed.

  16. Deeply virtual Compton scattering and generalized parton distributions at CLAS

    SciTech Connect

    Niccolai, Silvia

    2008-11-01

    The exclusive electroproduction of real photons and mesons at high momentum transfer allows us to access the Generalized Parton Distributions (GPDs). The formalism of the GPDs provides a unified description of the hadronic structure in terms of quark and gluonic degrees of freedom. In particular, the Deeply Virtual Compton Scattering (DVCS), ep â e2p2Å , is one of the key reactions to determine the GPDs experimentally, as it is the simplest process that can be described in terms of GPDs. A dedicated experiment to study DVCS has been carried out in Hall B at Jefferson Lab. Beam-spin asymmetries, resulting from the interference of the Bethe-Heitler process and DVCS have been extracted over the widest kinematic range ever accessed for this reaction ( 1.2 < Q 2 < 3.7 (GeV/c 2, 0.09 < - t < 1.3 (GeV/c 2, 0.13 < x B < 0.46 . In this paper, the results obtained experimentally are shown and compared to GPD parametrizations.

  17. Distributed Data Mining System with Gateway for Virtual Observatories

    NASA Astrophysics Data System (ADS)

    Karimabadi, H.

    2007-05-01

    Progress in space physics has always been strongly dependent on analysis of in situ spacecraft measurements. However, the vast majority of spacecraft data go unexplored and with upcoming multi-spacecraft NASA missions (THEMIS, MMS, etc.) the growing size of data promises to outpace the ability of scientists to analyze them. There are several NASA funded initiatives such as VSO, CoSEC, VHO, and VSPO to use the Internet to develop a software environment for searching, obtaining and analyzing data from archives of data distributed at many sites around the world. A natural extension of the function of such portals is to provide sophisticated data mining capabilities. Accordingly we have combined the latest advances in the fields of distributed computing and data mining to develop a unique tool that serves as the "computational" engine for Virtual Observatories (VOs). This tool extends the capability of VOs from data portal to a science analysis center. As one of the initial utility of this software, we have applied the algorithms to study of flux transfer events. Results from analysis of CLUSTER data will be presented. Our customized data mining software can work as a stand alone or be integrated into existing and future space physics data assimilation infrastructures (e.g., VSPO, VHO). Finally, we note that San Diego Supercomputer Center (SDSC) has agreed to host data as well as our software on one of their clusters and make it available at no cost to the scientific community. This will enable access to their CPU farm and will be particularly valuable to promote usage of our software.

  18. Virtual time and time warp on the JPL hypercube. [operating system implementation for distributed simulation

    NASA Technical Reports Server (NTRS)

    Jefferson, David; Beckman, Brian

    1986-01-01

    This paper describes the concept of virtual time and its implementation in the Time Warp Operating System at the Jet Propulsion Laboratory. Virtual time is a distributed synchronization paradigm that is appropriate for distributed simulation, database concurrency control, real time systems, and coordination of replicated processes. The Time Warp Operating System is targeted toward the distributed simulation application and runs on a 32-node JPL Mark II Hypercube.

  19. Guest Editor's introduction: Special issue on distributed virtual environments

    NASA Astrophysics Data System (ADS)

    Lea, Rodger

    1998-09-01

    Distributed virtual environments (DVEs) combine technology from 3D graphics, virtual reality and distributed systems to provide an interactive 3D scene that supports multiple participants. Each participant has a representation in the scene, often known as an avatar, and is free to navigate through the scene and interact with both the scene and other viewers of the scene. Changes to the scene, for example, position changes of one avatar as the associated viewer navigates through the scene, or changes to objects in the scene via manipulation, are propagated in real time to all viewers. This ensures that all viewers of a shared scene `see' the same representation of it, allowing sensible reasoning about the scene. Early work on such environments was restricted to their use in simulation, in particular in military simulation. However, over recent years a number of interesting and potentially far-reaching attempts have been made to exploit the technology for a range of other uses, including: Social spaces. Such spaces can be seen as logical extensions of the familiar text chat space. In 3D social spaces avatars, representing participants, can meet in shared 3D scenes and in addition to text chat can use visual cues and even in some cases spatial audio. Collaborative working. A number of recent projects have attempted to explore the use of DVEs to facilitate computer-supported collaborative working (CSCW), where the 3D space provides a context and work space for collaboration. Gaming. The shared 3D space is already familiar, albeit in a constrained manner, to the gaming community. DVEs are a logical superset of existing 3D games and can provide a rich framework for advanced gaming applications. e-commerce. The ability to navigate through a virtual shopping mall and to look at, and even interact with, 3D representations of articles has appealed to the e-commerce community as it searches for the best method of presenting merchandise to electronic consumers. The technology

  20. Collaborative Virtual Environments as Means to Increase the Level of Intersubjectivity in a Distributed Cognition System

    ERIC Educational Resources Information Center

    Ligorio, M. Beatrice; Cesareni, Donatella; Schwartz, Neil

    2008-01-01

    Virtual environments are able to extend the space of interaction beyond the classroom. In order to analyze how distributed cognition functions in such an extended space, we suggest focusing on the architecture of intersubjectivity. The Euroland project--a virtual land created and populated by seven classrooms supported by a team of…

  1. Experiential Virtual Scenarios With Real-Time Monitoring (Interreality) for the Management of Psychological Stress: A Block Randomized Controlled Trial

    PubMed Central

    Pallavicini, Federica; Morganti, Luca; Serino, Silvia; Scaratti, Chiara; Briguglio, Marilena; Crifaci, Giulia; Vetrano, Noemi; Giulintano, Annunziata; Bernava, Giuseppe; Tartarisco, Gennaro; Pioggia, Giovanni; Raspelli, Simona; Cipresso, Pietro; Vigna, Cinzia; Grassi, Alessandra; Baruffi, Margherita; Wiederhold, Brenda; Riva, Giuseppe

    2014-01-01

    Background The recent convergence between technology and medicine is offering innovative methods and tools for behavioral health care. Among these, an emerging approach is the use of virtual reality (VR) within exposure-based protocols for anxiety disorders, and in particular posttraumatic stress disorder. However, no systematically tested VR protocols are available for the management of psychological stress. Objective Our goal was to evaluate the efficacy of a new technological paradigm, Interreality, for the management and prevention of psychological stress. The main feature of Interreality is a twofold link between the virtual and the real world achieved through experiential virtual scenarios (fully controlled by the therapist, used to learn coping skills and improve self-efficacy) with real-time monitoring and support (identifying critical situations and assessing clinical change) using advanced technologies (virtual worlds, wearable biosensors, and smartphones). Methods The study was designed as a block randomized controlled trial involving 121 participants recruited from two different worker populations—teachers and nurses—that are highly exposed to psychological stress. Participants were a sample of teachers recruited in Milan (Block 1: n=61) and a sample of nurses recruited in Messina, Italy (Block 2: n=60). Participants within each block were randomly assigned to the (1) Experimental Group (EG): n=40; B1=20, B2=20, which received a 5-week treatment based on the Interreality paradigm; (2) Control Group (CG): n=42; B1=22, B2=20, which received a 5-week traditional stress management training based on cognitive behavioral therapy (CBT); and (3) the Wait-List group (WL): n=39, B1=19, B2=20, which was reassessed and compared with the two other groups 5 weeks after the initial evaluation. Results Although both treatments were able to significantly reduce perceived stress better than WL, only EG participants reported a significant reduction (EG=12% vs CG=0

  2. Live Virtual Constructive Distributed Test Environment Characterization Report

    NASA Technical Reports Server (NTRS)

    Murphy, Jim; Kim, Sam K.

    2013-01-01

    This report documents message latencies observed over various Live, Virtual, Constructive, (LVC) simulation environment configurations designed to emulate possible system architectures for the Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project integrated tests. For each configuration, four scenarios with progressively increasing air traffic loads were used to determine system throughput and bandwidth impacts on message latency.

  3. Identifying IP Blocks with Spamming Bots by Spatial Distribution

    NASA Astrophysics Data System (ADS)

    Yun, Sangki; Kim, Byungseung; Bahk, Saewoong; Kim, Hyogon

    In this letter, we develop a behavioral metric with which spamming botnets can be quickly identified with respect to their residing IP blocks. Our method aims at line-speed operation without deep inspection, so only TCP/IP header fields of the passing packets are examined. However, the proposed metric yields a high-quality receiver operating characteristics (ROC), with high detection rates and low false positive rates.

  4. Distributed data mining in the National Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Borne, Kirk D.

    2003-03-01

    The astronomy research community is about to become the beneficiary of huge multi-terabyte databases from a host of sky surveys. The rich and diverse information content within this "virtual sky" and the array of results to be derived therefrom will far exceed the current capacity of data search and research tools. The new digital surveys have the potential of facilitating a wide range of scientific discoveries about the Universe! To enable this to happen, the astronomical community is embarking on an ambitious endeavor, the creation of a National Virtual Observatory (NVO). This will in fact develop into a Global Virtual Observatory. To facilitate the new type of science enabled by the NVO, new techniques in data mining and knowledge discovery in large databases must be developed and deployed, and the next generation of astronomers must be trained in these techniques. This activity will benefit greatly from developments in the fields of information technology, computer science, and statistics. Aspects of the NVO initiative, including sample science user scenarios and user requirements will be presented. The value of scientific data mining and some early test case results will be discussed in the context of the speaker's research interests in colliding and merging galaxies.

  5. Enabling distributed simulation multilevel security using virtual machine and virtual private network technology

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.

    2007-04-01

    Increasing the accuracy of the portrayal of all of the elements of a simulation environment has long been a prime goal of the modeling and simulation community; a goal that has remained far out of reach for many reasons. One of the greatest hurdles facing simulation developers in the effort to increase simulation accuracy is the need to segregate information across the entire simulation environment according to access restrictions in order to insure the integrity, security, and reliability requirements imposed on the data. However, this need for segregation does not mean that those with the highest access permissions should be forced to use multiple computers and displays to integrate the information that they need or that intelligent agents should be restricted in their access to the information that they need in order to adequately assist their human operators. In this paper, we present a potential solution to the problem of integrating and segregating data, which is the use of virtual machine and virtual private network technology in order to maintain segregation of data, control access, and control intercommunication.

  6. Block distributions on the lunar surface: A comparison between measurements obtained from surface and orbital photography

    NASA Technical Reports Server (NTRS)

    Cintala, Mark J.; Mcbride, Kathleen M.

    1995-01-01

    Among the hazards that must be negotiated by lunar-landing spacecraft are blocks on the surface of the Moon. Unfortunately, few data exist that can be used to evaluate the threat posed by such blocks to landing spacecraft. Perhaps the best information is that obtained from Surveyor photographs, but those data do not extend to the dimensions of the large blocks that would pose the greatest hazards. Block distributions in the vicinities of the Surveyor 1, 3, 6, and 7 sites have been determined from Lunar Orbiter photography and are presented here. Only large (i.e., greater than or equal to 2.5 m) blocks are measurable in these pictures, resulting in a size gap between the Surveyor and Lunar Orbiter distributions. Nevertheless, the orbital data are self-consistent, a claim supported by the similarity in behavior between the subsets of data from the Surveyor 1, 3, and 6 sites and by the good agreement in position (if not slopes) between the data obtained from the Surveyor 3 photography and those derived from the Lunar Orbiter photographs. Confidence in the results is also justified by the well-behaved distribution of large blocks at the surveyor site. Comparisons between the Surveyor distributions and those derived from the orbital photography permit these observations: (1) in all cases but that for Surveyor 3, the density of large blocks is overestimated by extrapolation of the Surveyor-derived trends; (2) the slopes of the Surveyor-derived distributions are consistently lower than those determined for the large blocks; and (3) these apparent disagreements could be mitigated if the overall shapes of the cumulative lunar block populations were nonlinear, allowing for different slopes over different size intervals. The relatively large gaps between the Surveyor-derived and Orbiter-derived data sets, however, do not permit a determination of those shapes.

  7. The Impact of Virtual Collaboration and Collaboration Technologies on Knowledge Transfer and Team Performance in Distributed Organizations

    ERIC Educational Resources Information Center

    Ngoma, Ngoma Sylvestre

    2013-01-01

    Virtual teams are increasingly viewed as a powerful determinant of competitive advantage in geographically distributed organizations. This study was designed to provide insights into the interdependencies between virtual collaboration, collaboration technologies, knowledge transfer, and virtual team performance in an effort to understand whether…

  8. Building Blocks for Distributed Information Systems in Hospitals

    PubMed Central

    Rutt, Thomas E.

    1987-01-01

    In order to provide a consistent view of work functions which can be supported by successive releases of hospital information systems, a hospital business model was developed. The business model describes potential system functions along with their data access needs. This model was used to determine an optimal set of mutually exclusive logical databases. These logical databases are seeds around which design of distributed systems should be based. The logical databases derived by this work can be used as a starting point for defining application protocol standards for hospital system interface transactions.

  9. Framework for bringing realistic virtual natural environments to distributed simulations

    NASA Astrophysics Data System (ADS)

    Whitney, David A.; Reynolds, Robert A.; Olson, Stephen H.; Sherer, Dana Z.; Driscoll, Mavis L.; Watman, K. L.

    1997-06-01

    One of the major new technical challenges for distributed simulations is the distribution and presentation and distribution of the natural atmosphere-ocean-space environment. The natural terrain environment has been a part of such simulations for a while, but the integration of atmosphere and ocean data and effects is quite new. The DARPA synthetic environments (SE) program has been developing and demonstrating advanced technologies for providing tactically significant atmosphere-ocean data and effects for a range of simulations. A general-purpose data collection, assimilation, management, and distribution system is being developed by the TAOS (Total Atmosphere-Ocean System) Project. This system is designed to support the new high level architecture (HLA)/run- time infrastructure (RTI) being developed by the Defense Modeling and Simulation Office (DMSO), as well as existing distributed interactive simulation (DIS) network protocols. This paper describes how synthetic natural environments are being integrated by TAOS to provide an increasingly rich dynamic synthetic natural environment. Architectural designs and implementations to accommodate a range of simulation applications are discussed. A number of enabling technologies are employed, such as the development of standards for gridded data distribution, and the inclusion of derived products and local environmental features within 4-dimensional data grids. The application of TAOS for training, analysis, and engineering simulations for sensor analysis is discussed.

  10. ChemScreener: A Distributed Computing Tool for Scaffold based Virtual Screening.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Deepak; Vyas, Renu

    2015-01-01

    In this work we present ChemScreener, a Java-based application to perform virtual library generation combined with virtual screening in a platform-independent distributed computing environment. ChemScreener comprises a scaffold identifier, a distinct scaffold extractor, an interactive virtual library generator as well as a virtual screening module for subsequently selecting putative bioactive molecules. The virtual libraries are annotated with chemophore-, pharmacophore- and toxicophore-based information for compound prioritization. The hits selected can then be further processed using QSAR, docking and other in silico approaches which can all be interfaced within the ChemScreener framework. As a sample application, in this work scaffold selectivity, diversity, connectivity and promiscuity towards six important therapeutic classes have been studied. In order to illustrate the computational power of the application, 55 scaffolds extracted from 161 anti-psychotic compounds were enumerated to produce a virtual library comprising 118 million compounds (17 GB) and annotated with chemophore, pharmacophore and toxicophore based features in a single step which would be non-trivial to perform with many standard software tools today on libraries of this size. PMID:26138574

  11. Implementation and evaluation of a virtual learning center for distributed education.

    PubMed Central

    Caton, K. A.; Hersh, W.; Williams, J. B.

    1999-01-01

    A number of tools are required to support a distributed education program. This paper will relate experiences in the development and implementation of a web-based Virtual Learning Center. Initial evaluation offers direction for further development, necessary university support, and faculty and student preparation. Images Figure 3 PMID:10566407

  12. Integrating a distributed, agile, virtual enterprise in the TEAM program

    NASA Astrophysics Data System (ADS)

    Cobb, C. K.; Gray, W. Harvey; Hewgley, Robert E.; Klages, Edward J.; Neal, Richard E.

    1997-01-01

    The technologies enabling agile manufacturing (TEAM) program enhances industrial capability by advancing and deploying manufacturing technologies that promote agility. TEAM has developed a product realization process that features the integration of product design and manufacturing groups. TEAM uses the tools it collects, develops, and integrates in support of the product realization process to demonstrate and deploy agile manufacturing capabilities for three high- priority processes identified by industry: material removal, forming, and electromechanical assembly. In order to provide a proof-of-principle, the material removal process has been addressed first and has been successfully demonstrate din an 'interconnected' mode. An internet-accessible intersite file manager (IFM) application has been deployed to allow geographically distributed TEAM participants to share and distribute information as the product realization process is executed. An automated inspection planning application has been demonstrated, importing a solid model form the IFM, generating an inspection plan and a part program to be used in the inspection process, and then distributing the part program to the inspection site via the IFM. TEAM seeks to demonstrate the material removal process in an integrated mode in June 1997 complete with an object-oriented framework and infrastructure. The current status and future plans for this project are presented here.

  13. Spatial distribution of ice blocks on Enceladus and implications for their origin and emplacement

    NASA Astrophysics Data System (ADS)

    Martens, Hilary R.; Ingersoll, Andrew P.; Ewald, Shawn P.; Helfenstein, Paul; Giese, Bernd

    2015-01-01

    We have mapped the locations of over 100,000 ice blocks across the south polar region of Saturn's moon Enceladus, thus generating the first quantitative estimates of ice-block number density distribution in relation to major geological features. Ice blocks were manually identified and mapped from twenty of the highest resolution (4-25 m per pixel) Cassini Imaging Science Subsystem (ISS) narrow-angle images using ArcGIS software. The 10-100 m-diameter positive-relief features are marginally visible at the resolution of the images, making ice-block identifications difficult but not impossible. Our preliminary results reveal that ice blocks in the southern hemisphere are systematically most concentrated within the geologically active South Polar Terrain (SPT) and exhibit peak concentrations within 20 km of the tiger-stripe fractures as well as close to the south pole. We find that ice blocks are concentrated just as heavily between tiger-stripe fractures as on the directly adjacent margins; although significant local fluctuations in ice-block number density do occur, we observe no clear pattern with respect to the tiger stripes or jet sources. We examine possible roles of several mechanisms for ice-block origin, emplacement, and evolution: impact cratering, ejection from fissures during cryovolcanic eruptions, tectonic disruption of lithospheric ice, mass wasting, seismic disturbance, and vapor condensation around icy fumeroles. We conclude that impact cratering as well as mass wasting, perhaps triggered by seismic events, cannot account for a majority of ice-block features within the inner SPT. The pervasiveness of fracturing at many size scales, the ubiquity of ice blocks in the inner SPT, as well as the occurrence of linear block arrangements that parallel through-cutting crack networks along the flanks of tiger stripes indicate that tectonic deformation is an important source of blocky-ice features in the SPT. Ejection during catastrophic cryovolcanic eruptions

  14. Drift-insensitive distributed calibration of probe microscope scanner in nanometer range: Virtual mode

    NASA Astrophysics Data System (ADS)

    Lapshin, Rostislav V.

    2016-08-01

    A method of distributed calibration of a probe microscope scanner is suggested. The main idea consists in a search for a net of local calibration coefficients (LCCs) in the process of automatic measurement of a standard surface, whereby each point of the movement space of the scanner can be characterized by a unique set of scale factors. Feature-oriented scanning (FOS) methodology is used as a basis for implementation of the distributed calibration permitting to exclude in situ the negative influence of thermal drift, creep and hysteresis on the obtained results. Possessing the calibration database enables correcting in one procedure all the spatial systematic distortions caused by nonlinearity, nonorthogonality and spurious crosstalk couplings of the microscope scanner piezomanipulators. To provide high precision of spatial measurements in nanometer range, the calibration is carried out using natural standards - constants of crystal lattice. One of the useful modes of the developed calibration method is a virtual mode. In the virtual mode, instead of measurement of a real surface of the standard, the calibration program makes a surface image "measurement" of the standard, which was obtained earlier using conventional raster scanning. The application of the virtual mode permits simulation of the calibration process and detail analysis of raster distortions occurring in both conventional and counter surface scanning. Moreover, the mode allows to estimate the thermal drift and the creep velocities acting while surface scanning. Virtual calibration makes possible automatic characterization of a surface by the method of scanning probe microscopy (SPM).

  15. Block distributions on the lunar surface: A comparison between measurements obtained from surface and orbital photography

    NASA Technical Reports Server (NTRS)

    Cintala, Mark J.; Mcbride, Kathleen M.

    1994-01-01

    Enlargements of Lunar-Orbiter photography were used in conjunction with a digitizing tablet to collect the locations and dimensions of blocks surrounding the Surveyor 1, 3, 6, and 7 landing sites. Data were reduced to the location and the major axis of the visible portion of each block. Shadows sometimes made it difficult to assess whether the visible major axis corresponded with the actual principal dimension. These data were then correlated with the locations of major craters in the study areas, thus subdividing the data set into blocks obviously associated with craters and those in intercrater areas. A block was arbitrarily defined to be associated with a crater when its location was within 1.1 crater radii of the crater's center. Since this study was commissioned for the ultimate purpose of determining hazards to landing spacecraft, such a definition was deemed appropriate in defining block-related hazards associated with craters. Size distributions of smaller fragments as determined from Surveyor photography were obtained as measurements from graphical data. Basic comparisons were performed through use of cumulative frequency distributions identical to those applied to studies of crater-count data.

  16. Design of a software framework to support live/virtual training on distributed terrain

    NASA Astrophysics Data System (ADS)

    Schiavone, Guy A.; Tracy, Judd; Woodruff, Eric; Dere, Troy

    2003-09-01

    In this paper we describe research and development on the concept and application of distributed terrain and distributed terrain servers to support live/virtual training operations. This includes design of a distributed, cluster-capable "Combat Server" for the virtual representation and simulation of live training exercises, and current work to support virtual representation and visualization of live indoor operations involving firefighters, SWAT teams and/or special operations forces. The Combat Server concept under development is for an object-oriented, efficient and flexible distributed platform designed for simulation and training. It can operate on any compatible, high performance computer for which the software is compliant; however, it is explicitly designed for distribution and cooperation of relatively inexpensive clustered computers, together playing the role of a large independent system. The design of the Combat Server aims to be generic and encompass any situation that involves monitoring, tracking, assessment, visualization and, eventually, simulated interactivity to compliment real-world training exercises. To accomplish such genericity, the design must incorporate techniques such as layering or abstraction to remove any dependencies on specific hardware, such as weapons, that are to eventually be employed by the system; this also includes entity tracking hardware interfaces, whether by GPS or Ultra-Wide Band technologies. The Combat Server is a framework. Its design is a foothold for building a specialized distributed system for modeling a particular style of exercise. The combat server can also be a software development framework, providing a platform for building specialized exercises while abstracting the developer from the minutia of building a real-time distributed system. In this paper we review preliminary experiments regarding basic line-of-sight (LOS) functions of the combat server functionality and scalability in a cluster computing

  17. Numerical Simulation of Current Distribution in Cathode Carbon Block of an Aluminum Reduction Cell

    NASA Astrophysics Data System (ADS)

    Tao, Wenju; Li, Tuofu; Wang, Zhaowen; Gao, Bingliang; Shi, Zhongning; Hu, Xianwei; Cui, Jianzhong

    2015-11-01

    Cathode carbon block wear is the main limiting factor for the lifetime of aluminum reduction cells. The wear rate is enhanced by current density. In this article, the current distribution at the surface of carbon block was calculated using a thermoelectric coupled model. Then the effects of effective length ( l e), height of the cathode carbon block ( h c), and width and height of the collector ( w b and h b) on current distribution were investigated. The results show that l e has a great effect on the current distribution. With l e decreasing, the maximum current density increases rapidly and shifts toward the cell center. When the l e decreases from 1.67 m to 1.51 m, the maximum current density increases by 57.9%. Moreover, the maximum current density will be reduced with increasing h c or h b × w b. For h b × w b = 180 mm × 180 mm2, the maximum current density is reduced by 27.8%. However, increasing h c or h b × w b will decrease the temperature in the cathode carbon block. The results of this study may provide the database optimization of cell operation and design.

  18. Responses of European precipitation distributions and regimes to different blocking locations

    NASA Astrophysics Data System (ADS)

    Sousa, Pedro M.; Trigo, Ricardo M.; Barriopedro, David; Soares, Pedro M. M.; Ramos, Alexandre M.; Liberato, Margarida L. R.

    2016-04-01

    In this work we performed an analysis on the impacts of blocking episodes on seasonal and annual European precipitation and the associated physical mechanisms. Distinct domains were considered in detail taking into account different blocking center positions spanning between the Atlantic and western Russia. Significant positive precipitation anomalies are found for southernmost areas while generalized negative anomalies (up to 75 % in some areas) occur in large areas of central and northern Europe. This dipole of anomalies is reversed when compared to that observed during episodes of strong zonal flow conditions. We illustrate that the location of the maximum precipitation anomalies follows quite well the longitudinal positioning of the blocking centers and discuss regional and seasonal differences in the precipitation responses. To better understand the precipitation anomalies, we explore the blocking influence on cyclonic activity. The results indicate a split of the storm-tracks north and south of blocking systems, leading to an almost complete reduction of cyclonic centers in northern and central Europe and increases in southern areas, where cyclone frequency doubles during blocking episodes. However, the underlying processes conductive to the precipitation anomalies are distinct between northern and southern European regions, with a significant role of atmospheric instability in southern Europe, and moisture availability as the major driver at higher latitudes. This distinctive underlying process is coherent with the characteristic patterns of latent heat release from the ocean associated with blocked and strong zonal flow patterns. We also analyzed changes in the full range of the precipitation distribution of several regional sectors during blocked and zonal days. Results show that precipitation reductions in the areas under direct blocking influence are driven by a substantial drop in the frequency of moderate rainfall classes. Contrarily, southwards of

  19. Constraints on the H˜ generalized parton distribution from deep virtual Compton scattering measured at HERMES

    NASA Astrophysics Data System (ADS)

    Guidal, M.

    2010-09-01

    We have analyzed the longitudinally polarized proton target asymmetry data of the Deep Virtual Compton process recently published by the HERMES Collaboration in terms of Generalized Parton Distributions. We have fitted these new data in a largely model-independent fashion and the procedure results in numerical constraints on the accent="true">H˜Im Compton Form Factor. We present its t- and ξ-dependencies. We also find improvement on the determination of two other Compton Form Factors, HRe and HIm.

  20. Misattribution in Virtual Groups: The Effects of Member Distribution on Self-Serving Bias and Partner Blame

    ERIC Educational Resources Information Center

    Walther, Joseph B.; Bazarova, Natalya N.

    2007-01-01

    Interest in virtual groups has focused on attribution biases due to the collocation or distribution of partners. No previous research examines self-attributions in virtual groups, yet self-attributions--the acknowledgment of personal responsibility or its deflection--potentially determines learning and improvement. This study reviews research on…

  1. Solar System Modeler: A Distributed, Virtual Environment for Space Visualization and GPS Navigation

    NASA Astrophysics Data System (ADS)

    Williams, Gary E.

    1996-12-01

    The Solar System Modeler (SM) extends the Space Modeler developed in 1994. It provides a virtual environment enabling an explorer to dynamically investigate near Earth satellites, deep space probes, planets, moons, and other celestial phenomena. The explorer navigates the virtual environment via mouse selected options from menu panels while wearing a tracked, head mounted display (HMD). Alternatively, a monitor may replace the HMD and keyboard controls replace head tracking. The SM's functionality is extended by the ability to broadcast simulated GPS satellite transmissions in compliance with Distributed Interactive Simulation (DIS) protocol standards. The transmissions include information found in true GPS broadcasts that is required for a receiver to determine its location. The Virtual GPS Receiver (VGPSR) receives the GPS transmissions from the SM and computes the receiver's position with a realistic error based on numerous variables simulating those encountered in the real GPS system. The VGPSR is designed as a plug-in module for simulations requiring virtual navigation. The receiver's client application provides the VGPSR with the simulation time and the true position of the receiver. In return, the application receives a GPS indicated position.

  2. Counting Blocks or Keyboards? A Comparative Analysis of Concrete versus Virtual Manipulatives in Elementary School Mathematics Concepts

    ERIC Educational Resources Information Center

    Brown, Sonya E.

    2007-01-01

    This study was designed to investigate the impact of using computer-simulated (virtual ) manipulatives and hands-on (concrete) manipulatives on elementary students' learning skills and concepts in equivalent fractions. The researcher's primary interest was whether or not students who used virtual manipulatives would out-perform students who used…

  3. Numerical Convergence of the Block-Maxima Approach to the Generalized Extreme Value Distribution

    NASA Astrophysics Data System (ADS)

    Faranda, Davide; Lucarini, Valerio; Turchetti, Giorgio; Vaienti, Sandro

    2011-12-01

    In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems. In this setting, recent works have shown how to get a statistics of extremes in agreement with the classical Extreme Value Theory. We pursue these investigations by giving analytical expressions of Extreme Value distribution parameters for maps that have an absolutely continuous invariant measure. We compare these analytical results with numerical experiments in which we study the convergence to limiting distributions using the so called block-maxima approach, pointing out in which cases we obtain robust estimation of parameters. In regular maps for which mixing properties do not hold, we show that the fitting procedure to the classical Extreme Value Distribution fails, as expected. However, we obtain an empirical distribution that can be explained starting from a different observable function for which Nicolis et al. (Phys. Rev. Lett. 97(21): 210602, 2006) have found analytical results.

  4. Rapid prototyping, astronaut training, and experiment control and supervision: distributed virtual worlds for COLUMBUS, the European Space Laboratory module

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Rossmann, Juergen

    2002-02-01

    In 2004, the European COLUMBUS Module is to be attached to the International Space Station. On the way to the successful planning, deployment and operation of the module, computer generated and animated models are being used to optimize performance. Under contract of the German Space Agency DLR, it has become IRF's task to provide a Projective Virtual Reality System to provide a virtual world built after the planned layout of the COLUMBUS module let astronauts and experimentators practice operational procedures and the handling of experiments. The key features of the system currently being realized comprise the possibility for distributed multi-user access to the virtual lab and the visualization of real-world experiment data. Through the capabilities to share the virtual world, cooperative operations can be practiced easily, but also trainers and trainees can work together more effectively sharing the virtual environment. The capability to visualize real-world data will be used to introduce measured data of experiments into the virtual world online in order to realistically interact with the science-reference model hardware: The user's actions in the virtual world are translated into corresponding changes of the inputs of the science reference model hardware; the measured data is than in turn fed back into the virtual world. During the operation of COLUMBUS, the capabilities for distributed access and the capabilities to visualize measured data through the use of metaphors and augmentations of the virtual world may be used to provide virtual access to the COLUMBUS module, e.g. via Internet. Currently, finishing touches are being put to the system. In November 2001 the virtual world shall be operational, so that besides the design and the key ideas, first experimental results can be presented.

  5. Iris: Constructing and Analyzing Spectral Energy Distributions with the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Laurino, O.; Budynkiewicz, J.; Busko, I.; Cresitello-Dittmar, M.; D'Abrusco, R.; Doe, S.; Evans, J.; Pevunova, O.

    2014-05-01

    We present Iris 2.0, the latest release of the Virtual Astronomical Observatory application for building and analyzing Spectral Energy Distributions (SEDs). With Iris, users may read in and display SEDs inspect and edit any selection of SED data, fit models to SEDs in arbitrary spectral ranges, and calculate confidence limits on best-fit parameters. SED data may be loaded into the application from VOTable and FITS files compliant with the International Virtual Observatoy Alliance interoperable data models, or retrieved directly from NED or the Italian Space Agency Science Data Center; data in non-standard formats may also be converted within the application. Users may seamlessy exchange data between Iris and other Virtual Observatoy tools using the Simple Application Messaging Protocol. Iris 2.0 also provides a tool for redshifting, interpolating, and measuring integratd fluxes, and allows simple aperture corrections for individual points and SED segments. Custom Python functions, template models and template libraries may be imported into Iris for fitting SEDs. Iris may be extended through Java plugins; users can install third-party packages, or develop their own plugin using Iris' Software Development Kit. Iris 2.0 is available for Linux and Mac OS X systems.

  6. Distributed cluster testing using new virtualized framework for XRootD

    NASA Astrophysics Data System (ADS)

    Salmon, Justin L.; Janyst, Lukasz

    2014-06-01

    The Extended ROOT Daemon (XRootD) is a distributed, scalable system for low-latency clustered data access. XRootD is mature and widely used in HEP, both standalone and as a core server framework for the EOS system at CERN, and hence requires extensive testing to ensure general stability. However, there are many difficulties posed by distributed testing, such as cluster set up, synchronization, orchestration, inter-cluster communication and controlled failure handling. A three-layer master/hypervisor/slave model is presented to ameloirate these difficulties by utilizing libvirt and QEMU/KVM virtualization technologies to automate spawning of configurable virtual clusters and orchestrate multi-stage test suites. The framework also incorporates a user-friendly web interface for scheduling and monitoring tests. The prototype has been used successfully to build new test suites for XRootD and EOS with existing unit test integration. It is planned for the future to sufficiently generalize the framework to encourage usage by potentially any distributed system.

  7. A virtual species set for robust and reproducible species distribution modelling tests

    PubMed Central

    Garzon-Lopez, Carol X.; Bastin, Lucy; Foody, Giles M.; Rocchini, Duccio

    2016-01-01

    Predicting species potential and future distribution has become a relevant tool in biodiversity monitoring and conservation. In this data article we present the suitability map of a virtual species generated based on two bioclimatic variables, and a dataset containing more than 700,000 random observations at the extent of Europe. The dataset includes spatial attributes such as: distance to roads, protected areas, country codes, and the habitat suitability of two spatially clustered species (grassland and forest species) and a wide-spread species. PMID:27014734

  8. A virtual species set for robust and reproducible species distribution modelling tests.

    PubMed

    Garzon-Lopez, Carol X; Bastin, Lucy; Foody, Giles M; Rocchini, Duccio

    2016-06-01

    Predicting species potential and future distribution has become a relevant tool in biodiversity monitoring and conservation. In this data article we present the suitability map of a virtual species generated based on two bioclimatic variables, and a dataset containing more than 700,000 random observations at the extent of Europe. The dataset includes spatial attributes such as: distance to roads, protected areas, country codes, and the habitat suitability of two spatially clustered species (grassland and forest species) and a wide-spread species. PMID:27014734

  9. A Study on Data Handling Mechanism of a Distributed Virtual Factory

    NASA Astrophysics Data System (ADS)

    Sashio, Kentarou; Fujii, Susumu; Kaihara, Toshiya

    To cope with diversified consumers' needs, recent manufacturing systems are required to accommodate the agility in manufacturing by shortening the lead time, reducing the indirect costs and so on. To evaluate the performance of the total manufacturing system taking the complicated information flow as well as the material flow among areas into consideration, Distributed Virtual Factory (DVF) has been proposed. To construct DVF by integrating area level simulators, Communication, Synchronization and Data Handling Mechanism are required. The Data Handling Mechanism supplies external information required by area level simulators. In this paper, a Database Interface is developed to integrate databases into a DVF and the Data Handling Mechanism is implemented by utilizing database.

  10. A distributed parallel genetic algorithm of placement strategy for virtual machines deployment on cloud platform.

    PubMed

    Dong, Yu-Shuang; Xu, Gao-Chao; Fu, Xiao-Dong

    2014-01-01

    The cloud platform provides various services to users. More and more cloud centers provide infrastructure as the main way of operating. To improve the utilization rate of the cloud center and to decrease the operating cost, the cloud center provides services according to requirements of users by sharding the resources with virtualization. Considering both QoS for users and cost saving for cloud computing providers, we try to maximize performance and minimize energy cost as well. In this paper, we propose a distributed parallel genetic algorithm (DPGA) of placement strategy for virtual machines deployment on cloud platform. It executes the genetic algorithm parallelly and distributedly on several selected physical hosts in the first stage. Then it continues to execute the genetic algorithm of the second stage with solutions obtained from the first stage as the initial population. The solution calculated by the genetic algorithm of the second stage is the optimal one of the proposed approach. The experimental results show that the proposed placement strategy of VM deployment can ensure QoS for users and it is more effective and more energy efficient than other placement strategies on the cloud platform. PMID:25097872

  11. A Distributed Parallel Genetic Algorithm of Placement Strategy for Virtual Machines Deployment on Cloud Platform

    PubMed Central

    Dong, Yu-Shuang; Xu, Gao-Chao; Fu, Xiao-Dong

    2014-01-01

    The cloud platform provides various services to users. More and more cloud centers provide infrastructure as the main way of operating. To improve the utilization rate of the cloud center and to decrease the operating cost, the cloud center provides services according to requirements of users by sharding the resources with virtualization. Considering both QoS for users and cost saving for cloud computing providers, we try to maximize performance and minimize energy cost as well. In this paper, we propose a distributed parallel genetic algorithm (DPGA) of placement strategy for virtual machines deployment on cloud platform. It executes the genetic algorithm parallelly and distributedly on several selected physical hosts in the first stage. Then it continues to execute the genetic algorithm of the second stage with solutions obtained from the first stage as the initial population. The solution calculated by the genetic algorithm of the second stage is the optimal one of the proposed approach. The experimental results show that the proposed placement strategy of VM deployment can ensure QoS for users and it is more effective and more energy efficient than other placement strategies on the cloud platform. PMID:25097872

  12. A comparison of distributed memory and virtual shared memory parallel programming models

    SciTech Connect

    Keane, J.A.; Grant, A.J.; Xu, M.Q.

    1993-04-01

    The virtues of the different parallel programming models, shared memory and distributed memory, have been much debated. Conventionally the debate could be reduced to programming convenience on the one hand, and high salability factors on the other. More recently the debate has become somewhat blurred with the provision of virtual shared memory models built on machines with physically distributed memory. The intention of such models/machines is to provide scalable shared memory, i.e. to provide both programmer convenience and high salability. In this paper, the different models are considered from experiences gained with a number of system ranging from applications in both commerce and science to languages and operating systems. Case studies are introduced as appropriate.

  13. Characterization of rock matrix block size distribution, dispersivity, and mass transfer coefficients in fractured porous media

    NASA Astrophysics Data System (ADS)

    Sharifi Haddad, Amin

    Fractured porous media are important structures in petroleum engineering and geohydrology. The accelerating global demand for energy has turned the focus to fractured formations. The fractured porous media are also found in conventional naturally fractured reservoirs and the water supply from karst (carbonate) aquifers. Studying mass transfer processes allows us to explore the complexities and uncertainties encountered with fractured rocks. This dissertation is developing an analytical methodology for the study of mass transfer in fractured reservoirs. The dissertation begins with two cases that demonstrate the importance of the rock matrix block size distribution and dispersivity through a transient mass exchange mechanism between rock matrix blocks and fractures. The first case assumes a medium with no surface adsorption, and the second case includes the surface adsorption variable. One of the main focuses of this work is the characterization of the rock matrix block size distribution in fractured porous media. Seismic surveying, well test analysis, well logging, and geomechanical tools are currently used to characterize this property, based on measurements of different variables. This study explores an innovative method of using solute transport to determine the fracture intensity. This methodology is applied to slab-shaped rock matrix blocks and can easily be extended to other geometries. Another focus of this dissertation is the characterization of dispersivity in field scale studies. Improving our knowledge of dispersivity will enable more accurate mass transfer predictions and advance the study of transport processes. Field tracer tests demonstrated that dispersivity is scale-dependent. Proposed functions for the increasing trend of dispersivity include linear and asymptotic scale-dependence. This study investigated the linear dispersivity trend around the injection wellbore. An analysis of the tracer concentration in a monitoring well was used to

  14. LIDAR-based outcrop characterisation - joint classification, surface and block size distribution

    NASA Astrophysics Data System (ADS)

    Tanner, David C.; Dietrich, Patrick; Krawczyk, Charlotte M.

    2013-04-01

    Outcrops, in the first instance, only offer at best a 2-2.5D view of the available geological information, such as joints and fractures. In order to study geodynamic processes, it is necessary to calculate true values of, for example, fracture densities and block dimensions. We show how LIDAR-generated point-cloud data of outcrops can be used to delineate such geological surfaces. Our methods do not require the point-set to be meshed; instead we work with the original point cloud, thus avoiding meshing errors. In a first step we decompose the point-cloud into tiny volumes; in each volume we calculate the best fitting plane. An expert can then decide which of the planes are important (in an interactive density pole diagram) and classify them. Actual block surfaces are identified by applying a clustering algorithm to the mini-planes. Subsequently, we calculate the size of these surfaces. Finally we estimate the block size distribution within the outcrop by projecting the block surfaces into the rock volume. To assess the reproducibility of our results we show to which extent they depend on various parameters, such as the resolution of the LIDAR scan and algorithm parameters. In theory the results can be calculated at the site of measurement to ensure the LIDAR scan resolution is sufficient and if necessary rerun the scan with different parameters. We demonstrate our methods with LIDAR data that we produced in a sandstone quarry in Germany. The part of the outcrop which we measured with the LIDAR was out-of-reach for measurements with a geological compass, but our results correlate well with compass measurements from a different outcrop in the same quarry. Three main surfaces could be delineated from the point cloud: the bedding, and two major joint types. The three fabrics are almost orthogonal. Our statistical results suggest that blocks with a volume of several hundred liters can be expected regularly within the quarry. The results can be directly used to

  15. Quantification of hepatic flow distribution using particle tracking for patient specific virtual Fontan surgery

    NASA Astrophysics Data System (ADS)

    Yang, Weiguang; Vignon-Clementel, Irene; Troianowski, Guillaume; Shadden, Shawn; Mohhan Reddy, V.; Feinstein, Jeffrey; Marsden, Alison

    2010-11-01

    The Fontan surgery is the third and final stage in a palliative series to treat children with single ventricle heart defects. In the extracardiac Fontan procedure, the inferior vena cava (IVC) is connected to the pulmonary arteries via a tube-shaped Gore-tex graft. Clinical observations have shown that the absence of a hepatic factor, carried in the IVC flow, can cause pulmonary arteriovenous malformations. Although it is clear that hepatic flow distribution is an important determinant of Fontan performance, few studies have quantified its relation to Fontan design. In this study, we virtually implanted three types of grafts (T-junction, offset and Y-graft) into 5 patient specific models of the Glenn (stage 2) anatomy. We then performed 3D time-dependent simulations and systematically compared the IVC flow distribution, energy loss, and pressure levels in different surgical designs. A robustness test is performed to evaluate the sensitivity of hepatic distribution to pulmonary flow split. Results show that the Y-graft design effectively improves the IVC flow distribution, compared to traditional designs and that surgical designs could be customized on a patient-by-patient basis.

  16. Comparative molecular surface analysis (CoMSA) for virtual combinatorial library screening of styrylquinoline HIV-1 blocking agents.

    PubMed

    Niedbala, Halina; Polanski, Jaroslaw; Gieleciak, Rafal; Musiol, Robert; Tabak, Dominik; Podeszwa, Barbara; Bak, Andrzej; Palka, Anna; Mouscadet, Jean-Francois; Gasteiger, Johann; Le Bret, Marc

    2006-12-01

    We used comparative molecular surface analysis to design molecules for the synthesis as part of the search for new HIV-1 integrase inhibitors. We analyzed the virtual combinatorial library (VCL) constituted from various moieties of styrylquinoline and styrylquinazoline inhibitors. Since imines can be applied in a strategy of dynamic combinatorial chemistry (DCC), we also tested similar compounds in which the -C=N- or -N=C- linker connected the heteroaromatic and aromatic moieties. We then used principal component analysis (PCA) or self-organizing maps (SOM), namely, the Kohonen neural networks to obtain a clustering plot analyzing the diversity of the VCL formed. Previously synthesized compounds of known activity, used as molecular probes, were projected onto this plot, which provided a set of promising virtual drugs. Moreover, we further modified the above mentioned VCL to include the single bond linker -C-N- or -N-C-. This allowed increasing compound stability but expanded also the diversity between the available molecular probes and virtual targets. The application of the CoMSA with SOM indicated important differences between such compounds and active molecular probes. We synthesized such compounds to verify the computational predictions. PMID:17168681

  17. Non-Gaussian postselection and virtual photon subtraction in continuous-variable quantum key distribution

    NASA Astrophysics Data System (ADS)

    Li, Zhengyu; Zhang, Yichen; Wang, Xiangyu; Xu, Bingjie; Peng, Xiang; Guo, Hong

    2016-01-01

    Photon subtraction can enhance the performance of continuous-variable quantum key distribution (CV QKD). However, the enhancement effect will be reduced by the imperfections of practical devices, especially the limited efficiency of a single-photon detector. In this paper, we propose a non-Gaussian postselection method to emulate the photon substraction used in coherent-state CV QKD protocols. The virtual photon subtraction not only can avoid the complexity and imperfections of a practical photon-subtraction operation, which extends the secure transmission distance as the ideal case does, but also can be adjusted flexibly according to the channel parameters to optimize the performance. Furthermore, our preliminary tests on the information reconciliation suggest that in the low signal-to-noise ratio regime, the performance of reconciliating the postselected non-Gaussian data is better than that of the Gaussian data, which implies the feasibility of implementing this method practically.

  18. Near-exact distributions for the block equicorrelation and equivariance likelihood ratio test statistic

    NASA Astrophysics Data System (ADS)

    Coelho, Carlos A.; Marques, Filipe J.

    2013-09-01

    In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.

  19. Distributed computing as a virtual supercomputer: Tools to run and manage large-scale BOINC simulations

    NASA Astrophysics Data System (ADS)

    Giorgino, Toni; Harvey, M. J.; de Fabritiis, Gianni

    2010-08-01

    Distributed computing (DC) projects tackle large computational problems by exploiting the donated processing power of thousands of volunteered computers, connected through the Internet. To efficiently employ the computational resources of one of world's largest DC efforts, GPUGRID, the project scientists require tools that handle hundreds of thousands of tasks which run asynchronously and generate gigabytes of data every day. We describe RBoinc, an interface that allows computational scientists to embed the DC methodology into the daily work-flow of high-throughput experiments. By extending the Berkeley Open Infrastructure for Network Computing (BOINC), the leading open-source middleware for current DC projects, with mechanisms to submit and manage large-scale distributed computations from individual workstations, RBoinc turns distributed grids into cost-effective virtual resources that can be employed by researchers in work-flows similar to conventional supercomputers. The GPUGRID project is currently using RBoinc for all of its in silico experiments based on molecular dynamics methods, including the determination of binding free energies and free energy profiles in all-atom models of biomolecules.

  20. Interevent time distributions of human multi-level activity in a virtual world

    NASA Astrophysics Data System (ADS)

    Mryglod, O.; Fuchs, B.; Szell, M.; Holovatch, Yu.; Thurner, S.

    2015-02-01

    Studying human behavior in virtual environments provides extraordinary opportunities for a quantitative analysis of social phenomena with levels of accuracy that approach those of the natural sciences. In this paper we use records of player activities in the massive multiplayer online game Pardus over 1238 consecutive days, and analyze dynamical features of sequences of actions of players. We build on previous work where temporal structures of human actions of the same type were quantified, and provide an empirical understanding of human actions of different types. This study of multi-level human activity can be seen as a dynamic counterpart of static multiplex network analysis. We show that the interevent time distributions of actions in the Pardus universe follow highly non-trivial distribution functions, from which we extract action-type specific characteristic 'decay constants'. We discuss characteristic features of interevent time distributions, including periodic patterns on different time scales, bursty dynamics, and various functional forms on different time scales. We comment on gender differences of players in emotional actions, and find that while males and females act similarly when performing some positive actions, females are slightly faster for negative actions. We also observe effects on the age of players: more experienced players are generally faster in making decisions about engaging in and terminating enmity and friendship, respectively.

  1. Virtuality Distributions in application to gamma gamma* to pi^0 Transition Form Factor at Handbag Level

    SciTech Connect

    Radyushkin, Anatoly V.

    2014-07-01

    We outline basics of a new approach to transverse momentum dependence in hard processes. As an illustration, we consider hard exclusive transition process gamma*gamma -> to pi^0 at the handbag level. Our starting point is coordinate representation for matrix elements of operators (in the simplest case, bilocal O(0,z)) describing a hadron with momentum p. Treated as functions of (pz) and z^2, they are parametrized through a virtuality distribution amplitude (VDA) Phi (x, sigma), with x being Fourier-conjugate to (pz) and sigma Laplace-conjugate to z^2. For intervals with z^+=0, we introduce transverse momentum distribution amplitude (TMDA) Psi (x, k_\\perp), and write it in terms of VDA Phi (x, \\sigma). The results of covariant calculations, written in terms of Phi (x sigma) are converted into expressions involving Psi (x, k_\\perp. Starting with scalar toy models, we extend the analysis onto the case of spin-1/2 quarks and QCD. We propose simple models for soft VDAs/TMDAs, and use them for comparison of handbag results with experimental (BaBar and BELLE) data on the pion transition form factor. We also discuss how one can generate high-k_\\perp tails from primordial soft distributions.

  2. Correlation between lifetime and blocking temperature distribution in spin-valve structures

    NASA Astrophysics Data System (ADS)

    Nozières, J. P.; Jaren, S.; Zhang, Y. B.; Pentek, K.; Zeltser, A.; Wills, P.; Speriosu, V. S.

    2000-05-01

    The blocking temperature distribution Tb(T) and the failure activation energy (as defined by a 10% drop in the magnetoresistance amplitude in a reverse field equivalent to the self-demagnetizing field of a micron size stripe height device) have been determined in spin-valve sheet films with FeMn, IrMn, PtMn, NiMn, and CrPdMn antiferromagnetic exchange biasing layers. We find a clear correlation between the expected lifetime and the fraction of loose (e.g., unblocked) antiferromagnetic grains, which we believe is due to pinned layer rotation being the main failure mechanism in these systems. For CrPdMn structures, a good agreement is found between the stability of sheet films and of finished sliders. From these data, only NiMn and PtMn appear to be suitable for disk-drive applications.

  3. The distribution of KIR-HLA functional blocks is different from north to south of Italy.

    PubMed

    Fasano, M E; Rendine, S; Pasi, A; Bontadini, A; Cosentini, E; Carcassi, C; Capittini, C; Cornacchini, G; Espadas de Arias, A; Garbarino, L; Carella, G; Mariotti, M L; Mele, L; Miotti, V; Moscetti, A; Nesci, S; Ozzella, G; Piancatelli, D; Porfirio, B; Riva, M R; Romeo, G; Tagliaferri, C; Lombardo, C; Testi, M; Amoroso, A; Martinetti, M

    2014-03-01

    The killer cell immunoglobulin-like receptor (KIR)-human leukocyte antigen (HLA) interaction represents an example of genetic epistasis, where the concomitant presence of specific genes or alleles encoding receptor-ligand units is necessary for the activity of natural killer (NK) cells. Although KIR and HLA genes segregate independently, they co-evolved under environmental pressures to maintain particular KIR-HLA functional blocks for species survival. We investigated, in 270 Italian healthy individuals, the distribution of KIR and HLA polymorphisms in three climatic areas (from cold north to warm south), to verify their possible geographical stratification. We analyzed the presence of 13 KIR genes and genotyped KIR ligands belonging to HLA class I: HLA-C, HLA-B and HLA-A. We did not observe any genetic stratification for KIR genes and HLA-C ligands in Italy. By contrast, in a north-to-south direction, we found a decreasing trend for the HLA-A3 and HLA-A11 ligands (P = 0.012) and an increasing trend for the HLA-B ligands carrying the Bw4 epitope (P = 0.0003) and the Bw4 Ile80 epitope (P = 0.0005). The HLA-A and HLA-B KIR ligands were in negative linkage disequilibrium (correlation coefficient -0.1211), possibly as a consequence of their similar function in inhibiting NK cells. The distribution of the KIR-HLA functional blocks was different along Italy, as we observed a north-to-south ascending trend for KIR3DL1, when coupled with HLA-B Bw4 ligands (P = 0.0067) and with HLA-B Bw4 Ile80 (P = 0.0027), and a descending trend for KIR3DL2 when coupled with HLA-A3 and HLA-A11 ligands (P = 0.0044). Overall, people from South Italy preferentially use the KIR3DL1-HLA-B Bw4 functional unit, while those from the North Italy equally use both the KIR3DL2-HLA-A3/A11 and the KIR3DL1-HLA-B Bw4 functional units to fight infections. Thus, only KIR3DL receptors, which exert the unique role of microbial sensors through the specific D0 domain, and their cognate

  4. Distributed execution of recovery blocks - An approach for uniform treatment of hardware and software faults in real-time applications

    NASA Technical Reports Server (NTRS)

    Kim, K. H.; Welch, Howard O.

    1989-01-01

    The concept of distributed execution of recovery blocks is examined as an approach for uniform treatment of hardware and software faults. A useful characteristic of the approach is the relatively small time cost it requires. The approach is thus suitable for incorporation into real-time computer systems. A specific formulation of the approach that is aimed at minimizing the recovery time is presented, called the distributed recovery block (DRB) scheme. The DRB scheme is capable of effecting forward recovery while handling both hardware and software faults in a uniform manner. An approach to incorporating the capability for multiprocessing scheme is also discussed. Two experiments aimed at testing the execution efficiency of the scheme in real-time applications have been conducted on two different multimicrocomputer networks. The results clearly indicate the feasibility of achieving tolerance of hardware and software faults in a broad range of real-time computer systems by use of the schemes for distributed execution of recovery blocks.

  5. Distributed attitude synchronization of formation flying via consensus-based virtual structure

    NASA Astrophysics Data System (ADS)

    Cong, Bing-Long; Liu, Xiang-Dong; Chen, Zhen

    2011-06-01

    This paper presents a general framework for synchronized multiple spacecraft rotations via consensus-based virtual structure. In this framework, attitude control systems for formation spacecrafts and virtual structure are designed separately. Both parametric uncertainty and external disturbance are taken into account. A time-varying sliding mode control (TVSMC) algorithm is designed to improve the robustness of the actual attitude control system. As for the virtual attitude control system, a behavioral consensus algorithm is presented to accomplish the attitude maneuver of the entire formation and guarantee a consistent attitude among the local virtual structure counterparts during the attitude maneuver. A multiple virtual sub-structures (MVSSs) system is introduced to enhance current virtual structure scheme when large amounts of spacecrafts are involved in the formation. The attitude of spacecraft is represented by modified Rodrigues parameter (MRP) for its non-redundancy. Finally, a numerical simulation with three synchronization situations is employed to illustrate the effectiveness of the proposed strategy.

  6. Virtual Oregon: A Proof-of-Concept for Seamless Access to Distributed Environmental Information

    NASA Astrophysics Data System (ADS)

    Keon, D.; Pancake, C.; Wright, D. J.; Walsh, K.

    2002-12-01

    Virtual Oregon is a new data coordination center established at Oregon State University in order to: (1) archive environmental and other place-based data on Oregon and associated areas; (2) make those data accessible to a broad spectrum of agencies and individuals via innovative web interfaces; (3) identify key data sets that are not yet available and encourage their collection and dissemination; and (4) facilitate development of statewide standards for archiving, documenting, and disseminating data. Rather than co-locating researchers and data in a physical center, Virtual Oregon employs a distributed architecture that occupies multiple locations while users are presented with the illusion of a single, centralized facility. This approach was selected not just to maximize the impact on campus students, faculty, and staff but also to service broader interactions with extension agents and other members of Oregon State's statewide community. Virtual Oregon builds on regional GIS centers and databanks in a wide range of disciplines, providing decades of research data on topics as varied as coastal processes, climate, biodiversity, land ownership, water quality, wildfire, and agricultural production. There are four distributed nodes, each serving as a center and clearinghouse for distinct types of information and services: - Department of Geosciences (College of Science): geospatial coverages, digital aerial and ortho imagery and associated base data - Forestry Sciences Laboratory (USDA Forest Service and Oregon State's College of Forestry): ecological and resource management databases; data analyses; data from computational simulations - Northwest Alliance for Computational Science and Engineering (NACSE): databases based on specimen collections, field observation, images, or analysis of historical documents; user interface design - Valley Library: published maps, books and archival publications, gray literature, photographs and video Data are harvested from a variety

  7. Impact of distributed virtual reality on engineering knowledge retention and student engagement

    NASA Astrophysics Data System (ADS)

    Sulbaran, Tulio Alberto

    Engineering Education is facing many problems, one of which is poor knowledge retention among engineering students. This problem affects the Architecture, Engineering, and Construction (A/E/C) industry, because students are unprepared for many necessary job skills. This problem of poor knowledge retention is caused by many factors, one of which is the mismatch between student learning preferences and the media used to teach engineering. The purpose of this research is to assess the impact of Distributed Virtual Reality (DVR) as an engineering teaching tool. The implementation of DVR addresses the issue of poor knowledge retention by impacting the mismatch between learning and teaching style in the visual versus verbal spectrum. Using as a point of departure three knowledge domain areas (Learning and Instruction, Distributed Virtual Reality and Crane Selection as Part of Crane Lift Planning), a DVR engineering teaching tool is developed, deployed and assessed in engineering classrooms. The statistical analysis of the data indicates that: (1) most engineering students are visual learners; (2) most students would like more classes using DVR; (3) engineering students find DVR more engaging than traditional learning methods; (4) most students find the responsiveness of the DVR environments to be either good or very good; (5) all students are able to interact with DVR and most of the students found it easy or very easy to navigate (without previous formal training in how to use DVR); (6) students' knowledge regarding the subject (crane selection) is higher after the experiment; and, (7) students' using different instructional media do not demonstrate statistical difference in knowledge retained after the experiment. This inter-disciplinary research offers opportunities for direct and immediate application in education, research, and industry, due to the fact that the instructional module developed (on crane selection as part of construction crane lift planning) can be

  8. Effects of Sequence Distribution, Concentration and pH on Gradient and Block Copolymer Micelle Formation in Solution

    NASA Astrophysics Data System (ADS)

    Marrou, Stephen; Kim, Jungki; Wong, Christopher; Torkelson, John

    2011-03-01

    Gradient copolymers are a relatively new class of materials with a gradual change in comonomer composition along the copolymer chain length, which have exhibited unique material properties in comparison to random and block copolymers. Here we extend this architecture to amphiphilic systems that form micelles in solvent, as the effect of a nonuniform comonomer sequence distribution is expected to strongly influence critical aggregation phenomena. Utilizing pyrene as a fluorescence probe, we determined that gradient copolymers present an intermediate critical aggregation concentration in comparison to analogous block and random copolymers. The effect of gradient architecture on a pH-sensitive copolymer was also investigated, concluding that gradient sequencing significantly impacts the solubility and critical aggregation pH when compared to block and random copolymers of similar composition, providing further evidence that gradient architectures introduce a powerful means of tuning properties between block and random copolymers.

  9. On delay adjustment for dynamic load balancing in distributed virtual environments.

    PubMed

    Deng, Yunhua; Lau, Rynson W H

    2012-04-01

    Distributed virtual environments (DVEs) are becoming very popular in recent years, due to the rapid growing of applications, such as massive multiplayer online games (MMOGs). As the number of concurrent users increases, scalability becomes one of the major challenges in designing an interactive DVE system. One solution to address this scalability problem is to adopt a multi-server architecture. While some methods focus on the quality of partitioning the load among the servers, others focus on the efficiency of the partitioning process itself. However, all these methods neglect the effect of network delay among the servers on the accuracy of the load balancing solutions. As we show in this paper, the change in the load of the servers due to network delay would affect the performance of the load balancing algorithm. In this work, we conduct a formal analysis of this problem and discuss two efficient delay adjustment schemes to address the problem. Our experimental results show that our proposed schemes can significantly improve the performance of the load balancing algorithm with neglectable computation overhead. PMID:22402679

  10. Electrical Resistivity Investigation of Gas Hydrate Distribution in Mississippi Canyon Block 118, Gulf of Mexico

    SciTech Connect

    Dunbar, John

    2012-12-31

    Electrical methods offer a geophysical approach for determining the sub-bottom distribution of hydrate in deep marine environments. Methane hydrate is essentially non-conductive. Hence, sediments containing hydrate are more resistive than sediments without hydrates. To date, the controlled source electromagnetic (CSEM) method has been used in marine hydrates studies. This project evaluated an alternative electrical method, direct current resistivity (DCR), for detecting marine hydrates. DCR involves the injection of direct current between two source electrodes and the simultaneous measurement of the electric potential (voltage) between multiple receiver electrodes. The DCR method provides subsurface information comparable to that produced by the CSEM method, but with less sophisticated instrumentation. Because the receivers are simple electrodes, large numbers can be deployed to achieve higher spatial resolution. In this project a prototype seafloor DCR system was developed and used to conduct a reconnaissance survey at a site of known hydrate occurrence in Mississippi Canyon Block 118. The resulting images of sub-bottom resistivities indicate that high-concentration hydrates at the site occur only in the upper 50 m, where deep-seated faults intersect the seafloor. Overall, there was evidence for much less hydrate at the site than previously thought based on available seismic and CSEM data alone.

  11. Pore Size Distribution and Methane Equilibrium Conditions at Walker Ridge Block 313, Northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Bihani, A. D.; Daigle, H.; Cook, A.; Glosser, D.; Shushtarian, A.

    2015-12-01

    Coexistence of three methane phases (liquid (L), gas (G), hydrate (H)) in marine gas hydrate systems may occur according to in-situ pressure, temperature, salinity and pore size. In sediments with salinity close to seawater, a discrete zone of three-phase (3P) equilibrium may occur near the base of the regional hydrate stability zone (RHSZ) due to capillary effects. The existence of a 3P zone influences the location of the bottom-simulating reflection (BSR) and has implications for methane fluxes at the base of the RHSZ. We studied hydrate stability conditions in two wells, WR313-G and WR313-H, at Walker Ridge Block 313 in the northern Gulf of Mexico. We determined pore size distributions (PSD) by constructing a synthetic nuclear magnetic resonance (NMR) relaxation time distribution. Correlations were obtained by non-linear regression on NMR, gamma ray, and bulk density logs from well KC-151 at Keathley Canyon. The correlations enabled construction of relaxation time distributions for WR313-G and WR313-H, which were used to predict PSD through comparison with mercury injection capillary pressure measurements. With the computed PSD, L+H and L+G methane solubility was determined from in-situ pressure and temperature. The intersection of the L+G and L+H curves for various pore sizes allowed calculation of the depth range of the 3P equilibrium zone. As in previous studies at Blake Ridge and Hydrate Ridge, the top of the 3P zone moves upwards with increasing water depth and overlies the bulk 3P equilibrium depth. In clays at Walker Ridge, the predicted thickness of the 3P zone is approximately 35 m, but in coarse sands it is only a few meters due to the difference in absolute pore sizes and the width of the PSD. The thick 3P zone in the clays may explain in part why the BSR is only observed in the sand layers at Walker Ridge, although other factors may influence the presence or absence of a BSR.

  12. Pore size distribution and methane equilibrium conditions at Walker Ridge Block 313, northern Gulf of Mexico

    SciTech Connect

    Bihani, Abhishek; Daigle, Hugh; Cook, Ann; Glosser, Deborah; Shushtarian, Arash

    2015-12-15

    Coexistence of three methane phases (liquid (L), gas (G), hydrate (H)) in marine gas hydrate systems may occur according to in-situ pressure, temperature, salinity and pore size. In sediments with salinity close to seawater, a discrete zone of three-phase (3P) equilibrium may occur near the base of the regional hydrate stability zone (RHSZ) due to capillary effects. The existence of a 3P zone influences the location of the bottom-simulating reflection (BSR) and has implications for methane fluxes at the base of the RHSZ. We studied hydrate stability conditions in two wells, WR313-G and WR313-H, at Walker Ridge Block 313 in the northern Gulf of Mexico. We determined pore size distributions (PSD) by constructing a synthetic nuclear magnetic resonance (NMR) relaxation time distribution. Correlations were obtained by non-linear regression on NMR, gamma ray, and bulk density logs from well KC-151 at Keathley Canyon. The correlations enabled construction of relaxation time distributions for WR313-G and WR313-H, which were used to predict PSD through comparison with mercury injection capillary pressure measurements. With the computed PSD, L+H and L+G methane solubility was determined from in-situ pressure and temperature. The intersection of the L+G and L+H curves for various pore sizes allowed calculation of the depth range of the 3P equilibrium zone. As in previous studies at Blake Ridge and Hydrate Ridge, the top of the 3P zone moves upwards with increasing water depth and overlies the bulk 3P equilibrium depth. In clays at Walker Ridge, the predicted thickness of the 3P zone is approximately 35 m, but in coarse sands it is only a few meters due to the difference in absolute pore sizes and the width of the PSD. The thick 3P zone in the clays may explain in part why the BSR is only observed in the sand layers at Walker Ridge, although other factors may influence the presence or absence of a BSR.

  13. Application of tracer injection tests to characterize rock matrix block size distribution and dispersivity in fractured aquifers

    NASA Astrophysics Data System (ADS)

    Sharifi Haddad, Amin; Hassanzadeh, Hassan; Abedi, Jalal; Chen, Zhangxin

    2014-03-01

    The complexity of mass transfer processes between the mobile and immobile zones in geohydrologic settings and the limitations that currently exist in the characterization of contaminated sites demand the development of improved models. In this work, we present a model that describes the mass transfer in structured porous media. This model considers divergent radial advective-dispersive transport in fractures and diffusive mass transfer inside rock matrix blocks. The heterogeneous nature of fractured formations is included with the integration of various distributions of rock matrix block sizes into the transport model. Breakthrough curves generated based on the developed model are analyzed to investigate the effects of the rate of injection, dispersivity and the immobile to mobile porosity ratio on mass transfer between mobile and immobile zones. It is shown that the developed model, in conjunction with tracer data collected from a monitoring well, can be used to estimate the dispersivity and fracture intensity. Results reveal that the dispersivity is independent of the rock matrix block size distribution for dispersion-dominant transport in fractures. These findings are used to develop a methodology to characterize rock matrix block size distribution in fractured aquifers and to estimate dispersivity based on a tracer test, which will improve our decisions concerning the remediation of contaminated sites.

  14. Determination of molecular-weight distribution and average molecular weights of block copolymers by gel-permeation chromatography.

    PubMed

    Nesterov, V V; Kurenbin, O I; Krasikov, V D; Belenkii, B G

    1987-01-01

    The problem of preparation of a block copolymer of precise molecular-weight distribution (MWD) and with heterogeneous composition on the basis of gel-permeation chromatography (GPC) data has been investigated. It has been shown that in MWD calculations the distribution f(p) of the composition p in individual GPC fractions should be taken into account. The type of the f(p) functions can be simultaneously established by an independent method, such as use of adsorption-column or thin-layer chromatography sensitive to the composition of the copolymer. It has also been shown that the actual f(p) may be replaced by a corresponding piecewise distribution, of simple form, without decrease in the precision of calculation of the MWD and average molecular weights of most known block copolymers. PMID:18964273

  15. Size Distribution for Potentially Unstable Rock Masses and In Situ Rock Blocks Using LIDAR-Generated Digital Elevation Models

    NASA Astrophysics Data System (ADS)

    Mavrouli, O.; Corominas, J.; Jaboyedoff, M.

    2015-07-01

    In this paper, two analytical procedures which are independent from the existence of empirical data are presented for the calculation of (1) the size distribution of potentially unstable rock masses that expresses the potential rockfall size distribution, including big volumes corresponding to potential rare events with low susceptibility of failure and (2) the in situ block distribution on the slope face. Two approaches are, respectively, used. The first one involves the detection of kinematically unstable surfaces on a digital elevation model (DEM) and on orthophotos and the calculation of the volumes resting on them. For the second one the in situ block volumes formed by the intersection of the existing discontinuity sets are calculated using a high-resolution DEM. The procedures are presented through an application example at the country of Andorra and in particular at the chute of Forat Negre. The results from the first procedure indicate that it is kinematically possible to have mobilized volumes of some thousands of cubic meters; however, these are considered rare events with low susceptibility of failure. The size distribution of potentially unstable rock masses for big volume events was well fitted by a power law with an exponent of -0.5. The in situ block distribution on the slope face from the second procedure, assuming three types of intersection between the joints of the existing discontinuity sets and two extreme cases of discontinuity persistence, was also found to follow a power law, but with an exponent of -1.3. The comparison with the observed in the field block volume distribution on the slope face indicates that in reality discontinuities have a very high persistence and that considering only their visible trace length overestimates volumes, which is conservative.

  16. The Effect of Audio and Visual Aids on Task Performance in Distributed Collaborative Virtual Environments

    NASA Astrophysics Data System (ADS)

    Ullah, Sehat; Richard, Paul; Otman, Samir; Mallem, Malik

    2009-03-01

    Collaborative virtual environments (CVE) has recently gained the attention of many researchers due to its numerous potential application domains. Cooperative virtual environments, where users simultaneously manipulate objects, is one of the subfields of CVEs. In this paper we present a framework that enables two users to cooperatively manipulate objects in virtual environment, while setting on two separate machines connected through local network. In addition the article presents the use of sensory feedback (audio and visual) and investigates their effects on the cooperation and user's performance. Six volunteers subject had to cooperatively perform a peg-in-hole task. Results revealed that visual and auditory aid increase users' performance. However majority of the users preferred visual feedback to audio. We hope this framework will greatly help in the development of CAD systems that allow the designers to collaboratively design while being distant. Similarly other application domains may be cooperative assembly, surgical training and rehabilitation systems.

  17. Technophiles to Newbies: The Challenge of Supporting Distributed Teams to Maintain Engagement in Virtual Worlds

    NASA Technical Reports Server (NTRS)

    Griffith, Karen

    2011-01-01

    The purpose of this paper is to look for links in a virtual trainee's interest and self-efficacy in a simulated event as it relates to their previous self-reported technical skill level. Ultimately, the idea would be to provide the right amount of support at the right place at the right time to set the conditions for maximum transfer of the skill sets to the work place. An anecdotal recap of a recent experiment of a medium-scale training event produced in a virtual world will provide examples for discussion. In July 2010, a virtual training event was produced for the Air Force Research Lab's Games for Team Training (GaMeTT) at the Patriot Exercise at Volk Field in Wisconsin. There were 29 EMEDS participants who completed the simulated OCO event using the OLIVE gaming engine. Approximately 25 avatars were present at any given time; including role players, observers, coordinators and participants.

  18. Supporting Distributed Team Working in 3D Virtual Worlds: A Case Study in Second Life

    ERIC Educational Resources Information Center

    Minocha, Shailey; Morse, David R.

    2010-01-01

    Purpose: The purpose of this paper is to report on a study into how a three-dimensional (3D) virtual world (Second Life) can facilitate socialisation and team working among students working on a team project at a distance. This models the situation in many commercial sectors where work is increasingly being conducted across time zones and between…

  19. Distributed event-triggered consensus tracking of second-order multi-agent systems with a virtual leader

    NASA Astrophysics Data System (ADS)

    Jie, Cao; Zhi-Hai, Wu; Li, Peng

    2016-05-01

    This paper investigates the consensus tracking problems of second-order multi-agent systems with a virtual leader via event-triggered control. A novel distributed event-triggered transmission scheme is proposed, which is intermittently examined at constant sampling instants. Only partial neighbor information and local measurements are required for event detection. Then the corresponding event-triggered consensus tracking protocol is presented to guarantee second-order multi-agent systems to achieve consensus tracking. Numerical simulations are given to illustrate the effectiveness of the proposed strategy. Project supported by the National Natural Science Foundation of China (Grant Nos. 61203147, 61374047, and 61403168).

  20. Effects of Burnup and Temperature Distributions to CANDLE Burnup of Block-Type High Temperature Gas Cooled Reactor

    SciTech Connect

    Yasunori Ohoka; Ismile; Hiroshi Sekimoto

    2004-07-01

    The CANDLE burnup strategy is a new reactor burnup concept, where the distributions of fuel nuclide densities, neutron flux, and power density move with the same constant speed along the core axis from bottom to top or from top to bottom of the core and without any change in their shapes. It can be applied easily to the block-type high temperature gas cooled reactor using an appropriate burnable poison mixed with uranium oxide fuel. In the present study, the burnup distribution and the temperature distribution in the core are investigated and their effects on the CANDLE burnup core characteristics are studied. In this study, the natural gadolinium is used as the burnable poison. With the fuel enrichment of 15%, the natural gadolinium concentration of 3.0% and the fuel pin pitch of 6.6 cm, the CANDLE burnup is realized with the burning region moving speed of 29 cm/year and the axial half width of power density distribution of 1.5 m for uniform group constant case at 900 K. When the effect of nuclide change by burnup is considered, the burning region speed becomes 25 cm/year and the axial half-width of power density distribution becomes 1.25 m. When the temperature distributions effect is considered, the effects on the core characteristics are smaller than the burnup distribution effect. The maximum fuel temperature of the parallel flow case is higher than the counter flow case. (authors)

  1. MISR Center Block Time Tool

    Atmospheric Science Data Center

    2013-04-01

      MISR Center Block Time Tool The misr_time tool calculates the block center times for MISR Level 1B2 files. This is ... version of the IDL package or by using the IDL Virtual Machine application. The IDL Virtual Machine is bundled with IDL and is ...

  2. A transportable and easily configurable multi-projector display system for distributed virtual reality applications

    NASA Astrophysics Data System (ADS)

    Grimes, Holly; McMenemy, Karen R.; Ferguson, R. S.

    2008-02-01

    This paper details how simple PC software, a small network of consumer level PCs, some do-it-yourself hardware and four low cost video projectors can be combined to form an easily configurable and transportable projection display with applications in virtual reality training. This paper provides some observations on the practical difficulties of using such a system, its effectiveness in delivering a VE for training and what benefit may be offered through the deployment of a large number of these low cost environments.

  3. John H. Dillon Medal: Tapered Block Copolymers: Tuning Self-Assembly and Properties by Manipulating Monomer Segment Distributions

    NASA Astrophysics Data System (ADS)

    Epps, Thomas

    The self-assembly of block copolymers (BCPs) presents unique opportunities to design materials with attractive chemical and mechanical properties based on the ability of BCPs to form periodic structures with nanoscale domain spacings. One area of recent progress in our group focuses on the behavior of tapered BCPs in which the segment distribution at the interface between blocks is synthetically varied to tune morphology, domain density profiles, thermal transitions as well as mechanical and transport properties. Two application targets for these materials are lithium-ion conducting membranes for batteries and nanostructured thin films for nanotemplates and barrier membranes. In the first target area, we found that the taper volume fraction and composition allow us to manipulate the self-assembly of salt-doped BCPs in a well-defined manner that permits optimization of morphology and ion-content. Additionally, we found that the tapered interfaces influence the glass-transition behavior of the ion-conducting block leading to significant changes in lithium-ion transport (ion conductivity). In the second target area, we found the taper content alters the rate of self-assembly as well as the rate of island/hole formation (and ultimate island/hole size) upon thermal annealing. Additionally, using reflectivity techniques, we probed the domain density profiles as a function of taper composition and linked these profiles to changes in domain spacing and glass transition temperature. Overall, these studies show the versatility of tapering to provide a unique handle for simultaneously optimizing multiple materials properties.

  4. Practical quantum private query of blocks based on unbalanced-state Bennett-Brassard-1984 quantum-key-distribution protocol

    NASA Astrophysics Data System (ADS)

    Wei, Chun-Yan; Gao, Fei; Wen, Qiao-Yan; Wang, Tian-Yin

    2014-12-01

    Until now, the only kind of practical quantum private query (QPQ), quantum-key-distribution (QKD)-based QPQ, focuses on the retrieval of a single bit. In fact, meaningful message is generally composed of multiple adjacent bits (i.e., a multi-bit block). To obtain a message from database, the user Alice has to query l times to get each ai. In this condition, the server Bob could gain Alice's privacy once he obtains the address she queried in any of the l queries, since each ai contributes to the message Alice retrieves. Apparently, the longer the retrieved message is, the worse the user privacy becomes. To solve this problem, via an unbalanced-state technique and based on a variant of multi-level BB84 protocol, we present a protocol for QPQ of blocks, which allows the user to retrieve a multi-bit block from database in one query. Our protocol is somewhat like the high-dimension version of the first QKD-based QPQ protocol proposed by Jacobi et al., but some nontrivial modifications are necessary.

  5. Practical quantum private query of blocks based on unbalanced-state Bennett-Brassard-1984 quantum-key-distribution protocol

    PubMed Central

    Wei, Chun-Yan; Gao, Fei; Wen, Qiao-Yan; Wang, Tian-Yin

    2014-01-01

    Until now, the only kind of practical quantum private query (QPQ), quantum-key-distribution (QKD)-based QPQ, focuses on the retrieval of a single bit. In fact, meaningful message is generally composed of multiple adjacent bits (i.e., a multi-bit block). To obtain a message from database, the user Alice has to query l times to get each ai. In this condition, the server Bob could gain Alice's privacy once he obtains the address she queried in any of the l queries, since each ai contributes to the message Alice retrieves. Apparently, the longer the retrieved message is, the worse the user privacy becomes. To solve this problem, via an unbalanced-state technique and based on a variant of multi-level BB84 protocol, we present a protocol for QPQ of blocks, which allows the user to retrieve a multi-bit block from database in one query. Our protocol is somewhat like the high-dimension version of the first QKD-based QPQ protocol proposed by Jacobi et al., but some nontrivial modifications are necessary. PMID:25518810

  6. Scaling properties and fractality in the distribution of coding segments in eukaryotic genomes revealed through a block entropy approach

    NASA Astrophysics Data System (ADS)

    Athanasopoulou, Labrini; Athanasopoulos, Stavros; Karamanos, Kostas; Almirantis, Yannis

    2010-11-01

    Statistical methods, including block entropy based approaches, have already been used in the study of long-range features of genomic sequences seen as symbol series, either considering the full alphabet of the four nucleotides or the binary purine or pyrimidine character set. Here we explore the alternation of short protein-coding segments with long noncoding spacers in entire chromosomes, focusing on the scaling properties of block entropy. In previous studies, it has been shown that the sizes of noncoding spacers follow power-law-like distributions in most chromosomes of eukaryotic organisms from distant taxa. We have developed a simple evolutionary model based on well-known molecular events (segmental duplications followed by elimination of most of the duplicated genes) which reproduces the observed linearity in log-log plots. The scaling properties of block entropy H(n) have been studied in several works. Their findings suggest that linearity in semilogarithmic scale characterizes symbol sequences which exhibit fractal properties and long-range order, while this linearity has been shown in the case of the logistic map at the Feigenbaum accumulation point. The present work starts with the observation that the block entropy of the Cantor-like binary symbol series scales in a similar way. Then, we perform the same analysis for the full set of human chromosomes and for several chromosomes of other eukaryotes. A similar but less extended linearity in semilogarithmic scale, indicating fractality, is observed, while randomly formed surrogate sequences clearly lack this type of scaling. Genomic sequences always present entropy values much lower than their random surrogates. Symbol sequences produced by the aforementioned evolutionary model follow the scaling found in genomic sequences, thus corroborating the conjecture that “segmental duplication-gene elimination” dynamics may have contributed to the observed long rangeness in the coding or noncoding alternation in

  7. Blocking temperature distribution and long-term stability of spin-valve structures with Mn-based antiferromagnets

    NASA Astrophysics Data System (ADS)

    Nozières, J. P.; Jaren, S.; Zhang, Y. B.; Zeltser, A.; Pentek, K.; Speriosu, V. S.

    2000-04-01

    We have determined the blocking temperature distribution Tb(T) in spin-valve sheet films with FeMn, IrMn, PtMn, NiMn and CrPdMn antiferromagnetic layers (AFM). We find a clear dependence of Tb(T) on the field applied during the measurement, which we link to the reversal state of the pinned layer through the torque applied on the AFM. Using fields large enough to fully reverse the pinned layer, NiMn and PtMn show little or no components of the blocking temperature below 150 °C, whereas both IrMn and CrPdMn (the latter in a "synthetic" AFM design) exhibit important low-temperature trailing edges of the distribution. Accelerated annealing experiments in a low reversed field equivalent to the self-demagnetizing field in a micron-size head allows us to access the time to failure and the failure activation energy from which the expected lifetime can be assessed. We find a general correlation between the expected lifetime and the fraction of loose (e.g., unblocked) AFM spins at any given temperature. Accordingly, only NiMn and PtMn are found to exhibit a sufficient long-term stability for disk-drive operations.

  8. Extended Virtual Spring Mesh (EVSM): The Distributed Self-Organizing Mobile Ad Hoc Network for Area Exploration

    SciTech Connect

    Kurt Derr

    2011-12-01

    Mobile Ad hoc NETworks (MANETs) are distributed self-organizing networks that can change locations and configure themselves on the fly. This paper focuses on an algorithmic approach for the deployment of a MANET within an enclosed area, such as a building in a disaster scenario, which can provide a robust communication infrastructure for search and rescue operations. While a virtual spring mesh (VSM) algorithm provides scalable, self-organizing, and fault-tolerant capabilities required by aMANET, the VSM lacks the MANET's capabilities of deployment mechanisms for blanket coverage of an area and does not provide an obstacle avoidance mechanism. This paper presents a new technique, an extended VSM (EVSM) algorithm that provides the following novelties: (1) new control laws for exploration and expansion to provide blanket coverage, (2) virtual adaptive springs enabling the mesh to expand as necessary, (3) adapts to communications disturbances by varying the density and movement of mobile nodes, and (4) new metrics to assess the performance of the EVSM algorithm. Simulation results show that EVSM provides up to 16% more coverage and is 3.5 times faster than VSM in environments with eight obstacles.

  9. Education Block Grant Alters State Role and Provides Greater Local Discretion. Report to the Congress of the United States.

    ERIC Educational Resources Information Center

    Comptroller General of the U.S., Washington, DC.

    The Omnibus Budget Reconciliation Act of 1981 consolidated numerous federal programs into the education block grant and shifted primary administrative responsibility to states. States have to develop a formula to distribute 80 percent of their block grant funds to local education agencies, which have virtually complete discretion in deciding the…

  10. Madrigal - Lessons Learned from 25 years of Evolution from a Single-Instrument Database to a Distributed Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Holt, J. M.; Rideout, W.; van Eyken, T.

    2005-12-01

    Madrigal is a distributed, open source virtual observatory which has been operational for 25 years. During that time it has evolved from a simple database system for the Millstone Hill Incoherent Scatter Radar to a full-featured virtual observatory distributed among five major sites. Madrigal is interoperable with the CEDAR Database and, in addition to being the primary data repository for incoherent scatter radar data, contains data from many other ground-based space science instruments. Madrigal features a well-defined metadata standard, real-time capability, an interactive Web interface, provision for linking ancillary information such as html pages and figures to data, interactive plotting and a complete Web-services interface. A number of important lessons have been learned from the Madrigal project: systems such as Madrigal depend critically on robust data and metadata standards; they need to be a community project; they must permit user interface improvements to be shared across the community; they require a standard, robust interface; scientific efforts using systems such as Madrigal can lead directly to improvements in the system. An example of the last has been the development of several climatological models from Madrigal data. Several features of Madrigal, such as a global search capability, were added in response to requests from the model developers. The models have recently been incorporated into Madrigal and provide a powerful basis for event discovery based on deviations of data from the climatological average. Madrigal will never completely solve the VO problem, but it will make life much easier for future VO projects.

  11. Generalized parton distributions and Deeply Virtual Compton Scattering on proton at CLAS

    SciTech Connect

    R. De Masi

    2007-12-01

    Two measurements of target and beam spin asymmetries for the reaction ep→epγ were performed with CLAS at Jefferson Laboratory. Polarized 5.7 GeV electrons were impinging on a longitudinally polarized ammonia and liquid hydrogen target respectively. These measurements are sensitive to Generalized Parton Distributions. Sizable sin phi azimuthal angular dependences were observed in both experiments, indicating the dominance of leading twist terms and the possibility of extracting combinations of Generalized Parton Distributions on the nucleon.

  12. A revolution in Distributed Virtual Globes creation with e-CORCE space program

    NASA Astrophysics Data System (ADS)

    Antikidis, Jean-Pierre

    2010-05-01

    Space applications are to-day participating to our everyday life on a continuous fashion and most of the time in an invisible way. Meteorology, telecom and more recently GPS driven applications are these days fully participating to our modern and comfortable way of life. Therefore a new revolution is underway by which Space Remote Sensing technology will bring the full of the Earth available in a digital form. Present requirements for digital Earth creation at high resolution requirement are pushing space technology to a new technological frontier that could be called the: 1 day to one week, 1 meter, 1 Earth, challenge.The e-CORCE vision (e-Constellation d'Observation Recurrente Cellulaire) relies on a complete new avenue to create a full virtual earth with the help of small satellite constellation and make them operated as sensors connected to a powerful internet based ground network. To handle this incredibly high quantity of information (10 000 Billions metric pixel ), maximum use of psycho-visual compression associated to over-simplified platforms considered as space IP nodes and a massive World-wide Grid-based system composed of more than 40 receiving and processing nodes is contemplated. The presentation will introduce the technological hurdles and the way modern upcoming cyber-infrastructure technologies called WAG (Wide Area Grid) may open a practical and economically sound solution to this never attempted challenge.

  13. Preliminary investigations on the determination of three-dimensional dose distributions using scintillator blocks and optical tomography

    SciTech Connect

    Kroll, Florian; Karsch, Leonhard; Pawelke, Jörg

    2013-08-15

    Purpose: Clinical QA in teletherapy as well as the characterization of experimental radiation sources for future medical applications requires effective methods for measuring three-dimensional (3D) dose distributions generated in a water-equivalent medium. Current dosimeters based on ionization chambers, diodes, thermoluminescence detectors, radiochromic films, or polymer gels exhibit various drawbacks: High quality 3D dose determination is either very sophisticated and expensive or requires high amounts of effort and time for the preparation or read out. New detectors based on scintillator blocks in combination with optical tomography are studied, since they have the potential to facilitate the desired cost-effective, transportable, and long-term stable dosimetry system that is able to determine 3D dose distributions with high spatial resolution in a short time.Methods: A portable detector prototype was set up based on a plastic scintillator block and four digital cameras. During irradiation the scintillator emits light, which is detected by the fixed cameras. The light distribution is then reconstructed by optical tomography, using maximum-likelihood expectation maximization. The result of the reconstruction approximates the 3D dose distribution. First performance tests of the prototype using laser light were carried out. Irradiation experiments were performed with ionizing radiation, i.e., bremsstrahlung (6 to 21 MV), electrons (6 to 21 MeV), and protons (68 MeV), provided by clinical and research accelerators.Results: Laser experiments show that the current imaging properties differ from the design specifications: The imaging scale of the optical systems is position dependent, ranging from 0.185 mm/pixel to 0.225 mm/pixel. Nevertheless, the developed dosimetry method is proven to be functional for electron and proton beams. Induced radiation doses of 50 mGy or more made 3D dose reconstructions possible. Taking the imaging properties into account, determined

  14. Deviations from the Gutenberg-Richter law on account of a random distribution of block sizes

    NASA Astrophysics Data System (ADS)

    Sibiryakov, B. P.

    2015-10-01

    This paper studies properties of a continuum with structure. The characteristic size of the structure governs the fact that difference relations are nonautomatically transformed into differential ones. It is impossible to consider an infinitesimal volume of a body, to which the major conservation laws could be applied, because the minimum representative volume of the body must contain at least a few elementary microstructures. The corresponding equations of motion are equations of infinite order, solutions of which include, along with usual sound waves, unusual waves with abnormally low velocities without a lower limit. It is shown that in such media weak perturbations can increase or decrease outside the limits. The number of complex roots of the corresponding dispersion equation, which can be interpreted as the number of unstable solutions, depends on the specific surface of cracks and is an almost linear dependence on a logarithmic scale, as in the seismological Gutenberg-Richter law. If the distance between one pore (crack) to another one is a random value with some distribution, we must write another dispersion equation and examine different scenarios depending on the statistical characteristics of the random distribution. In this case, there are sufficient deviations from the Gutenberg-Richter law and this theoretical result corresponds to some field and laboratory observations.

  15. Deviations from the Gutenberg–Richter law on account of a random distribution of block sizes

    SciTech Connect

    Sibiryakov, B. P.

    2015-10-27

    This paper studies properties of a continuum with structure. The characteristic size of the structure governs the fact that difference relations are nonautomatically transformed into differential ones. It is impossible to consider an infinitesimal volume of a body, to which the major conservation laws could be applied, because the minimum representative volume of the body must contain at least a few elementary microstructures. The corresponding equations of motion are equations of infinite order, solutions of which include, along with usual sound waves, unusual waves with abnormally low velocities without a lower limit. It is shown that in such media weak perturbations can increase or decrease outside the limits. The number of complex roots of the corresponding dispersion equation, which can be interpreted as the number of unstable solutions, depends on the specific surface of cracks and is an almost linear dependence on a logarithmic scale, as in the seismological Gutenberg–Richter law. If the distance between one pore (crack) to another one is a random value with some distribution, we must write another dispersion equation and examine different scenarios depending on the statistical characteristics of the random distribution. In this case, there are sufficient deviations from the Gutenberg–Richter law and this theoretical result corresponds to some field and laboratory observations.

  16. Light extraction improvement of blue light-emitting diodes with a Metal-distributed Bragg reflector current blocking layer

    NASA Astrophysics Data System (ADS)

    Liu, Na; Yi, Xiaoyan; Wang, Li; Sun, Xuejiao; Liu, Lei; Liu, Zhiqiang; Wang, Junxi; Li, Jinmin

    2015-03-01

    The p-electrode of blue light-emitting diodes (LED) chips has a low transmittance on the blue light spectrum. The blue light emitted from the quantum wells under the p-electrode will be severely absorbed by p-electrode, which cause a decrease in blue light extraction efficiency (LEE). In this study, we first designed a current blocking layer (CBL) structure with the blue light reflection through the simulation software. The simulation results show that this structure can effectively improve blue LEE, and then, this structure was verified by experiment. Electroluminescence measurement results show that LED with Metal-distributed Bragg reflector (M-DBR) CBL exhibited better optical performance than the LED with SiO2 CBL and DBR CBL. It was found that M-DBR CBL can effectively increase the blue light reflectivity and prevent the light absorption from the metal p-electrode to improve LED' blue LEE.

  17. Policy Building Blocks: Helping Policymakers Determine Policy Staging for the Development of Distributed PV Markets: Preprint

    SciTech Connect

    Doris, E.

    2012-04-01

    There is a growing body of qualitative and a limited body of quantitative literature supporting the common assertion that policy drives development of clean energy resources. Recent work in this area indicates that the impact of policy depends on policy type, length of time in place, and economic and social contexts of implementation. This work aims to inform policymakers about the impact of different policy types and to assist in the staging of those policies to maximize individual policy effectiveness and development of the market. To do so, this paper provides a framework for policy development to support the market for distributed photovoltaic systems. Next steps include mathematical validation of the framework and development of specific policy pathways given state economic and resource contexts.

  18. Reconstruction of temperature distribution in a steel block using an ultrasonic sensor array

    NASA Astrophysics Data System (ADS)

    Gajdacsi, A.; Jarvis, A. J. C.; Cegla, F. B.

    2013-01-01

    The variability of conventional sensors used for ultrasonic inspections can be greatly reduced by permanently installing them, therefore eliminating uncertainties caused by positional variations and coupling fluids. Much smaller changes are measurable and so the monitoring of the onset of material degradation becomes feasible. One of the typical degradation mechanisms affecting the petrochemical industry is high temperature hydrogen attack, which involves hydrogen diffusing into the material at high partial pressures and forming methane voids by reaction with the carbon in the steel. These methane voids cause a small drop in ultrasonic velocity which it is hoped can be monitored for with an ultrasonic array. The accuracy of reconstructing a non-uniform ultrasonic propagation velocity distribution is vital therefore and is investigated by applying heat to the specimen to replicate the effects of material degradation. A number of proposed reconstruction algorithms are compared both for simulated and real experimental measurements and the results are compared with our scattering model.

  19. Virtual libraries of tetrapyrrole macrocycles. Combinatorics, isomers, product distributions, and data mining.

    PubMed

    Taniguchi, Masahiko; Du, Hai; Lindsey, Jonathan S

    2011-09-26

    A software program (PorphyrinViLiGe) has been developed to enumerate the type and relative amounts of substituted tetrapyrrole macrocycles in a virtual library formed by one of four different classes of reactions. The classes include (1) 4-fold reaction of n disubstituted heterocycles (e.g., pyrroles or diiminoisoindolines) to form β-substituted porphyrins, β-substituted tetraazaporphyrins, or α- or β-substituted phthalocyanines; (2) combination of m aminoketones and n diones to form m × n pyrroles, which upon 4-fold reaction give β-substituted porphyrins; (3) derivatization of an 8-point tetrapyrrole scaffold with n reagents, and (4) 4-fold reaction of n aldehydes and pyrrole to form meso-substituted porphyrins. The program accommodates variable ratios of reactants, reversible or irreversible reaction (reaction classes 1 and 2), and degenerate modes of formation. Pólya's theorem (for enumeration of cyclic entities) has also been implemented and provides validation for reaction classes 3 and 4. The output includes the number and identity of distinct reaction-accessible substituent combinations, the number and identity of isomers thereof, and the theoretical mass spectrum. Provisions for data mining enable assessment of the number of products having a chosen pattern of substituents. Examples include derivatization of an octa-substituted phthalocyanine with eight reagents to afford a library of 2,099,728 members (yet only 6435 distinct substituent combinations) and reversible reaction of six distinct disubstituted pyrroles to afford 2649 members (yet only 126 distinct substituent combinations). In general, libraries of substituted tetrapyrrole macrocycles occupy a synthetically accessible region of chemical space that is rich in isomers (>99% or 95% for the two examples, respectively). PMID:21866949

  20. Site-specific assessment of the rockfall and the rock block volume distribution relations, using a LIDAR generated DEM

    NASA Astrophysics Data System (ADS)

    Mavrouli, Olga; Corominas, Jordi; Jaboyedoff, Michel

    2014-05-01

    The quantification of the rockfall hazard and, in particular of the rockfall propagation, requires information on the expected probability or frequency of rockfalls of a given magnitude (size), usually in the form of magnitude-frequency M-F relations. Two kinds of relations are needed. The first one characterises the rockfall masses that can be potentially detached from the slope face giving information on the volume distribution of rockfalls. From now on, this will be referred to as potential rockfall volume distribution VDR. For fragmental rockfalls, the evaluation of the VDR can be a first step towards the temporal M-F, The second one characterises the volume distribution of the rock blocks that result from the disintegration of the previous rockfall masses due to impact with the ground. This one will be referred to as rock block volume distribution VDB. In this work we present two analytical procedures which are independent from the existence of empirical data, for: (i) The calculation of the potential VDR that refers to big volumes with low probability of occurrence. This is realised by detection of the kinematically unstable surfaces on a DEM and on orthophotos, and calculation of the volumes that correspond to them. The basic assumptions here describing a conservative scenario of very low probability are: (a) the rockfall mass is detached entirely at a single rockfall event, without taking into account that smaller successive failures are possible instead; (b) all discontinuity sets are present everywhere in the slope and have infinite persistence; and (c) big stepped-path failures are possible. (ii) The assessment of the in-situ rock blocks volume distribution on the slope face, VDB, by calculation of the volume of the prisms which are formed by the intersection of the existing discontinuity sets and are kinematically unstable. This is also based on data obtained by DEM analysis. A high-resolution DEM obtained by Lidar is used. Both procedures are presented

  1. Thermally driven asymmetric responses of grains versus spin-glass related distributions of blocking temperature in exchange biased Co/IrMn bilayers

    SciTech Connect

    Baltz, V.

    2013-02-11

    Controlling ferromagnetic/antiferromagnetic blocking temperatures in exchange biased based devices appears crucial for applications. The blocking temperature is ascribed to the ability of both antiferromagnetic grains and interfacial spin-glass-like phases to withstand ferromagnetic magnetization reversal. To better understand the respective contributions of grains versus spin-glass, blocking temperature distributions were measured after various thermal treatments for cobalt/iridium-manganese bilayers. The high-temperature contribution linked to antiferromagnetic grains shifts towards lower temperatures above a threshold thermal annealing. In contrast, the occurrence and evolution of training effects for the low-temperature contribution only agree with its inferred interfacial spin-glass-like origin.

  2. Virtuality Distributions and γγ* -> π0 Transition at Handbag Level

    SciTech Connect

    Radyushkin, Anatoly V.

    2015-09-01

    We outline a new approach to transverse momentum dependence in hard processes using as an example the exclusive transition ${\\gamma^{*}\\gamma \\to \\pi^{0}}$ at the handbag level. We start with the coordinate representation for a matrix element ${\\langle p |{\\cal O}(0,z) |0 \\rangle}$ of a bilocal operator ${{\\cal O} (0,z)}$ describing a hadron with momentum p. Treated as a function of (pz) and z$^{2}$, it is parametrized through virtuality distribution amplitude (VDA) Φ (x, σ), with x being Fourier-conjugate to (pz) and σ Laplace-conjugate to z$^{2}$. For intervals with z$^{+}$ = 0, we introduce the transverse momentum distribution amplitude (TMDA) ${\\Ψ (x,k_{\\perp})}$ , and write it in terms of VDA Φ (x, σ). The results of covariant calculations, written in terms of Φ (x, σ) are converted into expressions involving ${\\Ψ (x,k_{\\perp})}$ . We propose simple models for soft VDAs/TMDAs, and use them for comparison of handbag results with experimental (BaBar and BELLE) data on the pion transition form factor.

  3. Virtual colonoscopy

    MedlinePlus

    Colonoscopy - virtual; CT colonography; Computed tomographic colonography; Colography - virtual ... Virtual colonoscopy is different from regular colonoscopy . Regular colonoscopy uses a long, lighted tool called a colonoscope that is ...

  4. Grid-based Infrastructure and Distributed Data Mining for Virtual Observatories

    NASA Astrophysics Data System (ADS)

    Karimabadi, H.; Sipes, T.; Ferenci, S.; Fujimoto, R.; Olschanowsky, R.; Balac, N.; Roberts, A.

    2006-12-01

    Data access as well as analysis of geographically distributed data sets are challenges common to a wide variety of fields. To address this problem, we have been working on the development of two pieces of technology: a grid-based software called IDDAT that supports processing and remote data analysis of widely distributed data and RemoteMiner which is a parallel, distributed data mining software. IDDAT and RemoteMiner work seamlessly and provide the necessary backend functionalities hidden from the user. The user accesses the system through a single web portal where data selection is performed and data mining functions are planned. The data mining functions are prepared for execution by IDDat services. Preparation can include moving data to the processing location via services built over Storage Resource Broker (SRB), preprocessing data, and allocating computation and storage resources. IDDat services also initiate and monitor data mining functions and provide services to allow the results to be shared among other users. In this presentation, we illustrate a general user workflow and the provided functionalities. We will also provide an overview of the technical issues and design features such as storage scheduling, efficient network traffic management and resource selection.

  5. Distribution and textures of K-feldspar grains in the George Ashley Block layered-aplite pegmatite intrusive

    NASA Astrophysics Data System (ADS)

    Kleck, W. D.

    2013-12-01

    Both Johns & Tuttle (1963) and London (2008) note that the distribution of potassium is neither uniform nor symmetrical in some pegmatite bodies. A detailed chemical and mineral analysis of the George Ashley Block (Kleck & Foord 1999) shows that the distribution of K-feldspar over the entire body is generally uniform, but not symmetrical. The amounts of quartz, plagioclase, muscovite, and garnet are neither uniform nor symmetrical. It is noted that layered-aplite, pegmatite intrusives (terminology of Jahns & Tuttle 1963) are intruded horizontally, and it is suggested that these are the pegmatite bodies which have this non-uniform distribution of minerals. These types of pegmatite bodies are distinctly different from other pegmatite bodies in several ways. The core zone in these bodies is not centrally located and typically divides these bodies into a pegmatitic top and an aplitic bottom; the top and bottom appear to be contemporaneous. The features and content of the border- and core-zones are not included in this discussion. The texture of the K-feldspar in the top of these bodies is generally pegmatitic; some of the K-feldspar grains may exist as ultra-large grains which have a teardrop shape or minor-crystal surfaces. In the bottom of these bodies, the K-feldspar is commonly rounded grains approximately 1 mm in diameter; rarely, some K-feldspar grains here are centimeter-grained with crystal surfaces indicating growth. In the George Ashley Block, the concentration and distribution of K-feldspar is inversely symmetrical in the top vs. bottom. In the top it increases toward the core zone; K-feldspar--20 increasing to 60 vol% (all values one significant figure) and K2O--3 increasing to 8 wt%. In the bottom it decreases toward the core zone; K-feldspar--40 decreasing to 0 vol% and K2O--4 decreasing to 0 wt%. The two trends are approximately parallel and the total amounts are approximately constant. The suggested conditions and mechanisms (with the added condition of

  6. Distribution of Recombination Crossovers and the Origin of Haplotype Blocks: The Interplay of Population History, Recombination, and Mutation

    PubMed Central

    Wang, Ning; Akey, Joshua M.; Zhang, Kun; Chakraborty, Ranajit; Jin, Li

    2002-01-01

    Recent studies suggest that haplotypes are arranged into discrete blocklike structures throughout the human genome. Here, we present an alternative haplotype block definition that assumes no recombination within each block but allows for recombination between blocks, and we use it to study the combined effects of demographic history and various population genetic parameters on haplotype block characteristics. Through extensive coalescent simulations and analysis of published haplotype data on chromosome 21, we find that (1) the combined effects of population demographic history, recombination, and mutation dictate haplotype block characteristics and (2) haplotype blocks can arise in the absence of recombination hot spots. Finally, we provide practical guidelines for designing and interpreting studies investigating haplotype block structure. PMID:12384857

  7. Information assimilation and distribution challenges and goals for real and virtual journals.

    PubMed

    Modlin, Irvin M; Adler, Guido; Alexander, Kathey; Arnold, Rudolf; Brenner, David A; Corazziari, Enrico; Floch, Martin H; LaPorte, Ronald E; Peterson, Walter L; Quigley, Eamonn M; Shapiro, Michael D; Spechler, Stuart J; Spiller, Robin C; Tytgat, Guido N; Wiegers, Wolfram

    2005-03-01

    The distribution of biomedical information was transfigured over three centuries ago with the introduction of scientific journals. This enabled the widespread dissemination of data to global audiences and greatly facilitated not only the advance of science but amplified the interaction between investigators despite their different locations. This process continued to expand in a linear fashion prior to the emergence of the Internet. The latter system has prompted a phenomenal augmentation of information accessibility, and its ever-expanding use has resulted in an exponential increase in the demand for digital technology and online resources. This technology has achieved unprecedented acceptance in the scientific domain and enabled publishers to expeditiously produce and distribute journal contents online. Such unparalleled access to information has sparked incendiary debate within the scientific community and among journal publishers in regard to numerous issues. It is thus much debated as to who has the right to "own" or control intellectual property, whether information should be made freely available to the online global community, how to gauge the legitimacy and authenticity of published research, and the need to reexamine the feasibility and profitability of paper journals in consideration of the digital, online formats that continue to gain popularity. To assess the current status of the situation, a meeting of journal editors, research scientists, and publishing executives was held in Constance, Germany, on June 26, 2004, to discuss these issues and formulate strategies and recommendations for the future of biomedical publishing. Herewith we provide a summation (manifesto) of the meeting's proceedings and provide a consensus opinion with the aim of illuminating the subject and also proposing some putative solutions for the major challenges that currently confront the scientific and publishing community. PMID:15718859

  8. Molecular dynamics study of the encapsulation capability of a PCL-PEO based block copolymer for hydrophobic drugs with different spatial distributions of hydrogen bond donors and acceptors.

    PubMed

    Patel, Sarthak K; Lavasanifar, Afsaneh; Choi, Phillip

    2010-03-01

    Molecular dynamics simulation was used to study the potential of using a block copolymer containing three poly(epsilon-caprolactone) (PCL) blocks of equal length connected to one end of a poly(ethylene oxide) (PEO) block, designated as PEO-b-3PCL, to encapsulate two classes of hydrophobic drugs with distinctively different molecular structures. In particular, the first class of drugs consisted of two cucurbitacin drugs (CuB and CuI) that contain multiple hydrogen bond donors and acceptors evenly distributed on their molecules while the other class of drugs (fenofibrate and nimodipine) contain essentially only clustered hydrogen bond acceptors. In the case of cucurbitacin drugs, the results showed that PEO-b-3PCL lowered the Flory-Huggins interaction parameters (chi) considerably (i.e., increased the drug solubility) compared to the linear di-block copolymer PEO-b-PCL with the same PCL/PEO (w/w) ratio of 1.0. However, the opposite effect was observed for fenofibrate and nimodipine. Analysis of the intermolecular interactions indicates that the number of hydrogen bonds formed between the three PCL blocks and cucurbitacin drugs is significantly higher than that of the linear di-block copolymer. On the other hand, owing to the absence of hydrogen bond donors and the clustering of the hydrogen bond acceptors on the fenofibrate and nimodipine molecules, this significantly reduces the number of hydrogen bonds formed in the multi-PCL block environment, leading to unfavourable chi values. The findings of the present work suggest that multi-hydrophobic block architecture could potentially increase the drug loading for hydrophobic drugs with structures containing evenly distributed multiple hydrogen bond donors and acceptors. PMID:19962756

  9. CROSS DRIVE: A Collaborative and Distributed Virtual Environment for Exploitation of Atmospherical and Geological Datasets of Mars

    NASA Astrophysics Data System (ADS)

    Cencetti, Michele

    2016-07-01

    European space exploration missions have produced huge data sets of potentially immense value for research as well as for planning and operating future missions. For instance, Mars Exploration programs comprise a series of missions with launches ranging from the past to beyond present, which are anticipated to produce exceptional volumes of data which provide prospects for research breakthroughs and advancing further activities in space. These collected data include a variety of information, such as imagery, topography, atmospheric, geochemical datasets and more, which has resulted in and still demands, databases, versatile visualisation tools and data reduction methods. Such rate of valuable data acquisition requires the scientists, researchers and computer scientists to coordinate their storage, processing and relevant tools to enable efficient data analysis. However, the current position is that expert teams from various disciplines, the databases and tools are fragmented, leaving little scope for unlocking its value through collaborative activities. The benefits of collaborative virtual environments have been implemented in various industrial fields allowing real-time multi-user collaborative work among people from different disciplines. Exploiting the benefits of advanced immersive virtual environments (IVE) has been recognized as an important interaction paradigm to facilitate future space exploration. The current work is mainly aimed towards the presentation of the preliminary results coming from the CROSS DRIVE project. This research received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 607177 and is mainly aimed towards the implementation of a distributed virtual workspace for collaborative scientific discovery, mission planning and operations. The purpose of the CROSS DRIVE project is to lay foundations of collaborative European workspaces for space science. It will demonstrate the feasibility and

  10. Heart Block

    MedlinePlus

    ... Block Explore Heart Block What Is... Electrical System & EKG Results Types Causes Who Is at Risk Signs & ... heart block. Doctors use a test called an EKG (electrocardiogram) to help diagnose heart block. This test ...

  11. Virtual colonoscopy

    MedlinePlus

    Colonoscopy - virtual; CT colonography; Computed tomographic colonography; Colography - virtual ... standards for gastroenterologists for performing and interpreting diagnostic computed tomography colonography: 2011 update. Gastroenterology . 2011;141:2240-2266. ...

  12. Platinum-group element abundance and distribution in chromite deposits of the Acoje Block, Zambales Ophiolite Complex, Philippines

    USGS Publications Warehouse

    Bacuta, G.C., Jr.; Kay, R.W.; Gibbs, A.K.; Lipin, B.R.

    1990-01-01

    Platinum-group elements (PGE) occur in ore-grade concentration in some of the chromite deposits related to the ultramafic section of the Acoje Block of the Zambales Ophiolite Complex. The deposits are of three types: Type 1 - associated with cumulate peridotites at the base of the crust; Type 2 - in dunite pods from the top 1 km of mantle harzburgite; and Type 3 - like Type 2, but in deeper levels of the harzburgite. Most of the deposites have chromite compositions that are high in Cr with Cr/(Cr + Al) (expressed as chromium index, Cr#) > 0.6; high-Al (Cr# Pd, thought to be characteristic of PGE-barren deposits) and positive slope (Ir < Pd, characteristic of PGE-rich deposits). Iridium, Ru and Os commonly occur as micron-size laurite (sulfide) inclusions in unfractured chromite. Laurite and native Os are also found as inclusions in interstitial sulfides. Platinum and Pd occur as alloy inclusions (and possibly as solid solution) in interstitial Ni-Cu sulfides and as tellurobismuthides in serpentine and altered sulfides. Variability of PGE distribution may be explained by alteration, crystal fractionation or partial melting processes. Alteration and metamorphism were ruled out, because PGE contents do not correlate with degree of serpentinization or the abundance and type (hydroxyl versus non-hydroxyl) of silicate inclusions in chromite. Preliminary Os isotopic data do not support crustal contamination as a source of the PGEs in the Acoje deposits. The anomalous PGE concentrations in Type 1 high-Cr chromite deposits are attributed to two stages of enrichment: an early enrichment of their mantle source from previous melting events and a later stage of sulfide segregation accompanying chromite crystallization. High-Al chromite deposits which crystallized from basalts derived from relatively low degrees of melting owe their low PGE content to partitioning of PGEs in sulfides and alloys that remain in the mantle. High-Cr deposits crystallized from melts that were

  13. Measuring Virtual Student-Student Cooperation: A Case Study on the Evaluation of Cooperative Learning in a Virtual Distributed Computer and Law Course

    ERIC Educational Resources Information Center

    Nett, Bernhard

    2005-01-01

    This article demonstrates the evaluation of a German Computer and Law (C&L) seminar, which has been conducted in an experimental, distributed manner with five institutes co-operating. The evaluation was dedicated to the question, in which way the course supported cooperative learning among the students of the different participating institutes. As…

  14. Sedimentation and reservoir distribution related to a tilted block system in the Sardinia Oligocene-Miocene rift (Italy)

    SciTech Connect

    Tremolieres, P.; Cherchi, A.; Eschard, R.; De Graciansky, P.C.; Montadert, L.

    1988-08-01

    In the western Mediterranean basin lies a rift system about 250 km long and 50 km wide and its infilling outcrop (central Sardinia). Seismic reflection surveys show its offshore extension. Block tilting started during the late Oligocene and lasted during Aquitanian-early Burdigalian time. Two main fault trends, with synthetic and antithetic throws, define the more-or-less collapsed blocks. This morphology guided the transit and trapping of sediments. The sedimentation started in a continental environment then, since the Chattian, in marine conditions. In the central part, the series can reach a thickness of 2,000 m. The basement composition and the volcanics products related to the main fault motion controlled the nature of the synrift deposits. According to their location in the rift context, the tilted blocks trap either continental deposits or marine siliciclastic or carbonate deposits. In the deeper part of the graben, sands were redeposited by gravity flows into the basinal marls. The younger prerift deposits are from Eocene to early Oligocene age and locally comprise thick coal layers. Postrift deposits, mainly marls, sealed the blocks and synrift sedimentary bodies. In middle and late Miocene time some faults were reactivated during compressional events. Then, a quaternary extensional phase created the Campidano graben, filled with about 1,000 m of sediments superimposed on the Oligocene-Miocene rift.

  15. Study of Generalized Parton Distributions and Deeply Virtual Compton Scattering on the nucleon with the CLAS and CLAS12 detectors at the Jefferson Laboratory

    SciTech Connect

    Guegan, Baptiste

    2012-11-01

    The exclusive leptoproduction of a real photon is considered to be the "cleanest" way to access the Generalized Parton Distribution (GPD). This process is called Deeply Virtual Compton Scattering (DVCS) lN {yields} lN{gamma} , and is sensitive to all the four GPDs. Measuring the DVCS cross section is one of the main goals of this thesis. In this thesis, we present the work performed to extract on a wide phase-space the DVCS cross-section from the JLab data at a beam energy of 6 GeV.

  16. Water distributions in PS-b-P(S-g-PEO) block grafted copolymer system in aqueous solutions revealed by contrast variation SANS study

    SciTech Connect

    Chen, Wei-Ren; Hong, Kunlun; Li, Xin; Liu, Emily; Liu, Yun; Shew, Chwen-Yang; Smith, Gregory Scott

    2010-01-01

    In this report, we present a contrast variation small angle neutron scattering (SANS) study of PS-b-P(S-g-PEO) block graft copolymers in aqueous media at a concentration of 10 mg/ml. Through varying the solvent D2O/H2O ratio, the scattering contributions from the water molecules and the micellar constituent components can be determined. Based on the commonly used core-shell model, a theoretical coherent scattering cross section incorporating the effect of water penetration is developed and used to analyze the SANS I(Q), we have successfully quantified the micellar structure and the number of - water molecules associated with the core and corona. We have found that the overall micellar hydration level increase with the increase in the molecular weight of hydrophilic PEO side chains. This study provides key information in understanding solvent distributions within self-assembled amphiphilic block grafted copolymers.

  17. Fast Booting Many Similar Virtual Machines

    NASA Astrophysics Data System (ADS)

    Wang, Xiaolin; Wang, Zhenlin; Liang, Shuang; Zhang, Zhengyi; Luo, Yingwei; Li, Xiaoming

    Virtual Machines have been commonly used for server consolidation in data centers, network classrooms, and cloud computing environments. Although booting up a virtual machine takes much less time than booting up a physical computer, booting up multiple virtual machines on a single physical server still takes a lot of time. We propose a method to speed up the booting process when a set of similar virtual machines share a snapshot enabled storage. Our method exploits massive memory page sharing stemming from the reads to common disk blocks by these virtual machines. Our experiments show that the second virtual machine may reduce the booting time by half.

  18. Solar flux-density distribution due to partially shaded/blocked mirrors using the separation of variables/superposition technique with polynomial and Gaussian sunshapes

    SciTech Connect

    Elsayed, M.; Fathalah, K.A.

    1996-05-01

    In a previous work, the separation of a variable/superposition technique was used to predict the flux density distribution on the receiver surfaces of solar central receiver plants. In this paper further developments of the technique are given. A numerical technique is derived to carry out the convolution of the sunshape and error density functions. Also, a simplified numerical procedure is presented to determine the basic flux density function on which the technique depends. The technique is used to predict the receiver solar flux distribution using two sunshapes, polynomial and Gaussian distributions. The results predicted with the technique are validated by comparison with experimental results from mirrors both with and without partial shading/blocking of their surfaces.

  19. Building blocks for Virtual Observatories in Heliophysics

    NASA Astrophysics Data System (ADS)

    Bentley, R. D.; Csillaghy, A.; Pierantoni, G.; Benson, K.; Soldati, M.; Le Blanc, A.

    2011-12-01

    HELIO has made considerable progress in establishing a set of services that can be used by heliophysics community to identify interesting events and phenomena and then provide access to relevant observations. It has done this by adopting and adapting existing standards and protocols where suitable ones were available and developing new techniques where necessary. Initially we found that we were working in a partial vacuum. Many concepts and protocols were available but in a number of cases they did not quite match what was needed and it was necessary to extend or adapt them; we had to make many pragmatic decisions in order to achieve our goals with the allowed time. Our experience highlights the need to establish a set of standards and protocols for metadata and interfaces that are better suited to the needs of our communities, but based on and developed in conjunction with existing standards bodies such as the IVOA, IPDA, etc. We will describe what measure we had to take and set them in the context of creating a general research environment that can be used as a framework in which new projects can be developed. We will also describe how the CASSIS project is trying to foster ideas of standards in metadata and interfaces in order to engender interoperability. The HELIO and CASSIS projects are funded under the European Commission's Seventh Framework Programme (FP7).

  20. Population Blocks.

    ERIC Educational Resources Information Center

    Smith, Martin H.

    1992-01-01

    Describes an educational game called "Population Blocks" that is designed to illustrate the concept of exponential growth of the human population and some potential effects of overpopulation. The game material consists of wooden blocks; 18 blocks are painted green (representing land), 7 are painted blue (representing water); and the remaining…

  1. Virtual Machine Language

    NASA Technical Reports Server (NTRS)

    Grasso, Christopher; Page, Dennis; O'Reilly, Taifun; Fteichert, Ralph; Lock, Patricia; Lin, Imin; Naviaux, Keith; Sisino, John

    2005-01-01

    Virtual Machine Language (VML) is a mission-independent, reusable software system for programming for spacecraft operations. Features of VML include a rich set of data types, named functions, parameters, IF and WHILE control structures, polymorphism, and on-the-fly creation of spacecraft commands from calculated values. Spacecraft functions can be abstracted into named blocks that reside in files aboard the spacecraft. These named blocks accept parameters and execute in a repeatable fashion. The sizes of uplink products are minimized by the ability to call blocks that implement most of the command steps. This block approach also enables some autonomous operations aboard the spacecraft, such as aerobraking, telemetry conditional monitoring, and anomaly response, without developing autonomous flight software. Operators on the ground write blocks and command sequences in a concise, high-level, human-readable programming language (also called VML ). A compiler translates the human-readable blocks and command sequences into binary files (the operations products). The flight portion of VML interprets the uplinked binary files. The ground subsystem of VML also includes an interactive sequence- execution tool hosted on workstations, which runs sequences at several thousand times real-time speed, affords debugging, and generates reports. This tool enables iterative development of blocks and sequences within times of the order of seconds.

  2. Virtual Congresses

    PubMed Central

    Lecueder, Silvia; Manyari, Dante E.

    2000-01-01

    A new form of scientific medical meeting has emerged in the last few years—the virtual congress. This article describes the general role of computer technologies and the Internet in the development of this new means of scientific communication, by reviewing the history of “cyber sessions” in medical education and the rationale, methods, and initial results of the First Virtual Congress of Cardiology. Instructions on how to participate in this virtual congress, either actively or as an observer, are included. Current advantages and disadvantages of virtual congresses, their impact on the scientific community at large, and future developments and possibilities in this area are discussed. PMID:10641960

  3. System Management Software for Virtual Environments

    SciTech Connect

    Vallee, Geoffroy R; Naughton, III, Thomas J; Scott, Stephen L

    2007-01-01

    Recently there has been an increased interest in the use of system-level virtualization using mature solutions such as Xen, QEMU, or VMWare. These virtualization platforms are being used in distributed and parallel environments including high performance computing. The use of virtual machines within such environments introduces new challenges to system management. These include tedious tasks such as deploying para-virtualized host operating systems to support virtual machine execution or virtual overlay networks to connect these virtual machines. Additionally, there is the problem of machine definition and deployment, which is complicated by differentiation in the underlying virtualization technology. This paper discusses tools for the deployment and management of both host operating systems and virtual machines in clusters. We begin with an overview of system-level virtualization and move on to a description of tools that we have developed to aid with these environments. These tools extend prior work in the area of cluster installation, configuration and management.

  4. Virtual Laboratories and Virtual Worlds

    NASA Astrophysics Data System (ADS)

    Hut, Piet

    2008-05-01

    Since we cannot put stars in a laboratory, astrophysicists had to wait till the invention of computers before becoming laboratory scientists. For half a century now, we have been conducting experiments in our virtual laboratories. However, we ourselves have remained behind the keyboard, with the screen of the monitor separating us from the world we are simulating. Recently, 3D on-line technology, developed first for games but now deployed in virtual worlds like Second Life, is beginning to make it possible for astrophysicists to enter their virtual labs themselves, in virtual form as avatars. This has several advantages, from new possibilities to explore the results of the simulations to a shared presence in a virtual lab with remote collaborators on different continents. I will report my experiences with the use of Qwaq Forums, a virtual world developed by a new company (see http://www.qwaq.com).

  5. Microscopic variations in interstitial and intracellular structure modulate the distribution of conduction delays and block in cardiac tissue with source–load mismatch

    PubMed Central

    Hubbard, Marjorie Letitia; Henriquez, Craig S.

    2012-01-01

    Aims Reentrant activity in the heart is often correlated with heterogeneity in both the intracellular structure and the interstitial structure surrounding cells; however, the combined effect of cardiac microstructure and interstitial resistivity in regions of source–load mismatch is largely unknown. The aim of this study was to investigate how microstructural variations in cell arrangement and increased interstitial resistivity influence the spatial distribution of conduction delays and block in poorly coupled regions of tissue. Methods and results Two-dimensional 0.6 cm × 0.6 cm computer models with idealized and realistic cellular structure were used to represent a monolayer of ventricular myocytes. Gap junction connections were distributed around the periphery of each cell at 10 μm intervals. Regions of source–load mismatch were added to the models by increasing the gap junction and interstitial resistivity in one-half of the tissue. Heterogeneity in cell shape and cell arrangement along the boundary between well-coupled and poorly coupled tissue increased variability in longitudinal conduction delays to as much as 10 ms before the onset of conduction block, resulting in wavefront breakthroughs with pronounced curvature at distinct points along the boundary. Increasing the effective interstitial resistivity reduced source–load mismatch at the transition boundary, which caused a decrease in longitudinal conduction delay and an increase in the number of wavefront breakthroughs. Conclusion Microstructural variations in cardiac tissue facilitate the formation of isolated sites of wavefront breakthrough that may enable abnormal electrical activity in small regions of diseased tissue to develop into more widespread reentrant activity. PMID:23104912

  6. Virtual Labs and Virtual Worlds

    NASA Astrophysics Data System (ADS)

    Boehler, Ted

    2006-12-01

    Virtual Labs and Virtual Worlds Coastline Community College has under development several virtual lab simulations and activities that range from biology, to language labs, to virtual discussion environments. Imagine a virtual world that students enter online, by logging onto their computer from home or anywhere they have web access. Upon entering this world they select a personalized identity represented by a digitized character (avatar) that can freely move about, interact with the environment, and communicate with other characters. In these virtual worlds, buildings, gathering places, conference rooms, labs, science rooms, and a variety of other “real world” elements are evident. When characters move about and encounter other people (players) they may freely communicate. They can examine things, manipulate objects, read signs, watch video clips, hear sounds, and jump to other locations. Goals of critical thinking, social interaction, peer collaboration, group support, and enhanced learning can be achieved in surprising new ways with this innovative approach to peer-to-peer communication in a virtual discussion world. In this presentation, short demos will be given of several online learning environments including a virtual biology lab, a marine science module, a Spanish lab, and a virtual discussion world. Coastline College has been a leader in the development of distance learning and media-based education for nearly 30 years and currently offers courses through PDA, Internet, DVD, CD-ROM, TV, and Videoconferencing technologies. Its distance learning program serves over 20,000 students every year. sponsor Jerry Meisner

  7. Virtually Possible

    ERIC Educational Resources Information Center

    Mellon, Ericka

    2011-01-01

    Diane Lewis began building her popular virtual education program in a storage closet. The drab room, just big enough to squeeze in a tiny table, was her office at the headquarters of Seminole County (Florida) Public Schools. She had a computer and a small staff of temporary workers. Lewis, who managed to open two successful virtual schools for…

  8. Ionic Blocks

    ERIC Educational Resources Information Center

    Sevcik, Richard S.; Gamble, Rex; Martinez, Elizabet; Schultz, Linda D.; Alexander, Susan V.

    2008-01-01

    "Ionic Blocks" is a teaching tool designed to help middle school students visualize the concepts of ions, ionic compounds, and stoichiometry. It can also assist high school students in reviewing their subject mastery. Three dimensional blocks are used to represent cations and anions, with color indicating charge (positive or negative) and size…

  9. SkyQuery - A Prototype Distributed Query and Cross-Matching Web Service for the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Thakar, A. R.; Budavari, T.; Malik, T.; Szalay, A. S.; Fekete, G.; Nieto-Santisteban, M.; Haridas, V.; Gray, J.

    2002-12-01

    We have developed a prototype distributed query and cross-matching service for the VO community, called SkyQuery, which is implemented with hierarchichal Web Services. SkyQuery enables astronomers to run combined queries on existing distributed heterogeneous astronomy archives. SkyQuery provides a simple, user-friendly interface to run distributed queries over the federation of registered astronomical archives in the VO. The SkyQuery client connects to the portal Web Service, which farms the query out to the individual archives, which are also Web Services called SkyNodes. The cross-matching algorithm is run recursively on each SkyNode. Each archive is a relational DBMS with a HTM index for fast spatial lookups. The results of the distributed query are returned as an XML DataSet that is automatically rendered by the client. SkyQuery also returns the image cutout corresponding to the query result. SkyQuery finds not only matches between the various catalogs, but also dropouts - objects that exist in some of the catalogs but not in others. This is often as important as finding matches. We demonstrate the utility of SkyQuery with a brown-dwarf search between SDSS and 2MASS, and a search for radio-quiet quasars in SDSS, 2MASS and FIRST. The importance of a service like SkyQuery for the worldwide astronomical community cannot be overstated: data on the same objects in various archives is mapped in different wavelength ranges and looks very different due to different errors, instrument sensitivities and other peculiarities of each archive. Our cross-matching algorithm preforms a fuzzy spatial join across multiple catalogs. This type of cross-matching is currently often done by eye, one object at a time. A static cross-identification table for a set of archives would become obsolete by the time it was built - the exponential growth of astronomical data means that a dynamic cross-identification mechanism like SkyQuery is the only viable option. SkyQuery was funded by a

  10. Engineering the Carrier Dynamics of InGaN Nanowire White Light-Emitting Diodes by Distributed p-AlGaN Electron Blocking Layers

    NASA Astrophysics Data System (ADS)

    Nguyen, Hieu Pham Trung; Djavid, Mehrdad; Woo, Steffi Y.; Liu, Xianhe; Connie, Ashfiqua T.; Sadaf, Sharif; Wang, Qi; Botton, Gianluigi A.; Shih, Ishiang; Mi, Zetian

    2015-01-01

    We report on the demonstration of a new type of axial nanowire LED heterostructures, with the use of self-organized InGaN/AlGaN dot-in-a-wire core-shell nanowire arrays. The large bandgap AlGaN shell is spontaneously formed on the sidewall of the nanowire during the growth of AlGaN barrier of the quantum dot active region. As such, nonradiative surface recombination, that dominates the carrier dynamics of conventional axial nanowire LED structures, can be largely eliminated, leading to significantly increased carrier lifetime from ~0.3 ns to 4.5 ns. The luminescence emission is also enhanced by orders of magnitude. Moreover, the p-doped AlGaN barrier layers can function as distributed electron blocking layers (EBLs), which is found to be more effective in reducing electron overflow, compared to the conventional AlGaN EBL. The device displays strong white-light emission, with a color rendering index of ~95. An output power of >5 mW is measured for a 1 mm × 1 mm device, which is more than 500 times stronger than the conventional InGaN axial nanowire LEDs without AlGaN distributed EBLs.

  11. Engineering the Carrier Dynamics of InGaN Nanowire White Light-Emitting Diodes by Distributed p-AlGaN Electron Blocking Layers

    PubMed Central

    Nguyen, Hieu Pham Trung; Djavid, Mehrdad; Woo, Steffi Y.; Liu, Xianhe; Connie, Ashfiqua T.; Sadaf, Sharif; Wang, Qi; Botton, Gianluigi A.; Shih, Ishiang; Mi, Zetian

    2015-01-01

    We report on the demonstration of a new type of axial nanowire LED heterostructures, with the use of self-organized InGaN/AlGaN dot-in-a-wire core-shell nanowire arrays. The large bandgap AlGaN shell is spontaneously formed on the sidewall of the nanowire during the growth of AlGaN barrier of the quantum dot active region. As such, nonradiative surface recombination, that dominates the carrier dynamics of conventional axial nanowire LED structures, can be largely eliminated, leading to significantly increased carrier lifetime from ~0.3 ns to 4.5 ns. The luminescence emission is also enhanced by orders of magnitude. Moreover, the p-doped AlGaN barrier layers can function as distributed electron blocking layers (EBLs), which is found to be more effective in reducing electron overflow, compared to the conventional AlGaN EBL. The device displays strong white-light emission, with a color rendering index of ~95. An output power of >5 mW is measured for a 1 mm × 1 mm device, which is more than 500 times stronger than the conventional InGaN axial nanowire LEDs without AlGaN distributed EBLs. PMID:25592057

  12. Deeply virtual Compton scattering

    NASA Astrophysics Data System (ADS)

    Marukyan, Hrachya

    2015-11-01

    This paper reviews the experimental measurements in the field of deeply virtual Compton scattering and related theoretical efforts aimed for the extraction of generalized parton distributions, objects, describing the three-dimensional structure of nucleons and nuclei. The future experiments and theoretical expectations are also considered.

  13. Virtual Satellite

    NASA Technical Reports Server (NTRS)

    Hammrs, Stephan R.

    2008-01-01

    Virtual Satellite (VirtualSat) is a computer program that creates an environment that facilitates the development, verification, and validation of flight software for a single spacecraft or for multiple spacecraft flying in formation. In this environment, enhanced functionality and autonomy of navigation, guidance, and control systems of a spacecraft are provided by a virtual satellite that is, a computational model that simulates the dynamic behavior of the spacecraft. Within this environment, it is possible to execute any associated software, the development of which could benefit from knowledge of, and possible interaction (typically, exchange of data) with, the virtual satellite. Examples of associated software include programs for simulating spacecraft power and thermal- management systems. This environment is independent of the flight hardware that will eventually host the flight software, making it possible to develop the software simultaneously with, or even before, the hardware is delivered. Optionally, by use of interfaces included in VirtualSat, hardware can be used instead of simulated. The flight software, coded in the C or C++ programming language, is compilable and loadable into VirtualSat without any special modifications. Thus, VirtualSat can serve as a relatively inexpensive software test-bed for development test, integration, and post-launch maintenance of spacecraft flight software.

  14. Virtual seminars

    NASA Astrophysics Data System (ADS)

    Nelson, H. Roice

    1997-06-01

    A virtual seminar (SM) is an economic and effective instructional tool for teaching students who are at a distance from their instructor. Like conventional class room teaching, a virtual seminar requires an instructor, a student, and a method of communication. Teleconferencing, video conferencing, intranets and the Internet give learners in a Virtual Seminar the ability to interact immediately with their mentors and receive real and relevant answers. This paper shows how industry and academia can benefit from using methods developed and experience gained in presenting the first virtual seminars to academic and petroleum industry participants in mid-1996. The information explosion in industry means that business or technical information is worthless until it is assimilated into a corporate knowledge management system. A search for specific information often turns into a filtering exercise or an attempt to find patterns and classify retrieved material. In the setting of an interactive corporate information system, virtual seminars meet the need for a productive new relationship between creative people and the flux of corporate knowledge. Experience shows that it is more efficient to circulate timesensitive and confidential information electronically through a virtual seminar. Automating the classification of information and removing that task from the usual work load creates an electronic corporate memory and enhances the value of the knowledge to both users and a corporation. Catalogued benchmarks, best-practice standards, and Knowledge Maps (SM) of experience serve as key aids to communicating knowledge through virtual seminars and converting that knowledge into a profit-making asset.

  15. A Virtual Tour of Virtual Schools.

    ERIC Educational Resources Information Center

    Joiner, Lottie L.

    2002-01-01

    Briefly describes the eight virtual schools in the United States: Kentucky Virtual High School; Illinois Virtual High School; Florida Virtual School; CCS Web Academy in Fayetteville, North Carolina; The Virtual High School in Hudson, Massachusetts; Basehor-Linwood Virtual Charter School in Kansas; Monte Vista Online Academy in Colorado; and…

  16. Guidance of Block Needle Insertion by Electrical Nerve Stimulation: A Pilot Study of the Resulting Distribution of Injected Solution in Dogs

    PubMed Central

    Rigaud, Marcel; Filip, Patrick; Lirk, Philipp; Fuchs, Andreas; Gemes, Geza; Hogan, Quinn

    2009-01-01

    Background Little is known regarding the final needle tip location when various intensities of nerve stimulation are used to guide block needle insertion. Therefore, in control and hyperglycemic dogs, the authors examined whether lower-intensity stimulation results in injection closer to the sciatic nerve than higher-threshold stimulation. Methods During anesthesia, the sciatic nerve was approached with an insulated nerve block needle emitting either 1 mA (high-current group, n = 9) or 0.5 mA (low-current group, n = 9 in control dogs and n = 6 in hyperglycemic dogs). After positioning to obtain a distal motor response, the lowest current producing a response was identified, and ink (0.5 ml) was injected. Frozen sections of the tissue revealed whether the ink was in contact with the epineurium of the nerve, distant to it, or within it. Results In control dogs, the patterns of distribution using high-threshold (final current 0.99 ± 0.03 mA, mean ± SD) and low-threshold (final current 0.33 ± 0.08 mA) stimulation equally showed ink that was in contact with the epineurium or distant to it. One needle placement in the high-threshold group resulted in intraneural injection. In hyperglycemic dogs, all needle insertions used a low-threshold technique (n = 6, final threshold 0.35 ± 0.08 mA), and all resulted in intraneural injections. Conclusions In normal dogs, current stimulation levels in the range of 0.33–1.0 mA result in needle placement comparably close to the sciatic nerve but do not correlate with distance from the target nerve. In this experimental design, low-threshold electrical stimulation does not offer satisfactory protection against intraneural injection in the presence of hyperglycemia. PMID:18719445

  17. Incorporating Virtual Teamwork Training into MIS Curricula

    ERIC Educational Resources Information Center

    Chen, Fang; Sager, James; Corbitt, Gail; Gardiner, Stanley C.

    2008-01-01

    Due to increasing industry demand for personnel who work effectively in virtual/distributed teams, MIS students should undergo training to improve their awareness of and competence in virtual teamwork. This paper proposes a model for virtual teamwork training and describes the implementation of the model in a class where students were located in…

  18. Virtual Colonoscopy

    MedlinePlus

    ... virtual colonoscopy include exposure to radiation perforation—a hole or tear in the lining of the colon ... colonoscopy include exposure to radiation and perforation—a hole or tear in the lining of the colon. [ ...

  19. Virtual Worlds for Virtual Organizing

    NASA Astrophysics Data System (ADS)

    Rhoten, Diana; Lutters, Wayne

    The members and resources of a virtual organization are dispersed across time and space, yet they function as a coherent entity through the use of technologies, networks, and alliances. As virtual organizations proliferate and become increasingly important in society, many may exploit the technical architecture s of virtual worlds, which are the confluence of computer-mediated communication, telepresence, and virtual reality originally created for gaming. A brief socio-technical history describes their early origins and the waves of progress followed by stasis that brought us to the current period of renewed enthusiasm. Examination of contemporary examples demonstrates how three genres of virtual worlds have enabled new arenas for virtual organizing: developer-defined closed worlds, user-modifiable quasi-open worlds, and user-generated open worlds. Among expected future trends are an increase in collaboration born virtually rather than imported from existing organizations, a tension between high-fidelity recreations of the physical world and hyper-stylized imaginations of fantasy worlds, and the growth of specialized worlds optimized for particular sectors, companies, or cultures.

  20. Virtual Telescopes in Education

    NASA Astrophysics Data System (ADS)

    Hoban, S.; Des Jardins, M.; Farrell, N.; Rathod, P.; Sachs, J.; Sansare, S.; Yesha, Y.; Keating, J.; Busschots, B.; Means, J.; Clark, G.; Mayo, L.; Smith, W.

    Virtual Telescopes in Education is providing the services required to operate a virtual observatory comprising distributed telescopes, including an interactive, constraint-based scheduling service, data and resource archive, proposal preparation and review environment, and a VTIE Journal. A major goal of VTIE is to elicit from learners questions about the nature of celestial objects and the physical processes that give rise to the spectacular imagery that catches their imaginations. Generation of constrained science questions will assist learners in the science process. To achieve interoperability with other NSDL resources, our approach follows the Open Archives Initiative and the W3C Semantic Web activity.

  1. Virtual memory

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    Virtual memory was conceived as a way to automate overlaying of program segments. Modern computers have very large main memories, but need automatic solutions to the relocation and protection problems. Virtual memory serves this need as well and is thus useful in computers of all sizes. The history of the idea is traced, showing how it has become a widespread, little noticed feature of computers today.

  2. Virtual polytopes

    NASA Astrophysics Data System (ADS)

    Panina, G. Yu; Streinu, I.

    2015-12-01

    Originating in diverse branches of mathematics, from polytope algebra and toric varieties to the theory of stressed graphs, virtual polytopes represent a natural algebraic generalization of convex polytopes. Introduced as elements of the Grothendieck group associated to the semigroup of convex polytopes, they admit a variety of geometrizations. The present survey connects the theory of virtual polytopes with other geometrical subjects, describes a series of geometrizations together with relations between them, and gives a selection of applications. Bibliography: 50 titles.

  3. Virtual Machine Logbook - Enabling virtualization for ATLAS

    NASA Astrophysics Data System (ADS)

    Yao, Yushu; Calafiura, Paolo; Poffet, Julien; Cavalli, Andrea; Leggett, Charles; Frédéric, Bapst

    2010-04-01

    ATLAS software has been developed mostly on CERN linux cluster lxplus or on similar facilities at the experiment Tier 1 centers. The fast rise of virtualization technology has the potential to change this model, turning every laptop or desktop into an ATLAS analysis platform. In the context of the CernVM project we are developing a suite of tools and CernVM plug-in extensions to promote the use of virtualization for ATLAS analysis and software development. The Virtual Machine Logbook (VML), in particular, is an application to organize work of physicists on multiple projects, logging their progress, and speeding up "context switches" from one project to another. An important feature of VML is the ability to share with a single "click" the status of a given project with other colleagues. VML builds upon the save and restore capabilities of mainstream virtualization software like VMware, and provides a technology-independent client interface to them. A lot of emphasis in the design and implementation has gone into optimizing the save and restore process to makepractical to store many VML entries on a typical laptop disk or to share a VML entry over the network. At the same time, taking advantage of CernVM's plugin capabilities, we are extending the CernVM platform to help increase the usability of ATLAS software. For example, we added the ability to start the ATLAS event display on any computer running CernVM simply by clicking a button in a web browser. We want to integrate seamlessly VML with CernVM unique file system design to distribute efficiently ATLAS software on every physicist computer. The CernVM File System (CVMFS) download files on-demand via HTTP, and cache it locally for future use. This reduces by one order of magnitude the download sizes, making practical for a developer to work with multiple software releases on a virtual machine.

  4. Improving hole injection and carrier distribution in InGaN light-emitting diodes by removing the electron blocking layer and including a unique last quantum barrier

    SciTech Connect

    Cheng, Liwen Chen, Haitao; Wu, Shudong

    2015-08-28

    The effects of removing the AlGaN electron blocking layer (EBL), and using a last quantum barrier (LQB) with a unique design in conventional blue InGaN light-emitting diodes (LEDs), were investigated through simulations. Compared with the conventional LED design that contained a GaN LQB and an AlGaN EBL, the LED that contained an AlGaN LQB with a graded-composition and no EBL exhibited enhanced optical performance and less efficiency droop. This effect was caused by an enhanced electron confinement and hole injection efficiency. Furthermore, when the AlGaN LQB was replaced with a triangular graded-composition, the performance improved further and the efficiency droop was lowered. The simulation results indicated that the enhanced hole injection efficiency and uniform distribution of carriers observed in the quantum wells were caused by the smoothing and thinning of the potential barrier for the holes. This allowed a greater number of holes to tunnel into the quantum wells from the p-type regions in the proposed LED structure.

  5. Virtual Tower

    SciTech Connect

    Wayne, R.A.

    1997-08-01

    The primary responsibility of an intrusion detection system (IDS) operator is to monitor the system, assess alarms, and summon and coordinate the response team when a threat is acknowledged. The tools currently provided to the operator are somewhat limited: monitors must be switched, keystrokes must be entered to call up intrusion sensor data, and communication with the response force must be maintained. The Virtual tower is an operator interface assembled from low-cost commercial-off-the-shelf hardware and software; it enables large amounts of data to be displayed in a virtual manner that provides instant recognition for the operator and increases assessment accuracy in alarm annunciator and control systems. This is accomplished by correlating and fusing the data into a 360-degree visual representation that employs color, auxiliary attributes, video, and directional audio to prompt the operator. The Virtual Tower would be a valuable low-cost enhancement to existing systems.

  6. Virtual Violence.

    PubMed

    2016-08-01

    In the United States, exposure to media violence is becoming an inescapable component of children's lives. With the rise in new technologies, such as tablets and new gaming platforms, children and adolescents increasingly are exposed to what is known as "virtual violence." This form of violence is not experienced physically; rather, it is experienced in realistic ways via new technology and ever more intense and realistic games. The American Academy of Pediatrics continues to be concerned about children's exposure to virtual violence and the effect it has on their overall health and well-being. This policy statement aims to summarize the current state of scientific knowledge regarding the effects of virtual violence on children's attitudes and behaviors and to make specific recommendations for pediatricians, parents, industry, and policy makers. PMID:27432848

  7. DIstributed VIRtual System (DIVIRS) Project

    NASA Technical Reports Server (NTRS)

    Schorr, Herbert; Neuman, B. Clifford; Gaines, Stockton R.; Mizell, David

    1996-01-01

    The development of Prospero moved from the University of Washington to ISI and several new versions of the software were released from ISI during the contract period. Changes in the first release from ISI included bug fixes and extensions to support the needs of specific users. Among these changes was a new option to directory queries that allows attributes to be returned for all files in a directory together with the directory listing. This change greatly improves the performance of their server and reduces the number of packets sent across their trans-pacific connection to the rest of the internet. Several new access method were added to the Prospero file method. The Prospero Data Access Protocol was designed, to support secure retrieval of data from systems running Prospero.

  8. Virtual sound for virtual reality

    SciTech Connect

    Blattner, M.M. ||; Papp, A.L. III |

    1993-02-01

    The computational limitations of real-time interactive computing do not meet our requirements for producing realistic images for virtual reality in a convincing manner. Regardless of the real-time restrictions on virtual reality interfaces, the representations can be no better than the graphics. Computer graphics is still limited in its ability to generate complex objects such as landscapes and humans. Nevertheless, useful and convincing visualizations are made through a variety of techniques. The central theme of this article is that a similar situation is true with sound for virtual reality. It is beyond our abilityto create interactive soundscapes that create a faithful reproduction of real world sounds, however, by choosing one`s application carefully and using sound to enhance a display rather than only mimic real-world scenes, a very effective use of sound can be made.

  9. Virtual sound for virtual reality

    SciTech Connect

    Blattner, M.M. Cancer Center, Houston, TX . Dept. of Biomathematics Lawrence Livermore National Lab., CA California Univ., Davis, CA ); Papp, A.L. III Lawrence Livermore National Lab., CA )

    1993-02-01

    The computational limitations of real-time interactive computing do not meet our requirements for producing realistic images for virtual reality in a convincing manner. Regardless of the real-time restrictions on virtual reality interfaces, the representations can be no better than the graphics. Computer graphics is still limited in its ability to generate complex objects such as landscapes and humans. Nevertheless, useful and convincing visualizations are made through a variety of techniques. The central theme of this article is that a similar situation is true with sound for virtual reality. It is beyond our abilityto create interactive soundscapes that create a faithful reproduction of real world sounds, however, by choosing one's application carefully and using sound to enhance a display rather than only mimic real-world scenes, a very effective use of sound can be made.

  10. Virtualize Me!

    ERIC Educational Resources Information Center

    Waters, John K.

    2009-01-01

    John Abdelmalak, director of technology for the School District of the Chathams, was pretty sure it was time to jump on the virtualization bandwagon last year when he invited Dell to conduct a readiness assessment of his district's servers. When he saw just how little of their capacity was being used, he lost all doubt. Abdelmalak is one of many…

  11. Virtual Labs.

    ERIC Educational Resources Information Center

    Russo, Ruth

    1997-01-01

    Discusses the potential of computers in teaching laboratories to spare the lives of animals; however, it is felt that in areas of physiology education, virtual labs are not as desirable a learning experience for advanced students as live animal labs. (Author/AIM)

  12. Virtual Reality.

    ERIC Educational Resources Information Center

    Newby, Gregory B.

    1993-01-01

    Discusses the current state of the art in virtual reality (VR), its historical background, and future possibilities. Highlights include applications in medicine, art and entertainment, science, business, and telerobotics; and VR for information science, including graphical display of bibliographic data, libraries and books, and cyberspace.…

  13. Virtual Sensor Test Instrumentation

    NASA Technical Reports Server (NTRS)

    Wang, Roy

    2011-01-01

    Virtual Sensor Test Instrumentation is based on the concept of smart sensor technology for testing with intelligence needed to perform sell-diagnosis of health, and to participate in a hierarchy of health determination at sensor, process, and system levels. A virtual sensor test instrumentation consists of five elements: (1) a common sensor interface, (2) microprocessor, (3) wireless interface, (4) signal conditioning and ADC/DAC (analog-to-digital conversion/ digital-to-analog conversion), and (5) onboard EEPROM (electrically erasable programmable read-only memory) for metadata storage and executable software to create powerful, scalable, reconfigurable, and reliable embedded and distributed test instruments. In order to maximize the efficient data conversion through the smart sensor node, plug-and-play functionality is required to interface with traditional sensors to enhance their identity and capabilities for data processing and communications. Virtual sensor test instrumentation can be accessible wirelessly via a Network Capable Application Processor (NCAP) or a Smart Transducer Interlace Module (STIM) that may be managed under real-time rule engines for mission-critical applications. The transducer senses the physical quantity being measured and converts it into an electrical signal. The signal is fed to an A/D converter, and is ready for use by the processor to execute functional transformation based on the sensor characteristics stored in a Transducer Electronic Data Sheet (TEDS). Virtual sensor test instrumentation is built upon an open-system architecture with standardized protocol modules/stacks to interface with industry standards and commonly used software. One major benefit for deploying the virtual sensor test instrumentation is the ability, through a plug-and-play common interface, to convert raw sensor data in either analog or digital form, to an IEEE 1451 standard-based smart sensor, which has instructions to program sensors for a wide variety of

  14. Collaborative virtual experience based on reconfigurable simulation

    NASA Astrophysics Data System (ADS)

    Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong

    2006-10-01

    Virtual Reality simulation enables immersive 3D experience of a Virtual Environment. A simulation-based Virtual Environment can be used to map real world phenomena onto virtual experience. With a reconfigurable simulation, users can reconfigure the parameters of the involved objects, so that they can see different effects from the different configurations. This concept is suitable for a classroom learning of physics law. This research studies the Virtual Reality simulation of Newton's physics law on rigid body type of objects. With network support, collaborative interaction is enabled so that people from different places can interact with the same set of objects in immersive Collaborative Virtual Environment. The taxonomy of the interaction in different levels of collaboration is described as: distinct objects and same object, in which there are same object - sequentially, same object - concurrently - same attribute, and same object - concurrently - distinct attributes. The case studies are the interaction of users in two cases: destroying and creating a set of arranged rigid bodies. In Virtual Domino, users can observe physics law while applying force to the domino blocks in order to destroy the arrangements. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects.

  15. A virtual crystallization furnace for solar silicon

    SciTech Connect

    Steinbach, I.; Franke, D.; Krumbe, W.; Liebermann, J.

    1994-12-31

    Blocks of silicon for photovoltaic applications are economically crystallized in large casting furnaces. The quality of the material is determined by the velocity of the crystallization front, the flatness of the liquid-solid interface and the thermal gradients in the solid during cooling. The process cycle time, which is determined by the rate of crystallization and cooling, has a large effect on the process economic viability. Traditionally trial and error was used to determine the process control parameters, the success of which depended on the operator`s experience and intuition. This paper presents a numerical model, which when completed by a fitted data set, constitutes a virtual model of a real crystallization furnace, the Virtual Crystallization Furnace (VCF). The time-temperature distribution during the process cycle is the main output, which includes a display of actual liquid-solid front position. Moreover, solidification velocity, temperature gradients and thermal stresses can be deduced from this output. The time needed to run a simulation on a modern work-station is approximately 1/6 of real process time, thereby allowing the user to make many process variations at very reasonable costs. Therefore the VCF is a powerful tool for optimizing the process in order to reduce cycle time and to increase product quality.

  16. Virtual impactor

    DOEpatents

    Yeh, Hsu-Chi; Chen, Bean T.; Cheng, Yung-Sung; Newton, George J.

    1988-08-30

    A virtual impactor having improved efficiency and low wall losses in which a core of clean air is inserted into the aerosol flow while aerosol flow is maintained adjacent inner wall surfaces of the focusing portion of the impactor. The flow rate of the core and the length of the throat of the impactor's collection probe, as well as the dimensional relationships of other components of the impactor adjacent the separation region of the impactor, are selected to optimize separation efficiency.

  17. Virtual anthropology.

    PubMed

    Weber, Gerhard W

    2015-02-01

    Comparative morphology, dealing with the diversity of form and shape, and functional morphology, the study of the relationship between the structure and the function of an organism's parts, are both important subdisciplines in biological research. Virtual anthropology (VA) contributes to comparative morphology by taking advantage of technological innovations, and it also offers new opportunities for functional analyses. It exploits digital technologies and pools experts from different domains such as anthropology, primatology, medicine, paleontology, mathematics, statistics, computer science, and engineering. VA as a technical term was coined in the late 1990s from the perspective of anthropologists with the intent of being mostly applied to biological questions concerning recent and fossil hominoids. More generally, however, there are advanced methods to study shape and size or to manipulate data digitally suitable for application to all kinds of primates, mammals, other vertebrates, and invertebrates or to issues regarding plants, tools, or other objects. In this sense, we could also call the field "virtual morphology." The approach yields permanently available virtual copies of specimens and data that comprehensively quantify geometry, including previously neglected anatomical regions. It applies advanced statistical methods, supports the reconstruction of specimens based on reproducible manipulations, and promotes the acquisition of larger samples by data sharing via electronic archives. Finally, it can help identify new, hidden traits, which is particularly important in paleoanthropology, where the scarcity of material demands extracting information from fragmentary remains. This contribution presents a current view of the six main work steps of VA: digitize, expose, compare, reconstruct, materialize, and share. The VA machinery has also been successfully used in biomechanical studies which simulate the stress and strains appearing in structures. Although

  18. Tuning magnetization, blocking temperature, cation distribution of nanosized Co0.2Zn0.8Fe2O4 by mechanical activation

    NASA Astrophysics Data System (ADS)

    Dey, S.; Mondal, R.; Dey, S. K.; Majumder, S.; Dasgupta, P.; Poddar, A.; Reddy, V. R.; Kumar, S.

    2015-09-01

    The study on structural, microstructural, magnetic, and hyperfine properties of nanosized Co0.2Zn0.8Fe2O4 having particle size ˜18 nm (CZM) synthesized by high energy ball milling of Co0.2Zn0.8Fe2O4 nanoparticles of size ˜20 nm (CZ) produced by flow rate controlled coprecipitation method has revealed that the inclusion of strain induced anisotropy produced by mechanical treatment and escalation of oxygen mediated intersublattice exchange interaction of spinel ferrites by tuning cation distribution properly, can improve the magnetic quality of nanosized ferrites significantly. This upshot will be of immense help in promoting the technological application of nanostructured ferrites. The Rietveld refinement of powder x-ray diffraction pattern and the analysis of transmission electron micrographs, energy dispersive x-ray spectrum, and FTIR spectrum of the sample have confirmed that CZM is single phase cubic nanometric spinel ferrite of Fd 3 ¯ m symmetry and it possesses large microstrain within its crystal lattice. The dc magnetic and Mössbauer spectroscopic studies together with indicate that the particles in the sample are composed of ferrimagnetically aligned core and spin-glass like shell and the system behaves superparamagnetically at 300 K. The saturation magnetization (44 and 87 emu g-1 at 300 and 10 K) and hyperfine field of the sample are substantially higher than its counterparts reported earlier. In spite of its lower size compared to CZ, the blocking temperature (˜220 K) of CZM is higher than that of CZ (70 K) and also that of its counterparts synthesized by chemical methods. The strengthening of the intersublattice A-O-B superexchange interaction because of migration of Fe3+ ions from octahedral [B] to tetrahedral (A) sites in lieu of the relocation of Zn2+ among (A) and [B] sites helps in enhancement of magnetization and hyperfine field of CZM. The giant coercivity (HC ˜ 5600 Oe at 10 K) of CZM is accounted by the presence of spin glass like surface

  19. Role of the blocking capacitor in control of ion energy distributions in pulsed capacitively coupled plasmas sustained in Ar/CF{sub 4}/O{sub 2}

    SciTech Connect

    Song, Sang-Heon; Kushner, Mark J.

    2014-03-15

    In plasma etching for microelectronics fabrication, the quality of the process is in large part determined by the ability to control the ion energy distribution (IED) onto the wafer. To achieve this control, dual frequency capacitively coupled plasmas (DF-CCPs) have been developed with the goal of separately controlling the magnitude of the fluxes of ions and radicals with the high frequency (HF) and the shape of the IED with the low frequency (LF). In steady state operation, plasma properties are determined by a real time balance between electron sources and losses. As such, for a given geometry, pressure, and frequency of operation, the latitude for controlling the IED may be limited. Pulsed power is one technique being investigated to provide additional degrees of freedom to control the IED. In one configuration of a DF-CCP, the HF power is applied to the upper electrode and LF power is applied to the lower electrode which is serially connected to a blocking capacitor (BC) which generates a self dc-bias. In the steady state, the value of the dc-bias is, in fact, constant. During pulsed operation, however, there may be time modulation of the dc-bias which provides an additional means to control the IED. In this paper, IEDs to the wafer in pulsed DF-CCPs sustained in Ar/CF{sub 4}/O{sub 2} are discussed with results from a two-dimensional plasma hydrodynamics model. The IED can be manipulated depending on whether the LF or HF power is pulsed. The dynamic range of the control can be tuned by the dc-bias generated on the substrate, whose time variation depends on the size of the BC during pulsed operation. It was found that high energy ions can be preferentially produced when pulsing the HF power and low energy ions are preferentially produced when pulsing the LF power. A smaller BC value which allows the bias to follow the change in charged particle fluxes produces a larger dynamic range with which to control IEDs.

  20. Virtual System Environments

    SciTech Connect

    Vallee, Geoffroy R; Naughton, III, Thomas J; Ong, Hong Hoe; Tikotekar, Anand A; Engelmann, Christian; Bland, Wesley B; Aderholdt, Ferrol; Scott, Stephen L

    2008-01-01

    Distributed and parallel systems are typically managed with "static" settings: the operating system (OS) and the runtime environment (RTE) are specified at a given time and cannot be changed to fit an application's needs. This means that every time application developers want to use their application on a new execution platform, the application has to be ported to this new environment, which may be expensive in terms of application modifications and developer time. However, the science resides in the applications and not in the OS or the RTE. Therefore, it should be beneficial to adapt the OS and the RTE to the application instead of adapting the applications to the OS and the RTE. This document presents the concept of Virtual System Environments (VSE), which enables application developers to specify and create a virtual environment that properly fits their application's needs. For that four challenges have to be addressed: (i) definition of the VSE itself by the application developers, (ii) deployment of the VSE, (iii) system administration for the platform, and (iv) protection of the platform from the running VSE. We therefore present an integrated tool for the definition and deployment of VSEs on top of traditional and virtual (i.e., using system-level virtualization) execution platforms. This tool provides the capability to choose the degree of delegation for system administration tasks and the degree of protection from the application (e.g., using virtual machines). To summarize, the VSE concept enables the customization of the OS/RTE used for the execution of application by users without compromising local system administration rules and execution platform protection constraints.

  1. Virtual impactor

    DOEpatents

    Yeh, H.C.; Chen, B.T.; Cheng, Y.S.; Newton, G.J.

    1988-08-30

    A virtual impactor is described having improved efficiency and low wall losses in which a core of clean air is inserted into the aerosol flow while aerosol flow is maintained adjacent to the inner wall surfaces of the focusing portion of the impactor. The flow rate of the core and the length of the throat of the impactor's collection probe, as well as the dimensional relationships of other components of the impactor adjacent the separation region of the impactor, are selected to optimize separation efficiency. 4 figs.

  2. Virtual Observatories

    NASA Astrophysics Data System (ADS)

    Genova, Françoise

    2011-06-01

    Astronomy has been at the forefront among scientific disciplines for the sharing of data, and the advent of the World Wide Web has produced a revolution in the way astronomers do science. The recent development of the concept of Virtual Observatory builds on these foundations. This is one of the truly global endeavours of astronomy, aiming at providing astronomers with seamless access to data and tools, including theoretical data. Astronomy on-line resources provide a rare example of a world-wide, discipline-wide knowledge infrastructure, based on internationally agreed interoperability standards.

  3. Blocking farnesylation of the prelamin A variant in Hutchinson-Gilford progeria syndrome alters the distribution of A-type lamins.

    PubMed

    Wang, Yuexia; Ostlund, Cecilia; Choi, Jason C; Swayne, Theresa C; Gundersen, Gregg G; Worman, Howard J

    2012-01-01

    Mutations in the lamin A/C gene that cause Hutchinson-Gilford progeria syndrome lead to expression of a truncated, permanently farnesylated prelamin A variant called progerin. Blocking farnesylation leads to an improvement in the abnormal nuclear morphology observed in cells expressing progerin, which is associated with a re-localization of the variant protein from the nuclear envelope to the nuclear interior. We now show that a progerin construct that cannot be farnesylated is localized primarily in intranuclear foci and that its diffusional mobility is significantly greater than that of farnesylated progerin localized predominantly at the nuclear envelope. Expression of non-farnesylated progerin in transfected cells leads to a redistribution of lamin A and lamin C away from the nuclear envelope into intranuclear foci but does not significantly affect the localization of endogenous lamin B1 at nuclear envelope. There is a similar redistribution of lamin A and lamin C into intranuclear foci in transfected cells expressing progerin in which protein farnesylation is blocked by treatment with a protein farnesyltransferase inhibitor. Blocking farnesylation of progerin can lead to a redistribution of normal A-type lamins away from the inner nuclear envelope. This may have implications for using drugs that block protein prenylation to treat children with Hutchinson-Gilford progeria syndrome. These findings also provide additional evidence that A-type and B-type lamins can form separate microdomains within the nucleus. PMID:22895092

  4. Large Block Test Final Report

    SciTech Connect

    Lin, W

    2001-12-01

    This report documents the Large-Block Test (LBT) conducted at Fran Ridge near Yucca Mountain, Nevada. The LBT was a thermal test conducted on an exposed block of middle non-lithophysal Topopah Spring tuff (Tptpmn) and was designed to assist in understanding the thermal-hydrological-mechanical-chemical (THMC) processes associated with heating and then cooling a partially saturated fractured rock mass. The LBT was unique in that it was a large (3 x 3 x 4.5 m) block with top and sides exposed. Because the block was exposed at the surface, boundary conditions on five of the six sides of the block were relatively well known and controlled, making this test both easier to model and easier to monitor. This report presents a detailed description of the test as well as analyses of the data and conclusions drawn from the test. The rock block that was tested during the LBT was exposed by excavation and removal of the surrounding rock. The block was characterized and instrumented, and the sides were sealed and insulated to inhibit moisture and heat loss. Temperature on the top of the block was also controlled. The block was heated for 13 months, during which time temperature, moisture distribution, and deformation were monitored. After the test was completed and the block cooled down, a series of boreholes were drilled, and one of the heater holes was over-cored to collect samples for post-test characterization of mineralogy and mechanical properties. Section 2 provides background on the test. Section 3 lists the test objectives and describes the block site, the site configuration, and measurements made during the test. Section 3 also presents a chronology of events associated with the LBT, characterization of the block, and the pre-heat analyses of the test. Section 4 describes the fracture network contained in the block. Section 5 describes the heating/cooling system used to control the temperature in the block and presents the thermal history of the block during the test

  5. Hillslope-derived blocks retard river incision

    NASA Astrophysics Data System (ADS)

    Shobe, Charles M.; Tucker, Gregory E.; Anderson, Robert S.

    2016-05-01

    The most common detachment-limited river incision models ignore the effects of sediment on fluvial erosion, yet steep reaches of mountain rivers often host clusters of large (>1 m) blocks. We argue that this distribution of blocks is a manifestation of an autogenic negative feedback in which fast vertical river incision steepens adjacent hillslopes, which deliver blocks to the channel. Blocks inhibit incision by shielding the bed and enhancing form drag. We explore this feedback with a 1-D channel-reach model in which block delivery by hillslopes depends on the river incision rate. Results indicate that incision-dependent block delivery can explain the block distribution in Boulder Creek, Colorado. The proposed negative feedback may significantly slow knickpoint retreat, channel adjustment, and landscape response compared to rates predicted by current theory. The influence of hillslope-derived blocks may complicate efforts to extract base level histories from river profiles.

  6. Two implementations of shared virtual space environments.

    SciTech Connect

    Disz, T. L.

    1998-01-13

    While many issues in the area of virtual reality (VR) research have been addressed in recent years, the constant leaps forward in technology continue to push the field forward. VR research no longer is focused only on computer graphics, but instead has become even more interdisciplinary, combining the fields of networking, distributed computing, and even artificial intelligence. In this article we discuss some of the issues associated with distributed, collaborative virtual reality, as well as lessons learned during the development of two distributed virtual reality applications.

  7. STALK : an interactive virtual molecular docking system.

    SciTech Connect

    Levine, D.; Facello, M.; Hallstrom, P.; Reeder, G.; Walenz, B.; Stevens, F.; Univ. of Illinois

    1997-04-01

    Several recent technologies-genetic algorithms, parallel and distributed computing, virtual reality, and high-speed networking-underlie a new approach to the computational study of how biomolecules interact or 'dock' together. With the Stalk system, a user in a virtual reality environment can interact with a genetic algorithm running on a parallel computer to help in the search for likely geometric configurations.

  8. Blocking of Goal-Location Learning Based on Shape

    ERIC Educational Resources Information Center

    Alexander, Tim; Wilson, Stuart P.; Wilson, Paul N.

    2009-01-01

    Using desktop, computer-simulated virtual environments (VEs), the authors conducted 5 experiments to investigate blocking of learning about a goal location based on Shape B as a consequence of preliminary training to locate that goal using Shape A. The shapes were large 2-dimensional horizontal figures on the ground. Blocking of spatial learning…

  9. Virtual Reality and the Virtual Library.

    ERIC Educational Resources Information Center

    Oppenheim, Charles

    1993-01-01

    Explains virtual reality, including proper and improper uses of the term, and suggests ways that libraries might be affected by it. Highlights include elements of virtual reality systems; possible virtual reality applications, including architecture, the chemical industry, transport planning, armed forces, and entertainment; and the virtual…

  10. The Impact of Multimodal Virtual Manipulatives on Young Children's Mathematics Learning

    ERIC Educational Resources Information Center

    Paek, Seungoh

    2012-01-01

    The purpose of this study is to demonstrate how virtual manipulatives, designed to provide multimodal interactions, support richer perceptual experiences that promote conceptual learning. To study this phenomenon, a virtual manipulative called, "Puzzle Blocks," was developed by the researcher. "Puzzle Blocks" introduces the…

  11. Virtual Worlds, Real Learning

    ERIC Educational Resources Information Center

    Meyers, Eric M.

    2009-01-01

    Many children between the ages of four and twelve log in to Web-based virtual play spaces each day, and these virtual worlds are quickly becoming an important aspect of their out-of-school lives. Consequently, educators' challenge is to see how they can leverage virtual spaces, such as the virtual play spaces, for learning and literacy. Over the…

  12. Rethinking Virtual School

    ERIC Educational Resources Information Center

    Schomburg, Gary; Rippeth, Michelle

    2009-01-01

    Virtual schooling has been touted as one of the best ways to meet the needs of at-risk students, but what happens when a district's virtual education program is unsuccessful? That was the problem in Eastern Local School District, a small rural district in Beaver, Ohio. The district contracted virtual school services and used the virtual school for…

  13. The Russian Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Dluzhnevskaya, O. B.; Malkov, O. Yu.; Kilpio, A. A.; Kilpio, E. Yu.; Kovaleva, D. A.; Sat, L. A.

    The Russian Virtual Observatory (RVO) will be an integral component of the International Virtual Observatory (IVO). The RVO has the main goal of integrating resources of astronomical data accumulated in Russian observatories and institutions (databases, archives, digitized glass libraries, bibliographic data, a remote access system to information and technical resources of telescopes etc.), and providing transparent access for scientific and educational purposes to the distributed information and data services that comprise its content. Another goal of the RVO is to provide Russian astronomers with on-line access to the rich volumes of data and metadata that have been, and will continue to be, produced by astronomical survey projects. Centre for Astronomical Data (CAD), among other Russian institutions, has had the greatest experience in collecting and distributing astronomical data for more than 20 years. Some hundreds of catalogs and journal tables are currently available from the CAD repository. More recently, mirrors of main astronomical data resources (VizieR, ADS, etc) are now maintained in CAD. Besides, CAD accumulates and makes available for the astronomical community information on principal Russian astronomical resources.

  14. Virtual courseware for geoscience education: Virtual Earthquake and Virtual Dating

    NASA Astrophysics Data System (ADS)

    Novak, Gary A.

    1999-05-01

    Virtual courseware developed for introductory-level, on-line geology labs is an interactive teaching/learning model that has an enormous pedagogical potential for making Web sites places where students learn by doing. Virtual Earthquake and Virtual Dating are modest examples of the `virtual courseware' paradigm. Virtual Earthquake helps students explore the techniques of how an earthquake's epicenter is located and how its Richter magnitude is determined. Virtual Dating models the theory and techniques of the radiometric age determination of rocks and minerals. Virtual courseware applications offer several advantages over traditional floppy disk or CD ROM-based courseware, the most significant being the ease of dissemination. The author's experience with bringing these two virtual applications on-line suggests that there is a need for interactive geology labs on-line and that the approach will be received with enthusiasm by the educational community. The widespread implementation and adoption of virtual courseware can bring meaningful educational content and interactivity for the geosciences that goes beyond multimedia on the World-Wide-Web.

  15. Virtual button interface

    DOEpatents

    Jones, Jake S.

    1999-01-01

    An apparatus and method of issuing commands to a computer by a user interfacing with a virtual reality environment. To issue a command, the user directs gaze at a virtual button within the virtual reality environment, causing a perceptible change in the virtual button, which then sends a command corresponding to the virtual button to the computer, optionally after a confirming action is performed by the user, such as depressing a thumb switch.

  16. Virtual button interface

    DOEpatents

    Jones, J.S.

    1999-01-12

    An apparatus and method of issuing commands to a computer by a user interfacing with a virtual reality environment are disclosed. To issue a command, the user directs gaze at a virtual button within the virtual reality environment, causing a perceptible change in the virtual button, which then sends a command corresponding to the virtual button to the computer, optionally after a confirming action is performed by the user, such as depressing a thumb switch. 4 figs.

  17. Virtual Goods Recommendations in Virtual Worlds

    PubMed Central

    Chen, Kuan-Yu; Liao, Hsiu-Yu; Chen, Jyun-Hung; Liu, Duen-Ren

    2015-01-01

    Virtual worlds (VWs) are computer-simulated environments which allow users to create their own virtual character as an avatar. With the rapidly growing user volume in VWs, platform providers launch virtual goods in haste and stampede users to increase sales revenue. However, the rapidity of development incurs virtual unrelated items which will be difficult to remarket. It not only wastes virtual global companies' intelligence resources, but also makes it difficult for users to find suitable virtual goods fit for their virtual home in daily virtual life. In the VWs, users decorate their houses, visit others' homes, create families, host parties, and so forth. Users establish their social life circles through these activities. This research proposes a novel virtual goods recommendation method based on these social interactions. The contact strength and contact influence result from interactions with social neighbors and influence users' buying intention. Our research highlights the importance of social interactions in virtual goods recommendation. The experiment's data were retrieved from an online VW platform, and the results show that the proposed method, considering social interactions and social life circle, has better performance than existing recommendation methods. PMID:25834837

  18. Virtual PCR

    SciTech Connect

    Gardner, S N; Clague, D S; Vandersall, J A; Hon, G; Williams, P L

    2006-02-23

    The polymerase chain reaction (PCR) stands among the keystone technologies for analysis of biological sequence data. PCR is used to amplify DNA, to generate many copies from as little as a single template. This is essential, for example, in processing forensic DNA samples, pathogen detection in clinical or biothreat surveillance applications, and medical genotyping for diagnosis and treatment of disease. It is used in virtually every laboratory doing molecular, cellular, genetic, ecologic, forensic, or medical research. Despite its ubiquity, we lack the precise predictive capability that would enable detailed optimization of PCR reaction dynamics. In this LDRD, we proposed to develop Virtual PCR (VPCR) software, a computational method to model the kinetic, thermodynamic, and biological processes of PCR reactions. Given a successful completion, these tools will allow us to predict both the sequences and concentrations of all species that are amplified during PCR. The ability to answer the following questions will allow us both to optimize the PCR process and interpret the PCR results: What products are amplified when sequence mixtures are present, containing multiple, closely related targets and multiplexed primers, which may hybridize with sequence mismatches? What are the effects of time, temperature, and DNA concentrations on the concentrations of products? A better understanding of these issues will improve the design and interpretation of PCR reactions. The status of the VPCR project after 1.5 years of funding is consistent with the goals of the overall project which was scoped for 3 years of funding. At half way through the projected timeline of the project we have an early beta version of the VPCR code. We have begun investigating means to improve the robustness of the code, performed preliminary experiments to test the code and begun drafting manuscripts for publication. Although an experimental protocol for testing the code was developed, the preliminary

  19. Types of Heart Block

    MedlinePlus

    ... Block Explore Heart Block What Is... Electrical System & EKG Results Types Causes Who Is at Risk Signs & ... the P and the R waves on the EKG (electrocardiogram). First-degree heart block may not cause ...

  20. Virtualization, virtual environments, and content-based retrieval of three-dimensional information for cultural applications

    NASA Astrophysics Data System (ADS)

    Paquet, Eric; Peters, Shawn; Beraldin, J. A.; Valzano, Virginia; Bandiera, Adriana

    2003-01-01

    The present paper proposes a virtual environment for visualizing virtualized cultural and historical sites. The proposed environment is based on a distributed asynchronous architecture and supports stereo vision and tiled wall display. The system is mobile and can run from two laptops. This virtual environment addresses the problems of intellectual property protection and multimedia information retrieval through encryptation and content-based management respectively. Experimental results with a fully textured 3D model of the Crypt of Santa Cristina in Italy are presented, evaluating the performances of the proposed virtual environment.

  1. Blocking of Spatial Learning between Enclosure Geometry and a Local Landmark

    ERIC Educational Resources Information Center

    Wilson, Paul N.; Alexander, Tim

    2008-01-01

    In a virtual environment, blocking of spatial learning to locate an invisible target was found reciprocally between a distinctively shaped enclosure and a local landmark within its walls. The blocking effect was significantly stronger when the shape of the enclosure rather than the landmark served as the blocking cue. However, the extent to which…

  2. Method and structure for skewed block-cyclic distribution of lower-dimensional data arrays in higher-dimensional processor grids

    DOEpatents

    Chatterjee, Siddhartha; Gunnels, John A.

    2011-11-08

    A method and structure of distributing elements of an array of data in a computer memory to a specific processor of a multi-dimensional mesh of parallel processors includes designating a distribution of elements of at least a portion of the array to be executed by specific processors in the multi-dimensional mesh of parallel processors. The pattern of the designating includes a cyclical repetitive pattern of the parallel processor mesh, as modified to have a skew in at least one dimension so that both a row of data in the array and a column of data in the array map to respective contiguous groupings of the processors such that a dimension of the contiguous groupings is greater than one.

  3. Next Generation Virtual Observatories

    NASA Astrophysics Data System (ADS)

    Fox, P.; McGuinness, D. L.

    2008-12-01

    Virtual Observatories (VO) are now being established in a variety of geoscience disciplines beyond their origins in Astronomy and Solar Physics. Implementations range from hydrology and environmental sciences to solid earth sciences. Among the goals of VOs are to provide search/ query, access and use of distributed, heterogeneous data resources. With many of these goals being met and usage increasing, new demands and requirements are arising. In particular there are two of immediate and pressing interest. The first is use of VOs by non-specialists, especially for information products that go beyond the usual data, or data products that are sought for scientific research. The second area is citation and attribution of artifacts that are being generated by VOs. In some sense VOs are re-publishing (re-packaging, or generating new synthetic) data and information products. At present only a few VOs address this need and it is clear that a comprehensive solution that includes publishers is required. Our work in VOs and related semantic data framework and integration areas has lead to a view of the next generation of virtual observatories which the two above-mentioned needs as well as others that are emerging. Both of the needs highlight a semantic gap, i.e. that the meaning and use for a user or users beyond the original design intention is very often difficult or impossible to bridge. For example, VOs created for experts with complex, arcane or jargon vocabularies are not accessible to the non-specialist and further, information products the non-specialist may use are not created or considered for creation. In the second case, use of a (possibly virtual) data or information product (e.g. an image or map) as an intellectual artifact that can be accessed as part of the scientific publication and review procedure also introduces terminology gaps, as well as services that VOs may need to provide. Our supposition is that formalized methods in semantics and semantic web

  4. Accessing proton generalized parton distributions and pion distribution amplitudes with the exclusive pion-induced Drell-Yan process at J-PARC

    NASA Astrophysics Data System (ADS)

    Sawada, Takahiro; Chang, Wen-Chen; Kumano, Shunzo; Peng, Jen-Chieh; Sawada, Shinya; Tanaka, Kazuhiro

    2016-06-01

    Generalized parton distributions (GPDs) encoding multidimensional information of hadron partonic structure appear as the building blocks in a factorized description of hard exclusive reactions. The nucleon GPDs have been accessed by deeply virtual Compton scattering and deeply virtual meson production with lepton beam. A complementary probe with hadron beam is the exclusive pion-induced Drell-Yan process. In this paper, we discuss recent theoretical advances on describing this process in terms of nucleon GPDs and pion distribution amplitudes. Furthermore, we address the feasibility of measuring the exclusive pion-induced Drell-Yan process π-p →μ+μ-n via a spectrometer at the High Momentum Beamline being constructed at J-PARC in Japan. Realization of such measurement at J-PARC will provide a new test of perturbative QCD descriptions of a novel class of hard exclusive reactions. It will also offer the possibility of experimentally accessing nucleon GPDs at large timelike virtuality.

  5. Distribution of lithostratigraphic units within the central block of Yucca Mountain, Nevada: A three-dimensional computer-based model, Version YMP.R2.0

    SciTech Connect

    Buesch, D.C.; Nelson, J.E.; Dickerson, R.P.; Drake, R.M. II; San Juan, C.A.; Spengler, R.W.; Geslin, J.K.; Moyer, T.C.

    1996-09-01

    Yucca Mountain, Nevada is underlain by 14.0 to 11.6 Ma volcanic rocks tilted eastward 3{degree} to 20{degree} and cut by faults that were primarily active between 12.7 and 11.6 Ma. A three-dimensional computer-based model of the central block of the mountain consists of seven structural subblocks composed of six formations and the interstratified-bedded tuffaceous deposits. Rocks from the 12.7 Ma Tiva Canyon Tuff, which forms most of the exposed rocks on the mountain, to the 13.1 Ma Prow Pass Tuff are modeled with 13 surfaces. Modeled units represent single formations such as the Pah Canyon Tuff, grouped units such as the combination of the Yucca Mountain Tuff with the superjacent bedded tuff, and divisions of the Topopah Spring Tuff such as the crystal-poor vitrophyre interval. The model is based on data from 75 boreholes from which a structure contour map at the base of the Tiva Canyon Tuff and isochore maps for each unit are constructed to serve as primary input. Modeling consists of an iterative cycle that begins with the primary structure-contour map from which isochore values of the subjacent model unit are subtracted to produce the structure contour map on the base of the unit. This new structure contour map forms the input for another cycle of isochore subtraction to produce the next structure contour map. In this method of solids modeling, the model units are presented by surfaces (structure contour maps), and all surfaces are stored in the model. Surfaces can be converted to form volumes of model units with additional effort. This lithostratigraphic and structural model can be used for (1) storing data from, and planning future, site characterization activities, (2) preliminary geometry of units for design of Exploratory Studies Facility and potential repository, and (3) performance assessment evaluations.

  6. Effects of chronic treatment with various neuromuscular blocking agents on the number and distribution of acetylcholine receptors in the rat diaphragm.

    PubMed Central

    Chang, C C; Chuang, S T; Huang, M C

    1975-01-01

    1. Acetylcholine receptors in the end-plate and non-end-plate areas of the rat diaphragm, after treating the animal with hemicholinium-3, alpha- or beta-bungarotoxin in vivo, were studied by their specific binding of labelled alpha-bungarotoxin. 2. Subcutaneous injection of maximum tolerable doses of hemicholinium-3 (50 mug/kg) twice daily for 7 days increased the number of extrajunctional receptors along the whole length of muscle fibre, the approximate density of receptor on muscle membrane being increased from 6/mum2 in normal diaphragm to 38/mum2. Junctional receptors were also increased in number from 2-2 x 10(7) to 2-8 x 10(7) per end-plate. 3. Five days after denervation, there were approximately 153/mum2 extrajunctional receptors and the number of receptors on the end-plate was increased by 220%. 4. Intrathoracic injection of beta-bungarotoxin (50 mug/kg) also increased the density of extrajunctional receptors to approximately 104/mum2, and the number of end-plate receptors by 140% in 5 days. The neuromuscular block was extensive and prolonged. 5. [3H]Diacetyl alpha-bungarotoxin (150 mug/kg) injected into thoracic cavity caused complete neuromuscular blockade for 12 hr. At 24 hr, the synaptic transmission was restored in 80% of the junctions with less than 10% end-plate receptors freed, whereas the safety factor for transmission in normal diaphragm was 3-5. Extrajunctional receptors appeared to increase within 24 hr. This increase continued despite the restoration of neuromuscular transmission, and the receptor density at 5 days was approximately 5l/mum2. The number of junctional receptors, however, was not increased. Repeated injection of the toxin gave the same result. 6. It is concluded that the numbers of junctional and extrajunctional acetylcholine receptors are regulated in different ways, and the possible role of acetylcholine is discussed. PMID:170397

  7. Open multi-agent control architecture to support virtual-reality-based man-machine interfaces

    NASA Astrophysics Data System (ADS)

    Freund, Eckhard; Rossmann, Juergen; Brasch, Marcel

    2001-10-01

    Projective Virtual Reality is a new and promising approach to intuitively operable man machine interfaces for the commanding and supervision of complex automation systems. The user interface part of Projective Virtual Reality heavily builds on latest Virtual Reality techniques, a task deduction component and automatic action planning capabilities. In order to realize man machine interfaces for complex applications, not only the Virtual Reality part has to be considered but also the capabilities of the underlying robot and automation controller are of great importance. This paper presents a control architecture that has proved to be an ideal basis for the realization of complex robotic and automation systems that are controlled by Virtual Reality based man machine interfaces. The architecture does not just provide a well suited framework for the real-time control of a multi robot system but also supports Virtual Reality metaphors and augmentations which facilitate the user's job to command and supervise a complex system. The developed control architecture has already been used for a number of applications. Its capability to integrate sensor information from sensors of different levels of abstraction in real-time helps to make the realized automation system very responsive to real world changes. In this paper, the architecture will be described comprehensively, its main building blocks will be discussed and one realization that is built based on an open source real-time operating system will be presented. The software design and the features of the architecture which make it generally applicable to the distributed control of automation agents in real world applications will be explained. Furthermore its application to the commanding and control of experiments in the Columbus space laboratory, the European contribution to the International Space Station (ISS), is only one example which will be described.

  8. On the decoder error probability of block codes

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming

    1992-01-01

    By using coding and combinational techniques, an explicit formula is derived which enumerates the complete weight distribution of decodable words of block codes using partially known weight distributions. Also an approximation formula for nonbinary block codes is obtained. These results in turn give exact and approximate expressions for the decoder error probability PE(u) of block codes.

  9. Intelligent Virtual Station (IVS)

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The Intelligent Virtual Station (IVS) is enabling the integration of design, training, and operations capabilities into an intelligent virtual station for the International Space Station (ISS). A viewgraph of the IVS Remote Server is presented.

  10. Virtual Reality as Metaphor.

    ERIC Educational Resources Information Center

    Gozzi, Raymond, Jr.

    1996-01-01

    Suggests that virtual reality technology has become popular because it is a miniaturization, a model, of something that already exists. Compares virtual reality to the news media, which centers on the gory, the sensational, and the distorted. (PA)

  11. Postural Instability Detection: Aging and the Complexity of Spatial-Temporal Distributional Patterns for Virtually Contacting the Stability Boundary in Human Stance

    PubMed Central

    Kilby, Melissa C.; Slobounov, Semyon M.; Newell, Karl M.

    2014-01-01

    Falls among the older population can severely restrict their functional mobility and even cause death. Therefore, it is crucial to understand the mechanisms and conditions that cause falls, for which it is important to develop a predictive model of falls. One critical quantity for postural instability detection and prediction is the instantaneous stability of quiet upright stance based on motion data. However, well-established measures in the field of motor control that quantify overall postural stability using center-of-pressure (COP) or center-of-mass (COM) fluctuations are inadequate predictors of instantaneous stability. For this reason, 2D COP/COM virtual-time-to-contact (VTC) is investigated to detect the postural stability deficits of healthy older people compared to young adults. VTC predicts the temporal safety margin to the functional stability boundary ( =  limits of the region of feasible COP or COM displacement) and, therefore, provides an index of the risk of losing postural stability. The spatial directions with increased instability were also determined using quantities of VTC that have not previously been considered. Further, Lempel-Ziv-Complexity (LZC), a measure suitable for on-line monitoring of stability/instability, was applied to explore the temporal structure or complexity of VTC and the predictability of future postural instability based on previous behavior. These features were examined as a function of age, vision and different load weighting on the legs. The primary findings showed that for old adults the stability boundary was contracted and VTC reduced. Furthermore, the complexity decreased with aging and the direction with highest postural instability also changed in aging compared to the young adults. The findings reveal the sensitivity of the time dependent properties of 2D VTC to the detection of postural instability in aging, availability of visual information and postural stance and potential applicability as a predictive

  12. Virtual Reference Services.

    ERIC Educational Resources Information Center

    Brewer, Sally

    2003-01-01

    As the need to access information increases, school librarians must create virtual libraries. Linked to reliable reference resources, the virtual library extends the physical collection and library hours and lets students learn to use Web-based resources in a protected learning environment. The growing number of virtual schools increases the need…

  13. Virtual trackballs revisited.

    PubMed

    Henriksen, Knud; Sporring, Jon; Hornbaek, Kasper

    2004-01-01

    Rotation of three-dimensional objects by a two-dimensional mouse is a typical task in computer-aided design, operation simulations, and desktop virtual reality. The most commonly used rotation technique is a virtual trackball surrounding the object and operated by the mouse pointer. This article reviews and provides a mathematical foundation for virtual trackballs. The first, but still popular, virtual trackball was described by Chen et al. We show that the virtual trackball by Chen et al. does not rotate the object along the intended great circular arc on the virtual trackball and we give a correction. Another popular virtual trackball is Shoemake's quaternion implementation, which we show to be a special case of the virtual trackball by Chen et al.. Shoemake extends the scope of the virtual trackball to the full screen. Unfortunately, Shoemake's virtual trackball is inhomogeneous and discontinuous with consequences for usability. Finally, we review Bell's virtual trackball and discuss studies of the usability of virtual trackballs. PMID:15384645

  14. Virtual Worlds? "Outlook Good"

    ERIC Educational Resources Information Center

    Kelton, AJ

    2008-01-01

    Many people believed that virtual worlds would end up like the eight-track audiotape: a memory of something no longer used (or useful). Yet today there are hundreds of higher education institutions represented in three-dimensional (3D) virtual worlds such as Active Worlds and Second Life. The movement toward the virtual realm as a viable teaching…

  15. Virtual Reality: An Overview.

    ERIC Educational Resources Information Center

    Franchi, Jorge

    1994-01-01

    Highlights of this overview of virtual reality include optics; interface devices; virtual worlds; potential applications, including medicine and archaeology; problems, including costs; current research and development; future possibilities; and a listing of vendors and suppliers of virtual reality products. (Contains 11 references.) (LRW)

  16. Testing block subdivision algorithms on block designs

    NASA Astrophysics Data System (ADS)

    Wiseman, Natalie; Patterson, Zachary

    2016-01-01

    Integrated land use-transportation models predict future transportation demand taking into account how households and firms arrange themselves partly as a function of the transportation system. Recent integrated models require parcels as inputs and produce household and employment predictions at the parcel scale. Block subdivision algorithms automatically generate parcel patterns within blocks. Evaluating block subdivision algorithms is done by way of generating parcels and comparing them to those in a parcel database. Three block subdivision algorithms are evaluated on how closely they reproduce parcels of different block types found in a parcel database from Montreal, Canada. While the authors who developed each of the algorithms have evaluated them, they have used their own metrics and block types to evaluate their own algorithms. This makes it difficult to compare their strengths and weaknesses. The contribution of this paper is in resolving this difficulty with the aim of finding a better algorithm suited to subdividing each block type. The proposed hypothesis is that given the different approaches that block subdivision algorithms take, it's likely that different algorithms are better adapted to subdividing different block types. To test this, a standardized block type classification is used that consists of mutually exclusive and comprehensive categories. A statistical method is used for finding a better algorithm and the probability it will perform well for a given block type. Results suggest the oriented bounding box algorithm performs better for warped non-uniform sites, as well as gridiron and fragmented uniform sites. It also produces more similar parcel areas and widths. The Generalized Parcel Divider 1 algorithm performs better for gridiron non-uniform sites. The Straight Skeleton algorithm performs better for loop and lollipop networks as well as fragmented non-uniform and warped uniform sites. It also produces more similar parcel shapes and patterns.

  17. Simplified Virtualization in a HEP/NP Environment with Condor

    NASA Astrophysics Data System (ADS)

    Strecker-Kellogg, W.; Caramarcu, C.; Hollowell, C.; Wong, T.

    2012-12-01

    In this work we will address the development of a simple prototype virtualized worker node cluster, using Scientific Linux 6.x as a base OS, KVM and the libvirt API for virtualization, and the Condor batch software to manage virtual machines. The discussion in this paper provides details on our experience with building, configuring, and deploying the various components from bare metal, including the base OS, creation and distribution of the virtualized OS images and the integration of batch services with the virtual machines. Our focus was on simplicity and interoperability with our existing architecture.

  18. Synchronizing Self and Object Movement: How Child and Adult Cyclists Intercept Moving Gaps in a Virtual Environment

    ERIC Educational Resources Information Center

    Chihak, Benjamin J.; Plumert, Jodie M.; Ziemer, Christine J.; Babu, Sabarish; Grechkin, Timofey; Cremer, James F.; Kearney, Joseph K.

    2010-01-01

    Two experiments examined how 10- and 12-year-old children and adults intercept moving gaps while bicycling in an immersive virtual environment. Participants rode an actual bicycle along a virtual roadway. At 12 test intersections, participants attempted to pass through a gap between 2 moving, car-sized blocks without stopping. The blocks were…

  19. Small iminium ions block gramicidin channels in lipid bilayers.

    PubMed Central

    Hemsley, G; Busath, D

    1991-01-01

    Guanidinium and acetamidinium, when added to the bathing solution in concentrations of approximately 0.1M, cause brief blocks in the single channel potassium currents from channels formed in planar lipid bilayers by gramicidin A. Single channel lifetimes are not affected indicating that the channel structure is not modified by the blockers. Guanidinium block durations and interblock times are approximately exponential in distribution. Block frequencies increase with guanidinium concentration whereas block durations are unaffected. Increases in membrane potential cause an increase in block frequency as expected for a positively charged blocker but a decrease in block duration suggesting that the block is relieved when the blocker passes through the channel. At low pH, urea, formamide, and acetamide cause similar blocks suggesting that the protonated species of these molecules also block. Arginine and several amines do not block. This indicates that only iminium ions which are small enough to enter the channel can cause blocks in gramicidin channels. PMID:1712240

  20. Virtual reality applied to teletesting

    NASA Astrophysics Data System (ADS)

    van den Berg, Thomas J.; Smeenk, Roland J. M.; Mazy, Alain; Jacques, Patrick; Arguello, Luis; Mills, Simon

    2003-05-01

    The activity "Virtual Reality applied to Teletesting" is related to a wider European Space Agency (ESA) initiative of cost reduction, in particular the reduction of test costs. Reduction of costs of space related projects have to be performed on test centre operating costs and customer company costs. This can accomplished by increasing the automation and remote testing ("teletesting") capabilities of the test centre. Main problems related to teletesting are a lack of situational awareness and the separation of control over the test environment. The objective of the activity is to evaluate the use of distributed computing and Virtual Reality technology to support the teletesting of a payload under vacuum conditions, and to provide a unified man-machine interface for the monitoring and control of payload, vacuum chamber and robotics equipment. The activity includes the development and testing of a "Virtual Reality Teletesting System" (VRTS). The VRTS is deployed at one of the ESA certified test centres to perform an evaluation and test campaign using a real payload. The VRTS is entirely written in the Java programming language, using the J2EE application model. The Graphical User Interface runs as an applet in a Web browser, enabling easy access from virtually any place.

  1. Tharsis block tectonics on Mars

    NASA Technical Reports Server (NTRS)

    Raitala, Jouko T.

    1988-01-01

    The concept of block tectonics provides a framework for understanding many aspects of Tharsis and adjoining structures. This Tharsis block tectonics on Mars is manifested partly by mantle-related doming and partly by response to loading by subsequent volcanic construction. Although the origin of the volcanism from beneath Tharsis is a subject of controversy explanations have to include inhomogeneities in Martian internal structure, energy distribution, magma accumulation and motion below the lithosphere. Thermal convection can be seen as a necessary consequence for transient initial phase of Martian cooling. This produced part of the elevated topography with tensional stresses and graben systems radial to the main bulge. The linear grabens, radial to the Tharsis center, can be interpreted to indicate rift zones that define the crustal block boundaries. The load-induced stresses may then have contributed on further graben and ridge formation over an extended period of time.

  2. Block That Pain!

    MedlinePlus

    ... combination produces a unique effect, blocking pain-sensing neurons without impairing signals from other cells. In contrast, ... surgical procedures block activity in all types of neurons. This can cause numbness, paralysis, and other nervous ...

  3. The Block Scheduling Handbook.

    ERIC Educational Resources Information Center

    Queen, J. Allen

    Block scheduling encourages increased comprehensive immersion into subject matter, improved teacher-student relationships, and decreased disciplinary problems. While block scheduling may offer many advantages, moving to a block schedule from conventional scheduling can be a major adjustment for both students and teachers. This guide is intended to…

  4. Block Scheduling. Research Brief

    ERIC Educational Resources Information Center

    Muir, Mike

    2003-01-01

    What are the effects of block scheduling? Results of transitioning from traditional to block scheduling are mixed. Some studies indicate no change in achievement results, nor change in teachers' opinions about instructional strategies. Other studies show that block scheduling doesn't work well for Advanced Placement or Music courses, that "hard to…

  5. Shared virtual environments for aerospace training

    NASA Technical Reports Server (NTRS)

    Loftin, R. Bowen; Voss, Mark

    1994-01-01

    Virtual environments have the potential to significantly enhance the training of NASA astronauts and ground-based personnel for a variety of activities. A critical requirement is the need to share virtual environments, in real or near real time, between remote sites. It has been hypothesized that the training of international astronaut crews could be done more cheaply and effectively by utilizing such shared virtual environments in the early stages of mission preparation. The Software Technology Branch at NASA's Johnson Space Center has developed the capability for multiple users to simultaneously share the same virtual environment. Each user generates the graphics needed to create the virtual environment. All changes of object position and state are communicated to all users so that each virtual environment maintains its 'currency.' Examples of these shared environments will be discussed and plans for the utilization of the Department of Defense's Distributed Interactive Simulation (DIS) protocols for shared virtual environments will be presented. Finally, the impact of this technology on training and education in general will be explored.

  6. Virtual Campus in the Context of an Educational Virtual City

    ERIC Educational Resources Information Center

    Fominykh, Mikhail; Prasolova-Forland, Ekaterina; Morozov, Mikhail; Gerasimov, Alexey

    2011-01-01

    This paper is focused on virtual campuses, i.e. virtual worlds representing real educational institutions that are based on the metaphor of a university and provide users with different learning tools. More specifically, the idea of integrating a virtual campus into the context of a virtual city is suggested. Such a virtual city, where students…

  7. Grids, virtualization, and clouds at Fermilab

    SciTech Connect

    Timm, S.; Chadwick, K.; Garzoglio, G.; Noh, S.

    2014-06-11

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture and the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). Lastly, this work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.

  8. Grids, virtualization, and clouds at Fermilab

    DOE PAGESBeta

    Timm, S.; Chadwick, K.; Garzoglio, G.; Noh, S.

    2014-06-11

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture andmore » the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). Lastly, this work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.« less

  9. Grids, virtualization, and clouds at Fermilab

    NASA Astrophysics Data System (ADS)

    Timm, S.; Chadwick, K.; Garzoglio, G.; Noh, S.

    2014-06-01

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture and the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). This work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.

  10. Blocking Delaunay triangulations.

    PubMed

    Aichholzer, Oswin; Fabila-Monroy, Ruy; Hackl, Thomas; van Kreveld, Marc; Pilz, Alexander; Ramos, Pedro; Vogtenhuber, Birgit

    2013-02-01

    Given a set B of n black points in general position, we say that a set of white points W blocks B if in the Delaunay triangulation of [Formula: see text] there is no edge connecting two black points. We give the following bounds for the size of the smallest set W blocking B: (i) [Formula: see text] white points are always sufficient to block a set of n black points, (ii) if B is in convex position, [Formula: see text] white points are always sufficient to block it, and (iii) at least [Formula: see text] white points are always necessary to block a set of n black points. PMID:23483043