Laadan, Oren; Nieh, Jason; Phung, Dan
2012-10-02
Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.
Information distribution in distributed microprocessor based flight control systems
NASA Technical Reports Server (NTRS)
Montgomery, R. C.; Lee, P. S.
1977-01-01
This paper presents an optimal control theory that accounts for variable time intervals in the information distribution to control effectors in a distributed microprocessor based flight control system. The theory is developed using a linear process model for the aircraft dynamics and the information distribution process is modeled as a variable time increment process where, at the time that information is supplied to the control effectors, the control effectors know the time of the next information update only in a stochastic sense. An optimal control problem is formulated and solved that provides the control law that minimizes the expected value of a quadratic cost function. An example is presented where the theory is applied to the control of the longitudinal motions of the F8-DFBW aircraft. Theoretical and simulation results indicate that, for the example problem, the optimal cost obtained using a variable time increment Markov information update process where the control effectors know only the past information update intervals and the Markov transition mechanism is almost identical to that obtained using a known uniform information update interval.
NASA Astrophysics Data System (ADS)
Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi
2018-02-01
The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.
Forced guidance and distribution of practice in sequential information processing.
NASA Technical Reports Server (NTRS)
Decker, L. R.; Rogers, C. A., Jr.
1973-01-01
Distribution of practice and forced guidance were used in a sequential information-processing task in an attempt to increase the capacity of human information-processing mechanisms. A reaction time index of the psychological refractory period was used as the response measure. Massing of practice lengthened response times while forced guidance shortened them. Interpretation was in terms of load reduction upon the response-selection stage of the information-processing system.-
The application of artificial intelligence techniques to large distributed networks
NASA Technical Reports Server (NTRS)
Dubyah, R.; Smith, T. R.; Star, J. L.
1985-01-01
Data accessibility and transfer of information, including the land resources information system pilot, are structured as large computer information networks. These pilot efforts include the reduction of the difficulty to find and use data, reducing processing costs, and minimize incompatibility between data sources. Artificial Intelligence (AI) techniques were suggested to achieve these goals. The applicability of certain AI techniques are explored in the context of distributed problem solving systems and the pilot land data system (PLDS). The topics discussed include: PLDS and its data processing requirements, expert systems and PLDS, distributed problem solving systems, AI problem solving paradigms, query processing, and distributed data bases.
An approach for heterogeneous and loosely coupled geospatial data distributed computing
NASA Astrophysics Data System (ADS)
Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui
2010-07-01
Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.
Uncertainty estimation of the self-thinning process by Maximum-Entropy Principle
Shoufan Fang; George Z. Gertner
2000-01-01
When available information is scarce, the Maximum-Entropy Principle can estimate the distributions of parameters. In our case study, we estimated the distributions of the parameters of the forest self-thinning process based on literature information, and we derived the conditional distribution functions and estimated the 95 percent confidence interval (CI) of the self-...
Hierarchical species distribution models
Hefley, Trevor J.; Hooten, Mevin B.
2016-01-01
Determining the distribution pattern of a species is important to increase scientific knowledge, inform management decisions, and conserve biodiversity. To infer spatial and temporal patterns, species distribution models have been developed for use with many sampling designs and types of data. Recently, it has been shown that count, presence-absence, and presence-only data can be conceptualized as arising from a point process distribution. Therefore, it is important to understand properties of the point process distribution. We examine how the hierarchical species distribution modeling framework has been used to incorporate a wide array of regression and theory-based components while accounting for the data collection process and making use of auxiliary information. The hierarchical modeling framework allows us to demonstrate how several commonly used species distribution models can be derived from the point process distribution, highlight areas of potential overlap between different models, and suggest areas where further research is needed.
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
Methods and apparatuses for information analysis on shared and distributed computing systems
Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA
2011-02-22
Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.
ERIC Educational Resources Information Center
Popyk, Marilyn K.
1986-01-01
Discusses the new automated office and its six major technologies (data processing, word processing, graphics, image, voice, and networking), the information processing cycle (input, processing, output, distribution/communication, and storage and retrieval), ergonomics, and ways to expand office education classes (versus class instruction). (CT)
2006-04-01
and Scalability, (2) Sensors and Platforms, (3) Distributed Computing and Processing , (4) Information Management, (5) Fusion and Resource Management...use of the deployed system. 3.3 Distributed Computing and Processing Session The Distributed Computing and Processing Session consisted of three
Distributed Data Processing in a United States Naval Shipyard.
1979-12-01
25 1. Evolution ........ ..................... 25 2. Motivations for Distributed Processing ... ....... 30 a. Extensibility...51 B. EVOLUTION ...... ........................ ... 51 C. CONCEPTS .... ... ........................ . 55 D. FORM AND STRUCTURE OF THE...motivations for, and the characteristics of, distributed processing as they apply to management information systems. 1. Evolution Prior to the advent of
Distributed Information System Development: Review of Some Management Issues
NASA Astrophysics Data System (ADS)
Mishra, Deepti; Mishra, Alok
Due to the proliferation of the Internet and globalization, distributed information system development is becoming popular. In this paper we have reviewed some significant management issues like process management, project management, requirements management and knowledge management issues which have received much attention in distributed development perspective. In this literature review we found that areas like quality and risk management issues could get only scant attention in distributed information system development.
Analysis of haptic information in the cerebral cortex
2016-01-01
Haptic sensing of objects acquires information about a number of properties. This review summarizes current understanding about how these properties are processed in the cerebral cortex of macaques and humans. Nonnoxious somatosensory inputs, after initial processing in primary somatosensory cortex, are partially segregated into different pathways. A ventrally directed pathway carries information about surface texture into parietal opercular cortex and thence to medial occipital cortex. A dorsally directed pathway transmits information regarding the location of features on objects to the intraparietal sulcus and frontal eye fields. Shape processing occurs mainly in the intraparietal sulcus and lateral occipital complex, while orientation processing is distributed across primary somatosensory cortex, the parietal operculum, the anterior intraparietal sulcus, and a parieto-occipital region. For each of these properties, the respective areas outside primary somatosensory cortex also process corresponding visual information and are thus multisensory. Consistent with the distributed neural processing of haptic object properties, tactile spatial acuity depends on interaction between bottom-up tactile inputs and top-down attentional signals in a distributed neural network. Future work should clarify the roles of the various brain regions and how they interact at the network level. PMID:27440247
An object-oriented software approach for a distributed human tracking motion system
NASA Astrophysics Data System (ADS)
Micucci, Daniela L.
2003-06-01
Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.
Optimal regulation in systems with stochastic time sampling
NASA Technical Reports Server (NTRS)
Montgomery, R. C.; Lee, P. S.
1980-01-01
An optimal control theory that accounts for stochastic variable time sampling in a distributed microprocessor based flight control system is presented. The theory is developed by using a linear process model for the airplane dynamics and the information distribution process is modeled as a variable time increment process where, at the time that information is supplied to the control effectors, the control effectors know the time of the next information update only in a stochastic sense. An optimal control problem is formulated and solved for the control law that minimizes the expected value of a quadratic cost function. The optimal cost obtained with a variable time increment Markov information update process where the control effectors know only the past information update intervals and the Markov transition mechanism is almost identical to that obtained with a known and uniform information update interval.
Tariq, Amina; Georgiou, Andrew; Westbrook, Johanna
2013-05-01
Medication safety is a pressing concern for residential aged care facilities (RACFs). Retrospective studies in RACF settings identify inadequate communication between RACFs, doctors, hospitals and community pharmacies as the major cause of medication errors. Existing literature offers limited insight about the gaps in the existing information exchange process that may lead to medication errors. The aim of this research was to explicate the cognitive distribution that underlies RACF medication ordering and delivery to identify gaps in medication-related information exchange which lead to medication errors in RACFs. The study was undertaken in three RACFs in Sydney, Australia. Data were generated through ethnographic field work over a period of five months (May-September 2011). Triangulated analysis of data primarily focused on examining the transformation and exchange of information between different media across the process. The findings of this study highlight the extensive scope and intense nature of information exchange in RACF medication ordering and delivery. Rather than attributing error to individual care providers, the explication of distributed cognition processes enabled the identification of gaps in three information exchange dimensions which potentially contribute to the occurrence of medication errors namely: (1) design of medication charts which complicates order processing and record keeping (2) lack of coordination mechanisms between participants which results in misalignment of local practices (3) reliance on restricted communication bandwidth channels mainly telephone and fax which complicates the information processing requirements. The study demonstrates how the identification of these gaps enhances understanding of medication errors in RACFs. Application of the theoretical lens of distributed cognition can assist in enhancing our understanding of medication errors in RACFs through identification of gaps in information exchange. Understanding the dynamics of the cognitive process can inform the design of interventions to manage errors and improve residents' safety. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Shope, William G.; ,
1987-01-01
The US Geological Survey is utilizing a national network of more than 1000 satellite data-collection stations, four satellite-relay direct-readout ground stations, and more than 50 computers linked together in a private telecommunications network to acquire, process, and distribute hydrological data in near real-time. The four Survey offices operating a satellite direct-readout ground station provide near real-time hydrological data to computers located in other Survey offices through the Survey's Distributed Information System. The computerized distribution system permits automated data processing and distribution to be carried out in a timely manner under the control and operation of the Survey office responsible for the data-collection stations and for the dissemination of hydrological information to the water-data users.
Information Acquisition, Analysis and Integration
2016-08-03
of sensing and processing, theory, applications, signal processing, image and video processing, machine learning , technology transfer. 16. SECURITY... learning . 5. Solved elegantly old problems like image and video debluring, intro- ducing new revolutionary approaches. 1 DISTRIBUTION A: Distribution...Polatkan, G. Sapiro, D. Blei, D. B. Dunson, and L. Carin, “ Deep learning with hierarchical convolution factor analysis,” IEEE 6 DISTRIBUTION A
Optimal Information Processing in Biochemical Networks
NASA Astrophysics Data System (ADS)
Wiggins, Chris
2012-02-01
A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.
ABR: TYPES, VOLUMES, GEOGRAPHICAL DISTRIBUTION AND DISPOSAL
The presentation provides information on the residual produced by arsenic removal drinking water treatment processes. Although the presentation discussed mainly the residuals from the adsorptive media processes, it also provided information on other processes such as iron removal...
Heisz, Jennifer J; Vakorin, Vasily; Ross, Bernhard; Levine, Brian; McIntosh, Anthony R
2014-01-01
Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.
Dynamic Reconfiguration of a RGBD Sensor Based on QoS and QoC Requirements in Distributed Systems.
Munera, Eduardo; Poza-Lujan, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, José-Enrique; Noguera, Juan Fco Blanes
2015-07-24
The inclusion of embedded sensors into a networked system provides useful information for many applications. A Distributed Control System (DCS) is one of the clearest examples where processing and communications are constrained by the client's requirements and the capacity of the system. An embedded sensor with advanced processing and communications capabilities supplies high level information, abstracting from the data acquisition process and objects recognition mechanisms. The implementation of an embedded sensor/actuator as a Smart Resource permits clients to access sensor information through distributed network services. Smart resources can offer sensor services as well as computing, communications and peripheral access by implementing a self-aware based adaptation mechanism which adapts the execution profile to the context. On the other hand, information integrity must be ensured when computing processes are dynamically adapted. Therefore, the processing must be adapted to perform tasks in a certain lapse of time but always ensuring a minimum process quality. In the same way, communications must try to reduce the data traffic without excluding relevant information. The main objective of the paper is to present a dynamic configuration mechanism to adapt the sensor processing and communication to the client's requirements in the DCS. This paper describes an implementation of a smart resource based on a Red, Green, Blue, and Depth (RGBD) sensor in order to test the dynamic configuration mechanism presented.
Development of Electro-Optical Standard Processes for Application
2011-11-01
AERONAUTICS AND SPACE ADMINISTRATION DISTRIBUTION A: APPROVED FOR PUBLIC RELEASE DISTRIBUTION IS UNLIMITED Report Documentation Page Form ApprovedOMB No...0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Defines the process of
Process evaluation distributed system
NASA Technical Reports Server (NTRS)
Moffatt, Christopher L. (Inventor)
2006-01-01
The distributed system includes a database server, an administration module, a process evaluation module, and a data display module. The administration module is in communication with the database server for providing observation criteria information to the database server. The process evaluation module is in communication with the database server for obtaining the observation criteria information from the database server and collecting process data based on the observation criteria information. The process evaluation module utilizes a personal digital assistant (PDA). A data display module in communication with the database server, including a website for viewing collected process data in a desired metrics form, the data display module also for providing desired editing and modification of the collected process data. The connectivity established by the database server to the administration module, the process evaluation module, and the data display module, minimizes the requirement for manual input of the collected process data.
The Land Processes Distributed Active Archive Center (LP DAAC)
Golon, Danielle K.
2016-10-03
The Land Processes Distributed Active Archive Center (LP DAAC) operates as a partnership with the U.S. Geological Survey and is 1 of 12 DAACs within the National Aeronautics and Space Administration (NASA) Earth Observing System Data and Information System (EOSDIS). The LP DAAC ingests, archives, processes, and distributes NASA Earth science remote sensing data. These data are provided to the public at no charge. Data distributed by the LP DAAC provide information about Earth’s surface from daily to yearly intervals and at 15 to 5,600 meter spatial resolution. Data provided by the LP DAAC can be used to study changes in agriculture, vegetation, ecosystems, elevation, and much more. The LP DAAC provides several ways to access, process, and interact with these data. In addition, the LP DAAC is actively archiving new datasets to provide users with a variety of data to study the Earth.
Lifelong Learning and the Information Society.
ERIC Educational Resources Information Center
Boucouvalas, Marcie
Society is currently in the process of shifting its central focus from the production and distribution of material goods to the production and distribution of information. Indeed, data from the Bureau of Labor Statistics reveal that the majority of people are now involved in occupations that center around information rather than around industry.…
Historical Time-Domain: Data Archives, Processing, and Distribution
NASA Astrophysics Data System (ADS)
Grindlay, Jonathan E.; Griffin, R. Elizabeth
2012-04-01
The workshop on Historical Time-Domain Astronomy (TDA) was attended by a near-capacity gathering of ~30 people. From information provided in turn by those present, an up-to-date overview was created of available plate archives, progress in their digitization, the extent of actual processing of those data, and plans for data distribution. Several recommendations were made for prioritising the processing and distribution of historical TDA data.
A distributed computing model for telemetry data processing
NASA Astrophysics Data System (ADS)
Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.
1994-05-01
We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.
A distributed computing model for telemetry data processing
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.
1994-01-01
We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.
NASA Astrophysics Data System (ADS)
Wiesmann, William P.; Pranger, L. Alex; Bogucki, Mary S.
1998-05-01
Remote monitoring of physiologic data from individual high- risk workers distributed over time and space is a considerable challenge. This is often due to an inadequate capability to accurately integrate large amounts of data into usable information in real time. In this report, we have used the vertical and horizontal organization of the 'fireground' as a framework to design a distributed network of sensors. In this system, sensor output is linked through a hierarchical object oriented programing process to accurately interpret physiological data, incorporate these data into a synchronous model and relay processed data, trends and predictions to members of the fire incident command structure. There are several unique aspects to this approach. The first includes a process to account for variability in vital parameter values for each individual's normal physiologic response by including an adaptive network in each data process. This information is used by the model in an iterative process to baseline a 'normal' physiologic response to a given stress for each individual and to detect deviations that indicate dysfunction or a significant insult. The second unique capability of the system orders the information for each user including the subject, local company officers, medical personnel and the incident commanders. Information can be retrieved and used for training exercises and after action analysis. Finally this system can easily be adapted to existing communication and processing links along with incorporating the best parts of current models through the use of object oriented programming techniques. These modern software techniques are well suited to handling multiple data processes independently over time in a distributed network.
Online catalog access and distribution of remotely sensed information
NASA Astrophysics Data System (ADS)
Lutton, Stephen M.
1997-09-01
Remote sensing is providing voluminous data and value added information products. Electronic sensors, communication electronics, computer software, hardware, and network communications technology have matured to the point where a distributed infrastructure for remotely sensed information is a reality. The amount of remotely sensed data and information is making distributed infrastructure almost a necessity. This infrastructure provides data collection, archiving, cataloging, browsing, processing, and viewing for applications from scientific research to economic, legal, and national security decision making. The remote sensing field is entering a new exciting stage of commercial growth and expansion into the mainstream of government and business decision making. This paper overviews this new distributed infrastructure and then focuses on describing a software system for on-line catalog access and distribution of remotely sensed information.
Fluctuations in Wikipedia access-rate and edit-event data
NASA Astrophysics Data System (ADS)
Kämpf, Mirko; Tismer, Sebastian; Kantelhardt, Jan W.; Muchnik, Lev
2012-12-01
Internet-based social networks often reflect extreme events in nature and society by drastic increases in user activity. We study and compare the dynamics of the two major complex processes necessary for information spread via the online encyclopedia ‘Wikipedia’, i.e., article editing (information upload) and article access (information viewing) based on article edit-event time series and (hourly) user access-rate time series for all articles. Daily and weekly activity patterns occur in addition to fluctuations and bursting activity. The bursts (i.e., significant increases in activity for an extended period of time) are characterized by a power-law distribution of durations of increases and decreases. For describing the recurrence and clustering of bursts we investigate the statistics of the return intervals between them. We find stretched exponential distributions of return intervals in access-rate time series, while edit-event time series yield simple exponential distributions. To characterize the fluctuation behavior we apply detrended fluctuation analysis (DFA), finding that most article access-rate time series are characterized by strong long-term correlations with fluctuation exponents α≈0.9. The results indicate significant differences in the dynamics of information upload and access and help in understanding the complex process of collecting, processing, validating, and distributing information in self-organized social networks.
Earthquake prediction: the interaction of public policy and science.
Jones, L M
1996-01-01
Earthquake prediction research has searched for both informational phenomena, those that provide information about earthquake hazards useful to the public, and causal phenomena, causally related to the physical processes governing failure on a fault, to improve our understanding of those processes. Neither informational nor causal phenomena are a subset of the other. I propose a classification of potential earthquake predictors of informational, causal, and predictive phenomena, where predictors are causal phenomena that provide more accurate assessments of the earthquake hazard than can be gotten from assuming a random distribution. Achieving higher, more accurate probabilities than a random distribution requires much more information about the precursor than just that it is causally related to the earthquake. PMID:11607656
Lognormal Infection Times of Online Information Spread
Doerr, Christian; Blenn, Norbert; Van Mieghem, Piet
2013-01-01
The infection times of individuals in online information spread such as the inter-arrival time of Twitter messages or the propagation time of news stories on a social media site can be explained through a convolution of lognormally distributed observation and reaction times of the individual participants. Experimental measurements support the lognormal shape of the individual contributing processes, and have resemblance to previously reported lognormal distributions of human behavior and contagious processes. PMID:23700473
Kalvelage, T.; Willems, Jennifer
2003-01-01
The design of the EOS Data and Information Systems (EOSDIS) to acquire, archive, manage and distribute Earth observation data to the broadest possible user community was discussed. A number of several integrated retrieval, processing and distribution capabilities have been explained. The value of these functions to the users were described and potential future improvements were laid out for the users. The users were interested in acquiring the retrieval, processing and archiving systems integrated so that they can get the data they want in the format and delivery mechanism of their choice.
NATIONAL WATER INFORMATION SYSTEM OF THE U. S. GEOLOGICAL SURVEY.
Edwards, Melvin D.
1985-01-01
National Water Information System (NWIS) has been designed as an interactive, distributed data system. It will integrate the existing, diverse data-processing systems into a common system. It will also provide easier, more flexible use as well as more convenient access and expanded computing, dissemination, and data-analysis capabilities. The NWIS is being implemented as part of a Distributed Information System (DIS) being developed by the Survey's Water Resources Division. The NWIS will be implemented on each node of the distributed network for the local processing, storage, and dissemination of hydrologic data collected within the node's area of responsibility. The processor at each node will also be used to perform hydrologic modeling, statistical data analysis, text editing, and some administrative work.
Warner, D; Sale, J; Viirre, E
1996-01-01
Recent trends in healthcare informatics and telemedicine indicate that systems are being developed with a primary focus on technology and business, not on the process of medicine itself. Distributed Medical Intelligence promotes the development of an integrative medical communication system which addresses the process of providing expert medical knowledge to the point of need.
Studies of the General Parton Distributions.
NASA Astrophysics Data System (ADS)
Goloskokov, Sergey
2017-12-01
We discuss possibility to study Generalized Parton Distributions (GPSs) induced processes using polarized beams at NICA. We show that important information on GPDs structure can be obtained at NICA in exclusive meson production and in Drell-Yan (D-Y) process that determined by the double GPDs contribution.
Distributed Processing with a Mainframe-Based Hospital Information System: A Generalized Solution
Kirby, J. David; Pickett, Michael P.; Boyarsky, M. William; Stead, William W.
1987-01-01
Over the last two years the Medical Center Information Systems Department at Duke University Medical Center has been developing a systematic approach to distributing the processing and data involved in computerized applications at DUMC. The resulting system has been named MAPS- the Micro-ADS Processing System. A key characteristic of MAPS is that it makes it easy to execute any existing mainframe ADS application with a request from a PC. This extends the functionality of the mainframe application set to the PC without compromising the maintainability of the PC or mainframe systems.
NASA Astrophysics Data System (ADS)
Peng, Jia-Yin; Lei, Hong-Xuan; Mo, Zhi-Wen
2014-05-01
The previous protocols of remote quantum information concentration were focused on the reverse process of quantum telecloning of single-qubit states. We here investigate the reverse process of optimal universal 1→2 telecloning of arbitrary two-qubit states. The aim of this telecloning is to distribute respectively the quantum information to two groups of spatially separated receivers from a group of two senders situated at two different locations. Our scheme shows that the distributed quantum information can be remotely concentrated back to a group of two different receivers with 1 of probability by utilizing maximally four-particle cluster state and four-particle GHZ state as quantum channel.
NASA Astrophysics Data System (ADS)
Yu, Z. P.; Yue, Z. F.; Liu, W.
2018-05-01
With the development of artificial intelligence, more and more reliability experts have noticed the roles of subjective information in the reliability design of complex system. Therefore, based on the certain numbers of experiment data and expert judgments, we have divided the reliability estimation based on distribution hypothesis into cognition process and reliability calculation. Consequently, for an illustration of this modification, we have taken the information fusion based on intuitional fuzzy belief functions as the diagnosis model of cognition process, and finished the reliability estimation for the open function of cabin door affected by the imprecise judgment corresponding to distribution hypothesis.
Local active information storage as a tool to understand distributed neural information processing
Wibral, Michael; Lizier, Joseph T.; Vögler, Sebastian; Priesemann, Viola; Galuske, Ralf
2013-01-01
Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for today's digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, definitions were given for the dynamics of these information processing operations on a local scale in space and time in a distributed system, and the specific concept of local active information storage was successfully applied to the analysis and optimization of artificial neural systems. However, no attempt to measure the space-time dynamics of local active information storage in neural data has been made to date. Here we measure local active information storage on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat. We show that storage reflects neural properties such as stimulus preferences and surprise upon unexpected stimulus change, and in area 18 reflects the abstract concept of an ongoing stimulus despite the locally random nature of this stimulus. We suggest that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding. PMID:24501593
Friederici, A D
1995-09-01
This paper presents a model describing the temporal and neurotopological structure of syntactic processes during comprehension. It postulates three distinct phases of language comprehension, two of which are primarily syntactic in nature. During the first phase the parser assigns the initial syntactic structure on the basis of word category information. These early structural processes are assumed to be subserved by the anterior parts of the left hemisphere, as event-related brain potentials show this area to be maximally activated when phrase structure violations are processed and as circumscribed lesions in this area lead to an impairment of the on-line structural assignment. During the second phase lexical-semantic and verb-argument structure information is processed. This phase is neurophysiologically manifest in a negative component in the event-related brain potential around 400 ms after stimulus onset which is distributed over the left and right temporo-parietal areas when lexical-semantic information is processed and over left anterior areas when verb-argument structure information is processed. During the third phase the parser tries to map the initial syntactic structure onto the available lexical-semantic and verb-argument structure information. In case of an unsuccessful match between the two types of information reanalyses may become necessary. These processes of structural reanalysis are correlated with a centroparietally distributed late positive component in the event-related brain potential.(ABSTRACT TRUNCATED AT 250 WORDS)
DAMT - DISTRIBUTED APPLICATION MONITOR TOOL (HP9000 VERSION)
NASA Technical Reports Server (NTRS)
Keith, B.
1994-01-01
Typical network monitors measure status of host computers and data traffic among hosts. A monitor to collect statistics about individual processes must be unobtrusive and possess the ability to locate and monitor processes, locate and monitor circuits between processes, and report traffic back to the user through a single application program interface (API). DAMT, Distributed Application Monitor Tool, is a distributed application program that will collect network statistics and make them available to the user. This distributed application has one component (i.e., process) on each host the user wishes to monitor as well as a set of components at a centralized location. DAMT provides the first known implementation of a network monitor at the application layer of abstraction. Potential users only need to know the process names of the distributed application they wish to monitor. The tool locates the processes and the circuit between them, and reports any traffic between them at a user-defined rate. The tool operates without the cooperation of the processes it monitors. Application processes require no changes to be monitored by this tool. Neither does DAMT require the UNIX kernel to be recompiled. The tool obtains process and circuit information by accessing the operating system's existing process database. This database contains all information available about currently executing processes. Expanding the information monitored by the tool can be done by utilizing more information from the process database. Traffic on a circuit between processes is monitored by a low-level LAN analyzer that has access to the raw network data. The tool also provides features such as dynamic event reporting and virtual path routing. A reusable object approach was used in the design of DAMT. The tool has four main components; the Virtual Path Switcher, the Central Monitor Complex, the Remote Monitor, and the LAN Analyzer. All of DAMT's components are independent, asynchronously executing processes. The independent processes communicate with each other via UNIX sockets through a Virtual Path router, or Switcher. The Switcher maintains a routing table showing the host of each component process of the tool, eliminating the need for each process to do so. The Central Monitor Complex provides the single application program interface (API) to the user and coordinates the activities of DAMT. The Central Monitor Complex is itself divided into independent objects that perform its functions. The component objects are the Central Monitor, the Process Locator, the Circuit Locator, and the Traffic Reporter. Each of these objects is an independent, asynchronously executing process. User requests to the tool are interpreted by the Central Monitor. The Process Locator identifies whether a named process is running on a monitored host and which host that is. The circuit between any two processes in the distributed application is identified using the Circuit Locator. The Traffic Reporter handles communication with the LAN Analyzer and accumulates traffic updates until it must send a traffic report to the user. The Remote Monitor process is replicated on each monitored host. It serves the Central Monitor Complex processes with application process information. The Remote Monitor process provides access to operating systems information about currently executing processes. It allows the Process Locator to find processes and the Circuit Locator to identify circuits between processes. It also provides lifetime information about currently monitored processes. The LAN Analyzer consists of two processes. Low-level monitoring is handled by the Sniffer. The Sniffer analyzes the raw data on a single, physical LAN. It responds to commands from the Analyzer process, which maintains the interface to the Traffic Reporter and keeps track of which circuits to monitor. DAMT is written in C-language for HP-9000 series computers running HP-UX and Sun 3 and 4 series computers running SunOS. DAMT requires 1Mb of disk space and 4Mb of RAM for execution. This package requires MIT's X Window System, Version 11 Revision 4, with OSF/Motif 1.1. The HP-9000 version (GSC-13589) includes sample HP-9000/375 and HP-9000/730 executables which were compiled under HP-UX, and the Sun version (GSC-13559) includes sample Sun3 and Sun4 executables compiled under SunOS. The standard distribution medium for the HP version of DAMT is a .25 inch HP pre-formatted streaming magnetic tape cartridge in UNIX tar format. It is also available on a 4mm magnetic tape in UNIX tar format. The standard distribution medium for the Sun version of DAMT is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. DAMT was developed in 1992.
Groza, Tudor; Verspoor, Karin
2015-01-01
Concept recognition (CR) is a foundational task in the biomedical domain. It supports the important process of transforming unstructured resources into structured knowledge. To date, several CR approaches have been proposed, most of which focus on a particular set of biomedical ontologies. Their underlying mechanisms vary from shallow natural language processing and dictionary lookup to specialized machine learning modules. However, no prior approach considers the case sensitivity characteristics and the term distribution of the underlying ontology on the CR process. This article proposes a framework that models the CR process as an information retrieval task in which both case sensitivity and the information gain associated with tokens in lexical representations (e.g., term labels, synonyms) are central components of a strategy for generating term variants. The case sensitivity of a given ontology is assessed based on the distribution of so-called case sensitive tokens in its terms, while information gain is modelled using a combination of divergence from randomness and mutual information. An extensive evaluation has been carried out using the CRAFT corpus. Experimental results show that case sensitivity awareness leads to an increase of up to 0.07 F1 against a non-case sensitive baseline on the Protein Ontology and GO Cellular Component. Similarly, the use of information gain leads to an increase of up to 0.06 F1 against a standard baseline in the case of GO Biological Process and Molecular Function and GO Cellular Component. Overall, subject to the underlying token distribution, these methods lead to valid complementary strategies for augmenting term label sets to improve concept recognition.
Using Bayesian networks to support decision-focused information retrieval
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehner, P.; Elsaesser, C.; Seligman, L.
This paper has described an approach to controlling the process of pulling data/information from distributed data bases in a way that is specific to a persons specific decision making context. Our prototype implementation of this approach uses a knowledge-based planner to generate a plan, an automatically constructed Bayesian network to evaluate the plan, specialized processing of the network to derive key information items that would substantially impact the evaluation of the plan (e.g., determine that replanning is needed), automated construction of Standing Requests for Information (SRIs) which are automated functions that monitor changes and trends in distributed data base thatmore » are relevant to the key information items. This emphasis of this paper is on how Bayesian networks are used.« less
Characterization of autoregressive processes using entropic quantifiers
NASA Astrophysics Data System (ADS)
Traversaro, Francisco; Redelico, Francisco O.
2018-01-01
The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.
NASA Astrophysics Data System (ADS)
Yu, Bailang; Wu, Jianping
2006-10-01
Spatial Information Grid (SIG) is an infrastructure that has the ability to provide the services for spatial information according to users' needs by means of collecting, sharing, organizing and processing the massive distributed spatial information resources. This paper presents the architecture, technologies and implementation of the Shanghai City Spatial Information Application and Service System, a SIG based platform, which is an integrated platform that serves for administration, planning, construction and development of the city. In the System, there are ten categories of spatial information resources, including city planning, land-use, real estate, river system, transportation, municipal facility construction, environment protection, sanitation, urban afforestation and basic geographic information data. In addition, spatial information processing services are offered as a means of GIS Web Services. The resources and services are all distributed in different web-based nodes. A single database is created to store the metadata of all the spatial information. A portal site is published as the main user interface of the System. There are three main functions in the portal site. First, users can search the metadata and consequently acquire the distributed data by using the searching results. Second, some spatial processing web applications that developed with GIS Web Services, such as file format conversion, spatial coordinate transfer, cartographic generalization and spatial analysis etc, are offered to use. Third, GIS Web Services currently available in the System can be searched and new ones can be registered. The System has been working efficiently in Shanghai Government Network since 2005.
Future electro-optical sensors and processing in urban operations
NASA Astrophysics Data System (ADS)
Grönwall, Christina; Schwering, Piet B.; Rantakokko, Jouni; Benoist, Koen W.; Kemp, Rob A. W.; Steinvall, Ove; Letalick, Dietmar; Björkert, Stefan
2013-10-01
In the electro-optical sensors and processing in urban operations (ESUO) study we pave the way for the European Defence Agency (EDA) group of Electro-Optics experts (IAP03) for a common understanding of the optimal distribution of processing functions between the different platforms. Combinations of local, distributed and centralized processing are proposed. In this way one can match processing functionality to the required power, and available communication systems data rates, to obtain the desired reaction times. In the study, three priority scenarios were defined. For these scenarios, present-day and future sensors and signal processing technologies were studied. The priority scenarios were camp protection, patrol and house search. A method for analyzing information quality in single and multi-sensor systems has been applied. A method for estimating reaction times for transmission of data through the chain of command has been proposed and used. These methods are documented and can be used to modify scenarios, or be applied to other scenarios. Present day data processing is organized mainly locally. Very limited exchange of information with other platforms is present; this is performed mainly at a high information level. Main issues that arose from the analysis of present-day systems and methodology are the slow reaction time due to the limited field of view of present-day sensors and the lack of robust automated processing. Efficient handover schemes between wide and narrow field of view sensors may however reduce the delay times. The main effort in the study was in forecasting the signal processing of EO-sensors in the next ten to twenty years. Distributed processing is proposed between hand-held and vehicle based sensors. This can be accompanied by cloud processing on board several vehicles. Additionally, to perform sensor fusion on sensor data originating from different platforms, and making full use of UAV imagery, a combination of distributed and centralized processing is essential. There is a central role for sensor fusion of heterogeneous sensors in future processing. The changes that occur in the urban operations of the future due to the application of these new technologies will be the improved quality of information, with shorter reaction time, and with lower operator load.
Extracting Information from Folds in Rocks.
ERIC Educational Resources Information Center
Hudleston, Peter John
1986-01-01
Describes the three processes of folding in rocks: buckling, bending, and passive folding. Discusses how geometrical properties and strain distributions help to identify which processes produce natural folds, and also provides information about the mechanical properties of rocks, and the sense of shear in shear zones. (TW)
Land transportation model for supply chain manufacturing industries
NASA Astrophysics Data System (ADS)
Kurniawan, Fajar
2017-12-01
Supply chain is a system that integrates production, inventory, distribution and information processes for increasing productivity and minimize costs. Transportation is an important part of the supply chain system, especially for supporting the material distribution process, work in process products and final products. In fact, Jakarta as the distribution center of manufacturing industries for the industrial area. Transportation system has a large influences on the implementation of supply chain process efficiency. The main problem faced in Jakarta is traffic jam that will affect on the time of distribution. Based on the system dynamic model, there are several scenarios that can provide solutions to minimize timing of distribution that will effect on the cost such as the construction of ports approaching industrial areas other than Tanjung Priok, widening road facilities, development of railways system, and the development of distribution center.
Power System Information Delivering System Based on Distributed Object
NASA Astrophysics Data System (ADS)
Tanaka, Tatsuji; Tsuchiya, Takehiko; Tamura, Setsuo; Seki, Tomomichi; Kubota, Kenji
In recent years, improvement in computer performance and development of computer network technology or the distributed information processing technology has a remarkable thing. Moreover, the deregulation is starting and will be spreading in the electric power industry in Japan. Consequently, power suppliers are required to supply low cost power with high quality services to customers. Corresponding to these movements the authors have been proposed SCOPE (System Configuration Of PowEr control system) architecture for distributed EMS/SCADA (Energy Management Systems / Supervisory Control and Data Acquisition) system based on distributed object technology, which offers the flexibility and expandability adapting those movements. In this paper, the authors introduce a prototype of the power system information delivering system, which was developed based on SCOPE architecture. This paper describes the architecture and the evaluation results of this prototype system. The power system information delivering system supplies useful power systems information such as electric power failures to the customers using Internet and distributed object technology. This system is new type of SCADA system which monitors failure of power transmission system and power distribution system with geographic information integrated way.
A Parallel and Distributed Processing Model of Joint Attention, Social-Cognition and Autism
Mundy, Peter; Sullivan, Lisa; Mastergeorge, Ann M.
2009-01-01
Scientific Abstract The impaired development of joint attention is a cardinal feature of autism. Therefore, understanding the nature of joint attention is a central to research on this disorder. Joint attention may be best defined in terms of an information processing system that begins to develop by 4–6 months of age. This system integrates the parallel processing of internal information about one’s own visual attention with external information about the visual attention of other people. This type of joint encoding of information about self and other attention requires the activation of a distributed anterior and posterior cortical attention network. Genetic regulation, in conjunction with self-organizing behavioral activity guides the development of functional connectivity in this network. With practice in infancy the joint processing of self-other attention becomes automatically engaged as an executive function. It can be argued that this executive joint-attention is fundamental to human learning, as well as the development of symbolic thought, social-cognition and social-competence throughout the life span. One advantage of this parallel and distributed processing model of joint attention (PDPM) is that it directly connects theory on social pathology to a range of phenomenon in autism associated with neural connectivity, constructivist and connectionist models of cognitive development, early intervention, activity-dependent gene expression, and atypical ocular motor control. PMID:19358304
NASA Astrophysics Data System (ADS)
Dafflon, B.; Barrash, W.; Cardiff, M.; Johnson, T. C.
2011-12-01
Reliable predictions of groundwater flow and solute transport require an estimation of the detailed distribution of the parameters (e.g., hydraulic conductivity, effective porosity) controlling these processes. However, such parameters are difficult to estimate because of the inaccessibility and complexity of the subsurface. In this regard, developments in parameter estimation techniques and investigations of field experiments are still challenging and necessary to improve our understanding and the prediction of hydrological processes. Here we analyze a conservative tracer test conducted at the Boise Hydrogeophysical Research Site in 2001 in a heterogeneous unconfined fluvial aquifer. Some relevant characteristics of this test include: variable-density (sinking) effects because of the injection concentration of the bromide tracer, the relatively small size of the experiment, and the availability of various sources of geophysical and hydrological information. The information contained in this experiment is evaluated through several parameter estimation approaches, including a grid-search-based strategy, stochastic simulation of hydrological property distributions, and deterministic inversion using regularization and pilot-point techniques. Doing this allows us to investigate hydraulic conductivity and effective porosity distributions and to compare the effects of assumptions from several methods and parameterizations. Our results provide new insights into the understanding of variable-density transport processes and the hydrological relevance of incorporating various sources of information in parameter estimation approaches. Among others, the variable-density effect and the effective porosity distribution, as well as their coupling with the hydraulic conductivity structure, are seen to be significant in the transport process. The results also show that assumed prior information can strongly influence the estimated distributions of hydrological properties.
Montana Curriculum Guidelines for Distributive Education. Revised.
ERIC Educational Resources Information Center
Harris, Ron, Ed.
These distributive education curriculum guidelines are intended to provide Montana teachers with teaching information for 11 units. Units cover introduction to marketing and distributive education, human relations and communications, operations and control, processes involved in buying for resale, merchandise handling, sales promotion, sales and…
WaveJava: Wavelet-based network computing
NASA Astrophysics Data System (ADS)
Ma, Kun; Jiao, Licheng; Shi, Zhuoer
1997-04-01
Wavelet is a powerful theory, but its successful application still needs suitable programming tools. Java is a simple, object-oriented, distributed, interpreted, robust, secure, architecture-neutral, portable, high-performance, multi- threaded, dynamic language. This paper addresses the design and development of a cross-platform software environment for experimenting and applying wavelet theory. WaveJava, a wavelet class library designed by the object-orient programming, is developed to take advantage of the wavelets features, such as multi-resolution analysis and parallel processing in the networking computing. A new application architecture is designed for the net-wide distributed client-server environment. The data are transmitted with multi-resolution packets. At the distributed sites around the net, these data packets are done the matching or recognition processing in parallel. The results are fed back to determine the next operation. So, the more robust results can be arrived quickly. The WaveJava is easy to use and expand for special application. This paper gives a solution for the distributed fingerprint information processing system. It also fits for some other net-base multimedia information processing, such as network library, remote teaching and filmless picture archiving and communications.
Investigation of Doppler spectra of laser radiation scattered inside hand skin during occlusion test
NASA Astrophysics Data System (ADS)
Kozlov, I. O.; Zherebtsov, E. A.; Zherebtsova, A. I.; Dremin, V. V.; Dunaev, A. V.
2017-11-01
Laser Doppler flowmetry (LDF) is a method widely used in diagnosis of microcirculation diseases. It is well known that information about frequency distribution of Doppler spectrum of the laser radiation scattered by moving red blood cells (RBC) usually disappears after signal processing procedure. Photocurrent’s spectrum distribution contains valuable diagnostic information about velocity distribution of the RBC. In this research it is proposed to compute the indexes of microcirculation in the sub-ranges of the Doppler spectrum as well as investigate the frequency distribution of the computed indexes.
Fiacco, P. A.; Rice, W. H.
1991-01-01
Computerized medical record systems require structured database architectures for information processing. However, the data must be able to be transferred across heterogeneous platform and software systems. Client-Server architecture allows for distributive processing of information among networked computers and provides the flexibility needed to link diverse systems together effectively. We have incorporated this client-server model with a graphical user interface into an outpatient medical record system, known as SuperChart, for the Department of Family Medicine at SUNY Health Science Center at Syracuse. SuperChart was developed using SuperCard and Oracle SuperCard uses modern object-oriented programming to support a hypermedia environment. Oracle is a powerful relational database management system that incorporates a client-server architecture. This provides both a distributed database and distributed processing which improves performance. PMID:1807732
Operating tool for a distributed data and information management system
NASA Astrophysics Data System (ADS)
Reck, C.; Mikusch, E.; Kiemle, S.; Wolfmüller, M.; Böttcher, M.
2002-07-01
The German Remote Sensing Data Center has developed the Data Information and Management System DIMS which provides multi-mission ground system services for earth observation product processing, archiving, ordering and delivery. DIMS successfully uses newest technologies within its services. This paper presents the solution taken to simplify operation tasks for this large and distributed system.
A Case Study of Information Transfer by Social Change Organizations.
ERIC Educational Resources Information Center
Macfarlane, Ronald G.
1984-01-01
Examines the information needs of social change organizations using the peace and disarmament movement in Toronto as a case study. The process of information transfer, sources and places to obtain information, information distribution, and how libraries can better meet information needs of community organizations are discussed. Eight references…
NASA Astrophysics Data System (ADS)
Savorskiy, V.; Lupyan, E.; Balashov, I.; Burtsev, M.; Proshin, A.; Tolpin, V.; Ermakov, D.; Chernushich, A.; Panova, O.; Kuznetsov, O.; Vasilyev, V.
2014-04-01
Both development and application of remote sensing involves a considerable expenditure of material and intellectual resources. Therefore, it is important to use high-tech means of distribution of remote sensing data and processing results in order to facilitate access for as much as possible number of researchers. It should be accompanied with creation of capabilities for potentially more thorough and comprehensive, i.e. ultimately deeper, acquisition and complex analysis of information about the state of Earth's natural resources. As well objective need in a higher degree of Earth observation (EO) data assimilation is set by conditions of satellite observations, in which the observed objects are uncontrolled state. Progress in addressing this problem is determined to a large extent by order of the distributed EO information system (IS) functioning. Namely, it is largely dependent on reducing the cost of communication processes (data transfer) between spatially distributed IS nodes and data users. One of the most effective ways to improve the efficiency of data exchange processes is the creation of integrated EO IS optimized for running procedures of distributed data processing. The effective EO IS implementation should be based on specific software architecture.
Empirical comparison of heuristic load distribution in point-to-point multicomputer networks
NASA Technical Reports Server (NTRS)
Grunwald, Dirk C.; Nazief, Bobby A. A.; Reed, Daniel A.
1990-01-01
The study compared several load placement algorithms using instrumented programs and synthetic program models. Salient characteristics of these program traces (total computation time, total number of messages sent, and average message time) span two orders of magnitude. Load distribution algorithms determine the initial placement for processes, a precursor to the more general problem of load redistribution. It is found that desirable workload distribution strategies will place new processes globally, rather than locally, to spread processes rapidly, but that local information should be used to refine global placement.
1998-05-01
distribution limitations recommended if public release is not approved. The ASD(PA) shall also process appeals when public release denial is based upon...Rules of Evidence, and all other applicable laws. An interlocutory appeal by the United States shall lie from a decision or order of a district court... limitations ; document markings; document preparation; scientific and technical information; STINFO; information security; security training
NASA Technical Reports Server (NTRS)
Graves, Sara J.
1994-01-01
Work on this project was focused on information management techniques for Marshall Space Flight Center's EOSDIS Version 0 Distributed Active Archive Center (DAAC). The centerpiece of this effort has been participation in EOSDIS catalog interoperability research, the result of which is a distributed Information Management System (IMS) allowing the user to query the inventories of all the DAAC's from a single user interface. UAH has provided the MSFC DAAC database server for the distributed IMS, and has contributed to definition and development of the browse image display capabilities in the system's user interface. Another important area of research has been in generating value-based metadata through data mining. In addition, information management applications for local inventory and archive management, and for tracking data orders were provided.
Neuronal Assemblies Evidence Distributed Interactions within a Tactile Discrimination Task in Rats
Deolindo, Camila S.; Kunicki, Ana C. B.; da Silva, Maria I.; Lima Brasil, Fabrício; Moioli, Renan C.
2018-01-01
Accumulating evidence suggests that neural interactions are distributed and relate to animal behavior, but many open questions remain. The neural assembly hypothesis, formulated by Hebb, states that synchronously active single neurons may transiently organize into functional neural circuits—neuronal assemblies (NAs)—and that would constitute the fundamental unit of information processing in the brain. However, the formation, vanishing, and temporal evolution of NAs are not fully understood. In particular, characterizing NAs in multiple brain regions over the course of behavioral tasks is relevant to assess the highly distributed nature of brain processing. In the context of NA characterization, active tactile discrimination tasks with rats are elucidative because they engage several cortical areas in the processing of information that are otherwise masked in passive or anesthetized scenarios. In this work, we investigate the dynamic formation of NAs within and among four different cortical regions in long-range fronto-parieto-occipital networks (primary somatosensory, primary visual, prefrontal, and posterior parietal cortices), simultaneously recorded from seven rats engaged in an active tactile discrimination task. Our results first confirm that task-related neuronal firing rate dynamics in all four regions is significantly modulated. Notably, a support vector machine decoder reveals that neural populations contain more information about the tactile stimulus than the majority of single neurons alone. Then, over the course of the task, we identify the emergence and vanishing of NAs whose participating neurons are shown to contain more information about animal behavior than randomly chosen neurons. Taken together, our results further support the role of multiple and distributed neurons as the functional unit of information processing in the brain (NA hypothesis) and their link to active animal behavior. PMID:29375324
NASA Astrophysics Data System (ADS)
Tanaka, Ken-ichi; Ueno, Jun
2017-09-01
Reliable information of radioactivity inventory resulted from the radiological characterization is important in order to plan decommissioning planning and is also crucial in order to promote decommissioning in effectiveness and in safe. The information is referred to by planning of decommissioning strategy and by an application to regulator. Reliable information of radioactivity inventory can be used to optimize the decommissioning processes. In order to perform the radiological characterization reliably, we improved a procedure of an evaluation of neutron-activated materials for a Boiling Water Reactor (BWR). Neutron-activated materials are calculated with calculation codes and their validity should be verified with measurements. The evaluation of neutron-activated materials can be divided into two processes. One is a distribution calculation of neutron-flux. Another is an activation calculation of materials. The distribution calculation of neutron-flux is performed with neutron transport calculation codes with appropriate cross section library to simulate neutron transport phenomena well. Using the distribution of neutron-flux, we perform distribution calculations of radioactivity concentration. We also estimate a time dependent distribution of radioactivity classification and a radioactive-waste classification. The information obtained from the evaluation is utilized by other tasks in the preparatory tasks to make the decommissioning plan and the activity safe and rational.
Distributed Load Shedding over Directed Communication Networks with Time Delays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Tao; Wu, Di
When generation is insufficient to support all loads under emergencies, effective and efficient load shedding needs to be deployed in order to maintain the supply-demand balance. This paper presents a distributed load shedding algorithm, which makes efficient decision based on the discovered global information. In the global information discovery process, each load only communicates with its neighboring load via directed communication links possibly with arbitrarily large but bounded time varying communication delays. We propose a novel distributed information discovery algorithm based on ratio consensus. Simulation results are used to validate the proposed method.
Self-referenced processing, neurodevelopment and joint attention in autism.
Mundy, Peter; Gwaltney, Mary; Henderson, Heather
2010-09-01
This article describes a parallel and distributed processing model (PDPM) of joint attention, self-referenced processing and autism. According to this model, autism involves early impairments in the capacity for rapid, integrated processing of self-referenced (proprioceptive and interoceptive) and other-referenced (exteroceptive) information. Measures of joint attention have proven useful in research on autism because they are sensitive to the early development of the 'parallel' and integrated processing of self- and other-referenced stimuli. Moreover, joint attention behaviors are a consequence, but also an organizer of the functional development of a distal distributed cortical system involving anterior networks including the prefrontal and insula cortices, as well as posterior neural networks including the temporal and parietal cortices. Measures of joint attention provide early behavioral indicators of atypical development in this parallel and distributed processing system in autism. In addition it is proposed that an early, chronic disturbance in the capacity for integrating self- and other-referenced information may have cascading effects on the development of self awareness in autism. The assumptions, empirical support and future research implications of this model are discussed.
Land processes distributed active archive center product lifecycle plan
Daucsavage, John C.; Bennett, Stacie D.
2014-01-01
The U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center and the National Aeronautics and Space Administration (NASA) Earth Science Data System Program worked together to establish, develop, and operate the Land Processes (LP) Distributed Active Archive Center (DAAC) to provide stewardship for NASA’s land processes science data. These data are critical science assets that serve the land processes science community with potential value beyond any immediate research use, and therefore need to be accounted for and properly managed throughout their lifecycle. A fundamental LP DAAC objective is to enable permanent preservation of these data and information products. The LP DAAC accomplishes this by bridging data producers and permanent archival resources while providing intermediate archive services for data and information products.
Mixed mechanisms of multi-site phosphorylation
Suwanmajo, Thapanar; Krishnan, J.
2015-01-01
Multi-site phosphorylation is ubiquitous in cell biology and has been widely studied experimentally and theoretically. The underlying chemical modification mechanisms are typically assumed to be distributive or processive. In this paper, we study the behaviour of mixed mechanisms that can arise either because phosphorylation and dephosphorylation involve different mechanisms or because phosphorylation and/or dephosphorylation can occur through a combination of mechanisms. We examine a hierarchy of models to assess chemical information processing through different mixed mechanisms, using simulations, bifurcation analysis and analytical work. We demonstrate how mixed mechanisms can show important and unintuitive differences from pure distributive and processive mechanisms, in some cases resulting in monostable behaviour with simple dose–response behaviour, while in other cases generating new behaviour-like oscillations. Our results also suggest patterns of information processing that are relevant as the number of modification sites increases. Overall, our work creates a framework to examine information processing arising from complexities of multi-site modification mechanisms and their impact on signal transduction. PMID:25972433
European Science Notes Information Bulletin Reports on Current European and Middle Eastern Science
1992-01-01
Overcash MATERIALS Research and Development in the Abbey-Polymer Processing and Properties ................... 574 J. Magill Corrosion and Protection Centre...gressi• ely pursuing the development of powerful "* Software Engineering and microprocessors and communication chips. The Information Processing ...differential equations, processing , Europe has a number of fascinating weather forecasting) that are to be developed by a projects in distributed
Visual representation of spatiotemporal structure
NASA Astrophysics Data System (ADS)
Schill, Kerstin; Zetzsche, Christoph; Brauer, Wilfried; Eisenkolb, A.; Musto, A.
1998-07-01
The processing and representation of motion information is addressed from an integrated perspective comprising low- level signal processing properties as well as higher-level cognitive aspects. For the low-level processing of motion information we argue that a fundamental requirement is the existence of a spatio-temporal memory. Its key feature, the provision of an orthogonal relation between external time and its internal representation, is achieved by a mapping of temporal structure into a locally distributed activity distribution accessible in parallel by higher-level processing stages. This leads to a reinterpretation of the classical concept of `iconic memory' and resolves inconsistencies on ultra-short-time processing and visual masking. The spatial-temporal memory is further investigated by experiments on the perception of spatio-temporal patterns. Results on the direction discrimination of motion paths provide evidence that information about direction and location are not processed and represented independent of each other. This suggests a unified representation on an early level, in the sense that motion information is internally available in form of a spatio-temporal compound. For the higher-level representation we have developed a formal framework for the qualitative description of courses of motion that may occur with moving objects.
14 CFR 1260.16 - Distribution.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Distribution. 1260.16 Section 1260.16... Pre-Award Requirements § 1260.16 Distribution. (a) Copies of grants and supplements will be provided... when delegated; (4) The NASA Center for AeroSpace Information (CASI), Attn: Document Processing Section...
Information processing in the primate visual system - An integrated systems perspective
NASA Technical Reports Server (NTRS)
Van Essen, David C.; Anderson, Charles H.; Felleman, Daniel J.
1992-01-01
The primate visual system contains dozens of distinct areas in the cerebral cortex and several major subcortical structures. These subdivisions are extensively interconnected in a distributed hierarchical network that contains several intertwined processing streams. A number of strategies are used for efficient information processing within this hierarchy. These include linear and nonlinear filtering, passage through information bottlenecks, and coordinated use of multiple types of information. In addition, dynamic regulation of information flow within and between visual areas may provide the computational flexibility needed for the visual system to perform a broad spectrum of tasks accurately and at high resolution.
NASA Astrophysics Data System (ADS)
Gholibeigian, Hassan
Dimension of information as the fifth dimension of the universe including packages of new information, is nested with space-time. Distributed density of information is matched on its correspondence distributed mater in space-time. Fundamental particle (string) like photon and graviton needs a package of information including its exact quantum state and law for process and travel a Planck length in a Planck time. This process is done via sub-particles (substrings). Processed information is carried by particle as the universe's history. My proposed formula for Planck unit of information (IP) and also for Fundamental Physical (Universal) Constant is: IP =lP ct P =1 Planck length lP, Planck time tP, and c , is light speed. Also my proposed formula for calculation of the packages is: I =tP- 1 . τ , in which, I is number of packages, and τ is lifetime of the particle. ``Communication of information'' as a ``fundamental symmetry'' leads phenomena. Packages should be always up to date including new information for evolution of the Universe. But, where come from or how are created new information which Hawking and his colleagues forgot it bring inside the black hole and leave it behind the horizon in form of soft hair?
A Technical Survey on Optimization of Processing Geo Distributed Data
NASA Astrophysics Data System (ADS)
Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.
2018-04-01
With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.
Mathematical model of whole-process calculation for bottom-blowing copper smelting
NASA Astrophysics Data System (ADS)
Li, Ming-zhou; Zhou, Jie-min; Tong, Chang-ren; Zhang, Wen-hai; Li, He-song
2017-11-01
The distribution law of materials in smelting products is key to cost accounting and contaminant control. Regardless, the distribution law is difficult to determine quickly and accurately by mere sampling and analysis. Mathematical models for material and heat balance in bottom-blowing smelting, converting, anode furnace refining, and electrolytic refining were established based on the principles of material (element) conservation, energy conservation, and control index constraint in copper bottom-blowing smelting. Simulation of the entire process of bottom-blowing copper smelting was established using a self-developed MetCal software platform. A whole-process simulation for an enterprise in China was then conducted. Results indicated that the quantity and composition information of unknown materials, as well as heat balance information, can be quickly calculated using the model. Comparison of production data revealed that the model can basically reflect the distribution law of the materials in bottom-blowing copper smelting. This finding provides theoretical guidance for mastering the performance of the entire process.
Distributed decision support for the 21st century mission space
NASA Astrophysics Data System (ADS)
McQuay, William K.
2002-07-01
The past decade has produced significant changes in the conduct of military operations: increased humanitarian missions, asymmetric warfare, the reliance on coalitions and allies, stringent rules of engagement, concern about casualties, and the need for sustained air operations. Future mission commanders will need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Integral to this process is creating situational assessment-understanding the mission space, simulation to analyze alternative futures, current capabilities, planning assessments, course-of-action assessments, and a common operational picture-keeping everyone on the same sheet of paper. Decision support tools in a distributed collaborative environment offer the capability of decomposing these complex multitask processes and distributing them over a dynamic set of execution assets. Decision support technologies can semi-automate activities, such as planning an operation, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that is not currently fused. The marriage of information and simulation technologies provides the mission commander with a collaborative virtual environment for planning and decision support.
Construction of Green Tide Monitoring System and Research on its Key Techniques
NASA Astrophysics Data System (ADS)
Xing, B.; Li, J.; Zhu, H.; Wei, P.; Zhao, Y.
2018-04-01
As a kind of marine natural disaster, Green Tide has been appearing every year along the Qingdao Coast, bringing great loss to this region, since the large-scale bloom in 2008. Therefore, it is of great value to obtain the real time dynamic information about green tide distribution. In this study, methods of optical remote sensing and microwave remote sensing are employed in Green Tide Monitoring Research. A specific remote sensing data processing flow and a green tide information extraction algorithm are designed, according to the optical and microwave data of different characteristics. In the aspect of green tide spatial distribution information extraction, an automatic extraction algorithm of green tide distribution boundaries is designed based on the principle of mathematical morphology dilation/erosion. And key issues in information extraction, including the division of green tide regions, the obtaining of basic distributions, the limitation of distribution boundary, and the elimination of islands, have been solved. The automatic generation of green tide distribution boundaries from the results of remote sensing information extraction is realized. Finally, a green tide monitoring system is built based on IDL/GIS secondary development in the integrated environment of RS and GIS, achieving the integration of RS monitoring and information extraction.
Challenges of Using CSCL in Open Distributed Learning.
ERIC Educational Resources Information Center
Nilsen, Anders Grov; Instefjord, Elen J.
As a compulsory part of the study in Pedagogical Information Science at the University of Bergen and Stord/Haugesund College (Norway) during the spring term of 1999, students participated in a distributed group activity that provided experience on distributed collaboration and use of online groupware systems. The group collaboration process was…
Raster Data Partitioning for Supporting Distributed GIS Processing
NASA Astrophysics Data System (ADS)
Nguyen Thai, B.; Olasz, A.
2015-08-01
In the geospatial sector big data concept also has already impact. Several studies facing originally computer science techniques applied in GIS processing of huge amount of geospatial data. In other research studies geospatial data is considered as it were always been big data (Lee and Kang, 2015). Nevertheless, we can prove data acquisition methods have been improved substantially not only the amount, but the resolution of raw data in spectral, spatial and temporal aspects as well. A significant portion of big data is geospatial data, and the size of such data is growing rapidly at least by 20% every year (Dasgupta, 2013). The produced increasing volume of raw data, in different format, representation and purpose the wealth of information derived from this data sets represents only valuable results. However, the computing capability and processing speed rather tackle with limitations, even if semi-automatic or automatic procedures are aimed on complex geospatial data (Kristóf et al., 2014). In late times, distributed computing has reached many interdisciplinary areas of computer science inclusive of remote sensing and geographic information processing approaches. Cloud computing even more requires appropriate processing algorithms to be distributed and handle geospatial big data. Map-Reduce programming model and distributed file systems have proven their capabilities to process non GIS big data. But sometimes it's inconvenient or inefficient to rewrite existing algorithms to Map-Reduce programming model, also GIS data can not be partitioned as text-based data by line or by bytes. Hence, we would like to find an alternative solution for data partitioning, data distribution and execution of existing algorithms without rewriting or with only minor modifications. This paper focuses on technical overview of currently available distributed computing environments, as well as GIS data (raster data) partitioning, distribution and distributed processing of GIS algorithms. A proof of concept implementation have been made for raster data partitioning, distribution and processing. The first results on performance have been compared against commercial software ERDAS IMAGINE 2011 and 2014. Partitioning methods heavily depend on application areas, therefore we may consider data partitioning as a preprocessing step before applying processing services on data. As a proof of concept we have implemented a simple tile-based partitioning method splitting an image into smaller grids (NxM tiles) and comparing the processing time to existing methods by NDVI calculation. The concept is demonstrated using own development open source processing framework.
ERIC Educational Resources Information Center
Williams, Diane L.; Minshew, Nancy J.; Goldstein, Gerald
2015-01-01
More than 20?years ago, Minshew and colleagues proposed the Complex Information Processing model of autism in which the impairment is characterized as a generalized deficit involving multiple modalities and cognitive domains that depend on distributed cortical systems responsible for higher order abilities. Subsequent behavioral work revealed a…
Information Delivery Options over Three Decades.
ERIC Educational Resources Information Center
Kennedy, H. E.
1986-01-01
Reviews the development of technological innovations in information delivery, including microforms, electronic processing, online distribution, full-text abstracts online, floppy disks, downloading, vertical integration, electronic publishing, and optical disks. The impact of technology on the information industry and the need to use technology…
Test results management and distributed cognition in electronic health record-enabled primary care.
Smith, Michael W; Hughes, Ashley M; Brown, Charnetta; Russo And, Elise; Giardina, Traber D; Mehta, Praveen; Singh, Hardeep
2018-06-01
Managing abnormal test results in primary care involves coordination across various settings. This study identifies how primary care teams manage test results in a large, computerized healthcare system in order to inform health information technology requirements for test results management and other distributed healthcare services. At five US Veterans Health Administration facilities, we interviewed 37 primary care team members, including 16 primary care providers, 12 registered nurses, and 9 licensed practical nurses. We performed content analysis using a distributed cognition approach, identifying patterns of information transmission across people and artifacts (e.g. electronic health records). Results illustrate challenges (e.g. information overload) as well as strategies used to overcome challenges. Various communication paths were used. Some team members served as intermediaries, processing information before relaying it. Artifacts were used as memory aids. Health information technology should address the risks of distributed work by supporting awareness of team and task status for reliable management of results.
Morphological evidence for parallel processing of information in rat macula.
Ross, M D
1988-01-01
Study of montages, tracings and reconstructions prepared from a series of 570 consecutive ultrathin sections shows that rat maculas are morphologically organized for parallel processing of linear acceleratory information. Type II cells of one terminal field distribute information to neighboring terminals as well. The findings are examined in light of physiological data which indicate that macular receptor fields have a preferred directional vector, and are interpreted by analogy to a computer technology known as an information network.
45 CFR 310.10 - What are the functional requirements for the Model Tribal IV-D System?
Code of Federal Regulations, 2013 CFR
2013-10-01
... Tribal financial management and expenditure information; (d) Distribute current support and arrearage..., process and monitor accounts receivable on all amounts owed, collected, and distributed with regard to: (1...
45 CFR 310.10 - What are the functional requirements for the Model Tribal IV-D System?
Code of Federal Regulations, 2012 CFR
2012-10-01
... Tribal financial management and expenditure information; (d) Distribute current support and arrearage..., process and monitor accounts receivable on all amounts owed, collected, and distributed with regard to: (1...
45 CFR 310.10 - What are the functional requirements for the Model Tribal IV-D System?
Code of Federal Regulations, 2014 CFR
2014-10-01
... Tribal financial management and expenditure information; (d) Distribute current support and arrearage..., process and monitor accounts receivable on all amounts owed, collected, and distributed with regard to: (1...
Distributed collaborative environments for predictive battlespace awareness
NASA Astrophysics Data System (ADS)
McQuay, William K.
2003-09-01
The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Situational assessment is crucial in understanding the battlespace. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Decision support technologies can semi-automate activities, such as analysis and planning, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that the commander must fused. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing AFRL research efforts in applying distributed collaborative environments to predictive battlespace awareness.
Evaluation of Ultrasonic Fiber Structure Extraction Technique Using Autopsy Specimens of Liver
NASA Astrophysics Data System (ADS)
Yamaguchi, Tadashi; Hirai, Kazuki; Yamada, Hiroyuki; Ebara, Masaaki; Hachiya, Hiroyuki
2005-06-01
It is very important to diagnose liver cirrhosis noninvasively and correctly. In our previous studies, we proposed a processing technique to detect changes in liver tissue in vivo. In this paper, we propose the evaluation of the relationship between liver disease and echo information using autopsy specimens of a human liver in vitro. It is possible to verify the function of a processing parameter clearly and to compare the processing result and the actual human liver tissue structure by in vitro experiment. In the results of our processing technique, information that did not obey a Rayleigh distribution from the echo signal of the autopsy liver specimens was extracted depending on changes in a particular processing parameter. The fiber tissue structure of the same specimen was extracted from a number of histological images of stained tissue. We constructed 3D structures using the information extracted from the echo signal and the fiber structure of the stained tissue and compared the two. By comparing the 3D structures, it is possible to evaluate the relationship between the information that does not obey a Rayleigh distribution of the echo signal and the fibrosis structure.
A universal quantum information processor for scalable quantum communication and networks
Yang, Xihua; Xue, Bolin; Zhang, Junxiang; Zhu, Shiyao
2014-01-01
Entanglement provides an essential resource for quantum computation, quantum communication, and quantum networks. How to conveniently and efficiently realize the generation, distribution, storage, retrieval, and control of multipartite entanglement is the basic requirement for realistic quantum information processing. Here, we present a theoretical proposal to efficiently and conveniently achieve a universal quantum information processor (QIP) via atomic coherence in an atomic ensemble. The atomic coherence, produced through electromagnetically induced transparency (EIT) in the Λ-type configuration, acts as the QIP and has full functions of quantum beam splitter, quantum frequency converter, quantum entangler, and quantum repeater. By employing EIT-based nondegenerate four-wave mixing processes, the generation, exchange, distribution, and manipulation of light-light, atom-light, and atom-atom multipartite entanglement can be efficiently and flexibly achieved in a deterministic way with only coherent light fields. This method greatly facilitates the operations in quantum information processing, and holds promising applications in realistic scalable quantum communication and quantum networks. PMID:25316514
Factors Affecting Information Seeking and Evaluation in a Distributed Learning Environment
ERIC Educational Resources Information Center
Lee, Jae-Shin; Cho, Hichang
2011-01-01
The purpose of this study was to identify and analyze the processes of seeking information online and evaluating this information. We hypothesized that individuals' social network, in-out group categorization, and cultural proclivity would influence their online information-seeking behavior. Also, we tested whether individuals differentiated…
75 FR 56522 - Notice of Proposed Information Collection Requests
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-16
... process improvement, the Department routinely conducts a review of the application data elements to... amended (HEA), mandates that the Secretary of Education ``* * * shall produce, distribute, and process... 56523
Informing Mexico's Distributed Generation Policy with System Advisor Model (SAM) Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aznar, Alexandra Y; Zinaman, Owen R; McCall, James D
The Government of Mexico recognizes the potential for clean distributed generation (DG) to meaningfully contribute to Mexico's clean energy and emissions reduction goals. However, important questions remain about how to fairly value DG and foster inclusive and equitable market growth that is beneficial to investors, electricity ratepayers, electricity distributors, and society. The U.S. National Renewable Energy Laboratory (NREL) has partnered with power sector institutions and stakeholders in Mexico to provide timely analytical support and expertise to help inform policymaking processes on clean DG. This document describes two technical assistance interventions that used the System Advisor Model (SAM) to inform Mexico'smore » DG policymaking processes with a focus on rooftop solar regulation and policy.« less
Managing Learning for Performance.
ERIC Educational Resources Information Center
Kuchinke, K. Peter
1995-01-01
Presents findings of organizational learning literature that could substantiate claims of learning organization proponents. Examines four learning processes and their contribution to performance-based learning management: knowledge acquisition, information distribution, information interpretation, and organizational memory. (SK)
Classification of cognitive systems dedicated to data sharing
NASA Astrophysics Data System (ADS)
Ogiela, Lidia; Ogiela, Marek R.
2017-08-01
In this paper will be presented classification of new cognitive information systems dedicated to cryptographic data splitting and sharing processes. Cognitive processes of semantic data analysis and interpretation, will be used to describe new classes of intelligent information and vision systems. In addition, cryptographic data splitting algorithms and cryptographic threshold schemes will be used to improve processes of secure and efficient information management with application of such cognitive systems. The utility of the proposed cognitive sharing procedures and distributed data sharing algorithms will be also presented. A few possible application of cognitive approaches for visual information management and encryption will be also described.
Design distributed simulation platform for vehicle management system
NASA Astrophysics Data System (ADS)
Wen, Zhaodong; Wang, Zhanlin; Qiu, Lihua
2006-11-01
Next generation military aircraft requires the airborne management system high performance. General modules, data integration, high speed data bus and so on are needed to share and manage information of the subsystems efficiently. The subsystems include flight control system, propulsion system, hydraulic power system, environmental control system, fuel management system, electrical power system and so on. The unattached or mixed architecture is changed to integrated architecture. That means the whole airborne system is regarded into one system to manage. So the physical devices are distributed but the system information is integrated and shared. The process function of each subsystem are integrated (including general process modules, dynamic reconfiguration), furthermore, the sensors and the signal processing functions are shared. On the other hand, it is a foundation for power shared. Establish a distributed vehicle management system using 1553B bus and distributed processors which can provide a validation platform for the research of airborne system integrated management. This paper establishes the Vehicle Management System (VMS) simulation platform. Discuss the software and hardware configuration and analyze the communication and fault-tolerant method.
Distribution Functions of Sizes and Fluxes Determined from Supra-Arcade Downflows
NASA Technical Reports Server (NTRS)
McKenzie, D.; Savage, S.
2011-01-01
The frequency distributions of sizes and fluxes of supra-arcade downflows (SADs) provide information about the process of their creation. For example, a fractal creation process may be expected to yield a power-law distribution of sizes and/or fluxes. We examine 120 cross-sectional areas and magnetic flux estimates found by Savage & McKenzie for SADs, and find that (1) the areas are consistent with a log-normal distribution and (2) the fluxes are consistent with both a log-normal and an exponential distribution. Neither set of measurements is compatible with a power-law distribution nor a normal distribution. As a demonstration of the applicability of these findings to improved understanding of reconnection, we consider a simple SAD growth scenario with minimal assumptions, capable of producing a log-normal distribution.
Distributed Visualization Project
NASA Technical Reports Server (NTRS)
Craig, Douglas; Conroy, Michael; Kickbusch, Tracey; Mazone, Rebecca
2016-01-01
Distributed Visualization allows anyone, anywhere to see any simulation at any time. Development focuses on algorithms, software, data formats, data systems and processes to enable sharing simulation-based information across temporal and spatial boundaries without requiring stakeholders to possess highly-specialized and very expensive display systems. It also introduces abstraction between the native and shared data, which allows teams to share results without giving away proprietary or sensitive data. The initial implementation of this capability is the Distributed Observer Network (DON) version 3.1. DON 3.1 is available for public release in the NASA Software Store (https://software.nasa.gov/software/KSC-13775) and works with version 3.0 of the Model Process Control specification (an XML Simulation Data Representation and Communication Language) to display complex graphical information and associated Meta-Data.
P300 event-related potentials in children with dyslexia.
Papagiannopoulou, Eleni A; Lagopoulos, Jim
2017-04-01
To elucidate the timing and the nature of neural disturbances in dyslexia and to further understand the topographical distribution of these, we examined entire brain regions employing the non-invasive auditory oddball P300 paradigm in children with dyslexia and neurotypical controls. Our findings revealed abnormalities for the dyslexia group in (i) P300 latency, globally, but greatest in frontal brain regions and (ii) decreased P300 amplitude confined to the central brain regions (Fig. 1). These findings reflect abnormalities associated with a diminished capacity to process mental workload as well as delayed processing of this information in children with dyslexia. Furthermore, the topographical distribution of these findings suggests a distinct spatial distribution for the observed P300 abnormalities. This information may be useful in future therapeutic or brain stimulation intervention trials.
Auditory Power-Law Activation Avalanches Exhibit a Fundamental Computational Ground State
NASA Astrophysics Data System (ADS)
Stoop, Ruedi; Gomez, Florian
2016-07-01
The cochlea provides a biological information-processing paradigm that we are only beginning to understand in its full complexity. Our work reveals an interacting network of strongly nonlinear dynamical nodes, on which even a simple sound input triggers subnetworks of activated elements that follow power-law size statistics ("avalanches"). From dynamical systems theory, power-law size distributions relate to a fundamental ground state of biological information processing. Learning destroys these power laws. These results strongly modify the models of mammalian sound processing and provide a novel methodological perspective for understanding how the brain processes information.
Distributed Processing of Projections of Large Datasets: A Preliminary Study
Maddox, Brian G.
2004-01-01
Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.
ERIC Educational Resources Information Center
Iaccino, James F.; Sowa, Stephen J.
A study examined the effects of sex, handedness, and instructions in the processing of verbal and spatial information presented tachistoscopically. Subjects, 48 volunteers from Illinois Benedictine College, were evenly distributed in terms of sex and handedness, and were further divided into two subgroups based on whether visual field attendance…
NASA Astrophysics Data System (ADS)
Wang, Zhi-peng; Zhang, Shuai; Liu, Hong-zhao; Qin, Yi
2014-12-01
Based on phase retrieval algorithm and QR code, a new optical encryption technology that only needs to record one intensity distribution is proposed. In this encryption process, firstly, the QR code is generated from the information to be encrypted; and then the generated QR code is placed in the input plane of 4-f system to have a double random phase encryption. For only one intensity distribution in the output plane is recorded as the ciphertext, the encryption process is greatly simplified. In the decryption process, the corresponding QR code is retrieved using phase retrieval algorithm. A priori information about QR code is used as support constraint in the input plane, which helps solve the stagnation problem. The original information can be recovered without distortion by scanning the QR code. The encryption process can be implemented either optically or digitally, and the decryption process uses digital method. In addition, the security of the proposed optical encryption technology is analyzed. Theoretical analysis and computer simulations show that this optical encryption system is invulnerable to various attacks, and suitable for harsh transmission conditions.
From the Research Laboratory to the Operating Company: How Information Travels.
ERIC Educational Resources Information Center
Coppin, Ann S.; Palmer, Linda L.
1980-01-01
Reviews transmission processes of Chevron Oil Field Research Company (COFRC) research results from laboratories to end-user operating companies worldwide. Information dissemination methods described included informal communication, intercompany meetings, visits by COFRC personnel to operating company offices, distribution of written reports,…
An Integration of a GIS with Peatland Management
NASA Technical Reports Server (NTRS)
Hoshal, J. C.; Johnson, R. L.
1982-01-01
The complexities of peatland management in Minnesota and the use of a geographic information system, the Minnesota Land Management Information System (MLMIS) in the management process are examined. General information on the nature of peat and it quantity and distribution in Minnesota is also presented.
Fault-Tolerant Signal Processing Architectures with Distributed Error Control.
1985-01-01
Zm, Revisited," Information and Control, Vol. 37, pp. 100-104, 1978. 13. J. Wakerly , Error Detecting Codes. SeIf-Checkino Circuits and Applications ...However, the newer results concerning applications of real codes are still in the publication process. Hence, two very detailed appendices are included to...significant entities to be protected. While the distributed finite field approach afforded adequate protection, its applicability was restricted and
Distributed Fusion in Sensor Networks with Information Genealogy
2011-06-28
image processing [2], acoustic and speech recognition [3], multitarget tracking [4], distributed fusion [5], and Bayesian inference [6-7]. For...Adaptation for Distant-Talking Speech Recognition." in Proc Acoustics. Speech , and Signal Processing, 2004 |4| Y Bar-Shalom and T 1-. Fortmann...used in speech recognition and other classification applications [8]. But their use in underwater mine classification is limited. In this paper, we
The distributed agent-based approach in the e-manufacturing environment
NASA Astrophysics Data System (ADS)
Sękala, A.; Kost, G.; Dobrzańska-Danikiewicz, A.; Banaś, W.; Foit, K.
2015-11-01
The deficiency of a coherent flow of information from a production department causes unplanned downtime and failures of machines and their equipment, which in turn results in production planning process based on incorrect and out-of-date information. All of these factors entail, as the consequence, the additional difficulties associated with the process of decision-making. They concern, among other, the coordination of components of a distributed system and providing the access to the required information, thereby generating unnecessary costs. The use of agent technology significantly speeds up the flow of information within the virtual enterprise. This paper includes the proposal of a multi-agent approach for the integration of processes within the virtual enterprise concept. The presented concept was elaborated to investigate the possible solutions of the ways of transmission of information in the production system taking into account the self-organization of constituent components. Thus it implicated the linking of the concept of multi-agent system with the system of managing the production information, based on the idea of e-manufacturing. The paper presents resulting scheme that should be the base for elaborating an informatics model of the target virtual system. The computer system itself is intended to be developed next.
Self-Referenced Processing, Neurodevelopment and Joint Attention in Autism
ERIC Educational Resources Information Center
Mundy, Peter; Gwaltney, Mary; Henderson, Heather
2010-01-01
This article describes a parallel and distributed processing model (PDPM) of joint attention, self-referenced processing and autism. According to this model, autism involves early impairments in the capacity for rapid, integrated processing of self-referenced (proprioceptive and interoceptive) and other-referenced (exteroceptive) information.…
Characterization of intermittency in renewal processes: Application to earthquakes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akimoto, Takuma; Hasumi, Tomohiro; Aizawa, Yoji
2010-03-15
We construct a one-dimensional piecewise linear intermittent map from the interevent time distribution for a given renewal process. Then, we characterize intermittency by the asymptotic behavior near the indifferent fixed point in the piecewise linear intermittent map. Thus, we provide a framework to understand a unified characterization of intermittency and also present the Lyapunov exponent for renewal processes. This method is applied to the occurrence of earthquakes using the Japan Meteorological Agency and the National Earthquake Information Center catalog. By analyzing the return map of interevent times, we find that interevent times are not independent and identically distributed random variablesmore » but that the conditional probability distribution functions in the tail obey the Weibull distribution.« less
NASA Astrophysics Data System (ADS)
Lan, Hengxing; Derek Martin, C.; Lim, C. H.
2007-02-01
Geographic information system (GIS) modeling is used in combination with three-dimensional (3D) rockfall process modeling to assess rockfall hazards. A GIS extension, RockFall Analyst (RA), which is capable of effectively handling large amounts of geospatial information relative to rockfall behaviors, has been developed in ArcGIS using ArcObjects and C#. The 3D rockfall model considers dynamic processes on a cell plane basis. It uses inputs of distributed parameters in terms of raster and polygon features created in GIS. Two major components are included in RA: particle-based rockfall process modeling and geostatistics-based rockfall raster modeling. Rockfall process simulation results, 3D rockfall trajectories and their velocity features either for point seeders or polyline seeders are stored in 3D shape files. Distributed raster modeling, based on 3D rockfall trajectories and a spatial geostatistical technique, represents the distribution of spatial frequency, the flying and/or bouncing height, and the kinetic energy of falling rocks. A distribution of rockfall hazard can be created by taking these rockfall characteristics into account. A barrier analysis tool is also provided in RA to aid barrier design. An application of these modeling techniques to a case study is provided. The RA has been tested in ArcGIS 8.2, 8.3, 9.0 and 9.1.
NASA Astrophysics Data System (ADS)
Chen, Ruey-Shun; Tsai, Yung-Shun; Tu, Arthur
In this study we propose a manufacturing control framework based on radio-frequency identification (RFID) technology and a distributed information system to construct a mass-customization production process in a loosely coupled shop-floor control environment. On the basis of this framework, we developed RFID middleware and an integrated information system for tracking and controlling the manufacturing process flow. A bicycle manufacturer was used to demonstrate the prototype system. The findings of this study were that the proposed framework can improve the visibility and traceability of the manufacturing process as well as enhance process quality control and real-time production pedigree access. Using this framework, an enterprise can easily integrate an RFID-based system into its manufacturing environment to facilitate mass customization and a just-in-time production model.
NASA Technical Reports Server (NTRS)
Mah, G. R.; Myers, J.
1993-01-01
The U.S. Government has initiated the Global Change Research program, a systematic study of the Earth as a complete system. NASA's contribution of the Global Change Research Program is the Earth Observing System (EOS), a series of orbital sensor platforms and an associated data processing and distribution system. The EOS Data and Information System (EOSDIS) is the archiving, production, and distribution system for data collected by the EOS space segment and uses a multilayer architecture for processing, archiving, and distributing EOS data. The first layer consists of the spacecraft ground stations and processing facilities that receive the raw data from the orbiting platforms and then separate the data by individual sensors. The second layer consists of Distributed Active Archive Centers (DAAC) that process, distribute, and archive the sensor data. The third layer consists of a user science processing network. The EOSDIS is being developed in a phased implementation. The initial phase, Version 0, is a prototype of the operational system. Version 0 activities are based upon existing systems and are designed to provide an EOSDIS-like capability for information management and distribution. An important science support task is the creation of simulated data sets for EOS instruments from precursor aircraft or satellite data. The Land Processes DAAC, at the EROS Data Center (EDC), is responsible for archiving and processing EOS precursor data from airborne instruments such as the Thermal Infrared Multispectral Scanner (TIMS), the Thematic Mapper Simulator (TMS), and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS). AVIRIS, TIMS, and TMS are flown by the NASA-Ames Research Center ARC) on an ER-2. The ER-2 flies at 65000 feet and can carry up to three sensors simultaneously. Most jointly collected data sets are somewhat boresighted and roughly registered. The instrument data are being used to construct data sets that simulate the spectral and spatial characteristics of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument scheduled to be flown on the first EOS-AM spacecraft. The ASTER is designed to acquire 14 channels of land science data in the visible and near-IR (VNIR), shortwave-IR (SWIR), and thermal-IR (TIR) regions from 0.52 micron to 11.65 micron at high spatial resolutions of 15 m to 90 m. Stereo data will also be acquired in the VNIR region in a single band. The AVIRIS and TMS cover the ASTER VNIR and SWIR bands, and the TIMS covers the TIR bands. Simulated ASTER data sets have been generated over Death Valley, California, Cuprite, Nevada, and the Drum Mountains, Utah using a combination of AVIRIS, TIMS, amd TMS data, and existing digital elevation models (DEM) for the topographic information.
Distributed representations in memory: Insights from functional brain imaging
Rissman, Jesse; Wagner, Anthony D.
2015-01-01
Forging new memories for facts and events, holding critical details in mind on a moment-to-moment basis, and retrieving knowledge in the service of current goals all depend on a complex interplay between neural ensembles throughout the brain. Over the past decade, researchers have increasingly leveraged powerful analytical tools (e.g., multi-voxel pattern analysis) to decode the information represented within distributed fMRI activity patterns. In this review, we discuss how these methods can sensitively index neural representations of perceptual and semantic content, and how leverage on the engagement of distributed representations provides unique insights into distinct aspects of memory-guided behavior. We emphasize that, in addition to characterizing the contents of memories, analyses of distributed patterns shed light on the processes that influence how information is encoded, maintained, or retrieved, and thus inform memory theory. We conclude by highlighting open questions about memory that can be addressed through distributed pattern analyses. PMID:21943171
The closed-mindedness that wasn't: need for structure and expectancy-inconsistent information.
Kemmelmeier, Markus
2015-01-01
Social-cognitive researchers have typically assumed that individuals high in need for structure or need for closure tend to be closed-minded: they are motivated to resist or ignore information that is inconsistent with existing beliefs but instead they rely on category-based expectancies. The present paper argues that this conclusion is not necessarily warranted because previous studies did not allow individual differences in categorical processing to emerge and did not consider different distributions of category-relevant information. Using a person memory paradigm, Experiments 1 and 2 shows that, when categorical processing is optional, high need-for-structure individuals are especially likely to use this type processing to reduce uncertainty, which results in superior recall for expectancy-inconsistent information. Experiment 2 demonstrates that such information is also more likely to be used in judgment making, leading to judgmental moderation among high need-for-structure individuals. Experiments 3 and 4 used a person memory paradigm which requires categorical processing regardless of levels of need for structure. Experiments 3 and 4 demonstrate that, whether expectancy-consistent or -inconsistent information is recalled better is a function of whether the majority of available information is compatible or incompatible with an initial category-based expectancy. Experiment 4 confirmed that the extent to which high need-for-structure individuals attend to different types of information varies with their distribution. The discussion highlights that task affordances have a critical influence on the consequences of categorical processing for memory and social judgment. Thus, high need for structure does not necessarily equate closed-mindedness.
The closed-mindedness that wasn’t: need for structure and expectancy-inconsistent information
Kemmelmeier, Markus
2015-01-01
Social-cognitive researchers have typically assumed that individuals high in need for structure or need for closure tend to be closed-minded: they are motivated to resist or ignore information that is inconsistent with existing beliefs but instead they rely on category-based expectancies. The present paper argues that this conclusion is not necessarily warranted because previous studies did not allow individual differences in categorical processing to emerge and did not consider different distributions of category-relevant information. Using a person memory paradigm, Experiments 1 and 2 shows that, when categorical processing is optional, high need-for-structure individuals are especially likely to use this type processing to reduce uncertainty, which results in superior recall for expectancy-inconsistent information. Experiment 2 demonstrates that such information is also more likely to be used in judgment making, leading to judgmental moderation among high need-for-structure individuals. Experiments 3 and 4 used a person memory paradigm which requires categorical processing regardless of levels of need for structure. Experiments 3 and 4 demonstrate that, whether expectancy-consistent or -inconsistent information is recalled better is a function of whether the majority of available information is compatible or incompatible with an initial category-based expectancy. Experiment 4 confirmed that the extent to which high need-for-structure individuals attend to different types of information varies with their distribution. The discussion highlights that task affordances have a critical influence on the consequences of categorical processing for memory and social judgment. Thus, high need for structure does not necessarily equate closed-mindedness. PMID:26191017
The Information Sector: Definition and Measurement.
ERIC Educational Resources Information Center
Porat, Marc U.
In the last 20 years the U.S. economy had changed as a result of the increase in production, processing, and distribution of information goods and services. Three information sectors--the primary sector producing information goods and services, the private bureaucracy, and the public bureaucracy--are part of a six-sector economy. Today,…
Role of the ATLAS Grid Information System (AGIS) in Distributed Data Analysis and Simulation
NASA Astrophysics Data System (ADS)
Anisenkov, A. V.
2018-03-01
In modern high-energy physics experiments, particular attention is paid to the global integration of information and computing resources into a unified system for efficient storage and processing of experimental data. Annually, the ATLAS experiment performed at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) produces tens of petabytes raw data from the recording electronics and several petabytes of data from the simulation system. For processing and storage of such super-large volumes of data, the computing model of the ATLAS experiment is based on heterogeneous geographically distributed computing environment, which includes the worldwide LHC computing grid (WLCG) infrastructure and is able to meet the requirements of the experiment for processing huge data sets and provide a high degree of their accessibility (hundreds of petabytes). The paper considers the ATLAS grid information system (AGIS) used by the ATLAS collaboration to describe the topology and resources of the computing infrastructure, to configure and connect the high-level software systems of computer centers, to describe and store all possible parameters, control, configuration, and other auxiliary information required for the effective operation of the ATLAS distributed computing applications and services. The role of the AGIS system in the development of a unified description of the computing resources provided by grid sites, supercomputer centers, and cloud computing into a consistent information model for the ATLAS experiment is outlined. This approach has allowed the collaboration to extend the computing capabilities of the WLCG project and integrate the supercomputers and cloud computing platforms into the software components of the production and distributed analysis workload management system (PanDA, ATLAS).
Schlegel, Alexander; Konuthula, Dedeepya; Alexander, Prescott; Blackwood, Ethan; Tse, Peter U
2016-08-01
The manipulation of mental representations in the human brain appears to share similarities with the physical manipulation of real-world objects. In particular, some neuroimaging studies have found increased activity in motor regions during mental rotation, suggesting that mental and physical operations may involve overlapping neural populations. Does the motor network contribute information processing to mental rotation? If so, does it play a similar computational role in both mental and manual rotation, and how does it communicate with the wider network of areas involved in the mental workspace? Here we used multivariate methods and fMRI to study 24 participants as they mentally rotated 3-D objects or manually rotated their hands in one of four directions. We find that information processing related to mental rotations is distributed widely among many cortical and subcortical regions, that the motor network becomes tightly integrated into a wider mental workspace network during mental rotation, and that motor network activity during mental rotation only partially resembles that involved in manual rotation. Additionally, these findings provide evidence that the mental workspace is organized as a distributed core network that dynamically recruits specialized subnetworks for specific tasks as needed.
ERIC Educational Resources Information Center
Ho, Jeanne Marie; Ng, David
2012-01-01
This study examined the process of Information Communication Technology reform in a Singapore school. The focus was on distributed leadership actions, and the factors which enabled and constrained the distribution of leadership. This study adopted a naturalistic inquiry approach, involving the case study of a school. The study found that…
Seaworthy Quantum Key Distribution Design and Validation (SEAKEY)
2016-03-10
Contractor Address: 10 Moulton Street, Cambridge, MA 02138 Title of the Project: Seaworthy Quantum Key Distribution Design and Validation (SEAKEY...Technologies Kathryn Carson Program Manager Quantum Information Processing 2 | P a g e Approved for public release; distribution is...we have continued work calculating the key rates achievable parametrically with receiver performance. In addition, we describe the initial designs
Srinivasan, Narayanan; Mukherjee, Sumitava; Mishra, Maruti V.; Kesarwani, Smriti
2013-01-01
Attention is a key process used to conceptualize and define modes of thought, but we lack information about the role of specific attentional processes on preferential choice and memory in multi-attribute decision making. In this study, we examine the role of attention based on two dimensions, attentional scope and load on choice preference strength and memory using a paradigm that arguably elicits unconscious thought. Scope of attention was manipulated by using global or local processing during distraction (Experiment 1) and before the information-encoding stage (Experiment 2). Load was manipulated by using the n-back task in Experiment 1. Results from Experiment 1 show that global processing or distributed attention during distraction results in stronger preference irrespective of load but better memory only at low cognitive load. Task difficulty or load did not have any effect on preference or memory. In Experiment 2, distributed attention before attribute encoding facilitated only memory but did not influence preference. Results show that attentional processes at different stages of processing like distraction and information-encoding influence decision making processes. Scope of attention not only influences preference and memory but the manner in which attentional scope influences them depends on both load and stage of information processing. The results indicate the important role of attention in processes critical for decision making and calls for a re-evaluation of the unconscious thought theory (UTT) and the need for reconceptualizing the role of attention. PMID:23382726
Srinivasan, Narayanan; Mukherjee, Sumitava; Mishra, Maruti V; Kesarwani, Smriti
2013-01-01
Attention is a key process used to conceptualize and define modes of thought, but we lack information about the role of specific attentional processes on preferential choice and memory in multi-attribute decision making. In this study, we examine the role of attention based on two dimensions, attentional scope and load on choice preference strength and memory using a paradigm that arguably elicits unconscious thought. Scope of attention was manipulated by using global or local processing during distraction (Experiment 1) and before the information-encoding stage (Experiment 2). Load was manipulated by using the n-back task in Experiment 1. Results from Experiment 1 show that global processing or distributed attention during distraction results in stronger preference irrespective of load but better memory only at low cognitive load. Task difficulty or load did not have any effect on preference or memory. In Experiment 2, distributed attention before attribute encoding facilitated only memory but did not influence preference. Results show that attentional processes at different stages of processing like distraction and information-encoding influence decision making processes. Scope of attention not only influences preference and memory but the manner in which attentional scope influences them depends on both load and stage of information processing. The results indicate the important role of attention in processes critical for decision making and calls for a re-evaluation of the unconscious thought theory (UTT) and the need for reconceptualizing the role of attention.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-15
... ``* * * shall produce, distribute, and process free of charge common financial reporting forms as described in... developed an application process to collect and process the data necessary to determine a student's eligibility to receive Title IV, HEA program assistance. The application process involves an applicant's...
Foraging patterns in online searches.
Wang, Xiangwen; Pleimling, Michel
2017-03-01
Nowadays online searches are undeniably the most common form of information gathering, as witnessed by billions of clicks generated each day on search engines. In this work we describe online searches as foraging processes that take place on the semi-infinite line. Using a variety of quantities like probability distributions and complementary cumulative distribution functions of step length and waiting time as well as mean square displacements and entropies, we analyze three different click-through logs that contain the detailed information of millions of queries submitted to search engines. Notable differences between the different logs reveal an increased efficiency of the search engines. In the language of foraging, the newer logs indicate that online searches overwhelmingly yield local searches (i.e., on one page of links provided by the search engines), whereas for the older logs the foraging processes are a combination of local searches and relocation phases that are power law distributed. Our investigation of click logs of search engines therefore highlights the presence of intermittent search processes (where phases of local explorations are separated by power law distributed relocation jumps) in online searches. It follows that good search engines enable the users to find the information they are looking for through a local exploration of a single page with search results, whereas for poor search engine users are often forced to do a broader exploration of different pages.
Foraging patterns in online searches
NASA Astrophysics Data System (ADS)
Wang, Xiangwen; Pleimling, Michel
2017-03-01
Nowadays online searches are undeniably the most common form of information gathering, as witnessed by billions of clicks generated each day on search engines. In this work we describe online searches as foraging processes that take place on the semi-infinite line. Using a variety of quantities like probability distributions and complementary cumulative distribution functions of step length and waiting time as well as mean square displacements and entropies, we analyze three different click-through logs that contain the detailed information of millions of queries submitted to search engines. Notable differences between the different logs reveal an increased efficiency of the search engines. In the language of foraging, the newer logs indicate that online searches overwhelmingly yield local searches (i.e., on one page of links provided by the search engines), whereas for the older logs the foraging processes are a combination of local searches and relocation phases that are power law distributed. Our investigation of click logs of search engines therefore highlights the presence of intermittent search processes (where phases of local explorations are separated by power law distributed relocation jumps) in online searches. It follows that good search engines enable the users to find the information they are looking for through a local exploration of a single page with search results, whereas for poor search engine users are often forced to do a broader exploration of different pages.
A Distributed Architecture for Tsunami Early Warning and Collaborative Decision-support in Crises
NASA Astrophysics Data System (ADS)
Moßgraber, J.; Middleton, S.; Hammitzsch, M.; Poslad, S.
2012-04-01
The presentation will describe work on the system architecture that is being developed in the EU FP7 project TRIDEC on "Collaborative, Complex and Critical Decision-Support in Evolving Crises". The challenges for a Tsunami Early Warning System (TEWS) are manifold and the success of a system depends crucially on the system's architecture. A modern warning system following a system-of-systems approach has to integrate various components and sub-systems such as different information sources, services and simulation systems. Furthermore, it has to take into account the distributed and collaborative nature of warning systems. In order to create an architecture that supports the whole spectrum of a modern, distributed and collaborative warning system one must deal with multiple challenges. Obviously, one cannot expect to tackle these challenges adequately with a monolithic system or with a single technology. Therefore, a system architecture providing the blueprints to implement the system-of-systems approach has to combine multiple technologies and architectural styles. At the bottom layer it has to reliably integrate a large set of conventional sensors, such as seismic sensors and sensor networks, buoys and tide gauges, and also innovative and unconventional sensors, such as streams of messages from social media services. At the top layer it has to support collaboration on high-level decision processes and facilitates information sharing between organizations. In between, the system has to process all data and integrate information on a semantic level in a timely manner. This complex communication follows an event-driven mechanism allowing events to be published, detected and consumed by various applications within the architecture. Therefore, at the upper layer the event-driven architecture (EDA) aspects are combined with principles of service-oriented architectures (SOA) using standards for communication and data exchange. The most prominent challenges on this layer include providing a framework for information integration on a syntactic and semantic level, leveraging distributed processing resources for a scalable data processing platform, and automating data processing and decision support workflows.
Experimental demonstration of graph-state quantum secret sharing.
Bell, B A; Markham, D; Herrera-Martí, D A; Marin, A; Wadsworth, W J; Rarity, J G; Tame, M S
2014-11-21
Quantum communication and computing offer many new opportunities for information processing in a connected world. Networks using quantum resources with tailor-made entanglement structures have been proposed for a variety of tasks, including distributing, sharing and processing information. Recently, a class of states known as graph states has emerged, providing versatile quantum resources for such networking tasks. Here we report an experimental demonstration of graph state-based quantum secret sharing--an important primitive for a quantum network with applications ranging from secure money transfer to multiparty quantum computation. We use an all-optical setup, encoding quantum information into photons representing a five-qubit graph state. We find that one can reliably encode, distribute and share quantum information amongst four parties, with various access structures based on the complex connectivity of the graph. Our results show that graph states are a promising approach for realising sophisticated multi-layered communication protocols in quantum networks.
Hall, William
2017-01-14
As healthcare resources become increasingly scarce due to growing demand and stagnating budgets, the need for effective priority setting and resource allocation will become ever more critical to providing sustainable care to patients. While societal values should certainly play a part in guiding these processes, the methodology used to capture these values need not necessarily be limited to multi-criterion decision analysis (MCDA)-based processes including 'evidence-informed deliberative processes.' However, if decision-makers intend to not only incorporates the values of the public they serve into decisions but have the decisions enacted as well, consideration should be given to more direct involvement of stakeholders. Based on the examples provided by Baltussen et al, MCDA-based processes like 'evidence-informed deliberative processes' could be one way of achieving this laudable goal. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Decending motion of particle and its effect on ozone hole chemistry
NASA Technical Reports Server (NTRS)
Iwasaka, Y.
1988-01-01
Particle descending motion is one possible process which causes ozone loss near the tropopause in the Antarctic spring. However, this particle size distribution has not yet been measured. Particle settling is an important redistribution process of the chemical constituents contained in the particles. To understand particle settling effects on the Ozone Hole, information on the size distribution and the chemical composition of the particles is necessary.
A Disk-Based System for Producing and Distributing Science Products from MODIS
NASA Technical Reports Server (NTRS)
Masuoka, Edward; Wolfe, Robert; Sinno, Scott; Ye Gang; Teague, Michael
2007-01-01
Since beginning operations in 1999, the MODIS Adaptive Processing System (MODAPS) has evolved to take advantage of trends in information technology, such as the falling cost of computing cycles and disk storage and the availability of high quality open-source software (Linux, Apache and Perl), to achieve substantial gains in processing and distribution capacity and throughput while driving down the cost of system operations.
Understanding Local Structure Globally in Earth Science Remote Sensing Data Sets
NASA Technical Reports Server (NTRS)
Braverman, Amy; Fetzer, Eric
2007-01-01
Empirical probability distributions derived from the data are the signatures of physical processes generating the data. Distributions defined on different space-time windows can be compared and differences or changes can be attributed to physical processes. This presentation discusses on ways to reduce remote sensing data in a way that preserves information, focusing on the rate-distortion theory and using the entropy-constrained vector quantization algorithm.
A Novel College Network Resource Management Method using Cloud Computing
NASA Astrophysics Data System (ADS)
Lin, Chen
At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.
Managed traffic evacuation using distributed sensor processing
NASA Astrophysics Data System (ADS)
Ramuhalli, Pradeep; Biswas, Subir
2005-05-01
This paper presents an integrated sensor network and distributed event processing architecture for managed in-building traffic evacuation during natural and human-caused disasters, including earthquakes, fire and biological/chemical terrorist attacks. The proposed wireless sensor network protocols and distributed event processing mechanisms offer a new distributed paradigm for improving reliability in building evacuation and disaster management. The networking component of the system is constructed using distributed wireless sensors for measuring environmental parameters such as temperature, humidity, and detecting unusual events such as smoke, structural failures, vibration, biological/chemical or nuclear agents. Distributed event processing algorithms will be executed by these sensor nodes to detect the propagation pattern of the disaster and to measure the concentration and activity of human traffic in different parts of the building. Based on this information, dynamic evacuation decisions are taken for maximizing the evacuation speed and minimizing unwanted incidents such as human exposure to harmful agents and stampedes near exits. A set of audio-visual indicators and actuators are used for aiding the automated evacuation process. In this paper we develop integrated protocols, algorithms and their simulation models for the proposed sensor networking and the distributed event processing framework. Also, efficient harnessing of the individually low, but collectively massive, processing abilities of the sensor nodes is a powerful concept behind our proposed distributed event processing algorithms. Results obtained through simulation in this paper are used for a detailed characterization of the proposed evacuation management system and its associated algorithmic components.
ERIC Educational Resources Information Center
Yang, Fang-Ying
2017-01-01
The main goal of this study was to investigate how readers' visual attention distribution during reading of conflicting science information is related to their scientific reasoning behavior. A total of 25 university students voluntarily participated in the study. They were given conflicting science information about earthquake predictions to read…
Information Fusion for Situational Awareness
2003-01-01
UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP021704 TITLE: Information Fusion for Situational Awareness DISTRIBUTION...component part numbers comprise the compilation report: ADP021634 thru ADP021736 UNCLASSIFIED Information Fusion for Situational Awareness Dr. John...Situation Assessment, or level 2 be applied to address Situational Awareness - the processing, the knowledge of objects, their goal of this paper
Navigating the Decision Space: Shared Medical Decision Making as Distributed Cognition.
Lippa, Katherine D; Feufel, Markus A; Robinson, F Eric; Shalin, Valerie L
2017-06-01
Despite increasing prominence, little is known about the cognitive processes underlying shared decision making. To investigate these processes, we conceptualize shared decision making as a form of distributed cognition. We introduce a Decision Space Model to identify physical and social influences on decision making. Using field observations and interviews, we demonstrate that patients and physicians in both acute and chronic care consider these influences when identifying the need for a decision, searching for decision parameters, making actionable decisions Based on the distribution of access to information and actions, we then identify four related patterns: physician dominated; physician-defined, patient-made; patient-defined, physician-made; and patient-dominated decisions. Results suggests that (a) decision making is necessarily distributed between physicians and patients, (b) differential access to information and action over time requires participants to transform a distributed task into a shared decision, and (c) adverse outcomes may result from failures to integrate physician and patient reasoning. Our analysis unifies disparate findings in the medical decision-making literature and has implications for improving care and medical training.
Secure distribution for high resolution remote sensing images
NASA Astrophysics Data System (ADS)
Liu, Jin; Sun, Jing; Xu, Zheng Q.
2010-09-01
The use of remote sensing images collected by space platforms is becoming more and more widespread. The increasing value of space data and its use in critical scenarios call for adoption of proper security measures to protect these data against unauthorized access and fraudulent use. In this paper, based on the characteristics of remote sensing image data and application requirements on secure distribution, a secure distribution method is proposed, including users and regions classification, hierarchical control and keys generation, and multi-level encryption based on regions. The combination of the three parts can make that the same remote sensing images after multi-level encryption processing are distributed to different permission users through multicast, but different permission users can obtain different degree information after decryption through their own decryption keys. It well meets user access control and security needs in the process of high resolution remote sensing image distribution. The experimental results prove the effectiveness of the proposed method which is suitable for practical use in the secure transmission of remote sensing images including confidential information over internet.
A Methodology for Distributing the Corporate Database.
ERIC Educational Resources Information Center
McFadden, Fred R.
The trend to distributed processing is being fueled by numerous forces, including advances in technology, corporate downsizing, increasing user sophistication, and acquisitions and mergers. Increasingly, the trend in corporate information systems (IS) departments is toward sharing resources over a network of multiple types of processors, operating…
Research on distributed optical fiber sensing data processing method based on LabVIEW
NASA Astrophysics Data System (ADS)
Li, Zhonghu; Yang, Meifang; Wang, Luling; Wang, Jinming; Yan, Junhong; Zuo, Jing
2018-01-01
The pipeline leak detection and leak location problem have gotten extensive attention in the industry. In this paper, the distributed optical fiber sensing system is designed based on the heat supply pipeline. The data processing method of distributed optical fiber sensing based on LabVIEW is studied emphatically. The hardware system includes laser, sensing optical fiber, wavelength division multiplexer, photoelectric detector, data acquisition card and computer etc. The software system is developed using LabVIEW. The software system adopts wavelet denoising method to deal with the temperature information, which improved the SNR. By extracting the characteristic value of the fiber temperature information, the system can realize the functions of temperature measurement, leak location and measurement signal storage and inquiry etc. Compared with traditional negative pressure wave method or acoustic signal method, the distributed optical fiber temperature measuring system can measure several temperatures in one measurement and locate the leak point accurately. It has a broad application prospect.
Surveillance of industrial processes with correlated parameters
White, Andrew M.; Gross, Kenny C.; Kubic, William L.; Wigeland, Roald A.
1996-01-01
A system and method for surveillance of an industrial process. The system and method includes a plurality of sensors monitoring industrial process parameters, devices to convert the sensed data to computer compatible information and a computer which executes computer software directed to analyzing the sensor data to discern statistically reliable alarm conditions. The computer software is executed to remove serial correlation information and then calculate Mahalanobis distribution data to carry out a probability ratio test to determine alarm conditions.
Quantum decision-maker theory and simulation
NASA Astrophysics Data System (ADS)
Zak, Michail; Meyers, Ronald E.; Deacon, Keith S.
2000-07-01
A quantum device simulating the human decision making process is introduced. It consists of quantum recurrent nets generating stochastic processes which represent the motor dynamics, and of classical neural nets describing the evolution of probabilities of these processes which represent the mental dynamics. The autonomy of the decision making process is achieved by a feedback from the mental to motor dynamics which changes the stochastic matrix based upon the probability distribution. This feedback replaces unavailable external information by an internal knowledge- base stored in the mental model in the form of probability distributions. As a result, the coupled motor-mental dynamics is described by a nonlinear version of Markov chains which can decrease entropy without an external source of information. Applications to common sense based decisions as well as to evolutionary games are discussed. An example exhibiting self-organization is computed using quantum computer simulation. Force on force and mutual aircraft engagements using the quantum decision maker dynamics are considered.
The differential role of phonological and distributional cues in grammatical categorisation.
Monaghan, Padraic; Chater, Nick; Christiansen, Morten H
2005-06-01
Recognising the grammatical categories of words is a necessary skill for the acquisition of syntax and for on-line sentence processing. The syntactic and semantic context of the word contribute as cues for grammatical category assignment, but phonological cues, too, have been implicated as important sources of information. The value of phonological and distributional cues has not, with very few exceptions, been empirically assessed. This paper presents a series of analyses of phonological cues and distributional cues and their potential for distinguishing grammatical categories of words in corpus analyses. The corpus analyses indicated that phonological cues were more reliable for less frequent words, whereas distributional information was most valuable for high frequency words. We tested this prediction in an artificial language learning experiment, where the distributional and phonological cues of categories of nonsense words were varied. The results corroborated the corpus analyses. For high-frequency nonwords, distributional information was more useful, whereas for low-frequency words there was more reliance on phonological cues. The results indicate that phonological and distributional cues contribute differentially towards grammatical categorisation.
Seaworthy Quantum Key Distribution Design and Validation (SEAKEY)
2015-05-27
Address: 10 Moulton Street, Cambridge, MA 02138 Title of the Project: Seaworthy Quantum Key Distribution Design and Validation (SEAKEY...Technologies Kathryn Carson Program Manager Quantum Information Processing Report Documentation Page Form ApprovedOMB No. 0704-0188 Public...2016 4. TITLE AND SUBTITLE Seaworthy Quantum Key Distribution Design and Validation (SEAKEY) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM
Distributed parameterization of complex terrain
NASA Astrophysics Data System (ADS)
Band, Lawrence E.
1991-03-01
This paper addresses the incorporation of high resolution topography, soils and vegetation information into the simulation of land surface processes in atmospheric circulation models (ACM). Recent work has concentrated on detailed representation of one-dimensional exchange processes, implicitly assuming surface homogeneity over the atmospheric grid cell. Two approaches that could be taken to incorporate heterogeneity are the integration of a surface model over distributed, discrete portions of the landscape, or over a distribution function of the model parameters. However, the computational burden and parameter intensive nature of current land surface models in ACM limits the number of independent model runs and parameterizations that are feasible to accomplish for operational purposes. Therefore, simplications in the representation of the vertical exchange processes may be necessary to incorporate the effects of landscape variability and horizontal divergence of energy and water. The strategy is then to trade off the detail and rigor of point exchange calculations for the ability to repeat those calculations over extensive, complex terrain. It is clear the parameterization process for this approach must be automated such that large spatial databases collected from remotely sensed images, digital terrain models and digital maps can be efficiently summarized and transformed into the appropriate parameter sets. Ideally, the landscape should be partitioned into surface units that maximize between unit variance while minimizing within unit variance, although it is recognized that some level of surface heterogeneity will be retained at all scales. Therefore, the geographic data processing necessary to automate the distributed parameterization should be able to estimate or predict parameter distributional information within each surface unit.
ERIC Educational Resources Information Center
Jared, Debra; Jouravlev, Olessia; Joanisse, Marc F.
2017-01-01
Decomposition theories of morphological processing in visual word recognition posit an early morpho-orthographic parser that is blind to semantic information, whereas parallel distributed processing (PDP) theories assume that the transparency of orthographic-semantic relationships influences processing from the beginning. To test these…
A 3-Component Mixture of Rayleigh Distributions: Properties and Estimation in Bayesian Framework
Aslam, Muhammad; Tahir, Muhammad; Hussain, Zawar; Al-Zahrani, Bander
2015-01-01
To study lifetimes of certain engineering processes, a lifetime model which can accommodate the nature of such processes is desired. The mixture models of underlying lifetime distributions are intuitively more appropriate and appealing to model the heterogeneous nature of process as compared to simple models. This paper is about studying a 3-component mixture of the Rayleigh distributionsin Bayesian perspective. The censored sampling environment is considered due to its popularity in reliability theory and survival analysis. The expressions for the Bayes estimators and their posterior risks are derived under different scenarios. In case the case that no or little prior information is available, elicitation of hyperparameters is given. To examine, numerically, the performance of the Bayes estimators using non-informative and informative priors under different loss functions, we have simulated their statistical properties for different sample sizes and test termination times. In addition, to highlight the practical significance, an illustrative example based on a real-life engineering data is also given. PMID:25993475
A Domain-Specific Language for Aviation Domain Interoperability
ERIC Educational Resources Information Center
Comitz, Paul
2013-01-01
Modern information systems require a flexible, scalable, and upgradeable infrastructure that allows communication and collaboration between heterogeneous information processing and computing environments. Aviation systems from different organizations often use differing representations and distribution policies for the same data and messages,…
NASA Astrophysics Data System (ADS)
Antamoshkin, O. A.; Kilochitskaya, T. R.; Ontuzheva, G. A.; Stupina, A. A.; Tynchenko, V. S.
2018-05-01
This study reviews the problem of allocation of resources in the heterogeneous distributed information processing systems, which may be formalized in the form of a multicriterion multi-index problem with the linear constraints of the transport type. The algorithms for solution of this problem suggest a search for the entire set of Pareto-optimal solutions. For some classes of hierarchical systems, it is possible to significantly speed up the procedure of verification of a system of linear algebraic inequalities for consistency due to the reducibility of them to the stream models or the application of other solution schemes (for strongly connected structures) that take into account the specifics of the hierarchies under consideration.
The Information Environment. Education and Curriculum Series No. 3. Syllabus for IST 501.
ERIC Educational Resources Information Center
Taylor, Robert S.
This syllabus outlines a graduate-level introductory overview of the agencies, industries, and services whose primary concerns are the creation, processing, storage, distribution, and use of information; also considered are questions relating to technological impact, the role of the information professional, and cost-benefits. The course is…
A Planning Process Addresses an Organizational and Support Crisis in Information Technology.
ERIC Educational Resources Information Center
Nelson, Keith R.; Davenport, Richard W.
1996-01-01
An institutionwide strategic planning effort at Central Michigan University, in response to a need for rapid and significant changes in its information technology infrastructure, is outlined. The effort resulted in a matrix governance structure for information technology that acknowledges the value of both distributed support and a strong central…
Determining informative priors for cognitive models.
Lee, Michael D; Vanpaemel, Wolf
2018-02-01
The development of cognitive models involves the creative scientific formalization of assumptions, based on theory, observation, and other relevant information. In the Bayesian approach to implementing, testing, and using cognitive models, assumptions can influence both the likelihood function of the model, usually corresponding to assumptions about psychological processes, and the prior distribution over model parameters, usually corresponding to assumptions about the psychological variables that influence those processes. The specification of the prior is unique to the Bayesian context, but often raises concerns that lead to the use of vague or non-informative priors in cognitive modeling. Sometimes the concerns stem from philosophical objections, but more often practical difficulties with how priors should be determined are the stumbling block. We survey several sources of information that can help to specify priors for cognitive models, discuss some of the methods by which this information can be formalized in a prior distribution, and identify a number of benefits of including informative priors in cognitive modeling. Our discussion is based on three illustrative cognitive models, involving memory retention, categorization, and decision making.
ESUSA: US endangered species distribution file
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagy, J.; Calef, C.E.
1979-10-01
This report describes a file containing distribution data on endangered species of the United States of Federal concern pursuant to the Endangered Species Act of 1973. Included for each species are (a) the common name, (b) the scientific name, (c) the family, (d) the group (mammal, bird, etc.), (e) Fish and Wildlife Service (FWS) listing and recovery priorities, (f) the Federal legal status, (g) the geographic distribution by counties or islands, (h) Federal Register citations and (i) the sources of the information on distribution of the species. Status types are endangered, threatened, proposed, formally under review, candidate, deleted, and rejected.more » Distribution is by Federal Information Processing Standard (FIPS) county code and is of four types: designated critical habitat, present range, potential range, and historic range.« less
Method for removing atomic-model bias in macromolecular crystallography
Terwilliger, Thomas C [Santa Fe, NM
2006-08-01
Structure factor bias in an electron density map for an unknown crystallographic structure is minimized by using information in a first electron density map to elicit expected structure factor information. Observed structure factor amplitudes are combined with a starting set of crystallographic phases to form a first set of structure factors. A first electron density map is then derived and features of the first electron density map are identified to obtain expected distributions of electron density. Crystallographic phase probability distributions are established for possible crystallographic phases of reflection k, and the process is repeated as k is indexed through all of the plurality of reflections. An updated electron density map is derived from the crystallographic phase probability distributions for each one of the reflections. The entire process is then iterated to obtain a final set of crystallographic phases with minimum bias from known electron density maps.
How to retrieve additional information from the multiplicity distributions
NASA Astrophysics Data System (ADS)
Wilk, Grzegorz; Włodarczyk, Zbigniew
2017-01-01
Multiplicity distributions (MDs) P(N) measured in multiparticle production processes are most frequently described by the negative binomial distribution (NBD). However, with increasing collision energy some systematic discrepancies have become more and more apparent. They are usually attributed to the possible multi-source structure of the production process and described using a multi-NBD form of the MD. We investigate the possibility of keeping a single NBD but with its parameters depending on the multiplicity N. This is done by modifying the widely known clan model of particle production leading to the NBD form of P(N). This is then confronted with the approach based on the so-called cascade-stochastic formalism which is based on different types of recurrence relations defining P(N). We demonstrate that a combination of both approaches allows the retrieval of additional valuable information from the MDs, namely the oscillatory behavior of the counting statistics apparently visible in the high energy data.
A VBA Desktop Database for Proposal Processing at National Optical Astronomy Observatories
NASA Astrophysics Data System (ADS)
Brown, Christa L.
National Optical Astronomy Observatories (NOAO) has developed a relational Microsoft Windows desktop database using Microsoft Access and the Microsoft Office programming language, Visual Basic for Applications (VBA). The database is used to track data relating to observing proposals from original receipt through the review process, scheduling, observing, and final statistical reporting. The database has automated proposal processing and distribution of information. It allows NOAO to collect and archive data so as to query and analyze information about our science programs in new ways.
Surveillance of industrial processes with correlated parameters
White, A.M.; Gross, K.C.; Kubic, W.L.; Wigeland, R.A.
1996-12-17
A system and method for surveillance of an industrial process are disclosed. The system and method includes a plurality of sensors monitoring industrial process parameters, devices to convert the sensed data to computer compatible information and a computer which executes computer software directed to analyzing the sensor data to discern statistically reliable alarm conditions. The computer software is executed to remove serial correlation information and then calculate Mahalanobis distribution data to carry out a probability ratio test to determine alarm conditions. 10 figs.
IRM in the Federal Government: Opinions and Reflections.
ERIC Educational Resources Information Center
Haney, Glenn P.
1989-01-01
Evaluates various aspects of federal information resources management and reviews technological changes within the Department of Agriculture to illustrate current issues and future trends in information resources management. Topics discussed include telecommunications and networking; distributed processing and field office automation; the role of…
NASA Technical Reports Server (NTRS)
Craddock, Robert A.; Golombek, Matthew; Howard, Alan D.
2000-01-01
Both the size-frequency distribution and morphometry of rock populations emplaced by a variety of geologic processes in Hawaii indicate that such information may be useful in planning future landing sites on Mars and interpreting the surface geology.
Gender differences in justice evaluations: Evidence from fMRI.
Dulebohn, James H; Davison, Robert B; Lee, Seungcheol Austin; Conlon, Donald E; McNamara, Gerry; Sarinopoulos, Issidoros C
2016-02-01
Justice research examining gender differences has yielded contrasting findings. This study enlists advanced techniques in cognitive neuroscience (fMRI) to examine gender differences in brain activation patterns in response to procedural and distributive justice manipulations. We integrate social role, information processing, justice, and neuroscience literature to posit and test for gender differences in 2 neural subsystems known to be involved in the appraisal of self-relevant events. Results indicate that the relationship between justice information processing and neural activity in areas representing these subsystems is significantly influenced by gender, with greater activation for females than males during consideration of both procedural and distributive justice information. In addition, we find evidence that gender and distributive injustice interact to influence bargaining behavior, with females rejecting ultimatum game offers more frequently than males. Results also demonstrate activation in the ventromedial prefrontal cortex (vmPFC) and ventral striatum brain regions during procedural justice evaluation is associated with offer rejection in females, but not in males. Managerial implications based on the study's support for gender differences in justice perceptions are discussed. (c) 2016 APA, all rights reserved).
CO 2 laser cutting of MDF . 2. Estimation of power distribution
NASA Astrophysics Data System (ADS)
Ng, S. L.; Lum, K. C. P.; Black, I.
2000-02-01
Part 2 of this paper details an experimentally-based method to evaluate the power distribution for both CW and PM cutting. Variations in power distribution with different cutting speeds, material thickness and pulse ratios are presented. The paper also provides information on both the cutting efficiency and absorptivity index for MDF, and comments on the beam dispersion characteristics after the cutting process.
Design considerations, architecture, and use of the Mini-Sentinel distributed data system.
Curtis, Lesley H; Weiner, Mark G; Boudreau, Denise M; Cooper, William O; Daniel, Gregory W; Nair, Vinit P; Raebel, Marsha A; Beaulieu, Nicolas U; Rosofsky, Robert; Woodworth, Tiffany S; Brown, Jeffrey S
2012-01-01
We describe the design, implementation, and use of a large, multiorganizational distributed database developed to support the Mini-Sentinel Pilot Program of the US Food and Drug Administration (FDA). As envisioned by the US FDA, this implementation will inform and facilitate the development of an active surveillance system for monitoring the safety of medical products (drugs, biologics, and devices) in the USA. A common data model was designed to address the priorities of the Mini-Sentinel Pilot and to leverage the experience and data of participating organizations and data partners. A review of existing common data models informed the process. Each participating organization designed a process to extract, transform, and load its source data, applying the common data model to create the Mini-Sentinel Distributed Database. Transformed data were characterized and evaluated using a series of programs developed centrally and executed locally by participating organizations. A secure communications portal was designed to facilitate queries of the Mini-Sentinel Distributed Database and transfer of confidential data, analytic tools were developed to facilitate rapid response to common questions, and distributed querying software was implemented to facilitate rapid querying of summary data. As of July 2011, information on 99,260,976 health plan members was included in the Mini-Sentinel Distributed Database. The database includes 316,009,067 person-years of observation time, with members contributing, on average, 27.0 months of observation time. All data partners have successfully executed distributed code and returned findings to the Mini-Sentinel Operations Center. This work demonstrates the feasibility of building a large, multiorganizational distributed data system in which organizations retain possession of their data that are used in an active surveillance system. Copyright © 2012 John Wiley & Sons, Ltd.
Design and development of a medical big data processing system based on Hadoop.
Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song
2015-03-01
Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.
A Framework for Distributed Problem Solving
NASA Astrophysics Data System (ADS)
Leone, Joseph; Shin, Don G.
1989-03-01
This work explores a distributed problem solving (DPS) approach, namely the AM/AG model, to cooperative memory recall. The AM/AG model is a hierarchic social system metaphor for DPS based on the Mintzberg's model of organizations. At the core of the model are information flow mechanisms, named amplification and aggregation. Amplification is a process of expounding a given task, called an agenda, into a set of subtasks with magnified degree of specificity and distributing them to multiple processing units downward in the hierarchy. Aggregation is a process of combining the results reported from multiple processing units into a unified view, called a resolution, and promoting the conclusion upward in the hierarchy. The combination of amplification and aggregation can account for a memory recall process which primarily relies on the ability of making associations between vast amounts of related concepts, sorting out the combined results, and promoting the most plausible ones. The amplification process is discussed in detail. An implementation of the amplification process is presented. The process is illustrated by an example.
DOE Office of Scientific and Technical Information (OSTI.GOV)
ROOT, R.W.
1999-05-18
This guide provides the Tank Waste Remediation System Privatization Infrastructure Program management with processes and requirements to appropriately control information and documents in accordance with the Tank Waste Remediation System Configuration Management Plan (Vann 1998b). This includes documents and information created by the program, as well as non-program generated materials submitted to the project. It provides appropriate approval/control, distribution and filing systems.
Lyceum: A Multi-Protocol Digital Library Gateway
NASA Technical Reports Server (NTRS)
Maa, Ming-Hokng; Nelson, Michael L.; Esler, Sandra L.
1997-01-01
Lyceum is a prototype scalable query gateway that provides a logically central interface to multi-protocol and physically distributed, digital libraries of scientific and technical information. Lyceum processes queries to multiple syntactically distinct search engines used by various distributed information servers from a single logically central interface without modification of the remote search engines. A working prototype (http://www.larc.nasa.gov/lyceum/) demonstrates the capabilities, potentials, and advantages of this type of meta-search engine by providing access to over 50 servers covering over 20 disciplines.
Coherent-state information concentration and purification in atomic memory
NASA Astrophysics Data System (ADS)
Herec, Jiří; Filip, Radim
2006-12-01
We propose a feasible method of coherent-state information concentration and purification utilizing quantum memory. The method allows us to optimally concentrate and purify information carried by many noisy copies of an unknown coherent state (randomly distributed in time) to a single copy. Thus nonclassical resources and operations can be saved, if we compare information processing with many noisy copies and a single copy with concentrated and purified information.
Study on parallel and distributed management of RS data based on spatial database
NASA Astrophysics Data System (ADS)
Chen, Yingbiao; Qian, Qinglan; Wu, Hongqiao; Liu, Shijin
2009-10-01
With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.
Study on parallel and distributed management of RS data based on spatial data base
NASA Astrophysics Data System (ADS)
Chen, Yingbiao; Qian, Qinglan; Liu, Shijin
2006-12-01
With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.
NASA Astrophysics Data System (ADS)
Jiang, Xue-Qin; Huang, Peng; Huang, Duan; Lin, Dakai; Zeng, Guihua
2017-02-01
Achieving information theoretic security with practical complexity is of great interest to continuous-variable quantum key distribution in the postprocessing procedure. In this paper, we propose a reconciliation scheme based on the punctured low-density parity-check (LDPC) codes. Compared to the well-known multidimensional reconciliation scheme, the present scheme has lower time complexity. Especially when the chosen punctured LDPC code achieves the Shannon capacity, the proposed reconciliation scheme can remove the information that has been leaked to an eavesdropper in the quantum transmission phase. Therefore, there is no information leaked to the eavesdropper after the reconciliation stage. This indicates that the privacy amplification algorithm of the postprocessing procedure is no more needed after the reconciliation process. These features lead to a higher secret key rate, optimal performance, and availability for the involved quantum key distribution scheme.
Inferring epidemiological parameters from phylogenetic information for the HIV-1 epidemic among MSM
NASA Astrophysics Data System (ADS)
Quax, Rick; van de Vijver, David A. M. C.; Frentz, Dineke; Sloot, Peter M. A.
2013-09-01
The HIV-1 epidemic in Europe is primarily sustained by a dynamic topology of sexual interactions among MSM who have individual immune systems and behavior. This epidemiological process shapes the phylogeny of the virus population. Both fields of epidemic modeling and phylogenetics have a long history, however it remains difficult to use phylogenetic data to infer epidemiological parameters such as the structure of the sexual network and the per-act infectiousness. This is because phylogenetic data is necessarily incomplete and ambiguous. Here we show that the cluster-size distribution indeed contains information about epidemiological parameters using detailed numberical experiments. We simulate the HIV epidemic among MSM many times using the Monte Carlo method with all parameter values and their ranges taken from literature. For each simulation and the corresponding set of parameter values we calculate the likelihood of reproducing an observed cluster-size distribution. The result is an estimated likelihood distribution of all parameters from the phylogenetic data, in particular the structure of the sexual network, the per-act infectiousness, and the risk behavior reduction upon diagnosis. These likelihood distributions encode the knowledge provided by the observed cluster-size distrbution, which we quantify using information theory. Our work suggests that the growing body of genetic data of patients can be exploited to understand the underlying epidemiological process.
Digital Image Processing in Private Industry.
ERIC Educational Resources Information Center
Moore, Connie
1986-01-01
Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…
A study on building data warehouse of hospital information system.
Li, Ping; Wu, Tao; Chen, Mu; Zhou, Bin; Xu, Wei-guo
2011-08-01
Existing hospital information systems with simple statistical functions cannot meet current management needs. It is well known that hospital resources are distributed with private property rights among hospitals, such as in the case of the regional coordination of medical services. In this study, to integrate and make full use of medical data effectively, we propose a data warehouse modeling method for the hospital information system. The method can also be employed for a distributed-hospital medical service system. To ensure that hospital information supports the diverse needs of health care, the framework of the hospital information system has three layers: datacenter layer, system-function layer, and user-interface layer. This paper discusses the role of a data warehouse management system in handling hospital information from the establishment of the data theme to the design of a data model to the establishment of a data warehouse. Online analytical processing tools assist user-friendly multidimensional analysis from a number of different angles to extract the required data and information. Use of the data warehouse improves online analytical processing and mitigates deficiencies in the decision support system. The hospital information system based on a data warehouse effectively employs statistical analysis and data mining technology to handle massive quantities of historical data, and summarizes from clinical and hospital information for decision making. This paper proposes the use of a data warehouse for a hospital information system, specifically a data warehouse for the theme of hospital information to determine latitude, modeling and so on. The processing of patient information is given as an example that demonstrates the usefulness of this method in the case of hospital information management. Data warehouse technology is an evolving technology, and more and more decision support information extracted by data mining and with decision-making technology is required for further research.
Åhlfeldt, Rose-Mharie; Persson, Anne; Rexhepi, Hanife; Wåhlander, Kalle
2016-12-01
This article presents and illustrates the main features of a proposed process-oriented approach for patient information distribution in future health care information systems, by using a prototype of a process support system. The development of the prototype was based on the Visuera method, which includes five defined steps. The results indicate that a visualized prototype is a suitable tool for illustrating both the opportunities and constraints of future ideas and solutions in e-Health. The main challenges for developing and implementing a fully functional process support system concern both technical and organizational/management aspects. © The Author(s) 2015.
ATLAS Eventlndex monitoring system using the Kibana analytics and visualization platform
NASA Astrophysics Data System (ADS)
Barberis, D.; Cárdenas Zárate, S. E.; Favareto, A.; Fernandez Casani, A.; Gallas, E. J.; Garcia Montoro, C.; Gonzalez de la Hoz, S.; Hrivnac, J.; Malon, D.; Prokoshin, F.; Salt, J.; Sanchez, J.; Toebbicke, R.; Yuan, R.; ATLAS Collaboration
2016-10-01
The ATLAS EventIndex is a data catalogue system that stores event-related metadata for all (real and simulated) ATLAS events, on all processing stages. As it consists of different components that depend on other applications (such as distributed storage, and different sources of information) we need to monitor the conditions of many heterogeneous subsystems, to make sure everything is working correctly. This paper describes how we gather information about the EventIndex components and related subsystems: the Producer-Consumer architecture for data collection, health parameters from the servers that run EventIndex components, EventIndex web interface status, and the Hadoop infrastructure that stores EventIndex data. This information is collected, processed, and then displayed using CERN service monitoring software based on the Kibana analytic and visualization package, provided by CERN IT Department. EventIndex monitoring is used both by the EventIndex team and ATLAS Distributed Computing shifts crew.
Shannon information entropy in heavy-ion collisions
NASA Astrophysics Data System (ADS)
Ma, Chun-Wang; Ma, Yu-Gang
2018-03-01
The general idea of information entropy provided by C.E. Shannon "hangs over everything we do" and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information entropy essentially quantify the information of a quantity with its specific distribution, for which the information entropy based methods have been deeply developed in many scientific areas including physics. The dynamical properties of heavy-ion collisions (HICs) process make it difficult and complex to study the nuclear matter and its evolution, for which Shannon information entropy theory can provide new methods and observables to understand the physical phenomena both theoretically and experimentally. To better understand the processes of HICs, the main characteristics of typical models, including the quantum molecular dynamics models, thermodynamics models, and statistical models, etc., are briefly introduced. The typical applications of Shannon information theory in HICs are collected, which cover the chaotic behavior in branching process of hadron collisions, the liquid-gas phase transition in HICs, and the isobaric difference scaling phenomenon for intermediate mass fragments produced in HICs of neutron-rich systems. Even though the present applications in heavy-ion collision physics are still relatively simple, it would shed light on key questions we are seeking for. It is suggested to further develop the information entropy methods in nuclear reactions models, as well as to develop new analysis methods to study the properties of nuclear matters in HICs, especially the evolution of dynamics system.
NASA Astrophysics Data System (ADS)
Klus, Jakub; Pořízka, Pavel; Prochazka, David; Mikysek, Petr; Novotný, Jan; Novotný, Karel; Slobodník, Marek; Kaiser, Jozef
2017-05-01
This paper presents a novel approach for processing the spectral information obtained from high-resolution elemental mapping performed by means of Laser-Induced Breakdown Spectroscopy. The proposed methodology is aimed at the description of possible elemental associations within a heterogeneous sample. High-resolution elemental mapping provides a large number of measurements. Moreover, typical laser-induced plasma spectrum consists of several thousands of spectral variables. Analysis of heterogeneous samples, where valuable information is hidden in a limited fraction of sample mass, requires special treatment. The sample under study is a sandstone-hosted uranium ore that shows irregular distribution of ore elements such as zirconium, titanium, uranium and niobium. Presented processing methodology shows the way to reduce the dimensionality of data and retain the spectral information by utilizing self-organizing maps (SOM). The spectral information from SOM is processed further to detect either simultaneous or isolated presence of elements. Conclusions suggested by SOM are in good agreement with geological studies of mineralization phases performed at the deposit. Even deeper investigation of the SOM results enables discrimination of interesting measurements and reveals new possibilities in the visualization of chemical mapping information. Suggested approach improves the description of elemental associations in mineral phases, which is crucial for the mining industry.
ERIC Educational Resources Information Center
Flotsam and Jetsam: A Newsletter for Massachusetts Marine Educators, 1985
1985-01-01
Presents factual information on penguins using an outline format. Includes descriptions of physical characteristics, behavioral mechanisms, geographical distribution, and physiological processes. Provides separate bibliographies for teachers and students. (ML)
Distribution and Aggregate Thickness of Salt Deposits of the United States
The map shows the distribution and aggregate thickness of salt deposits of the United States. This information is from contour map sheets, scanned and processed for use in a global mineral resource assessment, produced by the U.S. Geological Survey. It is used here to provide a geospatial context to the distribution of rock-salt deposits in the US. It is useful in illustrating sources of chlorides.
Multiattribute Fixed-State Utility Assessment.
1981-03-27
of a companion distribution, are presented in Appendix A. Because of the theory of conditional expected utility and the modelling of the utilities by...obtain approximations to the moments, using the companion V ) ) distribution discussed in Section 3. The moments of both distributions are discussed...1956b, 21, 207-216. Slavic, P. "From Shakespeare to Simon: Speculation -- and some evidence -- about man’s ability to process information." O g . a
The Use of Model-Driven Methodologies and Processes in Aegis Development
2011-05-17
Jamie.Durbin@lmco.com Christopher.M.Thompson@lmco.com May 17, 2011 Distribution Statement A: Approved for Public Release. Distribution is unlimited...Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per...AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES Presented at the 23rd Systems and Software
The scaling of geographic ranges: implications for species distribution models
Yackulic, Charles B.; Ginsberg, Joshua R.
2016-01-01
There is a need for timely science to inform policy and management decisions; however, we must also strive to provide predictions that best reflect our understanding of ecological systems. Species distributions evolve through time and reflect responses to environmental conditions that are mediated through individual and population processes. Species distribution models that reflect this understanding, and explicitly model dynamics, are likely to give more accurate predictions.
Foo, Brian; van der Schaar, Mihaela
2010-11-01
In this paper, we discuss distributed optimization techniques for configuring classifiers in a real-time, informationally-distributed stream mining system. Due to the large volume of streaming data, stream mining systems must often cope with overload, which can lead to poor performance and intolerable processing delay for real-time applications. Furthermore, optimizing over an entire system of classifiers is a difficult task since changing the filtering process at one classifier can impact both the feature values of data arriving at classifiers further downstream and thus, the classification performance achieved by an ensemble of classifiers, as well as the end-to-end processing delay. To address this problem, this paper makes three main contributions: 1) Based on classification and queuing theoretic models, we propose a utility metric that captures both the performance and the delay of a binary filtering classifier system. 2) We introduce a low-complexity framework for estimating the system utility by observing, estimating, and/or exchanging parameters between the inter-related classifiers deployed across the system. 3) We provide distributed algorithms to reconfigure the system, and analyze the algorithms based on their convergence properties, optimality, information exchange overhead, and rate of adaptation to non-stationary data sources. We provide results using different video classifier systems.
How Leadership for an ICT Reform Is Distributed within a School
ERIC Educational Resources Information Center
Seong, David Ng Foo; Ho, Jeanne Marie
2012-01-01
Purpose: The purpose of this paper is to examine the process of information communication technology (ICT) reform in a government school in Singapore. The focus is on the distributed leadership actions performed by various individuals, and how the multiple leaders and their leadership practices interacted with one another.…
Very Large Scale Distributed Information Processing Systems
1991-09-27
USENIX Conference Proceedings, pp. 31-43. USENIX, February 1988. [KLA90] Michael L. Kazar, Bruce W. Leverett, Owen T. Anderson, Vasilis Apos- tolides, Beth...will be selected if cost is the curlcron Iorsleettin- IfFigure 2 R DistribUted Database lSgtam and its we combin the abolve two pit , n r-itcrr
An Investigation of Data Overload in Team-Based Distributed Cognition Systems
ERIC Educational Resources Information Center
Hellar, David Benjamin
2009-01-01
The modern military command center is a hybrid system of computer automated surveillance and human oriented decision making. In these distributed cognition systems, data overload refers simultaneously to the glut of raw data processed by information technology systems and the dearth of actionable knowledge useful to human decision makers.…
Technology, Privacy and the Democratic Process.
ERIC Educational Resources Information Center
Gandy, Oscar H., Jr.; Simmons, Charles E.
Through a review of two accelerating trends in the technology of producing and distributing information and entertainment, this paper argues that the promises of "the information economy" and the "television of abundance" bring not the emancipation of diversity and access, but the rapid disintegration of an already weakened…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-18
..., process deviation, or contamination with microorganisms where any lot of the food has entered distribution... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2013-N-1119] Agency Information Collection Activities; Proposed Collection; Comment Request; Food Canning...
Integrating Conceptual Knowledge within and across Representational Modalities
ERIC Educational Resources Information Center
McNorgan, Chris; Reid, Jackie; McRae, Ken
2011-01-01
Research suggests that concepts are distributed across brain regions specialized for processing information from different sensorimotor modalities. Multimodal semantic models fall into one of two broad classes differentiated by the assumed hierarchy of convergence zones over which information is integrated. In shallow models, communication within-…
Multi-Connection Pattern Analysis: Decoding the representational content of neural communication.
Li, Yuanning; Richardson, Robert Mark; Ghuman, Avniel Singh
2017-11-15
The lack of multivariate methods for decoding the representational content of interregional neural communication has left it difficult to know what information is represented in distributed brain circuit interactions. Here we present Multi-Connection Pattern Analysis (MCPA), which works by learning mappings between the activity patterns of the populations as a factor of the information being processed. These maps are used to predict the activity from one neural population based on the activity from the other population. Successful MCPA-based decoding indicates the involvement of distributed computational processing and provides a framework for probing the representational structure of the interaction. Simulations demonstrate the efficacy of MCPA in realistic circumstances. In addition, we demonstrate that MCPA can be applied to different signal modalities to evaluate a variety of hypothesis associated with information coding in neural communications. We apply MCPA to fMRI and human intracranial electrophysiological data to provide a proof-of-concept of the utility of this method for decoding individual natural images and faces in functional connectivity data. We further use a MCPA-based representational similarity analysis to illustrate how MCPA may be used to test computational models of information transfer among regions of the visual processing stream. Thus, MCPA can be used to assess the information represented in the coupled activity of interacting neural circuits and probe the underlying principles of information transformation between regions. Copyright © 2017 Elsevier Inc. All rights reserved.
Potential for Remotely Sensed Soil Moisture Data in Hydrologic Modeling
NASA Technical Reports Server (NTRS)
Engman, Edwin T.
1997-01-01
Many hydrologic processes display a unique signature that is detectable with microwave remote sensing. These signatures are in the form of the spatial and temporal distributions of surface soil moisture and portray the spatial heterogeneity of hydrologic processes and properties that one encounters in drainage basins. The hydrologic processes that may be detected include ground water recharge and discharge zones, storm runoff contributing areas, regions of potential and less than potential ET, and information about the hydrologic properties of soils and heterogeneity of hydrologic parameters. Microwave remote sensing has the potential to detect these signatures within a basin in the form of volumetric soil moisture measurements in the top few cm. These signatures should provide information on how and where to apply soil physical parameters in distributed and lumped parameter models and how to subdivide drainage basins into hydrologically similar sub-basins.
A parallel-processing approach to computing for the geographic sciences
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Haga, Jim; Maddox, Brian; Feller, Mark
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting research into various areas, such as advanced computer architecture, algorithms to meet the processing needs for real-time image and data processing, the creation of custom datasets from seamless source data, rapid turn-around of products for emergency response, and support for computationally intense spatial and temporal modeling.
Temporal coding of reward-guided choice in the posterior parietal cortex
Hawellek, David J.; Wong, Yan T.; Pesaran, Bijan
2016-01-01
Making a decision involves computations across distributed cortical and subcortical networks. How such distributed processing is performed remains unclear. We test how the encoding of choice in a key decision-making node, the posterior parietal cortex (PPC), depends on the temporal structure of the surrounding population activity. We recorded spiking and local field potential (LFP) activity in the PPC while two rhesus macaques performed a decision-making task. We quantified the mutual information that neurons carried about an upcoming choice and its dependence on LFP activity. The spiking of PPC neurons was correlated with LFP phases at three distinct time scales in the theta, beta, and gamma frequency bands. Importantly, activity at these time scales encoded upcoming decisions differently. Choice information contained in neural firing varied with the phase of beta and gamma activity. For gamma activity, maximum choice information occurred at the same phase as the maximum spike count. However, for beta activity, choice information and spike count were greatest at different phases. In contrast, theta activity did not modulate the encoding properties of PPC units directly but was correlated with beta and gamma activity through cross-frequency coupling. We propose that the relative timing of local spiking and choice information reveals temporal reference frames for computations in either local or large-scale decision networks. Differences between the timing of task information and activity patterns may be a general signature of distributed processing across large-scale networks. PMID:27821752
Tesoriero, Ricardo; Gallud Lazaro, Jose A; Altalhi, Abdulrahman H
2017-02-01
Improve the quantity and quality of information obtained from traditional Loewenstein Occupational Therapy Cognitive Assessment Battery systems to monitor the evolution of patients' rehabilitation process as well as to compare different rehabilitation therapies. The system replaces traditional artefacts with virtual versions of them to take advantage of cutting edge interaction technology. The system is defined as a Distributed User Interface (DUI) supported by a display ecosystem, including mobile devices as well as multi-touch surfaces. Due to the heterogeneity of the devices involved in the system, the software technology is based on a client-server architecture using the Web as the software platform. The system provides therapists with information that is not available (or it is very difficult to gather) using traditional technologies (i.e. response time measurements, object tracking, information storage and retrieval facilities, etc.). The use of DUIs allows therapists to gather information that is unavailable using traditional assessment methods as well as adapt the system to patients' profile to increase the range of patients that are able to take this assessment. Implications for Rehabilitation Using a Distributed User Interface environment to carry out LOTCAs improves the quality of the information gathered during the rehabilitation assessment. This system captures physical data regarding patient's interaction during the assessment to improve the rehabilitation process analysis. Allows professionals to adapt the assessment procedure to create different versions according to patients' profile. Improves the availability of patients' profile information to therapists to adapt the assessment procedure.
Analysis of fault-tolerant neurocontrol architectures
NASA Technical Reports Server (NTRS)
Troudet, T.; Merrill, W.
1992-01-01
The fault-tolerance of analog parallel distributed implementations of a multivariable aircraft neurocontroller is analyzed by simulating weight and neuron failures in a simplified scheme of analog processing based on the functional architecture of the ETANN chip (Electrically Trainable Artificial Neural Network). The neural information processing is found to be only partially distributed throughout the set of weights of the neurocontroller synthesized with the backpropagation algorithm. Although the degree of distribution of the neural processing, and consequently the fault-tolerance of the neurocontroller, could be enhanced using Locally Distributed Weight and Neuron Approaches, a satisfactory level of fault-tolerance could only be obtained by retraining the degrated VLSI neurocontroller. The possibility of maintaining neurocontrol performance and stability in the presence of single weight of neuron failures was demonstrated through an automated retraining procedure of the neurocontroller based on a pre-programmed choice and sequence of the training parameters.
NASA Astrophysics Data System (ADS)
Longmore, S. P.; Bikos, D.; Szoke, E.; Miller, S. D.; Brummer, R.; Lindsey, D. T.; Hillger, D.
2014-12-01
The increasing use of mobile phones equipped with digital cameras and the ability to post images and information to the Internet in real-time has significantly improved the ability to report events almost instantaneously. In the context of severe weather reports, a representative digital image conveys significantly more information than a simple text or phone relayed report to a weather forecaster issuing severe weather warnings. It also allows the forecaster to reasonably discern the validity and quality of a storm report. Posting geo-located, time stamped storm report photographs utilizing a mobile phone application to NWS social media weather forecast office pages has generated recent positive feedback from forecasters. Building upon this feedback, this discussion advances the concept, development, and implementation of a formalized Photo Storm Report (PSR) mobile application, processing and distribution system and Advanced Weather Interactive Processing System II (AWIPS-II) plug-in display software.The PSR system would be composed of three core components: i) a mobile phone application, ii) a processing and distribution software and hardware system, and iii) AWIPS-II data, exchange and visualization plug-in software. i) The mobile phone application would allow web-registered users to send geo-location, view direction, and time stamped PSRs along with severe weather type and comments to the processing and distribution servers. ii) The servers would receive PSRs, convert images and information to NWS network bandwidth manageable sizes in an AWIPS-II data format, distribute them on the NWS data communications network, and archive the original PSRs for possible future research datasets. iii) The AWIPS-II data and exchange plug-ins would archive PSRs, and the visualization plug-in would display PSR locations, times and directions by hour, similar to surface observations. Hovering on individual PSRs would reveal photo thumbnails and clicking on them would display the full resolution photograph.Here, we present initial NWS forecaster feedback received from social media posted PSRs, motivating the possible advantages of PSRs within AWIPS-II, the details of developing and implementing a PSR system, and possible future applications beyond severe weather reports and AWIPS-II.
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.
Enhanced intelligence through optimized TCPED concepts for airborne ISR
NASA Astrophysics Data System (ADS)
Spitzer, M.; Kappes, E.; Böker, D.
2012-06-01
Current multinational operations show an increased demand for high quality actionable intelligence for different operational levels and users. In order to achieve sufficient availability, quality and reliability of information, various ISR assets are orchestrated within operational theatres. Especially airborne Intelligence, Surveillance and Reconnaissance (ISR) assets provide - due to their endurance, non-intrusiveness, robustness, wide spectrum of sensors and flexibility to mission changes - significant intelligence coverage of areas of interest. An efficient and balanced utilization of airborne ISR assets calls for advanced concepts for the entire ISR process framework including the Tasking, Collection, Processing, Exploitation and Dissemination (TCPED). Beyond this, the employment of current visualization concepts, shared information bases and information customer profiles, as well as an adequate combination of ISR sensors with different information age and dynamic (online) retasking process elements provides the optimization of interlinked TCPED processes towards higher process robustness, shorter process duration, more flexibility between ISR missions and, finally, adequate "entry points" for information requirements by operational users and commands. In addition, relevant Trade-offs of distributed and dynamic TCPED processes are examined and future trends are depicted.
NASA Astrophysics Data System (ADS)
Kirst, Christoph
It is astonishing how the sub-parts of a brain co-act to produce coherent behavior. What are mechanism that coordinate information processing and communication and how can those be changed flexibly in order to cope with variable contexts? Here we show that when information is encoded in the deviations around a collective dynamical reference state of a recurrent network the propagation of these fluctuations is strongly dependent on precisely this underlying reference. Information here 'surfs' on top of the collective dynamics and switching between states enables fast and flexible rerouting of information. This in turn affects local processing and consequently changes in the global reference dynamics that re-regulate the distribution of information. This provides a generic mechanism for self-organized information processing as we demonstrate with an oscillatory Hopfield network that performs contextual pattern recognition. Deep neural networks have proven to be very successful recently. Here we show that generating information channels via collective reference dynamics can effectively compress a deep multi-layer architecture into a single layer making this mechanism a promising candidate for the organization of information processing in biological neuronal networks.
Spectrum Management Guidelines for National and Service Test and Training Ranges
2017-07-12
GPS Global Positioning System ISM Installation Spectrum Manager JTIDS Joint Tactical Information Distribution System KMR Kwajalein Missile Range... information UAV unmanned aerial vehicle US&P United States and Possessions Spectrum Management Guidelines for National and Service Test and Training...frequency deconfliction processes. The AFC will inform the range or center commander and the Installation Spectrum Manager (ISM) at the
Infant Joint Attention, Neural Networks and Social Cognition
Mundy, Peter; Jarrold, William
2010-01-01
Neural network models of attention can provide a unifying approach to the study of human cognitive and emotional development (Posner & Rothbart, 2007). This paper we argue that a neural networks approach to the infant development of joint attention can inform our understanding of the nature of human social learning, symbolic thought process and social cognition. At its most basic, joint attention involves the capacity to coordinate one’s own visual attention with that of another person. We propose that joint attention development involves increments in the capacity to engage in simultaneous or parallel processing of information about one’s own attention and the attention of other people. Infant practice with joint attention is both a consequence and organizer of the development of a distributed and integrated brain network involving frontal and parietal cortical systems. This executive distributed network first serves to regulate the capacity of infants to respond to and direct the overt behavior of other people in order to share experience with others through the social coordination of visual attention. In this paper we describe this parallel and distributed neural network model of joint attention development and discuss two hypotheses that stem from this model. One is that activation of this distributed network during coordinated attention enhances to depth of information processing and encoding beginning in the first year of life. We also propose that with development joint attention becomes internalized as the capacity to socially coordinate mental attention to internal representations. As this occurs the executive joint attention network makes vital contributions to the development of human symbolic thinking and social cognition. PMID:20884172
Universities--Drivers for Regional Innovation Culture and Competitiveness
ERIC Educational Resources Information Center
Muresan, Mihaela; Gogu, Emilia
2010-01-01
The actual infrastructure of the information society sustains the globalization trend and increases the importance of the information and knowledge. The development of the knowledge society is the direct consequence of the mix of economic, social and cultural processes, which involve the knowledge creation and its equitable distribution, access…
A gossip based information fusion protocol for distributed frequent itemset mining
NASA Astrophysics Data System (ADS)
Sohrabi, Mohammad Karim
2018-07-01
The computational complexity, huge memory space requirement, and time-consuming nature of frequent pattern mining process are the most important motivations for distribution and parallelization of this mining process. On the other hand, the emergence of distributed computational and operational environments, which causes the production and maintenance of data on different distributed data sources, makes the parallelization and distribution of the knowledge discovery process inevitable. In this paper, a gossip based distributed itemset mining (GDIM) algorithm is proposed to extract frequent itemsets, which are special types of frequent patterns, in a wireless sensor network environment. In this algorithm, local frequent itemsets of each sensor are extracted using a bit-wise horizontal approach (LHPM) from the nodes which are clustered using a leach-based protocol. Heads of clusters exploit a gossip based protocol in order to communicate each other to find the patterns which their global support is equal to or more than the specified support threshold. Experimental results show that the proposed algorithm outperforms the best existing gossip based algorithm in term of execution time.
Neuropsychological constraints to human data production on a global scale
NASA Astrophysics Data System (ADS)
Gros, C.; Kaczor, G.; Marković, D.
2012-01-01
Which are the factors underlying human information production on a global level? In order to gain an insight into this question we study a corpus of 252-633 mil. publicly available data files on the Internet corresponding to an overall storage volume of 284-675 Terabytes. Analyzing the file size distribution for several distinct data types we find indications that the neuropsychological capacity of the human brain to process and record information may constitute the dominant limiting factor for the overall growth of globally stored information, with real-world economic constraints having only a negligible influence. This supposition draws support from the observation that the files size distributions follow a power law for data without a time component, like images, and a log-normal distribution for multimedia files, for which time is a defining qualia.
McComb, Sara; Kennedy, Deanna; Perryman, Rebecca; Warner, Norman; Letsky, Michael
2010-04-01
Our objective is to capture temporal patterns in mental model convergence processes and differences in these patterns between distributed teams using an electronic collaboration space and face-to-face teams with no interface. Distributed teams, as sociotechnical systems, collaborate via technology to work on their task. The way in which they process information to inform their mental models may be examined via team communication and may unfold differently than it does in face-to-face teams. We conducted our analysis on 32 three-member teams working on a planning task. Half of the teams worked as distributed teams in an electronic collaboration space, and the other half worked face-to-face without an interface. Using event history analysis, we found temporal interdependencies among the initial convergence points of the multiple mental models we examined. Furthermore, the timing of mental model convergence and the onset of task work discussions were related to team performance. Differences existed in the temporal patterns of convergence and task work discussions across conditions. Distributed teams interacting via an electronic interface and face-to-face teams with no interface converged on multiple mental models, but their communication patterns differed. In particular, distributed teams with an electronic interface required less overall communication, converged on all mental models later in their life cycles, and exhibited more linear cognitive processes than did face-to-face teams interacting verbally. Managers need unique strategies for facilitating communication and mental model convergence depending on teams' degrees of collocation and access to an interface, which in turn will enhance team performance.
NASA Astrophysics Data System (ADS)
Davies, D.; Murphy, K. J.; Michael, K.
2013-12-01
NASA's Land Atmosphere Near real-time Capability for EOS (Earth Observing System) (LANCE) provides data and imagery from Terra, Aqua and Aura satellites in less than 3 hours from satellite observation, to meet the needs of the near real-time (NRT) applications community. This article describes the architecture of the LANCE and outlines the modifications made to achieve the 3-hour latency requirement with a view to informing future NRT satellite distribution capabilities. It also describes how latency is determined. LANCE is a distributed system that builds on the existing EOS Data and Information System (EOSDIS) capabilities. To achieve the NRT latency requirement, many components of the EOS satellite operations, ground and science processing systems have been made more efficient without compromising the quality of science data processing. The EOS Data and Operations System (EDOS) processes the NRT stream with higher priority than the science data stream in order to minimize latency. In addition to expediting transfer times, the key difference between the NRT Level 0 products and those for standard science processing is the data used to determine the precise location and tilt of the satellite. Standard products use definitive geo-location (attitude and ephemeris) data provided daily, whereas NRT products use predicted geo-location provided by the instrument Global Positioning System (GPS) or approximation of navigational data (depending on platform). Level 0 data are processed in to higher-level products at designated Science Investigator-led Processing Systems (SIPS). The processes used by LANCE have been streamlined and adapted to work with datasets as soon as they are downlinked from satellites or transmitted from ground stations. Level 2 products that require ancillary data have modified production rules to relax the requirements for ancillary data so reducing processing times. Looking to the future, experience gained from LANCE can provide valuable lessons on satellite and ground system architectures and on how the delivery of NRT products from other NASA missions might be achieved.
Strategic Positioning of the Web in a Multi-Channel Market Approach.
ERIC Educational Resources Information Center
Simons, Luuk P. A.; Steinfield, Charles; Bouwman, Harry
2002-01-01
Discusses channel economics in retail activities and trends toward unbundling due to the emergence of the Web channel. Highlights include sales processes and physical distribution processes; transaction costs; hybrid electronic commerce strategies; channel management and customer support; information economics, thing economics, and service…
Supplemental Information for New York State Standardized Interconnection Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ingram, Michael; Narang, David J.; Mather, Barry A.
This document is intended to aid in the understanding and application of the New York State Standardized Interconnection Requirements (SIR) and Application Process for New Distributed Generators 5 MW or Less Connected in Parallel with Utility Distribution Systems, and it aims to provide supplemental information and discussion on selected topics relevant to the SIR. This guide focuses on technical issues that have to date resulted in the majority of utility findings within the context of interconnecting photovoltaic (PV) inverters. This guide provides background on the overall issue and related mitigation measures for selected topics, including substation backfeeding, anti-islanding and considerationsmore » for monitoring and controlling distributed energy resources (DER).« less
Redundancy and reduction: Speakers manage syntactic information density
Florian Jaeger, T.
2010-01-01
A principle of efficient language production based on information theoretic considerations is proposed: Uniform Information Density predicts that language production is affected by a preference to distribute information uniformly across the linguistic signal. This prediction is tested against data from syntactic reduction. A single multilevel logit model analysis of naturally distributed data from a corpus of spontaneous speech is used to assess the effect of information density on complementizer that-mentioning, while simultaneously evaluating the predictions of several influential alternative accounts: availability, ambiguity avoidance, and dependency processing accounts. Information density emerges as an important predictor of speakers’ preferences during production. As information is defined in terms of probabilities, it follows that production is probability-sensitive, in that speakers’ preferences are affected by the contextual probability of syntactic structures. The merits of a corpus-based approach to the study of language production are discussed as well. PMID:20434141
Estimation of the Scatterer Distribution of the Cirrhotic Liver using Ultrasonic Image
NASA Astrophysics Data System (ADS)
Yamaguchi, Tadashi; Hachiya, Hiroyuki
1998-05-01
In the B-mode image of the liver obtained by an ultrasonic imaging system, the speckled pattern changes with the progression of the disease such as liver cirrhosis.In this paper we present the statistical characteristics of the echo envelope of the liver, and the technique to extract information of the scatterer distribution from the normal and cirrhotic liver images using constant false alarm rate (CFAR) processing.We analyze the relationship between the extracted scatterer distribution and the stage of liver cirrhosis. The ratio of the area in which the amplitude of the processing signal is more than the threshold to the entire processed image area is related quantitatively to the stage of liver cirrhosis.It is found that the proposed technique is valid for the quantitative diagnosis of liver cirrhosis.
Tools for studying dry-cured ham processing by using computed tomography.
Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena
2012-01-11
An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.
Computer Sciences and Data Systems, volume 2
NASA Technical Reports Server (NTRS)
1987-01-01
Topics addressed include: data storage; information network architecture; VHSIC technology; fiber optics; laser applications; distributed processing; spaceborne optical disk controller; massively parallel processors; and advanced digital SAR processors.
Security and privacy issues of personal health.
Blobel, Bernd; Pharow, Peter
2007-01-01
While health systems in developed countries and increasingly also in developing countries are moving from organisation-centred to person-centred health service delivery, the supporting communication and information technology is faced with new risks regarding security and privacy of stakeholders involved. The comprehensively distributed environment puts special burden on guaranteeing communication security services, but even more on guaranteeing application security services dealing with privilege management, access control and audit regarding social implication and connected sensitivity of personal information recorded, processed, communicated and stored in an even internationally distributed environment.
Electrophysiologically dissociating episodic preretrieval processing.
Bridger, Emma K; Mecklinger, Axel
2012-06-01
Contrasts between ERPs elicited by new items from tests with distinct episodic retrieval requirements index preretrieval processing. Preretrieval operations are thought to facilitate the recovery of task-relevant information because they have been shown to correlate with response accuracy in tasks in which prioritizing the retrieval of this information could be a useful strategy. This claim was tested here by contrasting new item ERPs from two retrieval tasks, each designed to explicitly require the recovery of a different kind of mnemonic information. New item ERPs differed from 400 msec poststimulus, but the distribution of these effects varied markedly, depending upon participants' response accuracy: A protracted posteriorly located effect was present for higher performing participants, whereas an anteriorly distributed effect occurred for lower performing participants. The magnitude of the posterior effect from 400 to 800 msec correlated with response accuracy, supporting the claim that preretrieval processes facilitate the recovery of task-relevant information. Additional contrasts between ERPs from these tasks and an old/new recognition task operating as a relative baseline revealed task-specific effects with nonoverlapping scalp topographies, in line with the assumption that these new item ERP effects reflect qualitatively distinct retrieval operations. Similarities in these effects were also used to reason about preretrieval processes related to the general requirement to recover contextual details. These insights, alongside the distinct pattern of effects for the two accuracy groups, reveal the multifarious nature of preretrieval processing while indicating that only some of these classes of operation are systematically related to response accuracy in recognition memory tasks.
Earth Observation Services (Image Processing Software)
NASA Technical Reports Server (NTRS)
1992-01-01
San Diego State University and Environmental Systems Research Institute, with other agencies, have applied satellite imaging and image processing techniques to geographic information systems (GIS) updating. The resulting images display land use and are used by a regional planning agency for applications like mapping vegetation distribution and preserving wildlife habitats. The EOCAP program provides government co-funding to encourage private investment in, and to broaden the use of NASA-developed technology for analyzing information about Earth and ocean resources.
Data Integration in Computer Distributed Systems
NASA Astrophysics Data System (ADS)
Kwiecień, Błażej
In this article the author analyze a problem of data integration in a computer distributed systems. Exchange of information between different levels in integrated pyramid of enterprise process is fundamental with regard to efficient enterprise work. Communication and data exchange between levels are not always the same cause of necessity of different network protocols usage, communication medium, system response time, etc.
NASA Technical Reports Server (NTRS)
Bogomolov, E. A.; Yevstafev, Y. Y.; Karakadko, V. K.; Lubyanaya, N. D.; Romanov, V. A.; Totubalina, M. G.; Yamshchikov, M. A.
1975-01-01
A system for the recording and processing of telescope data is considered for measurements of EW asymmetry. The information is recorded by 45 channels on a continuously moving 35-mm film. The dead time of the recorder is about 0.1 sec. A sorting electronic circuit is used to reduce the errors when the statistical time distribution of the pulses is recorded. The recorded information is read out by means of photoresistors. The phototransmitter signals are fed either to the mechanical recorder unit for preliminary processing, or to a logical circuit which controls the operation of the punching device. The punched tape is processed by an electronic computer.
NASA Astrophysics Data System (ADS)
Lacasa, Lucas
2014-09-01
Dynamical processes can be transformed into graphs through a family of mappings called visibility algorithms, enabling the possibility of (i) making empirical time series analysis and signal processing and (ii) characterizing classes of dynamical systems and stochastic processes using the tools of graph theory. Recent works show that the degree distribution of these graphs encapsulates much information on the signals' variability, and therefore constitutes a fundamental feature for statistical learning purposes. However, exact solutions for the degree distributions are only known in a few cases, such as for uncorrelated random processes. Here we analytically explore these distributions in a list of situations. We present a diagrammatic formalism which computes for all degrees their corresponding probability as a series expansion in a coupling constant which is the number of hidden variables. We offer a constructive solution for general Markovian stochastic processes and deterministic maps. As case tests we focus on Ornstein-Uhlenbeck processes, fully chaotic and quasiperiodic maps. Whereas only for certain degree probabilities can all diagrams be summed exactly, in the general case we show that the perturbation theory converges. In a second part, we make use of a variational technique to predict the complete degree distribution for special classes of Markovian dynamics with fast-decaying correlations. In every case we compare the theory with numerical experiments.
Pazos, Diego; Giannasi, Pauline; Rossy, Quentin; Esseiva, Pierre
2013-07-10
The Internet is becoming more and more popular among drug users. The use of websites and forums to obtain illicit drugs and relevant information about the means of consumption is a growing phenomenon mainly for new synthetic drugs. Gamma Butyrolactone (GBL), a chemical precursor of Gamma Hydroxy Butyric acid (GHB), is used as a "club drug" and also in drug facilitated sexual assaults. Its market takes place mainly on the Internet through online websites but the structure of the market remains unknown. This research aims to combine digital, physical and chemical information to help understand the distribution routes and the structure of the GBL market. Based on an Internet monitoring process, thirty-nine websites selling GBL, mainly in the Netherlands, were detected between January 2010 and December 2011. Seventeen websites were categorized into six groups based on digital traces (e.g. IP addresses and contact information). In parallel, twenty-five bulk GBL specimens were purchased from sixteen websites for packaging comparisons and carbon isotopic measurements. Packaging information showed a high correlation with digital data confirming the links previously established whereas chemical information revealed undetected links and provided complementary information. Indeed, while digital and packaging data give relevant information about the retailers, the supply routes and the distribution close to the consumer, the carbon isotopic data provides upstream information about the production level and in particular the synthesis pathways and the chemical precursors. A three-level structured market has been thereby identified with a production level mainly located in China and in Germany, an online distribution level mainly hosted in the Netherlands and the customers who order on the Internet. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
A Geospatial Information Grid Framework for Geological Survey.
Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong
2015-01-01
The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.
A Geospatial Information Grid Framework for Geological Survey
Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong
2015-01-01
The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper. PMID:26710255
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-23
... Records by Persons Who Manufacture, Process, Pack, Transport, Distribute, Receive, Hold, or Import Food... ``Questions and Answers Regarding Establishment and Maintenance of Records by Persons Who Manufacture, Process... updated information pertaining to the establishment and maintenance of records by persons who manufacture...
21 CFR 172.177 - Sodium nitrite used in processing smoked chub.
Code of Federal Regulations, 2011 CFR
2011-04-01
... be heated by a controlled heat process which provides a monitoring system positioned in as many... subsequent storage and distribution. All shipping containers, retail packages, and shipping records shall...) The label and labeling of the additive container shall bear, in addition to the other information...
21 CFR 172.177 - Sodium nitrite used in processing smoked chub.
Code of Federal Regulations, 2012 CFR
2012-04-01
... be heated by a controlled heat process which provides a monitoring system positioned in as many... subsequent storage and distribution. All shipping containers, retail packages, and shipping records shall...) The label and labeling of the additive container shall bear, in addition to the other information...
21 CFR 172.177 - Sodium nitrite used in processing smoked chub.
Code of Federal Regulations, 2013 CFR
2013-04-01
... be heated by a controlled heat process which provides a monitoring system positioned in as many... subsequent storage and distribution. All shipping containers, retail packages, and shipping records shall...) The label and labeling of the additive container shall bear, in addition to the other information...
2007-02-28
Shah, D. Waagen, H. Schmitt, S. Bellofiore, A. Spanias, and D. Cochran, 32nd International Conference on Acoustics, Speech , and Signal Processing...Information Exploitation Office kNN k-Nearest Neighbor LEAN Laplacian Eigenmap Adaptive Neighbor LIP Linear Integer Programming ISP
Coordinating complex problem-solving among distributed intelligent agents
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1992-01-01
A process-oriented control model is described for distributed problem solving. The model coordinates the transfer and manipulation of information across independent networked applications, both intelligent and conventional. The model was implemented using SOCIAL, a set of object-oriented tools for distributing computing. Complex sequences of distributed tasks are specified in terms of high level scripts. Scripts are executed by SOCIAL objects called Manager Agents, which realize an intelligent coordination model that routes individual tasks to suitable server applications across the network. These tools are illustrated in a prototype distributed system for decision support of ground operations for NASA's Space Shuttle fleet.
An Architecture for Integrated Regional Health Telematics Networks
2001-10-25
that enables informed citizens to have an impact on the healthcare system and to be more concerned and care for their own health . The current...resource, educational, integrated electronic health record (I- EHR ), and added value services [2]. These classes of telematic services are applica...cally distributed clinical information systems . 5) Finally, added-value services (e.g. image processing, information indexing, data pre-fetching
The design and implementation of multi-source application middleware based on service bus
NASA Astrophysics Data System (ADS)
Li, Yichun; Jiang, Ningkang
2017-06-01
With the rapid development of the Internet of Things(IoT), the real-time monitoring data are increasing with different types and large amounts. Aiming at taking full advantages of the data, we designed and implemented an application middleware, which not only supports the three-layer architecture of IoT information system but also enables the flexible configuration of multiple resources access and other accessional modules. The middleware platform shows the characteristics of lightness, security, AoP (aspect-oriented programming), distribution and real-time, which can let application developers construct the information processing systems on related areas in a short period. It focuses not limited to these functions: pre-processing of data format, the definition of data entity, the callings and handlings of distributed service and massive data process. The result of experiment shows that the performance of middleware is more excellent than some message queue construction to some degree and its throughput grows better as the number of distributed nodes increases while the code is not complex. Currently, the middleware is applied to the system of Shanghai Pudong environmental protection agency and achieved a great success.
Distributed software framework and continuous integration in hydroinformatics systems
NASA Astrophysics Data System (ADS)
Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao
2017-08-01
When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.
The biogeochemical distribution of trace elements in the Indian Ocean
NASA Astrophysics Data System (ADS)
Saager, Paul M.
1994-06-01
The present review deals with the distributions of dissolved trace metals in the Indian Ocean in relation with biological, chemical and hydrographic processes. The literature data-base is extremely limited and almost no information is available on particle processes and input and output processes of trace metals in the Indian Ocean basin and therefore much research is needed to expand our understanding of the marine chemistries of most trace metals. An area of special interest for future research is the Arabian Sea. The local conditions (upwelling induced productivity, restricted bottom water circulation and suboxic intermediate waters) create a natural laboratory for studying trace metal chemistry.
Deterministic quantum state transfer and remote entanglement using microwave photons.
Kurpiers, P; Magnard, P; Walter, T; Royer, B; Pechal, M; Heinsoo, J; Salathé, Y; Akin, A; Storz, S; Besse, J-C; Gasparinetti, S; Blais, A; Wallraff, A
2018-06-01
Sharing information coherently between nodes of a quantum network is fundamental to distributed quantum information processing. In this scheme, the computation is divided into subroutines and performed on several smaller quantum registers that are connected by classical and quantum channels 1 . A direct quantum channel, which connects nodes deterministically rather than probabilistically, achieves larger entanglement rates between nodes and is advantageous for distributed fault-tolerant quantum computation 2 . Here we implement deterministic state-transfer and entanglement protocols between two superconducting qubits fabricated on separate chips. Superconducting circuits 3 constitute a universal quantum node 4 that is capable of sending, receiving, storing and processing quantum information 5-8 . Our implementation is based on an all-microwave cavity-assisted Raman process 9 , which entangles or transfers the qubit state of a transmon-type artificial atom 10 with a time-symmetric itinerant single photon. We transfer qubit states by absorbing these itinerant photons at the receiving node, with a probability of 98.1 ± 0.1 per cent, achieving a transfer-process fidelity of 80.02 ± 0.07 per cent for a protocol duration of only 180 nanoseconds. We also prepare remote entanglement on demand with a fidelity as high as 78.9 ± 0.1 per cent at a rate of 50 kilohertz. Our results are in excellent agreement with numerical simulations based on a master-equation description of the system. This deterministic protocol has the potential to be used for quantum computing distributed across different nodes of a cryogenic network.
ERIC Educational Resources Information Center
Hathaway, Walter E.
Efficient and convenient comprehensive information systems, long kept from coming into being by a variety of obstacles, are now made possible by the concept of distributive processing and the technology of micro- and mini-computer networks. Such systems can individualize instruction, group students efficiently, cut administrative costs, streamline…
Use of MCIDAS as an earth science information systems tool
NASA Technical Reports Server (NTRS)
Goodman, H. Michael; Karitani, Shogo; Parker, Karen G.; Stooksbury, Laura M.; Wilson, Gregory S.
1988-01-01
The application of the man computer interactive data access system (MCIDAS) to information processing is examined. The computer systems that interface with the MCIDAS are discussed. Consideration is given to the computer networking of MCIDAS, data base archival, and the collection and distribution of real-time special sensor microwave/imager data.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-20
... manufacture, process or distribute industrial chemicals. Since other entities may also be interested, the..., 6, and 8 of the Toxic Substances Control Act (TSCA). Some of the information may be claimed or... evaluating the exposure of new chemical substances, including microorganisms and nanomaterials. They will...
ERIC Educational Resources Information Center
Lewis, Wiley B.
A review and analysis of Educational Resources Information Center (ERIC) publications and non-ERIC publications was made to assess availability and identify major findings, promising developments, strategies, and methodological strengths and weaknesses which exist in curricula designed for preparing food industry workers. Project national figures…
Distributed information system on molecular spectroscopy
NASA Astrophysics Data System (ADS)
Bykov, A. D.; Fazliev, A. Z.; Kozodoev, A. V.; Privezentsev, A. I.; Sinitsa, L. N.; Tonkov, M. V.; Filippov, N. N.; Tretyakov, M. Yu.
2006-12-01
The urgency of creating the information-computational systems (ICS) on molecular spectroscopy follows from the circumstance that for some molecules the number of calculated energy levels counts hundreds of thousands, and the number of spectral lines sometimes reaches hundreds of millions. Publication of such data volumes in regular journals is inappropriate. Comparison of different calculated spectral characteristics or their comparison with experimental data beyond computer processing is hopeless. We find information systems to be an adequate form for holding such data volumes and a toolkit for handling them. Correct digital data processing requires appropriate sets of metadata arranged in the form of ontology of molecular spectroscopy. Our information system provides the data on spectral line parameters, water molecule energy levels, and absorption coefficients. Within this distributed IS one can solve two types of problems: manipulation with data and calculation of spectral functions. Among the latest experimental data in the IS there are data obtained at the Institute of Applied Physics RAS. To calculate the absorption coefficients for the molecules of carbonic acid gas, we take into consideration spectral line interference.
Multifractal Approach to Time Clustering of Earthquakes. Application to Mt. Vesuvio Seismicity
NASA Astrophysics Data System (ADS)
Codano, C.; Alonzo, M. L.; Vilardo, G.
The clustering structure of the Vesuvian earthquakes occurring is investigated by means of statistical tools: the inter-event time distribution, the running mean and the multifractal analysis. The first cannot clearly distinguish between a Poissonian process and a clustered one due to the difficulties of clearly distinguishing between an exponential distribution and a power law one. The running mean test reveals the clustering of the earthquakes, but looses information about the structure of the distribution at global scales. The multifractal approach can enlighten the clustering at small scales, while the global behaviour remains Poissonian. Subsequently the clustering of the events is interpreted in terms of diffusive processes of the stress in the earth crust.
An experimental paradigm for team decision processes
NASA Technical Reports Server (NTRS)
Serfaty, D.; Kleinman, D. L.
1986-01-01
The study of distributed information processing and decision making is presently hampered by two factors: (1) The inherent complexity of the mathematical formulation of decentralized problems has prevented the development of models that could be used to predict performance in a distributed environment; and (2) The lack of comprehensive scientific empirical data on human team decision making has hindered the development of significant descriptive models. As a part of a comprehensive effort to find a new framework for multihuman decision making problems, a novel experimental research paradigm was developed involving human terms in decision making tasks. Attempts to construct parts of an integrated model with ideas from queueing networks, team theory, distributed estimation and decentralized resource management are described.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Kennedy, John M.; White, Terry F.
1991-01-01
Phase 1 of a four part study was undertaken to investigate the use of scientific and technical information (STI) by U.S. aerospace engineers and scientists. Specific attention was paid to institutional and sociometric variables and to the step-by-step process of information gathering used by the respondents. Data were collected by means of three self-administered mail-back questionnaires. The approximately 34,000 members of the American Institute of Aeronautics and Astronautics served as the study population. More than 65 percent of the randomly selected respondants returned the questionnaires in each of the three groups. Respondants relied more heavily on informal sources of information than formal sources and turned to librarians and other technical information specialists only when they did not obtain results via informal means or their own formal searches. The report includes frequency distributions for the questions.
Krishnan, Shaji; Verheij, Elwin E R; Bas, Richard C; Hendriks, Margriet W B; Hankemeier, Thomas; Thissen, Uwe; Coulier, Leon
2013-05-15
Mass spectra obtained by deconvolution of liquid chromatography/high-resolution mass spectrometry (LC/HRMS) data can be impaired by non-informative mass-over-charge (m/z) channels. This impairment of mass spectra can have significant negative influence on further post-processing, like quantification and identification. A metric derived from the knowledge of errors in isotopic distribution patterns, and quality of the signal within a pre-defined mass chromatogram block, has been developed to pre-select all informative m/z channels. This procedure results in the clean-up of deconvoluted mass spectra by maintaining the intensity counts from m/z channels that originate from a specific compound/molecular ion, for example, molecular ion, adducts, (13) C-isotopes, multiply charged ions and removing all m/z channels that are not related to the specific peak. The methodology has been successfully demonstrated for two sets of high-resolution LC/MS data. The approach described is therefore thought to be a useful tool in the automatic processing of LC/HRMS data. It clearly shows the advantages compared to other approaches like peak picking and de-isotoping in the sense that all information is retained while non-informative data is removed automatically. Copyright © 2013 John Wiley & Sons, Ltd.
A distributed, hierarchical and recurrent framework for reward-based choice
Hunt, Laurence T.; Hayden, Benjamin Y.
2017-01-01
Many accounts of reward-based choice argue for distinct component processes that are serial and functionally localized. In this article, we argue for an alternative viewpoint, in which choices emerge from repeated computations that are distributed across many brain regions. We emphasize how several features of neuroanatomy may support the implementation of choice, including mutual inhibition in recurrent neural networks and the hierarchical organisation of timescales for information processing across the cortex. This account also suggests that certain correlates of value may be emergent rather than represented explicitly in the brain. PMID:28209978
Dynamics of Biofilm Regrowth in Drinking Water Distribution Systems.
Douterelo, I; Husband, S; Loza, V; Boxall, J
2016-07-15
The majority of biomass within water distribution systems is in the form of attached biofilm. This is known to be central to drinking water quality degradation following treatment, yet little understanding of the dynamics of these highly heterogeneous communities exists. This paper presents original information on such dynamics, with findings demonstrating patterns of material accumulation, seasonality, and influential factors. Rigorous flushing operations repeated over a 1-year period on an operational chlorinated system in the United Kingdom are presented here. Intensive monitoring and sampling were undertaken, including time-series turbidity and detailed microbial analysis using 16S rRNA Illumina MiSeq sequencing. The results show that bacterial dynamics were influenced by differences in the supplied water and by the material remaining attached to the pipe wall following flushing. Turbidity, metals, and phosphate were the main factors correlated with the distribution of bacteria in the samples. Coupled with the lack of inhibition of biofilm development due to residual chlorine, this suggests that limiting inorganic nutrients, rather than organic carbon, might be a viable component in treatment strategies to manage biofilms. The research also showed that repeat flushing exerted beneficial selective pressure, giving another reason for flushing being a viable advantageous biofilm management option. This work advances our understanding of microbiological processes in drinking water distribution systems and helps inform strategies to optimize asset performance. This research provides novel information regarding the dynamics of biofilm formation in real drinking water distribution systems made of different materials. This new knowledge on microbiological process in water supply systems can be used to optimize the performance of the distribution network and to guarantee safe and good-quality drinking water to consumers. Copyright © 2016 Douterelo et al.
Dynamics of Biofilm Regrowth in Drinking Water Distribution Systems
Husband, S.; Loza, V.; Boxall, J.
2016-01-01
ABSTRACT The majority of biomass within water distribution systems is in the form of attached biofilm. This is known to be central to drinking water quality degradation following treatment, yet little understanding of the dynamics of these highly heterogeneous communities exists. This paper presents original information on such dynamics, with findings demonstrating patterns of material accumulation, seasonality, and influential factors. Rigorous flushing operations repeated over a 1-year period on an operational chlorinated system in the United Kingdom are presented here. Intensive monitoring and sampling were undertaken, including time-series turbidity and detailed microbial analysis using 16S rRNA Illumina MiSeq sequencing. The results show that bacterial dynamics were influenced by differences in the supplied water and by the material remaining attached to the pipe wall following flushing. Turbidity, metals, and phosphate were the main factors correlated with the distribution of bacteria in the samples. Coupled with the lack of inhibition of biofilm development due to residual chlorine, this suggests that limiting inorganic nutrients, rather than organic carbon, might be a viable component in treatment strategies to manage biofilms. The research also showed that repeat flushing exerted beneficial selective pressure, giving another reason for flushing being a viable advantageous biofilm management option. This work advances our understanding of microbiological processes in drinking water distribution systems and helps inform strategies to optimize asset performance. IMPORTANCE This research provides novel information regarding the dynamics of biofilm formation in real drinking water distribution systems made of different materials. This new knowledge on microbiological process in water supply systems can be used to optimize the performance of the distribution network and to guarantee safe and good-quality drinking water to consumers. PMID:27208119
Business logic for geoprocessing of distributed geodata
NASA Astrophysics Data System (ADS)
Kiehle, Christian
2006-12-01
This paper describes the development of a business-logic component for the geoprocessing of distributed geodata. The business logic acts as a mediator between the data and the user, therefore playing a central role in any spatial information system. The component is used in service-oriented architectures to foster the reuse of existing geodata inventories. Based on a geoscientific case study of groundwater vulnerability assessment and mapping, the demands for such architectures are identified with special regard to software engineering tasks. Methods are derived from the field of applied Geosciences (Hydrogeology), Geoinformatics, and Software Engineering. In addition to the development of a business logic component, a forthcoming Open Geospatial Consortium (OGC) specification is introduced: the OGC Web Processing Service (WPS) specification. A sample application is introduced to demonstrate the potential of WPS for future information systems. The sample application Geoservice Groundwater Vulnerability is described in detail to provide insight into the business logic component, and demonstrate how information can be generated out of distributed geodata. This has the potential to significantly accelerate the assessment and mapping of groundwater vulnerability. The presented concept is easily transferable to other geoscientific use cases dealing with distributed data inventories. Potential application fields include web-based geoinformation systems operating on distributed data (e.g. environmental planning systems, cadastral information systems, and others).
Phase space gradient of dissipated work and information: A role of relative Fisher information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamano, Takuya, E-mail: yamano@amy.hi-ho.ne.jp
2013-11-15
We show that an information theoretic distance measured by the relative Fisher information between canonical equilibrium phase densities corresponding to forward and backward processes is intimately related to the gradient of the dissipated work in phase space. We present a universal constraint on it via the logarithmic Sobolev inequality. Furthermore, we point out that a possible expression of the lower bound indicates a deep connection in terms of the relative entropy and the Fisher information of the canonical distributions.
Advanced Information Processing System (AIPS)
NASA Technical Reports Server (NTRS)
Pitts, Felix L.
1993-01-01
Advanced Information Processing System (AIPS) is a computer systems philosophy, a set of validated hardware building blocks, and a set of validated services as embodied in system software. The goal of AIPS is to provide the knowledgebase which will allow achievement of validated fault-tolerant distributed computer system architectures, suitable for a broad range of applications, having failure probability requirements of 10E-9 at 10 hours. A background and description is given followed by program accomplishments, the current focus, applications, technology transfer, FY92 accomplishments, and funding.
NASA Astrophysics Data System (ADS)
Antonov I., P.; Goroshkov A., V.; Kalyunov V., N.; Markhvida I., V.; Rubanov A., S.; Tanin L., V.
1983-12-01
The role of investigation of peripheral vervous fibers in bitality state is of great importance when elucidating the mechanism of a stimulant low-energy laser radiation influence which is widely applicable, for example, in practice for curing lumbar osteochondros-is (1), trigeminal verve radiculitis, and in developing the processes of transmission and processing of the information required for sustaining organism homeostasis. Using both electrophysiologic and holographic methods simultaneously can increase total information and authenticity of these investigations.
Ying, Ko Chung; Browne, Graeme; Hutchinson, Marie; Cashin, Andrew; Binh, Bui Vu
2012-05-01
Autism is not generally well understood by the community in the West or in Asia. A diagnosis of autism is distressing for all families. When families receive the diagnosis they are often not able to fully appreciate what it means or process the information given to them. Booklets exist in English that contain relevant autism related information but few have been evaluated. In Vietnam, parents do not have ready access to autism related information. This paper makes the case for offering a Vietnamese language information resource/booklet for parents to be distributed at the beginning of the diagnostic process and evaluating its usefulness. In developed countries autism has been recognised since the 1940s (Kanner, 1943). More recently it is being increasingly recognised in children with average and above intelligence. In Vietnam, a Western view of autism is just developing. Consequently community resources are undeveloped. The community, in general, and health services for children, in particular, have a rudimentary understanding of autism. This paper discusses a Western understanding of autism, autism in Vietnam, and suggests one possible strategy for addressing the educational needs around autism in Vietnam.
The Dynamics of Learning and the Emergence of Distributed Adaption
2006-05-01
regular access to experts in a wide range of disciplines—such as, biology, economics, cognitive science, and sociology—that historically have...organized a successful workshop on “Collective Cognition : Mathemati- cal Foundations of Distributed Intelligence,” bringing together workers in...processing and cognition . (For a complete list of participants, talk titles and abstracts, and other information on the workshop, see http
NASA Technical Reports Server (NTRS)
Liberman, Eugene M.; Manner, David B.; Dolce, James L.; Mellor, Pamela A.
1993-01-01
Expert systems are widely used in health monitoring and fault detection applications. One of the key features of an expert system is that it possesses a large body of knowledge about the application for which it was designed. When the user consults this knowledge base, it is essential that the expert system's reasoning process and its conclusions be as concise as possible. If, in addition, an expert system is part of a process monitoring system, the expert system's conclusions must be combined with current events of the process. Under these circumstances, it is difficult for a user to absorb and respond to all the available information. For example, a user can become distracted and confused if two or more unrelated devices in different parts of the system require attention. A human interface designed to integrate expert system diagnoses with process data and to focus the user's attention to the important matters provides a solution to the 'information overload' problem. This paper will discuss a user interface to the power distribution expert system for Space Station Freedom. The importance of features which simplify assessing system status and which minimize navigating through layers of information will be discussed. Design rationale and implementation choices will also be presented.
A resilient and secure software platform and architecture for distributed spacecraft
NASA Astrophysics Data System (ADS)
Otte, William R.; Dubey, Abhishek; Karsai, Gabor
2014-06-01
A distributed spacecraft is a cluster of independent satellite modules flying in formation that communicate via ad-hoc wireless networks. This system in space is a cloud platform that facilitates sharing sensors and other computing and communication resources across multiple applications, potentially developed and maintained by different organizations. Effectively, such architecture can realize the functions of monolithic satellites at a reduced cost and with improved adaptivity and robustness. Openness of these architectures pose special challenges because the distributed software platform has to support applications from different security domains and organizations, and where information flows have to be carefully managed and compartmentalized. If the platform is used as a robust shared resource its management, configuration, and resilience becomes a challenge in itself. We have designed and prototyped a distributed software platform for such architectures. The core element of the platform is a new operating system whose services were designed to restrict access to the network and the file system, and to enforce resource management constraints for all non-privileged processes Mixed-criticality applications operating at different security labels are deployed and controlled by a privileged management process that is also pre-configuring all information flows. This paper describes the design and objective of this layer.
Informational technologies in modern educational structure
NASA Astrophysics Data System (ADS)
Fedyanin, A. B.
2017-01-01
The article represents the structure of informational technologies complex that is applied in modern school education, describes the most important educational methods, shows the results of their implementation. It represents the forms and methods of educational process informative support usage, examined in respects of different aspects of their using that take into account also the psychological features of students. A range of anxious facts and dangerous trends connected with the usage and distribution of the informational technologies that are to be taken into account in the educational process of informatization is also indicated in the article. Materials of the article are based on the experience of many years in operation and development of the informational educational sphere on the basis of secondary school of the physics and mathematics specialization.
Buffered coscheduling for parallel programming and enhanced fault tolerance
Petrini, Fabrizio [Los Alamos, NM; Feng, Wu-chun [Los Alamos, NM
2006-01-31
A computer implemented method schedules processor jobs on a network of parallel machine processors or distributed system processors. Control information communications generated by each process performed by each processor during a defined time interval is accumulated in buffers, where adjacent time intervals are separated by strobe intervals for a global exchange of control information. A global exchange of the control information communications at the end of each defined time interval is performed during an intervening strobe interval so that each processor is informed by all of the other processors of the number of incoming jobs to be received by each processor in a subsequent time interval. The buffered coscheduling method of this invention also enhances the fault tolerance of a network of parallel machine processors or distributed system processors
Method and apparatus for filtering visual documents
NASA Technical Reports Server (NTRS)
Rorvig, Mark E. (Inventor); Shelton, Robert O. (Inventor)
1993-01-01
A method and apparatus for producing an abstract or condensed version of a visual document is presented. The frames comprising the visual document are first sampled to reduce the number of frames required for processing. The frames are then subjected to a structural decomposition process that reduces all information in each frame to a set of values. These values are in turn normalized and further combined to produce only one information content value per frame. The information content values of these frames are then compared to a selected distribution cutoff point. This effectively selects those values at the tails of a normal distribution, thus filtering key frames from their surrounding frames. The value for each frame is then compared with the value from the previous frame, and the respective frame is finally stored only if the values are significantly different. The method filters or compresses a visual document with a reduction in digital storage on the ratio of up to 700 to 1 or more, depending on the content of the visual document being filtered.
Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.
Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon
2016-01-01
Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.
Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events
Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon
2016-01-01
Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate. PMID:28066225
Infant joint attention, neural networks and social cognition.
Mundy, Peter; Jarrold, William
2010-01-01
Neural network models of attention can provide a unifying approach to the study of human cognitive and emotional development (Posner & Rothbart, 2007). In this paper we argue that a neural network approach to the infant development of joint attention can inform our understanding of the nature of human social learning, symbolic thought process and social cognition. At its most basic, joint attention involves the capacity to coordinate one's own visual attention with that of another person. We propose that joint attention development involves increments in the capacity to engage in simultaneous or parallel processing of information about one's own attention and the attention of other people. Infant practice with joint attention is both a consequence and an organizer of the development of a distributed and integrated brain network involving frontal and parietal cortical systems. This executive distributed network first serves to regulate the capacity of infants to respond to and direct the overt behavior of other people in order to share experience with others through the social coordination of visual attention. In this paper we describe this parallel and distributed neural network model of joint attention development and discuss two hypotheses that stem from this model. One is that activation of this distributed network during coordinated attention enhances the depth of information processing and encoding beginning in the first year of life. We also propose that with development, joint attention becomes internalized as the capacity to socially coordinate mental attention to internal representations. As this occurs the executive joint attention network makes vital contributions to the development of human symbolic thinking and social cognition. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Raj, Rahul; Hamm, Nicholas Alexander Samuel; van der Tol, Christiaan; Stein, Alfred
2016-03-01
Gross primary production (GPP) can be separated from flux tower measurements of net ecosystem exchange (NEE) of CO2. This is used increasingly to validate process-based simulators and remote-sensing-derived estimates of simulated GPP at various time steps. Proper validation includes the uncertainty associated with this separation. In this study, uncertainty assessment was done in a Bayesian framework. It was applied to data from the Speulderbos forest site, The Netherlands. We estimated the uncertainty in GPP at half-hourly time steps, using a non-rectangular hyperbola (NRH) model for its separation from the flux tower measurements. The NRH model provides a robust empirical relationship between radiation and GPP. It includes the degree of curvature of the light response curve, radiation and temperature. Parameters of the NRH model were fitted to the measured NEE data for every 10-day period during the growing season (April to October) in 2009. We defined the prior distribution of each NRH parameter and used Markov chain Monte Carlo (MCMC) simulation to estimate the uncertainty in the separated GPP from the posterior distribution at half-hourly time steps. This time series also allowed us to estimate the uncertainty at daily time steps. We compared the informative with the non-informative prior distributions of the NRH parameters and found that both choices produced similar posterior distributions of GPP. This will provide relevant and important information for the validation of process-based simulators in the future. Furthermore, the obtained posterior distributions of NEE and the NRH parameters are of interest for a range of applications.
Design and Implementation of Distributed Crawler System Based on Scrapy
NASA Astrophysics Data System (ADS)
Fan, Yuhao
2018-01-01
At present, some large-scale search engines at home and abroad only provide users with non-custom search services, and a single-machine web crawler cannot sovle the difficult task. In this paper, Through the study and research of the original Scrapy framework, the original Scrapy framework is improved by combining Scrapy and Redis, a distributed crawler system based on Web information Scrapy framework is designed and implemented, and Bloom Filter algorithm is applied to dupefilter modul to reduce memory consumption. The movie information captured from douban is stored in MongoDB, so that the data can be processed and analyzed. The results show that distributed crawler system based on Scrapy framework is more efficient and stable than the single-machine web crawler system.
High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering
NASA Technical Reports Server (NTRS)
Maly, K.
1998-01-01
Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.
NASA's Long-Term Archive (LTA) of ICESat Data at the National Snow and Ice Data Center (NSIDC)
NASA Astrophysics Data System (ADS)
Fowler, D. K.; Moses, J. F.; Dimarzio, J. P.; Webster, D.
2011-12-01
Data Stewardship, preservation, and reproducibility are becoming principal parts of a data manager's work. In an era of distributed data and information systems, where the host location ought to be transparent to the internet user, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. NASA's EOS Data and Information System (EOSDIS) is a distributed system of discipline-specific archives and mission-specific science data processing facilities. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. Using inputs from the USGCRP Workshop on Long Term Archive Requirements (1998), discussions with EOS instrument teams, and input from the 2011 ESIPS Federation meeting, NASA is developing a set of Earth science data and information content requirements for long term preservation that will ultimately be used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission is one of the first to come to an end, NASA and NSIDC are preparing for long-term support of the ICESat mission data now. For a long-term archive, it is imperative that there is sufficient information about how products were prepared in order to convince future researchers that the scientific results are accurate, understandable, useable, and reproducible. Our experience suggests data centers know what to preserve in most cases, i.e., the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases the data centers must seek guidance from the science team, e.g., for pre-launch, calibration/validation, and test data. All these data are an important part of product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information content requirements, guidance from the ICESat/GLAS Science Team and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the Distributed Active Archive Center.
2017-04-05
Information Technology at Nationwide v Abstract vi 1 Business Imperatives 1 1.1 Deliver the Right Work 1 1.2 Deliver the Right Way 1 1.3 Deliver with...an Engaged Workforce 1 2 Challenges and Opportunities 2 2.1 Responding to Demand 2 2.2 Standards and Capabilities 2 2.3 Information Technology ...release and unlimited distribution. Information Technology at Nationwide Nationwide Information Technology (IT) is comprised of seven offices
NASA Astrophysics Data System (ADS)
Wright, Willie E.
2003-05-01
As Military Medical Information Assurance organizations face off with modern pressures to downsize and outsource, they battle with losing knowledgeable people who leave and take with them what they know. This knowledge is increasingly being recognized as an important resource and organizations are now taking steps to manage it. In addition, as the pressures for globalization (Castells, 1998) increase, collaboration and cooperation are becoming more distributed and international. Knowledge sharing in a distributed international environment is becoming an essential part of Knowledge Management. This is a major shortfall in the current approach to capturing and sharing knowledge in Military Medical Information Assurance. This paper addresses this challenge by exploring Risk Information Management Resource (RIMR) as a tool for sharing knowledge using the concept of Communities of Practice. RIMR is based no the framework of sharing and using knowledge. This concept is done through three major components - people, process and technology. The people aspect enables remote collaboration, support communities of practice, reward and recognize knowledge sharing while encouraging storytelling. The process aspect enhances knowledge capture and manages information. While the technology aspect enhance system integration and data mining, it also utilizes intelligent agents and exploits expert systems. These coupled with supporting activities of education and training, technology infrastructure and information security enables effective information assurance collaboration.
Preliminary surficial geologic map database of the Amboy 30 x 60 minute quadrangle, California
Bedford, David R.; Miller, David M.; Phelps, Geoffrey A.
2006-01-01
The surficial geologic map database of the Amboy 30x60 minute quadrangle presents characteristics of surficial materials for an area approximately 5,000 km2 in the eastern Mojave Desert of California. This map consists of new surficial mapping conducted between 2000 and 2005, as well as compilations of previous surficial mapping. Surficial geology units are mapped and described based on depositional process and age categories that reflect the mode of deposition, pedogenic effects occurring post-deposition, and, where appropriate, the lithologic nature of the material. The physical properties recorded in the database focus on those that drive hydrologic, biologic, and physical processes such as particle size distribution (PSD) and bulk density. This version of the database is distributed with point data representing locations of samples for both laboratory determined physical properties and semi-quantitative field-based information. Future publications will include the field and laboratory data as well as maps of distributed physical properties across the landscape tied to physical process models where appropriate. The database is distributed in three parts: documentation, spatial map-based data, and printable map graphics of the database. Documentation includes this file, which provides a discussion of the surficial geology and describes the format and content of the map data, a database 'readme' file, which describes the database contents, and FGDC metadata for the spatial map information. Spatial data are distributed as Arc/Info coverage in ESRI interchange (e00) format, or as tabular data in the form of DBF3-file (.DBF) file formats. Map graphics files are distributed as Postscript and Adobe Portable Document Format (PDF) files, and are appropriate for representing a view of the spatial database at the mapped scale.
Technologies for network-centric C4ISR
NASA Astrophysics Data System (ADS)
Dunkelberger, Kirk A.
2003-07-01
Three technologies form the heart of any network-centric command, control, communication, intelligence, surveillance, and reconnaissance (C4ISR) system: distributed processing, reconfigurable networking, and distributed resource management. Distributed processing, enabled by automated federation, mobile code, intelligent process allocation, dynamic multiprocessing groups, check pointing, and other capabilities creates a virtual peer-to-peer computing network across the force. Reconfigurable networking, consisting of content-based information exchange, dynamic ad-hoc routing, information operations (perception management) and other component technologies forms the interconnect fabric for fault tolerant inter processor and node communication. Distributed resource management, which provides the means for distributed cooperative sensor management, foe sensor utilization, opportunistic collection, symbiotic inductive/deductive reasoning and other applications provides the canonical algorithms for network-centric enterprises and warfare. This paper introduces these three core technologies and briefly discusses a sampling of their component technologies and their individual contributions to network-centric enterprises and warfare. Based on the implied requirements, two new algorithms are defined and characterized which provide critical building blocks for network centricity: distributed asynchronous auctioning and predictive dynamic source routing. The first provides a reliable, efficient, effective approach for near-optimal assignment problems; the algorithm has been demonstrated to be a viable implementation for ad-hoc command and control, object/sensor pairing, and weapon/target assignment. The second is founded on traditional dynamic source routing (from mobile ad-hoc networking), but leverages the results of ad-hoc command and control (from the contributed auctioning algorithm) into significant increases in connection reliability through forward prediction. Emphasis is placed on the advantages gained from the closed-loop interaction of the multiple technologies in the network-centric application environment.
NASA Astrophysics Data System (ADS)
Evans, M. E.; Merow, C.; Record, S.; Menlove, J.; Gray, A.; Cundiff, J.; McMahon, S.; Enquist, B. J.
2013-12-01
Current attempts to forecast how species' distributions will change in response to climate change suffer under a fundamental trade-off: between modeling many species superficially vs. few species in detail (between correlative vs. mechanistic models). The goals of this talk are two-fold: first, we present a Bayesian multilevel modeling framework, dynamic range modeling (DRM), for building process-based forecasts of many species' distributions at a time, designed to address the trade-off between detail and number of distribution forecasts. In contrast to 'species distribution modeling' or 'niche modeling', which uses only species' occurrence data and environmental data, DRMs draw upon demographic data, abundance data, trait data, occurrence data, and GIS layers of climate in a single framework to account for two processes known to influence range dynamics - demography and dispersal. The vision is to use extensive databases on plant demography, distributions, and traits - in the Botanical Information and Ecology Network, the Forest Inventory and Analysis database (FIA), and the International Tree Ring Data Bank - to develop DRMs for North American trees. Second, we present preliminary results from building the core submodel of a DRM - an integral projection model (IPM) - for a sample of dominant tree species in western North America. IPMs are used to infer demographic niches - i.e., the set of environmental conditions under which population growth rate is positive - and project population dynamics through time. Based on >550,000 data points derived from FIA for nine tree species in western North America, we show IPM-based models of their current and future distributions, and discuss how IPMs can be used to forecast future forest productivity, mortality patterns, and inform efforts at assisted migration.
Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service
Hatano, Kenji; Ohe, Kazuhiko
2003-01-01
Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364
BIO-Plex Information System Concept
NASA Technical Reports Server (NTRS)
Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)
1999-01-01
This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.
NASA Technical Reports Server (NTRS)
1986-01-01
The past, present, and future status of space technology in Berlin is discussed, including raw material processing, transportation, energy, and information generation and distribution. How Berlin can contribute toward further advancement in this field, individually or in collaboration with international partners is indicated.
NASA Technical Reports Server (NTRS)
Byrne, F.
1981-01-01
Time-shared interface speeds data processing in distributed computer network. Two-level high-speed scanning approach routes information to buffer, portion of which is reserved for series of "first-in, first-out" memory stacks. Buffer address structure and memory are protected from noise or failed components by error correcting code. System is applicable to any computer or processing language.
ERIC Educational Resources Information Center
Schmidtke, Daniel; Matsuki, Kazunaga; Kuperman, Victor
2017-01-01
The current study addresses a discrepancy in the psycholinguistic literature about the chronology of information processing during the visual recognition of morphologically complex words. "Form-then-meaning" accounts of complex word recognition claim that morphemes are processed as units of form prior to any influence of their meanings,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taheri, M; Teslich, N; Lu, J P
An in situ method for studying the role of laser energy on the microstructural evolution of polycrystalline Si is presented. By monitoring both laser energy and microstructural evolution simultaneously in the dynamic transmission electron microscope, information on grain size and defect concentration can be correlated directly with processing conditions. This proof of principle study provides fundamental scientific information on the crystallization process that has technological importance for the development of thin film transistors. In conclusion, we successfully developed a method for studying UV laser processing of Si films in situ on nanosecond time scales, with ultimate implications for TFT applicationmore » improvements. In addition to grain size distribution as a function of laser energy density, we found that grain size scaled with laser energy in general. We showed that nanosecond time resolution allowed us to see the nucleation and growth front during processing, which will help further the understanding of microstructural evolution of poly-Si films for electronic applications. Future studies, coupled with high resolution TEM, will be performed to study grain boundary migration, intergranular defects, and grain size distribution with respect to laser energy and adsorption depth.« less
Serious games for elderly continuous monitoring.
Lemus-Zúñiga, Lenin-G; Navarro-Pardo, Esperanza; Moret-Tatay, Carmen; Pocinho, Ricardo
2015-01-01
Information technology (IT) and serious games allow older population to remain independent for longer. Hence, when designing technology for this population, developmental changes, such as attention and/or perception, should be considered. For instance, a crucial developmental change has been related to cognitive speed in terms of reaction time (RT). However, this variable presents a skewed distribution that difficult data analysis. An alternative strategy is to characterize the data to an ex-Gaussian function. Furthermore, this procedure provides different parameters that have been related to underlying cognitive processes in the literature. Another issue to be considered is the optimal data recording, storing and processing. For that purpose mobile devices (smart phones and tablets) are a good option for targeting serious games where valuable information can be stored (time spent in the application, reaction time, frequency of use, and a long etcetera). The data stored inside the smartphones and tablets can be sent to a central computer (cloud storage) in order to store the data collected to not only fill the distribution of reaction times to mathematical functions, but also to estimate parameters which may reflect cognitive processes underlying language, aging, and decisional process.
Idle waves in high-performance computing
NASA Astrophysics Data System (ADS)
Markidis, Stefano; Vencels, Juris; Peng, Ivy Bo; Akhmetova, Dana; Laure, Erwin; Henri, Pierre
2015-01-01
The vast majority of parallel scientific applications distributes computation among processes that are in a busy state when computing and in an idle state when waiting for information from other processes. We identify the propagation of idle waves through processes in scientific applications with a local information exchange between the two processes. Idle waves are nondispersive and have a phase velocity inversely proportional to the average busy time. The physical mechanism enabling the propagation of idle waves is the local synchronization between two processes due to remote data dependency. This study provides a description of the large number of processes in parallel scientific applications as a continuous medium. This work also is a step towards an understanding of how localized idle periods can affect remote processes, leading to the degradation of global performance in parallel scientific applications.
NASA Technical Reports Server (NTRS)
Spinhirne, J. D.; Welton, E. J.; Campbell, J. R.; Berkoff, T. A.; Starr, David OC. (Technical Monitor)
2002-01-01
The NASA MPL-net project goal is consistent data products of the vertical distribution of clouds and aerosol from globally distributed lidar observation sites. The four ARM micro pulse lidars are a basis of the network to consist of over twelve sites. The science objective is ground truth for global satellite retrievals and accurate vertical distribution information in combination with surface radiation measurements for aerosol and cloud models. The project involves improvement in instruments and data processing and cooperation with ARM and other partners.
Entropy Methods For Univariate Distributions in Decision Analysis
NASA Astrophysics Data System (ADS)
Abbas, Ali E.
2003-03-01
One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.
NASA Astrophysics Data System (ADS)
Voloshynovskiy, Sviatoslav V.; Koval, Oleksiy; Deguillaume, Frederic; Pun, Thierry
2004-06-01
In this paper we address visual communications via printing channels from an information-theoretic point of view as communications with side information. The solution to this problem addresses important aspects of multimedia data processing, security and management, since printed documents are still the most common form of visual information representation. Two practical approaches to side information communications for printed documents are analyzed in the paper. The first approach represents a layered joint source-channel coding for printed documents. This approach is based on a self-embedding concept where information is first encoded assuming a Wyner-Ziv set-up and then embedded into the original data using a Gel'fand-Pinsker construction and taking into account properties of printing channels. The second approach is based on Wyner-Ziv and Berger-Flynn-Gray set-ups and assumes two separated communications channels where an appropriate distributed coding should be elaborated. The first printing channel is considered to be a direct visual channel for images ("analog" channel with degradations). The second "digital channel" with constrained capacity is considered to be an appropriate auxiliary channel. We demonstrate both theoretically and practically how one can benefit from this sort of "distributed paper communications".
Total Coliform Rule Distribution System Advisory Committee (TCRDSAC) Document
This document provide information about the TCRDSAC, including its charter, processes and recommendations. The Agency used the Advisory Committee recommendations to develop proposed and final rules that revised the Total Coliform Rule.
How neuroscience can inform the study of individual differences in cognitive abilities
McFarland, Dennis J.
2018-01-01
Theories of human mental abilities should be consistent with what is known in neuroscience. Currently tests of human mental abilities are modeled by cognitive constructs such as attention, working memory, and speed of information processing. These constructs are in turn related to a single general ability. However brains are very complex systems and whether most of the variability between the operations of different brains can be ascribed to a single factor is questionable. Research in neuroscience suggests that psychological processes such at perception, attention, decision and executive control are emergent properties of interacting distributed networks. The modules that make up these networks use similar computational processes that involve multiple forms of neural plasticity, each having different time constants. Accordingly these networks might best be characterized in terms of the information they process rather than in terms of abstract psychological processes such as working memory and executive control. PMID:28195556
Discrete mathematics for spatial data classification and understanding
NASA Astrophysics Data System (ADS)
Mussio, Luigi; Nocera, Rossella; Poli, Daniela
1998-12-01
Data processing, in the field of information technology, requires new tools, involving discrete mathematics, like data compression, signal enhancement, data classification and understanding, hypertexts and multimedia (considering educational aspects too), because the mass of data implies automatic data management and doesn't permit any a priori knowledge. The methodologies and procedures used in this class of problems concern different kinds of segmentation techniques and relational strategies, like clustering, parsing, vectorization, formalization, fitting and matching. On the other hand, the complexity of this approach imposes to perform optimal sampling and outlier detection just at the beginning, in order to define the set of data to be processed: rough data supply very poor information. For these reasons, no hypotheses about the distribution behavior of the data can be generally done and a judgment should be acquired by distribution-free inference only.
NASA Astrophysics Data System (ADS)
Bu, Xianye; Dong, Hongli; Han, Fei; Li, Gongfa
2018-07-01
This paper is concerned with the distributed filtering problem for a class of time-varying systems subject to deception attacks and event-triggering protocols. Due to the bandwidth limitation, an event-triggered communication strategy is adopted to alleviate the data transmission pressure in the algorithm implementation process. The partial nodes-based filtering problem is considered, where only a partial of nodes can measure the information of the plant. Meanwhile, the measurement information possibly suffers the deception attacks in the transmission process. Sufficient conditions can be established such that the error dynamics satisfies the prescribed average ? performance constraints. The parameters of designed filters can be calculated by solving a series of recursive linear matrix inequalities. A simulation example is presented to demonstrate the effectiveness of the proposed filtering method in this paper.
Redundant Disk Arrays in Transaction Processing Systems. Ph.D. Thesis, 1993
NASA Technical Reports Server (NTRS)
Mourad, Antoine Nagib
1994-01-01
We address various issues dealing with the use of disk arrays in transaction processing environments. We look at the problem of transaction undo recovery and propose a scheme for using the redundancy in disk arrays to support undo recovery. The scheme uses twin page storage for the parity information in the array. It speeds up transaction processing by eliminating the need for undo logging for most transactions. The use of redundant arrays of distributed disks to provide recovery from disasters as well as temporary site failures and disk crashes is also studied. We investigate the problem of assigning the sites of a distributed storage system to redundant arrays in such a way that a cost of maintaining the redundant parity information is minimized. Heuristic algorithms for solving the site partitioning problem are proposed and their performance is evaluated using simulation. We also develop a heuristic for which an upper bound on the deviation from the optimal solution can be established.
Distributed Data Collection for the ATLAS EventIndex
NASA Astrophysics Data System (ADS)
Sánchez, J.; Fernández Casaní, A.; González de la Hoz, S.
2015-12-01
The ATLAS EventIndex contains records of all events processed by ATLAS, in all processing stages. These records include the references to the files containing each event (the GUID of the file) and the internal pointer to each event in the file. This information is collected by all jobs that run at Tier-0 or on the Grid and process ATLAS events. Each job produces a snippet of information for each permanent output file. This information is packed and transferred to a central broker at CERN using an ActiveMQ messaging system, and then is unpacked, sorted and reformatted in order to be stored and catalogued into a central Hadoop server. This contribution describes in detail the Producer/Consumer architecture to convey this information from the running jobs through the messaging system to the Hadoop server.
A Hierarchical Bayesian Model for Calibrating Estimates of Species Divergence Times
Heath, Tracy A.
2012-01-01
In Bayesian divergence time estimation methods, incorporating calibrating information from the fossil record is commonly done by assigning prior densities to ancestral nodes in the tree. Calibration prior densities are typically parametric distributions offset by minimum age estimates provided by the fossil record. Specification of the parameters of calibration densities requires the user to quantify his or her prior knowledge of the age of the ancestral node relative to the age of its calibrating fossil. The values of these parameters can, potentially, result in biased estimates of node ages if they lead to overly informative prior distributions. Accordingly, determining parameter values that lead to adequate prior densities is not straightforward. In this study, I present a hierarchical Bayesian model for calibrating divergence time analyses with multiple fossil age constraints. This approach applies a Dirichlet process prior as a hyperprior on the parameters of calibration prior densities. Specifically, this model assumes that the rate parameters of exponential prior distributions on calibrated nodes are distributed according to a Dirichlet process, whereby the rate parameters are clustered into distinct parameter categories. Both simulated and biological data are analyzed to evaluate the performance of the Dirichlet process hyperprior. Compared with fixed exponential prior densities, the hierarchical Bayesian approach results in more accurate and precise estimates of internal node ages. When this hyperprior is applied using Markov chain Monte Carlo methods, the ages of calibrated nodes are sampled from mixtures of exponential distributions and uncertainty in the values of calibration density parameters is taken into account. PMID:22334343
The extraction and integration framework: a two-process account of statistical learning.
Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G
2013-07-01
The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved
NASA Astrophysics Data System (ADS)
Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.
2016-06-01
We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.
Astro-WISE: Chaining to the Universe
NASA Astrophysics Data System (ADS)
Valentijn, E. A.; McFarland, J. P.; Snigula, J.; Begeman, K. G.; Boxhoorn, D. R.; Rengelink, R.; Helmich, E.; Heraudeau, P.; Verdoes Kleijn, G.; Vermeij, R.; Vriend, W.-J.; Tempelaar, M. J.; Deul, E.; Kuijken, K.; Capaccioli, M.; Silvotti, R.; Bender, R.; Neeser, M.; Saglia, R.; Bertin, E.; Mellier, Y.
2007-10-01
The recent explosion of recorded digital data and its processed derivatives threatens to overwhelm researchers when analysing their experimental data or looking up data items in archives and file systems. While current hardware developments allow the acquisition, processing and storage of hundreds of terabytes of data at the cost of a modern sports car, the software systems to handle these data are lagging behind. This problem is very general and is well recognized by various scientific communities; several large projects have been initiated, e.g., DATAGRID/EGEE {http://www.eu-egee.org/} federates compute and storage power over the high-energy physical community, while the international astronomical community is building an Internet geared Virtual Observatory {http://www.euro-vo.org/pub/} (Padovani 2006) connecting archival data. These large projects either focus on a specific distribution aspect or aim to connect many sub-communities and have a relatively long trajectory for setting standards and a common layer. Here, we report first light of a very different solution (Valentijn & Kuijken 2004) to the problem initiated by a smaller astronomical IT community. It provides an abstract scientific information layer which integrates distributed scientific analysis with distributed processing and federated archiving and publishing. By designing new abstractions and mixing in old ones, a Science Information System with fully scalable cornerstones has been achieved, transforming data systems into knowledge systems. This break-through is facilitated by the full end-to-end linking of all dependent data items, which allows full backward chaining from the observer/researcher to the experiment. Key is the notion that information is intrinsic in nature and thus is the data acquired by a scientific experiment. The new abstraction is that software systems guide the user to that intrinsic information by forcing full backward and forward chaining in the data modelling.
Distributed digital signal processors for multi-body flexible structures
NASA Technical Reports Server (NTRS)
Lee, Gordon K. F.
1992-01-01
Multi-body flexible structures, such as those currently under investigation in spacecraft design, are large scale (high-order) dimensional systems. Controlling and filtering such structures is a computationally complex problem. This is particularly important when many sensors and actuators are located along the structure and need to be processed in real time. This report summarizes research activity focused on solving the signal processing (that is, information processing) issues of multi-body structures. A distributed architecture is developed in which single loop processors are employed for local filtering and control. By implementing such a philosophy with an embedded controller configuration, a supervising controller may be used to process global data and make global decisions as the local devices are processing local information. A hardware testbed, a position controller system for a servo motor, is employed to illustrate the capabilities of the embedded controller structure. Several filtering and control structures which can be modeled as rational functions can be implemented on the system developed in this research effort. Thus the results of the study provide a support tool for many Control/Structure Interaction (CSI) NASA testbeds such as the Evolutionary model and the nine-bay truss structure.
NASA Astrophysics Data System (ADS)
Straub, K. M.; Ganti, V. K.; Paola, C.; Foufoula-Georgiou, E.
2010-12-01
Stratigraphy preserved in alluvial basins houses the most complete record of information necessary to reconstruct past environmental conditions. Indeed, the character of the sedimentary record is inextricably related to the surface processes that formed it. In this presentation we explore how the signals of surface processes are recorded in stratigraphy through the use of physical and numerical experiments. We focus on linking surface processes to stratigraphy in 1D by quantifying the probability distributions of processes that govern the evolution of depositional systems to the probability distribution of preserved bed thicknesses. In this study we define a bed as a package of sediment bounded above and below by erosional surfaces. In a companion presentation we document heavy-tailed statistics of erosion and deposition from high-resolution temporal elevation data recorded during a controlled physical experiment. However, the heavy tails in the magnitudes of erosional and depositional events are not preserved in the experimental stratigraphy. Similar to many bed thickness distributions reported in field studies we find that an exponential distribution adequately describes the thicknesses of beds preserved in our experiment. We explore the generation of exponential bed thickness distributions from heavy-tailed surface statistics using 1D numerical models. These models indicate that when the full distribution of elevation fluctuations (both erosional and depositional events) is symmetrical, the resulting distribution of bed thicknesses is exponential in form. Finally, we illustrate that a predictable relationship exists between the coefficient of variation of surface elevation fluctuations and the scale-parameter of the resulting exponential distribution of bed thicknesses.
Application of ideal pressure distribution in development process of automobile seats.
Kilincsoy, U; Wagner, A; Vink, P; Bubb, H
2016-07-19
In designing a car seat the ideal pressure distribution is important as it is the largest contact surface between the human and the car. Because of obstacles hindering a more general application of the ideal pressure distribution in seating design, multidimensional measuring techniques are necessary with extensive user tests. The objective of this study is to apply and integrate the knowledge about the ideal pressure distribution in the seat design process for a car manufacturer in an efficient way. Ideal pressure distribution was combined with pressure measurement, in this case pressure mats. In order to integrate this theoretical knowledge of seating comfort in the seat development process for a car manufacturer a special user interface was defined and developed. The mapping of the measured pressure distribution in real-time and accurately scaled to actual seats during test setups directly lead to design implications for seat design even during the test situation. Detailed analysis of the subject's feedback was correlated with objective measurements of the subject's pressure distribution in real time. Therefore existing seating characteristics were taken into account as well. A user interface can incorporate theoretical and validated 'state of the art' models of comfort. Consequently, this information can reduce extensive testing and lead to more detailed results in a shorter time period.
Application of Advanced Multi-Core Processor Technologies to Oceanographic Research
2013-09-30
STM32 NXP LPC series No Proprietary Microchip PIC32/DSPIC No > 500 mW; < 5 W ARM Cortex TI OMAP TI Sitara Broadcom BCM2835 Varies FPGA...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Application of Advanced Multi-Core Processor Technologies...state-of-the-art information processing architectures. OBJECTIVES Next-generation processor architectures (multi-core, multi-threaded) hold the
NASA Technical Reports Server (NTRS)
Jenkins, George
1986-01-01
Prelaunch, launch, mission, and landing distribution of RF and hardline uplink/downlink information between Space Shuttle Orbiter/cargo elements, tracking antennas, and control centers at JSC, KSC, MSFC, GSFC, ESMC/RCC, and Sunnyvale are presented as functional block diagrams. Typical mismatch problems encountered during spacecraft-to-project control center telemetry transmissions are listed along with new items for future support enhancement.
ERIC Educational Resources Information Center
Li, Jianyi; Nie, Lanying; Li, Zeyu; Lin, Lijun; Tang, Lei; Ouyang, Jun
2012-01-01
Anatomical corrosion casts of human specimens are useful teaching aids. However, their use is limited due to ethical dilemmas associated with their production, their lack of perfect reproducibility, and their consumption of original specimens in the process of casting. In this study, new approaches with modern distribution of complex anatomical…
NASA Astrophysics Data System (ADS)
Jenkins, George
Prelaunch, launch, mission, and landing distribution of RF and hardline uplink/downlink information between Space Shuttle Orbiter/cargo elements, tracking antennas, and control centers at JSC, KSC, MSFC, GSFC, ESMC/RCC, and Sunnyvale are presented as functional block diagrams. Typical mismatch problems encountered during spacecraft-to-project control center telemetry transmissions are listed along with new items for future support enhancement.
The Role of Metaphors in Fostering Macrocognitive Processes in Distributed Teams
2012-07-30
temporal dynamics, and storytelling towards the goal of improving team coordination and performance in distributed decision making teams. Specifically...better reflect the context of organizational and military teams and 3) to investigate how storytelling (complex form of metaphor) can be used as a...Information Sharing, Situation Awareness, Storytelling , Metaphors, Reflexivity.Team Simulation, NeoCITIES 16. SECURITY CLASSIFICATION OF: a. REPORT b
Mark Coleman
2007-01-01
In forest trees, roots mediate such significant carbon fluxes as primary production and soil C02 efflux. Despite the central role of roots in these critical processes, information on root distribution during stand establishment is limited, yet must be described to accurately predict how various forest types, which are growing with a range of...
Semantics-driven modelling of user preferences for information retrieval in the biomedical domain.
Gladun, Anatoly; Rogushina, Julia; Valencia-García, Rafael; Béjar, Rodrigo Martínez
2013-03-01
A large amount of biomedical and genomic data are currently available on the Internet. However, data are distributed into heterogeneous biological information sources, with little or even no organization. Semantic technologies provide a consistent and reliable basis with which to confront the challenges involved in the organization, manipulation and visualization of data and knowledge. One of the knowledge representation techniques used in semantic processing is the ontology, which is commonly defined as a formal and explicit specification of a shared conceptualization of a domain of interest. The work presented here introduces a set of interoperable algorithms that can use domain and ontological information to improve information-retrieval processes. This work presents an ontology-based information-retrieval system for the biomedical domain. This system, with which some experiments have been carried out that are described in this paper, is based on the use of domain ontologies for the creation and normalization of lightweight ontologies that represent user preferences in a determined domain in order to improve information-retrieval processes.
Jaeger, Johannes; Irons, David; Monk, Nick
2008-10-01
Positional specification by morphogen gradients is traditionally viewed as a two-step process. A gradient is formed and then interpreted, providing a spatial metric independent of the target tissue, similar to the concept of space in classical mechanics. However, the formation and interpretation of gradients are coupled, dynamic processes. We introduce a conceptual framework for positional specification in which cellular activity feeds back on positional information encoded by gradients, analogous to the feedback between mass-energy distribution and the geometry of space-time in Einstein's general theory of relativity. We discuss how such general relativistic positional information (GRPI) can guide systems-level approaches to pattern formation.
Mechanical trapping of particles in granular media
NASA Astrophysics Data System (ADS)
Kerimov, Abdulla; Mavko, Gary; Mukerji, Tapan; Al Ibrahim, Mustafa A.
2018-02-01
Mechanical trapping of fine particles in the pores of granular materials is an essential mechanism in a wide variety of natural and industrial filtration processes. The progress of invading particles is primarily limited by the network of pore throats and connected pathways encountered by the particles during their motion through the porous medium. Trapping of invading particles is limited to a depth defined by the size, shape, and distribution of the invading particles with respect to the size, shape, and distribution of the host porous matrix. Therefore, the trapping process, in principle, can be used to obtain information about geometrical properties, such as pore throat and particle size, of the underlying host matrix. A numerical framework is developed to simulate the mechanical trapping of fine particles in porous granular media with prescribed host particle size, shape, and distribution. The trapping of invading particles is systematically modeled in host packings with different host particle distributions: monodisperse, bidisperse, and polydisperse distributions of host particle sizes. Our simulation results show quantitatively and qualitatively to what extent trapping behavior is different in the generated monodisperse, bidisperse, and polydisperse packings of spherical particles. Depending on host particle size and distribution, the information about extreme estimates of minimal pore throat sizes of the connected pathways in the underlying host matrix can be inferred from trapping features, such as the fraction of trapped particles as a function of invading particle size. The presence of connected pathways with minimum and maximum of minimal pore throat diameters can be directly obtained from trapping features. This limited information about the extreme estimates of pore throat sizes of the connected pathways in the host granular media inferred from our numerical simulations is consistent with simple geometrical estimates of extreme value of pore and throat sizes of the densest structural arrangements of spherical particles and geometrical Delaunay tessellation analysis of the pore space of host granular media. Our results suggest simple relations between the host particle size and trapping features. These relationships can be potentially used to describe both the dynamics of the mechanical trapping process and the geometrical properties of the host granular media.
Mechanical trapping of particles in granular media.
Kerimov, Abdulla; Mavko, Gary; Mukerji, Tapan; Al Ibrahim, Mustafa A
2018-02-01
Mechanical trapping of fine particles in the pores of granular materials is an essential mechanism in a wide variety of natural and industrial filtration processes. The progress of invading particles is primarily limited by the network of pore throats and connected pathways encountered by the particles during their motion through the porous medium. Trapping of invading particles is limited to a depth defined by the size, shape, and distribution of the invading particles with respect to the size, shape, and distribution of the host porous matrix. Therefore, the trapping process, in principle, can be used to obtain information about geometrical properties, such as pore throat and particle size, of the underlying host matrix. A numerical framework is developed to simulate the mechanical trapping of fine particles in porous granular media with prescribed host particle size, shape, and distribution. The trapping of invading particles is systematically modeled in host packings with different host particle distributions: monodisperse, bidisperse, and polydisperse distributions of host particle sizes. Our simulation results show quantitatively and qualitatively to what extent trapping behavior is different in the generated monodisperse, bidisperse, and polydisperse packings of spherical particles. Depending on host particle size and distribution, the information about extreme estimates of minimal pore throat sizes of the connected pathways in the underlying host matrix can be inferred from trapping features, such as the fraction of trapped particles as a function of invading particle size. The presence of connected pathways with minimum and maximum of minimal pore throat diameters can be directly obtained from trapping features. This limited information about the extreme estimates of pore throat sizes of the connected pathways in the host granular media inferred from our numerical simulations is consistent with simple geometrical estimates of extreme value of pore and throat sizes of the densest structural arrangements of spherical particles and geometrical Delaunay tessellation analysis of the pore space of host granular media. Our results suggest simple relations between the host particle size and trapping features. These relationships can be potentially used to describe both the dynamics of the mechanical trapping process and the geometrical properties of the host granular media.
Business Intelligence Applied to the ALMA Software Integration Process
NASA Astrophysics Data System (ADS)
Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.
2012-09-01
Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.
NASA Astrophysics Data System (ADS)
Raj, R.; Hamm, N. A. S.; van der Tol, C.; Stein, A.
2015-08-01
Gross primary production (GPP), separated from flux tower measurements of net ecosystem exchange (NEE) of CO2, is used increasingly to validate process-based simulators and remote sensing-derived estimates of simulated GPP at various time steps. Proper validation should include the uncertainty associated with this separation at different time steps. This can be achieved by using a Bayesian framework. In this study, we estimated the uncertainty in GPP at half hourly time steps. We used a non-rectangular hyperbola (NRH) model to separate GPP from flux tower measurements of NEE at the Speulderbos forest site, The Netherlands. The NRH model included the variables that influence GPP, in particular radiation, and temperature. In addition, the NRH model provided a robust empirical relationship between radiation and GPP by including the degree of curvature of the light response curve. Parameters of the NRH model were fitted to the measured NEE data for every 10-day period during the growing season (April to October) in 2009. Adopting a Bayesian approach, we defined the prior distribution of each NRH parameter. Markov chain Monte Carlo (MCMC) simulation was used to update the prior distribution of each NRH parameter. This allowed us to estimate the uncertainty in the separated GPP at half-hourly time steps. This yielded the posterior distribution of GPP at each half hour and allowed the quantification of uncertainty. The time series of posterior distributions thus obtained allowed us to estimate the uncertainty at daily time steps. We compared the informative with non-informative prior distributions of the NRH parameters. The results showed that both choices of prior produced similar posterior distributions GPP. This will provide relevant and important information for the validation of process-based simulators in the future. Furthermore, the obtained posterior distributions of NEE and the NRH parameters are of interest for a range of applications.
NASA Astrophysics Data System (ADS)
Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf
2017-09-01
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.
NASA Astrophysics Data System (ADS)
Barabanov, A. V.; Markov, A. S.; Tsirlov, V. L.
2018-05-01
This paper presents statistical results and their consolidation, which were received in the study into security of various web-application against cross-site request forgery attacks. Some of the results were received in the study carried out within the framework of certification for compliance with information security requirements. The paper provides the results of consolidating information about the attack and protection measures, which are currently used by the developers of web-applications. It specifies results of the study, which demonstrate various distribution types: distribution of identified vulnerabilities as per the developer type (Russian and foreign), distribution of the security measures used in web-applications, distribution of the identified vulnerabilities as per the programming languages, data on the number of security measures that are used in the studied web-applications. The results of the study show that in most cases the developers of web-applications do not pay due attention to protection against cross-site request forgery attacks. The authors give recommendations to the developers that are planning to undergo a certification process for their software applications.
Advanced algorithms for distributed fusion
NASA Astrophysics Data System (ADS)
Gelfand, A.; Smith, C.; Colony, M.; Bowman, C.; Pei, R.; Huynh, T.; Brown, C.
2008-03-01
The US Military has been undergoing a radical transition from a traditional "platform-centric" force to one capable of performing in a "Network-Centric" environment. This transformation will place all of the data needed to efficiently meet tactical and strategic goals at the warfighter's fingertips. With access to this information, the challenge of fusing data from across the batttlespace into an operational picture for real-time Situational Awareness emerges. In such an environment, centralized fusion approaches will have limited application due to the constraints of real-time communications networks and computational resources. To overcome these limitations, we are developing a formalized architecture for fusion and track adjudication that allows the distribution of fusion processes over a dynamically created and managed information network. This network will support the incorporation and utilization of low level tracking information within the Army Distributed Common Ground System (DCGS-A) or Future Combat System (FCS). The framework is based on Bowman's Dual Node Network (DNN) architecture that utilizes a distributed network of interlaced fusion and track adjudication nodes to build and maintain a globally consistent picture across all assets.
Next Generation Multimedia Distributed Data Base Systems
NASA Technical Reports Server (NTRS)
Pendleton, Stuart E.
1997-01-01
The paradigm of client/server computing is changing. The model of a server running a monolithic application and supporting clients at the desktop is giving way to a different model that blurs the line between client and server. We are on the verge of plunging into the next generation of computing technology--distributed object-oriented computing. This is not only a change in requirements but a change in opportunities, and requires a new way of thinking for Information System (IS) developers. The information system demands caused by global competition are requiring even more access to decision making tools. Simply, object-oriented technology has been developed to supersede the current design process of information systems which is not capable of handling next generation multimedia.
NASA Astrophysics Data System (ADS)
Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.
2013-12-01
A new software application called MAD# has been coupled with the HTCondor high throughput computing system to aid scientists and educators with the characterization of spatial random fields and enable understanding the spatial distribution of parameters used in hydrogeologic and related modeling. MAD# is an open source desktop software application used to characterize spatial random fields using direct and indirect information through Bayesian inverse modeling technique called the Method of Anchored Distributions (MAD). MAD relates indirect information with a target spatial random field via a forward simulation model. MAD# executes inverse process running the forward model multiple times to transfer information from indirect information to the target variable. MAD# uses two parallelization profiles according to computational resources available: one computer with multiple cores and multiple computers - multiple cores through HTCondor. HTCondor is a system that manages a cluster of desktop computers for submits serial or parallel jobs using scheduling policies, resources monitoring, job queuing mechanism. This poster will show how MAD# reduces the time execution of the characterization of random fields using these two parallel approaches in different case studies. A test of the approach was conducted using 1D problem with 400 cells to characterize saturated conductivity, residual water content, and shape parameters of the Mualem-van Genuchten model in four materials via the HYDRUS model. The number of simulations evaluated in the inversion was 10 million. Using the one computer approach (eight cores) were evaluated 100,000 simulations in 12 hours (10 million - 1200 hours approximately). In the evaluation on HTCondor, 32 desktop computers (132 cores) were used, with a processing time of 60 hours non-continuous in five days. HTCondor reduced the processing time for uncertainty characterization by a factor of 20 (1200 hours reduced to 60 hours.)
Colen, Hadewig B; Neef, Cees; Schuring, Roel W
2003-06-01
Worldwide patient safety has become a major social policy problem for healthcare organisations. As in other organisations, the patients in our hospital also suffer from an inadequate distribution process, as becomes clear from incident reports involving medication errors. Medisch Spectrum Twente is a top primary-care, clinical, teaching hospital. The hospital pharmacy takes care of 1070 internal beds and 1120 beds in an affiliated psychiatric hospital and nursing homes. In the beginning of 1999, our pharmacy group started a large interdisciplinary research project to develop a safe, effective and efficient drug distribution system by using systematic process redesign. The process redesign includes both organisational and technological components. This article describes the identification and verification of critical performance dimensions for the design of drug distribution processes in hospitals (phase 1 of the systematic process redesign of drug distribution). Based on reported errors and related causes, we suggested six generic performance domains. To assess the role of the performance dimensions, we used three approaches: flowcharts, interviews with stakeholders and review of the existing performance using time studies and medication error studies. We were able to set targets for costs, quality of information, responsiveness, employee satisfaction, and degree of innovation. We still have to establish what drug distribution system, in respect of quality and cost-effectiveness, represents the best and most cost-effective way of preventing medication errors. We intend to develop an evaluation model, using the critical performance dimensions as a starting point. This model can be used as a simulation template to compare different drug distribution concepts in order to define the differences in quality and cost-effectiveness.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos... Pennsylvania Ave., NW., Washington, DC 20460, ATTENTION: Asbestos Exemption. For information regarding the...
Bioactive phytochemicals in wheat: Extraction, analysis, processing, and functional properties
USDA-ARS?s Scientific Manuscript database
Whole wheat provides a rich source of bioactive phytochemicals namely, phenolic acids, carotenoids, tocopherols, alkylresorcinols, arabinoxylans, benzoxazinoids, phytosterols, and lignans. This review provides information on the distribution, extractability, analysis, and nutraceutical properties of...
Partial information decomposition as a spatiotemporal filter.
Flecker, Benjamin; Alford, Wesley; Beggs, John M; Williams, Paul L; Beer, Randall D
2011-09-01
Understanding the mechanisms of distributed computation in cellular automata requires techniques for characterizing the emergent structures that underlie information processing in such systems. Recently, techniques from information theory have been brought to bear on this problem. Building on this work, we utilize the new technique of partial information decomposition to show that previous information-theoretic measures can confound distinct sources of information. We then propose a new set of filters and demonstrate that they more cleanly separate out the background domains, particles, and collisions that are typically associated with information storage, transfer, and modification in cellular automata.
Dorazio, Robert; Karanth, K. Ullas
2017-01-01
MotivationSeveral spatial capture-recapture (SCR) models have been developed to estimate animal abundance by analyzing the detections of individuals in a spatial array of traps. Most of these models do not use the actual dates and times of detection, even though this information is readily available when using continuous-time recorders, such as microphones or motion-activated cameras. Instead most SCR models either partition the period of trap operation into a set of subjectively chosen discrete intervals and ignore multiple detections of the same individual within each interval, or they simply use the frequency of detections during the period of trap operation and ignore the observed times of detection. Both practices make inefficient use of potentially important information in the data.Model and data analysisWe developed a hierarchical SCR model to estimate the spatial distribution and abundance of animals detected with continuous-time recorders. Our model includes two kinds of point processes: a spatial process to specify the distribution of latent activity centers of individuals within the region of sampling and a temporal process to specify temporal patterns in the detections of individuals. We illustrated this SCR model by analyzing spatial and temporal patterns evident in the camera-trap detections of tigers living in and around the Nagarahole Tiger Reserve in India. We also conducted a simulation study to examine the performance of our model when analyzing data sets of greater complexity than the tiger data.BenefitsOur approach provides three important benefits: First, it exploits all of the information in SCR data obtained using continuous-time recorders. Second, it is sufficiently versatile to allow the effects of both space use and behavior of animals to be specified as functions of covariates that vary over space and time. Third, it allows both the spatial distribution and abundance of individuals to be estimated, effectively providing a species distribution model, even in cases where spatial covariates of abundance are unknown or unavailable. We illustrated these benefits in the analysis of our data, which allowed us to quantify differences between nocturnal and diurnal activities of tigers and to estimate their spatial distribution and abundance across the study area. Our continuous-time SCR model allows an analyst to specify many of the ecological processes thought to be involved in the distribution, movement, and behavior of animals detected in a spatial trapping array of continuous-time recorders. We plan to extend this model to estimate the population dynamics of animals detected during multiple years of SCR surveys.
Suboptimal distributed control and estimation: application to a four coupled tanks system
NASA Astrophysics Data System (ADS)
Orihuela, Luis; Millán, Pablo; Vivas, Carlos; Rubio, Francisco R.
2016-06-01
The paper proposes an innovative estimation and control scheme that enables the distributed monitoring and control of large-scale processes. The proposed approach considers a discrete linear time-invariant process controlled by a network of agents that may both collect information about the evolution of the plant and apply control actions to drive its behaviour. The problem makes full sense when local observability/controllability is not assumed and the communication between agents can be exploited to reach system-wide goals. Additionally, to reduce agents bandwidth requirements and power consumption, an event-based communication policy is studied. The design procedure guarantees system stability, allowing the designer to trade-off performance, control effort and communication requirements. The obtained controllers and observers are implemented in a fully distributed fashion. To illustrate the performance of the proposed technique, experimental results on a quadruple-tank process are provided.
A Bayesian Framework of Uncertainties Integration in 3D Geological Model
NASA Astrophysics Data System (ADS)
Liang, D.; Liu, X.
2017-12-01
3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.
Research of Ancient Architectures in Jin-Fen Area Based on GIS&BIM Technology
NASA Astrophysics Data System (ADS)
Jia, Jing; Zheng, Qiuhong; Gao, Huiying; Sun, Hai
2017-05-01
The number of well-preserved ancient buildings located in Shanxi Province, enjoying the absolute maximum proportion of ancient architectures in China, is about 18418, among which, 9053 buildings have the structural style of wood frame. The value of the application of BIM (Building Information Modeling) and GIS (Geographic Information System) is gradually probed and testified in the corresponding fields of ancient architecture’s spatial distribution information management, routine maintenance and special conservation & restoration, the evaluation and simulation of related disasters, such as earthquake. The research objects are ancient architectures in JIN-FEN area, which were first investigated by Sicheng LIANG and recorded in his work of “Chinese ancient architectures survey report”. The research objects, i.e. the ancient architectures in Jin-Fen area include those in Sicheng LIANG’s investigation, and further adjustments were made through authors’ on-site investigation and literature searching & collection. During this research process, the spatial distributing Geodatabase of research objects is established utilizing GIS. The BIM components library for ancient buildings is formed combining on-site investigation data and precedent classic works, such as “Yingzao Fashi”, a treatise on architectural methods in Song Dynasty, “Yongle Encyclopedia” and “Gongcheng Zuofa Zeli”, case collections of engineering practice, by the Ministry of Construction of Qing Dynasty. A building of Guangsheng temple in Hongtong county is selected as an example to elaborate the BIM model construction process based on the BIM components library for ancient buildings. Based on the foregoing work results of spatial distribution data, attribute data of features, 3D graphic information and parametric building information model, the information management system for ancient architectures in Jin-Fen Area, utilizing GIS&BIM technology, could be constructed to support the further research of seismic disaster analysis and seismic performance simulation.
Identifying patients for clinical trials using fuzzy ternary logic expressions on HL7 messages.
Majeed, Raphael W; Röhrig, Rainer
2011-01-01
Identifying eligible patients is one of the most critical parts of any clinical trial. The process of recruiting patients for the third phase of any clinical trial is usually done manually, informing relevant physicians or putting notes on bulletin boards. While most necessary information is already available in electronic hospital information systems, required data still has to be looked up individually. Most university hospitals make use of a dedicated communication server to distribute information from independent information systems, e.g. laboratory information systems, electronic health records, surgery planning systems. Thus, a theoretical model is developed to formally describe inclusion and exclusion criteria for each clinical trial using a fuzzy ternary logic expression. These expressions will then be used to process HL7 messages from a communication server in order to identify eligible patients.
Parameter Variability and Distributional Assumptions in the Diffusion Model
ERIC Educational Resources Information Center
Ratcliff, Roger
2013-01-01
If the diffusion model (Ratcliff & McKoon, 2008) is to account for the relative speeds of correct responses and errors, it is necessary that the components of processing identified by the model vary across the trials of a task. In standard applications, the rate at which information is accumulated by the diffusion process is assumed to be normally…
Boiret, Mathieu; de Juan, Anna; Gorretta, Nathalie; Ginot, Yves-Michel; Roger, Jean-Michel
2015-01-25
In this work, Raman hyperspectral images and multivariate curve resolution-alternating least squares (MCR-ALS) are used to study the distribution of actives and excipients within a pharmaceutical drug product. This article is mainly focused on the distribution of a low dose constituent. Different approaches are compared, using initially filtered or non-filtered data, or using a column-wise augmented dataset before starting the MCR-ALS iterative process including appended information on the low dose component. In the studied formulation, magnesium stearate is used as a lubricant to improve powder flowability. With a theoretical concentration of 0.5% (w/w) in the drug product, the spectral variance contained in the data is weak. By using a principal component analysis (PCA) filtered dataset as a first step of the MCR-ALS approach, the lubricant information is lost in the non-explained variance and its associated distribution in the tablet cannot be highlighted. A sufficient number of components to generate the PCA noise-filtered matrix has to be used in order to keep the lubricant variability within the data set analyzed or, otherwise, work with the raw non-filtered data. Different models are built using an increasing number of components to perform the PCA reduction. It is shown that the magnesium stearate information can be extracted from a PCA model using a minimum of 20 components. In the last part, a column-wise augmented matrix, including a reference spectrum of the lubricant, is used before starting MCR-ALS process. PCA reduction is performed on the augmented matrix, so the magnesium stearate contribution is included within the MCR-ALS calculations. By using an appropriate PCA reduction, with a sufficient number of components, or by using an augmented dataset including appended information on the low dose component, the distribution of the two actives, the two main excipients and the low dose lubricant are correctly recovered. Copyright © 2014 Elsevier B.V. All rights reserved.
A fault-tolerant information processing concept for space vehicles.
NASA Technical Reports Server (NTRS)
Hopkins, A. L., Jr.
1971-01-01
A distributed fault-tolerant information processing system is proposed, comprising a central multiprocessor, dedicated local processors, and multiplexed input-output buses connecting them together. The processors in the multiprocessor are duplicated for error detection, which is felt to be less expensive than using coded redundancy of comparable effectiveness. Error recovery is made possible by a triplicated scratchpad memory in each processor. The main multiprocessor memory uses replicated memory for error detection and correction. Local processors use any of three conventional redundancy techniques: voting, duplex pairs with backup, and duplex pairs in independent subsystems.
Molecular-beam Studies of Primary Photochemical Processes
DOE R&D Accomplishments Database
Lee, Y. T.
1982-12-01
Application of the method of molecular-beam photofragmentation translational spectroscopy to the investigation of primary photochemical processes of polyatomic molecules is described. Examples will be given to illustrate how information concerning the energetics, dynamics, and mechanism of dissociation processes can be obtained from the precise measurements of angular and velocity distributions of products in an experiment in which a well-defined beam of molecules is crossed with a laser.
Flexible and fast: linguistic shortcut affects both shallow and deep conceptual processing.
Connell, Louise; Lynott, Dermot
2013-06-01
Previous research has shown that people use linguistic distributional information during conceptual processing, and that it is especially useful for shallow tasks and rapid responding. Using two conceptual combination tasks, we showed that this linguistic shortcut extends to the processing of novel stimuli, is used in both successful and unsuccessful conceptual processing, and is evident in both shallow and deep conceptual tasks. Specifically, as predicted by the ECCo theory of conceptual combination, people use the linguistic shortcut as a "quick-and-dirty" guide to whether the concepts are likely to combine into a coherent conceptual representation, in both shallow sensibility judgment and deep interpretation generation tasks. Linguistic distributional frequency predicts both the likelihood and the time course of rejecting a novel word compound as nonsensical or uninterpretable. However, it predicts the time course of successful processing only in shallow sensibility judgment, because the deeper conceptual process of interpretation generation does not allow the linguistic shortcut to suffice. Furthermore, the effects of linguistic distributional frequency are independent of any effects of conventional word frequency. We discuss the utility of the linguistic shortcut as a cognitive triage mechanism that can optimize processing in a limited-resource conceptual system.
NASA Astrophysics Data System (ADS)
Wang, C.; Rubin, Y.
2014-12-01
Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.
Extracting Message Inter-Departure Time Distributions from the Human Electroencephalogram
Mišić, Bratislav; Vakorin, Vasily A.; Kovačević, Nataša; Paus, Tomáš; McIntosh, Anthony R.
2011-01-01
The complex connectivity of the cerebral cortex is a topic of much study, yet the link between structure and function is still unclear. The processing capacity and throughput of information at individual brain regions remains an open question and one that could potentially bridge these two aspects of neural organization. The rate at which information is emitted from different nodes in the network and how this output process changes under different external conditions are general questions that are not unique to neuroscience, but are of interest in multiple classes of telecommunication networks. In the present study we show how some of these questions may be addressed using tools from telecommunications research. An important system statistic for modeling and performance evaluation of distributed communication systems is the time between successive departures of units of information at each node in the network. We describe a method to extract and fully characterize the distribution of such inter-departure times from the resting-state electroencephalogram (EEG). We show that inter-departure times are well fitted by the two-parameter Gamma distribution. Moreover, they are not spatially or neurophysiologically trivial and instead are regionally specific and sensitive to the presence of sensory input. In both the eyes-closed and eyes-open conditions, inter-departure time distributions were more dispersed over posterior parietal channels, close to regions which are known to have the most dense structural connectivity. The biggest differences between the two conditions were observed at occipital sites, where inter-departure times were significantly more variable in the eyes-open condition. Together, these results suggest that message departure times are indicative of network traffic and capture a novel facet of neural activity. PMID:21673866
Qin, Changbo; Jia, Yangwen; Su, Z; Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen
2008-07-29
This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems.
Qin, Changbo; Jia, Yangwen; Su, Z.(Bob); Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen
2008-01-01
This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems. PMID:27879946
Marginally specified priors for non-parametric Bayesian estimation
Kessler, David C.; Hoff, Peter D.; Dunson, David B.
2014-01-01
Summary Prior specification for non-parametric Bayesian inference involves the difficult task of quantifying prior knowledge about a parameter of high, often infinite, dimension. A statistician is unlikely to have informed opinions about all aspects of such a parameter but will have real information about functionals of the parameter, such as the population mean or variance. The paper proposes a new framework for non-parametric Bayes inference in which the prior distribution for a possibly infinite dimensional parameter is decomposed into two parts: an informative prior on a finite set of functionals, and a non-parametric conditional prior for the parameter given the functionals. Such priors can be easily constructed from standard non-parametric prior distributions in common use and inherit the large support of the standard priors on which they are based. Additionally, posterior approximations under these informative priors can generally be made via minor adjustments to existing Markov chain approximation algorithms for standard non-parametric prior distributions. We illustrate the use of such priors in the context of multivariate density estimation using Dirichlet process mixture models, and in the modelling of high dimensional sparse contingency tables. PMID:25663813
2002-01-01
electronics, systems integration and information technology company.39 Northrop Grumman no longer seeks a position as a prime contractor/integrator of fixed...of the spares procurement and distribution processes. Finally, they recognize that excellence in Information Technology (IT) is a strategic advantage...business in export dollars, the industry has been forced to look for new markets as worldwide aircraft sales have dropped. Because the U.S. national
Applications integration in a hybrid cloud computing environment: modelling and platform
NASA Astrophysics Data System (ADS)
Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang
2013-08-01
With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.
Signal processing in urodynamics: towards high definition urethral pressure profilometry.
Klünder, Mario; Sawodny, Oliver; Amend, Bastian; Ederer, Michael; Kelp, Alexandra; Sievert, Karl-Dietrich; Stenzl, Arnulf; Feuer, Ronny
2016-03-22
Urethral pressure profilometry (UPP) is used in the diagnosis of stress urinary incontinence (SUI) which is a significant medical, social, and economic problem. Low spatial pressure resolution, common occurrence of artifacts, and uncertainties in data location limit the diagnostic value of UPP. To overcome these limitations, high definition urethral pressure profilometry (HD-UPP) combining enhanced UPP hardware and signal processing algorithms has been developed. In this work, we present the different signal processing steps in HD-UPP and show experimental results from female minipigs. We use a special microtip catheter with high angular pressure resolution and an integrated inclination sensor. Signals from the catheter are filtered and time-correlated artifacts removed. A signal reconstruction algorithm processes pressure data into a detailed pressure image on the urethra's inside. Finally, the pressure distribution on the urethra's outside is calculated through deconvolution. A mathematical model of the urethra is contained in a point-spread-function (PSF) which is identified depending on geometric and material properties of the urethra. We additionally investigate the PSF's frequency response to determine the relevant frequency band for pressure information on the urinary sphincter. Experimental pressure data are spatially located and processed into high resolution pressure images. Artifacts are successfully removed from data without blurring other details. The pressure distribution on the urethra's outside is reconstructed and compared to the one on the inside. Finally, the pressure images are mapped onto the urethral geometry calculated from inclination and position data to provide an integrated image of pressure distribution, anatomical shape, and location. With its advanced sensing capabilities, the novel microtip catheter collects an unprecedented amount of urethral pressure data. Through sequential signal processing steps, physicians are provided with detailed information on the pressure distribution in and around the urethra. Therefore, HD-UPP overcomes many current limitations of conventional UPP and offers the opportunity to evaluate urethral structures, especially the sphincter, in context of the correct anatomical location. This could enable the development of focal therapy approaches in the treatment of SUI.
Seagrasses and Protective Criteria: A Review and Assessment of Research Status
WED scientists conducted a literature review of scientific knowledge of the two most broadly distributed U.S. seagrass species in order to inform the process of developing protective criteria for these important coastal resources.
Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin
2003-09-01
Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cuevas, F.A.; Curilef, S., E-mail: scurilef@ucn.cl; Plastino, A.R., E-mail: arplastino@ugr.es
The spread of a wave-packet (or its deformation) is a very important topic in quantum mechanics. Understanding this phenomenon is relevant in connection with the study of diverse physical systems. In this paper we apply various 'spreading measures' to characterize the evolution of an initially localized wave-packet in a tight-binding lattice, with special emphasis on information-theoretical measures. We investigate the behavior of both the probability distribution associated with the wave packet and the concomitant probability current. Complexity measures based upon Renyi entropies appear to be particularly good descriptors of the details of the delocalization process. - Highlights: > Spread ofmore » highly localized wave-packet in the tight-binding lattice. > Entropic and information-theoretical characterization is used to understand the delocalization. > The behavior of both the probability distribution and the concomitant probability current is investigated. > Renyi entropies appear to be good descriptors of the details of the delocalization process.« less
Hoelzer, Simon; Schweiger, Ralf K; Rieger, Joerg; Meyer, Michael
2006-01-01
The organizational structures of web contents and electronic information resources must adapt to the demands of a growing volume of information and user requirements. Otherwise the information society will be threatened by disinformation. The biomedical sciences are especially vulnerable in this regard, since they are strongly oriented toward text-based knowledge sources. Here sustainable improvement can only be achieved by using a comprehensive, integrated approach that not only includes data management but also specifically incorporates the editorial processes, including structuring information sources and publication. The technical resources needed to effectively master these tasks are already available in the form of the data standards and tools of the Semantic Web. They include Rich Site Summaries (RSS), which have become an established means of distributing and syndicating conventional news messages and blogs. They can also provide access to the contents of the previously mentioned information sources, which are conventionally classified as 'deep web' content.
Network-Capable Application Process and Wireless Intelligent Sensors for ISHM
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Morris, Jon; Turowski, Mark; Wang, Ray
2011-01-01
Intelligent sensor technology and systems are increasingly becoming attractive means to serve as frameworks for intelligent rocket test facilities with embedded intelligent sensor elements, distributed data acquisition elements, and onboard data acquisition elements. Networked intelligent processors enable users and systems integrators to automatically configure their measurement automation systems for analog sensors. NASA and leading sensor vendors are working together to apply the IEEE 1451 standard for adding plug-and-play capabilities for wireless analog transducers through the use of a Transducer Electronic Data Sheet (TEDS) in order to simplify sensor setup, use, and maintenance, to automatically obtain calibration data, and to eliminate manual data entry and error. A TEDS contains the critical information needed by an instrument or measurement system to identify, characterize, interface, and properly use the signal from an analog sensor. A TEDS is deployed for a sensor in one of two ways. First, the TEDS can reside in embedded, nonvolatile memory (typically flash memory) within the intelligent processor. Second, a virtual TEDS can exist as a separate file, downloadable from the Internet. This concept of virtual TEDS extends the benefits of the standardized TEDS to legacy sensors and applications where the embedded memory is not available. An HTML-based user interface provides a visual tool to interface with those distributed sensors that a TEDS is associated with, to automate the sensor management process. Implementing and deploying the IEEE 1451.1-based Network-Capable Application Process (NCAP) can achieve support for intelligent process in Integrated Systems Health Management (ISHM) for the purpose of monitoring, detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, mitigation to maintain operability, and integrated awareness of system health by the operator. It can also support local data collection and storage. This invention enables wide-area sensing and employs numerous globally distributed sensing devices that observe the physical world through the existing sensor network. This innovation enables distributed storage, distributed processing, distributed intelligence, and the availability of DiaK (Data, Information, and Knowledge) to any element as needed. It also enables the simultaneous execution of multiple processes, and represents models that contribute to the determination of the condition and health of each element in the system. The NCAP (intelligent process) can configure data-collection and filtering processes in reaction to sensed data, allowing it to decide when and how to adapt collection and processing with regard to sophisticated analysis of data derived from multiple sensors. The user will be able to view the sensing device network as a single unit that supports a high-level query language. Each query would be able to operate over data collected from across the global sensor network just as a search query encompasses millions of Web pages. The sensor web can preserve ubiquitous information access between the querier and the queried data. Pervasive monitoring of the physical world raises significant data and privacy concerns. This innovation enables different authorities to control portions of the sensing infrastructure, and sensor service authors may wish to compose services across authority boundaries.
ERIC Educational Resources Information Center
Wilde, Carroll O.
The Poisson probability distribution is seen to provide a mathematical model from which useful information can be obtained in practical applications. The distribution and some situations to which it applies are studied, and ways to find answers to practical questions are noted. The unit includes exercises and a model exam, and provides answers to…
ERIC Educational Resources Information Center
Clubb, Deborah, Ed.; Ligon, Polly C., Ed.
Colloquium participants were asked to make informed guesses about whether developing countries can grow and equitably distribute the food they need over the next decade, what the international development community should do to help in both production and distribution, and what role the United States should play in the development process. The 17…
A distributed reasoning engine ecosystem for semantic context-management in smart environments.
Almeida, Aitor; López-de-Ipiña, Diego
2012-01-01
To be able to react adequately a smart environment must be aware of the context and its changes. Modeling the context allows applications to better understand it and to adapt to its changes. In order to do this an appropriate formal representation method is needed. Ontologies have proven themselves to be one of the best tools to do it. Semantic inference provides a powerful framework to reason over the context data. But there are some problems with this approach. The inference over semantic context information can be cumbersome when working with a large amount of data. This situation has become more common in modern smart environments where there are a lot sensors and devices available. In order to tackle this problem we have developed a mechanism to distribute the context reasoning problem into smaller parts in order to reduce the inference time. In this paper we describe a distributed peer-to-peer agent architecture of context consumers and context providers. We explain how this inference sharing process works, partitioning the context information according to the interests of the agents, location and a certainty factor. We also discuss the system architecture, analyzing the negotiation process between the agents. Finally we compare the distributed reasoning with the centralized one, analyzing in which situations is more suitable each approach.
NASA Astrophysics Data System (ADS)
Bednar, Earl; Drager, Steven L.
2007-04-01
Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.
Three-dimensional distribution of cortical synapses: a replicated point pattern-based analysis
Anton-Sanchez, Laura; Bielza, Concha; Merchán-Pérez, Angel; Rodríguez, José-Rodrigo; DeFelipe, Javier; Larrañaga, Pedro
2014-01-01
The biggest problem when analyzing the brain is that its synaptic connections are extremely complex. Generally, the billions of neurons making up the brain exchange information through two types of highly specialized structures: chemical synapses (the vast majority) and so-called gap junctions (a substrate of one class of electrical synapse). Here we are interested in exploring the three-dimensional spatial distribution of chemical synapses in the cerebral cortex. Recent research has showed that the three-dimensional spatial distribution of synapses in layer III of the neocortex can be modeled by a random sequential adsorption (RSA) point process, i.e., synapses are distributed in space almost randomly, with the only constraint that they cannot overlap. In this study we hypothesize that RSA processes can also explain the distribution of synapses in all cortical layers. We also investigate whether there are differences in both the synaptic density and spatial distribution of synapses between layers. Using combined focused ion beam milling and scanning electron microscopy (FIB/SEM), we obtained three-dimensional samples from the six layers of the rat somatosensory cortex and identified and reconstructed the synaptic junctions. A total volume of tissue of approximately 4500μm3 and around 4000 synapses from three different animals were analyzed. Different samples, layers and/or animals were aggregated and compared using RSA replicated spatial point processes. The results showed no significant differences in the synaptic distribution across the different rats used in the study. We found that RSA processes described the spatial distribution of synapses in all samples of each layer. We also found that the synaptic distribution in layers II to VI conforms to a common underlying RSA process with different densities per layer. Interestingly, the results showed that synapses in layer I had a slightly different spatial distribution from the other layers. PMID:25206325
Three-dimensional distribution of cortical synapses: a replicated point pattern-based analysis.
Anton-Sanchez, Laura; Bielza, Concha; Merchán-Pérez, Angel; Rodríguez, José-Rodrigo; DeFelipe, Javier; Larrañaga, Pedro
2014-01-01
The biggest problem when analyzing the brain is that its synaptic connections are extremely complex. Generally, the billions of neurons making up the brain exchange information through two types of highly specialized structures: chemical synapses (the vast majority) and so-called gap junctions (a substrate of one class of electrical synapse). Here we are interested in exploring the three-dimensional spatial distribution of chemical synapses in the cerebral cortex. Recent research has showed that the three-dimensional spatial distribution of synapses in layer III of the neocortex can be modeled by a random sequential adsorption (RSA) point process, i.e., synapses are distributed in space almost randomly, with the only constraint that they cannot overlap. In this study we hypothesize that RSA processes can also explain the distribution of synapses in all cortical layers. We also investigate whether there are differences in both the synaptic density and spatial distribution of synapses between layers. Using combined focused ion beam milling and scanning electron microscopy (FIB/SEM), we obtained three-dimensional samples from the six layers of the rat somatosensory cortex and identified and reconstructed the synaptic junctions. A total volume of tissue of approximately 4500μm(3) and around 4000 synapses from three different animals were analyzed. Different samples, layers and/or animals were aggregated and compared using RSA replicated spatial point processes. The results showed no significant differences in the synaptic distribution across the different rats used in the study. We found that RSA processes described the spatial distribution of synapses in all samples of each layer. We also found that the synaptic distribution in layers II to VI conforms to a common underlying RSA process with different densities per layer. Interestingly, the results showed that synapses in layer I had a slightly different spatial distribution from the other layers.
NASA Astrophysics Data System (ADS)
Prapavat, Viravuth; Schuetz, Rijk; Runge, Wolfram; Beuthan, Juergen; Mueller, Gerhard J.
1995-12-01
This paper presents in-vitro-studies using the scattered intensity distribution obtained by cw- transillumination to examine the condition of rheumatic disorders of interphalangeal joints. Inflammation of joints, due to rheumatic diseases, leads to changes in the synovial membrane, synovia composition and content, and anatomic geometrical variations. Measurements have shown that these rheumatic induced inflammation processes result in a variation in optical properties of joint systems. With a scanning system the interphalangeal joint is transilluminated with diode lasers (670 nm, 905 nm) perpendicular to the joint cavity. The detection of the entire distribution of the transmitted radiation intensity was performed with a CCD camera. As a function of the structure and optical properties of the transilluminated volume we achieved distributions of scattered radiation which show characteristic variations in intensity and shape. Using signal and image processing procedures we evaluated the measured scattered distributions regarding their information weight, shape and scale features. Mathematical methods were used to find classification criteria to determine variations of the joint condition.
A universal quantum module for quantum communication, computation, and metrology
NASA Astrophysics Data System (ADS)
Hanks, Michael; Lo Piparo, Nicolò; Trupke, Michael; Schmiedmayer, Jorg; Munro, William J.; Nemoto, Kae
2017-08-01
In this work, we describe a simple module that could be ubiquitous for quantum information based applications. The basic modules comprises a single NV- center in diamond embedded in an optical cavity, where the cavity mediates interactions between photons and the electron spin (enabling entanglement distribution and efficient readout), while the nuclear spins constitutes a long-lived quantum memories capable of storing and processing quantum information. We discuss how a network of connected modules can be used for distributed metrology, communication and computation applications. Finally, we investigate the possible use of alternative diamond centers (SiV/GeV) within the module and illustrate potential advantages.
NASA Technical Reports Server (NTRS)
Sidik, S. M.
1972-01-01
The error variance of the process prior multivariate normal distributions of the parameters of the models are assumed to be specified, prior probabilities of the models being correct. A rule for termination of sampling is proposed. Upon termination, the model with the largest posterior probability is chosen as correct. If sampling is not terminated, posterior probabilities of the models and posterior distributions of the parameters are computed. An experiment was chosen to maximize the expected Kullback-Leibler information function. Monte Carlo simulation experiments were performed to investigate large and small sample behavior of the sequential adaptive procedure.
NASA Astrophysics Data System (ADS)
Grenn, Michael W.
This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed to reach the desired state of quality is estimated from the perspective of the receiver. The HR may increase, decrease or remain steady depending on the degree to which additions, deletions and revisions impact the distribution of R among the quality levels. Current requirements trend metrics generally treat additions, deletions and revisions the same and simply measure the quantity of these changes over time. The REF evaluates the quantity of requirements changes over time, distinguishes between their positive and negative effects by calculating their impact on HR, Q, and AI, and forecasts when the desired state will be reached, enabling more accurate assessment of the status and progress of the requirements engineering effort. Results from random variable simulations suggest the REF is an improved leading indicator of requirements trends that can be readily combined with current methods. The increase in I, or decrease in H R and uncertainty, is proportional to the engineering effort E input into the requirements engineering process. The REF estimates the AE needed to transition R from their current state of quality to the desired end state or some other interim state of interest. Simulation results are compared with measured engineering effort data for Department of Defense programs published in the SE literature, and the results suggest the REF is a promising new method for estimation of AE.
Troshin, Petr V; Morris, Chris; Prince, Stephen M; Papiz, Miroslav Z
2008-12-01
Membrane Protein Structure Initiative (MPSI) exploits laboratory competencies to work collaboratively and distribute work among the different sites. This is possible as protein structure determination requires a series of steps, starting with target selection, through cloning, expression, purification, crystallization and finally structure determination. Distributed sites create a unique set of challenges for integrating and passing on information on the progress of targets. This role is played by the Protein Information Management System (PIMS), which is a laboratory information management system (LIMS), serving as a hub for MPSI, allowing collaborative structural proteomics to be carried out in a distributed fashion. It holds key information on the progress of cloning, expression, purification and crystallization of proteins. PIMS is employed to track the status of protein targets and to manage constructs, primers, experiments, protocols, sample locations and their detailed histories: thus playing a key role in MPSI data exchange. It also serves as the centre of a federation of interoperable information resources such as local laboratory information systems and international archival resources, like PDB or NCBI. During the challenging task of PIMS integration, within the MPSI, we discovered a number of prerequisites for successful PIMS integration. In this article we share our experiences and provide invaluable insights into the process of LIMS adaptation. This information should be of interest to partners who are thinking about using LIMS as a data centre for their collaborative efforts.
Representation control increases task efficiency in complex graphical representations.
Moritz, Julia; Meyerhoff, Hauke S; Meyer-Dernbecher, Claudia; Schwan, Stephan
2018-01-01
In complex graphical representations, the relevant information for a specific task is often distributed across multiple spatial locations. In such situations, understanding the representation requires internal transformation processes in order to extract the relevant information. However, digital technology enables observers to alter the spatial arrangement of depicted information and therefore to offload the transformation processes. The objective of this study was to investigate the use of such a representation control (i.e. the users' option to decide how information should be displayed) in order to accomplish an information extraction task in terms of solution time and accuracy. In the representation control condition, the participants were allowed to reorganize the graphical representation and reduce information density. In the control condition, no interactive features were offered. We observed that participants in the representation control condition solved tasks that required reorganization of the maps faster and more accurate than participants without representation control. The present findings demonstrate how processes of cognitive offloading, spatial contiguity, and information coherence interact in knowledge media intended for broad and diverse groups of recipients.
Representation control increases task efficiency in complex graphical representations
Meyerhoff, Hauke S.; Meyer-Dernbecher, Claudia; Schwan, Stephan
2018-01-01
In complex graphical representations, the relevant information for a specific task is often distributed across multiple spatial locations. In such situations, understanding the representation requires internal transformation processes in order to extract the relevant information. However, digital technology enables observers to alter the spatial arrangement of depicted information and therefore to offload the transformation processes. The objective of this study was to investigate the use of such a representation control (i.e. the users' option to decide how information should be displayed) in order to accomplish an information extraction task in terms of solution time and accuracy. In the representation control condition, the participants were allowed to reorganize the graphical representation and reduce information density. In the control condition, no interactive features were offered. We observed that participants in the representation control condition solved tasks that required reorganization of the maps faster and more accurate than participants without representation control. The present findings demonstrate how processes of cognitive offloading, spatial contiguity, and information coherence interact in knowledge media intended for broad and diverse groups of recipients. PMID:29698443
Distributed computer taxonomy based on O/S structure
NASA Technical Reports Server (NTRS)
Foudriat, Edwin C.
1985-01-01
The taxonomy considers the resource structure at the operating system level. It compares a communication based taxonomy with the new taxonomy to illustrate how the latter does a better job when related to the client's view of the distributed computer. The results illustrate the fundamental features and what is required to construct fully distributed processing systems. The problem of using network computers on the space station is addressed. A detailed discussion of the taxonomy is not given here. Information is given in the form of charts and diagrams that were used to illustrate a talk.
Fission meter and neutron detection using poisson distribution comparison
Rowland, Mark S; Snyderman, Neal J
2014-11-18
A neutron detector system and method for discriminating fissile material from non-fissile material wherein a digital data acquisition unit collects data at high rate, and in real-time processes large volumes of data directly into information that a first responder can use to discriminate materials. The system comprises counting neutrons from the unknown source and detecting excess grouped neutrons to identify fission in the unknown source. Comparison of the observed neutron count distribution with a Poisson distribution is performed to distinguish fissile material from non-fissile material.
Hadoop-based implementation of processing medical diagnostic records for visual patient system
NASA Astrophysics Data System (ADS)
Yang, Yuanyuan; Shi, Liehang; Xie, Zhe; Zhang, Jianguo
2018-03-01
We have innovatively introduced Visual Patient (VP) concept and method visually to represent and index patient imaging diagnostic records (IDR) in last year SPIE Medical Imaging (SPIE MI 2017), which can enable a doctor to review a large amount of IDR of a patient in a limited appointed time slot. In this presentation, we presented a new approach to design data processing architecture of VP system (VPS) to acquire, process and store various kinds of IDR to build VP instance for each patient in hospital environment based on Hadoop distributed processing structure. We designed this system architecture called Medical Information Processing System (MIPS) with a combination of Hadoop batch processing architecture and Storm stream processing architecture. The MIPS implemented parallel processing of various kinds of clinical data with high efficiency, which come from disparate hospital information system such as PACS, RIS LIS and HIS.
2006-09-01
expected advancements in information technology and library science offer the best hope of resolving the above concerns. vi • An EWA will be...information technology and library science must be utilized to accomplish this. Some DOD research investment may be required to resolve DOD specific...distributed assessment process that exploits the documentation of all of the CEST issues, advances in information technology and library science , and the
Intra-enterprise telecommunication satellites
NASA Astrophysics Data System (ADS)
Henry, A. J.
1981-11-01
Information transfer in the mid 1980's is sketched. The use of geostationary satellites for internal requirements of businesses is an important factor in the growth of information transfer. Protection of transferred information is achieved through encryption. The companies which use satellites are those whose telecommunication costs are already significant; who have large computing capabilities including distributed data processing; who use national and international leased circuits; and whose establishments are dispersed. Uses include teleconferencing, voice and data transmission, and text and facsimile communication.
Sokhey, Taegh; Gaebler-Spira, Deborah; Kording, Konrad P.
2017-01-01
Background It is important to understand the motor deficits of children with Cerebral Palsy (CP). Our understanding of this motor disorder can be enriched by computational models of motor control. One crucial stage in generating movement involves combining uncertain information from different sources, and deficits in this process could contribute to reduced motor function in children with CP. Healthy adults can integrate previously-learned information (prior) with incoming sensory information (likelihood) in a close-to-optimal way when estimating object location, consistent with the use of Bayesian statistics. However, there are few studies investigating how children with CP perform sensorimotor integration. We compare sensorimotor estimation in children with CP and age-matched controls using a model-based analysis to understand the process. Methods and findings We examined Bayesian sensorimotor integration in children with CP, aged between 5 and 12 years old, with Gross Motor Function Classification System (GMFCS) levels 1–3 and compared their estimation behavior with age-matched typically-developing (TD) children. We used a simple sensorimotor estimation task which requires participants to combine probabilistic information from different sources: a likelihood distribution (current sensory information) with a prior distribution (learned target information). In order to examine sensorimotor integration, we quantified how participants weighed statistical information from the two sources (prior and likelihood) and compared this to the statistical optimal weighting. We found that the weighing of statistical information in children with CP was as statistically efficient as that of TD children. Conclusions We conclude that Bayesian sensorimotor integration is not impaired in children with CP and therefore, does not contribute to their motor deficits. Future research has the potential to enrich our understanding of motor disorders by investigating the stages of motor processing set out by computational models. Therapeutic interventions should exploit the ability of children with CP to use statistical information. PMID:29186196
Kannampallil, Thomas G; Franklin, Amy; Mishra, Rashmi; Almoosa, Khalid F; Cohen, Trevor; Patel, Vimla L
2013-01-01
Information in critical care environments is distributed across multiple sources, such as paper charts, electronic records, and support personnel. For decision-making tasks, physicians have to seek, gather, filter and organize information from various sources in a timely manner. The objective of this research is to characterize the nature of physicians' information seeking process, and the content and structure of clinical information retrieved during this process. Eight medical intensive care unit physicians provided a verbal think-aloud as they performed a clinical diagnosis task. Verbal descriptions of physicians' activities, sources of information they used, time spent on each information source, and interactions with other clinicians were captured for analysis. The data were analyzed using qualitative and quantitative approaches. We found that the information seeking process was exploratory and iterative and driven by the contextual organization of information. While there was no significant differences between the overall time spent paper or electronic records, there was marginally greater relative information gain (i.e., more unique information retrieved per unit time) from electronic records (t(6)=1.89, p=0.1). Additionally, information retrieved from electronic records was at a higher level (i.e., observations and findings) in the knowledge structure than paper records, reflecting differences in the nature of knowledge utilization across resources. A process of local optimization drove the information seeking process: physicians utilized information that maximized their information gain even though it required significantly more cognitive effort. Implications for the design of health information technology solutions that seamlessly integrate information seeking activities within the workflow, such as enriching the clinical information space and supporting efficient clinical reasoning and decision-making, are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dejoie, Catherine; Tamura, Nobumichi; Kunz, Martin
Archaeological artefacts are often heterogeneous materials where several phases coexist in a wide grain size distribution. Most of the time, retrieving structure information at the micrometre scale is of great importance for these materials. Particularly, the organization of different phases at the micrometre scale is closely related to optical or mechanical properties, manufacturing processes, functionalities in ancient times and long-term conservation. Between classic X-ray powder diffraction with a millimetre beam and transmission electron microscopy, a gap exists and structure and phase information at the micrometre scale are missing. Using a micrometre-size synchrotron X-ray beam, a hybrid approach combining both monochromaticmore » powder micro-diffraction and Laue single-crystal micro-diffraction was deployed to obtain information from nanometre- and micrometre-size phases, respectively. Therefore providing a way to bridge the aforementioned gap, this unique methodology was applied to three different types of ancient materials that all show a strong heterogeneity. In Roman terra sigillata, the specific distribution of nanocrystalline hematite is mainly responsible for the deep-red tone of the slip, while the distribution of micrometre-size quartz in ceramic bodies reflects the change of manufacturing process between pre-sigillata and high-quality sigillata periods. In the second example, we investigated the modifications occurring in Neolithic and geological flints after a heating process. By separating the diffracted signal coming from the nano- and the micrometre scale, we observed a domain size increase for nanocrystalline quartz in geological flints and a relaxation of the residual strain in larger detritic quartz. In conclusion, through the study of a Roman iron nail, we showed that the carburation process to strengthen the steel was mainly a surface process that formed 10–20 µm size domains of single–crystal ferrite and nanocrystalline cementite.« less
Dejoie, Catherine; Tamura, Nobumichi; Kunz, Martin; ...
2015-09-20
Archaeological artefacts are often heterogeneous materials where several phases coexist in a wide grain size distribution. Most of the time, retrieving structure information at the micrometre scale is of great importance for these materials. Particularly, the organization of different phases at the micrometre scale is closely related to optical or mechanical properties, manufacturing processes, functionalities in ancient times and long-term conservation. Between classic X-ray powder diffraction with a millimetre beam and transmission electron microscopy, a gap exists and structure and phase information at the micrometre scale are missing. Using a micrometre-size synchrotron X-ray beam, a hybrid approach combining both monochromaticmore » powder micro-diffraction and Laue single-crystal micro-diffraction was deployed to obtain information from nanometre- and micrometre-size phases, respectively. Therefore providing a way to bridge the aforementioned gap, this unique methodology was applied to three different types of ancient materials that all show a strong heterogeneity. In Roman terra sigillata, the specific distribution of nanocrystalline hematite is mainly responsible for the deep-red tone of the slip, while the distribution of micrometre-size quartz in ceramic bodies reflects the change of manufacturing process between pre-sigillata and high-quality sigillata periods. In the second example, we investigated the modifications occurring in Neolithic and geological flints after a heating process. By separating the diffracted signal coming from the nano- and the micrometre scale, we observed a domain size increase for nanocrystalline quartz in geological flints and a relaxation of the residual strain in larger detritic quartz. In conclusion, through the study of a Roman iron nail, we showed that the carburation process to strengthen the steel was mainly a surface process that formed 10–20 µm size domains of single–crystal ferrite and nanocrystalline cementite.« less
Bayesian analysis of multimodal data and brain imaging
NASA Astrophysics Data System (ADS)
Assadi, Amir H.; Eghbalnia, Hamid; Backonja, Miroslav; Wakai, Ronald T.; Rutecki, Paul; Haughton, Victor
2000-06-01
It is often the case that information about a process can be obtained using a variety of methods. Each method is employed because of specific advantages over the competing alternatives. An example in medical neuro-imaging is the choice between fMRI and MEG modes where fMRI can provide high spatial resolution in comparison to the superior temporal resolution of MEG. The combination of data from varying modes provides the opportunity to infer results that may not be possible by means of any one mode alone. We discuss a Bayesian and learning theoretic framework for enhanced feature extraction that is particularly suited to multi-modal investigations of massive data sets from multiple experiments. In the following Bayesian approach, acquired knowledge (information) regarding various aspects of the process are all directly incorporated into the formulation. This information can come from a variety of sources. In our case, it represents statistical information obtained from other modes of data collection. The information is used to train a learning machine to estimate a probability distribution, which is used in turn by a second machine as a prior, in order to produce a more refined estimation of the distribution of events. The computational demand of the algorithm is handled by proposing a distributed parallel implementation on a cluster of workstations that can be scaled to address real-time needs if required. We provide a simulation of these methods on a set of synthetically generated MEG and EEG data. We show how spatial and temporal resolutions improve by using prior distributions. The method on fMRI signals permits one to construct the probability distribution of the non-linear hemodynamics of the human brain (real data). These computational results are in agreement with biologically based measurements of other labs, as reported to us by researchers from UK. We also provide preliminary analysis involving multi-electrode cortical recording that accompanies behavioral data in pain experiments on freely moving mice subjected to moderate heat delivered by an electric bulb. Summary of new or breakthrough ideas: (1) A new method to estimate probability distribution for measurement of nonlinear hemodynamics of brain from a multi- modal neuronal data. This is the first time that such an idea is tried, to our knowledge. (2) Breakthrough in improvement of time resolution of fMRI signals using (1) above.
Microelectromechanical Systems
NASA Technical Reports Server (NTRS)
Gabriel, Kaigham J.
1995-01-01
Micro-electromechanical systems (MEMS) is an enabling technology that merges computation and communication with sensing and actuation to change the way people and machines interact with the physical world. MEMS is a manufacturing technology that will impact widespread applications including: miniature inertial measurement measurement units for competent munitions and personal navigation; distributed unattended sensors; mass data storage devices; miniature analytical instruments; embedded pressure sensors; non-invasive biomedical sensors; fiber-optics components and networks; distributed aerodynamic control; and on-demand structural strength. The long term goal of ARPA's MEMS program is to merge information processing with sensing and actuation to realize new systems and strategies for both perceiving and controlling systems, processes, and the environment. The MEMS program has three major thrusts: advanced devices and processes, system design, and infrastructure.
Mobile agent location in distributed environments
NASA Astrophysics Data System (ADS)
Fountoukis, S. G.; Argyropoulos, I. P.
2012-12-01
An agent is a small program acting on behalf of a user or an application which plays the role of a user. Artificial intelligence can be encapsulated in agents so that they can be capable of both behaving autonomously and showing an elementary decision ability regarding movement and some specific actions. Therefore they are often called autonomous mobile agents. In a distributed system, they can move themselves from one processing node to another through the interconnecting network infrastructure. Their purpose is to collect useful information and to carry it back to their user. Also, agents are used to start, monitor and stop processes running on the individual interconnected processing nodes of computer cluster systems. An agent has a unique id to discriminate itself from other agents and a current position. The position can be expressed as the address of the processing node which currently hosts the agent. Very often, it is necessary for a user, a processing node or another agent to know the current position of an agent in a distributed system. Several procedures and algorithms have been proposed for the purpose of position location of mobile agents. The most basic of all employs a fixed computing node, which acts as agent position repository, receiving messages from all the moving agents and keeping records of their current positions. The fixed node, responds to position queries and informs users, other nodes and other agents about the position of an agent. Herein, a model is proposed that considers pairs and triples of agents instead of single ones. A location method, which is investigated in this paper, attempts to exploit this model.
Analysis of Distribution of Vector-Borne Diseases Using Geographic Information Systems.
Nihei, Naoko
2017-01-01
The distribution of vector-borne diseases is changing on a global scale owing to issues involving natural environments, socioeconomic conditions and border disputes among others. Geographic information systems (GIS) provide an important method of establishing a prompt and precise understanding of local data on disease outbreaks, from which disease eradication programs can be established. Having first defined GIS as a combination of GPS, RS and GIS, we showed the processes through which these technologies were being introduced into our research. GIS-derived geographical information attributes were interpreted in terms of point, area, line, spatial epidemiology, risk and development for generating the vector dynamic models associated with the spread of the disease. The need for interdisciplinary scientific and administrative collaboration in the use of GIS to control infectious diseases is highly warranted.
40 CFR 763.179 - Confidential business information claims.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) TOXIC SUBSTANCES CONTROL ACT ASBESTOS Prohibition of the Manufacture, Importation, Processing, and Distribution in Commerce of Certain Asbestos-Containing Products; Labeling Requirements § 763.179 Confidential... asbestos on human health and the environment? If your answer is yes, explain. ...
Replication in Mobile Environments
2007-12-01
control number. 1. REPORT DATE 01 DEC 2007 2. REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE Data Replication Over Disadvantaged ...Communication, Information Processing, and Ergonomics KIE What is the Problem? Data replication among distributed databases occurring over disadvantaged
NASA Astrophysics Data System (ADS)
Terahama, Yukinori; Takahashi, Yoshiyasu; Suzuki, Shigeru; Kinukawa, Hiroshi
Recent years, maintenance of corporate soundness and compliance with the law and corporate ethics are getting more significant in the insurance industry, regardless of life insurance. In the other hand, division of production and distribution is increasing. Therefore the problem of compliance with an agency is getting more significant. We propose a contract procedure in an agency separated products explanation and applied process for protection of the personal information. Our proposed procedure protects the personal information of the contractor and supports the compliance observance for contracts with the background texture watermarks and the redactable signature. We have developed a prototype system of the solution to check its feasibility.
Towards a service bus for distributed manufacturing
NASA Astrophysics Data System (ADS)
Delgado-Gomes, Vasco; Oliveira-Lima, José A.; Martins, João F.; Jardim-Gonçalves, Ricardo
2013-10-01
The electronic exchange of data between industrial equipment, manufacturing and information systems of companies is becoming increasingly important with the current trend of reducing products' life cycle, wide range of diversified products, and the need to answer the specific needs of each consumer. In this context, quality, time, costs involved in integrating information over the company's internal processes, and in the interaction of these processes with their customers, suppliers and other business partners are in many sectors, far beyond what the current technology and communications solutions enable. This paper presents a communication infrastructure to integrate several companies from different sectors of the supply chain, to exchange their heterogeneous information using a data model which is composed by different standards.
Automated information and control complex of hydro-gas endogenous mine processes
NASA Astrophysics Data System (ADS)
Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.
2017-09-01
The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.
Experimental extraction of an entangled photon pair from two identically decohered pairs.
Yamamoto, Takashi; Koashi, Masato; Ozdemir, Sahin Kaya; Imoto, Nobuyuki
2003-01-23
Entanglement is considered to be one of the most important resources in quantum information processing schemes, including teleportation, dense coding and entanglement-based quantum key distribution. Because entanglement cannot be generated by classical communication between distant parties, distribution of entangled particles between them is necessary. During the distribution process, entanglement between the particles is degraded by the decoherence and dissipation processes that result from unavoidable coupling with the environment. Entanglement distillation and concentration schemes are therefore needed to extract pairs with a higher degree of entanglement from these less-entangled pairs; this is accomplished using local operations and classical communication. Here we report an experimental demonstration of extraction of a polarization-entangled photon pair from two decohered photon pairs. Two polarization-entangled photon pairs are generated by spontaneous parametric down-conversion and then distributed through a channel that induces identical phase fluctuations to both pairs; this ensures that no entanglement is available as long as each pair is manipulated individually. Then, through collective local operations and classical communication we extract from the two decohered pairs a photon pair that is observed to be polarization-entangled.
Gossip and Distributed Kalman Filtering: Weak Consensus Under Weak Detectability
NASA Astrophysics Data System (ADS)
Kar, Soummya; Moura, José M. F.
2011-04-01
The paper presents the gossip interactive Kalman filter (GIKF) for distributed Kalman filtering for networked systems and sensor networks, where inter-sensor communication and observations occur at the same time-scale. The communication among sensors is random; each sensor occasionally exchanges its filtering state information with a neighbor depending on the availability of the appropriate network link. We show that under a weak distributed detectability condition: 1. the GIKF error process remains stochastically bounded, irrespective of the instability properties of the random process dynamics; and 2. the network achieves \\emph{weak consensus}, i.e., the conditional estimation error covariance at a (uniformly) randomly selected sensor converges in distribution to a unique invariant measure on the space of positive semi-definite matrices (independent of the initial state.) To prove these results, we interpret the filtered states (estimates and error covariances) at each node in the GIKF as stochastic particles with local interactions. We analyze the asymptotic properties of the error process by studying as a random dynamical system the associated switched (random) Riccati equation, the switching being dictated by a non-stationary Markov chain on the network graph.
NASA Astrophysics Data System (ADS)
KIM, J.; Bastidas, L. A.
2011-12-01
We evaluate, calibrate and diagnose the performance of National Weather Service RDHM distributed model over the Durango River Basin in Colorado using simultaneously in situ and remotely sensed information from different discharge gaging stations (USGS), information about snow cover (SCV) and snow water equivalent (SWE) in situ from several SNOTEL sites and snow information distributed over the catchment from remotely sensed information (NOAA-NASA). In the process of evaluation we attempt to establish the optimal degree of parameter distribution over the catchment by calibration. A multi-criteria approach based on traditional measures (RMSE) and similarity based pattern comparisons using the Hausdorff and Earth Movers Distance approaches is used for the overall evaluation of the model performance. These pattern based approaches (shape matching) are found to be extremely relevant to account for the relatively large degree of inaccuracy in the remotely sensed SWE (judged inaccurate in terms of the value but reliable in terms of the distribution pattern) and the high reliability of the SCV (yes/no situation) while at the same time allow for an evaluation that quantifies the accuracy of the model over the entire catchment considering the different types of observations. The Hausdorff norm, due to its intrinsically multi-dimensional nature, allows for the incorporation of variables such as the terrain elevation as one of the variables for evaluation. The EMD, because of its extremely high computational overburden, requires the mapping of the set of evaluation variables into a two dimensional matrix for computation.
S.R. Pezeshki; R.D. DeLaune
2000-01-01
Characterization of hydric soils and the relationship between soil oxidation-reduction processes and wetland plant distribution are critical to the identification and delineation of wetlands and to our understanding of soil processes and plant functioning in wetland ecosystems. However, the information on the relationship between flood response of wetland plants and...
Atomic Energy Division plant capacity manual Savannah River Plant and Dana Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1960-05-01
This report is a summary of plant service capacities at the Savannah River Plant and the Dana Plant. The report is divided into different areas of the plants, and includes information on services such as process steam, clarified water, deionized water, electric distribution systems, electric generating capacity, filtered water, process water, river water, well water, etc.
Putting It All Together: A Unified Account of Word Recognition and Reaction-Time Distributions
ERIC Educational Resources Information Center
Norris, Dennis
2009-01-01
R. Ratcliff, P. Gomez, and G. McKoon (2004) suggested much of what goes on in lexical decision is attributable to decision processes and may not be particularly informative about word recognition. They proposed that lexical decision should be characterized by a decision process, taking the form of a drift-diffusion model (R. Ratcliff, 1978), that…
Near-surface wind speed statistical distribution: comparison between ECMWF System 4 and ERA-Interim
NASA Astrophysics Data System (ADS)
Marcos, Raül; Gonzalez-Reviriego, Nube; Torralba, Verónica; Cortesi, Nicola; Young, Doo; Doblas-Reyes, Francisco J.
2017-04-01
In the framework of seasonal forecast verification, knowing whether the characteristics of the climatological wind speed distribution, simulated by the forecasting systems, are similar to the observed ones is essential to guide the subsequent process of bias adjustment. To bring some light about this topic, this work assesses the properties of the statistical distributions of 10m wind speed from both ERA-Interim reanalysis and seasonal forecasts of ECMWF system 4. The 10m wind speed distribution has been characterized in terms of the four main moments of the probability distribution (mean, standard deviation, skewness and kurtosis) together with the coefficient of variation and goodness of fit Shapiro-Wilks test, allowing the identification of regions with higher wind variability and non-Gaussian behaviour at monthly time-scales. Also, the comparison of the predicted and observed 10m wind speed distributions has been measured considering both inter-annual and intra-seasonal variability. Such a comparison is important in both climate research and climate services communities because it provides useful climate information for decision-making processes and wind industry applications.
Texas Solar Collaboration Action Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winland, Chris
2013-02-14
Texas Solar Collaboration Permitting and Interconenction Process Improvement Action Plan. San Antonio-specific; Investigate feasibility of using electronic signatures; Investigate feasibility of enabling other online permitting processes (e.g., commercial); Assess need for future document management and workflow/notification IT improvements; Update Information Bulletin 153 regarding City requirements and processes for PV; Educate contractors and public on CPS Energy’s new 2013 solar program processes; Continue to discuss “downtown grid” interconnection issues and identify potential solutions; Consider renaming Distributed Energy Resources (DER); and Continue to participate in collaborative actions.
Karapiperis, Christos; Kempf, Stefan J; Quintens, Roel; Azimzadeh, Omid; Vidal, Victoria Linares; Pazzaglia, Simonetta; Bazyka, Dimitry; Mastroberardino, Pier G; Scouras, Zacharias G; Tapio, Soile; Benotmane, Mohammed Abderrafi; Ouzounis, Christos A
2016-05-11
The underlying molecular processes representing stress responses to low-dose ionising radiation (LDIR) in mammals are just beginning to be understood. In particular, LDIR effects on the brain and their possible association with neurodegenerative disease are currently being explored using omics technologies. We describe a light-weight approach for the storage, analysis and distribution of relevant LDIR omics datasets. The data integration platform, called BRIDE, contains information from the literature as well as experimental information from transcriptomics and proteomics studies. It deploys a hybrid, distributed solution using both local storage and cloud technology. BRIDE can act as a knowledge broker for LDIR researchers, to facilitate molecular research on the systems biology of LDIR response in mammals. Its flexible design can capture a range of experimental information for genomics, epigenomics, transcriptomics, and proteomics. The data collection is available at:
Integrated resource scheduling in a distributed scheduling environment
NASA Technical Reports Server (NTRS)
Zoch, David; Hall, Gardiner
1988-01-01
The Space Station era presents a highly-complex multi-mission planning and scheduling environment exercised over a highly distributed system. In order to automate the scheduling process, customers require a mechanism for communicating their scheduling requirements to NASA. A request language that a remotely-located customer can use to specify his scheduling requirements to a NASA scheduler, thus automating the customer-scheduler interface, is described. This notation, Flexible Envelope-Request Notation (FERN), allows the user to completely specify his scheduling requirements such as resource usage, temporal constraints, and scheduling preferences and options. The FERN also contains mechanisms for representing schedule and resource availability information, which are used in the inter-scheduler inconsistency resolution process. Additionally, a scheduler is described that can accept these requests, process them, generate schedules, and return schedule and resource availability information to the requester. The Request-Oriented Scheduling Engine (ROSE) was designed to function either as an independent scheduler or as a scheduling element in a network of schedulers. When used in a network of schedulers, each ROSE communicates schedule and resource usage information to other schedulers via the FERN notation, enabling inconsistencies to be resolved between schedulers. Individual ROSE schedules are created by viewing the problem as a constraint satisfaction problem with a heuristically guided search strategy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-07-01
The report is divided into the following sections: (1) Introduction; (2) Conclusions and Recommendations; (3) Existing Conditions and Facilities for a Fuel Distribution Center; (4) Pacific Ocean Regional Tuna Fisheries and Resources; (5) Fishing Effort in the FSMEEZ 1992-1994; (6) Current Transshipping Operations in the Western Pacific Ocean; (7) Current and Probale Bunkering Practices of United States, Japanese, Koren, and Taiwanese Offshore-Based Vessels Operating in FSM and Adjacent Waters; (8) Shore-Based Fish-Handling/Processing; (9) Fuels Forecast; (10) Fuel Supply, Storage and Distribution; (11) Cost Estimates; (12) Economic Evaluation of Fuel Supply, Storage and Distribution.
Model-checking techniques based on cumulative residuals.
Lin, D Y; Wei, L J; Ying, Z
2002-03-01
Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.
WPS mediation: An approach to process geospatial data on different computing backends
NASA Astrophysics Data System (ADS)
Giuliani, Gregory; Nativi, Stefano; Lehmann, Anthony; Ray, Nicolas
2012-10-01
The OGC Web Processing Service (WPS) specification allows generating information by processing distributed geospatial data made available through Spatial Data Infrastructures (SDIs). However, current SDIs have limited analytical capacities and various problems emerge when trying to use them in data and computing-intensive domains such as environmental sciences. These problems are usually not or only partially solvable using single computing resources. Therefore, the Geographic Information (GI) community is trying to benefit from the superior storage and computing capabilities offered by distributed computing (e.g., Grids, Clouds) related methods and technologies. Currently, there is no commonly agreed approach to grid-enable WPS. No implementation allows one to seamlessly execute a geoprocessing calculation following user requirements on different computing backends, ranging from a stand-alone GIS server up to computer clusters and large Grid infrastructures. Considering this issue, this paper presents a proof of concept by mediating different geospatial and Grid software packages, and by proposing an extension of WPS specification through two optional parameters. The applicability of this approach will be demonstrated using a Normalized Difference Vegetation Index (NDVI) mediated WPS process, highlighting benefits, and issues that need to be further investigated to improve performances.
NASA Technical Reports Server (NTRS)
Shipman, D. L.
1972-01-01
The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.
Psychophysiology of dissociated consciousness.
Bob, Petr
2014-01-01
Recent study of consciousness provides an evidence that there is a limit of consciousness, which presents a barrier between conscious and unconscious processes. This barrier likely is specifically manifested as a disturbance of neural mechanisms of consciousness that through distributed brain processing, attentional mechanisms and memory processes enable to constitute integrative conscious experience. According to recent findings a level of conscious integration may change during certain conditions related to experimental cognitive manipulations, hypnosis, or stressful experiences that can lead to dissociation of consciousness. In psychopathological research the term dissociation was proposed by Pierre Janet for explanation of processes related to splitting of consciousness due to traumatic events or during hypnosis. According to several recent findings dissociation of consciousness likely is related to deficits in global distribution of information and may lead to heightened levels of "neural complexity" that reflects brain integration or differentiation based on numbers of independent neural processes in the brain that may be specifically related to various mental disorders.
Ontologies and Information Systems: A Literature Survey
2011-06-01
Science and Technology Organisation DSTO–TN–1002 ABSTRACT An ontology captures in a computer-processable language the important con - cepts in a...knowledge shara- bility, reusability and scalability, and that support collaborative and distributed con - struction of ontologies, the DOGMA and DILIGENT...and assemble the received information). In the second stage, the designers determine how ontologies should be used in the pro - cess of adding
Unmanned Underwater Vehicle (UUV) Information Study
2014-11-28
Maritime Unmanned System NATO North Atlantic Treaty Organization xi The use or disclosure of the information on this sheet is subject to the... Unmanned Aerial System UDA Underwater Domain Awareness UNISIPS Unified Sonar Image Processing System USV Unmanned Surface Vehicle UUV Unmanned Underwater...data distribution to ashore systems , such as the delay, its impact and the benefits to the overall MDA and required metadata for efficient search and
Distributed heterogeneous inspecting system and its middleware-based solution.
Huang, Li-can; Wu, Zhao-hui; Pan, Yun-he
2003-01-01
There are many cases when an organization needs to monitor the data and operations of its supervised departments, especially those departments which are not owned by this organization and are managed by their own information systems. Distributed Heterogeneous Inspecting System (DHIS) is the system an organization uses to monitor its supervised departments by inspecting their information systems. In DHIS, the inspected systems are generally distributed, heterogeneous, and constructed by different companies. DHIS has three key processes-abstracting core data sets and core operation sets, collecting these sets, and inspecting these collected sets. In this paper, we present the concept and mathematical definition of DHIS, a metadata method for solving the interoperability, a security strategy for data transferring, and a middleware-based solution of DHIS. We also describe an example of the inspecting system at WENZHOU custom.
Rapid Decisions From Experience
Zeigenfuse, Matthew D.; Pleskac, Timothy J.; Liu, Taosheng
2014-01-01
In many everyday decisions, people quickly integrate noisy samples of information to form a preference among alternatives that offer uncertain rewards. Here, we investigated this decision process using the Flash Gambling Task (FGT), in which participants made a series of choices between a certain payoff and an uncertain alternative that produced a normal distribution of payoffs. For each choice, participants experienced the distribution of payoffs via rapid samples updated every 50 ms. We show that people can make these rapid decisions from experience and that the decision process is consistent with a sequential sampling process. Results also reveal a dissociation between these preferential decisions and equivalent perceptual decisions where participants had to determine which alternatives contained more dots on average. To account for this dissociation, we developed a sequential sampling rank-dependent utility model, which showed that participants in the FGT attended more to larger potential payoffs than participants in the perceptual task despite being given equivalent information. We discuss the implications of these findings in terms of computational models of preferential choice and a more complete understanding of experience-based decision making. PMID:24549141
TERRA/MODIS Data Products and Data Management at the GES-DAAC
NASA Astrophysics Data System (ADS)
Sharma, A. K.; Ahmad, S.; Eaton, P.; Koziana, J.; Leptoukh, G.; Ouzounov, D.; Savtchenko, A.; Serafino, G.; Sikder, M.; Zhou, B.
2001-05-01
Since February 2000, the Earth Sciences Distributed Active Archive Center (GES-DAAC) at the NASA/Goddard Space Flight Center has been successfully ingesting, processing, archiving, and distributing the Moderate Resolution Imaging Spectroradiometer (MODIS) data. MODIS is the key instrument aboard the Terra satellite, viewing the entire Earth's surface every 1 to 2 days, acquiring data in 36 channels in the visible and infrared spectral bands (0.4 to 14.4 microns). Higher resolution (250m, 500m, and 1km pixel) data are improving our understanding of global dynamics and processes occurring on the land, in the oceans, and in the lower atmosphere and will play a vital role in the future development of validated, global, interactive Earth-system models. MODIS calibrated and uncalibrated radiances, and geolocation products were released to the public in April 2000, and a suite of oceans products and an entire suite of atmospheric products were released by early January 2001. The suite of ocean products is grouped into three categories Ocean Color, SST and Primary Productivity. The suite of atmospheric products includes Aerosol, Total Precipitable Water, Cloud Optical and Physical properties, Atmospheric Profiles and Cloud Mask. The MODIS Data Support Team (MDST) at the GES-DAAC has been providing support for enabling basic scientific research and assistance in accessing the scientific data and information to the Earth Science User Community. Support is also provided for data formats (HDF-EOS), information on visualization tools, documentation for data products, information on the scientific content of products and metadata. Visit the MDST website at http://daac.gsfc.nasa.gov/CAMPAIGN_DOCS/MODIS/index.html The task to process archive and distribute enormous volumes of MODIS data to users (more than 0.5 TB a day) has led to the development of an unique world wide web based GES DAAC Search and Order system http://acdisx.gsfc.nasa.gov/data/, data handling software and tools, as well as a FTP site that contains sample of browse images and MODIS data products. This paper is intended to inform the user community about the data system and services available at the GES-DAAC in support of these information-rich data products. MDST provides support to MODIS data users to access and process data and information for research, applications and educational purposes. This paper will present an overview of the MODIS data products released to public including the suite of atmosphere and oceans data products that can be ordered from the GES-DAAC. Different mechanisms for search and ordering the data, determining data product sizes, data distribution policy, User Assistance System (UAS), and data subscription services will be described.
Advanced information processing system
NASA Technical Reports Server (NTRS)
Lala, J. H.
1984-01-01
Design and performance details of the advanced information processing system (AIPS) for fault and damage tolerant data processing on aircraft and spacecraft are presented. AIPS comprises several computers distributed throughout the vehicle and linked by a damage tolerant data bus. Most I/O functions are available to all the computers, which run in a TDMA mode. Each computer performs separate specific tasks in normal operation and assumes other tasks in degraded modes. Redundant software assures that all fault monitoring, logging and reporting are automated, together with control functions. Redundant duplex links and damage-spread limitation provide the fault tolerance. Details of an advanced design of a laboratory-scale proof-of-concept system are described, including functional operations.
Advanced information processing system for advanced launch system: Avionics architecture synthesis
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.
1991-01-01
The Advanced Information Processing System (AIPS) is a fault-tolerant distributed computer system architecture that was developed to meet the real time computational needs of advanced aerospace vehicles. One such vehicle is the Advanced Launch System (ALS) being developed jointly by NASA and the Department of Defense to launch heavy payloads into low earth orbit at one tenth the cost (per pound of payload) of the current launch vehicles. An avionics architecture that utilizes the AIPS hardware and software building blocks was synthesized for ALS. The AIPS for ALS architecture synthesis process starting with the ALS mission requirements and ending with an analysis of the candidate ALS avionics architecture is described.
Smart Networked Elements in Support of ISHM
NASA Technical Reports Server (NTRS)
Oostdyk, Rebecca; Mata, Carlos; Perotti, Jose M.
2008-01-01
At the core of ISHM is the ability to extract information and knowledge from raw data. Conventional data acquisition systems sample and convert physical measurements to engineering units, which higher-level systems use to derive health and information about processes and systems. Although health management is essential at the top level, there are considerable advantages to implementing health-related functions at the sensor level. The distribution of processing to lower levels reduces bandwidth requirements, enhances data fusion, and improves the resolution for detection and isolation of failures in a system, subsystem, component, or process. The Smart Networked Element (SNE) has been developed to implement intelligent functions and algorithms at the sensor level in support of ISHM.
Telemedicine and distributed medical intelligence.
Warner, D; Tichenor, J M; Balch, D C
1996-01-01
Recent trends in health care informatics and telemedicine indicate that systems are being developed with a primary focus on technology and business, not on the process of medicine itself. The authors present a new model of health care information, distributed medical intelligence, which promotes the development of an integrative medical communication system addressing the process of providing expert medical knowledge to the point of need. The model incorporates audio, video, high-resolution still images, and virtual reality applications into an integrated medical communications network. Three components of the model (care portals, Docking Station, and the bridge) are described. The implementation of this model at the East Carolina University School of Medicine is also outlined.
NASA Astrophysics Data System (ADS)
Zhang, Yan; Wang, Xiaorui; Zhe Zhang, Yun
2018-07-01
By employing the different topological charges of a Laguerre–Gaussian beam as a qubit, we experimentally demonstrate a controlled-NOT (CNOT) gate with light beams carrying orbital angular momentum via a photonic band gap structure in a hot atomic ensemble. Through a degenerate four-wave mixing process, the spatial distribution of the CNOT gate including splitting and spatial shift can be affected by the Kerr nonlinear effect in multilevel atomic systems. Moreover, the intensity variations of the CNOT gate can be controlled by the relative phase modulation. This research can be useful for applications in quantum information processing.
Numerical modelling of distributed vibration sensor based on phase-sensitive OTDR
NASA Astrophysics Data System (ADS)
Masoudi, A.; Newson, T. P.
2017-04-01
A Distributed Vibration Sensor Based on Phase-Sensitive OTDR is numerically modeled. The advantage of modeling the building blocks of the sensor individually and combining the blocks to analyse the behavior of the sensing system is discussed. It is shown that the numerical model can accurately imitate the response of the experimental setup to dynamic perturbations a signal processing procedure similar to that used to extract the phase information from sensing setup.
Bayesian hierarchical models for regional climate reconstructions of the last glacial maximum
NASA Astrophysics Data System (ADS)
Weitzel, Nils; Hense, Andreas; Ohlwein, Christian
2017-04-01
Spatio-temporal reconstructions of past climate are important for the understanding of the long term behavior of the climate system and the sensitivity to forcing changes. Unfortunately, they are subject to large uncertainties, have to deal with a complex proxy-climate structure, and a physically reasonable interpolation between the sparse proxy observations is difficult. Bayesian Hierarchical Models (BHMs) are a class of statistical models that is well suited for spatio-temporal reconstructions of past climate because they permit the inclusion of multiple sources of information (e.g. records from different proxy types, uncertain age information, output from climate simulations) and quantify uncertainties in a statistically rigorous way. BHMs in paleoclimatology typically consist of three stages which are modeled individually and are combined using Bayesian inference techniques. The data stage models the proxy-climate relation (often named transfer function), the process stage models the spatio-temporal distribution of the climate variables of interest, and the prior stage consists of prior distributions of the model parameters. For our BHMs, we translate well-known proxy-climate transfer functions for pollen to a Bayesian framework. In addition, we can include Gaussian distributed local climate information from preprocessed proxy records. The process stage combines physically reasonable spatial structures from prior distributions with proxy records which leads to a multivariate posterior probability distribution for the reconstructed climate variables. The prior distributions that constrain the possible spatial structure of the climate variables are calculated from climate simulation output. We present results from pseudoproxy tests as well as new regional reconstructions of temperatures for the last glacial maximum (LGM, ˜ 21,000 years BP). These reconstructions combine proxy data syntheses with information from climate simulations for the LGM that were performed in the PMIP3 project. The proxy data syntheses consist either of raw pollen data or of normally distributed climate data from preprocessed proxy records. Future extensions of our method contain the inclusion of other proxy types (transfer functions), the implementation of other spatial interpolation techniques, the use of age uncertainties, and the extension to spatio-temporal reconstructions of the last deglaciation. Our work is part of the PalMod project funded by the German Federal Ministry of Education and Science (BMBF).
Antonsson, E; Langer, B; Halfpap, I; Gottwald, J; Rühl, E
2017-06-28
In order to gain quantitative information on the surface composition of nanoparticles from X-ray photoelectron spectroscopy, a detailed understanding of photoelectron transport phenomena in these samples is needed. Theoretical results on the elastic and inelastic scattering have been reported, but a rigorous experimental verification is lacking. We report in this work on the photoelectron angular distribution from free SiO 2 nanoparticles (d = 122 ± 9 nm) after ionization by soft X-rays above the Si 2p and O 1s absorption edges, which gives insight into the relative importance of elastic and inelastic scattering channels in the sample particles. The photoelectron angular anisotropy is found to be lower for photoemission from SiO 2 nanoparticles than that expected from the theoretical values for the isolated Si and O atoms in the photoelectron kinetic energy range 20-380 eV. The reduced angular anisotropy is explained by elastic scattering of the outgoing photoelectrons from neighboring atoms, smearing out the atomic distribution. Photoelectron angular distributions yield detailed information on photoelectron elastic scattering processes allowing for a quantification of the number of elastic scattering events the photoelectrons have undergone prior to leaving the sample. The interpretation of the experimental photoelectron angular distributions is complemented by Monte Carlo simulations, which take inelastic and elastic photoelectron scattering into account using theoretical values for the scattering cross sections. The results of the simulations reproduce the experimental photoelectron angular distributions and provide further support for the assignment that elastic and inelastic electron scattering processes need to be considered.
Nitrification in Water and Wastewater Treatment
This chapter discusses available information on the occurrence of nitrification in water treatment plants and its potential impact on distribution system water quality. Nitrification as part of the water treatment process can occur whenever ammonia is present in or added to the s...
An Airborne Onboard Parallel Processing Testbed
NASA Technical Reports Server (NTRS)
Mandl, Daniel J.
2014-01-01
This presentation provides information on the progress the Intelligent Payload Module (IPM) development effort. In addition, a vision is presented on integration of the IPM architecture with the GeoSocial Application Program Interface (API) architecture to enable efficient distribution of satellite data products.
Systems Suitable for Information Professionals.
ERIC Educational Resources Information Center
Blair, John C., Jr.
1983-01-01
Describes computer operating systems applicable to microcomputers, noting hardware components, advantages and disadvantages of each system, local area networks, distributed processing, and a fully configured system. Lists of hardware components (disk drives, solid state disk emulators, input/output and memory components, and processors) and…
2011-09-01
AND EXPERIMENTAL DESIGN ..........................................................................................................31 1...PRIMARY RESERCH QUESTION ............................................................41 C. OBJECTIVE ACHIEVEMENT...Based Outpatient Clinic CPT Cognitive Processing Therapy DISE Distributed Information Systems Experimentation EBT Evidence-Based Treatment GMC
The AgESGUI geospatial simulation system for environmental model application and evaluation
USDA-ARS?s Scientific Manuscript database
Practical decision making in spatially-distributed environmental assessment and management is increasingly being based on environmental process-based models linked to geographical information systems (GIS). Furthermore, powerful computers and Internet-accessible assessment tools are providing much g...
NASA Astrophysics Data System (ADS)
Zheng, Yan
2015-03-01
Internet of things (IoT), focusing on providing users with information exchange and intelligent control, attracts a lot of attention of researchers from all over the world since the beginning of this century. IoT is consisted of large scale of sensor nodes and data processing units, and the most important features of IoT can be illustrated as energy confinement, efficient communication and high redundancy. With the sensor nodes increment, the communication efficiency and the available communication band width become bottle necks. Many research work is based on the instance which the number of joins is less. However, it is not proper to the increasing multi-join query in whole internet of things. To improve the communication efficiency between parallel units in the distributed sensor network, this paper proposed parallel query optimization algorithm based on distribution attributes cost graph. The storage information relations and the network communication cost are considered in this algorithm, and an optimized information changing rule is established. The experimental result shows that the algorithm has good performance, and it would effectively use the resource of each node in the distributed sensor network. Therefore, executive efficiency of multi-join query between different nodes could be improved.
The distributed neural system for top-down letter processing: an fMRI study
NASA Astrophysics Data System (ADS)
Liu, Jiangang; Feng, Lu; Li, Ling; Tian, Jie
2011-03-01
This fMRI study used Psychophysiological interaction (PPI) to investigate top-down letter processing with an illusory letter detection task. After an initial training that became increasingly difficult, participant was instructed to detect a letter from pure noise images where there was actually no letter. Such experimental paradigm allowed for isolating top-down components of letter processing and minimizing the influence of bottom-up perceptual input. A distributed cortical network of top-down letter processing was identified by analyzing the functional connectivity patterns of letter-preferential area (LA) within the left fusiform gyrus. Such network extends from the visual cortex to high level cognitive cortexes, including the left middle frontal gyrus, left medial frontal gyrus, left superior parietal gyrus, bilateral precuneus, and left inferior occipital gyrus. These findings suggest that top-down letter processing contains not only regions for processing of letter phonology and appearance, but also those involved in internal information generation and maintenance, and attention and memory processing.
Detection of global state predicates
NASA Technical Reports Server (NTRS)
Marzullo, Keith; Neiger, Gil
1991-01-01
The problem addressed here arises in the context of Meta: how can a set of processes monitor the state of a distributed application in a consistent manner? For example, consider the simple distributed application as shown here. Each of the three processes in the application has a light, and the control processes would each like to take an action when some specified subset of the lights are on. The application processes are instrumented with stubs that determine when the process turns its lights on or off. This information is disseminated to the control processes, each of which then determines when its condition of interest is met. Meta is built on top of the ISIS toolkit, and so we first built the sensor dissemination mechanism using atomic broadcast. Atomic broadcast guarantees that all recipients receive the messages in the same order and that this order is consistent with causality. Unfortunately, the control processes are somewhat limited in what they can deduce when they find that their condition of interest holds.
Fuselets: an agent based architecture for fusion of heterogeneous information and data
NASA Astrophysics Data System (ADS)
Beyerer, Jürgen; Heizmann, Michael; Sander, Jennifer
2006-04-01
A new architecture for fusing information and data from heterogeneous sources is proposed. The approach takes criminalistics as a model. In analogy to the work of detectives, who attempt to investigate crimes, software agents are initiated that pursue clues and try to consolidate or to dismiss hypotheses. Like their human pendants, they can, if questions beyond their competences arise, consult expert agents. Within the context of a certain task, region, and time interval, specialized operations are applied to each relevant information source, e.g. IMINT, SIGINT, ACINT,..., HUMINT, data bases etc. in order to establish hit lists of first clues. Each clue is described by its pertaining facts, uncertainties, and dependencies in form of a local degree-of-belief (DoB) distribution in a Bayesian sense. For each clue an agent is initiated which cooperates with other agents and experts. Expert agents support to make use of different information sources. Consultations of experts, capable to access certain information sources, result in changes of the DoB of the pertaining clue. According to the significance of concentration of their DoB distribution clues are abandoned or pursued further to formulate task specific hypotheses. Communications between the agents serve to find out whether different clues belong to the same cause and thus can be put together. At the end of the investigation process, the different hypotheses are evaluated by a jury and a final report is created that constitutes the fusion result. The approach proposed avoids calculating global DoB distributions by adopting a local Bayesian approximation and thus reduces the complexity of the exact problem essentially. Different information sources are transformed into DoB distributions using the maximum entropy paradigm and considering known facts as constraints. Nominal, ordinal and cardinal quantities can be treated within this framework equally. The architecture is scalable by tailoring the number of agents according to the available computer resources, to the priority of tasks, and to the maximum duration of the fusion process. Furthermore, the architecture allows cooperative work of human and automated agents and experts, as long as not all subtasks can be accomplished automatically.
Combined X-ray CT and mass spectrometry for biomedical imaging applications
NASA Astrophysics Data System (ADS)
Schioppa, E., Jr.; Ellis, S.; Bruinen, A. L.; Visser, J.; Heeren, R. M. A.; Uher, J.; Koffeman, E.
2014-04-01
Imaging technologies play a key role in many branches of science, especially in biology and medicine. They provide an invaluable insight into both internal structure and processes within a broad range of samples. There are many techniques that allow one to obtain images of an object. Different techniques are based on the analysis of a particular sample property by means of a dedicated imaging system, and as such, each imaging modality provides the researcher with different information. The use of multimodal imaging (imaging with several different techniques) can provide additional and complementary information that is not possible when employing a single imaging technique alone. In this study, we present for the first time a multi-modal imaging technique where X-ray computerized tomography (CT) is combined with mass spectrometry imaging (MSI). While X-ray CT provides 3-dimensional information regarding the internal structure of the sample based on X-ray absorption coefficients, MSI of thin sections acquired from the same sample allows the spatial distribution of many elements/molecules, each distinguished by its unique mass-to-charge ratio (m/z), to be determined within a single measurement and with a spatial resolution as low as 1 μm or even less. The aim of the work is to demonstrate how molecular information from MSI can be spatially correlated with 3D structural information acquired from X-ray CT. In these experiments, frozen samples are imaged in an X-ray CT setup using Medipix based detectors equipped with a CO2 cooled sample holder. Single projections are pre-processed before tomographic reconstruction using a signal-to-thickness calibration. In the second step, the object is sliced into thin sections (circa 20 μm) that are then imaged using both matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) and secondary ion (SIMS) mass spectrometry, where the spatial distribution of specific molecules within the sample is determined. The combination of two vastly different imaging approaches provides complementary information (i.e., anatomical and molecular distributions) that allows the correlation of distinct structural features with specific molecules distributions leading to unique insights in disease development.
A Bayesian Alternative for Multi-objective Ecohydrological Model Specification
NASA Astrophysics Data System (ADS)
Tang, Y.; Marshall, L. A.; Sharma, A.; Ajami, H.
2015-12-01
Process-based ecohydrological models combine the study of hydrological, physical, biogeochemical and ecological processes of the catchments, which are usually more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov Chain Monte Carlo (MCMC) techniques. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological framework. In our study, a formal Bayesian approach is implemented in an ecohydrological model which combines a hydrological model (HyMOD) and a dynamic vegetation model (DVM). Simulations focused on one objective likelihood (Streamflow/LAI) and multi-objective likelihoods (Streamflow and LAI) with different weights are compared. Uniform, weakly informative and strongly informative prior distributions are used in different simulations. The Kullback-leibler divergence (KLD) is used to measure the dis(similarity) between different priors and corresponding posterior distributions to examine the parameter sensitivity. Results show that different prior distributions can strongly influence posterior distributions for parameters, especially when the available data is limited or parameters are insensitive to the available data. We demonstrate differences in optimized parameters and uncertainty limits in different cases based on multi-objective likelihoods vs. single objective likelihoods. We also demonstrate the importance of appropriately defining the weights of objectives in multi-objective calibration according to different data types.
The timing of sequences of saccades in visual search.
Van Loon, E M; Hooge, I Th C; Van den Berg, A V
2002-01-01
According to the LATER model (linear approach to thresholds with ergodic rate), the latency of a single saccade in response to target appearance can be understood as a decision process, which is subject to (i) variations in the rate of (visual) information processing; and (ii) the threshold for the decision. We tested whether the LATER model can also be applied to the sequences of saccades in a multiple fixation search, during which latencies of second and subsequent saccades are typically shorter than that of the initial saccade. We found that the distributions of the reciprocal latencies for later saccades, unlike those of the first saccade, are highly asymmetrical, much like a gamma distribution. This suggests that the normal distribution of the rate r, which the LATER model assumes, is not appropriate to describe the rate distributions of subsequent saccades in a scanning sequence. By contrast, the gamma distribution is also appropriate to describe the distribution of reciprocal latencies for the first saccade. The change of the gamma distribution parameters as a function of the ordinal number of the saccade suggests a lowering of the threshold for second and later saccades, as well as a reduction in the number of target elements analysed. PMID:12184827
Flotation and flocculation chemistry of coal and oxidized coals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somasundaran, P.
1990-01-01
The objective of this research project is to understand the fundamentals involved in the flotation and flocculation of coal and oxidized coals and elucidate mechanisms by which surface interactions between coal and various reagents enhance coal beneficiation. An understanding of the nature of the heterogeneity of coal surfaces arising from the intrinsic distribution of chemical moieties is fundamental to the elucidation of mechanism of coal surface modification and its role in interfacial processes such as flotation, flocculation and agglomeration. A new approach for determining the distribution in surface properties of coal particles was developed in this study and various techniquesmore » capable of providing such information were identified. Distributions in surface energy, contact angle and wettability were obtained using novel techniques such as centrifugal immersion and film flotation. Changes in these distributions upon oxidation and surface modifications were monitored and discussed. An approach to the modelling of coal surface site distributions based on thermodynamic information obtained from gas adsorption and immersion calorimetry is proposed. Polyacrylamide and dodecane was used to alter the coal surface. Methanol adsorption was also studied. 62 figs.« less
Federated Giovanni: A Distributed Web Service for Analysis and Visualization of Remote Sensing Data
NASA Technical Reports Server (NTRS)
Lynnes, Chris
2014-01-01
The Geospatial Interactive Online Visualization and Analysis Interface (Giovanni) is a popular tool for users of the Goddard Earth Sciences Data and Information Services Center (GES DISC) and has been in use for over a decade. It provides a wide variety of algorithms and visualizations to explore large remote sensing datasets without having to download the data and without having to write readers and visualizers for it. Giovanni is now being extended to enable its capabilities at other data centers within the Earth Observing System Data and Information System (EOSDIS). This Federated Giovanni will allow four other data centers to add and maintain their data within Giovanni on behalf of their user community. Those data centers are the Physical Oceanography Distributed Active Archive Center (PO.DAAC), MODIS Adaptive Processing System (MODAPS), Ocean Biology Processing Group (OBPG), and Land Processes Distributed Active Archive Center (LP DAAC). Three tiers are supported: Tier 1 (GES DISC-hosted) gives the remote data center a data management interface to add and maintain data, which are provided through the Giovanni instance at the GES DISC. Tier 2 packages Giovanni up as a virtual machine for distribution to and deployment by the other data centers. Data variables are shared among data centers by sharing documents from the Solr database that underpins Giovanni's data management capabilities. However, each data center maintains their own instance of Giovanni, exposing the variables of most interest to their user community. Tier 3 is a Shared Source model, in which the data centers cooperate to extend the infrastructure by contributing source code.
Gamma-ray momentum reconstruction from Compton electron trajectories by filtered back-projection
Haefner, A.; Gunter, D.; Plimley, B.; ...
2014-11-03
Gamma-ray imaging utilizing Compton scattering has traditionally relied on measuring coincident gamma-ray interactions to map directional information of the source distribution. This coincidence requirement makes it an inherently inefficient process. We present an approach to gamma-ray reconstruction from Compton scattering that requires only a single electron tracking detector, thus removing the coincidence requirement. From the Compton scattered electron momentum distribution, our algorithm analytically computes the incident photon's correlated direction and energy distributions. Because this method maps the source energy and location, it is useful in applications, where prior information about the source distribution is unknown. We demonstrate this method withmore » electron tracks measured in a scientific Si charge coupled device. While this method was demonstrated with electron tracks in a Si-based detector, it is applicable to any detector that can measure electron direction and energy, or equivalently the electron momentum. For example, it can increase the sensitivity to obtain energy and direction in gas-based systems that suffer from limited efficiency.« less
Design and Verification of a Distributed Communication Protocol
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.; Goodloe, Alwyn E.
2009-01-01
The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.
NASA Astrophysics Data System (ADS)
Song, Y.; Gui, Z.; Wu, H.; Wei, Y.
2017-09-01
Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.
Web-GIS platform for monitoring and forecasting of regional climate and ecological changes
NASA Astrophysics Data System (ADS)
Gordov, E. P.; Krupchatnikov, V. N.; Lykosov, V. N.; Okladnikov, I.; Titov, A. G.; Shulgina, T. M.
2012-12-01
Growing volume of environmental data from sensors and model outputs makes development of based on modern information-telecommunication technologies software infrastructure for information support of integrated scientific researches in the field of Earth sciences urgent and important task (Gordov et al, 2012, van der Wel, 2005). It should be considered that original heterogeneity of datasets obtained from different sources and institutions not only hampers interchange of data and analysis results but also complicates their intercomparison leading to a decrease in reliability of analysis results. However, modern geophysical data processing techniques allow combining of different technological solutions for organizing such information resources. Nowadays it becomes a generally accepted opinion that information-computational infrastructure should rely on a potential of combined usage of web- and GIS-technologies for creating applied information-computational web-systems (Titov et al, 2009, Gordov et al. 2010, Gordov, Okladnikov and Titov, 2011). Using these approaches for development of internet-accessible thematic information-computational systems, and arranging of data and knowledge interchange between them is a very promising way of creation of distributed information-computation environment for supporting of multidiscipline regional and global research in the field of Earth sciences including analysis of climate changes and their impact on spatial-temporal vegetation distribution and state. Experimental software and hardware platform providing operation of a web-oriented production and research center for regional climate change investigations which combines modern web 2.0 approach, GIS-functionality and capabilities of running climate and meteorological models, large geophysical datasets processing, visualization, joint software development by distributed research groups, scientific analysis and organization of students and post-graduate students education is presented. Platform software developed (Shulgina et al, 2012, Okladnikov et al, 2012) includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also data preprocessing, run and visualization of modeling results of models WRF and «Planet Simulator» integrated into the platform is provided. All functions of the center are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of visualization of processing results, selection of geographical region of interest (pan and zoom) and data layers manipulation (order, enable/disable, features extraction). Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches (Shulgina et al, 2011). Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified graphical web-interface.
2003-09-01
BLANK xv LIST OF ACRONYMS ABC Activity Based Costing ADO ActiveX Data Object ASP Application Server Page BPR Business Process Re...processes uses people and systems (hardware, software, machinery, etc.) and that these people and systems contain the “corporate” knowledge of the...server architecture was also a high maintenance item. Data was no longer contained on one mainframe but was distributed throughout the enterprise
Biogeochemical Processes in Microbial Ecosystems
NASA Technical Reports Server (NTRS)
DesMarais, David J.
2001-01-01
The hierarchical organization of microbial ecosystems determines process rates that shape Earth's environment, create the biomarker sedimentary and atmospheric signatures of life, and define the stage upon which major evolutionary events occurred. In order to understand how microorganisms have shaped the global environment of Earth and, potentially, other worlds, we must develop an experimental paradigm that links biogeochemical processes with ever-changing temporal and spatial distributions of microbial populations and their metabolic properties. Additional information is contained in the original extended abstract.
NASA Astrophysics Data System (ADS)
Ntarlagiannis, D.; Ustra, A.; Slater, L. D.; Zhang, C.; Mendonça, C. A.
2015-12-01
In this work we present an alternative formulation of the Debye Decomposition (DD) of complex conductivity spectra, with a new set of parameters that are directly related to the continuous Debye relaxation model. The procedure determines the relaxation time distribution (RTD) and two frequency-independent parameters that modulate the induced polarization spectra. The distribution of relaxation times quantifies the contribution of each distinct relaxation process, which can in turn be associated with specific polarization processes and characterized in terms of electrochemical and interfacial parameters as derived from mechanistic models. Synthetic tests show that the procedure can successfully fit spectral induced polarization (SIP) data and accurately recover the RTD. The procedure was applied to different data sets, focusing on environmental applications. We focus on data of sand-clay mixtures artificially contaminated with toluene, and crude oil-contaminated sands experiencing biodegradation. The results identify characteristic relaxation times that can be associated with distinct polarization processes resulting from either the contaminant itself or transformations associated with biodegradation. The inversion results provide information regarding the relative strength and dominant relaxation time of these polarization processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olsen, C.R.; Larsen, I.L.; Lowry, P.D.
Radionuclides released into the Susquehanna--Chesapeake System from the Three Mile Island, Peach Bottom, and Calvert Cliffs nuclear power plants are partitioned among dissolved, particulate, and biological phases and may thus exist in a number of physical and chemical forms. In this project, we have measured the dissolved and particulate distributions of fallout /sup 137/Cs; reactor-released /sup 137/Cs, /sup 134/Cs, /sup 65/Zn, /sup 60/Co, and /sup 58/Co; and naturally occurring /sup 7/Be and /sup 210/Pb in the lower Susquehanna River and Upper Chesapeake Bay. In addition, we chemically leached suspended particles and bottom sediments in the laboratory to determine radionuclide partitioningmore » among different particulate-sorbing phases to complement the site-specific field data. This information has been used to document the important geochemical processes that affect the transport, sorption, distribution, and fate of reactor-released radionuclides (and by analogy, other trace contaminants) in this river-estuarine system. Knowledge of the mechanisms, kinetic factors, and processes that affect radionuclide distributions is crucial for predicting their biological availability, toxicity, chemical behavior, physical transport, and accumulation in aquatic systems. The results from this project provide the information necessary for developing accurate radionuclide-transport and biological-uptake models. 76 refs., 12 figs.« less
Web Monitoring of EOS Front-End Ground Operations, Science Downlinks and Level 0 Processing
NASA Technical Reports Server (NTRS)
Cordier, Guy R.; Wilkinson, Chris; McLemore, Bruce
2008-01-01
This paper addresses the efforts undertaken and the technology deployed to aggregate and distribute the metadata characterizing the real-time operations associated with NASA Earth Observing Systems (EOS) high-rate front-end systems and the science data collected at multiple ground stations and forwarded to the Goddard Space Flight Center for level 0 processing. Station operators, mission project management personnel, spacecraft flight operations personnel and data end-users for various EOS missions can retrieve the information at any time from any location having access to the internet. The users are distributed and the EOS systems are distributed but the centralized metadata accessed via an external web server provide an effective global and detailed view of the enterprise-wide events as they are happening. The data-driven architecture and the implementation of applied middleware technology, open source database, open source monitoring tools, and external web server converge nicely to fulfill the various needs of the enterprise. The timeliness and content of the information provided are key to making timely and correct decisions which reduce project risk and enhance overall customer satisfaction. The authors discuss security measures employed to limit access of data to authorized users only.
Detection and Distribution of Natural Gaps in Tropical Rainforest
NASA Astrophysics Data System (ADS)
Goulamoussène, Y.; Linguet, L.; Hérault, B.
2014-12-01
Forest management is important to assess biodiversity and ecological processes. Requirements for disturbance information have also been motivated by the scientific community. Therefore, understanding and monitoring the distribution frequencies of treefall gaps is relevant to better understanding and predicting the carbon budget in response to global change and land use change. In this work we characterize and quantify the frequency distribution of natural canopy gaps. We observe then interaction between environment variables and gap formation across tropical rainforest of the French Guiana region by using high resolution airborne Light Detection and Ranging (LiDAR). We mapped gaps with canopy model distribution on 40000 ha of forest. We used a Bayesian modelling framework to estimate and select useful covariate model parameters. Topographic variables are included in a model to predict gap size distribution. We discuss results from the interaction between environment and gap size distribution, mainly topographic indexes. The use of both airborne and space-based techniques has improved our ability to supply needed disturbance information. This work is an approach at plot scale. The use of satellite data will allow us to work at forest scale. The inclusion of climate variables in our model will let us assess the impact of global change on tropical rainforest.
ERIC Educational Resources Information Center
Dominey, Peter Ford; Inui, Toshio; Hoen, Michel
2009-01-01
A central issue in cognitive neuroscience today concerns how distributed neural networks in the brain that are used in language learning and processing can be involved in non-linguistic cognitive sequence learning. This issue is informed by a wealth of functional neurophysiology studies of sentence comprehension, along with a number of recent…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, T.; Ke, J.; Sathaye, J.
2011-04-20
This User's Manual summarizes the background information of the Benchmarking and Energy/water-Saving Tool (BEST) for the Dairy Processing Industry (Version 1.2, 2011), including'Read Me' portion of the tool, the sections of Introduction, and Instructions for the BEST-Dairy tool that is developed and distributed by Lawrence Berkeley National Laboratory (LBNL).
NASA Technical Reports Server (NTRS)
Falke, Stefan; Husar, Rudolf
2011-01-01
The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.
Review of cost versus scale: water and wastewater treatment and reuse processes.
Guo, Tianjiao; Englehardt, James; Wu, Tingting
2014-01-01
The US National Research Council recently recommended direct potable water reuse (DPR), or potable water reuse without environmental buffer, for consideration to address US water demand. However, conveyance of wastewater and water to and from centralized treatment plants consumes on average four times the energy of treatment in the USA, and centralized DPR would further require upgradient distribution of treated water. Therefore, information on the cost of unit treatment processes potentially useful for DPR versus system capacity was reviewed, converted to constant 2012 US dollars, and synthesized in this work. A logarithmic variant of the Williams Law cost function was found applicable over orders of magnitude of system capacity, for the subject processes: activated sludge, membrane bioreactor, coagulation/flocculation, reverse osmosis, ultrafiltration, peroxone and granular activated carbon. Results are demonstrated versus 10 DPR case studies. Because economies of scale found for capital equipment are counterbalanced by distribution/collection network costs, further study of the optimal scale of distributed DPR systems is suggested.
Examining the relationship between comprehension and production processes in code-switched language
Guzzardo Tamargo, Rosa E.; Valdés Kroff, Jorge R.; Dussias, Paola E.
2016-01-01
We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish–English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants’ comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension. PMID:28670049
Examining the relationship between comprehension and production processes in code-switched language.
Guzzardo Tamargo, Rosa E; Valdés Kroff, Jorge R; Dussias, Paola E
2016-08-01
We employ code-switching (the alternation of two languages in bilingual communication) to test the hypothesis, derived from experience-based models of processing (e.g., Boland, Tanenhaus, Carlson, & Garnsey, 1989; Gennari & MacDonald, 2009), that bilinguals are sensitive to the combinatorial distributional patterns derived from production and that they use this information to guide processing during the comprehension of code-switched sentences. An analysis of spontaneous bilingual speech confirmed the existence of production asymmetries involving two auxiliary + participle phrases in Spanish-English code-switches. A subsequent eye-tracking study with two groups of bilingual code-switchers examined the consequences of the differences in distributional patterns found in the corpus study for comprehension. Participants' comprehension costs mirrored the production patterns found in the corpus study. Findings are discussed in terms of the constraints that may be responsible for the distributional patterns in code-switching production and are situated within recent proposals of the links between production and comprehension.
Pareja, Lucía; Colazzo, Marcos; Pérez-Parada, Andrés; Besil, Natalia; Heinzen, Horacio; Böcking, Bernardo; Cesio, Verónica; Fernández-Alba, Amadeo R
2012-05-09
The results of an experiment to study the occurrence and distribution of pesticide residues during rice cropping and processing are reported. Four herbicides, nine fungicides, and two insecticides (azoxystrobin, byspiribac-sodium, carbendazim, clomazone, difenoconazole, epoxiconazole, isoprothiolane, kresoxim-methyl, propanil, quinclorac, tebuconazole, thiamethoxam, tricyclazole, trifloxystrobin, λ-cyhalotrin) were applied to an isolated rice-crop plot under controlled conditions, during the 2009-2010 cropping season in Uruguay. Paddy rice was harvested and industrially processed to brown rice, white rice, and rice bran, which were analyzed for pesticide residues using the original QuEChERS methodology and its citrate variation by LC-MS/MS and GC-MS. The distribution of pesticide residues was uneven among the different matrices. Ten different pesticide residues were found in paddy rice, seven in brown rice, and eight in rice bran. The highest concentrations were detected in paddy rice. These results provide information regarding the fate of pesticides in the rice food chain and its safety for consumers.
Wonodi, C B; Privor-Dumm, L; Aina, M; Pate, A M; Reis, R; Gadhoke, P; Levine, O S
2012-05-01
The decision-making process to introduce new vaccines into national immunization programmes is often complex, involving many stakeholders who provide technical information, mobilize finance, implement programmes and garner political support. Stakeholders may have different levels of interest, knowledge and motivations to introduce new vaccines. Lack of consensus on the priority, public health value or feasibility of adding a new vaccine can delay policy decisions. Efforts to support country-level decision-making have largely focused on establishing global policies and equipping policy makers with the information to support decision-making on new vaccine introduction (NVI). Less attention has been given to understanding the interactions of policy actors and how the distribution of influence affects the policy process and decision-making. Social network analysis (SNA) is a social science technique concerned with explaining social phenomena using the structural and relational features of the network of actors involved. This approach can be used to identify how information is exchanged and who is included or excluded from the process. For this SNA of vaccine decision-making in Nigeria, we interviewed federal and state-level government officials, officers of bilateral and multilateral partner organizations, and other stakeholders such as health providers and the media. Using data culled from those interviews, we performed an SNA in order to map formal and informal relationships and the distribution of influence among vaccine decision-makers, as well as to explore linkages and pathways to stakeholders who can influence critical decisions in the policy process. Our findings indicate a relatively robust engagement of key stakeholders in Nigeria. We hypothesized that economic stakeholders and implementers would be important to ensure sustainable financing and strengthen programme implementation, but some economic and implementation stakeholders did not appear centrally on the map; this may suggest a need to strengthen the decision-making processes by engaging these stakeholders more centrally and earlier.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jimenez-Arguello, Alejandro Marti
The study of the inner structure of hadrons allows us to understand the nature of the interactions between partons, quarks and gluons, described by Quantum Chromodynamics. The elastic scattering reactions, which have been studied in order to measure the nucleon form factors, are included in this frame. The inelastic scattering reactions are also included in this frame, they allow us to obtain information about the nucleon structure thanks to the development of the parton distribution functions (PDFs). While through elastic scattering we can obtain information about the charge distribution of the nucleon, and hence, about the spatial distribution of themore » partons, through inelastic scattering we obtain information about the momentum distributions of partons, by employing the PDFs. However, we can study the exclusive inelastic scattering reactions, such as the Deeply Virtual Compton Scattering (DVCS), wich allow us to access to the spatial and momentum distributions simultaneously. This is possible thanks to the generalized parton distributions (GPDs), which allow us to correlate both types of distributions. The process known as DVCS is the easiest way to access the GPDs. This process can be expressed as the scattering of an electron by a proton by means of a virtual photon with the result of the scattered initial particles plus a real photon. We find a process competing with DVCS known as Bethe-Heitler (BH), in which the real photon is radiated by the lepton rather than the quark. Due to the small cross section of DVCS, of the order of nb, in order to conduct these kind of experiments it is necessary to make use of facilities capable of providing high beam intensities. One of these facilities is the Thomas Jefferson National Accelerator Facility , where the experiment JLab E07-007, "Complete Separation of Virtual Photon and π⁰ Electroproduction Observables of Unpolarized Protons", took place during the months of October to December of 2010. The main goal of this experiment is the isolation of the contribution from the term coming form the DVCS from the interference term, resulting from the BH contribution. This isolation is known as "Rosenbluth Separation". The work presented in this thesis focuses on the analysis of the data stored by the electromagnetic calorimeter, employed for the detection of real photons. There is also a a theoretical introduction to the study of the nucleon structure, reviewing the concepts of form factors and parton distributions through elastic and inelastic processes. The computation of the photon leptoproduction cross section is described in detail, as well as the goals of experiment E07-007. This thesis also describes the analysis of the data stored by the electromagnetic calorimeter, with the purpose of obtaining the kinematic variables of the real photons resulting from DVCS reactions. Finally, it describes the selection of events from stored data, the applied cuts to kinematical variables and the background subtraction. Also, the process of extraction of the necessary observables for computing the photon leptoproduction cross section is described, along with the main steps followed to perform the Monte Carlo simulation used in this computation. The resulting cross sections are shown at the end of this thesis.« less
Zhang, Jiwei; Di, Jianglei; Li, Ying; Xi, Teli; Zhao, Jianlin
2015-10-19
We present a method for dynamically measuring the refractive index distribution in a large range based on the combination of digital holographic interferometry and total internal reflection. A series of holograms, carrying the index information of mixed liquids adhered on a total reflection prism surface, are recorded with CCD during the diffusion process. Phase shift differences of the reflected light are reconstructed exploiting the principle of double-exposure holographic interferometry. According to the relationship between the reflection phase shift difference and the liquid index, two dimensional index distributions can be directly figured out, assuming that the index of air near the prism surface is constant. The proposed method can also be applied to measure the index of solid media and monitor the index variation during some chemical reaction processes.
ERIC Educational Resources Information Center
Collinge, Brian; And Others
1995-01-01
Four conference presenters involved in consumer online services present information on new products both under development and in the process of implementation, commenting on technological, content, distribution, and consumer service issues. Products and companies discussed are eWorld (Apple Computer Europe); Olivetti Telemedia; CompuServe; and…
Code of Federal Regulations, 2010 CFR
2010-07-01
... and maintained primarily for the convenience of the Agency employee, and that are not distributed to... created before entering Government service; private materials brought into, created, or received in the... National Technical Information Service (NTIS), or the Internet normally need not be processed as FOIA...
Code of Federal Regulations, 2011 CFR
2011-07-01
... and maintained primarily for the convenience of the Agency employee, and that are not distributed to... created before entering Government service; private materials brought into, created, or received in the... National Technical Information Service (NTIS), or the Internet normally need not be processed as FOIA...
Technology and the Online Catalog.
ERIC Educational Resources Information Center
Graham, Peter S.
1983-01-01
Discusses trends in computer technology and their use for library catalogs, noting the concept of bandwidth (describes quantity of information transmitted per given unit of time); computer hardware differences (micros, minis, maxis); distributed processing systems and databases; optical disk storage; networks; transmission media; and terminals.…
Code of Federal Regulations, 2012 CFR
2012-01-01
... Commission's rule governing the Privacy of Consumer Financial Information, 16 CFR part 313. (b) Customer... customer of a financial institution, whether in paper, electronic, or other form, that is handled or... administrative, technical, or physical safeguards you use to access, collect, distribute, process, protect, store...
SEA ARCHER Distributed Aviation Platform
2001-12-01
manual processes, but should also improve decision support functions through advanced modeling and simulation. SEA ARCHER’s information architecture...this payload model was the SH-60 for which accurate weights were attained. Weights for the Marine STOVL version of the JSF were also attained, and
Code of Federal Regulations, 2011 CFR
2011-01-01
... CONCERNING USE OF THE NOAA SPACE-BASED DATA COLLECTION SYSTEMS § 911.3 Definitions. For purposes of this part... data from fixed and moving platforms and provides platform location data. This system consists of... Data Processing and Distribution for the National Environmental Satellite, Data, and Information...
Characterization of process air emissions in automotive production plants.
D'Arcy, J B; Dasch, J M; Gundrum, A B; Rivera, J L; Johnson, J H; Carlson, D H; Sutherland, J W
2016-01-01
During manufacturing, particles produced from industrial processes become airborne. These airborne emissions represent a challenge from an industrial hygiene and environmental standpoint. A study was undertaken to characterize the particles associated with a variety of manufacturing processes found in the auto industry. Air particulates were collected in five automotive plants covering ten manufacturing processes in the areas of casting, machining, heat treatment and assembly. Collection procedures provided information on air concentration, size distribution, and chemical composition of the airborne particulate matter for each process and insight into the physical and chemical processes that created those particles.
A comparison of decentralized, distributed, and centralized vibro-acoustic control.
Frampton, Kenneth D; Baumann, Oliver N; Gardonio, Paolo
2010-11-01
Direct velocity feedback control of structures is well known to increase structural damping and thus reduce vibration. In multi-channel systems the way in which the velocity signals are used to inform the actuators ranges from decentralized control, through distributed or clustered control to fully centralized control. The objective of distributed controllers is to exploit the anticipated performance advantage of the centralized control while maintaining the scalability, ease of implementation, and robustness of decentralized control. However, and in seeming contradiction, some investigations have concluded that decentralized control performs as well as distributed and centralized control, while other results have indicated that distributed control has significant performance advantages over decentralized control. The purpose of this work is to explain this seeming contradiction in results, to explore the effectiveness of decentralized, distributed, and centralized vibro-acoustic control, and to expand the concept of distributed control to include the distribution of the optimization process and the cost function employed.
NASA Technical Reports Server (NTRS)
2002-01-01
TRMM has acquired more than four years of data since its launch in November 1997. All TRMM standard products are processed by the TRMM Science Data and Information System (TSDIS) and archived and distributed to general users by the GES DAAC. Table 1 shows the total archive and distribution as of February 28, 2002. The Utilization Ratio (UR), defined as the ratio of the number of distributed files to the number of archived files, of the TRMM standard products has been steadily increasing since 1998 and is currently at 6.98.
Emotion-attention interactions in recognition memory for distractor faces.
Srinivasan, Narayanan; Gupta, Rashmi
2010-04-01
Effective filtering of distractor information has been shown to be dependent on perceptual load. Given the salience of emotional information and the presence of emotion-attention interactions, we wanted to explore the recognition memory for emotional distractors especially as a function of focused attention and distributed attention by manipulating load and the spatial spread of attention. We performed two experiments to study emotion-attention interactions by measuring recognition memory performance for distractor neutral and emotional faces. Participants performed a color discrimination task (low-load) or letter identification task (high-load) with a letter string display in Experiment 1 and a high-load letter identification task with letters presented in a circular array in Experiment 2. The stimuli were presented against a distractor face background. The recognition memory results show that happy faces were recognized better than sad faces under conditions of less focused or distributed attention. When attention is more spatially focused, sad faces were recognized better than happy faces. The study provides evidence for emotion-attention interactions in which specific emotional information like sad or happy is associated with focused or distributed attention respectively. Distractor processing with emotional information also has implications for theories of attention. Copyright 2010 APA, all rights reserved.
USDA-ARS?s Scientific Manuscript database
Whereas soil test information on the fertility and chemistry of soils has been important to elaborate safe and sound agricultural practices, microscopic information can give a whole extra dimension to understand the chemical processes occurring in soils. The objective of this study was to evaluate t...
Receptive Fields and the Reconstruction of Visual Information.
1985-09-01
depending on the noise . Thus our model would suggest that the interpolation filters for deblurring are playing a role in Ii hyperacuity. This is novel...of additional precision in the information can be obtained by a process of deblurring , which could be relevant to hyperacuity. It also provides an... impulse of heat diffuses into increasingly larger Gaussian distributions as time proceeds. Mathematically, let f(x) denote the initial temperature
Huttin, Christine C; Liebman, Michael N
2013-01-01
This paper aims to discuss the economics of biobanking. Among the critical issues in evaluating potential ROI for creation of a bio-bank are: scale (e.g. local, national, international), centralized versus virtual/distributed, degree of sample annotation/QC procedures, targeted end-users and uses, types of samples, potential characterization, both of samples and annotations. The paper presents a review on cost models for an economic analysis of biobanking for different steps: data collection (e.g. biospecimens in different types of sites, storage, transport and distribution, information management for the different types of information (e.g. biological information such as cell, gene, and protein)). It also provides additional concepts to process biospecimens from laboratory to clinical practice and will help to identify how changing paradigms in translational medicine affect the economic modeling.
Bayesian hierarchical functional data analysis via contaminated informative priors.
Scarpa, Bruno; Dunson, David B
2009-09-01
A variety of flexible approaches have been proposed for functional data analysis, allowing both the mean curve and the distribution about the mean to be unknown. Such methods are most useful when there is limited prior information. Motivated by applications to modeling of temperature curves in the menstrual cycle, this article proposes a flexible approach for incorporating prior information in semiparametric Bayesian analyses of hierarchical functional data. The proposed approach is based on specifying the distribution of functions as a mixture of a parametric hierarchical model and a nonparametric contamination. The parametric component is chosen based on prior knowledge, while the contamination is characterized as a functional Dirichlet process. In the motivating application, the contamination component allows unanticipated curve shapes in unhealthy menstrual cycles. Methods are developed for posterior computation, and the approach is applied to data from a European fecundability study.
Semantic characteristics of NLP-extracted concepts in clinical notes vs. biomedical literature.
Wu, Stephen; Liu, Hongfang
2011-01-01
Natural language processing (NLP) has become crucial in unlocking information stored in free text, from both clinical notes and biomedical literature. Clinical notes convey clinical information related to individual patient health care, while biomedical literature communicates scientific findings. This work focuses on semantic characterization of texts at an enterprise scale, comparing and contrasting the two domains and their NLP approaches. We analyzed the empirical distributional characteristics of NLP-discovered named entities in Mayo Clinic clinical notes from 2001-2010, and in the 2011 MetaMapped Medline Baseline. We give qualitative and quantitative measures of domain similarity and point to the feasibility of transferring resources and techniques. An important by-product for this study is the development of a weighted ontology for each domain, which gives distributional semantic information that may be used to improve NLP applications.
Cellular complexity in subcortical white matter: a distributed control circuit?
Colombo, Jorge A
2018-03-01
The subcortical white matter (SWM) has been traditionally considered as a site for passive-neutral-information transfer through cerebral cortex association and projection fibers. Yet, the presence of subcortical neuronal and glial "interstitial" cells expressing immunolabelled neurotransmitters/neuromodulators and synaptic vesicular proteins, and recent immunohistochemical and electrophysiological observations on the rat visual cortex as well as interactive regulation of myelinating processes support the possibility that SWM nests subcortical, regionally variable, distributed neuronal-glial circuits, that could influence information transfer. Their hypothetical involvement in regulating the timing and signal transfer probability at the SWM axonal components ought to be considered and experimentally analysed. Thus, the "interstitial" neuronal cells-associated with local glial cells-traditionally considered to be vestigial and functionally inert under normal conditions, they may well turn to be critical in regulating information transfer at the SWM.
Liang, Wanjie; Cao, Jing; Fan, Yan; Zhu, Kefeng; Dai, Qiwei
2015-01-01
In recent years, traceability systems have been developed as effective tools for improving the transparency of supply chains, thereby guaranteeing the quality and safety of food products. In this study, we proposed a cattle/beef supply chain traceability model and a traceability system based on radio frequency identification (RFID) technology and the EPCglobal network. First of all, the transformations of traceability units were defined and analyzed throughout the cattle/beef chain. Secondly, we described the internal and external traceability information acquisition, transformation, and transmission processes throughout the beef supply chain in detail, and explained a methodology for modeling traceability information using the electronic product code information service (EPCIS) framework. Then, the traceability system was implemented based on Fosstrak and FreePastry software packages, and animal ear tag code and electronic product code (EPC) were employed to identify traceability units. Finally, a cattle/beef supply chain included breeding business, slaughter and processing business, distribution business and sales outlet was used as a case study to evaluate the beef supply chain traceability system. The results demonstrated that the major advantages of the traceability system are the effective sharing of information among business and the gapless traceability of the cattle/beef supply chain.
Liang, Wanjie; Cao, Jing; Fan, Yan; Zhu, Kefeng; Dai, Qiwei
2015-01-01
In recent years, traceability systems have been developed as effective tools for improving the transparency of supply chains, thereby guaranteeing the quality and safety of food products. In this study, we proposed a cattle/beef supply chain traceability model and a traceability system based on radio frequency identification (RFID) technology and the EPCglobal network. First of all, the transformations of traceability units were defined and analyzed throughout the cattle/beef chain. Secondly, we described the internal and external traceability information acquisition, transformation, and transmission processes throughout the beef supply chain in detail, and explained a methodology for modeling traceability information using the electronic product code information service (EPCIS) framework. Then, the traceability system was implemented based on Fosstrak and FreePastry software packages, and animal ear tag code and electronic product code (EPC) were employed to identify traceability units. Finally, a cattle/beef supply chain included breeding business, slaughter and processing business, distribution business and sales outlet was used as a case study to evaluate the beef supply chain traceability system. The results demonstrated that the major advantages of the traceability system are the effective sharing of information among business and the gapless traceability of the cattle/beef supply chain. PMID:26431340
Music score watermarking by clef modifications
NASA Astrophysics Data System (ADS)
Schmucker, Martin; Yan, Hongning
2003-06-01
In this paper we present a new method for hiding data in music scores. In contrast to previous published algorithms we investigate the possibilities of embedding information in clefs. Using the clef as information carrier has two advantages: First, a clef is present in each staff line which guarantees a fixed capacity. Second, the clef defines the reference system for musical symbols and music containing symbols, e.g. the notes and the rests, are not degraded by manipulations. Music scores must be robust against greyscale to binary conversion. As a consequence, the information is embedded by modifying the black and white distribution of pixels in certain areas. We evaluate simple image processing mechanisms based on erosion and dilation for embedding the information. For retrieving the watermark the b/w-distribution is extracted from the given clef. To solve the synchronization problem the watermarked clef is normalized in a pre-processing step. The normalization is based on moments. The areas used for watermarking are calculated by image segmentation techniques which consider the features of a clef. We analyze capacity and robustness of the proposed method using different parameters for our proposed method. This proposed method can be combined with other music score watermarking methods to increase the capacity of existing watermarking techniques.
Effect of users' opinion evolution on information diffusion in online social networks
NASA Astrophysics Data System (ADS)
Zhu, Hengmin; Kong, Yuehan; Wei, Jing; Ma, Jing
2018-02-01
The process of topic propagation always interweaves information diffusion and opinion evolution, but most previous works studied the models of information diffusion and opinion evolution separately, and seldom focused on their interaction of each other. To shed light on the effect of users' opinion evolution on information diffusion in online social networks, we proposed a model which incorporates opinion evolution into the process of topic propagation. Several real topics propagating on Sina Microblog were collected to analyze individuals' propagation intentions, and different propagation intentions were considered in the model. The topic propagation was simulated to explore the impact of different opinion distributions and intervention with opposite opinion on information diffusion. Results show that the topic with one-sided opinions can spread faster and more widely, and intervention with opposite opinion is an effective measure to guide the topic propagation. The earlier to intervene, the more effectively the topic propagation would be guided.
A highly scalable information system as extendable framework solution for medical R&D projects.
Holzmüller-Laue, Silke; Göde, Bernd; Stoll, Regina; Thurow, Kerstin
2009-01-01
For research projects in preventive medicine a flexible information management is needed that offers a free planning and documentation of project specific examinations. The system should allow a simple, preferably automated data acquisition from several distributed sources (e.g., mobile sensors, stationary diagnostic systems, questionnaires, manual inputs) as well as an effective data management, data use and analysis. An information system fulfilling these requirements has been developed at the Center for Life Science Automation (celisca). This system combines data of multiple investigations and multiple devices and displays them on a single screen. The integration of mobile sensor systems for comfortable, location-independent capture of time-based physiological parameter and the possibility of observation of these measurements directly by this system allow new scenarios. The web-based information system presented in this paper is configurable by user interfaces. It covers medical process descriptions, operative process data visualizations, a user-friendly process data processing, modern online interfaces (data bases, web services, XML) as well as a comfortable support of extended data analysis with third-party applications.
Emergency healthcare process automation using mobile computing and cloud services.
Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G
2012-10-01
Emergency care is basically concerned with the provision of pre-hospital and in-hospital medical and/or paramedical services and it typically involves a wide variety of interdependent and distributed activities that can be interconnected to form emergency care processes within and between Emergency Medical Service (EMS) agencies and hospitals. Hence, in developing an information system for emergency care processes, it is essential to support individual process activities and to satisfy collaboration and coordination needs by providing readily access to patient and operational information regardless of location and time. Filling this information gap by enabling the provision of the right information, to the right people, at the right time fosters new challenges, including the specification of a common information format, the interoperability among heterogeneous institutional information systems or the development of new, ubiquitous trans-institutional systems. This paper is concerned with the development of an integrated computer support to emergency care processes by evolving and cross-linking institutional healthcare systems. To this end, an integrated EMS cloud-based architecture has been developed that allows authorized users to access emergency case information in standardized document form, as proposed by the Integrating the Healthcare Enterprise (IHE) profile, uses the Organization for the Advancement of Structured Information Standards (OASIS) standard Emergency Data Exchange Language (EDXL) Hospital Availability Exchange (HAVE) for exchanging operational data with hospitals and incorporates an intelligent module that supports triaging and selecting the most appropriate ambulances and hospitals for each case.
Co-evolution of electric and telecommunications networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rivkin, S.R.
1998-05-01
There are potentially significant societal benefits in co-evolution between electricity and telecommunications in the areas of common infrastructure, accelerated deployment of distributed energy, tighter integration of information flow for energy management and distribution, and improved customer care. With due regard for natural processes that are more potent than any regulation and more real than any ideology, the gains from co-evolution would far outweigh the attenuated and speculative savings from restructuring of electricity that is too simplistic.
Exact Solution of the Markov Propagator for the Voter Model on the Complete Graph
2014-07-01
distribution of the random walk. This process can also be applied to other models, incomplete graphs, or to multiple dimensions. An advantage of this...since any multiple of an eigenvector remains an eigenvector. Without any loss, let bk = 1. Now we can ascertain the explicit solution for bj when k < j...this bound is valid for all initial probability distributions. However, without detailed information about the eigenvectors, we cannot extract more
A local approach for focussed Bayesian fusion
NASA Astrophysics Data System (ADS)
Sander, Jennifer; Heizmann, Michael; Goussev, Igor; Beyerer, Jürgen
2009-04-01
Local Bayesian fusion approaches aim to reduce high storage and computational costs of Bayesian fusion which is separated from fixed modeling assumptions. Using the small world formalism, we argue why this proceeding is conform with Bayesian theory. Then, we concentrate on the realization of local Bayesian fusion by focussing the fusion process solely on local regions that are task relevant with a high probability. The resulting local models correspond then to restricted versions of the original one. In a previous publication, we used bounds for the probability of misleading evidence to show the validity of the pre-evaluation of task specific knowledge and prior information which we perform to build local models. In this paper, we prove the validity of this proceeding using information theoretic arguments. For additional efficiency, local Bayesian fusion can be realized in a distributed manner. Here, several local Bayesian fusion tasks are evaluated and unified after the actual fusion process. For the practical realization of distributed local Bayesian fusion, software agents are predestinated. There is a natural analogy between the resulting agent based architecture and criminal investigations in real life. We show how this analogy can be used to improve the efficiency of distributed local Bayesian fusion additionally. Using a landscape model, we present an experimental study of distributed local Bayesian fusion in the field of reconnaissance, which highlights its high potential.
Understanding the distributed cognitive processes of intensive care patient discharge.
Lin, Frances; Chaboyer, Wendy; Wallis, Marianne
2014-03-01
To better understand and identify vulnerabilities and risks in the ICU patient discharge process, which provides evidence for service improvement. Previous studies have identified that 'after hours' discharge and 'premature' discharge from ICU are associated with increased mortality. However, some of these studies have largely been retrospective reviews of various administrative databases, while others have focused on specific aspects of the process, which may miss crucial components of the discharge process. This is an ethnographic exploratory study. Distributed cognition and activity theory were used as theoretical frameworks. Ethnographic data collection techniques including informal interviews, direct observations and collecting existing documents were used. A total of 56 one-to-one interviews were conducted with 46 participants; 28 discharges were observed; and numerous documents were collected during a five-month period. A triangulated technique was used in both data collection and data analysis to ensure the research rigour. Under the guidance of activity theory and distributed cognition theoretical frameworks, five themes emerged: hierarchical power and authority, competing priorities, ineffective communication, failing to enact the organisational processes and working collaboratively to optimise the discharge process. Issues with teamwork, cognitive processes and team members' interaction with cognitive artefacts influenced the discharge process. Strategies to improve shared situational awareness are needed to improve teamwork, patient flow and resource efficiency. Tools need to be evaluated regularly to ensure their continuous usefulness. Health care professionals need to be aware of the impact of their competing priorities and ensure discharges occur in a timely manner. Activity theory and distributed cognition are useful theoretical frameworks to support healthcare organisational research. © 2013 John Wiley & Sons Ltd.
Research on information security system of waste terminal disposal process
NASA Astrophysics Data System (ADS)
Zhou, Chao; Wang, Ziying; Guo, Jing; Guo, Yajuan; Huang, Wei
2017-05-01
Informatization has penetrated the whole process of production and operation of electric power enterprises. It not only improves the level of lean management and quality service, but also faces severe security risks. The internal network terminal is the outermost layer and the most vulnerable node of the inner network boundary. It has the characteristics of wide distribution, long depth and large quantity. The user and operation and maintenance personnel technical level and security awareness is uneven, which led to the internal network terminal is the weakest link in information security. Through the implementation of security of management, technology and physics, we should establish an internal network terminal security protection system, so as to fully protect the internal network terminal information security.
A broadband multimedia TeleLearning system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Ruiping; Karmouch, A.
1996-12-31
In this paper we discuss a broadband multimedia TeleLearning system under development in the Multimedia Information Research Laboratory at the University of Ottawa. The system aims at providing a seamless environment for TeleLearning using the latest telecommunication and multimedia information processing technology. It basically consists of a media production center, a courseware author site, a courseware database, a courseware user site, and an on-line facilitator site. All these components are distributed over an ATM network and work together to offer a multimedia interactive courseware service. An MHEG-based model is exploited in designing the system architecture to achieve the real-time, interactive,more » and reusable information interchange through heterogeneous platforms. The system architecture, courseware processing strategies, courseware document models are presented.« less
Use of communication technologies in document exchange for the management of construction projects
NASA Astrophysics Data System (ADS)
Mesároš, Peter; Mandičák, Tomáš
2016-06-01
Information and communication technologies represent a set of people, processes, technical and software tools providing collection, transport, storage and processing of data for distribution and presentation of information. Particularly communication systems are the main tool for information exchange. Of the other part, these technologies have a broad focus and use. One of them is the exchange of documents in the management of construction projects. Paper discusses the issue of exploitation level of communication technologies in construction project management. The main objective of this paper is to analyze exploitation level of communication technologies. Another aim of the paper is to compare exploitation level or rate of document exchange by electronic communication devices and face-to-face communication.
Beyond mind-reading: multi-voxel pattern analysis of fMRI data.
Norman, Kenneth A; Polyn, Sean M; Detre, Greg J; Haxby, James V
2006-09-01
A key challenge for cognitive neuroscience is determining how mental representations map onto patterns of neural activity. Recently, researchers have started to address this question by applying sophisticated pattern-classification algorithms to distributed (multi-voxel) patterns of functional MRI data, with the goal of decoding the information that is represented in the subject's brain at a particular point in time. This multi-voxel pattern analysis (MVPA) approach has led to several impressive feats of mind reading. More importantly, MVPA methods constitute a useful new tool for advancing our understanding of neural information processing. We review how researchers are using MVPA methods to characterize neural coding and information processing in domains ranging from visual perception to memory search.
Combining Different Tools for EEG Analysis to Study the Distributed Character of Language Processing
da Rocha, Armando Freitas; Foz, Flávia Benevides; Pereira, Alfredo
2015-01-01
Recent studies on language processing indicate that language cognition is better understood if assumed to be supported by a distributed intelligent processing system enrolling neurons located all over the cortex, in contrast to reductionism that proposes to localize cognitive functions to specific cortical structures. Here, brain activity was recorded using electroencephalogram while volunteers were listening or reading small texts and had to select pictures that translate meaning of these texts. Several techniques for EEG analysis were used to show this distributed character of neuronal enrollment associated with the comprehension of oral and written descriptive texts. Low Resolution Tomography identified the many different sets (s i) of neurons activated in several distinct cortical areas by text understanding. Linear correlation was used to calculate the information H(e i) provided by each electrode of the 10/20 system about the identified s i. H(e i) Principal Component Analysis (PCA) was used to study the temporal and spatial activation of these sources s i. This analysis evidenced 4 different patterns of H(e i) covariation that are generated by neurons located at different cortical locations. These results clearly show that the distributed character of language processing is clearly evidenced by combining available EEG technologies. PMID:26713089
Rocha, Armando Freitas da; Foz, Flávia Benevides; Pereira, Alfredo
2015-01-01
Recent studies on language processing indicate that language cognition is better understood if assumed to be supported by a distributed intelligent processing system enrolling neurons located all over the cortex, in contrast to reductionism that proposes to localize cognitive functions to specific cortical structures. Here, brain activity was recorded using electroencephalogram while volunteers were listening or reading small texts and had to select pictures that translate meaning of these texts. Several techniques for EEG analysis were used to show this distributed character of neuronal enrollment associated with the comprehension of oral and written descriptive texts. Low Resolution Tomography identified the many different sets (s i ) of neurons activated in several distinct cortical areas by text understanding. Linear correlation was used to calculate the information H(e i ) provided by each electrode of the 10/20 system about the identified s i . H(e i ) Principal Component Analysis (PCA) was used to study the temporal and spatial activation of these sources s i . This analysis evidenced 4 different patterns of H(e i ) covariation that are generated by neurons located at different cortical locations. These results clearly show that the distributed character of language processing is clearly evidenced by combining available EEG technologies.
A multi-agent system for coordinating international shipping
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldsmith, S.Y.; Phillips, L.R.; Spires, S.V.
1998-05-01
Moving commercial cargo across the US-Mexico border is currently a complex, paper-based, error-prone process that incurs expensive inspections and delays at several ports of entry in the Southwestern US. Improved information handling will dramatically reduce border dwell time, variation in delivery time, and inventories, and will give better control of the shipment process. The Border Trade Facilitation System (BTFS) is an agent-based collaborative work environment that assists geographically distributed commercial and government users with transshipment of goods across the US-Mexico border. Software agents mediate the creation, validation and secure sharing of shipment information and regulatory documentation over the Internet, usingmore » the World Wide Web to interface with human actors. Agents are organized into Agencies. Each agency represents a commercial or government agency. Agents perform four specific functions on behalf of their user organizations: (1) agents with domain knowledge elicit commercial and regulatory information from human specialists through forms presented via web browsers; (2) agents mediate information from forms with diverse otologies, copying invariant data from one form to another thereby eliminating the need for duplicate data entry; (3) cohorts of distributed agents coordinate the work flow among the various information providers and they monitor overall progress of the documentation and the location of the shipment to ensure that all regulatory requirements are met prior to arrival at the border; (4) agents provide status information to human actors and attempt to influence them when problems are predicted.« less
Reducing Interpolation Artifacts for Mutual Information Based Image Registration
Soleimani, H.; Khosravifard, M.A.
2011-01-01
Medical image registration methods which use mutual information as similarity measure have been improved in recent decades. Mutual Information is a basic concept of Information theory which indicates the dependency of two random variables (or two images). In order to evaluate the mutual information of two images their joint probability distribution is required. Several interpolation methods, such as Partial Volume (PV) and bilinear, are used to estimate joint probability distribution. Both of these two methods yield some artifacts on mutual information function. Partial Volume-Hanning window (PVH) and Generalized Partial Volume (GPV) methods are introduced to remove such artifacts. In this paper we show that the acceptable performance of these methods is not due to their kernel function. It's because of the number of pixels which incorporate in interpolation. Since using more pixels requires more complex and time consuming interpolation process, we propose a new interpolation method which uses only four pixels (the same as PV and bilinear interpolations) and removes most of the artifacts. Experimental results of the registration of Computed Tomography (CT) images show superiority of the proposed scheme. PMID:22606673
Correlation between information diffusion and opinion evolution on social media
NASA Astrophysics Data System (ADS)
Xiong, Fei; Liu, Yun; Zhang, Zhenjiang
2014-12-01
Information diffusion and opinion evolution are often treated as two independent processes. Opinion models assume the topic reaches each agent and agents initially have their own ideas. In fact, the processes of information diffusion and opinion evolution often intertwine with each other. Whether the influence between these two processes plays a role in the system state is unclear. In this paper, we collected more than one million real data from a well-known social platform, and analysed large-scale user diffusion behaviour and opinion formation. We found that user inter-event time follows a two-scaling power-law distribution with two different power exponents. Public opinion stabilizes quickly and evolves toward the direction of convergence, but the consensus state is prevented by a few opponents. We propose a three-state opinion model accompanied by information diffusion. Agents form and exchange their opinions during information diffusion. Conversely, agents' opinions also influence their diffusion actions. Simulations show that the model with a correlation of the two processes produces similar statistical characteristics as empirical results. A fast epidemic process drives individual opinions to converge more obviously. Unlike previous epidemic models, the number of infected agents does not always increase with the update rate, but has a peak with an intermediate value of the rate.
MODIS land data at the EROS data center DAAC
Jenkerson, Calli B.; Reed, B.C.
2001-01-01
The US Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center (EDC) in Sioux Falls, SD, USA, is the primary national archive for land processes data and one of the National Aeronautics and Space Administration's (NASA) Distributed Active Archive Centers (DAAC) for the Earth Observing System (EOS). One of EDC's functions as a DAAC is the archival and distribution of Moderate Resolution Spectroradiometer (MODIS) Land Data collected from the Earth Observing System (EOS) satellite Terra. More than 500,000 publicly available MODIS land data granules totaling 25 Terabytes (Tb) are currently stored in the EDC archive. This collection is managed, archived, and distributed by EOS Data and Information System (EOSDIS) Core System (ECS) at EDC. EDC User Services support the use of MODIS Land data, which include land surface reflectance/albedo, temperature/emissivity, vegetation characteristics, and land cover, by responding to user inquiries, constructing user information sites on the EDC web page, and presenting MODIS materials worldwide.
Distribution of high-dimensional entanglement via an intra-city free-space link
Steinlechner, Fabian; Ecker, Sebastian; Fink, Matthias; Liu, Bo; Bavaresco, Jessica; Huber, Marcus; Scheidl, Thomas; Ursin, Rupert
2017-01-01
Quantum entanglement is a fundamental resource in quantum information processing and its distribution between distant parties is a key challenge in quantum communications. Increasing the dimensionality of entanglement has been shown to improve robustness and channel capacities in secure quantum communications. Here we report on the distribution of genuine high-dimensional entanglement via a 1.2-km-long free-space link across Vienna. We exploit hyperentanglement, that is, simultaneous entanglement in polarization and energy-time bases, to encode quantum information, and observe high-visibility interference for successive correlation measurements in each degree of freedom. These visibilities impose lower bounds on entanglement in each subspace individually and certify four-dimensional entanglement for the hyperentangled system. The high-fidelity transmission of high-dimensional entanglement under real-world atmospheric link conditions represents an important step towards long-distance quantum communications with more complex quantum systems and the implementation of advanced quantum experiments with satellite links. PMID:28737168
Distribution of high-dimensional entanglement via an intra-city free-space link.
Steinlechner, Fabian; Ecker, Sebastian; Fink, Matthias; Liu, Bo; Bavaresco, Jessica; Huber, Marcus; Scheidl, Thomas; Ursin, Rupert
2017-07-24
Quantum entanglement is a fundamental resource in quantum information processing and its distribution between distant parties is a key challenge in quantum communications. Increasing the dimensionality of entanglement has been shown to improve robustness and channel capacities in secure quantum communications. Here we report on the distribution of genuine high-dimensional entanglement via a 1.2-km-long free-space link across Vienna. We exploit hyperentanglement, that is, simultaneous entanglement in polarization and energy-time bases, to encode quantum information, and observe high-visibility interference for successive correlation measurements in each degree of freedom. These visibilities impose lower bounds on entanglement in each subspace individually and certify four-dimensional entanglement for the hyperentangled system. The high-fidelity transmission of high-dimensional entanglement under real-world atmospheric link conditions represents an important step towards long-distance quantum communications with more complex quantum systems and the implementation of advanced quantum experiments with satellite links.
An adaptive grid algorithm for 3-D GIS landform optimization based on improved ant algorithm
NASA Astrophysics Data System (ADS)
Wu, Chenhan; Meng, Lingkui; Deng, Shijun
2005-07-01
The key technique of 3-D GIS is to realize quick and high-quality 3-D visualization, in which 3-D roaming system based on landform plays an important role. However how to increase efficiency of 3-D roaming engine and process a large amount of landform data is a key problem in 3-D landform roaming system and improper process of the problem would result in tremendous consumption of system resources. Therefore it has become the key of 3-D roaming system design that how to realize high-speed process of distributed data for landform DEM (Digital Elevation Model) and high-speed distributed modulation of various 3-D landform data resources. In the paper we improved the basic ant algorithm and designed the modulation strategy of 3-D GIS landform resources based on the improved ant algorithm. By initially hypothetic road weights σi , the change of the information factors in the original algorithm would transform from ˜τj to ∆τj+σi and the weights was decided by 3-D computative capacity of various nodes in network environment. So during the course of initial phase of task assignment, increasing the resource information factors of high task-accomplishing rate and decreasing ones of low accomplishing rate would make load accomplishing rate approach the same value as quickly as possible, then in the later process of task assignment, the load balanced ability of the system was further improved. Experimental results show by improving ant algorithm, our system not only decreases many disadvantage of the traditional ant algorithm, but also like ants looking for food effectively distributes the complicated landform algorithm to many computers to process cooperatively and gains a satisfying search result.
Lendínez, Cristina; Pelegrina, Santiago; Lechuga, M Teresa
2014-01-01
The present study investigates the process of updating representations in working memory (WM) and how similarity between the information involved influences this process. In WM updating tasks, the similarity in terms of numerical distance between the number to be substituted and the new one facilitates the updating process. We aimed to disentangle the possible effect of two dimensions of similarity that may contribute to this numerical effect: numerical distance itself and common digits shared between the numbers involved. Three experiments were conducted in which different ranges of distances and the coincidence between the digits of the two numbers involved in updating were manipulated. Results showed that the two dimensions of similarity had an effect on updating times. The greater the similarity between the information maintained in memory and the new information that substituted it, the faster the updating. This is consistent both with the idea of distributed representations based on features, and with a selective updating process based on a feature overwriting mechanism. Thus, updating in WM can be understood as a selective substitution process influenced by similarity in which only certain parts of the representation stored in memory are changed.
Some intriguing aspects of multiparticle production processes
NASA Astrophysics Data System (ADS)
Wilk, Grzegorz; Włodarczyk, Zbigniew
2018-04-01
Multiparticle production processes provide valuable information about the mechanism of the conversion of the initial energy of projectiles into a number of secondaries by measuring their multiplicity distributions and their distributions in phase space. They therefore serve as a reference point for more involved measurements. Distributions in phase space are usually investigated using the statistical approach, very successful in general but failing in cases of small colliding systems, small multiplicities, and at the edges of the allowed phase space, in which cases the underlying dynamical effects competing with the statistical distributions take over. We discuss an alternative approach, which applies to the whole phase space without detailed knowledge of dynamics. It is based on a modification of the usual statistics by generalizing it to a superstatistical form. We stress particularly the scaling and self-similar properties of such an approach manifesting themselves as the phenomena of the log-periodic oscillations and oscillations of temperature caused by sound waves in hadronic matter. Concerning the multiplicity distributions we discuss in detail the phenomenon of the oscillatory behavior of the modified combinants apparently observed in experimental data.
Dynamic Singularity Spectrum Distribution of Sea Clutter
NASA Astrophysics Data System (ADS)
Xiong, Gang; Yu, Wenxian; Zhang, Shuning
2015-12-01
The fractal and multifractal theory have provided new approaches for radar signal processing and target-detecting under the background of ocean. However, the related research mainly focuses on fractal dimension or multifractal spectrum (MFS) of sea clutter. In this paper, a new dynamic singularity analysis method of sea clutter using MFS distribution is developed, based on moving detrending analysis (DMA-MFSD). Theoretically, we introduce the time information by using cyclic auto-correlation of sea clutter. For transient correlation series, the instantaneous singularity spectrum based on multifractal detrending moving analysis (MF-DMA) algorithm is calculated, and the dynamic singularity spectrum distribution of sea clutter is acquired. In addition, we analyze the time-varying singularity exponent ranges and maximum position function in DMA-MFSD of sea clutter. For the real sea clutter data, we analyze the dynamic singularity spectrum distribution of real sea clutter in level III sea state, and conclude that the radar sea clutter has the non-stationary and time-varying scale characteristic and represents the time-varying singularity spectrum distribution based on the proposed DMA-MFSD method. The DMA-MFSD will also provide reference for nonlinear dynamics and multifractal signal processing.
High efficiency coherent optical memory with warm rubidium vapour
Hosseini, M.; Sparkes, B.M.; Campbell, G.; Lam, P.K.; Buchler, B.C.
2011-01-01
By harnessing aspects of quantum mechanics, communication and information processing could be radically transformed. Promising forms of quantum information technology include optical quantum cryptographic systems and computing using photons for quantum logic operations. As with current information processing systems, some form of memory will be required. Quantum repeaters, which are required for long distance quantum key distribution, require quantum optical memory as do deterministic logic gates for optical quantum computing. Here, we present results from a coherent optical memory based on warm rubidium vapour and show 87% efficient recall of light pulses, the highest efficiency measured to date for any coherent optical memory suitable for quantum information applications. We also show storage and recall of up to 20 pulses from our system. These results show that simple warm atomic vapour systems have clear potential as a platform for quantum memory. PMID:21285952
High efficiency coherent optical memory with warm rubidium vapour.
Hosseini, M; Sparkes, B M; Campbell, G; Lam, P K; Buchler, B C
2011-02-01
By harnessing aspects of quantum mechanics, communication and information processing could be radically transformed. Promising forms of quantum information technology include optical quantum cryptographic systems and computing using photons for quantum logic operations. As with current information processing systems, some form of memory will be required. Quantum repeaters, which are required for long distance quantum key distribution, require quantum optical memory as do deterministic logic gates for optical quantum computing. Here, we present results from a coherent optical memory based on warm rubidium vapour and show 87% efficient recall of light pulses, the highest efficiency measured to date for any coherent optical memory suitable for quantum information applications. We also show storage and recall of up to 20 pulses from our system. These results show that simple warm atomic vapour systems have clear potential as a platform for quantum memory.
Links, Amanda E.; Draper, David; Lee, Elizabeth; Guzman, Jessica; Valivullah, Zaheer; Maduro, Valerie; Lebedev, Vlad; Didenko, Maxim; Tomlin, Garrick; Brudno, Michael; Girdea, Marta; Dumitriu, Sergiu; Haendel, Melissa A.; Mungall, Christopher J.; Smedley, Damian; Hochheiser, Harry; Arnold, Andrew M.; Coessens, Bert; Verhoeven, Steven; Bone, William; Adams, David; Boerkoel, Cornelius F.; Gahl, William A.; Sincan, Murat
2016-01-01
The National Institutes of Health Undiagnosed Diseases Program (NIH UDP) applies translational research systematically to diagnose patients with undiagnosed diseases. The challenge is to implement an information system enabling scalable translational research. The authors hypothesized that similar complex problems are resolvable through process management and the distributed cognition of communities. The team, therefore, built the NIH UDP integrated collaboration system (UDPICS) to form virtual collaborative multidisciplinary research networks or communities. UDPICS supports these communities through integrated process management, ontology-based phenotyping, biospecimen management, cloud-based genomic analysis, and an electronic laboratory notebook. UDPICS provided a mechanism for efficient, transparent, and scalable translational research and thereby addressed many of the complex and diverse research and logistical problems of the NIH UDP. Full definition of the strengths and deficiencies of UDPICS will require formal qualitative and quantitative usability and process improvement measurement. PMID:27785453
Random walks on activity-driven networks with attractiveness
NASA Astrophysics Data System (ADS)
Alessandretti, Laura; Sun, Kaiyuan; Baronchelli, Andrea; Perra, Nicola
2017-05-01
Virtually all real-world networks are dynamical entities. In social networks, the propensity of nodes to engage in social interactions (activity) and their chances to be selected by active nodes (attractiveness) are heterogeneously distributed. Here, we present a time-varying network model where each node and the dynamical formation of ties are characterized by these two features. We study how these properties affect random-walk processes unfolding on the network when the time scales describing the process and the network evolution are comparable. We derive analytical solutions for the stationary state and the mean first-passage time of the process, and we study cases informed by empirical observations of social networks. Our work shows that previously disregarded properties of real social systems, such as heterogeneous distributions of activity and attractiveness as well as the correlations between them, substantially affect the dynamical process unfolding on the network.
Semantics-based distributed I/O with the ParaMEDIC framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balaji, P.; Feng, W.; Lin, H.
2008-01-01
Many large-scale applications simultaneously rely on multiple resources for efficient execution. For example, such applications may require both large compute and storage resources; however, very few supercomputing centers can provide large quantities of both. Thus, data generated at the compute site oftentimes has to be moved to a remote storage site for either storage or visualization and analysis. Clearly, this is not an efficient model, especially when the two sites are distributed over a wide-area network. Thus, we present a framework called 'ParaMEDIC: Parallel Metadata Environment for Distributed I/O and Computing' which uses application-specific semantic information to convert the generatedmore » data to orders-of-magnitude smaller metadata at the compute site, transfer the metadata to the storage site, and re-process the metadata at the storage site to regenerate the output. Specifically, ParaMEDIC trades a small amount of additional computation (in the form of data post-processing) for a potentially significant reduction in data that needs to be transferred in distributed environments.« less
Scheduling based on a dynamic resource connection
NASA Astrophysics Data System (ADS)
Nagiyev, A. E.; Botygin, I. A.; Shersntneva, A. I.; Konyaev, P. A.
2017-02-01
The practical using of distributed computing systems associated with many problems, including troubles with the organization of an effective interaction between the agents located at the nodes of the system, with the specific configuration of each node of the system to perform a certain task, with the effective distribution of the available information and computational resources of the system, with the control of multithreading which implements the logic of solving research problems and so on. The article describes the method of computing load balancing in distributed automatic systems, focused on the multi-agency and multi-threaded data processing. The scheme of the control of processing requests from the terminal devices, providing the effective dynamic scaling of computing power under peak load is offered. The results of the model experiments research of the developed load scheduling algorithm are set out. These results show the effectiveness of the algorithm even with a significant expansion in the number of connected nodes and zoom in the architecture distributed computing system.
Neural bases of orthographic long-term memory and working memory in dysgraphia
Purcell, Jeremy; Hillis, Argye E.; Capasso, Rita; Miceli, Gabriele
2016-01-01
Spelling a word involves the retrieval of information about the word’s letters and their order from long-term memory as well as the maintenance and processing of this information by working memory in preparation for serial production by the motor system. While it is known that brain lesions may selectively affect orthographic long-term memory and working memory processes, relatively little is known about the neurotopographic distribution of the substrates that support these cognitive processes, or the lesions that give rise to the distinct forms of dysgraphia that affect these cognitive processes. To examine these issues, this study uses a voxel-based mapping approach to analyse the lesion distribution of 27 individuals with dysgraphia subsequent to stroke, who were identified on the basis of their behavioural profiles alone, as suffering from deficits only affecting either orthographic long-term or working memory, as well as six other individuals with deficits affecting both sets of processes. The findings provide, for the first time, clear evidence of substrates that selectively support orthographic long-term and working memory processes, with orthographic long-term memory deficits centred in either the left posterior inferior frontal region or left ventral temporal cortex, and orthographic working memory deficits primarily arising from lesions of the left parietal cortex centred on the intraparietal sulcus. These findings also contribute to our understanding of the relationship between the neural instantiation of written language processes and spoken language, working memory and other cognitive skills. PMID:26685156
Objective fitting of hemoglobin dynamics in traumatic bruises based on temperature depth profiling
NASA Astrophysics Data System (ADS)
Vidovič, Luka; Milanič, Matija; Majaron, Boris
2014-02-01
Pulsed photothermal radiometry (PPTR) allows noninvasive measurement of laser-induced temperature depth profiles. The obtained profiles provide information on depth distribution of absorbing chromophores, such as melanin and hemoglobin. We apply this technique to objectively characterize mass diffusion and decomposition rate of extravasated hemoglobin during the bruise healing process. In present study, we introduce objective fitting of PPTR data obtained over the course of the bruise healing process. By applying Monte Carlo simulation of laser energy deposition and simulation of the corresponding PPTR signal, quantitative analysis of underlying bruise healing processes is possible. Introduction of objective fitting enables an objective comparison between the simulated and experimental PPTR signals. In this manner, we avoid reconstruction of laser-induced depth profiles and thus inherent loss of information in the process. This approach enables us to determine the value of hemoglobin mass diffusivity, which is controversial in existing literature. Such information will be a valuable addition to existing bruise age determination techniques.
The Dynamics of Curriculum Revision.
ERIC Educational Resources Information Center
LaPorte, Diane Howard; LaPorte, Ronald E.
This research study was undertaken in order to understand the dynamics of curriculum revision. The study examines reasons for change, persons involved in revision, frequency of revision, ways of evaluating a revised curriculum, and consistency of revision processes across school districts. Information was obtained through surveys distributed to…
A Debugger for Computational Grid Applications
NASA Technical Reports Server (NTRS)
Hood, Robert; Jost, Gabriele; Biegel, Bryan (Technical Monitor)
2001-01-01
This viewgraph presentation gives an overview of a debugger for computational grid applications. Details are given on NAS parallel tools groups (including parallelization support tools, evaluation of various parallelization strategies, and distributed and aggregated computing), debugger dependencies, scalability, initial implementation, the process grid, and information on Globus.
2005-04-01
process to promptly move supplies from the United States to a customer. GAO found that conflicting doctrinal responsibilities for distribution ... management , improperly packed shipments, insufficient transportation personnel and equipment, and inadequate information systems prevented the timely
Associations between Temperament and Social Responsiveness in Young Children
ERIC Educational Resources Information Center
Salley, Brenda; Miller, Angela; Bell, Martha Ann
2013-01-01
Recent research has demonstrated that social responsiveness (comprised of social awareness, social information processing, reciprocal social communication, social motivation, and repetitive/restricted interests) is continuously distributed within the general population. In the present study, we consider temperament as a co-occurring source of…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-27
... care professional in an inpatient or outpatient setting, dialysis unit, oncology setting, or operating... of stakeholders, including interested prescribers, pharmacists, other health care professionals... to health care delivery processes (e.g., medical practice, pharmacy practice)? Are there situations...
32 CFR 651.35 - Decision process.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) ENVIRONMENTAL QUALITY... further information or receipt of public comments (see § 651.47). (d) The FNSI is normally no more than... public for review through traditional publication and distribution, and through the World Wide Web (WWW...
NASA Integrated Services Environment
NASA Technical Reports Server (NTRS)
Ing, Sharon
2005-01-01
This slide presentation will begin with a discussion on NASA's current distributed environment for directories, identity management and account management. We will follow with information concerning the drivers, design, reviews and implementation of the NISE Project. The final component of the presentation discusses processes used, status and conclusions.
Innovations in clinical trials informatics.
Summers, Ron; Vyas, Hiten; Dudhal, Nilesh; Doherty, Neil F; Coombs, Crispin R; Hepworth, Mark
2008-01-01
This paper will investigate innovations in information management for use in clinical trials. The application typifies a complex, adaptive, distributed and information-rich environment for which continuous innovation is necessary. Organisational innovation is highlighted as well as the technical innovations in workflow processes and their representation as an integrated set of web services. Benefits realization uncovers further innovations in the business strand of the work undertaken. Following the description of the development of this information management system, the semantic web is postulated as a possible solution to tame the complexity related to information management issues found within clinical trials support systems.
A Scientific Data Provenance API for Distributed Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raju, Bibi; Elsethagen, Todd O.; Stephan, Eric G.
Data provenance has been an active area of research as a means to standardize how the origin of data, process event history, and what or who was responsible for influencing results is explained. There are two approaches to capture provenance information. The first approach is to collect observed evidence produced by an executing application using log files, event listeners, and temporary files that are used by the application or application developer. The provenance translated from these observations is an interpretation of the provided evidence. The second approach is called disclosed because the application provides a firsthand account of the provenancemore » based on the anticipated questions on data flow, process flow, and responsible agents. Most observed provenance collection systems collect lot of provenance information during an application run or workflow execution. The common trend in capturing provenance is to collect all possible information, then attempt to find relevant information, which is not efficient. Existing disclosed provenance system APIs do not work well in distributed environment and have trouble finding where to fit the individual pieces of provenance information. This work focuses on determining more reliable solutions for provenance capture. As part of the Integrated End-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project, an API was developed, called Producer API (PAPI), which can disclose application targeted provenance, designed to work in distributed environments by means of unique object identification methods. The provenance disclosure approach used adds additional metadata to the provenance information to uniquely identify the pieces and connect them together. PAPI uses a common provenance model to support this provenance integration across disclosure sources. The API also provides the flexibility to let the user decide what to do with the collected provenance. The collected provenance can be sent to a triple store using REST services or it can be logged to a file.« less
Distributed collaborative environments for virtual capability-based planning
NASA Astrophysics Data System (ADS)
McQuay, William K.
2003-09-01
Distributed collaboration is an emerging technology that will significantly change how decisions are made in the 21st century. Collaboration involves two or more geographically dispersed individuals working together to share and exchange data, information, knowledge, and actions. The marriage of information, collaboration, and simulation technologies provides the decision maker with a collaborative virtual environment for planning and decision support. This paper reviews research that is focusing on the applying open standards agent-based framework with integrated modeling and simulation to a new Air Force initiative in capability-based planning and the ability to implement it in a distributed virtual environment. Virtual Capability Planning effort will provide decision-quality knowledge for Air Force resource allocation and investment planning including examining proposed capabilities and cost of alternative approaches, the impact of technologies, identification of primary risk drivers, and creation of executable acquisition strategies. The transformed Air Force business processes are enabled by iterative use of constructive and virtual modeling, simulation, and analysis together with information technology. These tools are applied collaboratively via a technical framework by all the affected stakeholders - warfighter, laboratory, product center, logistics center, test center, and primary contractor.
NASA Technical Reports Server (NTRS)
Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen;
2012-01-01
For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.
Mathematical model of information process of protection of the social sector
NASA Astrophysics Data System (ADS)
Novikov, D. A.; Tsarkova, E. G.; Dubrovin, A. S.; Soloviev, A. S.
2018-03-01
In work the mathematical model of information protection of society against distribution of extremist moods by means of impact on mass consciousness of information placed in media is investigated. Internal and external channels on which there is a dissemination of information are designated. The problem of optimization consisting in search of the optimum strategy allowing to use most effectively media for dissemination of antiterrorist information with the minimum financial expenses is solved. The algorithm of a numerical method of the solution of a problem of optimization is constructed and also the analysis of results of a computing experiment is carried out.
A convergent model for distributed processing of Big Sensor Data in urban engineering networks
NASA Astrophysics Data System (ADS)
Parygin, D. S.; Finogeev, A. G.; Kamaev, V. A.; Finogeev, A. A.; Gnedkova, E. P.; Tyukov, A. P.
2017-01-01
The problems of development and research of a convergent model of the grid, cloud, fog and mobile computing for analytical Big Sensor Data processing are reviewed. The model is meant to create monitoring systems of spatially distributed objects of urban engineering networks and processes. The proposed approach is the convergence model of the distributed data processing organization. The fog computing model is used for the processing and aggregation of sensor data at the network nodes and/or industrial controllers. The program agents are loaded to perform computing tasks for the primary processing and data aggregation. The grid and the cloud computing models are used for integral indicators mining and accumulating. A computing cluster has a three-tier architecture, which includes the main server at the first level, a cluster of SCADA system servers at the second level, a lot of GPU video cards with the support for the Compute Unified Device Architecture at the third level. The mobile computing model is applied to visualize the results of intellectual analysis with the elements of augmented reality and geo-information technologies. The integrated indicators are transferred to the data center for accumulation in a multidimensional storage for the purpose of data mining and knowledge gaining.
NASA's Earth Science Data Systems
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.
2015-01-01
NASA's Earth Science Data Systems (ESDS) Program has evolved over the last two decades, and currently has several core and community components. Core components provide the basic operational capabilities to process, archive, manage and distribute data from NASA missions. Community components provide a path for peer-reviewed research in Earth Science Informatics to feed into the evolution of the core components. The Earth Observing System Data and Information System (EOSDIS) is a core component consisting of twelve Distributed Active Archive Centers (DAACs) and eight Science Investigator-led Processing Systems spread across the U.S. The presentation covers how the ESDS Program continues to evolve and benefits from as well as contributes to advances in Earth Science Informatics.
Virtual detector theory for strong-field atomic ionization
NASA Astrophysics Data System (ADS)
Wang, Xu; Tian, Justin; Eberly, J. H.
2018-04-01
A virtual detector (VD) is an imaginary device located at a fixed position in space that extracts information from the wave packet passing through it. By recording the particle momentum and the corresponding probability current at each time, the VDs can accumulate and build the differential momentum distribution of the particle, in a way that resembles real experiments. A mathematical proof is given for the equivalence of the differential momentum distribution obtained by the VD method and by Fourier transforming the wave function. In addition to being a tool for reducing the computational load, VDs have also been found useful in interpreting the ultrafast strong-field ionization process, especially the controversial quantum tunneling process.
User Manual for SAHM package for VisTrails
Talbert, C.B.; Talbert, M.K.
2012-01-01
The Software for Assisted Habitat I\\•1odeling (SAHM) has been created to both expedite habitat modeling and help maintain a record of the various input data, pre-and post-processing steps and modeling options incorporated in the construction of a species distribution model. The four main advantages to using the combined VisTrail: SAHM package for species distribution modeling are: 1. formalization and tractable recording of the entire modeling process 2. easier collaboration through a common modeling framework 3. a user-friendly graphical interface to manage file input, model runs, and output 4. extensibility to incorporate future and additional modeling routines and tools. This user manual provides detailed information on each module within the SAHM package, their input, output, common connections, optional arguments, and default settings. This information can also be accessed for individual modules by right clicking on the documentation button for any module in VisTrail or by right clicking on any input or output for a module and selecting view documentation. This user manual is intended to accompany the user guide which provides detailed instructions on how to install the SAHM package within VisTrails and then presents information on the use of the package.
Risk analysis for biological hazards: What we need to know about invasive species
Stohlgren, T.J.; Schnase, J.L.
2006-01-01
Risk analysis for biological invasions is similar to other types of natural and human hazards. For example, risk analysis for chemical spills requires the evaluation of basic information on where a spill occurs; exposure level and toxicity of the chemical agent; knowledge of the physical processes involved in its rate and direction of spread; and potential impacts to the environment, economy, and human health relative to containment costs. Unlike typical chemical spills, biological invasions can have long lag times from introduction and establishment to successful invasion, they reproduce, and they can spread rapidly by physical and biological processes. We use a risk analysis framework to suggest a general strategy for risk analysis for invasive species and invaded habitats. It requires: (1) problem formation (scoping the problem, defining assessment endpoints); (2) analysis (information on species traits, matching species traits to suitable habitats, estimating exposure, surveys of current distribution and abundance); (3) risk characterization (understanding of data completeness, estimates of the “potential” distribution and abundance; estimates of the potential rate of spread; and probable risks, impacts, and costs); and (4) risk management (containment potential, costs, and opportunity costs; legal mandates and social considerations and information science and technology needs).
Kalvelage, Thomas A.; Willems, Jennifer
2005-01-01
The US Geological Survey's EROS Data Center (EDC) hosts the Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC supports NASA's Earth Observing System (EOS), which is a series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. The EOS Data and Information Systems (EOSDIS) was designed to acquire, archive, manage and distribute Earth observation data to the broadest possible user community.The LP DAAC is one of four DAACs that utilize the EOSDIS Core System (ECS) to manage and archive their data. Since the ECS was originally designed, significant changes have taken place in technology, user expectations, and user requirements. Therefore the LP DAAC has implemented additional systems to meet the evolving needs of scientific users, tailored to an integrated working environment. These systems provide a wide variety of services to improve data access and to enhance data usability through subsampling, reformatting, and reprojection. These systems also support the wide breadth of products that are handled by the LP DAAC.The LP DAAC is the primary archive for the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data; it is the only facility in the United States that archives, processes, and distributes data from the Advanced Spaceborne Thermal Emission/Reflection Radiometer (ASTER) on NASA's Terra spacecraft; and it is responsible for the archive and distribution of “land products” generated from data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra and Aqua satellites.