Sample records for information processing time

  1. Process connectivity in a naturally prograding river delta

    NASA Astrophysics Data System (ADS)

    Sendrowski, Alicia; Passalacqua, Paola

    2017-03-01

    River deltas are lowland systems that can display high hydrological connectivity. This connectivity can be structural (morphological connections), functional (control of fluxes), and process connectivity (information flow from system drivers to sinks). In this work, we quantify hydrological process connectivity in Wax Lake Delta, coastal Louisiana, by analyzing couplings among external drivers (discharge, tides, and wind) and water levels recorded at five islands and one channel over summer 2014. We quantify process connections with information theory, a branch of mathematics concerned with the communication of information. We represent process connections as a network; variables serve as network nodes and couplings as network links describing the strength, direction, and time scale of information flow. Comparing process connections at long (105 days) and short (10 days) time scales, we show that tides exhibit daily synchronization with water level, with decreasing strength from downstream to upstream, and that tides transfer information as tides transition from spring to neap. Discharge synchronizes with water level and the time scale of its information transfer compares well to physical travel times through the system, computed with a hydrodynamic model. Information transfer and physical transport show similar spatial patterns, although information transfer time scales are larger than physical travel times. Wind events associated with water level setup lead to increased process connectivity with highly variable information transfer time scales. We discuss the information theory results in the context of the hydrologic behavior of the delta, the role of vegetation as a connector/disconnector on islands, and the applicability of process networks as tools for delta modeling results.

  2. Optimal regulation in systems with stochastic time sampling

    NASA Technical Reports Server (NTRS)

    Montgomery, R. C.; Lee, P. S.

    1980-01-01

    An optimal control theory that accounts for stochastic variable time sampling in a distributed microprocessor based flight control system is presented. The theory is developed by using a linear process model for the airplane dynamics and the information distribution process is modeled as a variable time increment process where, at the time that information is supplied to the control effectors, the control effectors know the time of the next information update only in a stochastic sense. An optimal control problem is formulated and solved for the control law that minimizes the expected value of a quadratic cost function. The optimal cost obtained with a variable time increment Markov information update process where the control effectors know only the past information update intervals and the Markov transition mechanism is almost identical to that obtained with a known and uniform information update interval.

  3. Knowledge, as the Result of the Processed Information by Human's Sub-particles (substrings)/Mind in our Brain

    NASA Astrophysics Data System (ADS)

    Gholibeigian, Hassan

    In my vision, there are four animated sub-particles (mater, plant, animal and human sub-particles) as the origin of the life and creator of momentum in each fundamental particle (string). They communicate with dimension of information which is nested with space-time for getting a package of information in each Planck time. They are link-point between dimension of information and space-time. Sub-particle which identifies its fundamental particle, processes the package of information for finding its next step. Processed information carry always by fundamental particles as the history of the universe and enhance its entropy. My proposed formula for calculating number of packages is I =tP- 1 . τ , Planck time tP, and τ is fundamental particle's lifetime. For example a photon needs processes 1 . 8 ×1043 packages of information for finding its path in a second. Duration of each process is faster than light speed. In our bodies, human's sub-particles (substrings) communicate with dimension of information and get packages of information including standard ethics for process and finding their next step. The processed information transforms to knowledge in our mind. This knowledge is always carried by us. Knowledge, as the Result of the Processed Information by Human's Sub-particles (sub-strings)/Mind in our Brain.

  4. Cerebro-cerebellar interactions underlying temporal information processing.

    PubMed

    Aso, Kenji; Hanakawa, Takashi; Aso, Toshihiko; Fukuyama, Hidenao

    2010-12-01

    The neural basis of temporal information processing remains unclear, but it is proposed that the cerebellum plays an important role through its internal clock or feed-forward computation functions. In this study, fMRI was used to investigate the brain networks engaged in perceptual and motor aspects of subsecond temporal processing without accompanying coprocessing of spatial information. Direct comparison between perceptual and motor aspects of time processing was made with a categorical-design analysis. The right lateral cerebellum (lobule VI) was active during a time discrimination task, whereas the left cerebellar lobule VI was activated during a timed movement generation task. These findings were consistent with the idea that the cerebellum contributed to subsecond time processing in both perceptual and motor aspects. The feed-forward computational theory of the cerebellum predicted increased cerebro-cerebellar interactions during time information processing. In fact, a psychophysiological interaction analysis identified the supplementary motor and dorsal premotor areas, which had a significant functional connectivity with the right cerebellar region during a time discrimination task and with the left lateral cerebellum during a timed movement generation task. The involvement of cerebro-cerebellar interactions may provide supportive evidence that temporal information processing relies on the simulation of timing information through feed-forward computation in the cerebellum.

  5. GEOTAIL Spacecraft historical data report

    NASA Technical Reports Server (NTRS)

    Boersig, George R.; Kruse, Lawrence F.

    1993-01-01

    The purpose of this GEOTAIL Historical Report is to document ground processing operations information gathered on the GEOTAIL mission during processing activities at the Cape Canaveral Air Force Station (CCAFS). It is hoped that this report may aid management analysis, improve integration processing and forecasting of processing trends, and reduce real-time schedule changes. The GEOTAIL payload is the third Delta 2 Expendable Launch Vehicle (ELV) mission to document historical data. Comparisons of planned versus as-run schedule information are displayed. Information will generally fall into the following categories: (1) payload stay times (payload processing facility/hazardous processing facility/launch complex-17A); (2) payload processing times (planned, actual); (3) schedule delays; (4) integrated test times (experiments/launch vehicle); (5) unique customer support requirements; (6) modifications performed at facilities; (7) other appropriate information (Appendices A & B); and (8) lessons learned (reference Appendix C).

  6. Forced guidance and distribution of practice in sequential information processing.

    NASA Technical Reports Server (NTRS)

    Decker, L. R.; Rogers, C. A., Jr.

    1973-01-01

    Distribution of practice and forced guidance were used in a sequential information-processing task in an attempt to increase the capacity of human information-processing mechanisms. A reaction time index of the psychological refractory period was used as the response measure. Massing of practice lengthened response times while forced guidance shortened them. Interpretation was in terms of load reduction upon the response-selection stage of the information-processing system.-

  7. Quantum process tomography with informational incomplete data of two J-coupled heterogeneous spins relaxation in a time window much greater than T1

    NASA Astrophysics Data System (ADS)

    Maciel, Thiago O.; Vianna, Reinaldo O.; Sarthour, Roberto S.; Oliveira, Ivan S.

    2015-11-01

    We reconstruct the time dependent quantum map corresponding to the relaxation process of a two-spin system in liquid-state NMR at room temperature. By means of quantum tomography techniques that handle informational incomplete data, we show how to properly post-process and normalize the measurements data for the simulation of quantum information processing, overcoming the unknown number of molecules prepared in a non-equilibrium magnetization state (Nj) by an initial sequence of radiofrequency pulses. From the reconstructed quantum map, we infer both longitudinal (T1) and transversal (T2) relaxation times, and introduce the J-coupling relaxation times ({T}1J,{T}2J), which are relevant for quantum information processing simulations. We show that the map associated to the relaxation process cannot be assumed approximated unital and trace-preserving for times greater than {T}2J.

  8. 22 CFR 706.30 - Timing of responses to requests.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 706.30 Foreign Relations OVERSEAS PRIVATE INVESTMENT CORPORATION ADMINISTRATIVE PROVISIONS INFORMATION DISCLOSURE UNDER THE FREEDOM OF INFORMATION ACT Processing of Requests § 706.30 Timing of responses to... information; (2) A request for expedited processing may be made at any time. (3) A requester who seeks...

  9. Information distribution in distributed microprocessor based flight control systems

    NASA Technical Reports Server (NTRS)

    Montgomery, R. C.; Lee, P. S.

    1977-01-01

    This paper presents an optimal control theory that accounts for variable time intervals in the information distribution to control effectors in a distributed microprocessor based flight control system. The theory is developed using a linear process model for the aircraft dynamics and the information distribution process is modeled as a variable time increment process where, at the time that information is supplied to the control effectors, the control effectors know the time of the next information update only in a stochastic sense. An optimal control problem is formulated and solved that provides the control law that minimizes the expected value of a quadratic cost function. An example is presented where the theory is applied to the control of the longitudinal motions of the F8-DFBW aircraft. Theoretical and simulation results indicate that, for the example problem, the optimal cost obtained using a variable time increment Markov information update process where the control effectors know only the past information update intervals and the Markov transition mechanism is almost identical to that obtained using a known uniform information update interval.

  10. How does information congruence influence diagnosis performance?

    PubMed

    Chen, Kejin; Li, Zhizhong

    2015-01-01

    Diagnosis performance is critical for the safety of high-consequence industrial systems. It depends highly on the information provided, perceived, interpreted and integrated by operators. This article examines the influence of information congruence (congruent information vs. conflicting information vs. missing information) and its interaction with time pressure (high vs. low) on diagnosis performance on a simulated platform. The experimental results reveal that the participants confronted with conflicting information spent significantly more time generating correct hypotheses and rated the results with lower probability values than when confronted with the other two levels of information congruence and were more prone to arrive at a wrong diagnosis result than when they were provided with congruent information. This finding stresses the importance of the proper processing of non-congruent information in safety-critical systems. Time pressure significantly influenced display switching frequency and completion time. This result indicates the decisive role of time pressure. Practitioner Summary: This article examines the influence of information congruence and its interaction with time pressure on human diagnosis performance on a simulated platform. For complex systems in the process control industry, the results stress the importance of the proper processing of non-congruent information in safety-critical systems.

  11. Multiscale analysis of information dynamics for linear multivariate processes.

    PubMed

    Faes, Luca; Montalto, Alessandro; Stramaglia, Sebastiano; Nollo, Giandomenico; Marinazzo, Daniele

    2016-08-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using statespace (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale information dynamics for simulated unidirectionally and bidirectionally coupled VAR processes, showing that rescaling may lead to insightful patterns of information storage and transfer but also to potentially misleading behaviors.

  12. 12 CFR 404.5 - Time for processing.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Time for processing. 404.5 Section 404.5 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES INFORMATION DISCLOSURE Procedures for Disclosure of Records Under the Freedom of Information Act. § 404.5 Time for processing. (a) General. Ex-Im Bank...

  13. Vortex information display system program description manual. [data acquisition from laser Doppler velocimeters and real time operation

    NASA Technical Reports Server (NTRS)

    Conway, R.; Matuck, G. N.; Roe, J. M.; Taylor, J.; Turner, A.

    1975-01-01

    A vortex information display system is described which provides flexible control through system-user interaction for collecting wing-tip-trailing vortex data, processing this data in real time, displaying the processed data, storing raw data on magnetic tape, and post processing raw data. The data is received from two asynchronous laser Doppler velocimeters (LDV's) and includes position, velocity, and intensity information. The raw data is written onto magnetic tape for permanent storage and is also processed in real time to locate vortices and plot their positions as a function of time. The interactive capability enables the user to make real time adjustments in processing data and provides a better definition of vortex behavior. Displaying the vortex information in real time produces a feedback capability to the LDV system operator allowing adjustments to be made in the collection of raw data. Both raw data and processing can be continually upgraded during flyby testing to improve vortex behavior studies. The post-analysis capability permits the analyst to perform in-depth studies of test data and to modify vortex behavior models to improve transport predictions.

  14. Text Processing: The Role of Reader Expectations and Background Knowledge.

    DTIC Science & Technology

    1987-08-01

    essay test were expected, but spend more time processing lower-- level information than if a recognition test were expected. Furthermore, processing ...shifts ii tle amount of time devoted to reading information at various levels in a t.x, structure, rather than dramatic differences in processing patt...structures ( Craik & Lockhart , 1972; Goetz, Schallert, Reynolds, & Radin, 1983). If new information is compatible with existing memory structures, it is

  15. Reconceptualizing perceptual load as a rate problem: The role of time in the allocation of selective attention.

    PubMed

    Li, Zhi; Xin, Keyun; Li, Wei; Li, Yanzhe

    2018-04-30

    In the literature about allocation of selective attention, a widely studied question is when will attention be allocated to information that is clearly irrelevant to the task at hand. The present study, by using convergent evidence, demonstrated that there is a trade-off between quantity of information present in a display and the time allowed to process it. Specifically, whether or not there is interference from irrelevant distractors depends not only on the amount of information present, but also on the amount of time allowed to process that information. When processing time is calibrated to the amount of information present, irrelevant distractors can be selectively ignored successfully. These results suggest that the perceptual load in the load theory of selective attention (i.e., Lavie, 2005) should be thought about as a dynamic rate problem rather than a static capacity limitation. The authors thus propose that rather than conceiving of perceptual load as a quantity of information, they should consider it as a quantity of information per unit of time. In other words, it is the relationship between the quantity of information in the task and the time for processing the information that determines the allocation of selective attention. Thus, the present findings extended load theory, allowing it to explain findings that were previously considered as counter evidence of load theory. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  16. 75 FR 80516 - Notice of Submission of Proposed Information Collection to OMB; Lender Qualifications for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    ... Proposed Information Collection to OMB; Lender Qualifications for Multifamily Accelerated Processing (MAP) AGENCY: Office of the Chief Information Officer, HUD. ACTION: Notice. SUMMARY: The proposed information...-processing plan that will take substantially less processing time than traditional processing. DATES...

  17. Understanding the nature of information seeking behavior in critical care: implications for the design of health information technology.

    PubMed

    Kannampallil, Thomas G; Franklin, Amy; Mishra, Rashmi; Almoosa, Khalid F; Cohen, Trevor; Patel, Vimla L

    2013-01-01

    Information in critical care environments is distributed across multiple sources, such as paper charts, electronic records, and support personnel. For decision-making tasks, physicians have to seek, gather, filter and organize information from various sources in a timely manner. The objective of this research is to characterize the nature of physicians' information seeking process, and the content and structure of clinical information retrieved during this process. Eight medical intensive care unit physicians provided a verbal think-aloud as they performed a clinical diagnosis task. Verbal descriptions of physicians' activities, sources of information they used, time spent on each information source, and interactions with other clinicians were captured for analysis. The data were analyzed using qualitative and quantitative approaches. We found that the information seeking process was exploratory and iterative and driven by the contextual organization of information. While there was no significant differences between the overall time spent paper or electronic records, there was marginally greater relative information gain (i.e., more unique information retrieved per unit time) from electronic records (t(6)=1.89, p=0.1). Additionally, information retrieved from electronic records was at a higher level (i.e., observations and findings) in the knowledge structure than paper records, reflecting differences in the nature of knowledge utilization across resources. A process of local optimization drove the information seeking process: physicians utilized information that maximized their information gain even though it required significantly more cognitive effort. Implications for the design of health information technology solutions that seamlessly integrate information seeking activities within the workflow, such as enriching the clinical information space and supporting efficient clinical reasoning and decision-making, are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Challenges in Extracting Information From Large Hydrogeophysical-monitoring Datasets

    NASA Astrophysics Data System (ADS)

    Day-Lewis, F. D.; Slater, L. D.; Johnson, T.

    2012-12-01

    Over the last decade, new automated geophysical data-acquisition systems have enabled collection of increasingly large and information-rich geophysical datasets. Concurrent advances in field instrumentation, web services, and high-performance computing have made real-time processing, inversion, and visualization of large three-dimensional tomographic datasets practical. Geophysical-monitoring datasets have provided high-resolution insights into diverse hydrologic processes including groundwater/surface-water exchange, infiltration, solute transport, and bioremediation. Despite the high information content of such datasets, extraction of quantitative or diagnostic hydrologic information is challenging. Visual inspection and interpretation for specific hydrologic processes is difficult for datasets that are large, complex, and (or) affected by forcings (e.g., seasonal variations) unrelated to the target hydrologic process. New strategies are needed to identify salient features in spatially distributed time-series data and to relate temporal changes in geophysical properties to hydrologic processes of interest while effectively filtering unrelated changes. Here, we review recent work using time-series and digital-signal-processing approaches in hydrogeophysics. Examples include applications of cross-correlation, spectral, and time-frequency (e.g., wavelet and Stockwell transforms) approaches to (1) identify salient features in large geophysical time series; (2) examine correlation or coherence between geophysical and hydrologic signals, even in the presence of non-stationarity; and (3) condense large datasets while preserving information of interest. Examples demonstrate analysis of large time-lapse electrical tomography and fiber-optic temperature datasets to extract information about groundwater/surface-water exchange and contaminant transport.

  19. The speed of information processing of 9- to 13-year-old intellectually gifted children.

    PubMed

    Duan, Xiaoju; Dan, Zhou; Shi, Jiannong

    2013-02-01

    In general, intellectually gifted children perform better than non-gifted children across many domains. The present validation study investigated the speed with which intellectually gifted children process information. 184 children, ages 9 to 13 years old (91 gifted, M age = 10.9 yr., SD = 1.8; 93 non-gifted children, M age = 11.0 yr., SD = 1.7) were tested individually on three information processing tasks: an inspection time task, a choice reaction time task, an abstract matching task. Intellectually gifted children outperformed their non-gifted peers on all three tasks obtaining shorter reaction time and doing so with greater accuracy. The findings supported the validity of the information processing speed in identifying intellectually gifted children.

  20. Information Processing Techniques Program. Volume II. Communications- Adaptive Internetting

    DTIC Science & Technology

    1977-09-30

    LABORATORY INFORMATION PROCESSING TECHNIQUES PROGRAM VOLUME II: COMMUNICATIONS-ADAPTIVE INTERNETTING I SEMIANNUAL TECHNICAL SUMMARY REPORT TO THE...MASSACHUSETTS ABSTRACT This repori describes work performed on the Communications-Adaptive Internetting program sponsored by the Information ... information processing techniques network speech terminal communicatlons-adaptive internetting 04 links digital voice communications time-varying

  1. An assembly process model based on object-oriented hierarchical time Petri Nets

    NASA Astrophysics Data System (ADS)

    Wang, Jiapeng; Liu, Shaoli; Liu, Jianhua; Du, Zenghui

    2017-04-01

    In order to improve the versatility, accuracy and integrity of the assembly process model of complex products, an assembly process model based on object-oriented hierarchical time Petri Nets is presented. A complete assembly process information model including assembly resources, assembly inspection, time, structure and flexible parts is established, and this model describes the static and dynamic data involved in the assembly process. Through the analysis of three-dimensional assembly process information, the assembly information is hierarchically divided from the whole, the local to the details and the subnet model of different levels of object-oriented Petri Nets is established. The communication problem between Petri subnets is solved by using message database, and it reduces the complexity of system modeling effectively. Finally, the modeling process is presented, and a five layer Petri Nets model is established based on the hoisting process of the engine compartment of a wheeled armored vehicle.

  2. Poor sleep quality predicts deficient emotion information processing over time in early adolescence.

    PubMed

    Soffer-Dudek, Nirit; Sadeh, Avi; Dahl, Ronald E; Rosenblat-Stein, Shiran

    2011-11-01

    There is deepening understanding of the effects of sleep on emotional information processing. Emotion information processing is a key aspect of social competence, which undergoes important maturational and developmental changes in adolescence; however, most research in this area has focused on adults. Our aim was to test the links between sleep and emotion information processing during early adolescence. Sleep and facial information processing were assessed objectively during 3 assessment waves, separated by 1-year lags. Data were obtained in natural environments-sleep was assessed in home settings, and facial information processing was assessed at school. 94 healthy children (53 girls, 41 boys), aged 10 years at Time 1. N/A. Facial information processing was tested under neutral (gender identification) and emotional (emotional expression identification) conditions. Sleep was assessed in home settings using actigraphy for 7 nights at each assessment wave. Waking > 5 min was considered a night awakening. Using multilevel modeling, elevated night awakenings and decreased sleep efficiency significantly predicted poor performance only in the emotional information processing condition (e.g., b = -1.79, SD = 0.52, confidence interval: lower boundary = -2.82, upper boundary = -0.076, t(416.94) = -3.42, P = 0.001). Poor sleep quality is associated with compromised emotional information processing during early adolescence, a sensitive period in socio-emotional development.

  3. The usability axiom of medical information systems.

    PubMed

    Pantazi, Stefan V; Kushniruk, Andre; Moehr, Jochen R

    2006-12-01

    In this article we begin by connecting the concept of simplicity of user interfaces of information systems with that of usability, and the concept of complexity of the problem-solving in information systems with the concept of usefulness. We continue by stating "the usability axiom" of medical information technology: information systems must be, at the same time, usable and useful. We then try to show why, given existing technology, the axiom is a paradox and we continue with analysing and reformulating it several times, from more fundamental information processing perspectives. We underline the importance of the concept of representation and demonstrate the need for context-dependent representations. By means of thought experiments and examples, we advocate the need for context-dependent information processing and argue for the relevance of algorithmic information theory and case-based reasoning in this context. Further, we introduce the notion of concept spaces and offer a pragmatic perspective on context-dependent representations. We conclude that the efficient management of concept spaces may help with the solution to the medical information technology paradox. Finally, we propose a view of informatics centred on the concepts of context-dependent information processing and management of concept spaces that aligns well with existing knowledge centric definitions of informatics in general and medical informatics in particular. In effect, our view extends M. Musen's proposal and proposes a definition of Medical Informatics as context-dependent medical information processing. The axiom that medical information systems must be, at the same time, useful and usable, is a paradox and its investigation by means of examples and thought experiments leads to the recognition of the crucial importance of context-dependent information processing. On the premise that context-dependent information processing equates to knowledge processing, this view defines Medical Informatics as a context-dependent medical information processing which aligns well with existing knowledge centric definitions of our field.

  4. How visual timing and form information affect speech and non-speech processing.

    PubMed

    Kim, Jeesun; Davis, Chris

    2014-10-01

    Auditory speech processing is facilitated when the talker's face/head movements are seen. This effect is typically explained in terms of visual speech providing form and/or timing information. We determined the effect of both types of information on a speech/non-speech task (non-speech stimuli were spectrally rotated speech). All stimuli were presented paired with the talker's static or moving face. Two types of moving face stimuli were used: full-face versions (both spoken form and timing information available) and modified face versions (only timing information provided by peri-oral motion available). The results showed that the peri-oral timing information facilitated response time for speech and non-speech stimuli compared to a static face. An additional facilitatory effect was found for full-face versions compared to the timing condition; this effect only occurred for speech stimuli. We propose the timing effect was due to cross-modal phase resetting; the form effect to cross-modal priming. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Age differences in decision making: a process methodology for examining strategic information processing.

    PubMed

    Johnson, M M

    1990-03-01

    This study explored the use of process tracing techniques in examining the decision-making processes of older and younger adults. Thirty-six college-age and thirty-six retirement-age participants decided which one of six cars they would purchase on the basis of computer-accessed data. They provided information search protocols. Results indicate that total time to reach a decision did not differ according to age. However, retirement-age participants used less information, spent more time viewing, and re-viewed fewer bits of information than college-age participants. Information search patterns differed markedly between age groups. Patterns of retirement-age adults indicated their use of noncompensatory decision rules which, according to decision-making literature (Payne, 1976), reduce cognitive processing demands. The patterns of the college-age adults indicated their use of compensatory decision rules, which have higher processing demands.

  6. Click trains and the rate of information processing: does "speeding up" subjective time make other psychological processes run faster?

    PubMed

    Jones, Luke A; Allely, Clare S; Wearden, John H

    2011-02-01

    A series of experiments demonstrated that a 5-s train of clicks that have been shown in previous studies to increase the subjective duration of tones they precede (in a manner consistent with "speeding up" timing processes) could also have an effect on information-processing rate. Experiments used studies of simple and choice reaction time (Experiment 1), or mental arithmetic (Experiment 2). In general, preceding trials by clicks made response times significantly shorter than those for trials without clicks, but white noise had no effects on response times. Experiments 3 and 4 investigated the effects of clicks on performance on memory tasks, using variants of two classic experiments of cognitive psychology: Sperling's (1960) iconic memory task and Loftus, Johnson, and Shimamura's (1985) iconic masking task. In both experiments participants were able to recall or recognize significantly more information from stimuli preceded by clicks than those preceded by silence.

  7. A situation-response model for intelligent pilot aiding

    NASA Technical Reports Server (NTRS)

    Schudy, Robert; Corker, Kevin

    1987-01-01

    An intelligent pilot aiding system needs models of the pilot information processing to provide the computational basis for successful cooperation between the pilot and the aiding system. By combining artificial intelligence concepts with the human information processing model of Rasmussen, an abstraction hierarchy of states of knowledge, processing functions, and shortcuts are developed, which is useful for characterizing the information processing both of the pilot and of the aiding system. This approach is used in the conceptual design of a real time intelligent aiding system for flight crews of transport aircraft. One promising result was the tentative identification of a particular class of information processing shortcuts, from situation characterizations to appropriate responses, as the most important reliable pathway for dealing with complex time critical situations.

  8. Design and Evaluation of Wood Processing Facilities Using Object-Oriented Simulation

    Treesearch

    D. Earl Kline; Philip A. Araman

    1992-01-01

    Managers of hardwood processing facilities need timely information on which to base important decisions such as when to add costly equipment or how to improve profitability subject to time-varying demands. The overall purpose of this paper is to introduce a tool that can effectively provide such timely information. A simulation/animation modeling procedure is described...

  9. Lognormal Infection Times of Online Information Spread

    PubMed Central

    Doerr, Christian; Blenn, Norbert; Van Mieghem, Piet

    2013-01-01

    The infection times of individuals in online information spread such as the inter-arrival time of Twitter messages or the propagation time of news stories on a social media site can be explained through a convolution of lognormally distributed observation and reaction times of the individual participants. Experimental measurements support the lognormal shape of the individual contributing processes, and have resemblance to previously reported lognormal distributions of human behavior and contagious processes. PMID:23700473

  10. A Package of Information as the Planck Unit of Information and Also as a Fundamental Physical (Universal) Constant

    NASA Astrophysics Data System (ADS)

    Gholibeigian, Hassan

    Dimension of information as the fifth dimension of the universe including packages of new information, is nested with space-time. Distributed density of information is matched on its correspondence distributed mater in space-time. Fundamental particle (string) like photon and graviton needs a package of information including its exact quantum state and law for process and travel a Planck length in a Planck time. This process is done via sub-particles (substrings). Processed information is carried by particle as the universe's history. My proposed formula for Planck unit of information (IP) and also for Fundamental Physical (Universal) Constant is: IP =lP ct P =1 Planck length lP, Planck time tP, and c , is light speed. Also my proposed formula for calculation of the packages is: I =tP- 1 . τ , in which, I is number of packages, and τ is lifetime of the particle. ``Communication of information'' as a ``fundamental symmetry'' leads phenomena. Packages should be always up to date including new information for evolution of the Universe. But, where come from or how are created new information which Hawking and his colleagues forgot it bring inside the black hole and leave it behind the horizon in form of soft hair?

  11. Remote Sensing: A valuable tool in the Forest Service decision making process. [in Utah

    NASA Technical Reports Server (NTRS)

    Stanton, F. L.

    1975-01-01

    Forest Service studies for integrating remotely sensed data into existing information systems highlight a need to: (1) re-examine present methods of collecting and organizing data, (2) develop an integrated information system for rapidly processing and interpreting data, (3) apply existing technological tools in new ways, and (4) provide accurate and timely information for making right management decisions. The Forest Service developed an integrated information system using remote sensors, microdensitometers, computer hardware and software, and interactive accessories. Their efforts substantially reduce the time it takes for collecting and processing data.

  12. Information Fusion for Feature Extraction and the Development of Geospatial Information

    DTIC Science & Technology

    2004-07-01

    of automated processing . 2. Requirements for Geospatial Information Accurate, timely geospatial information is critical for many military...this evaluation illustrates some of the difficulties in comparing manual and automated processing results (figure 5). The automated delineation of

  13. Expansion and Compression of Time Correlate with Information Processing in an Enumeration Task.

    PubMed

    Wutz, Andreas; Shukla, Anuj; Bapi, Raju S; Melcher, David

    2015-01-01

    Perception of temporal duration is subjective and is influenced by factors such as attention and context. For example, unexpected or emotional events are often experienced as if time subjectively expands, suggesting that the amount of information processed in a unit of time can be increased. Time dilation effects have been measured with an oddball paradigm in which an infrequent stimulus is perceived to last longer than standard stimuli in the rest of the sequence. Likewise, time compression for the oddball occurs when the duration of the standard items is relatively brief. Here, we investigated whether the amount of information processing changes when time is perceived as distorted. On each trial, an oddball stimulus of varying numerosity (1-14 items) and duration was presented along with standard items that were either short (70 ms) or long (1050 ms). Observers were instructed to count the number of dots within the oddball stimulus and to judge its relative duration with respect to the standards on that trial. Consistent with previous results, oddballs were reliably perceived as temporally distorted: expanded for longer standard stimuli blocks and compressed for shorter standards. The occurrence of these distortions of time perception correlated with perceptual processing; i.e. enumeration accuracy increased when time was perceived as expanded and decreased with temporal compression. These results suggest that subjective time distortions are not epiphenomenal, but reflect real changes in sensory processing. Such short-term plasticity in information processing rate could be evolutionarily advantageous in optimizing perception and action during critical moments.

  14. The Effect of Highlighting on Processing and Memory of Central and Peripheral Text Information: Evidence from Eye Movements

    ERIC Educational Resources Information Center

    Yeari, Menahem; Oudega, Marja; van den Broek, Paul

    2017-01-01

    The present study investigated the effect of text highlighting on online processing and memory of central and peripheral information. We compared processing time (using eye-tracking methodology) and recall of central and peripheral information for three types of highlighting: (a) highlighting of central information, (b) highlighting of peripheral…

  15. Local active information storage as a tool to understand distributed neural information processing

    PubMed Central

    Wibral, Michael; Lizier, Joseph T.; Vögler, Sebastian; Priesemann, Viola; Galuske, Ralf

    2013-01-01

    Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for today's digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, definitions were given for the dynamics of these information processing operations on a local scale in space and time in a distributed system, and the specific concept of local active information storage was successfully applied to the analysis and optimization of artificial neural systems. However, no attempt to measure the space-time dynamics of local active information storage in neural data has been made to date. Here we measure local active information storage on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat. We show that storage reflects neural properties such as stimulus preferences and surprise upon unexpected stimulus change, and in area 18 reflects the abstract concept of an ongoing stimulus despite the locally random nature of this stimulus. We suggest that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding. PMID:24501593

  16. Basic disturbances of information processing in psychosis prediction.

    PubMed

    Bodatsch, Mitja; Klosterkötter, Joachim; Müller, Ralf; Ruhrmann, Stephan

    2013-01-01

    The basic symptoms (BS) approach provides a valid instrument in predicting psychosis onset and represents moreover a significant heuristic framework for research. The term "basic symptoms" denotes subtle changes of cognition and perception in the earliest and prodromal stages of psychosis development. BS are thought to correspond to disturbances of neural information processing. Following the heuristic implications of the BS approach, the present paper aims at exploring disturbances of information processing, revealed by functional magnetic resonance imaging (fMRI) and electro-encephalographic as characteristics of the at-risk state of psychosis. Furthermore, since high-risk studies employing ultra-high-risk criteria revealed non-conversion rates commonly exceeding 50%, thus warranting approaches that increase specificity, the potential contribution of neural information processing disturbances to psychosis prediction is reviewed. In summary, the at-risk state seems to be associated with information processing disturbances. Moreover, fMRI investigations suggested that disturbances of language processing domains might be a characteristic of the prodromal state. Neurophysiological studies revealed that disturbances of sensory processing may assist psychosis prediction in allowing for a quantification of risk in terms of magnitude and time. The latter finding represents a significant advancement since an estimation of the time to event has not yet been achieved by clinical approaches. Some evidence suggests a close relationship between self-experienced BS and neural information processing. With regard to future research, the relationship between neural information processing disturbances and different clinical risk concepts warrants further investigations. Thereby, a possible time sequence in the prodromal phase might be of particular interest.

  17. Using eye-tracking to study the on-line processing of case-marking information among intermediate L2 learners of German

    PubMed Central

    Jackson, Carrie N.; Dussias, Paola E.; Hristova, Adelina

    2012-01-01

    This study uses eye-tracking to examine the processing of case-marking information in ambiguous subject- and object-first wh-questions in German. The position of the lexical verb was also manipulated via verb tense to investigate whether verb location influences how intermediate L2 learners process L2 sentences. Results show that intermediate L2 German learners were sensitive to case-marking information, exhibiting longer processing times on subject-first than object-first sentences, regardless of verb location. German native speakers exhibited the opposite word order preference, with longer processing times on object-first than subject-first sentences, replicating previous findings. These results are discussed in light of current L2 processing research, highlighting how methodological constraints influence researchers’ abilities to measure the on-line processing of morphosyntactic information among intermediate L2 learners. PMID:23493761

  18. Using Informance to Educate Parents and Demonstrate the Music Learning Process

    ERIC Educational Resources Information Center

    Nowmos, Christine M.

    2010-01-01

    Informances, informal and informative presentations of student learning that emphasize the learning process, provide an alternative to traditional concerts or programs, which may take general music classroom time away from activities not geared toward a specific performance. Informances are an excellent means of communicating educational…

  19. Impaired information processing triggers altered states of consciousness.

    PubMed

    Fritzsche, M

    2002-04-01

    Schizophrenia, intoxication with tetrahydrocannabinol (Delta-THC), and cannabis psychosis induce characteristic time and space distortions suggesting a common psychotic dysfunction. Since genetic research into schizophrenia has led into disappointing dead ends, the present study is focusing on this phenotype. It is shown that information theory can account for the dynamical basis of higher sensorimotor information processing and consciousness under physiologic as well as pathologic conditions. If Kolmogorov entropy (inherent in the processing of action and time) breaks down in acute psychosis, it is predicted that Shannon entropy (inherent in the processing of higher dimensional perception) will increase, provoking positive symptoms and altered states of consciousness. In the search for candidate genes and the protection of vulnerable individuals from cannabis abuse, non-linear EEG analysis of Kolmogorov information could thus present us with a novel diagnostic tool to directly assess the breakdown of information processing in schizophrenia. Copyright 2002 Elsevier Science Ltd. All rights reserved.

  20. Performance-informed EEG analysis reveals mixed evidence for EEG signatures unique to the processing of time.

    PubMed

    Schlichting, Nadine; de Jong, Ritske; van Rijn, Hedderik

    2018-06-20

    Certain EEG components (e.g., the contingent negative variation, CNV, or beta oscillations) have been linked to the perception of temporal magnitudes specifically. However, it is as of yet unclear whether these EEG components are really unique to time perception or reflect the perception of magnitudes in general. In the current study we recorded EEG while participants had to make judgments about duration (time condition) or numerosity (number condition) in a comparison task. This design allowed us to directly compare EEG signals between the processing of time and number. Stimuli consisted of a series of blue dots appearing and disappearing dynamically on a black screen. Each stimulus was characterized by its duration and the total number of dots that it consisted of. Because it is known that tasks like these elicit perceptual interference effects that we used a maximum-likelihood estimation (MLE) procedure to determine, for each participant and dimension separately, to what extent time and numerosity information were taken into account when making a judgement in an extensive post hoc analysis. This approach enabled us to capture individual differences in behavioral performance and, based on the MLE estimates, to select a subset of participants who suppressed task-irrelevant information. Even for this subset of participants, who showed no or only small interference effects and thus were thought to truly process temporal information in the time condition and numerosity information in the number condition, we found CNV patterns in the time-domain EEG signals for both tasks that was more pronounced in the time-task. We found no substantial evidence for differences between the processing of temporal and numerical information in the time-frequency domain.

  1. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  2. 32 CFR 701.120 - Processing requests that cite or imply PA, Freedom of Information (FOIA), or PA/FOIA.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Privacy Program § 701.120 Processing requests that cite or imply PA, Freedom of Information (FOIA), or PA... maximum release of information allowed under the Acts. (d) Processing time limits. DON activities shall... 32 National Defense 5 2010-07-01 2010-07-01 false Processing requests that cite or imply PA...

  3. An Information Filtering and Control System to Improve the Decision Making Process Within Future Command Information Centres

    DTIC Science & Technology

    2001-04-01

    part of the following report: TITLE: New Information Processing Techniques for Military Systems [les Nouvelles techniques de traitement de l’information...rapidly developing information increasing amount of time is needed for gathering and technology has until now not yet resulted in a substantial...Information Processing Techniques for Military Systems", held in Istanbul, Turkey, 9-11 October 2000, and published in RTO MP-049. 23-2 organisations. The

  4. Real-Time Data Processing Systems and Products at the Alaska Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Ruppert, N. A.; Hansen, R. A.

    2007-05-01

    The Alaska Earthquake Information Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.

  5. Developmental Steps in Metaphorical Language Abilities: The Influence of Age, Gender, Cognitive Flexibility, Information Processing Speed, and Analogical Reasoning.

    PubMed

    Willinger, Ulrike; Deckert, Matthias; Schmöger, Michaela; Schaunig-Busch, Ines; Formann, Anton K; Auff, Eduard

    2017-12-01

    Metaphor is a specific type of figurative language that is used in various important fields such as in the work with children in clinical or teaching contexts. The aim of the study was to investigate the developmental course, developmental steps, and possible cognitive predictors regarding metaphor processing in childhood and early adolescence. One hundred sixty-four typically developing children (7-year-olds, 9-year-olds) and early adolescents (11-year-olds) were tested for metaphor identification, comprehension, comprehension quality, and preference by the Metaphoric Triads Task as well as for analogical reasoning, information processing speed, cognitive flexibility under time pressure, and cognitive flexibility without time pressure. Metaphor identification and comprehension consecutively increased with age. Eleven-year-olds showed significantly higher metaphor comprehension quality and preference scores than seven- and nine-year-olds, whilst these younger age groups did not differ. Age, cognitive flexibility under time pressure, information processing speed, analogical reasoning, and cognitive flexibility without time pressure significantly predicted metaphor comprehension. Metaphorical language ability shows an ongoing development and seemingly changes qualitatively at the beginning of early adolescence. These results can possibly be explained by a greater synaptic reorganization in early adolescents. Furthermore, cognitive flexibility under time pressure and information processing speed possibly facilitate the ability to adapt metaphor processing strategies in a flexible, quick, and appropriate way.

  6. Research on robot mobile obstacle avoidance control based on visual information

    NASA Astrophysics Data System (ADS)

    Jin, Jiang

    2018-03-01

    Robots to detect obstacles and control robots to avoid obstacles has been a key research topic of robot control. In this paper, a scheme of visual information acquisition is proposed. By judging visual information, the visual information is transformed into the information source of path processing. In accordance with the established route, in the process of encountering obstacles, the algorithm real-time adjustment trajectory to meet the purpose of intelligent control of mobile robots. Simulation results show that, through the integration of visual sensing information, the obstacle information is fully obtained, while the real-time and accuracy of the robot movement control is guaranteed.

  7. The study of features of the structural organization of the au-tomated information processing system of the collective type

    NASA Astrophysics Data System (ADS)

    Nikolaev, V. N.; Titov, D. V.; Syryamkin, V. I.

    2018-05-01

    The comparative assessment of the level of channel capacity of different variants of the structural organization of the automated information processing systems is made. The information processing time assessment model depending on the type of standard elements and their structural organization is developed.

  8. Superior memory efficiency of quantum devices for the simulation of continuous-time stochastic processes

    NASA Astrophysics Data System (ADS)

    Elliott, Thomas J.; Gu, Mile

    2018-03-01

    Continuous-time stochastic processes pervade everyday experience, and the simulation of models of these processes is of great utility. Classical models of systems operating in continuous-time must typically track an unbounded amount of information about past behaviour, even for relatively simple models, enforcing limits on precision due to the finite memory of the machine. However, quantum machines can require less information about the past than even their optimal classical counterparts to simulate the future of discrete-time processes, and we demonstrate that this advantage extends to the continuous-time regime. Moreover, we show that this reduction in the memory requirement can be unboundedly large, allowing for arbitrary precision even with a finite quantum memory. We provide a systematic method for finding superior quantum constructions, and a protocol for analogue simulation of continuous-time renewal processes with a quantum machine.

  9. Analysis of E-marketplace Attributes: Assessing The NATO Logistics Stock Exchange

    DTIC Science & Technology

    2008-01-01

    order processing time Reduction of stock levels Reduction of payment processing time Reduction of excessive stocks Reduction of maverick buying...satisfaction 4,02 0,151 3. Reduction of order processing time 4,27 0,317 15. Reduction of stock levels 3,87 0,484 4. Reduction of payment processing time...information exchange with partners in the supply chain Efficiency Basic Reduction of order processing time Efficiency Important Reduction of

  10. The Importance of Reaction Times for Developmental Science: What a Difference Milliseconds Make

    ERIC Educational Resources Information Center

    Lange-Küttner, Christiane

    2012-01-01

    Reaction times are still rarely reported in developmental psychology although they are an indicator of the neural maturity of children's information processing system. Competence and capacity are confounded in development, where children may be able to reason, or remember, but are unable to cope with information processing load. Furthermore, there…

  11. Sensory Mode and "Information Load": Examining the Effects of Timing on Multisensory Processing.

    ERIC Educational Resources Information Center

    Tiene, Drew

    2000-01-01

    Discussion of the development of instructional multimedia materials focuses on a study of undergraduates that examined how the use of visual icons affected learning, differences in the instructional effectiveness of visual versus auditory processing of the same information, and timing (whether simultaneous or sequential presentation is more…

  12. Optimal protocol for maximum work extraction in a feedback process with a time-varying potential

    NASA Astrophysics Data System (ADS)

    Kwon, Chulan

    2017-12-01

    The nonequilibrium nature of information thermodynamics is characterized by the inequality or non-negativity of the total entropy change of the system, memory, and reservoir. Mutual information change plays a crucial role in the inequality, in particular if work is extracted and the paradox of Maxwell's demon is raised. We consider the Brownian information engine where the protocol set of the harmonic potential is initially chosen by the measurement and varies in time. We confirm the inequality of the total entropy change by calculating, in detail, the entropic terms including the mutual information change. We rigorously find the optimal values of the time-dependent protocol for maximum extraction of work both for the finite-time and the quasi-static process.

  13. Technology and application of 3D tunnel information monitoring

    NASA Astrophysics Data System (ADS)

    Li, Changqing; Deng, Hongliang; Chen, Ge; Wang, Simiao; Guo, Yang; Wu, Shenglin

    2015-12-01

    It is very necessary that Implement information monitoring and dynamic construction because of Complex geological environment and lack of basic information in the process of tunnel construction. The monitoring results show that 3 d laser scanning technology and information management system has important theoretical significance and application value to ensure the safety of tunnel construction, rich construction theory and technology. It can be known in real time the deformation information and the construction information in near tunnel workplace and the whole tunnel section in real time. In the meantime, it can be known the deformation regularity in the tunnel excavation process and the early warning and forecasting in the form of graphic and data. In order to determine the reasonable time and provide basis for supporting parameters and lining.

  14. Towards an automated intelligence product generation capability

    NASA Astrophysics Data System (ADS)

    Smith, Alison M.; Hawes, Timothy W.; Nolan, James J.

    2015-05-01

    Creating intelligence information products is a time consuming and difficult process for analysts faced with identifying key pieces of information relevant to a complex set of information requirements. Complicating matters, these key pieces of information exist in multiple modalities scattered across data stores, buried in huge volumes of data. This results in the current predicament analysts find themselves; information retrieval and management consumes huge amounts of time that could be better spent performing analysis. The persistent growth in data accumulation rates will only increase the amount of time spent on these tasks without a significant advance in automated solutions for information product generation. We present a product generation tool, Automated PrOduct Generation and Enrichment (APOGEE), which aims to automate the information product creation process in order to shift the bulk of the analysts' effort from data discovery and management to analysis. APOGEE discovers relevant text, imagery, video, and audio for inclusion in information products using semantic and statistical models of unstructured content. APOGEEs mixed-initiative interface, supported by highly responsive backend mechanisms, allows analysts to dynamically control the product generation process ensuring a maximally relevant result. The combination of these capabilities results in significant reductions in the time it takes analysts to produce information products while helping to increase the overall coverage. Through evaluation with a domain expert, APOGEE has been shown the potential to cut down the time for product generation by 20x. The result is a flexible end-to-end system that can be rapidly deployed in new operational settings.

  15. The influence of anticipatory processing on attentional biases in social anxiety.

    PubMed

    Mills, Adam C; Grant, DeMond M; Judah, Matt R; White, Evan J

    2014-09-01

    Research on cognitive theories of social anxiety disorder (SAD) has identified individual processes that influence this condition (e.g., cognitive biases, repetitive negative thinking), but few studies have attempted to examine the interaction between these processes. For example, attentional biases and anticipatory processing are theoretically related and have been found to influence symptoms of SAD, but they rarely have been studied together (i.e., Clark & Wells, 1995). Therefore, the goal of the current study was to examine the effect of anticipatory processing on attentional bias for internal (i.e., heart rate feedback) and external (i.e., emotional faces) threat information. A sample of 59 participants high (HSA) and low (LSA) in social anxiety symptoms engaged in a modified dot-probe task prior to (Time 1) and after (Time 2) an anticipatory processing or distraction task. HSAs who anticipated experienced an increase in attentional bias for internal information from Time 1 to Time 2, whereas HSAs in the distraction condition and LSAs in either condition experienced no changes. No changes in biases were found for HSAs for external biases, but LSAs who engaged in the distraction task became less avoidant of emotional faces from Time 1 to Time 2. This suggests that anticipatory processing results in an activation of attentional biases for physiological information as suggested by Clark and Wells. Copyright © 2014. Published by Elsevier Ltd.

  16. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  17. Logistics Control Facility: A Normative Model for Total Asset Visibility in the Air Force Logistics System

    DTIC Science & Technology

    1994-09-01

    IIssue Computers, information systems, and communication systems are being increasingly used in transportation, warehousing, order processing , materials...inventory levels, reduced order processing times, reduced order processing costs, and increased customer satisfaction. While purchasing and transportation...process, the speed in which crders are processed would increase significantly. Lowering the order processing time in turn lowers the lead time, which in

  18. Economy with the time delay of information flow—The stock market case

    NASA Astrophysics Data System (ADS)

    Miśkiewicz, Janusz

    2012-02-01

    Any decision process requires information about the past and present state of the system, but in an economy acquiring data and processing it is an expensive and time-consuming task. Therefore, the state of the system is often measured over some legal interval, analysed after the end of well defined time periods and the results announced much later before any strategic decision is envisaged. The various time delay roles have to be crucially examined. Here, a model of stock market coupled with an economy is investigated to emphasise the role of the time delay span on the information flow. It is shown that the larger the time delay the more important the collective behaviour of agents since one observes time oscillations in the absolute log-return autocorrelations.

  19. Parallel photonic information processing at gigabyte per second data rates using transient states

    NASA Astrophysics Data System (ADS)

    Brunner, Daniel; Soriano, Miguel C.; Mirasso, Claudio R.; Fischer, Ingo

    2013-01-01

    The increasing demands on information processing require novel computational concepts and true parallelism. Nevertheless, hardware realizations of unconventional computing approaches never exceeded a marginal existence. While the application of optics in super-computing receives reawakened interest, new concepts, partly neuro-inspired, are being considered and developed. Here we experimentally demonstrate the potential of a simple photonic architecture to process information at unprecedented data rates, implementing a learning-based approach. A semiconductor laser subject to delayed self-feedback and optical data injection is employed to solve computationally hard tasks. We demonstrate simultaneous spoken digit and speaker recognition and chaotic time-series prediction at data rates beyond 1Gbyte/s. We identify all digits with very low classification errors and perform chaotic time-series prediction with 10% error. Our approach bridges the areas of photonic information processing, cognitive and information science.

  20. Expectation, information processing, and subjective duration.

    PubMed

    Simchy-Gross, Rhimmon; Margulis, Elizabeth Hellmuth

    2018-01-01

    In research on psychological time, it is important to examine the subjective duration of entire stimulus sequences, such as those produced by music (Teki, Frontiers in Neuroscience, 10, 2016). Yet research on the temporal oddball illusion (according to which oddball stimuli seem longer than standard stimuli of the same duration) has examined only the subjective duration of single events contained within sequences, not the subjective duration of sequences themselves. Does the finding that oddballs seem longer than standards translate to entire sequences, such that entire sequences that contain oddballs seem longer than those that do not? Is this potential translation influenced by the mode of information processing-whether people are engaged in direct or indirect temporal processing? Two experiments aimed to answer both questions using different manipulations of information processing. In both experiments, musical sequences either did or did not contain oddballs (auditory sliding tones). To manipulate information processing, we varied the task (Experiment 1), the sequence event structure (Experiments 1 and 2), and the sequence familiarity (Experiment 2) independently within subjects. Overall, in both experiments, the sequences that contained oddballs seemed shorter than those that did not when people were engaged in direct temporal processing, but longer when people were engaged in indirect temporal processing. These findings support the dual-process contingency model of time estimation (Zakay, Attention, Perception & Psychophysics, 54, 656-664, 1993). Theoretical implications for attention-based and memory-based models of time estimation, the pacemaker accumulator and coding efficiency hypotheses of time perception, and dynamic attending theory are discussed.

  1. Temporal Information Processing as a Basis for Auditory Comprehension: Clinical Evidence from Aphasic Patients

    ERIC Educational Resources Information Center

    Oron, Anna; Szymaszek, Aneta; Szelag, Elzbieta

    2015-01-01

    Background: Temporal information processing (TIP) underlies many aspects of cognitive functions like language, motor control, learning, memory, attention, etc. Millisecond timing may be assessed by sequencing abilities, e.g. the perception of event order. It may be measured with auditory temporal-order-threshold (TOT), i.e. a minimum time gap…

  2. Time perception impairs sensory-motor integration in Parkinson’s disease

    PubMed Central

    2013-01-01

    It is well known that perception and estimation of time are fundamental for the relationship between humans and their environment. However, this temporal information processing is inefficient in patients with Parkinson’ disease (PD), resulting in temporal judgment deficits. In general, the pathophysiology of PD has been described as a dysfunction in the basal ganglia, which is a multisensory integration station. Thus, a deficit in the sensorimotor integration process could explain many of the Parkinson symptoms, such as changes in time perception. This physiological distortion may be better understood if we analyze the neurobiological model of interval timing, expressed within the conceptual framework of a traditional information-processing model called “Scalar Expectancy Theory”. Therefore, in this review we discuss the pathophysiology and sensorimotor integration process in PD, the theories and neural basic mechanisms involved in temporal processing, and the main clinical findings about the impact of time perception in PD. PMID:24131660

  3. Fluctuations in Wikipedia access-rate and edit-event data

    NASA Astrophysics Data System (ADS)

    Kämpf, Mirko; Tismer, Sebastian; Kantelhardt, Jan W.; Muchnik, Lev

    2012-12-01

    Internet-based social networks often reflect extreme events in nature and society by drastic increases in user activity. We study and compare the dynamics of the two major complex processes necessary for information spread via the online encyclopedia ‘Wikipedia’, i.e., article editing (information upload) and article access (information viewing) based on article edit-event time series and (hourly) user access-rate time series for all articles. Daily and weekly activity patterns occur in addition to fluctuations and bursting activity. The bursts (i.e., significant increases in activity for an extended period of time) are characterized by a power-law distribution of durations of increases and decreases. For describing the recurrence and clustering of bursts we investigate the statistics of the return intervals between them. We find stretched exponential distributions of return intervals in access-rate time series, while edit-event time series yield simple exponential distributions. To characterize the fluctuation behavior we apply detrended fluctuation analysis (DFA), finding that most article access-rate time series are characterized by strong long-term correlations with fluctuation exponents α≈0.9. The results indicate significant differences in the dynamics of information upload and access and help in understanding the complex process of collecting, processing, validating, and distributing information in self-organized social networks.

  4. An Analysis of the Order Cycle at Coast Guard Supply Center Curtis Bay, Maryland. How to Measure Customer Service.

    DTIC Science & Technology

    1994-12-01

    Order Cycle ..... 20 2. Order Processing and the Information System .......... ................. 21 3. The Order Cycle at SCCB ...... ......... 21 v...order transmittal time, order processing time, order assembly time, stock availability, production time, and delivery time. CUSTOMER L7 2 I i u *r1...methods, inventory stocking policies, order processing procedures, transport modes, and scheduling methods [Ref. 15]. 20 2. Order Processing and the

  5. Dynamic combination of sensory and reward information under time pressure

    PubMed Central

    Farashahi, Shiva; Kao, Chang-Hao

    2018-01-01

    When making choices, collecting more information is beneficial but comes at the cost of sacrificing time that could be allocated to making other potentially rewarding decisions. To investigate how the brain balances these costs and benefits, we conducted a series of novel experiments in humans and simulated various computational models. Under six levels of time pressure, subjects made decisions either by integrating sensory information over time or by dynamically combining sensory and reward information over time. We found that during sensory integration, time pressure reduced performance as the deadline approached, and choice was more strongly influenced by the most recent sensory evidence. By fitting performance and reaction time with various models we found that our experimental results are more compatible with leaky integration of sensory information with an urgency signal or a decision process based on stochastic transitions between discrete states modulated by an urgency signal. When combining sensory and reward information, subjects spent less time on integration than optimally prescribed when reward decreased slowly over time, and the most recent evidence did not have the maximal influence on choice. The suboptimal pattern of reaction time was partially mitigated in an equivalent control experiment in which sensory integration over time was not required, indicating that the suboptimal response time was influenced by the perception of imperfect sensory integration. Meanwhile, during combination of sensory and reward information, performance did not drop as the deadline approached, and response time was not different between correct and incorrect trials. These results indicate a decision process different from what is involved in the integration of sensory information over time. Together, our results not only reveal limitations in sensory integration over time but also illustrate how these limitations influence dynamic combination of sensory and reward information. PMID:29584717

  6. Can correlations among receptors affect the information about the stimulus?

    NASA Astrophysics Data System (ADS)

    Singh, Vijay; Tchernookov, Martin; Nemenman, Ilya

    2014-03-01

    In the context of neural information processing, it has been observed that, compared to the case of independent receptors, correlated receptors can often carry more information about the stimulus. We explore similar ideas in the context of molecular information processing, analyzing a cell with receptors whose activity is intrinsically negatively correlated because they compete for the same ligand molecules. We show analytically that, in case the involved biochemical interactions are linear, the information between the number of molecules captured by the receptors and the ligand concentration does not depend on correlations among the receptors. For a nonlinear kinetic network, correlations similarly do not change the amount of information for observation times much shorter or much longer than the characteristic time scale of ligand molecule binding and unbinding. However, at intermediate times, correlations can increase the amount of available information. This work has been supported by the James S McDonnell foundation.

  7. The effect of manipulating context-specific information on perceptual-cognitive processes during a simulated anticipation task.

    PubMed

    McRobert, Allistair P; Ward, Paul; Eccles, David W; Williams, A Mark

    2011-08-01

    We manipulated contextual information in order to examine the perceptual-cognitive processes that support anticipation using a simulated cricket-batting task. Skilled (N= 10) and less skilled (N= 10) cricket batters responded to video simulations of opponents bowling a cricket ball under high and low contextual information conditions. Skilled batters were more accurate, demonstrated more effective search behaviours, and provided more detailed verbal reports of thinking. Moreover, when they viewed their opponent multiple times (high context), they reduced their mean fixation time. All batters improved performance and altered thought processes when in the high context, compared to when they responded to their opponent without previously seeing them bowl (low context). Findings illustrate how context influences performance and the search for relevant information when engaging in a dynamic, time-constrained task. ©2011 The British Psychological Society.

  8. Capturing and Modeling Domain Knowledge Using Natural Language Processing Techniques

    DTIC Science & Technology

    2005-06-01

    Intelligence Artificielle , France, May 2001, p. 109- 118 [Barrière, 2001] -----. “Investigating the Causal Relation in Informative Texts”. Terminology, 7:2...out of the flood of information, military have to create new ways of processing sensor and intelligence information, and of providing the results to...have to create new ways of processing sensor and intelligence information, and of providing the results to commanders who must take timely operational

  9. Real-Time Nonlinear Optical Information Processing.

    DTIC Science & Technology

    1979-06-01

    operations aree presented. One approach realizes the halftone method of nonlinear optical processing in real time by replacing the conventional...photographic recording medium with a real-time image transducer. In the second approach halftoning is eliminated and the real-time device is used directly

  10. The effect of time perspectives on mental health information processing and help-seeking attitudes and intentions in younger versus older adults.

    PubMed

    Erickson, Julie; Mackenzie, Corey S; Menec, Verena H; Bailis, Daniel S

    2017-03-01

    Socioemotional selectivity theory posits that changes in time perspective over the lifespan are associated with distinct goals and motivations. Time perspectives and their associated socioemotional motivations have been shown to influence information processing and memory, such that motivation-consistent information is more likely to be remembered and evaluated more positively. The aim of this study was to examine the effect of motivation-consistent mental health information on memory for and evaluations of this information, as well as help-seeking attitudes and intentions to seek mental health services. We randomly assigned an Internet-based sample of 160 younger (18-25) and 175 older (60-89) adults to read a mental health information pamphlet that emphasized time perspectives and motivations relevant to either young adulthood (future-focused) or late adulthood (present-focused). Participants completed measures assessing their time perspective, memory for and subjective evaluation of the pamphlet, and help-seeking attitudes and intentions. The time perspective manipulation had no effect on memory for pamphlet information or help-seeking attitudes and intentions. There was, however, a significant interaction between time perspective and pamphlet version on the rated liking of the pamphlet. Although motivation-consistent information only affected perceptions of that information for present-focused (mostly older) individuals, this finding has important implications for enhancing older adults' mental health literacy.

  11. Research of Manufacture Time Management System Based on PLM

    NASA Astrophysics Data System (ADS)

    Jing, Ni; Juan, Zhu; Liangwei, Zhong

    This system is targeted by enterprises manufacturing machine shop, analyzes their business needs and builds the plant management information system of Manufacture time and Manufacture time information management. for manufacturing process Combined with WEB technology, based on EXCEL VBA development of methods, constructs a hybrid model based on PLM workshop Manufacture time management information system framework, discusses the functionality of the system architecture, database structure.

  12. Mutual information identifies spurious Hurst phenomena in resting state EEG and fMRI data

    NASA Astrophysics Data System (ADS)

    von Wegner, Frederic; Laufs, Helmut; Tagliazucchi, Enzo

    2018-02-01

    Long-range memory in time series is often quantified by the Hurst exponent H , a measure of the signal's variance across several time scales. We analyze neurophysiological time series from electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) resting state experiments with two standard Hurst exponent estimators and with the time-lagged mutual information function applied to discretized versions of the signals. A confidence interval for the mutual information function is obtained from surrogate Markov processes with equilibrium distribution and transition matrix identical to the underlying signal. For EEG signals, we construct an additional mutual information confidence interval from a short-range correlated, tenth-order autoregressive model. We reproduce the previously described Hurst phenomenon (H >0.5 ) in the analytical amplitude of alpha frequency band oscillations, in EEG microstate sequences, and in fMRI signals, but we show that the Hurst phenomenon occurs without long-range memory in the information-theoretical sense. We find that the mutual information function of neurophysiological data behaves differently from fractional Gaussian noise (fGn), for which the Hurst phenomenon is a sufficient condition to prove long-range memory. Two other well-characterized, short-range correlated stochastic processes (Ornstein-Uhlenbeck, Cox-Ingersoll-Ross) also yield H >0.5 , whereas their mutual information functions lie within the Markovian confidence intervals, similar to neural signals. In these processes, which do not have long-range memory by construction, a spurious Hurst phenomenon occurs due to slow relaxation times and heteroscedasticity (time-varying conditional variance). In summary, we find that mutual information correctly distinguishes long-range from short-range dependence in the theoretical and experimental cases discussed. Our results also suggest that the stationary fGn process is not sufficient to describe neural data, which seem to belong to a more general class of stochastic processes, in which multiscale variance effects produce Hurst phenomena without long-range dependence. In our experimental data, the Hurst phenomenon and long-range memory appear as different system properties that should be estimated and interpreted independently.

  13. Characterizing Information Processing With a Mobile Device: Measurement of Simple and Choice Reaction Time.

    PubMed

    Burke, Daniel; Linder, Susan; Hirsch, Joshua; Dey, Tanujit; Kana, Daniel; Ringenbach, Shannon; Schindler, David; Alberts, Jay

    2017-10-01

    Information processing is typically evaluated using simple reaction time (SRT) and choice reaction time (CRT) paradigms in which a specific response is initiated following a given stimulus. The measurement of reaction time (RT) has evolved from monitoring the timing of mechanical switches to computerized paradigms. The proliferation of mobile devices with touch screens makes them a natural next technological approach to assess information processing. The aims of this study were to determine the validity and reliability of using of a mobile device (Apple iPad or iTouch) to accurately measure RT. Sixty healthy young adults completed SRT and CRT tasks using a traditional test platform and mobile platforms on two occasions. The SRT was similar across test modality: 300, 287, and 280 milliseconds (ms) for the traditional, iPad, and iTouch, respectively. The CRT was similar within mobile devices, though slightly faster on the traditional: 359, 408, and 384 ms for traditional, iPad, and iTouch, respectively. Intraclass correlation coefficients ranged from 0.79 to 0.85 for SRT and from 0.75 to 0.83 for CRT. The similarity and reliability of SRT across platforms and consistency of SRT and CRT across test conditions indicate that mobile devices provide the next generation of assessment platforms for information processing.

  14. Use of Referential Discourse Contexts in L2 Offline and Online Sentence Processing.

    PubMed

    Yang, Pi-Lan

    2016-10-01

    The present study aimed to investigate (a) the extent to which Chinese-speaking learners of English in Taiwan use referential noun phrase (NP) information contained in discourse contexts to complete ambiguous noun/verb fragments in a sentence completion task, and (b) whether and when they use the contexts to disambiguate main verb versus reduced relative clause (MV/RRC) ambiguities in real time. Results showed that unlike native English speakers, English learners did not create a marked increase in RRC completions in biasing two-NP-referent discourse contexts except for advanced learners. Nevertheless, like native speakers, the learners at elementary, intermediate, and advanced English proficiency levels all used the information in a later stage of resolving the MV/RRC ambiguities in real time. The delayed effect of referential context information observed suggests that L2 learners, like native speakers, are able to construct syntax-to-discourse mappings in real time. It also suggests that processing of syntactic information takes precedence over integration of syntactic information with discourse information during L1 and L2 online sentence processing.

  15. Processing short-term and long-term information with a combination of polynomial approximation techniques and time-delay neural networks.

    PubMed

    Fuchs, Erich; Gruber, Christian; Reitmaier, Tobias; Sick, Bernhard

    2009-09-01

    Neural networks are often used to process temporal information, i.e., any kind of information related to time series. In many cases, time series contain short-term and long-term trends or behavior. This paper presents a new approach to capture temporal information with various reference periods simultaneously. A least squares approximation of the time series with orthogonal polynomials will be used to describe short-term trends contained in a signal (average, increase, curvature, etc.). Long-term behavior will be modeled with the tapped delay lines of a time-delay neural network (TDNN). This network takes the coefficients of the orthogonal expansion of the approximating polynomial as inputs such considering short-term and long-term information efficiently. The advantages of the method will be demonstrated by means of artificial data and two real-world application examples, the prediction of the user number in a computer network and online tool wear classification in turning.

  16. Adult Spinal Deformity Patients Recall Fewer Than 50% of the Risks Discussed in the Informed Consent Process Preoperatively and the Recall Rate Worsens Significantly in the Postoperative Period.

    PubMed

    Saigal, Rajiv; Clark, Aaron J; Scheer, Justin K; Smith, Justin S; Bess, Shay; Mummaneni, Praveen V; McCarthy, Ian M; Hart, Robert A; Kebaish, Khaled M; Klineberg, Eric O; Deviren, Vedat; Schwab, Frank; Shaffrey, Christopher I; Ames, Christopher P

    2015-07-15

    Recall of the informed consent process in patients undergoing adult spinal deformity surgery and their family members was investigated prospectively. To quantify the percentage recall of the most common complications discussed during the informed consent process in adult spinal deformity surgery, assess for differences between patients and family members, and correlate with mental status. Given high rates of complications in adult spinal deformity surgery, it is critical to shared decision making that patients are adequately informed about risks and are able to recall preoperative discussion of possible complications to mitigate medical legal risk. Patients undergoing adult spinal deformity surgery underwent an augmented informed consent process involving both verbal and video explanations. Recall of the 11 most common complications was scored. Mental status was assessed with the mini-mental status examination-brief version. Patients subjectively scored the informed consent process and video. After surgery, the recall test and mini-mental status examination-brief version were readministered at 5 additional time points: hospital discharge, 6 to 8 weeks, 3 months, 6 months, and 1 year postoperatively. Family members were assessed at the first 3 time points for comparison. Fifty-six patients enrolled. Despite ranking the consent process as important (median overall score: 10/10; video score: 9/10), median patient recall was only 45% immediately after discussion and video re-enforcement and subsequently declined to 18% at 6 to 8 weeks and 1 year postoperatively. Median family recall trended higher at 55% immediately and 36% at 6 to 8 weeks postoperatively. The perception of the severity of complications significantly differs between patient and surgeon. Mental status scores showed a transient, significant decrease from preoperation to discharge but were significantly higher at 1 year. Despite being well-informed in an optimized informed consent process, patients cannot recall most surgical risks discussed and recall declines over time. Significant progress remains to improve informed consent retention. 3.

  17. Information Processing in Memory Tasks.

    ERIC Educational Resources Information Center

    Johnston, William A.

    The intensity of information processing engendered in different phases of standard memory tasks was examined in six experiments. Processing intensity was conceptualized as system capacity consumed, and was measured via a divided-attention procedure in which subjects performed a memory task and a simple reaction-time (RT) task concurrently. The…

  18. A Conceptual Model of the Cognitive Processing of Environmental Distance Information

    NASA Astrophysics Data System (ADS)

    Montello, Daniel R.

    I review theories and research on the cognitive processing of environmental distance information by humans, particularly that acquired via direct experience in the environment. The cognitive processes I consider for acquiring and thinking about environmental distance information include working-memory, nonmediated, hybrid, and simple-retrieval processes. Based on my review of the research literature, and additional considerations about the sources of distance information and the situations in which it is used, I propose an integrative conceptual model to explain the cognitive processing of distance information that takes account of the plurality of possible processes and information sources, and describes conditions under which particular processes and sources are likely to operate. The mechanism of summing vista distances is identified as widely important in situations with good visual access to the environment. Heuristics based on time, effort, or other information are likely to play their most important role when sensory access is restricted.

  19. ADHD performance reflects inefficient but not impulsive information processing: a diffusion model analysis.

    PubMed

    Metin, Baris; Roeyers, Herbert; Wiersema, Jan R; van der Meere, Jaap J; Thompson, Margaret; Sonuga-Barke, Edmund

    2013-03-01

    Attention-deficit/hyperactivity disorder (ADHD) is associated with performance deficits across a broad range of tasks. Although individual tasks are designed to tap specific cognitive functions (e.g., memory, inhibition, planning, etc.), these deficits could also reflect general effects related to either inefficient or impulsive information processing or both. These two components cannot be isolated from each other on the basis of classical analysis in which mean reaction time (RT) and mean accuracy are handled separately. Seventy children with a diagnosis of combined type ADHD and 50 healthy controls (between 6 and 17 years) performed two tasks: a simple two-choice RT (2-CRT) task and a conflict control task (CCT) that required higher levels of executive control. RT and errors were analyzed using the Ratcliff diffusion model, which divides decisional time into separate estimates of information processing efficiency (called "drift rate") and speed-accuracy tradeoff (SATO, called "boundary"). The model also provides an estimate of general nondecisional time. Results were the same for both tasks independent of executive load. ADHD was associated with lower drift rate and less nondecisional time. The groups did not differ in terms of boundary parameter estimates. RT and accuracy performance in ADHD appears to reflect inefficient rather than impulsive information processing, an effect independent of executive function load. The results are consistent with models in which basic information processing deficits make an important contribution to the ADHD cognitive phenotype. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  20. Soft sensor for real-time cement fineness estimation.

    PubMed

    Stanišić, Darko; Jorgovanović, Nikola; Popov, Nikola; Čongradac, Velimir

    2015-03-01

    This paper describes the design and implementation of soft sensors to estimate cement fineness. Soft sensors are mathematical models that use available data to provide real-time information on process variables when the information, for whatever reason, is not available by direct measurement. In this application, soft sensors are used to provide information on process variable normally provided by off-line laboratory tests performed at large time intervals. Cement fineness is one of the crucial parameters that define the quality of produced cement. Providing real-time information on cement fineness using soft sensors can overcome limitations and problems that originate from a lack of information between two laboratory tests. The model inputs were selected from candidate process variables using an information theoretic approach. Models based on multi-layer perceptrons were developed, and their ability to estimate cement fineness of laboratory samples was analyzed. Models that had the best performance, and capacity to adopt changes in the cement grinding circuit were selected to implement soft sensors. Soft sensors were tested using data from a continuous cement production to demonstrate their use in real-time fineness estimation. Their performance was highly satisfactory, and the sensors proved to be capable of providing valuable information on cement grinding circuit performance. After successful off-line tests, soft sensors were implemented and installed in the control room of a cement factory. Results on the site confirm results obtained by tests conducted during soft sensor development. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  1. The use of information theory in evolutionary biology.

    PubMed

    Adami, Christoph

    2012-05-01

    Information is a key concept in evolutionary biology. Information stored in a biological organism's genome is used to generate the organism and to maintain and control it. Information is also that which evolves. When a population adapts to a local environment, information about this environment is fixed in a representative genome. However, when an environment changes, information can be lost. At the same time, information is processed by animal brains to survive in complex environments, and the capacity for information processing also evolves. Here, I review applications of information theory to the evolution of proteins and to the evolution of information processing in simulated agents that adapt to perform a complex task. © 2012 New York Academy of Sciences.

  2. MO-F-CAMPUS-T-02: An Electronic Whiteboard Platform to Manage Treatment Planning Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiCostanzo, D; Woollard, J; Gupta, N

    2015-06-15

    Purpose: In an effort to improve patient safety and streamline the radiotherapy treatment planning (TP) process, a software based whiteboard had been developed and put in use in our facility Methods: The electronic whiteboard developed using SQL database (DB) and PHP/JavaScript based web interface, is published via department intranet and login credentials. The DB stores data for each TP process such as patient information, plan type, simulation/start dates, physician, dosimetrist, QA and the current status in planning process. Users interact with the DB per plan and perform status updates in real time as the planning process progresses. All user interactionsmore » with the DB are recorded with timestamps so as to calculate statistical information for TP process management such as contouring times, planning and review times, dosimetry, physics and therapist QA times. External beam and brachytherapy plans are categorized according to complexity (ex: IMRT, 3D, HDR, LDR etc) and treatment types and applicators. Each plan category is assigned specific timelines for each planning process. When a plan approaches or passes the predetermined timeline, users are alerted via color coded graphical cues. When certain process items are not completed in time, pre-determined actions are triggered such as a delay in treatment start date. Results: Our institution has been using the electronic whiteboard for two years. Implementation of pre-determined actions based on the statistical information collected by the whiteboard improved our TP process. For example, the average time for normal tissue contouring decreased from 0.73±1.37 to 0.24±0.33 days. The average time for target volume contouring decreased from 3.2±2.84 to 2.37±2.54 days. This increase in efficiency allows more time for quality assurance processes, improving patient safety. Conclusion: The electronic whiteboard has been an invaluable tool for streamlining our TP processes. It facilitates timely and accurate communication between all parties involved in the TP process increasing patient safety.« less

  3. Enhancing Quality of Orthotic Services with Process and Outcome Information

    DTIC Science & Technology

    2017-10-01

    AWARD NUMBER: W81XWH-16-1-0788 TITLE: Enhancing Quality of Orthotic Services with Process and Outcome Information PRINCIPAL INVESTIGATOR...OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for... information . Send comments regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this

  4. Time-dependent probability density functions and information geometry in stochastic logistic and Gompertz models

    NASA Astrophysics Data System (ADS)

    Tenkès, Lucille-Marie; Hollerbach, Rainer; Kim, Eun-jin

    2017-12-01

    A probabilistic description is essential for understanding growth processes in non-stationary states. In this paper, we compute time-dependent probability density functions (PDFs) in order to investigate stochastic logistic and Gompertz models, which are two of the most popular growth models. We consider different types of short-correlated multiplicative and additive noise sources and compare the time-dependent PDFs in the two models, elucidating the effects of the additive and multiplicative noises on the form of PDFs. We demonstrate an interesting transition from a unimodal to a bimodal PDF as the multiplicative noise increases for a fixed value of the additive noise. A much weaker (leaky) attractor in the Gompertz model leads to a significant (singular) growth of the population of a very small size. We point out the limitation of using stationary PDFs, mean value and variance in understanding statistical properties of the growth in non-stationary states, highlighting the importance of time-dependent PDFs. We further compare these two models from the perspective of information change that occurs during the growth process. Specifically, we define an infinitesimal distance at any time by comparing two PDFs at times infinitesimally apart and sum these distances in time. The total distance along the trajectory quantifies the total number of different states that the system undergoes in time, and is called the information length. We show that the time-evolution of the two models become more similar when measured in units of the information length and point out the merit of using the information length in unifying and understanding the dynamic evolution of different growth processes.

  5. Optical information-processing systems and architectures II; Proceedings of the Meeting, San Diego, CA, July 9-13, 1990

    NASA Astrophysics Data System (ADS)

    Javidi, Bahram

    The present conference discusses topics in the fields of neural networks, acoustooptic signal processing, pattern recognition, phase-only processing, nonlinear signal processing, image processing, optical computing, and optical information processing. Attention is given to the optical implementation of an inner-product neural associative memory, optoelectronic associative recall via motionless-head/parallel-readout optical disk, a compact real-time acoustooptic image correlator, a multidimensional synthetic estimation filter, and a light-efficient joint transform optical correlator. Also discussed are a high-resolution spatial light modulator, compact real-time interferometric Fourier-transform processors, a fast decorrelation algorithm for permutation arrays, the optical interconnection of optical modules, and carry-free optical binary adders.

  6. Dynamic frontotemporal systems process space and time in working memory

    PubMed Central

    Adams, Jenna N.; Solbakk, Anne-Kristin; Endestad, Tor; Larsson, Pål G.; Ivanovic, Jugoslav; Meling, Torstein R.; Lin, Jack J.; Knight, Robert T.

    2018-01-01

    How do we rapidly process incoming streams of information in working memory, a cognitive mechanism central to human behavior? Dominant views of working memory focus on the prefrontal cortex (PFC), but human hippocampal recordings provide a neurophysiological signature distinct from the PFC. Are these regions independent, or do they interact in the service of working memory? We addressed this core issue in behavior by recording directly from frontotemporal sites in humans performing a visuospatial working memory task that operationalizes the types of identity and spatiotemporal information we encounter every day. Theta band oscillations drove bidirectional interactions between the PFC and medial temporal lobe (MTL; including the hippocampus). MTL theta oscillations directed the PFC preferentially during the processing of spatiotemporal information, while PFC theta oscillations directed the MTL for all types of information being processed in working memory. These findings reveal an MTL theta mechanism for processing space and time and a domain-general PFC theta mechanism, providing evidence that rapid, dynamic MTL–PFC interactions underlie working memory for everyday experiences. PMID:29601574

  7. Using stochastic language models (SLM) to map lexical, syntactic, and phonological information processing in the brain.

    PubMed

    Lopopolo, Alessandro; Frank, Stefan L; van den Bosch, Antal; Willems, Roel M

    2017-01-01

    Language comprehension involves the simultaneous processing of information at the phonological, syntactic, and lexical level. We track these three distinct streams of information in the brain by using stochastic measures derived from computational language models to detect neural correlates of phoneme, part-of-speech, and word processing in an fMRI experiment. Probabilistic language models have proven to be useful tools for studying how language is processed as a sequence of symbols unfolding in time. Conditional probabilities between sequences of words are at the basis of probabilistic measures such as surprisal and perplexity which have been successfully used as predictors of several behavioural and neural correlates of sentence processing. Here we computed perplexity from sequences of words and their parts of speech, and their phonemic transcriptions. Brain activity time-locked to each word is regressed on the three model-derived measures. We observe that the brain keeps track of the statistical structure of lexical, syntactic and phonological information in distinct areas.

  8. Real-time lexical comprehension in young children learning American Sign Language.

    PubMed

    MacDonald, Kyle; LaMarr, Todd; Corina, David; Marchman, Virginia A; Fernald, Anne

    2018-04-16

    When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children's developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by differential access to auditory information in day-to-day life. Finally, variation in children's ASL processing was positively correlated with age and vocabulary size. Thus, despite competition for attention within a single modality, the timing and accuracy of visual fixations during ASL comprehension reflect information processing skills that are important for language acquisition regardless of language modality. © 2018 John Wiley & Sons Ltd.

  9. Digital Image Processing Overview For Helmet Mounted Displays

    NASA Astrophysics Data System (ADS)

    Parise, Michael J.

    1989-09-01

    Digital image processing provides a means to manipulate an image and presents a user with a variety of display formats that are not available in the analog image processing environment. When performed in real time and presented on a Helmet Mounted Display, system capability and flexibility are greatly enhanced. The information content of a display can be increased by the addition of real time insets and static windows from secondary sensor sources, near real time 3-D imaging from a single sensor can be achieved, graphical information can be added, and enhancement techniques can be employed. Such increased functionality is generating a considerable amount of interest in the military and commercial markets. This paper discusses some of these image processing techniques and their applications.

  10. Breaking down barriers in cooperative fault management: Temporal and functional information displays

    NASA Technical Reports Server (NTRS)

    Potter, Scott S.; Woods, David D.

    1994-01-01

    At the highest level, the fundamental question addressed by this research is how to aid human operators engaged in dynamic fault management. In dynamic fault management there is some underlying dynamic process (an engineered or physiological process referred to as the monitored process - MP) whose state changes over time and whose behavior must be monitored and controlled. In these types of applications (dynamic, real-time systems), a vast array of sensor data is available to provide information on the state of the MP. Faults disturb the MP and diagnosis must be performed in parallel with responses to maintain process integrity and to correct the underlying problem. These situations frequently involve time pressure, multiple interacting goals, high consequences of failure, and multiple interleaved tasks.

  11. How gamma radiation processing systems are benefiting from the latest advances in information technology

    NASA Astrophysics Data System (ADS)

    Gibson, Wayne H.; Levesque, Daniel

    2000-03-01

    This paper discusses how gamma irradiation plants are putting the latest advances in computer and information technology to use for better process control, cost savings, and strategic advantages. Some irradiator operations are gaining significant benefits by integrating computer technology and robotics with real-time information processing, multi-user databases, and communication networks. The paper reports on several irradiation facilities that are making good use of client/server LANs, user-friendly graphics interfaces, supervisory control and data acquisition (SCADA) systems, distributed I/O with real-time sensor devices, trending analysis, real-time product tracking, dynamic product scheduling, and automated dosimetry reading. These plants are lowering costs by fast and reliable reconciliation of dosimetry data, easier validation to GMP requirements, optimizing production flow, and faster release of sterilized products to market. There is a trend in the manufacturing sector towards total automation using "predictive process control". Real-time verification of process parameters "on-the-run" allows control parameters to be adjusted appropriately, before the process strays out of limits. Applying this technology to the gamma radiation process, control will be based on monitoring the key parameters such as time, and making adjustments during the process to optimize quality and throughput. Dosimetry results will be used as a quality control measurement rather than as a final monitor for the release of the product. Results are correlated with the irradiation process data to quickly and confidently reconcile variations. Ultimately, a parametric process control system utilizing responsive control, feedback and verification will not only increase productivity and process efficiency, but can also result in operating within tighter dose control set points.

  12. Hierarchical process memory: memory as an integral component of information processing

    PubMed Central

    Hasson, Uri; Chen, Janice; Honey, Christopher J.

    2015-01-01

    Models of working memory commonly focus on how information is encoded into and retrieved from storage at specific moments. However, in the majority of real-life processes, past information is used continuously to process incoming information across multiple timescales. Considering single unit, electrocorticography, and functional imaging data, we argue that (i) virtually all cortical circuits can accumulate information over time, and (ii) the timescales of accumulation vary hierarchically, from early sensory areas with short processing timescales (tens to hundreds of milliseconds) to higher-order areas with long processing timescales (many seconds to minutes). In this hierarchical systems perspective, memory is not restricted to a few localized stores, but is intrinsic to information processing that unfolds throughout the brain on multiple timescales. “The present contains nothing more than the past, and what is found in the effect was already in the cause.”Henri L Bergson PMID:25980649

  13. Future electro-optical sensors and processing in urban operations

    NASA Astrophysics Data System (ADS)

    Grönwall, Christina; Schwering, Piet B.; Rantakokko, Jouni; Benoist, Koen W.; Kemp, Rob A. W.; Steinvall, Ove; Letalick, Dietmar; Björkert, Stefan

    2013-10-01

    In the electro-optical sensors and processing in urban operations (ESUO) study we pave the way for the European Defence Agency (EDA) group of Electro-Optics experts (IAP03) for a common understanding of the optimal distribution of processing functions between the different platforms. Combinations of local, distributed and centralized processing are proposed. In this way one can match processing functionality to the required power, and available communication systems data rates, to obtain the desired reaction times. In the study, three priority scenarios were defined. For these scenarios, present-day and future sensors and signal processing technologies were studied. The priority scenarios were camp protection, patrol and house search. A method for analyzing information quality in single and multi-sensor systems has been applied. A method for estimating reaction times for transmission of data through the chain of command has been proposed and used. These methods are documented and can be used to modify scenarios, or be applied to other scenarios. Present day data processing is organized mainly locally. Very limited exchange of information with other platforms is present; this is performed mainly at a high information level. Main issues that arose from the analysis of present-day systems and methodology are the slow reaction time due to the limited field of view of present-day sensors and the lack of robust automated processing. Efficient handover schemes between wide and narrow field of view sensors may however reduce the delay times. The main effort in the study was in forecasting the signal processing of EO-sensors in the next ten to twenty years. Distributed processing is proposed between hand-held and vehicle based sensors. This can be accompanied by cloud processing on board several vehicles. Additionally, to perform sensor fusion on sensor data originating from different platforms, and making full use of UAV imagery, a combination of distributed and centralized processing is essential. There is a central role for sensor fusion of heterogeneous sensors in future processing. The changes that occur in the urban operations of the future due to the application of these new technologies will be the improved quality of information, with shorter reaction time, and with lower operator load.

  14. Disabled vs nondisabled readers: perceptual vs higher-order processing of one vs three letters.

    PubMed

    Allegretti, C L; Puglisi, J T

    1986-10-01

    12 disabled and 12 nondisabled readers (mean age, 11 yr.) were compared on a letter-search task which separated perceptual processing from higher-order processing. Participants were presented a first stimulus (for 200 msec. to minimize eye movements) followed by a second stimulus immediately to estimate the amount of information initially perceived or after a 3000-msec. interval to examine information more permanently stored. Participants were required to decide whether any letter present in the first stimulus was also present in the second. Two processing loads (1 and 3 letters) were examined. Disabled readers showed more pronounced deficits when they were given very little time to process information or more information to process.

  15. The impact of storage on processing: how is information maintained in working memory?

    PubMed

    Vergauwe, Evie; Camos, Valérie; Barrouillet, Pierre

    2014-07-01

    Working memory is typically defined as a system devoted to the simultaneous maintenance and processing of information. However, the interplay between these 2 functions is still a matter of debate in the literature, with views ranging from complete independence to complete dependence. The time-based resource-sharing model assumes that a central bottleneck constrains the 2 functions to alternate in such a way that maintenance activities postpone concurrent processing, with each additional piece of information to be maintained resulting in an additional postponement. Using different kinds of memoranda, we examined in a series of 7 experiments the effect of increasing memory load on different processing tasks. The results reveal that, insofar as attention is needed for maintenance, processing times linearly increase at a rate of about 50 ms per verbal or visuospatial memory item, suggesting a very fast refresh rate in working memory. Our results also show an asymmetry between verbal and spatial information, in that spatial information can solely rely on attention for its maintenance while verbal information can also rely on a domain-specific maintenance mechanism independent from attention. The implications for the functioning of working memory are discussed, with a specific focus on how information is maintained in working memory. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  16. Recovery of speed of information processing in closed-head-injury patients.

    PubMed

    Zwaagstra, R; Schmidt, I; Vanier, M

    1996-06-01

    After severe traumatic brain injury, patients almost invariably demonstrate a slowing of reaction time, reflecting a slowing of central information processing. Methodological problems associated with the traditional method for the analysis of longitudinal data (MANOVA) severely complicate studies on cognitive recovery. It is argued that multilevel models are often better suited for the analysis of improvement over time in clinical settings. Multilevel models take into account individual differences in both overall performance level and recovery. These models enable individual predictions for the recovery of speed of information processing. Recovery is modelled in a group of closed-head-injury patients (N = 24). Recovery was predicted by age and severity of injury, as indicated by coma duration. Over a period up to 44 months post trauma, reaction times were found to decrease faster for patients with longer coma duration.

  17. Real-time physiological monitoring with distributed networks of sensors and object-oriented programming techniques

    NASA Astrophysics Data System (ADS)

    Wiesmann, William P.; Pranger, L. Alex; Bogucki, Mary S.

    1998-05-01

    Remote monitoring of physiologic data from individual high- risk workers distributed over time and space is a considerable challenge. This is often due to an inadequate capability to accurately integrate large amounts of data into usable information in real time. In this report, we have used the vertical and horizontal organization of the 'fireground' as a framework to design a distributed network of sensors. In this system, sensor output is linked through a hierarchical object oriented programing process to accurately interpret physiological data, incorporate these data into a synchronous model and relay processed data, trends and predictions to members of the fire incident command structure. There are several unique aspects to this approach. The first includes a process to account for variability in vital parameter values for each individual's normal physiologic response by including an adaptive network in each data process. This information is used by the model in an iterative process to baseline a 'normal' physiologic response to a given stress for each individual and to detect deviations that indicate dysfunction or a significant insult. The second unique capability of the system orders the information for each user including the subject, local company officers, medical personnel and the incident commanders. Information can be retrieved and used for training exercises and after action analysis. Finally this system can easily be adapted to existing communication and processing links along with incorporating the best parts of current models through the use of object oriented programming techniques. These modern software techniques are well suited to handling multiple data processes independently over time in a distributed network.

  18. Temporo-parietal junction activity in theory-of-mind tasks: falseness, beliefs, or attention.

    PubMed

    Aichhorn, Markus; Perner, Josef; Weiss, Benjamin; Kronbichler, Martin; Staffen, Wolfgang; Ladurner, Gunther

    2009-06-01

    By combining the false belief (FB) and photo (PH) vignettes to identify theory-of-mind areas with the false sign (FS) vignettes, we re-establish the functional asymmetry between the left and right temporo-parietal junction (TPJ). The right TPJ (TPJ-R) is specially sensitive to processing belief information, whereas the left TPJ (TPJ-L) is equally responsible for FBs as well as FSs. Measuring BOLD at two time points in each vignette, at the time the FB-inducing information (or lack of information) is presented and at the time the test question is processed, made clear that the FB is processed spontaneously as soon as the relevant information is presented and not on demand for answering the question in contrast to extant behavioral data. Finally, a fourth, true belief vignette (TB) required teleological reasoning, that is, prediction of a rational action without any doubts being raised about the adequacy of the actor's information about reality. Activation by this vignette supported claims that the TPJ-R is activated by TBs as well as FBs.

  19. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    PubMed

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-11-30

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015.

  20. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    PubMed Central

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  1. A system for multichannel recording and automatic reading of information. [for onboard cosmic ray counter

    NASA Technical Reports Server (NTRS)

    Bogomolov, E. A.; Yevstafev, Y. Y.; Karakadko, V. K.; Lubyanaya, N. D.; Romanov, V. A.; Totubalina, M. G.; Yamshchikov, M. A.

    1975-01-01

    A system for the recording and processing of telescope data is considered for measurements of EW asymmetry. The information is recorded by 45 channels on a continuously moving 35-mm film. The dead time of the recorder is about 0.1 sec. A sorting electronic circuit is used to reduce the errors when the statistical time distribution of the pulses is recorded. The recorded information is read out by means of photoresistors. The phototransmitter signals are fed either to the mechanical recorder unit for preliminary processing, or to a logical circuit which controls the operation of the punching device. The punched tape is processed by an electronic computer.

  2. Reconstruction method for data protection in telemedicine systems

    NASA Astrophysics Data System (ADS)

    Buldakova, T. I.; Suyatinov, S. I.

    2015-03-01

    In the report the approach to protection of transmitted data by creation of pair symmetric keys for the sensor and the receiver is offered. Since biosignals are unique for each person, their corresponding processing allows to receive necessary information for creation of cryptographic keys. Processing is based on reconstruction of the mathematical model generating time series that are diagnostically equivalent to initial biosignals. Information about the model is transmitted to the receiver, where the restoration of physiological time series is performed using the reconstructed model. Thus, information about structure and parameters of biosystem model received in the reconstruction process can be used not only for its diagnostics, but also for protection of transmitted data in telemedicine complexes.

  3. Process-in-Network: A Comprehensive Network Processing Approach

    PubMed Central

    Urzaiz, Gabriel; Villa, David; Villanueva, Felix; Lopez, Juan Carlos

    2012-01-01

    A solid and versatile communications platform is very important in modern Ambient Intelligence (AmI) applications, which usually require the transmission of large amounts of multimedia information over a highly heterogeneous network. This article focuses on the concept of Process-in-Network (PIN), which is defined as the possibility that the network processes information as it is being transmitted, and introduces a more comprehensive approach than current network processing technologies. PIN can take advantage of waiting times in queues of routers, idle processing capacity in intermediate nodes, and the information that passes through the network. PMID:22969390

  4. Temporal Expectation and Information Processing: A Model-Based Analysis

    ERIC Educational Resources Information Center

    Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander

    2012-01-01

    People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…

  5. Information Technology Project Processes: Understanding the Barriers to Improvement and Adoption

    ERIC Educational Resources Information Center

    Williams, Bernard L.

    2009-01-01

    Every year, organizations lose millions of dollars due to IT (Information Technology) project failures. Over time, organizations have developed processes and procedures to help reduce the incidence of challenged IT projects. Research has shown that IT project processes can work to help reduce the number of challenged projects. The research in this…

  6. Neural Correlates of Individual Differences in Strategic Retrieval Processing

    ERIC Educational Resources Information Center

    Bridger, Emma K.; Herron, Jane E.; Elward, Rachael L.; Wilding, Edward L.

    2009-01-01

    Processes engaged when information is encoded into memory are an important determinant of whether that information will be recovered subsequently. Also influential, however, are processes engaged at the time of retrieval, and these were investigated here by using event-related potentials (ERPs) to measure a specific class of retrieval operations.…

  7. Business Intelligence Applied to the ALMA Software Integration Process

    NASA Astrophysics Data System (ADS)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  8. Direct Thermodynamic Measurements of the Energetics of Information Processing

    DTIC Science & Technology

    2017-08-08

    Report: Direct thermodynamic measurements of the energetics of information processing The views, opinions and/or findings contained in this report are... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other

  9. Estimating the decomposition of predictive information in multivariate systems

    NASA Astrophysics Data System (ADS)

    Faes, Luca; Kugiumtzis, Dimitris; Nollo, Giandomenico; Jurysta, Fabrice; Marinazzo, Daniele

    2015-03-01

    In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of conditional mutual information, to the present target process. Moreover, it computes all information-theoretic quantities using a nearest-neighbor technique designed to compensate the bias due to the different dimensionality of individual entropy terms. The resulting estimators of prediction entropy, storage entropy, transfer entropy, and partial transfer entropy are tested on simulations of coupled linear stochastic and nonlinear deterministic dynamic processes, demonstrating the superiority of the proposed approach over the traditional estimators based on uniform embedding. The framework is then applied to multivariate physiologic time series, resulting in physiologically well-interpretable information decompositions of cardiovascular and cardiorespiratory interactions during head-up tilt and of joint brain-heart dynamics during sleep.

  10. Foundations of Swarm Intelligence: From Principles to Practice

    DTIC Science & Technology

    2003-01-01

    through the use of chemical substances known as pheromones which have a scent that decays over time through the process of evaporation [6, p. 26...These pheromones form the basis of what amounts to a clever, and apparently simple, communications and information storage and retrieval system. Since... pheromone strength or intensity decays over time, it also provides a very simple information processing mechanism that can implement forms of positive

  11. Dynamic information processing states revealed through neurocognitive models of object semantics

    PubMed Central

    Clarke, Alex

    2015-01-01

    Recognising objects relies on highly dynamic, interactive brain networks to process multiple aspects of object information. To fully understand how different forms of information about objects are represented and processed in the brain requires a neurocognitive account of visual object recognition that combines a detailed cognitive model of semantic knowledge with a neurobiological model of visual object processing. Here we ask how specific cognitive factors are instantiated in our mental processes and how they dynamically evolve over time. We suggest that coarse semantic information, based on generic shared semantic knowledge, is rapidly extracted from visual inputs and is sufficient to drive rapid category decisions. Subsequent recurrent neural activity between the anterior temporal lobe and posterior fusiform supports the formation of object-specific semantic representations – a conjunctive process primarily driven by the perirhinal cortex. These object-specific representations require the integration of shared and distinguishing object properties and support the unique recognition of objects. We conclude that a valuable way of understanding the cognitive activity of the brain is though testing the relationship between specific cognitive measures and dynamic neural activity. This kind of approach allows us to move towards uncovering the information processing states of the brain and how they evolve over time. PMID:25745632

  12. Processing of acoustic and phonological information of lexical tones in Mandarin Chinese revealed by mismatch negativity.

    PubMed

    Yu, Keke; Wang, Ruiming; Li, Li; Li, Ping

    2014-01-01

    The accurate perception of lexical tones in tonal languages involves the processing of both acoustic information and phonological information carried by the tonal signal. In this study we evaluated the relative role of the two types of information in native Chinese speaker's processing of tones at a preattentive stage with event-related potentials (ERPs), particularly the mismatch negativity (MNN). Specifically, we distinguished the acoustic from the phonological information by manipulating phonological category and acoustic interval of the stimulus materials. We found a significant main effect of phonological category for the peak latency of MMN, but a main effect of both phonological category and acoustic interval for the mean amplitude of MMN. The results indicated that the two types of information, acoustic and phonological, play different roles in the processing of Chinese lexical tones: acoustic information only impacts the extent of tonal processing, while phonological information affects both the extent and the time course of tonal processing. Implications of these findings are discussed in light of neurocognitive processes of phonological processing.

  13. An Ontological Informatics Framework for Pharmaceutical Product Development: Milling as a Case Study

    ERIC Educational Resources Information Center

    Akkisetty, Venkata Sai Pavan Kumar

    2009-01-01

    Pharmaceutical product development is an expensive, time consuming and information intensive process. Providing the right information at the right time is of great importance in pharmaceutical industry. To achieve this, knowledge management is the approach to deal with the humongous quantity of information. Ontological approach proposed in Venkat…

  14. The effects of mild and severe traumatic brain injury on speed of information processing as measured by the computerized tests of information processing (CTIP).

    PubMed

    Tombaugh, Tom N; Rees, Laura; Stormer, Peter; Harrison, Allyson G; Smith, Andra

    2007-01-01

    In spite of the fact that reaction time (RT) measures are sensitive to the effects of traumatic brain injury (TBI), few RT procedures have been developed for use in standard clinical evaluations. The computerized test of information processing (CTIP) [Tombaugh, T. N., & Rees, L. (2000). Manual for the computerized tests of information processing (CTIP). Ottawa, Ont.: Carleton University] was designed to measure the degree to which TBI decreases the speed at which information is processed. The CTIP consists of three computerized programs that progressively increase the amount of information that is processed. Results of the current study demonstrated that RT increased as the difficulty of the CTIP tests increased (known as the complexity effect), and as severity of injury increased (from mild to severe TBI). The current study also demonstrated the importance of selecting a non-biased measure of variability. Overall, findings suggest that the CTIP is an easy to administer and sensitive measure of information processing speed.

  15. [Effects of punctuation on the processing of syntactically ambiguous Japanese sentences with a semantic bias].

    PubMed

    Niikuni, Keiyu; Muramoto, Toshiaki

    2014-06-01

    This study explored the effects of a comma on the processing of structurally ambiguous Japanese sentences with a semantic bias. A previous study has shown that a comma which is incompatible with an ambiguous sentence's semantic bias affects the processing of the sentence, but the effects of a comma that is compatible with the bias are unclear. In the present study, we examined the role of a comma compatible with the sentence's semantic bias using the self-paced reading method, which enabled us to determine the reading times for the region of the sentence where readers would be expected to solve the ambiguity using semantic information (the "target region"). The results show that a comma significantly increases the reading time of the punctuated word but decreases the reading time in the target region. We concluded that even if the semantic information provided might be sufficient for disambiguation, the insertion of a comma would affect the processing cost of the ambiguity, indicating that readers use both the comma and semantic information in parallel for sentence processing.

  16. Temporally selective attention modulates early perceptual processing: event-related potential evidence.

    PubMed

    Sanders, Lisa D; Astheimer, Lori B

    2008-05-01

    Some of the most important information we encounter changes so rapidly that our perceptual systems cannot process all of it in detail. Spatially selective attention is critical for perception when more information than can be processed in detail is presented simultaneously at distinct locations. When presented with complex, rapidly changing information, listeners may need to selectively attend to specific times rather than to locations. We present evidence that listeners can direct selective attention to time points that differ by as little as 500 msec, and that doing so improves target detection, affects baseline neural activity preceding stimulus presentation, and modulates auditory evoked potentials at a perceptually early stage. These data demonstrate that attentional modulation of early perceptual processing is temporally precise and that listeners can flexibly allocate temporally selective attention over short intervals, making it a viable mechanism for preferentially processing the most relevant segments in rapidly changing streams.

  17. Informal Learning with Technology: The Effects of Self-Constructing Externalizations

    ERIC Educational Resources Information Center

    Damnik, Gregor; Proske, Antje; Narciss, Susanne; Körndle, Hermann

    2013-01-01

    Especially in the context of technology-enhanced informal learning, it is crucial to understand how to design information sources in such a way that learners are not overwhelmed by the demands of the learning process, but at the same time are engaged in higher order thinking processes. Guidance aids learners in dealing with the demands of a…

  18. Keeping up appearances: Strategic information exchange by disidentified group members

    PubMed Central

    Matschke, Christina

    2017-01-01

    Information exchange is a crucial process in groups, but to date, no one has systematically examined how a group member’s relationship with a group can undermine this process. The current research examined whether disidentified group members (i.e., members who have a negative relationship with their group) strategically undermine the group outcome in information exchange. Disidentification has been found to predict negative group-directed behaviour, but at the same time disidentified members run the risk of being punished or excluded from the group when displaying destructive behaviour. In three studies we expected and found that disidentified group members subtly act against the interest of the group by withholding important private information, while at the same time they keep up appearances by sharing important information that is already known by the other group members. These findings stress the importance of taking a group member’s relationship with a group into account when considering the process of information exchange. PMID:28384322

  19. A real-time dashboard for managing pathology processes.

    PubMed

    Halwani, Fawaz; Li, Wei Chen; Banerjee, Diponkar; Lessard, Lysanne; Amyot, Daniel; Michalowski, Wojtek; Giffen, Randy

    2016-01-01

    The Eastern Ontario Regional Laboratory Association (EORLA) is a newly established association of all the laboratory and pathology departments of Eastern Ontario that currently includes facilities from eight hospitals. All surgical specimens for EORLA are processed in one central location, the Department of Pathology and Laboratory Medicine (DPLM) at The Ottawa Hospital (TOH), where the rapid growth and influx of surgical and cytology specimens has created many challenges in ensuring the timely processing of cases and reports. Although the entire process is maintained and tracked in a clinical information system, this system lacks pre-emptive warnings that can help management address issues as they arise. Dashboard technology provides automated, real-time visual clues that could be used to alert management when a case or specimen is not being processed within predefined time frames. We describe the development of a dashboard helping pathology clinical management to make informed decisions on specimen allocation and tracking. The dashboard was designed and developed in two phases, following a prototyping approach. The first prototype of the dashboard helped monitor and manage pathology processes at the DPLM. The use of this dashboard helped to uncover operational inefficiencies and contributed to an improvement of turn-around time within The Ottawa Hospital's DPML. It also allowed the discovery of additional requirements, leading to a second prototype that provides finer-grained, real-time information about individual cases and specimens. We successfully developed a dashboard that enables managers to address delays and bottlenecks in specimen allocation and tracking. This support ensures that pathology reports are provided within time frame standards required for high-quality patient care. Given the importance of rapid diagnostics for a number of diseases, the use of real-time dashboards within pathology departments could contribute to improving the quality of patient care beyond EORLA's.

  20. DART system analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.

    2005-08-01

    The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less

  1. Energy Survey of Machine Tools: Separating Power Information of the Main Transmission System During Machining Process

    NASA Astrophysics Data System (ADS)

    Liu, Shuang; Liu, Fei; Hu, Shaohua; Yin, Zhenbiao

    The major power information of the main transmission system in machine tools (MTSMT) during machining process includes effective output power (i.e. cutting power), input power and power loss from the mechanical transmission system, and the main motor power loss. These information are easy to obtain in the lab but difficult to evaluate in a manufacturing process. To solve this problem, a separation method is proposed here to extract the MTSMT power information during machining process. In this method, the energy flow and the mathematical models of major power information of MTSMT during the machining process are set up first. Based on the mathematical models and the basic data tables obtained from experiments, the above mentioned power information during machining process can be separated just by measuring the real time total input power of the spindle motor. The operation program of this method is also given.

  2. Problems and Processes in Medical Encounters: The CASES method of dialogue analysis

    PubMed Central

    Laws, M. Barton; Taubin, Tatiana; Bezreh, Tanya; Lee, Yoojin; Beach, Mary Catherine; Wilson, Ira B.

    2013-01-01

    Objective To develop methods to reliably capture structural and dynamic temporal features of clinical interactions. Methods Observational study of 50 audio-recorded routine outpatient visits to HIV specialty clinics, using innovative analytic methods. The Comprehensive Analysis of the Structure of Encounters System (CASES) uses transcripts coded for speech acts, then imposes larger-scale structural elements: threads – the problems or issues addressed; and processes within threads –basic tasks of clinical care labeled Presentation, Information, Resolution (decision making) and Engagement (interpersonal exchange). Threads are also coded for the nature of resolution. Results 61% of utterances are in presentation processes. Provider verbal dominance is greatest in information and resolution processes, which also contain a high proportion of provider directives. About half of threads result in no action or decision. Information flows predominantly from patient to provider in presentation processes, and from provider to patient in information processes. Engagement is rare. Conclusions In this data, resolution is provider centered; more time for patient participation in resolution, or interpersonal engagement, would have to come from presentation. Practice Implications Awareness of the use of time in clinical encounters, and the interaction processes associated with various tasks, may help make clinical communication more efficient and effective. PMID:23391684

  3. Problems and processes in medical encounters: the cases method of dialogue analysis.

    PubMed

    Laws, M Barton; Taubin, Tatiana; Bezreh, Tanya; Lee, Yoojin; Beach, Mary Catherine; Wilson, Ira B

    2013-05-01

    To develop methods to reliably capture structural and dynamic temporal features of clinical interactions. Observational study of 50 audio-recorded routine outpatient visits to HIV specialty clinics, using innovative analytic methods. The comprehensive analysis of the structure of encounters system (CASES) uses transcripts coded for speech acts, then imposes larger-scale structural elements: threads--the problems or issues addressed; and processes within threads--basic tasks of clinical care labeled presentation, information, resolution (decision making) and Engagement (interpersonal exchange). Threads are also coded for the nature of resolution. 61% of utterances are in presentation processes. Provider verbal dominance is greatest in information and resolution processes, which also contain a high proportion of provider directives. About half of threads result in no action or decision. Information flows predominantly from patient to provider in presentation processes, and from provider to patient in information processes. Engagement is rare. In this data, resolution is provider centered; more time for patient participation in resolution, or interpersonal engagement, would have to come from presentation. Awareness of the use of time in clinical encounters, and the interaction processes associated with various tasks, may help make clinical communication more efficient and effective. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Two paths to blame: Intentionality directs moral information processing along two distinct tracks.

    PubMed

    Monroe, Andrew E; Malle, Bertram F

    2017-01-01

    There is broad consensus that features such as causality, mental states, and preventability are key inputs to moral judgments of blame. What is not clear is exactly how people process these inputs to arrive at such judgments. Three studies provide evidence that early judgments of whether or not a norm violation is intentional direct information processing along 1 of 2 tracks: if the violation is deemed intentional, blame processing relies on information about the agent's reasons for committing the violation; if the violation is deemed unintentional, blame processing relies on information about how preventable the violation was. Owing to these processing commitments, when new information requires perceivers to switch tracks, they must reconfigure their judgments, which results in measurable processing costs indicated by reaction time (RT) delays. These findings offer support for a new theory of moral judgment (the Path Model of Blame) and advance the study of moral cognition as hierarchical information processing. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Waking and scrambling in holographic heating up

    NASA Astrophysics Data System (ADS)

    Ageev, D. S.; Aref'eva, I. Ya.

    2017-10-01

    Using holographic methods, we study the heating up process in quantum field theory. As a holographic dual of this process, we use absorption of a thin shell on a black brane. We find the explicit form of the time evolution of the quantum mutual information during heating up from the temperature Ti to the temperature T f in a system of two intervals in two-dimensional space-time. We determine the geometric characteristics of the system under which the time dependence of the mutual information has a bell shape: it is equal to zero at the initial instant, becomes positive at some subsequent instant, further attains its maximum, and again decreases to zero. Such a behavior of the mutual information occurs in the process of photosynthesis. We show that if the distance x between the intervals is less than log 2/2π T i, then the evolution of the holographic mutual information has a bell shape only for intervals whose lengths are bounded from above and below. For sufficiently large x, i.e., for x < log 2/2π T i, the bell-like shape of the time dependence of the quantum mutual information is present only for sufficiently large intervals. Moreover, the zone narrows as T i increases and widens as T f increases.

  6. Affective Primacy vs. Cognitive Primacy: Dissolving the Debate.

    PubMed

    Lai, Vicky Tzuyin; Hagoort, Peter; Casasanto, Daniel

    2012-01-01

    When people see a snake, they are likely to activate both affective information (e.g., dangerous) and non-affective information about its ontological category (e.g., animal). According to the Affective Primacy Hypothesis, the affective information has priority, and its activation can precede identification of the ontological category of a stimulus. Alternatively, according to the Cognitive Primacy Hypothesis, perceivers must know what they are looking at before they can make an affective judgment about it. We propose that neither hypothesis holds at all times. Here we show that the relative speed with which affective and non-affective information gets activated by pictures and words depends upon the contexts in which stimuli are processed. Results illustrate that the question of whether affective information has processing priority over ontological information (or vice versa) is ill-posed. Rather than seeking to resolve the debate over Cognitive vs. Affective Primacy in favor of one hypothesis or the other, a more productive goal may be to determine the factors that cause affective information to have processing priority in some circumstances and ontological information in others. Our findings support a view of the mind according to which words and pictures activate different neurocognitive representations every time they are processed, the specifics of which are co-determined by the stimuli themselves and the contexts in which they occur.

  7. The Practice of Information Processing Model in the Teaching of Cognitive Strategies

    ERIC Educational Resources Information Center

    Ozel, Ali

    2009-01-01

    In this research, the differentiation condition of teaching the learning strategies depending on the time which the first grade of primary school teachers carried out to form an information-process skeleton on student is tried to be found out. This process including the efforts of 260 teachers in this direction consists of whether the adequate…

  8. Base Stock Policy in a Join-Type Production Line with Advanced Demand Information

    NASA Astrophysics Data System (ADS)

    Hiraiwa, Mikihiko; Tsubouchi, Satoshi; Nakade, Koichi

    Production control such as the base stock policy, the kanban policy and the constant work-in-process policy in a serial production line has been studied by many researchers. Production lines, however, usually have fork-type, join-type or network-type figures. In addition, in most previous studies on production control, a finished product is required at the same time as arrival of demand at the system. Demand information is, however, informed before due date in practice. In this paper a join-type (assembly) production line under base stock control with advanced demand information in discrete time is analyzed. The recursive equations for the work-in-process are derived. The heuristic algorithm for finding appropriate base stock levels of all machines at short time is proposed and the effect of advanced demand information is examined by simulation with the proposed algorithm. It is shown that the inventory cost can decreases with little backlogs by using the appropriate amount of demand information and setting appropriate base stock levels.

  9. Slow Cortical Dynamics and the Accumulation of Information over Long Timescales

    PubMed Central

    Honey, Christopher J.; Thesen, Thomas; Donner, Tobias H.; Silbert, Lauren J.; Carlson, Chad E.; Devinsky, Orrin; Doyle, Werner K.; Rubin, Nava; Heeger, David J.; Hasson, Uri

    2012-01-01

    SUMMARY Making sense of the world requires us to process information over multiple timescales. We sought to identify brain regions that accumulate information over short and long timescales and to characterize the distinguishing features of their dynamics. We recorded electrocorticographic (ECoG) signals from individuals watching intact and scrambled movies. Within sensory regions, fluctuations of high-frequency (64–200 Hz) power reliably tracked instantaneous low-level properties of the intact and scrambled movies. Within higher order regions, the power fluctuations were more reliable for the intact movie than the scrambled movie, indicating that these regions accumulate information over relatively long time periods (several seconds or longer). Slow (<0.1 Hz) fluctuations of high-frequency power with time courses locked to the movies were observed throughout the cortex. Slow fluctuations were relatively larger in regions that accumulated information over longer time periods, suggesting a connection between slow neuronal population dynamics and temporally extended information processing. PMID:23083743

  10. Visual information processing in the lion-tailed macaque (Macaca silenus): mental rotation or rotational invariance?

    PubMed

    Burmann, Britta; Dehnhardt, Guido; Mauck, Björn

    2005-01-01

    Mental rotation is a widely accepted concept indicating an image-like mental representation of visual information and an analogue mode of information processing in certain visuospatial tasks. In the task of discriminating between image and mirror-image of rotated figures, human reaction times increase with the angular disparity between the figures. In animals, tests of this kind yield inconsistent results. Pigeons were found to use a time-independent rotational invariance, possibly indicating a non-analogue information processing system that evolved in response to the horizontal plane of reference birds perceive during flight. Despite similar ecological demands concerning the visual reference plane, a sea lion was found to use mental rotation in similar tasks, but its processing speed while rotating three-dimensional stimuli seemed to depend on the axis of rotation in a different way than found for humans in similar tasks. If ecological demands influence the way information processing systems evolve, hominids might have secondarily lost the ability of rotational invariance while retreating from arboreal living and evolving an upright gait in which the vertical reference plane is more important. We therefore conducted mental rotation experiments with an arboreal living primate species, the lion-tailed macaque. Performing a two-alternative matching-to-sample procedure, the animal had to decide between rotated figures representing image and mirror-image of a previously shown upright sample. Although non-rotated stimuli were recognized faster than rotated ones, the animal's mean reaction times did not clearly increase with the angle of rotation. These results are inconsistent with the mental rotation concept but also cannot be explained assuming a mere rotational invariance. Our study thus seems to support the idea of information processing systems evolving gradually in response to specific ecological demands.

  11. On-Line Real-Time Management Information Systems and Their Impact Upon User Personnel and Organizational Structure in Aviation Maintenance Activities.

    DTIC Science & Technology

    1979-12-01

    the functional management level, a real-time production con- trol system and an order processing system at the operational level. SIDMS was designed...at any one time. 26 An overview of the major software systems in operation is listed below: a. Major Software Systems: Order processing system e Order ... processing for the supply support center/AWP locker. e Order processing for the airwing squadron material controls. e Order processing for the IMA

  12. Increased attention but more efficient disengagement: neuroscientific evidence for defensive processing of threatening health information.

    PubMed

    Kessels, Loes T E; Ruiter, Robert A C; Jansma, Bernadette M

    2010-07-01

    Previous studies indicate that people respond defensively to threatening health information, especially when the information challenges self-relevant goals. The authors investigated whether reduced acceptance of self-relevant health risk information is already visible in early attention processes, that is, attention disengagement processes. In a randomized, controlled trial with 29 smoking and nonsmoking students, a variant of Posner's cueing task was used in combination with the high-temporal resolution method of event-related brain potentials (ERPs). Reaction times and P300 ERP. Smokers showed lower P300 amplitudes in response to high- as opposed to low-threat invalid trials when moving their attention to a target in the opposite visual field, indicating more efficient attention disengagement processes. Furthermore, both smokers and nonsmokers showed increased P300 amplitudes in response to the presentation of high- as opposed to low-threat valid trials, indicating threat-induced attention-capturing processes. Reaction time measures did not support the ERP data, indicating that the ERP measure can be extremely informative to measure low-level attention biases in health communication. The findings provide the first neuroscientific support for the hypothesis that threatening health information causes more efficient disengagement among those for whom the health threat is self-relevant. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  13. Library Information System Time-Sharing (LISTS) Project. Final Report.

    ERIC Educational Resources Information Center

    Black, Donald V.

    The Library Information System Time-Sharing (LISTS) experiment was based on three innovations in data processing technology: (1) the advent of computer time-sharing on third-generation machines, (2) the development of general-purpose file-management software and (3) the introduction of large, library-oriented data bases. The main body of the…

  14. Management of Wood Products Manufacturing Using Simulation/Animation

    Treesearch

    D. Earl Kline; J.K. Wiedenbeck; Philip A. Araman

    1992-01-01

    Managers of hardwood processing facilities need timely information on which to base important decisions such as when to add costly equipment or how to improve profitability subject to time-varying demands. The overall purpose of this paper is to introduce a method that can effectively provide such timely information. A simulation/animation modeling procedure is...

  15. Identifying differences in biased affective information processing in major depression.

    PubMed

    Gollan, Jackie K; Pane, Heather T; McCloskey, Michael S; Coccaro, Emil F

    2008-05-30

    This study investigates the extent to which participants with major depression differ from healthy comparison participants in the irregularities in affective information processing, characterized by deficits in facial expression recognition, intensity categorization, and reaction time to identifying emotionally salient and neutral information. Data on diagnoses, symptom severity, and affective information processing using a facial recognition task were collected from 66 participants, male and female between ages 18 and 54 years, grouped by major depressive disorder (N=37) or healthy non-psychiatric (N=29) status. Findings from MANCOVAs revealed that major depression was associated with a significantly longer reaction time to sad facial expressions compared with healthy status. Also, depressed participants demonstrated a negative bias towards interpreting neutral facial expressions as sad significantly more often than healthy participants. In turn, healthy participants interpreted neutral faces as happy significantly more often than depressed participants. No group differences were observed for facial expression recognition and intensity categorization. The observed effects suggest that depression has significant effects on the perception of the intensity of negative affective stimuli, delayed speed of processing sad affective information, and biases towards interpreting neutral faces as sad.

  16. US GEOLOGICAL SURVEY'S NATIONAL SYSTEM FOR PROCESSING AND DISTRIBUTION OF NEAR REAL-TIME HYDROLOGICAL DATA.

    USGS Publications Warehouse

    Shope, William G.; ,

    1987-01-01

    The US Geological Survey is utilizing a national network of more than 1000 satellite data-collection stations, four satellite-relay direct-readout ground stations, and more than 50 computers linked together in a private telecommunications network to acquire, process, and distribute hydrological data in near real-time. The four Survey offices operating a satellite direct-readout ground station provide near real-time hydrological data to computers located in other Survey offices through the Survey's Distributed Information System. The computerized distribution system permits automated data processing and distribution to be carried out in a timely manner under the control and operation of the Survey office responsible for the data-collection stations and for the dissemination of hydrological information to the water-data users.

  17. Perception and satisfaction with the information received during the medical care process in patients with prostate cancer.

    PubMed

    Miñana López, B; Cánovas Tomás, M A; Cantalapiedra Escolar, A

    2016-03-01

    To assess the perception and degree of satisfaction of Spanish patients with prostate cancer (PC) concerning the information received during the medical care process. We analysed information on the perception of the medical care process of 591 patients with PC who attended a consultation. We also studied their degree of participation in decision making and the association between perceived satisfaction and the demographic and clinical variables, both of patients and specialists. Some 90.2% of the patients stated that they had received, mainly from the urologist, an appropriate amount of information about the disease. More than 80% of the patients were satisfied with the information received at the time of diagnosis. Some 70.3% of the patients stated that they better accepted the disease thanks to the information provided, and 60.5% believed that they had a better ability to resolve problems. Some 90.4% of the patients considered that the time provided by the specialist was appropriate. Some 62.5% of the patients participated in making decisions about their disease and treatment. Age (both of the patient and specialist), the extent of the disease, the time dedicated by the specialist and the type of centre were factors that had a significant association (P<.05) with the satisfaction achieved. The perception and degree of satisfaction that Spanish patients with PC have of the information received during the medical care process is good and is paralleled by a high degree of active participation in the therapeutic decision making process. Copyright © 2015 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  18. Modeling information diffusion in time-varying community networks

    NASA Astrophysics Data System (ADS)

    Cui, Xuelian; Zhao, Narisa

    2017-12-01

    Social networks are rarely static, and they typically have time-varying network topologies. A great number of studies have modeled temporal networks and explored social contagion processes within these models; however, few of these studies have considered community structure variations. In this paper, we present a study of how the time-varying property of a modular structure influences the information dissemination. First, we propose a continuous-time Markov model of information diffusion where two parameters, mobility rate and community attractiveness, are introduced to address the time-varying nature of the community structure. The basic reproduction number is derived, and the accuracy of this model is evaluated by comparing the simulation and theoretical results. Furthermore, numerical results illustrate that generally both the mobility rate and community attractiveness significantly promote the information diffusion process, especially in the initial outbreak stage. Moreover, the strength of this promotion effect is much stronger when the modularity is higher. Counterintuitively, it is found that when all communities have the same attractiveness, social mobility no longer accelerates the diffusion process. In addition, we show that the local spreading in the advantage group has been greatly enhanced due to the agglomeration effect caused by the social mobility and community attractiveness difference, which thus increases the global spreading.

  19. A further test of sequential-sampling models that account for payoff effects on response bias in perceptual decision tasks.

    PubMed

    Diederich, Adele

    2008-02-01

    Recently, Diederich and Busemeyer (2006) evaluated three hypotheses formulated as particular versions of a sequential-sampling model to account for the effects of payoffs in a perceptual decision task with time constraints. The bound-change hypothesis states that payoffs affect the distance of the starting position of the decision process to each decision bound. The drift-rate-change hypothesis states that payoffs affect the drift rate of the decision process. The two-stage-processing hypothesis assumes two processes, one for processing payoffs and another for processing stimulus information, and that on a given trial, attention switches from one process to the other. The latter hypothesis gave the best account of their data. The present study investigated two questions: (1) Does the experimental setting influence decisions, and consequently affect the fits of the hypotheses? A task was conducted in two experimental settings--either the time limit or the payoff matrix was held constant within a given block of trials, using three different payoff matrices and four different time limits--in order to answer this question. (2) Could it be that participants neglect payoffs on some trials and stimulus information on others? To investigate this idea, a further hypothesis was considered, the mixture-of-processes hypothesis. Like the two-stage-processing hypothesis, it postulates two processes, one for payoffs and another for stimulus information. However, it differs from the previous hypothesis in assuming that on a given trial exactly one of the processes operates, never both. The present design had no effect on choice probability but may have affected choice response times (RTs). Overall, the two-stage-processing hypothesis gave the best account, with respect both to choice probabilities and to observed mean RTs and mean RT patterns within a choice pair.

  20. The role of shared visual information for joint action coordination.

    PubMed

    Vesper, Cordula; Schmitz, Laura; Safra, Lou; Sebanz, Natalie; Knoblich, Günther

    2016-08-01

    Previous research has identified a number of coordination processes that enable people to perform joint actions. But what determines which coordination processes joint action partners rely on in a given situation? The present study tested whether varying the shared visual information available to co-actors can trigger a shift in coordination processes. Pairs of participants performed a movement task that required them to synchronously arrive at a target from separate starting locations. When participants in a pair received only auditory feedback about the time their partner reached the target they held their movement duration constant to facilitate coordination. When they received additional visual information about each other's movements they switched to a fundamentally different coordination process, exaggerating the curvature of their movements to communicate their arrival time. These findings indicate that the availability of shared perceptual information is a major factor in determining how individuals coordinate their actions to obtain joint outcomes. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  1. Decoding the time-course of object recognition in the human brain: From visual features to categorical decisions.

    PubMed

    Contini, Erika W; Wardle, Susan G; Carlson, Thomas A

    2017-10-01

    Visual object recognition is a complex, dynamic process. Multivariate pattern analysis methods, such as decoding, have begun to reveal how the brain processes complex visual information. Recently, temporal decoding methods for EEG and MEG have offered the potential to evaluate the temporal dynamics of object recognition. Here we review the contribution of M/EEG time-series decoding methods to understanding visual object recognition in the human brain. Consistent with the current understanding of the visual processing hierarchy, low-level visual features dominate decodable object representations early in the time-course, with more abstract representations related to object category emerging later. A key finding is that the time-course of object processing is highly dynamic and rapidly evolving, with limited temporal generalisation of decodable information. Several studies have examined the emergence of object category structure, and we consider to what degree category decoding can be explained by sensitivity to low-level visual features. Finally, we evaluate recent work attempting to link human behaviour to the neural time-course of object processing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Real-Time Sentence Processing in Children with Specific Language Impairment: The Contribution of Lexicosemantic, Syntactic, and World-Knowledge Information

    ERIC Educational Resources Information Center

    Pizzioli, Fabrizio; Schelstraete, Marie-Anne

    2013-01-01

    The present study investigated how lexicosemantic information, syntactic information, and world knowledge are integrated in the course of oral sentence processing in children with specific language impairment (SLI) as compared to children with typical language development. A primed lexical-decision task was used where participants had to make a…

  3. Elaborative rehearsal of nontemporal information interferes with temporal processing of durations in the range of seconds but not milliseconds.

    PubMed

    Rammsayer, Thomas; Ulrich, Rolf

    2011-05-01

    The distinct timing hypothesis suggests a sensory mechanism for processing of durations in the range of milliseconds and a cognitively controlled mechanism for processing of longer durations. To test this hypothesis, we employed a dual-task approach to investigate the effects of maintenance and elaborative rehearsal on temporal processing of brief and long durations. Unlike mere maintenance rehearsal, elaborative rehearsal as a secondary task involved transfer of information from working to long-term memory and elaboration of information to enhance storage in long-term memory. Duration discrimination of brief intervals was not affected by a secondary cognitive task that required either maintenance or elaborative rehearsal. Concurrent elaborative rehearsal, however, impaired discrimination of longer durations as compared to maintenance rehearsal and a control condition with no secondary task. These findings endorse the distinct timing hypothesis and are in line with the notion that executive functions, such as continuous memory updating and active transfer of information into long-term memory interfere with temporal processing of durations in the second, but not in the millisecond range. 2011 Elsevier B.V. All rights reserved.

  4. Brain response during the M170 time interval is sensitive to socially relevant information.

    PubMed

    Arviv, Oshrit; Goldstein, Abraham; Weeting, Janine C; Becker, Eni S; Lange, Wolf-Gero; Gilboa-Schechtman, Eva

    2015-11-01

    Deciphering the social meaning of facial displays is a highly complex neurological process. The M170, an event related field component of MEG recording, like its EEG counterpart N170, was repeatedly shown to be associated with structural encoding of faces. However, the scope of information encoded during the M170 time window is still being debated. We investigated the neuronal origin of facial processing of integrated social rank cues (SRCs) and emotional facial expressions (EFEs) during the M170 time interval. Participants viewed integrated facial displays of emotion (happy, angry, neutral) and SRCs (indicated by upward, downward, or straight head tilts). We found that the activity during the M170 time window is sensitive to both EFEs and SRCs. Specifically, highly prominent activation was observed in response to SRC connoting dominance as compared to submissive or egalitarian head cues. Interestingly, the processing of EFEs and SRCs appeared to rely on different circuitry. Our findings suggest that vertical head tilts are processed not only for their sheer structural variance, but as social information. Exploring the temporal unfolding and brain localization of non-verbal cues processing may assist in understanding the functioning of the social rank biobehavioral system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Research on moving object detection based on frog's eyes

    NASA Astrophysics Data System (ADS)

    Fu, Hongwei; Li, Dongguang; Zhang, Xinyuan

    2008-12-01

    On the basis of object's information processing mechanism with frog's eyes, this paper discussed a bionic detection technology which suitable for object's information processing based on frog's vision. First, the bionics detection theory by imitating frog vision is established, it is an parallel processing mechanism which including pick-up and pretreatment of object's information, parallel separating of digital image, parallel processing, and information synthesis. The computer vision detection system is described to detect moving objects which has special color, special shape, the experiment indicates that it can scheme out the detecting result in the certain interfered background can be detected. A moving objects detection electro-model by imitating biologic vision based on frog's eyes is established, the video simulative signal is digital firstly in this system, then the digital signal is parallel separated by FPGA. IN the parallel processing, the video information can be caught, processed and displayed in the same time, the information fusion is taken by DSP HPI ports, in order to transmit the data which processed by DSP. This system can watch the bigger visual field and get higher image resolution than ordinary monitor systems. In summary, simulative experiments for edge detection of moving object with canny algorithm based on this system indicate that this system can detect the edge of moving objects in real time, the feasibility of bionic model was fully demonstrated in the engineering system, and it laid a solid foundation for the future study of detection technology by imitating biologic vision.

  6. 75 FR 68702 - Regulation SHO

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-09

    ... extended compliance period will give industry participants additional time for programming and testing for... time for programming and testing for compliance with the Rule's requirements. We have been informed that there have been some delays in the programming process, due in part to certain information, which...

  7. "Just-in-time" clinical information.

    PubMed

    Chueh, H; Barnett, G O

    1997-06-01

    The just-in-time (JIT) model originated in the manufacturing industry as a way to manage parts inventories process so that specific components could be made available at the appropriate times (that is, "just in time"). This JIT model can be applied to the management of clinical information inventories, so that clinicians can have more immediate access to the most current and relevant information at the time they most need it--when making clinical care decisions. The authors discuss traditional modes of managing clinical information, and then describe how a new, JIT model may be developed and implemented. They describe three modes of clinician-information interactions that a JIT model might employ, the scope of information that may be made available in a JIT model (global information or local, case-specific information), and the challenges posed by the implementation of such an information-access model. Finally, they discuss how JIT information access may change how physicians practice medicine, various ways JIT information may be delivered, and concerns about the trustworthiness of electronically published and accessed information resources.

  8. Information-processing under incremental levels of physical loads: comparing racquet to combat sports.

    PubMed

    Mouelhi Guizani, S; Tenenbaum, G; Bouzaouach, I; Ben Kheder, A; Feki, Y; Bouaziz, M

    2006-06-01

    Skillful performance in combat and racquet sports consists of proficient technique accompanied with efficient information-processing while engaged in moderate to high physical effort. This study examined information processing and decision-making using simple reaction time (SRT) and choice reaction time (CRT) paradigms in athletes of combat sports and racquet ball games while undergoing incrementally increasing physical effort ranging from low to high intensities. Forty national level experienced athletics in the sports of tennis, table tennis, fencing, and boxing were selected for this study. Each subject performed both simple (SRT) and four-choice reaction time (4-CRT) tasks at rest, and while pedaling on a cycle ergometer at 20%, 40%, 60%, and 80% of their own maximal aerobic power (Pmax). RM MANCOVA revealed significant sport-type by physical load interaction effect mainly on CRT. Least significant difference (LSD) posthoc contrasts indicated that fencers and tennis players process information faster with incrementally increasing workload, while different patterns were obtained for boxers and table-tennis players. The error rate remained stable for each sport type over all conditions. Between-sport differences in SRT and CRT among the athletes were also noted. Findings provide evidence that the 4-CRT is a task that more closely corresponds to the original task athletes are familiar with and utilize in their practices and competitions. However, additional tests that mimic the real world experiences of each sport must be developed and used to capture the nature of information processing and response-selection in specific sports.

  9. Decentralized modal identification using sparse blind source separation

    NASA Astrophysics Data System (ADS)

    Sadhu, A.; Hazra, B.; Narasimhan, S.; Pandey, M. D.

    2011-12-01

    Popular ambient vibration-based system identification methods process information collected from a dense array of sensors centrally to yield the modal properties. In such methods, the need for a centralized processing unit capable of satisfying large memory and processing demands is unavoidable. With the advent of wireless smart sensor networks, it is now possible to process information locally at the sensor level, instead. The information at the individual sensor level can then be concatenated to obtain the global structure characteristics. A novel decentralized algorithm based on wavelet transforms to infer global structure mode information using measurements obtained using a small group of sensors at a time is proposed in this paper. The focus of the paper is on algorithmic development, while the actual hardware and software implementation is not pursued here. The problem of identification is cast within the framework of under-determined blind source separation invoking transformations of measurements to the time-frequency domain resulting in a sparse representation. The partial mode shape coefficients so identified are then combined to yield complete modal information. The transformations are undertaken using stationary wavelet packet transform (SWPT), yielding a sparse representation in the wavelet domain. Principal component analysis (PCA) is then performed on the resulting wavelet coefficients, yielding the partial mixing matrix coefficients from a few measurement channels at a time. This process is repeated using measurements obtained from multiple sensor groups, and the results so obtained from each group are concatenated to obtain the global modal characteristics of the structure.

  10. Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series

    PubMed Central

    Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael

    2014-01-01

    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems. PMID:25068489

  11. Conditioning from an information processing perspective.

    PubMed

    Gallistel, C R.

    2003-04-28

    The framework provided by Claude Shannon's [Bell Syst. Technol. J. 27 (1948) 623] theory of information leads to a quantitatively oriented reconceptualization of the processes that mediate conditioning. The focus shifts from processes set in motion by individual events to processes sensitive to the information carried by the flow of events. The conception of what properties of the conditioned and unconditioned stimuli are important shifts from the tangible properties to the intangible properties of number, duration, frequency and contingency. In this view, a stimulus becomes a CS if its onset substantially reduces the subject's uncertainty about the time of occurrence of the next US. One way to represent the subject's knowledge of that time of occurrence is by the cumulative probability function, which has two limiting forms: (1) The state of maximal uncertainty (minimal knowledge) is represented by the inverse exponential function for the random rate condition, in which the US is equally likely at any moment. (2) The limit to the subject's attainable certainty is represented by the cumulative normal function, whose momentary expectation is the CS-US latency minus the time elapsed since CS onset. Its standard deviation is the Weber fraction times the CS-US latency.

  12. Interferometer with Continuously Varying Path Length Measured in Wavelengths to the Reference Mirror

    NASA Technical Reports Server (NTRS)

    Ohara, Tetsuo (Inventor)

    2016-01-01

    An interferometer in which the path length of the reference beam, measured in wavelengths, is continuously changing in sinusoidal fashion and the interference signal created by combining the measurement beam and the reference beam is processed in real time to obtain the physical distance along the measurement beam between the measured surface and a spatial reference frame such as the beam splitter. The processing involves analyzing the Fourier series of the intensity signal at one or more optical detectors in real time and using the time-domain multi-frequency harmonic signals to extract the phase information independently at each pixel position of one or more optical detectors and converting the phase information to distance information.

  13. Audit of the informed consent process as a part of a clinical research quality assurance program.

    PubMed

    Lad, Pramod M; Dahl, Rebecca

    2014-06-01

    Audits of the informed consent process are a key element of a clinical research quality assurance program. A systematic approach to such audits has not been described in the literature. In this paper we describe two components of the audit. The first is the audit of the informed consent document to verify adherence with federal regulations. The second component is comprised of the audit of the informed consent conference, with emphasis on a real time review of the appropriate communication of the key elements of the informed consent. Quality measures may include preparation of an informed consent history log, notes to accompany the informed consent, the use of an informed consent feedback tool, and the use of institutional surveys to assess comprehension of the informed consent process.

  14. Emergency healthcare process automation using mobile computing and cloud services.

    PubMed

    Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G

    2012-10-01

    Emergency care is basically concerned with the provision of pre-hospital and in-hospital medical and/or paramedical services and it typically involves a wide variety of interdependent and distributed activities that can be interconnected to form emergency care processes within and between Emergency Medical Service (EMS) agencies and hospitals. Hence, in developing an information system for emergency care processes, it is essential to support individual process activities and to satisfy collaboration and coordination needs by providing readily access to patient and operational information regardless of location and time. Filling this information gap by enabling the provision of the right information, to the right people, at the right time fosters new challenges, including the specification of a common information format, the interoperability among heterogeneous institutional information systems or the development of new, ubiquitous trans-institutional systems. This paper is concerned with the development of an integrated computer support to emergency care processes by evolving and cross-linking institutional healthcare systems. To this end, an integrated EMS cloud-based architecture has been developed that allows authorized users to access emergency case information in standardized document form, as proposed by the Integrating the Healthcare Enterprise (IHE) profile, uses the Organization for the Advancement of Structured Information Standards (OASIS) standard Emergency Data Exchange Language (EDXL) Hospital Availability Exchange (HAVE) for exchanging operational data with hospitals and incorporates an intelligent module that supports triaging and selecting the most appropriate ambulances and hospitals for each case.

  15. Efficacy of Cognitive Processes in Young People with High-Functioning Autism Spectrum Disorder Using a Novel Visual Information-Processing Task

    ERIC Educational Resources Information Center

    Speirs, Samantha J.; Rinehart, Nicole J.; Robinson, Stephen R.; Tonge, Bruce J.; Yelland, Gregory W.

    2014-01-01

    Autism spectrum disorders (ASD) are characterised by a unique pattern of preserved abilities and deficits within and across cognitive domains. The Complex Information Processing Theory proposes this pattern reflects an altered capacity to respond to cognitive demands. This study compared how complexity induced by time constraints on processing…

  16. Smart Camera Technology Increases Quality

    NASA Technical Reports Server (NTRS)

    2004-01-01

    When it comes to real-time image processing, everyone is an expert. People begin processing images at birth and rapidly learn to control their responses through the real-time processing of the human visual system. The human eye captures an enormous amount of information in the form of light images. In order to keep the brain from becoming overloaded with all the data, portions of an image are processed at a higher resolution than others, such as a traffic light changing colors. changing colors. In the same manner, image processing products strive to extract the information stored in light in the most efficient way possible. Digital cameras available today capture millions of pixels worth of information from incident light. However, at frame rates more than a few per second, existing digital interfaces are overwhelmed. All the user can do is store several frames to memory until that memory is full and then subsequent information is lost. New technology pairs existing digital interface technology with an off-the-shelf complementary metal oxide semiconductor (CMOS) imager to provide more than 500 frames per second of specialty image processing. The result is a cost-effective detection system unlike any other.

  17. Visual representation of spatiotemporal structure

    NASA Astrophysics Data System (ADS)

    Schill, Kerstin; Zetzsche, Christoph; Brauer, Wilfried; Eisenkolb, A.; Musto, A.

    1998-07-01

    The processing and representation of motion information is addressed from an integrated perspective comprising low- level signal processing properties as well as higher-level cognitive aspects. For the low-level processing of motion information we argue that a fundamental requirement is the existence of a spatio-temporal memory. Its key feature, the provision of an orthogonal relation between external time and its internal representation, is achieved by a mapping of temporal structure into a locally distributed activity distribution accessible in parallel by higher-level processing stages. This leads to a reinterpretation of the classical concept of `iconic memory' and resolves inconsistencies on ultra-short-time processing and visual masking. The spatial-temporal memory is further investigated by experiments on the perception of spatio-temporal patterns. Results on the direction discrimination of motion paths provide evidence that information about direction and location are not processed and represented independent of each other. This suggests a unified representation on an early level, in the sense that motion information is internally available in form of a spatio-temporal compound. For the higher-level representation we have developed a formal framework for the qualitative description of courses of motion that may occur with moving objects.

  18. A 3D THz image processing methodology for a fully integrated, semi-automatic and near real-time operational system

    NASA Astrophysics Data System (ADS)

    Brook, A.; Cristofani, E.; Vandewal, M.; Matheis, C.; Jonuscheit, J.; Beigang, R.

    2012-05-01

    The present study proposes a fully integrated, semi-automatic and near real-time mode-operated image processing methodology developed for Frequency-Modulated Continuous-Wave (FMCW) THz images with the center frequencies around: 100 GHz and 300 GHz. The quality control of aeronautics composite multi-layered materials and structures using Non-Destructive Testing is the main focus of this work. Image processing is applied on the 3-D images to extract useful information. The data is processed by extracting areas of interest. The detected areas are subjected to image analysis for more particular investigation managed by a spatial model. Finally, the post-processing stage examines and evaluates the spatial accuracy of the extracted information.

  19. Information processing using a single dynamical node as complex system

    PubMed Central

    Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.

    2011-01-01

    Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110

  20. Regulation of health information processing in an outsourcing environment.

    PubMed

    2004-06-01

    Policy makers must consider the work force, technology, cost, and legal implications of their legislative proposals. AHIMA, AAMT, CHIA, and MTIA urge lawmakers to craft regulatory solutions that enforce HIPAA and support advancements in modern health information processing practices that improve the quality and cost of healthcare. We also urge increased investment in health information work force development and implementation of new technologies to advance critical healthcare outcomes--timely, accurate, accessible, and secure information to support patient care. It is essential that state legislatures reinforce the importance of improving information processing solutions for healthcare and not take actions that will produce unintended and detrimental consequences.

  1. Survey Satisficing Inflates Stereotypical Responses in Online Experiment: The Case of Immigration Study

    PubMed Central

    Miura, Asako; Kobayashi, Tetsuro

    2016-01-01

    Though survey satisficing, grudging cognitive efforts required to provide optimal answers in the survey response process, poses a serious threat to the validity of online experiments, a detailed explanation of the mechanism has yet to be established. Focusing on attitudes toward immigrants, we examined the mechanism by which survey satisficing distorts treatment effect estimates in online experiments. We hypothesized that satisficers would display more stereotypical responses than non-satisficers would when presented with stereotype-disconfirming information about an immigrant. Results of two experiments largely supported our hypotheses. Satisficers, whom we identified through an instructional manipulation check (IMC), processed information about immigrants' personality traits congruently with the stereotype activated by information provided about nationality. The significantly shorter vignette reading time of satisficers corroborates their time-efficient impression formation based on stereotyping. However, the shallow information processing of satisficers can be rectified by alerting them to their inattentiveness through use of a repeated IMC. PMID:27803680

  2. Separable Processes Before, During, and After the N400 Elicited by Previously Inferred and New Information: Evidence from Time-Frequency Decompositions

    PubMed Central

    Steele, Vaughn R.; Bernat, Edward M.; van den Broek, Paul; Collins, Paul F.; Patrick, Christopher J.; Marsolek, Chad J.

    2012-01-01

    Successful comprehension during reading often requires inferring information not explicitly presented. This information is readily accessible when subsequently encountered, and a neural correlate of this is an attenuation of the N400 event-related potential (ERP). We used ERPs and time-frequency (TF) analysis to investigate neural correlates of processing inferred information after a causal coherence inference had been generated during text comprehension. Participants read short texts, some of which promoted inference generation. After each text, they performed lexical decisions to target words that were unrelated or inference-related to the preceding text. Consistent with previous findings, inference-related words elicited an attenuated N400 relative to unrelated words. TF analyses revealed unique contributions to the N400 from activity occurring at 1–6 Hz (theta) and 0–2 Hz (delta), supporting the view that multiple, sequential processes underlie the N400. PMID:23165117

  3. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  4. The neural processing of taste

    PubMed Central

    Lemon, Christian H; Katz, Donald B

    2007-01-01

    Although there have been many recent advances in the field of gustatory neurobiology, our knowledge of how the nervous system is organized to process information about taste is still far from complete. Many studies on this topic have focused on understanding how gustatory neural circuits are spatially organized to represent information about taste quality (e.g., "sweet", "salty", "bitter", etc.). Arguments pertaining to this issue have largely centered on whether taste is carried by dedicated neural channels or a pattern of activity across a neural population. But there is now mounting evidence that the timing of neural events may also importantly contribute to the representation of taste. In this review, we attempt to summarize recent findings in the field that pertain to these issues. Both space and time are variables likely related to the mechanism of the gustatory neural code: information about taste appears to reside in spatial and temporal patterns of activation in gustatory neurons. What is more, the organization of the taste network in the brain would suggest that the parameters of space and time extend to the neural processing of gustatory information on a much grander scale. PMID:17903281

  5. 77 FR 77133 - Self-Regulatory Organizations; The Options Clearing Corporation; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-31

    ... month fee if they elect to subscribe to a service that provides real-time series information data. OCC... and processes to accommodate real-time feeds of Series Information data to Subscribers; however... these costs, OCC plans to charge a $250 per month fee to Subscribers receiving real-time Series...

  6. Time Processing Impairments in Preschoolers at Risk of Developing Difficulties in Mathematics

    ERIC Educational Resources Information Center

    Tobia, Valentina; Rinaldi, Luca; Marzocchi, Gian Marco

    2018-01-01

    The occurrence of time processing problems in individuals with Development Dyscalculia (DD) has favored the view of a general magnitude system devoted to both numerical and temporal information. Yet, this scenario has been partially challenged by studies indicating that time difficulties can be attributed to poor calculation or counting skills,…

  7. A Comparative Analysis of Extract, Transformation and Loading (ETL) Process

    NASA Astrophysics Data System (ADS)

    Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.

    2018-02-01

    The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).

  8. Time-interval for integration of stabilizing haptic and visual information in subjects balancing under static and dynamic conditions

    PubMed Central

    Honeine, Jean-Louis; Schieppati, Marco

    2014-01-01

    Maintaining equilibrium is basically a sensorimotor integration task. The central nervous system (CNS) continually and selectively weights and rapidly integrates sensory inputs from multiple sources, and coordinates multiple outputs. The weighting process is based on the availability and accuracy of afferent signals at a given instant, on the time-period required to process each input, and possibly on the plasticity of the relevant pathways. The likelihood that sensory inflow changes while balancing under static or dynamic conditions is high, because subjects can pass from a dark to a well-lit environment or from a tactile-guided stabilization to loss of haptic inflow. This review article presents recent data on the temporal events accompanying sensory transition, on which basic information is fragmentary. The processing time from sensory shift to reaching a new steady state includes the time to (a) subtract or integrate sensory inputs; (b) move from allocentric to egocentric reference or vice versa; and (c) adjust the calibration of motor activity in time and amplitude to the new sensory set. We present examples of processes of integration of posture-stabilizing information, and of the respective sensorimotor time-intervals while allowing or occluding vision or adding or subtracting tactile information. These intervals are short, in the order of 1–2 s for different postural conditions, modalities and deliberate or passive shift. They are just longer for haptic than visual shift, just shorter on withdrawal than on addition of stabilizing input, and on deliberate than unexpected mode. The delays are the shortest (for haptic shift) in blind subjects. Since automatic balance stabilization may be vulnerable to sensory-integration delays and to interference from concurrent cognitive tasks in patients with sensorimotor problems, insight into the processing time for balance control represents a critical step in the design of new balance- and locomotion training devices. PMID:25339872

  9. Information as the Fifth Dimension of the Universe which Fundamental Particles (strings), Dark Matter/Energy and Space-time are Floating in it While they are Listening to its Whispering for Getting Order

    NASA Astrophysics Data System (ADS)

    Gholibeigian, Hassan; Gholibeigian, Ghasem; Amirshahkarami, Azim; Gholibeigian, Kazem

    2017-01-01

    Four animated sub-particles (sub-strings) as origin of the life and generator of momentum (vibration) of elementary particles (strings) are communicated for transferring information for processing and preparing fundamental particles for the next step. It means that information may be a ``dimension'' of the nature which fundamental particles, dark matter/energy and space-time are floating in it and listening to its whispering and getting quantum information packages about their conditions and laws. So, communication of information which began before the spark to B.B. (Convection Bang), may be a ``Fundamental symmetry'' in the nature because leads other symmetries and supersymmetry as well as other phenomena. The processed information are always carried by fundamental particles as the preserved history and entropy of Universe. So, information wouldn't be destroyed, lost or released by black hole. But the involved fundamental particles of thermal radiation, electromagnetic and gravitational fields carry processed information during emitting from black hole, while they are communicated from fifth dimension for their new movement. AmirKabir University of Technology, Tehran, Iran.

  10. Task-technology fit of video telehealth for nurses in an outpatient clinic setting.

    PubMed

    Cady, Rhonda G; Finkelstein, Stanley M

    2014-07-01

    Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task-technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task-technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time-motion study. Qualitative and quantitative results were merged and analyzed within the task-technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task-technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Telehealth must provide the right information to the right clinician at the right time. Evaluating task-technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology.

  11. Towards Ontology-Driven Information Systems: Guidelines to the Creation of New Methodologies to Build Ontologies

    ERIC Educational Resources Information Center

    Soares, Andrey

    2009-01-01

    This research targeted the area of Ontology-Driven Information Systems, where ontology plays a central role both at development time and at run time of Information Systems (IS). In particular, the research focused on the process of building domain ontologies for IS modeling. The motivation behind the research was the fact that researchers have…

  12. Satisficing in split-second decision making is characterized by strategic cue discounting.

    PubMed

    Oh, Hanna; Beck, Jeffrey M; Zhu, Pingping; Sommer, Marc A; Ferrari, Silvia; Egner, Tobias

    2016-12-01

    Much of our real-life decision making is bounded by uncertain information, limitations in cognitive resources, and a lack of time to allocate to the decision process. It is thought that humans overcome these limitations through satisficing, fast but "good-enough" heuristic decision making that prioritizes some sources of information (cues) while ignoring others. However, the decision-making strategies we adopt under uncertainty and time pressure, for example during emergencies that demand split-second choices, are presently unknown. To characterize these decision strategies quantitatively, the present study examined how people solve a novel multicue probabilistic classification task under varying time pressure, by tracking shifts in decision strategies using variational Bayesian inference. We found that under low time pressure, participants correctly weighted and integrated all available cues to arrive at near-optimal decisions. With increasingly demanding, subsecond time pressures, however, participants systematically discounted a subset of the cue information by dropping the least informative cue(s) from their decision making process. Thus, the human cognitive apparatus copes with uncertainty and severe time pressure by adopting a "drop-the-worst" cue decision making strategy that minimizes cognitive time and effort investment while preserving the consideration of the most diagnostic cue information, thus maintaining "good-enough" accuracy. This advance in our understanding of satisficing strategies could form the basis of predicting human choices in high time pressure scenarios. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. Distributed Processing of Projections of Large Datasets: A Preliminary Study

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.

  14. Unraveling the sub-processes of selective attention: insights from dynamic modeling and continuous behavior.

    PubMed

    Frisch, Simon; Dshemuchadse, Maja; Görner, Max; Goschke, Thomas; Scherbaum, Stefan

    2015-11-01

    Selective attention biases information processing toward stimuli that are relevant for achieving our goals. However, the nature of this bias is under debate: Does it solely rely on the amplification of goal-relevant information or is there a need for additional inhibitory processes that selectively suppress currently distracting information? Here, we explored the processes underlying selective attention with a dynamic, modeling-based approach that focuses on the continuous evolution of behavior over time. We present two dynamic neural field models incorporating the diverging theoretical assumptions. Simulations with both models showed that they make similar predictions with regard to response times but differ markedly with regard to their continuous behavior. Human data observed via mouse tracking as a continuous measure of performance revealed evidence for the model solely based on amplification but no indication of persisting selective distracter inhibition.

  15. Optimal nonlinear information processing capacity in delay-based reservoir computers

    NASA Astrophysics Data System (ADS)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-09-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.

  16. Optimal nonlinear information processing capacity in delay-based reservoir computers.

    PubMed

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-09-11

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.

  17. Optimal nonlinear information processing capacity in delay-based reservoir computers

    PubMed Central

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-01-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature. PMID:26358528

  18. Information security of Smart Factories

    NASA Astrophysics Data System (ADS)

    Iureva, R. A.; Andreev, Y. S.; Iuvshin, A. M.; Timko, A. S.

    2018-05-01

    In several years, technologies and systems based on the Internet of things (IoT) will be widely used in all smart factories. When processing a huge array of unstructured data, their filtration and adequate interpretation are a priority for enterprises. In this context, the correct representation of information in a user-friendly form acquires special importance, for which the market today presents advanced analytical platforms designed to collect, store and analyze data on technological processes and events in real time. The main idea of the paper is the statement of the information security problem in IoT and integrity of processed information.

  19. Affective-cognitive meta-bases versus structural bases of attitudes predict processing interest versus efficiency.

    PubMed

    See, Ya Hui Michelle; Petty, Richard E; Fabrigar, Leandre R

    2013-08-01

    We proposed that (a) processing interest for affective over cognitive information is captured by meta-bases (i.e., the extent to which people subjectively perceive themselves to rely on affect or cognition in their attitudes) and (b) processing efficiency for affective over cognitive information is captured by structural bases (i.e., the extent to which attitudes are more evaluatively congruent with affect or cognition). Because processing speed can disentangle interest from efficiency by being manifest as longer or shorter reading times, we hypothesized and found that more affective meta-bases predicted longer affective than cognitive reading time when processing efficiency was held constant (Study 1). In contrast, more affective structural bases predicted shorter affective than cognitive reading time when participants were constrained in their ability to allocate resources deliberatively (Study 2). When deliberation was neither encouraged nor constrained, effects for meta-bases and structural bases emerged (Study 3). Implications for affective-cognitive processing and other attitudes-relevant constructs are discussed.

  20. 32 CFR 93.6 - Fees.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... SERVICE OF PROCESS; RELEASE OF OFFICIAL INFORMATION IN LITIGATION; AND TESTIMONY BY NSA PERSONNEL AS WITNESSES § 93.6 Fees. Consistent with the guidelines in § 93.1(e), NSA may charge reasonable fees to... providing such information, and may include: (a) The costs of time expended by NSA employees to process and...

  1. 32 CFR 93.6 - Fees.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... SERVICE OF PROCESS; RELEASE OF OFFICIAL INFORMATION IN LITIGATION; AND TESTIMONY BY NSA PERSONNEL AS WITNESSES § 93.6 Fees. Consistent with the guidelines in § 93.1(e), NSA may charge reasonable fees to... providing such information, and may include: (a) The costs of time expended by NSA employees to process and...

  2. 32 CFR 93.6 - Fees.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... SERVICE OF PROCESS; RELEASE OF OFFICIAL INFORMATION IN LITIGATION; AND TESTIMONY BY NSA PERSONNEL AS WITNESSES § 93.6 Fees. Consistent with the guidelines in § 93.1(e), NSA may charge reasonable fees to... providing such information, and may include: (a) The costs of time expended by NSA employees to process and...

  3. 32 CFR 93.6 - Fees.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... SERVICE OF PROCESS; RELEASE OF OFFICIAL INFORMATION IN LITIGATION; AND TESTIMONY BY NSA PERSONNEL AS WITNESSES § 93.6 Fees. Consistent with the guidelines in § 93.1(e), NSA may charge reasonable fees to... providing such information, and may include: (a) The costs of time expended by NSA employees to process and...

  4. 32 CFR 93.6 - Fees.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... SERVICE OF PROCESS; RELEASE OF OFFICIAL INFORMATION IN LITIGATION; AND TESTIMONY BY NSA PERSONNEL AS WITNESSES § 93.6 Fees. Consistent with the guidelines in § 93.1(e), NSA may charge reasonable fees to... providing such information, and may include: (a) The costs of time expended by NSA employees to process and...

  5. Analysis of Patent Activity in the Field of Quantum Information Processing

    NASA Astrophysics Data System (ADS)

    Winiarczyk, Ryszard; Gawron, Piotr; Miszczak, Jarosław Adam; Pawela, Łukasz; Puchała, Zbigniew

    2013-03-01

    This paper provides an analysis of patent activity in the field of quantum information processing. Data from the PatentScope database from the years 1993-2011 was used. In order to predict the future trends in the number of filed patents time series models were used.

  6. The Influence of Temporal Resolution Power and Working Memory Capacity on Psychometric Intelligence

    ERIC Educational Resources Information Center

    Troche, Stefan J.; Rammsayer, Thomas H.

    2009-01-01

    According to the temporal resolution power (TRP) hypothesis, higher TRP as reflected by better performance on psychophysical timing tasks accounts for faster speed of information processing and increased efficiency of information processing leading to better performance on tests of psychometric intelligence. An alternative explanation of…

  7. The effects of cognitive style and emotional trade-off difficulty on information processing in decision-making.

    PubMed

    Wang, Dawei; Hao, Leilei; Maguire, Phil; Hu, Yixin

    2016-12-01

    This study investigated the effects of cognitive style and emotional trade-off difficulty (ETOD) on information processing in decision-making. Eighty undergraduates (73.75% female, M = 21.90), grouped according to their cognitive style (field-dependent or field-independent), conducted an Information Display Board (IDB) task, through which search time, search depth and search pattern were measured. Participants' emotional states were assessed both before and after the IDB task. The results showed that participants experienced significantly more negative emotion under high ETOD compared to those under low ETOD. While both cognitive style and ETOD had significant effects on search time and search depth, only ETOD significantly influenced search pattern; individuals in both cognitive style groups tended to use attribute-based processing under high ETOD and to use alternative-based processing under low ETOD. There was also a significant interaction between cognitive style and ETOD for search time and search depth. We propose that these results are best accounted for by the coping behaviour framework under high ETOD, and by the negative emotion hypothesis under low ETOD. © 2016 International Union of Psychological Science.

  8. Information processing capacity in psychopathy: Effects of anomalous attention.

    PubMed

    Hamilton, Rachel K B; Newman, Joseph P

    2018-03-01

    Hamilton and colleagues (2015) recently proposed that an integrative deficit in psychopathy restricts simultaneous processing, thereby leaving fewer resources available for information encoding, narrowing the scope of attention, and undermining associative processing. The current study evaluated this parallel processing deficit proposal using the Simultaneous-Sequential paradigm. This investigation marks the first a priori test of the Hamilton et al.'s theoretical framework. We predicted that psychopathy would be associated with inferior performance (as indexed by lower accuracy and longer response time) on trials requiring simultaneous processing of visual information relative to trials necessitating sequential processing. Results were consistent with these predictions, supporting the proposal that psychopathy is characterized by a reduced capacity to process multicomponent perceptual information concurrently. We discuss the potential implications of impaired simultaneous processing for the conceptualization of the psychopathic deficit. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. Risk perception and information processing: the development and validation of a questionnaire to assess self-reported information processing.

    PubMed

    Smerecnik, Chris M R; Mesters, Ilse; Candel, Math J J M; De Vries, Hein; De Vries, Nanne K

    2012-01-01

    The role of information processing in understanding people's responses to risk information has recently received substantial attention. One limitation of this research concerns the unavailability of a validated questionnaire of information processing. This article presents two studies in which we describe the development and validation of the Information-Processing Questionnaire to meet that need. Study 1 describes the development and initial validation of the questionnaire. Participants were randomized to either a systematic processing or a heuristic processing condition after which they completed a manipulation check and the initial 15-item questionnaire and again two weeks later. The questionnaire was subjected to factor reliability and validity analyses on both measurement times for purposes of cross-validation of the results. A two-factor solution was observed representing a systematic processing and a heuristic processing subscale. The resulting scale showed good reliability and validity, with the systematic condition scoring significantly higher on the systematic subscale and the heuristic processing condition significantly higher on the heuristic subscale. Study 2 sought to further validate the questionnaire in a field study. Results of the second study corresponded with those of Study 1 and provided further evidence of the validity of the Information-Processing Questionnaire. The availability of this information-processing scale will be a valuable asset for future research and may provide researchers with new research opportunities. © 2011 Society for Risk Analysis.

  10. [Effects of an implicit internal working model on attachment in information processing assessed using Go/No-Go Association Task].

    PubMed

    Fujii, Tsutomu; Uebuchi, Hisashi; Yamada, Kotono; Saito, Masahiro; Ito, Eriko; Tonegawa, Akiko; Uebuchi, Marie

    2015-06-01

    The purposes of the present study were (a) to use both a relational-anxiety Go/No-Go Association Task (GNAT) and an avoidance-of-intimacy GNAT in order to assess an implicit Internal Working Model (IWM) of attachment; (b) to verify the effects of both measured implicit relational anxiety and implicit avoidance of intimacy on information processing. The implicit IWM measured by GNAT differed from the explicit IWM measured by questionnaires in terms of the effects on information processing. In particular, in subliminal priming tasks involving with others, implicit avoidance of intimacy predicted accelerated response times with negative stimulus words about attachment. Moreover, after subliminally priming stimulus words about self, implicit relational anxiety predicted delayed response times with negative stimulus words about attachment.

  11. 48 CFR 39.106 - Year 2000 compliance.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CATEGORIES OF CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY General 39.106 Year 2000 compliance. When acquiring information technology that will be required to perform date/time processing involving dates... information technology to be Year 2000 compliant; or (2) Require that non-compliant information technology be...

  12. Educational Management Information Systems: Progress and Prospectives.

    ERIC Educational Resources Information Center

    Evans, John A.

    An educational management information system is a network of communication channels, information sources, computer storage and retrieval devices, and processing routines that provide data to educational managers at different levels, places, and times to facilitate decisionmaking. Management information systems should be differentiated from…

  13. Parametric models to relate spike train and LFP dynamics with neural information processing.

    PubMed

    Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan

    2012-01-01

    Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial-by-trial behavioral performance than existing models of neural information processing. Our results highlight the utility of the unified modeling framework for characterizing spike-LFP recordings obtained during behavioral performance.

  14. Prediction of collision events: an EEG coherence analysis.

    PubMed

    Spapé, Michiel M; Serrien, Deborah J

    2011-05-01

    A common daily-life task is the interaction with moving objects for which prediction of collision events is required. To evaluate the sources of information used in this process, this EEG study required participants to judge whether two moving objects would collide with one another or not. In addition, the effect of a distractor object is evaluated. The measurements included the behavioural decision time and accuracy, eye movement fixation times, and the neural dynamics which was determined by means of EEG coherence, expressing functional connectivity between brain areas. Collision judgment involved widespread information processing across both hemispheres. When a distractor object was present, task-related activity was increased whereas distractor activity induced modulation of local sensory processing. Also relevant were the parietal regions communicating with bilateral occipital and midline areas and a left-sided sensorimotor circuit. Besides visual cues, cognitive and strategic strategies are used to establish a decision of events in time. When distracting information is introduced into the collision judgment process, it is managed at different processing levels and supported by distinct neural correlates. These data shed light on the processing mechanisms that support judgment of collision events; an ability that implicates higher-order decision-making. Copyright © 2011 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  15. I spy with my little eye: cognitive processing of framed physical activity messages.

    PubMed

    Bassett-Gunter, Rebecca L; Latimer-Cheung, Amy E; Martin Ginis, Kathleen A; Castelhano, Monica

    2014-01-01

    The primary purpose was to examine the relative cognitive processing of gain-framed versus loss-framed physical activity messages following exposure to health risk information. Guided by the Extended Parallel Process Model, the secondary purpose was to examine the relation between dwell time, message recall, and message-relevant thoughts, as well as perceived risk, personal relevance, and fear arousal. Baseline measures of perceived risk for inactivity-related disease and health problems were administered to 77 undergraduate students. Participants read population-specific health risk information while wearing a head-mounted eye tracker, which measured dwell time on message content. Perceived risk was then reassessed. Next, participants read PA messages while the eye tracker measured dwell time on message content. Immediately following message exposure, recall, thought-listing, fear arousal, and personal relevance were measured. Dwell time on gain-framed messages was significantly greater than loss-framed messages. However, message recall and thought-listing did not differ by message frame. Dwell time was not significantly related to recall or thought-listing. Consistent with the Extended Parallel Process Model, fear arousal was significantly related to recall, thought-listing, and personal relevance. In conclusion, gain-framed messages may evoke greater dwell time than loss-famed messages. However, dwell time alone may be insufficient for evoking further cognitive processing.

  16. Disambiguating past events: Accurate source memory for time and context depends on different retrieval processes.

    PubMed

    Persson, Bjorn M; Ainge, James A; O'Connor, Akira R

    2016-07-01

    Current animal models of episodic memory are usually based on demonstrating integrated memory for what happened, where it happened, and when an event took place. These models aim to capture the testable features of the definition of human episodic memory which stresses the temporal component of the memory as a unique piece of source information that allows us to disambiguate one memory from another. Recently though, it has been suggested that a more accurate model of human episodic memory would include contextual rather than temporal source information, as humans' memory for time is relatively poor. Here, two experiments were carried out investigating human memory for temporal and contextual source information, along with the underlying dual process retrieval processes, using an immersive virtual environment paired with a 'Remember-Know' memory task. Experiment 1 (n=28) showed that contextual information could only be retrieved accurately using recollection, while temporal information could be retrieved using either recollection or familiarity. Experiment 2 (n=24), which used a more difficult task, resulting in reduced item recognition rates and therefore less potential for contamination by ceiling effects, replicated the pattern of results from Experiment 1. Dual process theory predicts that it should only be possible to retrieve source context from an event using recollection, and our results are consistent with this prediction. That temporal information can be retrieved using familiarity alone suggests that it may be incorrect to view temporal context as analogous to other typically used source contexts. This latter finding supports the alternative proposal that time since presentation may simply be reflected in the strength of memory trace at retrieval - a measure ideally suited to trace strength interrogation using familiarity, as is typically conceptualised within the dual process framework. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Working memory capacity and redundant information processing efficiency.

    PubMed

    Endres, Michael J; Houpt, Joseph W; Donkin, Chris; Finn, Peter R

    2015-01-01

    Working memory capacity (WMC) is typically measured by the amount of task-relevant information an individual can keep in mind while resisting distraction or interference from task-irrelevant information. The current research investigated the extent to which differences in WMC were associated with performance on a novel redundant memory probes (RMP) task that systematically varied the amount of to-be-remembered (targets) and to-be-ignored (distractor) information. The RMP task was designed to both facilitate and inhibit working memory search processes, as evidenced by differences in accuracy, response time, and Linear Ballistic Accumulator (LBA) model estimates of information processing efficiency. Participants (N = 170) completed standard intelligence tests and dual-span WMC tasks, along with the RMP task. As expected, accuracy, response-time, and LBA model results indicated memory search and retrieval processes were facilitated under redundant-target conditions, but also inhibited under mixed target/distractor and redundant-distractor conditions. Repeated measures analyses also indicated that, while individuals classified as high (n = 85) and low (n = 85) WMC did not differ in the magnitude of redundancy effects, groups did differ in the efficiency of memory search and retrieval processes overall. Results suggest that redundant information reliably facilitates and inhibits the efficiency or speed of working memory search, and these effects are independent of more general limits and individual differences in the capacity or space of working memory.

  18. The Processing of Somatosensory Information Shifts from an Early Parallel into a Serial Processing Mode: A Combined fMRI/MEG Study.

    PubMed

    Klingner, Carsten M; Brodoehl, Stefan; Huonker, Ralph; Witte, Otto W

    2016-01-01

    The question regarding whether somatosensory inputs are processed in parallel or in series has not been clearly answered. Several studies that have applied dynamic causal modeling (DCM) to fMRI data have arrived at seemingly divergent conclusions. However, these divergent results could be explained by the hypothesis that the processing route of somatosensory information changes with time. Specifically, we suggest that somatosensory stimuli are processed in parallel only during the early stage, whereas the processing is later dominated by serial processing. This hypothesis was revisited in the present study based on fMRI analyses of tactile stimuli and the application of DCM to magnetoencephalographic (MEG) data collected during sustained (260 ms) tactile stimulation. Bayesian model comparisons were used to infer the processing stream. We demonstrated that the favored processing stream changes over time. We found that the neural activity elicited in the first 100 ms following somatosensory stimuli is best explained by models that support a parallel processing route, whereas a serial processing route is subsequently favored. These results suggest that the secondary somatosensory area (SII) receives information regarding a new stimulus in parallel with the primary somatosensory area (SI), whereas later processing in the SII is dominated by the preprocessed input from the SI.

  19. The Processing of Somatosensory Information Shifts from an Early Parallel into a Serial Processing Mode: A Combined fMRI/MEG Study

    PubMed Central

    Klingner, Carsten M.; Brodoehl, Stefan; Huonker, Ralph; Witte, Otto W.

    2016-01-01

    The question regarding whether somatosensory inputs are processed in parallel or in series has not been clearly answered. Several studies that have applied dynamic causal modeling (DCM) to fMRI data have arrived at seemingly divergent conclusions. However, these divergent results could be explained by the hypothesis that the processing route of somatosensory information changes with time. Specifically, we suggest that somatosensory stimuli are processed in parallel only during the early stage, whereas the processing is later dominated by serial processing. This hypothesis was revisited in the present study based on fMRI analyses of tactile stimuli and the application of DCM to magnetoencephalographic (MEG) data collected during sustained (260 ms) tactile stimulation. Bayesian model comparisons were used to infer the processing stream. We demonstrated that the favored processing stream changes over time. We found that the neural activity elicited in the first 100 ms following somatosensory stimuli is best explained by models that support a parallel processing route, whereas a serial processing route is subsequently favored. These results suggest that the secondary somatosensory area (SII) receives information regarding a new stimulus in parallel with the primary somatosensory area (SI), whereas later processing in the SII is dominated by the preprocessed input from the SI. PMID:28066197

  20. Intensity-based segmentation and visualization of cells in 3D microscopic images using the GPU

    NASA Astrophysics Data System (ADS)

    Kang, Mi-Sun; Lee, Jeong-Eom; Jeon, Woong-ki; Choi, Heung-Kook; Kim, Myoung-Hee

    2013-02-01

    3D microscopy images contain abundant astronomical data, rendering 3D microscopy image processing time-consuming and laborious on a central processing unit (CPU). To solve these problems, many people crop a region of interest (ROI) of the input image to a small size. Although this reduces cost and time, there are drawbacks at the image processing level, e.g., the selected ROI strongly depends on the user and there is a loss in original image information. To mitigate these problems, we developed a 3D microscopy image processing tool on a graphics processing unit (GPU). Our tool provides efficient and various automatic thresholding methods to achieve intensity-based segmentation of 3D microscopy images. Users can select the algorithm to be applied. Further, the image processing tool provides visualization of segmented volume data and can set the scale, transportation, etc. using a keyboard and mouse. However, the 3D objects visualized fast still need to be analyzed to obtain information for biologists. To analyze 3D microscopic images, we need quantitative data of the images. Therefore, we label the segmented 3D objects within all 3D microscopic images and obtain quantitative information on each labeled object. This information can use the classification feature. A user can select the object to be analyzed. Our tool allows the selected object to be displayed on a new window, and hence, more details of the object can be observed. Finally, we validate the effectiveness of our tool by comparing the CPU and GPU processing times by matching the specification and configuration.

  1. Characterization of autoregressive processes using entropic quantifiers

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; Redelico, Francisco O.

    2018-01-01

    The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.

  2. Intra-individual variability in information processing speed reflects white matter microstructure in multiple sclerosis.

    PubMed

    Mazerolle, Erin L; Wojtowicz, Magdalena A; Omisade, Antonina; Fisk, John D

    2013-01-01

    Slowed information processing speed is commonly reported in persons with multiple sclerosis (MS), and is typically investigated using clinical neuropsychological tests, which provide sensitive indices of mean-level information processing speed. However, recent studies have demonstrated that within-person variability or intra-individual variability (IIV) in information processing speed may be a more sensitive indicator of neurologic status than mean-level performance on clinical tests. We evaluated the neural basis of increased IIV in mildly affected relapsing-remitting MS patients by characterizing the relation between IIV (controlling for mean-level performance) and white matter integrity using diffusion tensor imaging (DTI). Twenty women with relapsing-remitting MS and 20 matched control participants completed the Computerized Test of Information Processing (CTIP), from which both mean response time and IIV were calculated. Other clinical measures of information processing speed were also collected. Relations between IIV on the CTIP and DTI metrics of white matter microstructure were evaluated using tract-based spatial statistics. We observed slower and more variable responses on the CTIP in MS patients relative to controls. Significant relations between white matter microstructure and IIV were observed for MS patients. Increased IIV was associated with reduced integrity in more white matter tracts than was slowed information processing speed as measured by either mean CTIP response time or other neuropsychological test scores. Thus, despite the common use of mean-level performance as an index of cognitive dysfunction in MS, IIV may be more sensitive to the overall burden of white matter disease at the microstructural level. Furthermore, our study highlights the potential value of considering within-person fluctuations, in addition to mean-level performance, for uncovering brain-behavior relationships in neurologic disorders with widespread white matter pathology.

  3. Dragon Stream Cipher for Secure Blackbox Cockpit Voice Recorder

    NASA Astrophysics Data System (ADS)

    Akmal, Fadira; Michrandi Nasution, Surya; Azmi, Fairuz

    2017-11-01

    Aircraft blackbox is a device used to record all aircraft information, which consists of Flight Data Recorder (FDR) and Cockpit Voice Recorder (CVR). Cockpit Voice Recorder contains conversations in the aircraft during the flight.Investigations on aircraft crashes usually take a long time, because it is difficult to find the aircraft blackbox. Then blackbox should have the ability to send information to other places. Aircraft blackbox must have a data security system, data security is a very important part at the time of information exchange process. The system in this research is to perform the encryption and decryption process on Cockpit Voice Recorder by people who are entitled by using Dragon Stream Cipher algorithm. The tests performed are time of data encryption and decryption, and avalanche effect. Result in this paper show us time encryption and decryption are 0,85 seconds and 1,84 second for 30 seconds Cockpit Voice Recorder data witn an avalanche effect 48,67 %.

  4. Development of Targeting UAVs Using Electric Helicopters and Yamaha RMAX

    DTIC Science & Technology

    2007-05-17

    including the QNX real - time operating system . The video overlay board is useful to display the onboard camera’s image with important information such as... real - time operating system . Fully utilizing the built-in multi-processing architecture with inter-process synchronization and communication

  5. Monitoring groundwater-surface water interaction using time-series and time-frequency analysis of transient three-dimensional electrical resistivity changes

    USGS Publications Warehouse

    Johnson, Timothy C.; Slater, Lee D.; Ntarlagiannis, Dimitris; Day-Lewis, Frederick D.; Elwaseif, Mehrez

    2012-01-01

    Time-lapse resistivity imaging is increasingly used to monitor hydrologic processes. Compared to conventional hydrologic measurements, surface time-lapse resistivity provides superior spatial coverage in two or three dimensions, potentially high-resolution information in time, and information in the absence of wells. However, interpretation of time-lapse electrical tomograms is complicated by the ever-increasing size and complexity of long-term, three-dimensional (3-D) time series conductivity data sets. Here we use 3-D surface time-lapse electrical imaging to monitor subsurface electrical conductivity variations associated with stage-driven groundwater-surface water interactions along a stretch of the Columbia River adjacent to the Hanford 300 near Richland, Washington, USA. We reduce the resulting 3-D conductivity time series using both time-series and time-frequency analyses to isolate a paleochannel causing enhanced groundwater-surface water interactions. Correlation analysis on the time-lapse imaging results concisely represents enhanced groundwater-surface water interactions within the paleochannel, and provides information concerning groundwater flow velocities. Time-frequency analysis using the Stockwell (S) transform provides additional information by identifying the stage periodicities driving groundwater-surface water interactions due to upstream dam operations, and identifying segments in time-frequency space when these interactions are most active. These results provide new insight into the distribution and timing of river water intrusion into the Hanford 300 Area, which has a governing influence on the behavior of a uranium plume left over from historical nuclear fuel processing operations.

  6. Information gathering, management and transfering for geospacial intelligence

    NASA Astrophysics Data System (ADS)

    Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena

    2017-07-01

    Information is a key subject in modern organization operations. The success of joint and combined operations with organizations partners depends on the accurate information and knowledge flow concerning the operations theatre: provision of resources, environment evolution, markets location, where and when an event occurred. As in the past and nowadays we cannot conceive modern operations without maps and geo-spatial information (GI). Information and knowledge management is fundamental to the success of organizational decisions in an uncertainty environment. The georeferenced information management is a process of knowledge management, it begins in the raw data and ends on generating knowledge. GI and intelligence systems allow us to integrate all other forms of intelligence and can be a main platform to process and display geo-spatial-time referenced events. Combining explicit knowledge with peoples know-how to generate a continuous learning cycle that supports real time decisions mitigates the influences of fog of everyday competition and provides the knowledge supremacy. Extending the preliminary analysis done in [1], this work applies the exploratory factor analysis to a questionnaire about the GI and intelligence management in an organization company allowing to identify future lines of action to improve information process sharing and exploration of all the potential of this important resource.

  7. The neural time course of art perception: an ERP study on the processing of style versus content in art.

    PubMed

    Augustin, M Dorothee; Defranceschi, Birgit; Fuchs, Helene K; Carbon, Claus-Christian; Hutzler, Florian

    2011-06-01

    A central prerequisite to understand the phenomenon of art in psychological terms is to investigate the nature of the underlying perceptual and cognitive processes. Building on a study by Augustin, Leder, Hutzler, and Carbon (2008) the current ERP study examined the neural time course of two central aspects of representational art, one of which is closely related to object- and scene perception, the other of which is art-specific: content and style. We adapted a paradigm that has repeatedly been employed in psycholinguistics and that allows one to examine the neural time course of two processes in terms of when sufficient information is available to allow successful classification. Twenty-two participants viewed pictures that systematically varied in style and content and conducted a combined go/nogo dual choice task. The dependent variables of interest were the Lateralised Readiness Potential (LRP) and the N200 effect. Analyses of both measures support the notion that in the processing of art style follows content, with style-related information being available at around 224 ms or between 40 and 94 ms later than content-related information. The paradigm used here offers a promising approach to further explore the time course of art perception, thus helping to unravel the perceptual and cognitive processes that underlie the phenomenon of art and the fascination it exerts. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. PMIS Project. Planning & Management Information System. A Project To Develop a Data Processing System for Support of the Planning and Management Needs of Local School Districts. Final Report, Year 2.

    ERIC Educational Resources Information Center

    Council of the Great City Schools, Washington, DC.

    This document examines the design and structure of PMIS (Planning and Management Information System), an information system that supports the decisionmaking process of executive management in local school districts. The system is designed around a comprehensive, longitudinal, and interrelated data base. It utilizes a powerful real-time,…

  9. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools

    NASA Astrophysics Data System (ADS)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.

    2015-12-01

    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  10. A real-time dashboard for managing pathology processes

    PubMed Central

    Halwani, Fawaz; Li, Wei Chen; Banerjee, Diponkar; Lessard, Lysanne; Amyot, Daniel; Michalowski, Wojtek; Giffen, Randy

    2016-01-01

    Context: The Eastern Ontario Regional Laboratory Association (EORLA) is a newly established association of all the laboratory and pathology departments of Eastern Ontario that currently includes facilities from eight hospitals. All surgical specimens for EORLA are processed in one central location, the Department of Pathology and Laboratory Medicine (DPLM) at The Ottawa Hospital (TOH), where the rapid growth and influx of surgical and cytology specimens has created many challenges in ensuring the timely processing of cases and reports. Although the entire process is maintained and tracked in a clinical information system, this system lacks pre-emptive warnings that can help management address issues as they arise. Aims: Dashboard technology provides automated, real-time visual clues that could be used to alert management when a case or specimen is not being processed within predefined time frames. We describe the development of a dashboard helping pathology clinical management to make informed decisions on specimen allocation and tracking. Methods: The dashboard was designed and developed in two phases, following a prototyping approach. The first prototype of the dashboard helped monitor and manage pathology processes at the DPLM. Results: The use of this dashboard helped to uncover operational inefficiencies and contributed to an improvement of turn-around time within The Ottawa Hospital's DPML. It also allowed the discovery of additional requirements, leading to a second prototype that provides finer-grained, real-time information about individual cases and specimens. Conclusion: We successfully developed a dashboard that enables managers to address delays and bottlenecks in specimen allocation and tracking. This support ensures that pathology reports are provided within time frame standards required for high-quality patient care. Given the importance of rapid diagnostics for a number of diseases, the use of real-time dashboards within pathology departments could contribute to improving the quality of patient care beyond EORLA's. PMID:27217974

  11. Environmental Compliance Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1981-02-01

    The Guide is intended to assist Department of Energy personnel by providing information on the NEPA process, the processes of other environmental statutes that bear on the NEPA process, the timing relationships between the NEPA process and these other processes, as well as timing relationships between the NEPA process and the development process for policies, programs, and projects. This information should be helpful not only in formulating environmental compliance plans but also in achieving compliance with NEPA and various other environmental statutes. The Guide is divided into three parts with related appendices: Part I provides guidance for developing environmental compliancemore » plans for DOE actions; Part II is devoted to NEPA with detailed flowcharts depicting the compliance procedures required by CEQ regulations and Department of Energy NEPA Guidelines; and Part III contains a series of flowcharts for other Federal environmental requirements that may apply to DOE projects.« less

  12. The right parietal cortex and time perception: back to Critchley and the Zeitraffer phenomenon.

    PubMed

    Alexander, Iona; Cowey, Alan; Walsh, Vincent

    2005-05-01

    We investigated the involvement of the posterior parietal cortex in time perception by temporarily disrupting normal functioning in this region, in subjects making prospective judgements of time or pitch. Disruption of the right posterior parietal cortex significantly slowed reaction times when making time, but not pitch, judgements. Similar interference with the left parietal cortex and control stimulation over the vertex did not significantly change performance on either pitch or time tasks. The results show that the information processing necessary for temporal judgements involves the parietal cortex, probably to optimise spatiotemporal accuracy in voluntary action. The results are in agreement with a recent neuroimaging study and are discussed with regard to a psychological model of temporal processing and a recent proposal that time is part of a parietal cortex system for encoding magnitude information relevant for action.

  13. Modelling spatiotemporal change using multidimensional arrays Meng

    NASA Astrophysics Data System (ADS)

    Lu, Meng; Appel, Marius; Pebesma, Edzer

    2017-04-01

    The large variety of remote sensors, model simulations, and in-situ records provide great opportunities to model environmental change. The massive amount of high-dimensional data calls for methods to integrate data from various sources and to analyse spatiotemporal and thematic information jointly. An array is a collection of elements ordered and indexed in arbitrary dimensions, which naturally represent spatiotemporal phenomena that are identified by their geographic locations and recording time. In addition, array regridding (e.g., resampling, down-/up-scaling), dimension reduction, and spatiotemporal statistical algorithms are readily applicable to arrays. However, the role of arrays in big geoscientific data analysis has not been systematically studied: How can arrays discretise continuous spatiotemporal phenomena? How can arrays facilitate the extraction of multidimensional information? How can arrays provide a clean, scalable and reproducible change modelling process that is communicable between mathematicians, computer scientist, Earth system scientist and stakeholders? This study emphasises on detecting spatiotemporal change using satellite image time series. Current change detection methods using satellite image time series commonly analyse data in separate steps: 1) forming a vegetation index, 2) conducting time series analysis on each pixel, and 3) post-processing and mapping time series analysis results, which does not consider spatiotemporal correlations and ignores much of the spectral information. Multidimensional information can be better extracted by jointly considering spatial, spectral, and temporal information. To approach this goal, we use principal component analysis to extract multispectral information and spatial autoregressive models to account for spatial correlation in residual based time series structural change modelling. We also discuss the potential of multivariate non-parametric time series structural change methods, hierarchical modelling, and extreme event detection methods to model spatiotemporal change. We show how array operations can facilitate expressing these methods, and how the open-source array data management and analytics software SciDB and R can be used to scale the process and make it easily reproducible.

  14. FPGA-based real time processing of the Plenoptic Wavefront Sensor

    NASA Astrophysics Data System (ADS)

    Rodríguez-Ramos, L. F.; Marín, Y.; Díaz, J. J.; Piqueras, J.; García-Jiménez, J.; Rodríguez-Ramos, J. M.

    The plenoptic wavefront sensor combines measurements at pupil and image planes in order to obtain simultaneously wavefront information from different points of view, being capable to sample the volume above the telescope to extract the tomographic information of the atmospheric turbulence. The advantages of this sensor are presented elsewhere at this conference (José M. Rodríguez-Ramos et al). This paper will concentrate in the processing required for pupil plane phase recovery, and its computation in real time using FPGAs (Field Programmable Gate Arrays). This technology eases the implementation of massive parallel processing and allows tailoring the system to the requirements, maintaining flexibility, speed and cost figures.

  15. Listeners modulate temporally selective attention during natural speech processing

    PubMed Central

    Astheimer, Lori B.; Sanders, Lisa D.

    2009-01-01

    Spatially selective attention allows for the preferential processing of relevant stimuli when more information than can be processed in detail is presented simultaneously at distinct locations. Temporally selective attention may serve a similar function during speech perception by allowing listeners to allocate attentional resources to time windows that contain highly relevant acoustic information. To test this hypothesis, event-related potentials were compared in response to attention probes presented in six conditions during a narrative: concurrently with word onsets, beginning 50 and 100 ms before and after word onsets, and at random control intervals. Times for probe presentation were selected such that the acoustic environments of the narrative were matched for all conditions. Linguistic attention probes presented at and immediately following word onsets elicited larger amplitude N1s than control probes over medial and anterior regions. These results indicate that native speakers selectively process sounds presented at specific times during normal speech perception. PMID:18395316

  16. Neural network for processing both spatial and temporal data with time based back-propagation

    NASA Technical Reports Server (NTRS)

    Villarreal, James A. (Inventor); Shelton, Robert O. (Inventor)

    1993-01-01

    Neural networks are computing systems modeled after the paradigm of the biological brain. For years, researchers using various forms of neural networks have attempted to model the brain's information processing and decision-making capabilities. Neural network algorithms have impressively demonstrated the capability of modeling spatial information. On the other hand, the application of parallel distributed models to the processing of temporal data has been severely restricted. The invention introduces a novel technique which adds the dimension of time to the well known back-propagation neural network algorithm. In the space-time neural network disclosed herein, the synaptic weights between two artificial neurons (processing elements) are replaced with an adaptable-adjustable filter. Instead of a single synaptic weight, the invention provides a plurality of weights representing not only association, but also temporal dependencies. In this case, the synaptic weights are the coefficients to the adaptable digital filters. Novelty is believed to lie in the disclosure of a processing element and a network of the processing elements which are capable of processing temporal as well as spacial data.

  17. The effects of prior knowledge on study-time allocation and free recall: investigating the discrepancy reduction model.

    PubMed

    Verkoeijen, Peter P J L; Rikers, Remy M J P; Schmidt, Henk G

    2005-01-01

    In this study, the authors examined the influence of prior knowledge activation on information processing by means of a prior knowledge activation procedure adopted from the read-generate paradigm. On the basis of cue-target pairs, participants in the experimental groups generated two different sets of items before studying a relevant list. Subsequently, participants were informed that they had to study the items in the list and that they should try to remember as many items as possible. The authors assessed the processing time allocated to the items in the list and free recall of those items. The results revealed that the experimental groups spent less time on items that had already been activated. In addition, the experimental groups outperformed the control group in overall free recall and in free recall of the activated items. Between-group comparisons did not demonstrate significant effects with respect to the processing time and free recall of nonactivated items. The authors interpreted these results in terms of the discrepancy reduction model of regulating the amount of processing time allocated to different parts of the list.

  18. Semantic Elaboration: ERPs Reveal Rapid Transition from Novel to Known

    ERIC Educational Resources Information Center

    Bauer, Patricia J.; Jackson, Felicia L.

    2015-01-01

    Like language, semantic memory is productive: It extends itself through self-derivation of new information through logical processes such as analogy, deduction, and induction, for example. Though it is clear these productive processes occur, little is known about the time course over which newly self-derived information becomes incorporated into…

  19. Using 2H and 18O in assessing evaporation and water residence time of lakes in EPA’s National Lakes Assessment.

    EPA Science Inventory

    Stable isotopes of water and organic material can be very useful in monitoring programs because stable isotopes integrate information about ecological processes and record this information. Most ecological processes of interest for water quality (i.e. denitrification) require si...

  20. Social Information Processing as a Mediator between Cognitive Schemas and Aggressive Behavior in Adolescents

    ERIC Educational Resources Information Center

    Calvete, Esther; Orue, Izaskun

    2012-01-01

    This longitudinal investigation assessed whether cognitive schemas of justification of violence, mistrust, and narcissism predicted social information processing (SIP), and SIP in turn predicted aggressive behavior in adolescents. A total of 650 adolescents completed measures of cognitive schemas at Time 1, SIP in ambiguous social scenarios at…

  1. The Time Course of Recovery for Grammatical Category Information During Lexical Processing for Syntactic Construction

    ERIC Educational Resources Information Center

    Pechmann, Thomas; Garrett, Merrill; Zerbst, Dieter

    2004-01-01

    In the experiments outlined in this article, the authors investigate lexical access processes in language production. In their earlier work, T. Pechmann and D. Zerbst (2002) reported evidence for grammatical category constraints in a picture-word interference task. Although grammatical category information was not activated when subjects produced…

  2. Pre-Service Teachers' Material Development Process Based on the ADDIE Model: E-Book Design

    ERIC Educational Resources Information Center

    Usta, Necla Dönmez; Güntepe, Ebru Turan

    2017-01-01

    With the developments in information and communication technologies, books which are fundamental information sources for students throughout their education and training process are being transformed into electronic book (e-book) formats. E-books provide interactive environments, and they are also updateable materials, which shows that, in time,…

  3. Signal digitizing system and method based on amplitude-to-time optical mapping

    DOEpatents

    Chou, Jason; Bennett, Corey V; Hernandez, Vince

    2015-01-13

    A signal digitizing system and method based on analog-to-time optical mapping, optically maps amplitude information of an analog signal of interest first into wavelength information using an amplitude tunable filter (ATF) to impress spectral changes induced by the amplitude of the analog signal onto a carrier signal, i.e. a train of optical pulses, and next from wavelength information to temporal information using a dispersive element so that temporal information representing the amplitude information is encoded in the time domain in the carrier signal. Optical-to-electrical conversion of the optical pulses into voltage waveforms and subsequently digitizing the voltage waveforms into a digital image enables the temporal information to be resolved and quantized in the time domain. The digital image may them be digital signal processed to digitally reconstruct the analog signal based on the temporal information with high fidelity.

  4. Synchronization of optical photons for quantum information processing.

    PubMed

    Makino, Kenzo; Hashimoto, Yosuke; Yoshikawa, Jun-Ichi; Ohdan, Hideaki; Toyama, Takeshi; van Loock, Peter; Furusawa, Akira

    2016-05-01

    A fundamental element of quantum information processing with photonic qubits is the nonclassical quantum interference between two photons when they bunch together via the Hong-Ou-Mandel (HOM) effect. Ultimately, many such photons must be processed in complex interferometric networks. For this purpose, it is essential to synchronize the arrival times of the flying photons and to keep their purities high. On the basis of the recent experimental success of single-photon storage with high purity, we demonstrate for the first time the HOM interference of two heralded, nearly pure optical photons synchronized through two independent quantum memories. Controlled storage times of up to 1.8 μs for about 90 events per second were achieved with purities that were sufficiently high for a negative Wigner function confirmed with homodyne measurements.

  5. Synchronization of optical photons for quantum information processing

    PubMed Central

    Makino, Kenzo; Hashimoto, Yosuke; Yoshikawa, Jun-ichi; Ohdan, Hideaki; Toyama, Takeshi; van Loock, Peter; Furusawa, Akira

    2016-01-01

    A fundamental element of quantum information processing with photonic qubits is the nonclassical quantum interference between two photons when they bunch together via the Hong-Ou-Mandel (HOM) effect. Ultimately, many such photons must be processed in complex interferometric networks. For this purpose, it is essential to synchronize the arrival times of the flying photons and to keep their purities high. On the basis of the recent experimental success of single-photon storage with high purity, we demonstrate for the first time the HOM interference of two heralded, nearly pure optical photons synchronized through two independent quantum memories. Controlled storage times of up to 1.8 μs for about 90 events per second were achieved with purities that were sufficiently high for a negative Wigner function confirmed with homodyne measurements. PMID:27386536

  6. A complexity basis for phenomenology: How information states at criticality offer a new approach to understanding experience of self, being and time.

    PubMed

    Hankey, Alex

    2015-12-01

    In the late 19th century Husserl studied our internal sense of time passing, maintaining that its deep connections into experience represent prima facie evidence for it as the basis for all investigations in the sciences: Phenomenology was born. Merleau-Ponty focused on perception pointing out that any theory of experience must accord with established aspects of biology i.e. be embodied. Recent analyses suggest that theories of experience require non-reductive, integrative information, together with a specific property connecting them to experience. Here we elucidate a new class of information states with just such properties found at the loci of control of complex biological systems, including nervous systems. Complexity biology concerns states satisfying self-organized criticality. Such states are located at critical instabilities, commonly observed in biological systems, and thought to maximize information diversity and processing, and hence to optimize regulation. Major results for biology follow: why organisms have unusually low entropies; and why they are not merely mechanical. Criticality states form singular self-observing systems, which reduce wave packets by processes of perfect self-observation associated with feedback gain g = 1. Analysis of their information properties leads to identification of a new kind of information state with high levels of internal coherence, and feedback loops integrated into their structure. The major idea presented here is that the integrated feedback loops are responsible for our 'sense of self', and also the feeling of continuity in our sense of time passing. Long-range internal correlations guarantee a unique kind of non-reductive, integrative information structure enabling such states to naturally support phenomenal experience. Being founded in complexity biology, they are 'embodied'; they also fulfill the statement that 'The self is a process', a singular process. High internal correlations and René Thom-style catastrophes support non-digital forms of information, gestalt cognition, and information transfer via quantum teleportation. Criticality in complexity biology can 'embody' cognitive states supporting gestalts, and phenomenology's senses of 'self,' time passing, existence and being. Copyright © 2015. Published by Elsevier Ltd.

  7. Information processing capacity while wearing personal protective eyewear.

    PubMed

    Wade, Chip; Davis, Jerry; Marzilli, Thomas S; Weimar, Wendi H

    2006-08-15

    It is difficult to overemphasize the function vision plays in information processing, specifically in maintaining postural control. Vision appears to be an immediate, effortless event; suggesting that eyes need only to be open to employ the visual information provided by the environment. This study is focused on investigating the effect of Occupational Safety and Health Administration regulated personal protective eyewear (29 CFR 1910.133) on physiological and cognitive factors associated with information processing capabilities. Twenty-one college students between the ages of 19 and 25 years were randomly tested in each of three eyewear conditions (control, new and artificially aged) on an inclined and horizontal support surface for auditory and visual stimulus reaction time. Data collection trials consisted of 50 randomly selected (25 auditory, 25 visual) stimuli over a 10-min surface-eyewear condition trial. Auditory stimulus reaction time was significantly affected by the surface by eyewear interaction (F2,40 = 7.4; p < 0.05). Similarly, analysis revealed a significant surface by eyewear interaction in reaction time following the visual stimulus (F2,40 = 21.7; p < 0.05). The current findings do not trivialize the importance of personal protective eyewear usage in an occupational setting; rather, they suggest the value of future research focused on the effect that personal protective eyewear has on the physiological, cognitive and biomechanical contributions to postural control. These findings suggest that while personal protective eyewear may serve to protect an individual from eye injury, an individual's use of such personal protective eyewear may have deleterious effects on sensory information associated with information processing and postural control.

  8. DOD Acquisition Information Management

    DTIC Science & Technology

    1994-09-30

    instead of on a real- time management information flow. The process of identifying risks and implementing corrective actions is lengthened by using the current system; performance measurement and reporting are impeded.

  9. Real-time nondestructive monitoring of the gas tungsten arc welding (GTAW) process by combined airborne acoustic emission and non-contact ultrasonics

    NASA Astrophysics Data System (ADS)

    Zhang, Lu; Basantes-Defaz, Alexandra-Del-Carmen; Abbasi, Zeynab; Yuhas, Donald; Ozevin, Didem; Indacochea, Ernesto

    2018-03-01

    Welding is a key manufacturing process for many industries and may introduce defects into the welded parts causing significant negative impacts, potentially ruining high-cost pieces. Therefore, a real-time process monitoring method is important to implement for avoiding producing a low-quality weld. Due to high surface temperature and possible contamination of surface by contact transducers, the welding process should be monitored via non-contact transducers. In this paper, airborne acoustic emission (AE) transducers tuned at 60 kHz and non-contact ultrasonic testing (UT) transducers tuned at 500 kHz are implemented for real time weld monitoring. AE is a passive nondestructive evaluation method that listens for the process noise, and provides information about the uniformity of manufacturing process. UT provides more quantitative information about weld defects. One of the most common weld defects as burn-through is investigated. The influences of weld defects on AE signatures (time-driven data) and UT signals (received signal energy, change in peak frequency) are presented. The level of burn-through damage is defined by using single method or combine AE/UT methods.

  10. Neural processing of visual information under interocular suppression: a critical review

    PubMed Central

    Sterzer, Philipp; Stein, Timo; Ludwig, Karin; Rothkirch, Marcus; Hesselmann, Guido

    2014-01-01

    When dissimilar stimuli are presented to the two eyes, only one stimulus dominates at a time while the other stimulus is invisible due to interocular suppression. When both stimuli are equally potent in competing for awareness, perception alternates spontaneously between the two stimuli, a phenomenon called binocular rivalry. However, when one stimulus is much stronger, e.g., due to higher contrast, the weaker stimulus can be suppressed for prolonged periods of time. A technique that has recently become very popular for the investigation of unconscious visual processing is continuous flash suppression (CFS): High-contrast dynamic patterns shown to one eye can render a low-contrast stimulus shown to the other eye invisible for up to minutes. Studies using CFS have produced new insights but also controversies regarding the types of visual information that can be processed unconsciously as well as the neural sites and the relevance of such unconscious processing. Here, we review the current state of knowledge in regard to neural processing of interocularly suppressed information. Focusing on recent neuroimaging findings, we discuss whether and to what degree such suppressed visual information is processed at early and more advanced levels of the visual processing hierarchy. We review controversial findings related to the influence of attention on early visual processing under interocular suppression, the putative differential roles of dorsal and ventral areas in unconscious object processing, and evidence suggesting privileged unconscious processing of emotional and other socially relevant information. On a more general note, we discuss methodological and conceptual issues, from practical issues of how unawareness of a stimulus is assessed to the overarching question of what constitutes an adequate operational definition of unawareness. Finally, we propose approaches for future research to resolve current controversies in this exciting research area. PMID:24904469

  11. An Exploratory Study on the Information Needs of Prostate Cancer Patients and Their Partners

    PubMed Central

    Kassianos, Angelos P.; Raats, Monique M.; Gage, Heather

    2016-01-01

    The aim of this study is to explore the information needs of men with prostate cancer and their partners retrospectively at various points in the treatment process. An online questionnaire was used to collect information from men with prostate cancer and their partners about information needs, and when these developed. Readers of a Prostate Care Cookbook and members of a Prostate Cancer Charity were invited to participate: 73 men with prostate cancer and 25 partners completed the questionnaire. Responses showed that participants develop their information needs close to diagnosis. Less educated men with prostate cancer and partners developed their needs closer to the time after diagnosis than those with higher education. Partners develop an interest on information related to treatment and interaction earlier than patients. Patients prioritised treatment and disease-specific information. Patients and partners differ in how their information needs develop. Medical information is prioritized by patients as opposed to practical information by partners. Health care provision can be tailored to meet the different needs of prostate cancer patients and their partners at different times in the treatment process. PMID:27403460

  12. The ticking time bomb: Using eye-tracking methodology to capture attentional processing during gradual time constraints.

    PubMed

    Franco-Watkins, Ana M; Davis, Matthew E; Johnson, Joseph G

    2016-11-01

    Many decisions are made under suboptimal circumstances, such as time constraints. We examined how different experiences of time constraints affected decision strategies on a probabilistic inference task and whether individual differences in working memory accounted for complex strategy use across different levels of time. To examine information search and attentional processing, we used an interactive eye-tracking paradigm where task information was occluded and only revealed by an eye fixation to a given cell. Our results indicate that although participants change search strategies during the most restricted times, the occurrence of the shift in strategies depends both on how the constraints are applied as well as individual differences in working memory. This suggests that, in situations that require making decisions under time constraints, one can influence performance by being sensitive to working memory and, potentially, by acclimating people to the task time gradually.

  13. The Role of ICT in Home Care.

    PubMed

    Wass, Sofie; Vimarlund, Vivian

    2017-01-01

    With an ageing population and limited resources, ICT is often mentioned as a solution to support elderly people in maintaining an independent and healthy lifestyle. In this paper, we describe how ICT can support access to information and rationalization of work processes in a home care context. We do this by modelling the workflow and identifying the possible impact of ICT. The results show a complex process and indicate that the available resources are not used in the best possible way. The introduction of ICT could increase patient safety by reducing the risk of misplacing information about the care recipients and at the same time provide real time information about the care recipients' needs and health at the point of care. However, to rationalize the work processes there is a need to combine ICT with a changed procedure for handling keys.

  14. Dimension Reduction of Multivariable Optical Emission Spectrometer Datasets for Industrial Plasma Processes

    PubMed Central

    Yang, Jie; McArdle, Conor; Daniels, Stephen

    2014-01-01

    A new data dimension-reduction method, called Internal Information Redundancy Reduction (IIRR), is proposed for application to Optical Emission Spectroscopy (OES) datasets obtained from industrial plasma processes. For example in a semiconductor manufacturing environment, real-time spectral emission data is potentially very useful for inferring information about critical process parameters such as wafer etch rates, however, the relationship between the spectral sensor data gathered over the duration of an etching process step and the target process output parameters is complex. OES sensor data has high dimensionality (fine wavelength resolution is required in spectral emission measurements in order to capture data on all chemical species involved in plasma reactions) and full spectrum samples are taken at frequent time points, so that dynamic process changes can be captured. To maximise the utility of the gathered dataset, it is essential that information redundancy is minimised, but with the important requirement that the resulting reduced dataset remains in a form that is amenable to direct interpretation of the physical process. To meet this requirement and to achieve a high reduction in dimension with little information loss, the IIRR method proposed in this paper operates directly in the original variable space, identifying peak wavelength emissions and the correlative relationships between them. A new statistic, Mean Determination Ratio (MDR), is proposed to quantify the information loss after dimension reduction and the effectiveness of IIRR is demonstrated using an actual semiconductor manufacturing dataset. As an example of the application of IIRR in process monitoring/control, we also show how etch rates can be accurately predicted from IIRR dimension-reduced spectral data. PMID:24451453

  15. A Simulation of the Base Civil Engineering Work Request/Work Order System.

    DTIC Science & Technology

    1981-09-01

    with better information with which to make a decision. For example, if the Chief of R&R wanted to know the effect on work order processing time of...work order processing times for the system. The Q-GERT Analysis Program developed by Pritsker (11) was used to simulate the generation of work...several factors affecting the mean work order processing time. 26 [2 r -- ... ... CHAPTER III RESEARCH METHODOLOGY Overview This chapter presents the

  16. Does a time constraint modify results from rating-based conjoint analysis? Case study with orange/pomegranate juice bottles.

    PubMed

    Reis, Felipe; Machín, Leandro; Rosenthal, Amauri; Deliza, Rosires; Ares, Gastón

    2016-12-01

    People do not usually process all the available information on packages for making their food choices and rely on heuristics for making their decisions, particularly when having limited time. However, in most consumer studies encourage participants to invest a lot of time for making their choices. Therefore, imposing a time-constraint in consumer studies may increase their ecological validity. In this context, the aim of the present work was to evaluate the influence of a time-constraint on consumer evaluation of pomegranate/orange juice bottles using rating-based conjoint task. A consumer study with 100 participants was carried out, in which they had to evaluate 16 pomegranate/orange fruit juice bottles, differing in bottle design, front-of-pack nutritional information, nutrition claim and processing claim, and to rate their intention to purchase. Half of the participants evaluated the bottle images without time constraint and the other half had a time-constraint of 3s for evaluating each image. Eye-movements were recorded during the evaluation. Results showed that time-constraint when evaluating intention to purchase did not largely modify the way in which consumers visually processed bottle images. Regardless of the experimental condition (with or without time constraint), they tended to evaluate the same product characteristics and to give them the same relative importance. However, a trend towards a more superficial evaluation of the bottles that skipped complex information was observed. Regarding the influence of product characteristics on consumer intention to purchase, bottle design was the variable with the largest relative importance in both conditions, overriding the influence of nutritional or processing characteristics, which stresses the importance of graphic design in shaping consumer perception. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Reported consent processes and demographics: a substudy of the INSIGHT Strategic Timing of AntiRetroviral Treatment trial

    PubMed Central

    Denning, Eileen; Sharma, Shweta; Smolskis, Mary; Touloumi, Giota; Walker, Sarah; Babiker, Abdel; Clewett, Megan; Emanuel, Ezekiel; Florence, Eric; Papadopoulos, Antonios; Sánchez, Adriana; Tavel, Jorge; Grady, Christine

    2014-01-01

    Objectives Efforts are needed to improve informed consent of participants in research. The Strategic Timing of AntiRetroviral Therapy (START) study provides a unique opportunity to study the effect of length and complexity of informed consent documents on understanding and satisfaction among geographically diverse participants. Methods Interested START sites were randomised to use either the standard consent form or the concise consent form for all of the site’s participants. Results A total of 4473 HIV-positive participants at 154 sites worldwide took part in the Informed Consent Substudy, with consent given in 11 primary languages. Most sites sent written information to potential participants in advance of clinic visits, usually including the consent form. At about half the sites, staff reported spending less than an hour per participant in the consent process. The vast majority of sites assessed participant understanding using informal nonspecific questions or clinical judgment. Conclusions These data reflect the interest of START research staff in evaluating the consent process and improving informed consent. The START Informed Consent Substudy is by far the largest study of informed consent intervention ever conducted. Its results have the potential to impact how consent forms are written around the world. PMID:25711320

  18. Reported consent processes and demographics: a substudy of the INSIGHT Strategic Timing of AntiRetroviral Treatment (START) trial.

    PubMed

    Denning, E; Sharma, S; Smolskis, M; Touloumi, G; Walker, S; Babiker, A; Clewett, M; Emanuel, E; Florence, E; Papadopoulos, A; Sánchez, A; Tavel, J; Grady, C

    2015-04-01

    Efforts are needed to improve informed consent of participants in research. The Strategic Timing of AntiRetroviral Therapy (START) study provides a unique opportunity to study the effect of length and complexity of informed consent documents on understanding and satisfaction among geographically diverse participants. Interested START sites were randomized to use either the standard consent form or the concise consent form for all of the site's participants. A total of 4473 HIV-positive participants at 154 sites world-wide took part in the Informed Consent Substudy, with consent given in 11 primary languages. Most sites sent written information to potential participants in advance of clinic visits, usually including the consent form. At about half the sites, staff reported spending less than an hour per participant in the consent process. The vast majority of sites assessed participant understanding using informal nonspecific questions or clinical judgment. These data reflect the interest of START research staff in evaluating the consent process and improving informed consent. The START Informed Consent Substudy is by far the largest study of informed consent intervention ever conducted. Its results have the potential to impact how consent forms are written around the world. © 2015 British HIV Association.

  19. Differential Effects of Motor Efference Copies and Proprioceptive Information on Response Evaluation Processes

    PubMed Central

    Stock, Ann-Kathrin; Wascher, Edmund; Beste, Christian

    2013-01-01

    It is well-kown that sensory information influences the way we execute motor responses. However, less is known about if and how sensory and motor information are integrated in the subsequent process of response evaluation. We used a modified Simon Task to investigate how these streams of information are integrated in response evaluation processes, applying an in-depth neurophysiological analysis of event-related potentials (ERPs), time-frequency decomposition and sLORETA. The results show that response evaluation processes are differentially modulated by afferent proprioceptive information and efference copies. While the influence of proprioceptive information is mediated via oscillations in different frequency bands, efference copy based information about the motor execution is specifically mediated via oscillations in the theta frequency band. Stages of visual perception and attention were not modulated by the interaction of proprioception and motor efference copies. Brain areas modulated by the interactive effects of proprioceptive and efference copy based information included the middle frontal gyrus and the supplementary motor area (SMA), suggesting that these areas integrate sensory information for the purpose of response evaluation. The results show how motor response evaluation processes are modulated by information about both the execution and the location of a response. PMID:23658624

  20. 32 CFR 651.40 - Introduction.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., allowing public review and comment on the proposal and providing a basis for informed decision-making. (b) The NEPA process should support sound, informed, and timely (early) decision-making; not produce...

  1. An ontology model for nursing narratives with natural language generation technology.

    PubMed

    Min, Yul Ha; Park, Hyeoun-Ae; Jeon, Eunjoo; Lee, Joo Yun; Jo, Soo Jung

    2013-01-01

    The purpose of this study was to develop an ontology model to generate nursing narratives as natural as human language from the entity-attribute-value triplets of a detailed clinical model using natural language generation technology. The model was based on the types of information and documentation time of the information along the nursing process. The typesof information are data characterizing the patient status, inferences made by the nurse from the patient data, and nursing actions selected by the nurse to change the patient status. This information was linked to the nursing process based on the time of documentation. We describe a case study illustrating the application of this model in an acute-care setting. The proposed model provides a strategy for designing an electronic nursing record system.

  2. Influencing Eating Choices: Biological Food Cues in Advertising and Packaging Alter Trajectories of Decision Making and Behavior.

    PubMed

    Bailey, Rachel L

    2017-10-01

    From an ecological perception perspective (Gibson, 1977), the availability of perceptual information alters what behaviors are more and less likely at different times. This study examines how perceptual information delivered in food advertisements and packaging alters the time course of information processing and decision making. Participants categorized images of food that varied in information delivered in terms of color, glossiness, and texture (e.g., food cues) before and after being exposed to a set of advertisements that also varied in this way. In general, items with more direct cues enhanced appetitive motivational processes, especially if they were also advertised with direct food cues. Individuals also chose to eat products that were packaged with more available direct food cues compared to opaque packaging.

  3. A Mechanism for Graded, Dynamically Routable Current Propagation in Pulse-Gated Synfire Chains and Implications for Information Coding

    PubMed Central

    Sornborger, Andrew T.; Wang, Zhuo; Tao, Louis

    2015-01-01

    Neural oscillations can enhance feature recognition [1], modulate interactions between neurons [2], and improve learning and memory [3]. Numerical studies have shown that coherent spiking can give rise to windows in time during which information transfer can be enhanced in neuronal networks [4–6]. Unanswered questions are: 1) What is the transfer mechanism? And 2) how well can a transfer be executed? Here, we present a pulse-based mechanism by which a graded current amplitude may be exactly propagated from one neuronal population to another. The mechanism relies on the downstream gating of mean synaptic current amplitude from one population of neurons to another via a pulse. Because transfer is pulse-based, information may be dynamically routed through a neural circuit with fixed connectivity. We demonstrate the transfer mechanism in a realistic network of spiking neurons and show that it is robust to noise in the form of pulse timing inaccuracies, random synaptic strengths and finite size effects. We also show that the mechanism is structurally robust in that it may be implemented using biologically realistic pulses. The transfer mechanism may be used as a building block for fast, complex information processing in neural circuits. We show that the mechanism naturally leads to a framework wherein neural information coding and processing can be considered as a product of linear maps under the active control of a pulse generator. Distinct control and processing components combine to form the basis for the binding, propagation, and processing of dynamically routed information within neural pathways. Using our framework, we construct example neural circuits to 1) maintain a short-term memory, 2) compute time-windowed Fourier transforms, and 3) perform spatial rotations. We postulate that such circuits, with automatic and stereotyped control and processing of information, are the neural correlates of Crick and Koch’s zombie modes. PMID:26227067

  4. Incorporating Edge Information into Best Merge Region-Growing Segmentation

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Pasolli, Edoardo

    2014-01-01

    We have previously developed a best merge region-growing approach that integrates nonadjacent region object aggregation with the neighboring region merge process usually employed in region growing segmentation approaches. This approach has been named HSeg, because it provides a hierarchical set of image segmentation results. Up to this point, HSeg considered only global region feature information in the region growing decision process. We present here three new versions of HSeg that include local edge information into the region growing decision process at different levels of rigor. We then compare the effectiveness and processing times of these new versions HSeg with each other and with the original version of HSeg.

  5. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  6. Transforming information from silicon testing and design characterization into numerical data sets for yield learning

    NASA Astrophysics Data System (ADS)

    Yang, Thomas; Shen, Yang; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh

    2017-03-01

    Silicon testing results are regularly collected for a particular lot of wafers to study yield loss from test result diagnostics. Product engineers will analyze the diagnostic results and perform a number of physical failure analyses to detect systematic defects which cause yield loss for these sets of wafers in order to feedback the information to process engineers for process improvements. Most of time, the systematic defects that are detected are major issues or just one of the causes for the overall yield loss. This paper will present a working flow for using design analysis techniques combined with diagnostic methods to systematically transform silicon testing information into physical layout information. A new set of the testing results are received from a new lot of wafers for the same product. We can then correlate all the diagnostic results from different periods of time to check which blocks or nets have been highlighted or stop occurring on the failure reports in order to monitor process changes which impact the yield. The design characteristic analysis flow is also implemented to find 1) the block connections on a design that have failed electrical test or 2) frequently used cells that been highlighted multiple times.

  7. Computing algebraic transfer entropy and coupling directions via transcripts

    NASA Astrophysics Data System (ADS)

    Amigó, José M.; Monetti, Roberto; Graff, Beata; Graff, Grzegorz

    2016-11-01

    Most random processes studied in nonlinear time series analysis take values on sets endowed with a group structure, e.g., the real and rational numbers, and the integers. This fact allows to associate with each pair of group elements a third element, called their transcript, which is defined as the product of the second element in the pair times the first one. The transfer entropy of two such processes is called algebraic transfer entropy. It measures the information transferred between two coupled processes whose values belong to a group. In this paper, we show that, subject to one constraint, the algebraic transfer entropy matches the (in general, conditional) mutual information of certain transcripts with one variable less. This property has interesting practical applications, especially to the analysis of short time series. We also derive weak conditions for the 3-dimensional algebraic transfer entropy to yield the same coupling direction as the corresponding mutual information of transcripts. A related issue concerns the use of mutual information of transcripts to determine coupling directions in cases where the conditions just mentioned are not fulfilled. We checked the latter possibility in the lowest dimensional case with numerical simulations and cardiovascular data, and obtained positive results.

  8. Speed of feedforward and recurrent processing in multilayer networks of integrate-and-fire neurons.

    PubMed

    Panzeri, S; Rolls, E T; Battaglia, F; Lavis, R

    2001-11-01

    The speed of processing in the visual cortical areas can be fast, with for example the latency of neuronal responses increasing by only approximately 10 ms per area in the ventral visual system sequence V1 to V2 to V4 to inferior temporal visual cortex. This has led to the suggestion that rapid visual processing can only be based on the feedforward connections between cortical areas. To test this idea, we investigated the dynamics of information retrieval in multiple layer networks using a four-stage feedforward network modelled with continuous dynamics with integrate-and-fire neurons, and associative synaptic connections between stages with a synaptic time constant of 10 ms. Through the implementation of continuous dynamics, we found latency differences in information retrieval of only 5 ms per layer when local excitation was absent and processing was purely feedforward. However, information latency differences increased significantly when non-associative local excitation was included. We also found that local recurrent excitation through associatively modified synapses can contribute significantly to processing in as little as 15 ms per layer, including the feedforward and local feedback processing. Moreover, and in contrast to purely feed-forward processing, the contribution of local recurrent feedback was useful and approximately this rapid even when retrieval was made difficult by noise. These findings suggest that cortical information processing can benefit from recurrent circuits when the allowed processing time per cortical area is at least 15 ms long.

  9. 48 CFR 39.002 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONTRACTING ACQUISITION OF INFORMATION TECHNOLOGY 39.002 Definitions. As used in this part— Modular contracting means use of one or more contracts to acquire information technology systems in successive... technology, means that the information technology accurately processes date/time data (including, but not...

  10. Characteristics of the transmission of autoregressive sub-patterns in financial time series

    NASA Astrophysics Data System (ADS)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong

    2014-09-01

    There are many types of autoregressive patterns in financial time series, and they form a transmission process. Here, we define autoregressive patterns quantitatively through an econometrical regression model. We present a computational algorithm that sets the autoregressive patterns as nodes and transmissions between patterns as edges, and then converts the transmission process of autoregressive patterns in a time series into a network. We utilised daily Shanghai (securities) composite index time series to study the transmission characteristics of autoregressive patterns. We found statistically significant evidence that the financial market is not random and that there are similar characteristics between parts and whole time series. A few types of autoregressive sub-patterns and transmission patterns drive the oscillations of the financial market. A clustering effect on fluctuations appears in the transmission process, and certain non-major autoregressive sub-patterns have high media capabilities in the financial time series. Different stock indexes exhibit similar characteristics in the transmission of fluctuation information. This work not only proposes a distinctive perspective for analysing financial time series but also provides important information for investors.

  11. Characteristics of the transmission of autoregressive sub-patterns in financial time series

    PubMed Central

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong

    2014-01-01

    There are many types of autoregressive patterns in financial time series, and they form a transmission process. Here, we define autoregressive patterns quantitatively through an econometrical regression model. We present a computational algorithm that sets the autoregressive patterns as nodes and transmissions between patterns as edges, and then converts the transmission process of autoregressive patterns in a time series into a network. We utilised daily Shanghai (securities) composite index time series to study the transmission characteristics of autoregressive patterns. We found statistically significant evidence that the financial market is not random and that there are similar characteristics between parts and whole time series. A few types of autoregressive sub-patterns and transmission patterns drive the oscillations of the financial market. A clustering effect on fluctuations appears in the transmission process, and certain non-major autoregressive sub-patterns have high media capabilities in the financial time series. Different stock indexes exhibit similar characteristics in the transmission of fluctuation information. This work not only proposes a distinctive perspective for analysing financial time series but also provides important information for investors. PMID:25189200

  12. Using Mars's Sulfur Cycle to Constrain the Duration and Timing of Fluvial Processes

    NASA Technical Reports Server (NTRS)

    Blaney, D. L.

    2002-01-01

    Sulfur exists in high abundances at diverse locations on Mars. This work uses knowledge of the Martian sulfate system to discriminate between leading hypotheses and discusses the implications for duration and timing of fluvial processes. Additional information is contained in the original extended abstract.

  13. Information Technologies for the 1980's: Lasers and Microprocessors.

    ERIC Educational Resources Information Center

    Mathews, William D.

    This discussion of the development and application of lasers and microprocessors to information processing stresses laser communication in relation to capacity, reliability, and cost and the advantages of this technology to real-time information access and information storage. The increased capabilities of microprocessors are reviewed, and a…

  14. Understanding price discovery in interconnected markets: Generalized Langevin process approach and simulation

    NASA Astrophysics Data System (ADS)

    Schenck, Natalya A.; Horvath, Philip A.; Sinha, Amit K.

    2018-02-01

    While the literature on price discovery process and information flow between dominant and satellite market is exhaustive, most studies have applied an approach that can be traced back to Hasbrouck (1995) or Gonzalo and Granger (1995). In this paper, however, we propose a Generalized Langevin process with asymmetric double-well potential function, with co-integrated time series and interconnected diffusion processes to model the information flow and price discovery process in two, a dominant and a satellite, interconnected markets. A simulated illustration of the model is also provided.

  15. Influence of aggression on information processing in the emotional stroop task--an event-related potential study.

    PubMed

    Bertsch, Katja; Böhnke, Robina; Kruk, Menno R; Naumann, Ewald

    2009-01-01

    Aggression is a common behavior which has frequently been explained as involving changes in higher level information processing patterns. Although researchers have started only recently to investigate information processing in healthy individuals while engaged in aggressive behavior, the impact of aggression on information processing beyond an aggressive encounter remains unclear. In an event-related potential study, we investigated the processing of facial expressions (happy, angry, fearful, and neutral) in an emotional Stroop task after experimentally provoking aggressive behavior in healthy participants. Compared to a non-provoked group, these individuals showed increased early (P2) and late (P3) positive amplitudes for all facial expressions. For the P2 amplitude, the effect of provocation was greatest for threat-related expressions. Beyond this, a bias for emotional expressions, i.e., slower reaction times to all emotional expressions, was found in provoked participants with a high level of trait anger. These results indicate significant effects of aggression on information processing, which last beyond the aggressive encounter even in healthy participants.

  16. Exploring the Notion of Context in Medical Data.

    PubMed

    Mylonas, Phivos

    2017-01-01

    Scientific and technological knowledge and skills are becoming crucial for most data analysis activities. Two rather distinct, but at the same time collaborating, domains are the ones of computer science and medicine; the former offers significant aid towards a more efficient understanding of the latter's research trends. Still, the process of meaningfully analyzing and understanding medical information and data is a tedious one, bound to several challenges. One of them is the efficient utilization of contextual information in the process leading to optimized, context-aware data analysis results. Nowadays, researchers are provided with tools and opportunities to analytically study medical data, but at the same time significant and rather complex computational challenges are yet to be tackled, among others due to the humanistic nature and increased rate of new content and information production imposed by related hardware and applications. So, the ultimate goal of this position paper is to provide interested parties an overview of major contextual information types to be identified within the medical data processing framework.

  17. Separable processes before, during, and after the N400 elicited by previously inferred and new information: evidence from time-frequency decompositions.

    PubMed

    Steele, Vaughn R; Bernat, Edward M; van den Broek, Paul; Collins, Paul F; Patrick, Christopher J; Marsolek, Chad J

    2013-01-25

    Successful comprehension during reading often requires inferring information not explicitly presented. This information is readily accessible when subsequently encountered, and a neural correlate of this is an attenuation of the N400 event-related potential (ERP). We used ERPs and time-frequency (TF) analysis to investigate neural correlates of processing inferred information after a causal coherence inference had been generated during text comprehension. Participants read short texts, some of which promoted inference generation. After each text, they performed lexical decisions to target words that were unrelated or inference-related to the preceding text. Consistent with previous findings, inference-related words elicited an attenuated N400 relative to unrelated words. TF analyses revealed unique contributions to the N400 from activity occurring at 1-6 Hz (theta) and 0-2 Hz (delta), supporting the view that multiple, sequential processes underlie the N400. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Re-Evaluating the Time Course of Gender and Phonological Encoding during Silent Monitoring Tasks Estimated by ERP: Serial or Parallel Processing?

    ERIC Educational Resources Information Center

    Camen, Christian; Morand, Stephanie; Laganaro, Marina

    2010-01-01

    Neurolinguistic and psycholinguistic studies suggest that grammatical (gender) and phonological information are retrieved independently and that gender can be accessed before phonological information. This study investigated the relative time courses of gender and phonological encoding using topographic evoked potentials mapping methods.…

  19. Noun-phrase anaphors and focus: the informational load hypothesis.

    PubMed

    Almor, A

    1999-10-01

    The processing of noun-phrase (NP) anaphors in discourse is argued to reflect constraints on the activation and processing of semantic information in working memory. The proposed theory views NP anaphor processing as an optimization process that is based on the principle that processing cost, defined in terms of activating semantic information, should serve some discourse function--identifying the antecedent, adding new information, or both. In a series of 5 self-paced reading experiments, anaphors' functionality was manipulated by changing the discourse focus, and their cost was manipulated by changing the semantic relation between the anaphors and their antecedents. The results show that reading times of NP anaphors reflect their functional justification: Anaphors were read faster when their cost had a better functional justification. These results are incompatible with any theory that treats NP anaphors as one homogeneous class regardless of discourse function and processing cost.

  20. An image-processing strategy to extract important information suitable for a low-size stimulus pattern in a retinal prosthesis.

    PubMed

    Chen, Yili; Fu, Jixiang; Chu, Dawei; Li, Rongmao; Xie, Yaoqin

    2017-11-27

    A retinal prosthesis is designed to help the blind to obtain some sight. It consists of an external part and an internal part. The external part is made up of a camera, an image processor and an RF transmitter. The internal part is made up of an RF receiver, implant chip and microelectrode. Currently, the number of microelectrodes is in the hundreds, and we do not know the mechanism for using an electrode to stimulate the optic nerve. A simple hypothesis is that the pixels in an image correspond to the electrode. The images captured by the camera should be processed by suitable strategies to correspond to stimulation from the electrode. Thus, it is a question of how to obtain the important information from the image captured in the picture. Here, we use the region of interest (ROI), a useful algorithm for extracting the ROI, to retain the important information, and to remove the redundant information. This paper explains the details of the principles and functions of the ROI. Because we are investigating a real-time system, we need a fast processing ROI as a useful algorithm to extract the ROI. Thus, we simplified the ROI algorithm and used it in an outside image-processing digital signal processing (DSP) system of the retinal prosthesis. The results show that our image-processing strategies are suitable for a real-time retinal prosthesis and can eliminate redundant information and provide useful information for expression in a low-size image.

  1. A new framework for modeling decisions about changing information: The Piecewise Linear Ballistic Accumulator model

    PubMed Central

    Heathcote, Andrew

    2016-01-01

    In the real world, decision making processes must be able to integrate non-stationary information that changes systematically while the decision is in progress. Although theories of decision making have traditionally been applied to paradigms with stationary information, non-stationary stimuli are now of increasing theoretical interest. We use a random-dot motion paradigm along with cognitive modeling to investigate how the decision process is updated when a stimulus changes. Participants viewed a cloud of moving dots, where the motion switched directions midway through some trials, and were asked to determine the direction of motion. Behavioral results revealed a strong delay effect: after presentation of the initial motion direction there is a substantial time delay before the changed motion information is integrated into the decision process. To further investigate the underlying changes in the decision process, we developed a Piecewise Linear Ballistic Accumulator model (PLBA). The PLBA is efficient to simulate, enabling it to be fit to participant choice and response-time distribution data in a hierarchal modeling framework using a non-parametric approximate Bayesian algorithm. Consistent with behavioral results, PLBA fits confirmed the presence of a long delay between presentation and integration of new stimulus information, but did not support increased response caution in reaction to the change. We also found the decision process was not veridical, as symmetric stimulus change had an asymmetric effect on the rate of evidence accumulation. Thus, the perceptual decision process was slow to react to, and underestimated, new contrary motion information. PMID:26760448

  2. Real-Time Monitoring of Scada Based Control System for Filling Process

    NASA Astrophysics Data System (ADS)

    Soe, Aung Kyaw; Myint, Aung Naing; Latt, Maung Maung; Theingi

    2008-10-01

    This paper is a design of real-time monitoring for filling system using Supervisory Control and Data Acquisition (SCADA). The monitoring of production process is described in real-time using Visual Basic.Net programming under Visual Studio 2005 software without SCADA software. The software integrators are programmed to get the required information for the configuration screens. Simulation of components is expressed on the computer screen using parallel port between computers and filling devices. The programs of real-time simulation for the filling process from the pure drinking water industry are provided.

  3. A Real-Time System for Lane Detection Based on FPGA and DSP

    NASA Astrophysics Data System (ADS)

    Xiao, Jing; Li, Shutao; Sun, Bin

    2016-12-01

    This paper presents a real-time lane detection system including edge detection and improved Hough Transform based lane detection algorithm and its hardware implementation with field programmable gate array (FPGA) and digital signal processor (DSP). Firstly, gradient amplitude and direction information are combined to extract lane edge information. Then, the information is used to determine the region of interest. Finally, the lanes are extracted by using improved Hough Transform. The image processing module of the system consists of FPGA and DSP. Particularly, the algorithms implemented in FPGA are working in pipeline and processing in parallel so that the system can run in real-time. In addition, DSP realizes lane line extraction and display function with an improved Hough Transform. The experimental results show that the proposed system is able to detect lanes under different road situations efficiently and effectively.

  4. Educating anesthesia residents to obtain and document informed consent for epidural labor analgesia: does simulation play a role?

    PubMed

    Antoniou, A; Marmai, K; Qasem, F; Cherry, R; Jones, P M; Singh, S

    2018-05-01

    Informed consent is required before placing an epidural. At our hospital, teaching of residents about this is done informally at the bedside. This study aimed to assess the ability of anesthesia residents to acquire and retain knowledge required when seeking informed consent for epidural labor analgesia. It assessed how well this knowledge was translated to clinical ability, by assessing the verbal consent process during an interaction with a standardized patient. Twenty anesthesia residents were randomized to a 'didactic group' or a 'simulation group'. Each resident was presented with a written scenario and asked to document the informed consent process, as they normally would do (pre-test). The didactic group then had a presentation about informed consent, while the simulation group members interviewed a simulated patient, the scenarios focusing on different aspects of consent. All residents then read a scenario and documented their informed consent process (post-test). Six weeks later all residents interviewed a standardized patient in labor and documented the consent from this interaction (six-week test). There was no significant difference in the baseline performance of the two groups. Both groups showed significant improvement in their written consent documentation at the immediate time point, the improvement in the didactic group being greater. The didactic group performed better at both the immediate time point and the six-week time point. In this small study, a didactic teaching method proved better than simulation-based teaching in helping residents to gain knowledge needed to obtain informed consent for epidural labor analgesia. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Multiscale Granger causality

    NASA Astrophysics Data System (ADS)

    Faes, Luca; Nollo, Giandomenico; Stramaglia, Sebastiano; Marinazzo, Daniele

    2017-10-01

    In the study of complex physical and biological systems represented by multivariate stochastic processes, an issue of great relevance is the description of the system dynamics spanning multiple temporal scales. While methods to assess the dynamic complexity of individual processes at different time scales are well established, multiscale analysis of directed interactions has never been formalized theoretically, and empirical evaluations are complicated by practical issues such as filtering and downsampling. Here we extend the very popular measure of Granger causality (GC), a prominent tool for assessing directed lagged interactions between joint processes, to quantify information transfer across multiple time scales. We show that the multiscale processing of a vector autoregressive (AR) process introduces a moving average (MA) component, and describe how to represent the resulting ARMA process using state space (SS) models and to combine the SS model parameters for computing exact GC values at arbitrarily large time scales. We exploit the theoretical formulation to identify peculiar features of multiscale GC in basic AR processes, and demonstrate with numerical simulations the much larger estimation accuracy of the SS approach compared to pure AR modeling of filtered and downsampled data. The improved computational reliability is exploited to disclose meaningful multiscale patterns of information transfer between global temperature and carbon dioxide concentration time series, both in paleoclimate and in recent years.

  6. The Impact of Storage on Processing: How Is Information Maintained in Working Memory?

    ERIC Educational Resources Information Center

    Vergauwe, Evie; Camos, Valérie; Barrouillet, Pierre

    2014-01-01

    Working memory is typically defined as a system devoted to the simultaneous maintenance and processing of information. However, the interplay between these 2 functions is still a matter of debate in the literature, with views ranging from complete independence to complete dependence. The time-based resource-sharing model assumes that a central…

  7. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    NASA Astrophysics Data System (ADS)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  8. Temporal factors affecting somatosensory–auditory interactions in speech processing

    PubMed Central

    Ito, Takayuki; Gracco, Vincent L.; Ostry, David J.

    2014-01-01

    Speech perception is known to rely on both auditory and visual information. However, sound-specific somatosensory input has been shown also to influence speech perceptual processing (Ito et al., 2009). In the present study, we addressed further the relationship between somatosensory information and speech perceptual processing by addressing the hypothesis that the temporal relationship between orofacial movement and sound processing contributes to somatosensory–auditory interaction in speech perception. We examined the changes in event-related potentials (ERPs) in response to multisensory synchronous (simultaneous) and asynchronous (90 ms lag and lead) somatosensory and auditory stimulation compared to individual unisensory auditory and somatosensory stimulation alone. We used a robotic device to apply facial skin somatosensory deformations that were similar in timing and duration to those experienced in speech production. Following synchronous multisensory stimulation the amplitude of the ERP was reliably different from the two unisensory potentials. More importantly, the magnitude of the ERP difference varied as a function of the relative timing of the somatosensory–auditory stimulation. Event-related activity change due to stimulus timing was seen between 160 and 220 ms following somatosensory onset, mostly around the parietal area. The results demonstrate a dynamic modulation of somatosensory–auditory convergence and suggest the contribution of somatosensory information for speech processing process is dependent on the specific temporal order of sensory inputs in speech production. PMID:25452733

  9. Knowledge-Based Integrated Information Systems Engineering: Highlights and Bibliography. Volume 1.

    DTIC Science & Technology

    1987-12-01

    of database technology, 0 communication technology and expert systems technology. , Organizational issues cover the process of making controlled... process of linking strategic goals, technical issues , and organizational aspects can be depicted as shown in Figure 2.2. At the top level, strategic...an integrated information system design and implementation in a short period of time [4]. 2.2.2 Emphasis on Process It was mentioned in Section 1.3

  10. General Recommendations on Fatigue Risk Management for the Canadian Forces

    DTIC Science & Technology

    2010-04-01

    missions performed in aviation require an individual(s) to process large amount of information in a short period of time and to do this on a continuous...information processing required during sustained operations can deteriorate an individual’s ability to perform a task. Given the high operational tempo...memory, which, in turn, is utilized to perform human thought processes (Baddeley, 2003). While various versions of this theory exist, they all share

  11. Designing for Temporal Awareness: The Role of Temporality in Time-Critical Medical Teamwork

    PubMed Central

    Kusunoki, Diana S.; Sarcevic, Aleksandra

    2016-01-01

    This paper describes the role of temporal information in emergency medical teamwork and how time-based features can be designed to support the temporal awareness of clinicians in this fast-paced and dynamic environment. Engagement in iterative design activities with clinicians over the course of two years revealed a strong need for time-based features and mechanisms, including timestamps for tasks based on absolute time and automatic stopclocks measuring time by counting up since task performance. We describe in detail the aspects of temporal awareness central to clinicians’ awareness needs and then provide examples of how we addressed these needs through the design of a shared information display. As an outcome of this process, we define four types of time representation techniques to facilitate the design of time-based features: (1) timestamps based on absolute time, (2) timestamps relative to the process start time, (3) time since task performance, and (4) time until the next required task. PMID:27478880

  12. Foveal analysis and peripheral selection during active visual sampling

    PubMed Central

    Ludwig, Casimir J. H.; Davies, J. Rhys; Eckstein, Miguel P.

    2014-01-01

    Human vision is an active process in which information is sampled during brief periods of stable fixation in between gaze shifts. Foveal analysis serves to identify the currently fixated object and has to be coordinated with a peripheral selection process of the next fixation location. Models of visual search and scene perception typically focus on the latter, without considering foveal processing requirements. We developed a dual-task noise classification technique that enables identification of the information uptake for foveal analysis and peripheral selection within a single fixation. Human observers had to use foveal vision to extract visual feature information (orientation) from different locations for a psychophysical comparison. The selection of to-be-fixated locations was guided by a different feature (luminance contrast). We inserted noise in both visual features and identified the uptake of information by looking at correlations between the noise at different points in time and behavior. Our data show that foveal analysis and peripheral selection proceeded completely in parallel. Peripheral processing stopped some time before the onset of an eye movement, but foveal analysis continued during this period. Variations in the difficulty of foveal processing did not influence the uptake of peripheral information and the efficacy of peripheral selection, suggesting that foveal analysis and peripheral selection operated independently. These results provide important theoretical constraints on how to model target selection in conjunction with foveal object identification: in parallel and independently. PMID:24385588

  13. Whisper: Tracing the Spatiotemporal Process of Information Diffusion in Real Time.

    PubMed

    Cao, Nan; Lin, Yu-Ru; Sun, Xiaohua; Lazer, D; Liu, Shixia; Qu, Huamin

    2012-12-01

    When and where is an idea dispersed? Social media, like Twitter, has been increasingly used for exchanging information, opinions and emotions about events that are happening across the world. Here we propose a novel visualization design, "Whisper", for tracing the process of information diffusion in social media in real time. Our design highlights three major characteristics of diffusion processes in social media: the temporal trend, social-spatial extent, and community response of a topic of interest. Such social, spatiotemporal processes are conveyed based on a sunflower metaphor whose seeds are often dispersed far away. In Whisper, we summarize the collective responses of communities on a given topic based on how tweets were retweeted by groups of users, through representing the sentiments extracted from the tweets, and tracing the pathways of retweets on a spatial hierarchical layout. We use an efficient flux line-drawing algorithm to trace multiple pathways so the temporal and spatial patterns can be identified even for a bursty event. A focused diffusion series highlights key roles such as opinion leaders in the diffusion process. We demonstrate how our design facilitates the understanding of when and where a piece of information is dispersed and what are the social responses of the crowd, for large-scale events including political campaigns and natural disasters. Initial feedback from domain experts suggests promising use for today's information consumption and dispersion in the wild.

  14. Health Information Needs and Health Seeking Behavior During the 2014-2016 Ebola Outbreak: A Twitter Content Analysis.

    PubMed

    Odlum, Michelle; Yoon, Sunmoo

    2018-03-23

    For effective public communication during major disease outbreaks like the 2014-2016 Ebola epidemic, health information needs of the population must be adequately assessed. Through content analysis of social media data, like tweets, public health information needs can be effectively assessed and in turn provide appropriate health information to address such needs. The aim of the current study was to assess health information needs about Ebola, at distinct epidemic time points, through longitudinal tracking. Natural language processing was applied to explore public response to Ebola over time from July 2014 to March 2015. A total 155,647 tweets (unique 68,736, retweet 86,911) mentioning Ebola were analyzed and visualized with infographics. Public fear, frustration, and health information seeking regarding Ebola-related global priorities were observed across time. Our longitudinal content analysis revealed that due to ongoing health information deficiencies, resulting in fear and frustration, social media was at times an impediment and not a vehicle to support health information needs. Content analysis of tweets effectively assessed Ebola information needs. Our study also demonstrates the use of Twitter as a method for capturing real-time data to assess ongoing information needs, fear, and frustration over time.

  15. The influence of age, muscle strength and speed of information processing on recovery responses to external perturbations in gait.

    PubMed

    Senden, R; Savelberg, H H C M; Adam, J; Grimm, B; Heyligers, I C; Meijer, K

    2014-01-01

    Dynamic imbalance caused by external perturbations to gait can successfully be counteracted by adequate recovery responses. The current study investigated how the recovery response is moderated by age, walking speed, muscle strength and speed of information processing. The gait pattern of 50 young and 45 elderly subjects was repeatedly perturbed at 20% and 80% of the first half of the swing phase using the Timed Rapid impact Perturbation (TRiP) set-up. Recovery responses were identified using 2D cameras. Muscular factors (dynamometer) and speed of information processing parameters (computer-based reaction time task) were determined. The stronger, faster reacting and faster walking young subjects recovered more often by an elevating strategy than elderly subjects. Twenty three per cent of the differences in recovery responses were explained by a combination of walking speed (B=-13.85), reaction time (B=-0.82), maximum extension strength (B=0.01) and rate of extension moment development (B=0.19). The recovery response that subjects employed when gait was perturbed by the TRiP set-up was modified by several factors; the individual contribution of walking speed, muscle strength and speed of information processing was small. Insight into remaining modifying factors is needed to assist and optimise fall prevention programmes. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Smith predictor-based multiple periodic disturbance compensation for long dead-time processes

    NASA Astrophysics Data System (ADS)

    Tan, Fang; Li, Han-Xiong; Shen, Ping

    2018-05-01

    Many disturbance rejection methods have been proposed for processes with dead-time, while these existing methods may not work well under multiple periodic disturbances. In this paper, a multiple periodic disturbance rejection is proposed under the Smith predictor configuration for processes with long dead-time. One feedback loop is added to compensate periodic disturbance while retaining the advantage of the Smith predictor. With information of the disturbance spectrum, the added feedback loop can remove multiple periodic disturbances effectively. The robust stability can be easily maintained through the rigorous analysis. Finally, simulation examples demonstrate the effectiveness and robustness of the proposed method for processes with long dead-time.

  17. Measuring information processing in a client with extreme agitation following traumatic brain injury using the Perceive, Recall, Plan and Perform System of Task Analysis.

    PubMed

    Nott, Melissa T; Chapparo, Christine

    2008-09-01

    Agitation following traumatic brain injury is characterised by a heightened state of activity with disorganised information processing that interferes with learning and achieving functional goals. This study aimed to identify information processing problems during task performance of a severely agitated adult using the Perceive, Recall, Plan and Perform (PRPP) System of Task Analysis. Second, this study aimed to examine the sensitivity of the PRPP System to changes in task performance over a short period of rehabilitation, and third, to evaluate the guidance provided by the PRPP in directing intervention. A case study research design was employed. The PRPP System of Task Analysis was used to assess changes in task embedded information processing capacity during occupational therapy intervention with a severely agitated adult in a rehabilitation context. Performance is assessed on three selected tasks over a one-month period. Information processing difficulties during task performance can be clearly identified when observing a severely agitated adult following a traumatic brain injury. Processing skills involving attention, sensory processing and planning were most affected at this stage of rehabilitation. These processing difficulties are linked to established descriptions of agitated behaviour. Fluctuations in performance across three tasks of differing processing complexity were evident, leading to hypothesised relationships between task complexity, environment and novelty with information processing errors. Changes in specific information processing capacity over time were evident based on repeated measures using the PRPP System of Task Analysis. This lends preliminary support for its utility as an outcome measure, and raises hypotheses about the type of therapy required to enhance information processing in people with severe agitation. The PRPP System is sensitive to information processing changes in severely agitated adults when used to reassess performance over short intervals and can provide direct guidance to occupational therapy intervention to improve task embedded information processing by categorising errors under four stages of an information processing model: Perceive, Recall, Plan and Perform.

  18. Linear and Non-linear Information Flows In Rainfall Field

    NASA Astrophysics Data System (ADS)

    Molini, A.; La Barbera, P.; Lanza, L. G.

    The rainfall process is the result of a complex framework of non-linear dynamical in- teractions between the different components of the atmosphere. It preserves the com- plexity and the intermittent features of the generating system in space and time as well as the strong dependence of these properties on the scale of observations. The understanding and quantification of how the non-linearity of the generating process comes to influence the single rain events constitute relevant research issues in the field of hydro-meteorology, especially in those applications where a timely and effective forecasting of heavy rain events is able to reduce the risk of failure. This work focuses on the characterization of the non-linear properties of the observed rain process and on the influence of these features on hydrological models. Among the goals of such a survey is the research of regular structures of the rainfall phenomenon and the study of the information flows within the rain field. The research focuses on three basic evo- lution directions for the system: in time, in space and between the different scales. In fact, the information flows that force the system to evolve represent in general a connection between the different locations in space, the different instants in time and, unless assuming the hypothesis of scale invariance is verified "a priori", the different characteristic scales. A first phase of the analysis is carried out by means of classic statistical methods, then a survey of the information flows within the field is devel- oped by means of techniques borrowed from the Information Theory, and finally an analysis of the rain signal in the time and frequency domains is performed, with par- ticular reference to its intermittent structure. The methods adopted in this last part of the work are both the classic techniques of statistical inference and a few procedures for the detection of non-linear and non-stationary features within the process starting from measured data.

  19. A Hot-Deck Multiple Imputation Procedure for Gaps in Longitudinal Recurrent Event Histories

    PubMed Central

    Wang, Chia-Ning; Little, Roderick; Nan, Bin; Harlow, Siobán D.

    2012-01-01

    Summary We propose a regression-based hot deck multiple imputation method for gaps of missing data in longitudinal studies, where subjects experience a recurrent event process and a terminal event. Examples are repeated asthma episodes and death, or menstrual periods and the menopause, as in our motivating application. Research interest concerns the onset time of a marker event, defined by the recurrent-event process, or the duration from this marker event to the final event. Gaps in the recorded event history make it difficult to determine the onset time of the marker event, and hence, the duration from onset to the final event. Simple approaches such as jumping gap times or dropping cases with gaps have obvious limitations. We propose a procedure for imputing information in the gaps by substituting information in the gap from a matched individual with a completely recorded history in the corresponding interval. Predictive Mean Matching is used to incorporate information on longitudinal characteristics of the repeated process and the final event time. Multiple imputation is used to propagate imputation uncertainty. The procedure is applied to an important data set for assessing the timing and duration of the menopausal transition. The performance of the proposed method is assessed by a simulation study. PMID:21361886

  20. Learning time-dependent noise to reduce logical errors: real time error rate estimation in quantum error correction

    NASA Astrophysics Data System (ADS)

    Huo, Ming-Xia; Li, Ying

    2017-12-01

    Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.

  1. Graded, Dynamically Routable Information Processing with Synfire-Gated Synfire Chains.

    PubMed

    Wang, Zhuo; Sornborger, Andrew T; Tao, Louis

    2016-06-01

    Coherent neural spiking and local field potentials are believed to be signatures of the binding and transfer of information in the brain. Coherent activity has now been measured experimentally in many regions of mammalian cortex. Recently experimental evidence has been presented suggesting that neural information is encoded and transferred in packets, i.e., in stereotypical, correlated spiking patterns of neural activity. Due to their relevance to coherent spiking, synfire chains are one of the main theoretical constructs that have been appealed to in order to describe coherent spiking and information transfer phenomena. However, for some time, it has been known that synchronous activity in feedforward networks asymptotically either approaches an attractor with fixed waveform and amplitude, or fails to propagate. This has limited the classical synfire chain's ability to explain graded neuronal responses. Recently, we have shown that pulse-gated synfire chains are capable of propagating graded information coded in mean population current or firing rate amplitudes. In particular, we showed that it is possible to use one synfire chain to provide gating pulses and a second, pulse-gated synfire chain to propagate graded information. We called these circuits synfire-gated synfire chains (SGSCs). Here, we present SGSCs in which graded information can rapidly cascade through a neural circuit, and show a correspondence between this type of transfer and a mean-field model in which gating pulses overlap in time. We show that SGSCs are robust in the presence of variability in population size, pulse timing and synaptic strength. Finally, we demonstrate the computational capabilities of SGSC-based information coding by implementing a self-contained, spike-based, modular neural circuit that is triggered by streaming input, processes the input, then makes a decision based on the processed information and shuts itself down.

  2. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory

    DTIC Science & Technology

    2016-05-12

    valued times series from a sample. (A practical algorithm to compute the estimator is a work in progress.) Third, finitely-valued spatial processes...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics; time series ; Markov chains; random...proved. Second, a statistical method is developed to estimate the memory depth of discrete- time and continuously-valued times series from a sample. (A

  3. Coarse-grained stochastic processes and kinetic Monte Carlo simulators for the diffusion of interacting particles

    NASA Astrophysics Data System (ADS)

    Katsoulakis, Markos A.; Vlachos, Dionisios G.

    2003-11-01

    We derive a hierarchy of successively coarse-grained stochastic processes and associated coarse-grained Monte Carlo (CGMC) algorithms directly from the microscopic processes as approximations in larger length scales for the case of diffusion of interacting particles on a lattice. This hierarchy of models spans length scales between microscopic and mesoscopic, satisfies a detailed balance, and gives self-consistent fluctuation mechanisms whose noise is asymptotically identical to the microscopic MC. Rigorous, detailed asymptotics justify and clarify these connections. Gradient continuous time microscopic MC and CGMC simulations are compared under far from equilibrium conditions to illustrate the validity of our theory and delineate the errors obtained by rigorous asymptotics. Information theory estimates are employed for the first time to provide rigorous error estimates between the solutions of microscopic MC and CGMC, describing the loss of information during the coarse-graining process. Simulations under periodic boundary conditions are used to verify the information theory error estimates. It is shown that coarse-graining in space leads also to coarse-graining in time by q2, where q is the level of coarse-graining, and overcomes in part the hydrodynamic slowdown. Operation counting and CGMC simulations demonstrate significant CPU savings in continuous time MC simulations that vary from q3 for short potentials to q4 for long potentials. Finally, connections of the new coarse-grained stochastic processes to stochastic mesoscopic and Cahn-Hilliard-Cook models are made.

  4. Temporal Information Partitioning Networks (TIPNets): A process network approach to infer ecohydrologic shifts

    NASA Astrophysics Data System (ADS)

    Goodwell, Allison E.; Kumar, Praveen

    2017-07-01

    In an ecohydrologic system, components of atmospheric, vegetation, and root-soil subsystems participate in forcing and feedback interactions at varying time scales and intensities. The structure of this network of complex interactions varies in terms of connectivity, strength, and time scale due to perturbations or changing conditions such as rainfall, drought, or land use. However, characterization of these interactions is difficult due to multivariate and weak dependencies in the presence of noise, nonlinearities, and limited data. We introduce a framework for Temporal Information Partitioning Networks (TIPNets), in which time-series variables are viewed as nodes, and lagged multivariate mutual information measures are links. These links are partitioned into synergistic, unique, and redundant information components, where synergy is information provided only jointly, unique information is only provided by a single source, and redundancy is overlapping information. We construct TIPNets from 1 min weather station data over several hour time windows. From a comparison of dry, wet, and rainy conditions, we find that information strengths increase when solar radiation and surface moisture are present, and surface moisture and wind variability are redundant and synergistic influences, respectively. Over a growing season, network trends reveal patterns that vary with vegetation and rainfall patterns. The framework presented here enables us to interpret process connectivity in a multivariate context, which can lead to better inference of behavioral shifts due to perturbations in ecohydrologic systems. This work contributes to more holistic characterizations of system behavior, and can benefit a wide variety of studies of complex systems.

  5. Different Electrophysiological Responses to Informative Value of Feedback Between Children and Adults.

    PubMed

    Du, Bin; Cao, Bihua; He, Weiqi; Li, Fuhong

    2018-01-01

    The ability to learn from feedback is important for children's adaptive behavior and school learning. Feedback has two main components, informative value and valence. How to disentangle these two components and what is the developmental neural correlates of using the informative value of feedback is still an open question. In this study, 23 children (7-10 years old) and 19 adults (19-22 years old) were asked to perform a rule induction task, in which they were required to find a rule, based on the informative value of feedback. Behavioral results indicated that the likelihood of correct searching behavior under negative feedback was low for children. Event-related potentials showed that (1) the effect of valence was processed in a wide time window, particularly in the N2 component; (2) the encoding process of the informative value of negative feedback began later for children than for adults; (3) a clear P300 was observed for adults; for children, however, P300 was absent in the frontal region; and (4) children processed the informative value of feedback chiefly in the left sites during the P300 time window, whereas adults did not show this laterality. These results suggested that children were less sensitive to the informative value of negative feedback possibly because of the immature brain.

  6. The Characteristics and Limits of Rapid Visual Categorization

    PubMed Central

    Fabre-Thorpe, Michèle

    2011-01-01

    Visual categorization appears both effortless and virtually instantaneous. The study by Thorpe et al. (1996) was the first to estimate the processing time necessary to perform fast visual categorization of animals in briefly flashed (20 ms) natural photographs. They observed a large differential EEG activity between target and distracter correct trials that developed from 150 ms after stimulus onset, a value that was later shown to be even shorter in monkeys! With such strong processing time constraints, it was difficult to escape the conclusion that rapid visual categorization was relying on massively parallel, essentially feed-forward processing of visual information. Since 1996, we have conducted a large number of studies to determine the characteristics and limits of fast visual categorization. The present chapter will review some of the main results obtained. I will argue that rapid object categorizations in natural scenes can be done without focused attention and are most likely based on coarse and unconscious visual representations activated with the first available (magnocellular) visual information. Fast visual processing proved efficient for the categorization of large superordinate object or scene categories, but shows its limits when more detailed basic representations are required. The representations for basic objects (dogs, cars) or scenes (mountain or sea landscapes) need additional processing time to be activated. This finding is at odds with the widely accepted idea that such basic representations are at the entry level of the system. Interestingly, focused attention is still not required to perform these time consuming basic categorizations. Finally we will show that object and context processing can interact very early in an ascending wave of visual information processing. We will discuss how such data could result from our experience with a highly structured and predictable surrounding world that shaped neuronal visual selectivity. PMID:22007180

  7. Real-time EEG-based detection of fatigue driving danger for accident prediction.

    PubMed

    Wang, Hong; Zhang, Chi; Shi, Tianwei; Wang, Fuwang; Ma, Shujun

    2015-03-01

    This paper proposes a real-time electroencephalogram (EEG)-based detection method of the potential danger during fatigue driving. To determine driver fatigue in real time, wavelet entropy with a sliding window and pulse coupled neural network (PCNN) were used to process the EEG signals in the visual area (the main information input route). To detect the fatigue danger, the neural mechanism of driver fatigue was analyzed. The functional brain networks were employed to track the fatigue impact on processing capacity of brain. The results show the overall functional connectivity of the subjects is weakened after long time driving tasks. The regularity is summarized as the fatigue convergence phenomenon. Based on the fatigue convergence phenomenon, we combined both the input and global synchronizations of brain together to calculate the residual amount of the information processing capacity of brain to obtain the dangerous points in real time. Finally, the danger detection system of the driver fatigue based on the neural mechanism was validated using accident EEG. The time distributions of the output danger points of the system have a good agreement with those of the real accident points.

  8. Vocal acoustic analysis as a biometric indicator of information processing: implications for neurological and psychiatric disorders.

    PubMed

    Cohen, Alex S; Dinzeo, Thomas J; Donovan, Neila J; Brown, Caitlin E; Morrison, Sean C

    2015-03-30

    Vocal expression reflects an integral component of communication that varies considerably within individuals across contexts and is disrupted in a range of neurological and psychiatric disorders. There is reason to suspect that variability in vocal expression reflects, in part, the availability of "on-line" resources (e.g., working memory, attention). Thus, understanding vocal expression is a potentially important biometric index of information processing, not only across but within individuals over time. A first step in this line of research involves establishing a link between vocal expression and information processing systems in healthy adults. The present study employed a dual attention experimental task where participants provided natural speech while simultaneously engaged in a baseline, medium or high nonverbal processing-load task. Objective, automated, and computerized analysis was employed to measure vocal expression in 226 adults. Increased processing load resulted in longer pauses, fewer utterances, greater silence overall and less variability in frequency and intensity levels. These results provide compelling evidence of a link between information processing resources and vocal expression, and provide important information for the development of an automated, inexpensive and uninvasive biometric measure of information processing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Temporal information processing in short- and long-term memory of patients with schizophrenia.

    PubMed

    Landgraf, Steffen; Steingen, Joerg; Eppert, Yvonne; Niedermeyer, Ulrich; van der Meer, Elke; Krueger, Frank

    2011-01-01

    Cognitive deficits of patients with schizophrenia have been largely recognized as core symptoms of the disorder. One neglected factor that contributes to these deficits is the comprehension of time. In the present study, we assessed temporal information processing and manipulation from short- and long-term memory in 34 patients with chronic schizophrenia and 34 matched healthy controls. On the short-term memory temporal-order reconstruction task, an incidental or intentional learning strategy was deployed. Patients showed worse overall performance than healthy controls. The intentional learning strategy led to dissociable performance improvement in both groups. Whereas healthy controls improved on a performance measure (serial organization), patients improved on an error measure (inappropriate semantic clustering) when using the intentional instead of the incidental learning strategy. On the long-term memory script-generation task, routine and non-routine events of everyday activities (e.g., buying groceries) had to be generated in either chronological or inverted temporal order. Patients were slower than controls at generating events in the chronological routine condition only. They also committed more sequencing and boundary errors in the inverted conditions. The number of irrelevant events was higher in patients in the chronological, non-routine condition. These results suggest that patients with schizophrenia imprecisely access temporal information from short- and long-term memory. In short-term memory, processing of temporal information led to a reduction in errors rather than, as was the case in healthy controls, to an improvement in temporal-order recall. When accessing temporal information from long-term memory, patients were slower and committed more sequencing, boundary, and intrusion errors. Together, these results suggest that time information can be accessed and processed only imprecisely by patients who provide evidence for impaired time comprehension. This could contribute to symptomatic cognitive deficits and strategic inefficiency in schizophrenia.

  10. Preparing routine health information systems for immediate health responses to disasters

    PubMed Central

    Aung, Eindra; Whittaker, Maxine

    2013-01-01

    During disaster times, we need specific information to rapidly plan a disaster response, especially in sudden-onset disasters. Due to the inadequate capacity of Routine Health Information Systems (RHIS), many developing countries face a lack of quality pre-disaster health-related data and efficient post-disaster data processes in the immediate aftermath of a disaster. Considering the significance of local capacity during the early stages of disaster response, RHIS at local, provincial/state and national levels need to be strengthened so that they provide relief personnel up-to-date information to plan, organize and monitor immediate relief activities. RHIS professionals should be aware of specific information needs in disaster response (according to the Sphere Project’s Humanitarian Minimum Standards) and requirements in data processes to fulfil those information needs. Preparing RHIS for disasters can be guided by key RHIS-strengthening frameworks; and disaster preparedness must be incorporated into countries’ RHIS. Mechanisms must be established in non-disaster times and maintained between RHIS and information systems of non-health sectors for exchanging disaster-related information and sharing technologies and cost. PMID:23002249

  11. 37 CFR 102.6 - Time limits and expedited processing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... requests. Refusal to reasonably modify the scope of a request or arrange an alternate time frame may affect... imminent threat to the life or physical safety of an individual; (ii) The loss of substantial due process... the Government's integrity that affect public confidence; or (iv) An urgency to inform the public...

  12. Serious games for elderly continuous monitoring.

    PubMed

    Lemus-Zúñiga, Lenin-G; Navarro-Pardo, Esperanza; Moret-Tatay, Carmen; Pocinho, Ricardo

    2015-01-01

    Information technology (IT) and serious games allow older population to remain independent for longer. Hence, when designing technology for this population, developmental changes, such as attention and/or perception, should be considered. For instance, a crucial developmental change has been related to cognitive speed in terms of reaction time (RT). However, this variable presents a skewed distribution that difficult data analysis. An alternative strategy is to characterize the data to an ex-Gaussian function. Furthermore, this procedure provides different parameters that have been related to underlying cognitive processes in the literature. Another issue to be considered is the optimal data recording, storing and processing. For that purpose mobile devices (smart phones and tablets) are a good option for targeting serious games where valuable information can be stored (time spent in the application, reaction time, frequency of use, and a long etcetera). The data stored inside the smartphones and tablets can be sent to a central computer (cloud storage) in order to store the data collected to not only fill the distribution of reaction times to mathematical functions, but also to estimate parameters which may reflect cognitive processes underlying language, aging, and decisional process.

  13. A Time-Space Domain Information Fusion Method for Specific Emitter Identification Based on Dempster-Shafer Evidence Theory.

    PubMed

    Jiang, Wen; Cao, Ying; Yang, Lin; He, Zichang

    2017-08-28

    Specific emitter identification plays an important role in contemporary military affairs. However, most of the existing specific emitter identification methods haven't taken into account the processing of uncertain information. Therefore, this paper proposes a time-space domain information fusion method based on Dempster-Shafer evidence theory, which has the ability to deal with uncertain information in the process of specific emitter identification. In this paper, radars will generate a group of evidence respectively based on the information they obtained, and our main task is to fuse the multiple groups of evidence to get a reasonable result. Within the framework of recursive centralized fusion model, the proposed method incorporates a correlation coefficient, which measures the relevance between evidence and a quantum mechanical approach, which is based on the parameters of radar itself. The simulation results of an illustrative example demonstrate that the proposed method can effectively deal with uncertain information and get a reasonable recognition result.

  14. Method and Application for Dynamic Comprehensive Evaluation with Subjective and Objective Information

    PubMed Central

    Liu, Dinglin; Zhao, Xianglian

    2013-01-01

    In an effort to deal with more complicated evaluation situations, scientists have focused their efforts on dynamic comprehensive evaluation research. How to make full use of the subjective and objective information has become one of the noteworthy content. In this paper, a dynamic comprehensive evaluation method with subjective and objective information is proposed. We use the combination weighting method to determine the index weight. Analysis hierarchy process method is applied to dispose the subjective information, and criteria importance through intercriteria correlation method is used to handle the objective information. And for the time weight determination, we consider both time distance and information size to embody the principle of esteeming the present over the past. And then the linear weighted average model is constructed to make the evaluation process more practicable. Finally, an example is presented to illustrate the effectiveness of this method. Overall, the results suggest that the proposed method is reasonable and effective. PMID:24386176

  15. A comparison of visuomotor cue integration strategies for object placement and prehension.

    PubMed

    Greenwald, Hal S; Knill, David C

    2009-01-01

    Visual cue integration strategies are known to depend on cue reliability and how rapidly the visual system processes incoming information. We investigated whether these strategies also depend on differences in the information demands for different natural tasks. Using two common goal-oriented tasks, prehension and object placement, we determined whether monocular and binocular information influence estimates of three-dimensional (3D) orientation differently depending on task demands. Both tasks rely on accurate 3D orientation estimates, but 3D position is potentially more important for grasping. Subjects placed an object on or picked up a disc in a virtual environment. On some trials, the monocular cues (aspect ratio and texture compression) and binocular cues (e.g., binocular disparity) suggested slightly different 3D orientations for the disc; these conflicts either were present upon initial stimulus presentation or were introduced after movement initiation, which allowed us to quantify how information from the cues accumulated over time. We analyzed the time-varying orientations of subjects' fingers in the grasping task and those of the object in the object placement task to quantify how different visual cues influenced motor control. In the first experiment, different subjects performed each task, and those performing the grasping task relied on binocular information more when orienting their hands than those performing the object placement task. When subjects in the second experiment performed both tasks in interleaved sessions, binocular cues were still more influential during grasping than object placement, and the different cue integration strategies observed for each task in isolation were maintained. In both experiments, the temporal analyses showed that subjects processed binocular information faster than monocular information, but task demands did not affect the time course of cue processing. How one uses visual cues for motor control depends on the task being performed, although how quickly the information is processed appears to be task invariant.

  16. FPGA implementation of sparse matrix algorithm for information retrieval

    NASA Astrophysics Data System (ADS)

    Bojanic, Slobodan; Jevtic, Ruzica; Nieto-Taladriz, Octavio

    2005-06-01

    Information text data retrieval requires a tremendous amount of processing time because of the size of the data and the complexity of information retrieval algorithms. In this paper the solution to this problem is proposed via hardware supported information retrieval algorithms. Reconfigurable computing may adopt frequent hardware modifications through its tailorable hardware and exploits parallelism for a given application through reconfigurable and flexible hardware units. The degree of the parallelism can be tuned for data. In this work we implemented standard BLAS (basic linear algebra subprogram) sparse matrix algorithm named Compressed Sparse Row (CSR) that is showed to be more efficient in terms of storage space requirement and query-processing timing over the other sparse matrix algorithms for information retrieval application. Although inverted index algorithm is treated as the de facto standard for information retrieval for years, an alternative approach to store the index of text collection in a sparse matrix structure gains more attention. This approach performs query processing using sparse matrix-vector multiplication and due to parallelization achieves a substantial efficiency over the sequential inverted index. The parallel implementations of information retrieval kernel are presented in this work targeting the Virtex II Field Programmable Gate Arrays (FPGAs) board from Xilinx. A recent development in scientific applications is the use of FPGA to achieve high performance results. Computational results are compared to implementations on other platforms. The design achieves a high level of parallelism for the overall function while retaining highly optimised hardware within processing unit.

  17. High School Students' Use of Paper-Based and Internet-Based Information Sources in the Engineering Design Process

    ERIC Educational Resources Information Center

    Pieper, Jon; Mentzer, Nathan

    2013-01-01

    Mentzer and Becker (2011) and Becker and Mentzer (2012) demonstrated that high school students engaged in engineering design problems spent more time accessing information and spent more time designing when provided with Internet access. They studied high school students engaged in an engineering design challenge. The two studies attempted to…

  18. 42 CFR 447.45 - Timely claims payment.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... implement an automated claims processing and information retrieval system. (2) The agency's request for a... additional information from the provider of the service or from a third party. It includes a claim with...

  19. 42 CFR 447.45 - Timely claims payment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... implement an automated claims processing and information retrieval system. (2) The agency's request for a... additional information from the provider of the service or from a third party. It includes a claim with...

  20. 42 CFR 447.45 - Timely claims payment.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... implement an automated claims processing and information retrieval system. (2) The agency's request for a... additional information from the provider of the service or from a third party. It includes a claim with...

  1. 42 CFR 447.45 - Timely claims payment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... implement an automated claims processing and information retrieval system. (2) The agency's request for a... additional information from the provider of the service or from a third party. It includes a claim with...

  2. Bottlenecks of Motion Processing during a Visual Glance: The Leaky Flask Model

    PubMed Central

    Öğmen, Haluk; Ekiz, Onur; Huynh, Duong; Bedell, Harold E.; Tripathy, Srimant P.

    2013-01-01

    Where do the bottlenecks for information and attention lie when our visual system processes incoming stimuli? The human visual system encodes the incoming stimulus and transfers its contents into three major memory systems with increasing time scales, viz., sensory (or iconic) memory, visual short-term memory (VSTM), and long-term memory (LTM). It is commonly believed that the major bottleneck of information processing resides in VSTM. In contrast to this view, we show major bottlenecks for motion processing prior to VSTM. In the first experiment, we examined bottlenecks at the stimulus encoding stage through a partial-report technique by delivering the cue immediately at the end of the stimulus presentation. In the second experiment, we varied the cue delay to investigate sensory memory and VSTM. Performance decayed exponentially as a function of cue delay and we used the time-constant of the exponential-decay to demarcate sensory memory from VSTM. We then decomposed performance in terms of quality and quantity measures to analyze bottlenecks along these dimensions. In terms of the quality of information, two thirds to three quarters of the motion-processing bottleneck occurs in stimulus encoding rather than memory stages. In terms of the quantity of information, the motion-processing bottleneck is distributed, with the stimulus-encoding stage accounting for one third of the bottleneck. The bottleneck for the stimulus-encoding stage is dominated by the selection compared to the filtering function of attention. We also found that the filtering function of attention is operating mainly at the sensory memory stage in a specific manner, i.e., influencing only quantity and sparing quality. These results provide a novel and more complete understanding of information processing and storage bottlenecks for motion processing. PMID:24391806

  3. Bottlenecks of motion processing during a visual glance: the leaky flask model.

    PubMed

    Öğmen, Haluk; Ekiz, Onur; Huynh, Duong; Bedell, Harold E; Tripathy, Srimant P

    2013-01-01

    Where do the bottlenecks for information and attention lie when our visual system processes incoming stimuli? The human visual system encodes the incoming stimulus and transfers its contents into three major memory systems with increasing time scales, viz., sensory (or iconic) memory, visual short-term memory (VSTM), and long-term memory (LTM). It is commonly believed that the major bottleneck of information processing resides in VSTM. In contrast to this view, we show major bottlenecks for motion processing prior to VSTM. In the first experiment, we examined bottlenecks at the stimulus encoding stage through a partial-report technique by delivering the cue immediately at the end of the stimulus presentation. In the second experiment, we varied the cue delay to investigate sensory memory and VSTM. Performance decayed exponentially as a function of cue delay and we used the time-constant of the exponential-decay to demarcate sensory memory from VSTM. We then decomposed performance in terms of quality and quantity measures to analyze bottlenecks along these dimensions. In terms of the quality of information, two thirds to three quarters of the motion-processing bottleneck occurs in stimulus encoding rather than memory stages. In terms of the quantity of information, the motion-processing bottleneck is distributed, with the stimulus-encoding stage accounting for one third of the bottleneck. The bottleneck for the stimulus-encoding stage is dominated by the selection compared to the filtering function of attention. We also found that the filtering function of attention is operating mainly at the sensory memory stage in a specific manner, i.e., influencing only quantity and sparing quality. These results provide a novel and more complete understanding of information processing and storage bottlenecks for motion processing.

  4. Effects of Emotional Valence and Arousal on Time Perception

    PubMed Central

    Van Volkinburg, Heather; Balsam, Peter

    2016-01-01

    We examined the influence of emotional arousal and valence on estimating time intervals. A reproduction task was used in which images from the International Affective Picture System served as the stimuli to be timed. Experiment 1 assessed the effects of positive and negative valence at a moderate arousal level and Experiment 2 replicated Experiment 1 with the addition of a high arousal condition. Overestimation increased as a function of arousal during encoding of times regardless of valence. For images presented during reproduction, overestimation occurred at the moderate arousal level for positive and negative valence but underestimation occurred in the negative valence high arousal condition. The overestimation of time intervals produced by emotional arousal during encoding and during reproduction suggests that emotional stimuli affect temporal information processing in a qualitatively different way during different phases of temporal information processing. PMID:27110491

  5. Parallel program debugging with flowback analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Jongdeok.

    1989-01-01

    This thesis describes the design and implementation of an integrated debugging system for parallel programs running on shared memory multi-processors. The goal of the debugging system is to present to the programmer a graphical view of the dynamic program dependences while keeping the execution-time overhead low. The author first describes the use of flowback analysis to provide information on causal relationship between events in a programs' execution without re-executing the program for debugging. Execution time overhead is kept low by recording only a small amount of trace during a program's execution. He uses semantic analysis and a technique called incrementalmore » tracing to keep the time and space overhead low. As part of the semantic analysis, he uses a static program dependence graph structure that reduces the amount of work done at compile time and takes advantage of the dynamic information produced during execution time. The cornerstone of the incremental tracing concept is to generate a coarse trace during execution and fill incrementally, during the interactive portion of the debugging session, the gap between the information gathered in the coarse trace and the information needed to do the flowback analysis using the coarse trace. Then, he describes how to extend the flowback analysis to parallel programs. The flowback analysis can span process boundaries; i.e., the most recent modification to a shared variable might be traced to a different process than the one that contains the current reference. The static and dynamic program dependence graphs of the individual processes are tied together with synchronization and data dependence information to form complete graphs that represent the entire program.« less

  6. Tracing the time course of picture--word processing.

    PubMed

    Smith, M C; Magee, L E

    1980-12-01

    A number of independent lines of research have suggested that semantic and articulatory information become available differentially from pictures and words. The first of the experiments reported here sought to clarify the time course by which information about pictures and words becomes available by considering the pattern of interference generated when incongruent pictures and words are presented simultaneously in a Stroop-like situation. Previous investigators report that picture naming is easily disrupted by the presence of a distracting word but that word naming is relatively immune to interference from an incongruent picture. Under the assumption that information available from a completed process may disrupt an ongoing process, these results suggest that words access articulatory information more rapidly than do pictures. Experiment 1 extended this paradigm by requiring subjects to verify the category of the target stimulus. In accordance with the hypothesis that picture access the semantic code more rapidly than words, there was a reversal in the interference pattern: Word categorization suffered considerable disruption, whereas picture categorization was minimally affected by the presence of an incongruent word. Experiment 2 sought to further test the hypothesis that access to semantic and articulatory codes is different for pictures and words by examining memory for those items following naming or categorization. Categorized words were better recognized than named words, whereas the reverse was true for pictures, a result which suggests that picture naming involves more extensive processing than picture categorization. Experiment 3 replicated this result under conditions in which viewing time was held constant. The last experiment extended the investigation of memory differences to a situation in which subjects were required to generate the superordinate category name. Here, memory for categorized pictures was as good as memory for named pictures. Category generation also influenced memory for words, memory performance being superior to that following a yes--no verification of category membership. These experiments suggest a model of information access whereby pictures access semantic information were readily than name information, with the reverse being true for words. Memory for both pictures and words was a function of the amount of processing required to access a particular type of information as well as the extent of response differentiation necessitated by the task.

  7. Construction and updating of event models in auditory event processing.

    PubMed

    Huff, Markus; Maurer, Annika E; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-02-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event boundaries. Evidence from reading time studies (increased reading times with increasing amount of change) suggest that updating of event models is incremental. We present results from 5 experiments that studied event processing (including memory formation processes and reading times) using an audio drama as well as a transcript thereof as stimulus material. Experiments 1a and 1b replicated the event boundary advantage effect for memory. In contrast to recent evidence from studies using visual stimulus material, Experiments 2a and 2b found no support for incremental updating with normally sighted and blind participants for recognition memory. In Experiment 3, we replicated Experiment 2a using a written transcript of the audio drama as stimulus material, allowing us to disentangle encoding and retrieval processes. Our results indicate incremental updating processes at encoding (as measured with reading times). At the same time, we again found recognition performance to be unaffected by the amount of change. We discuss these findings in light of current event cognition theories. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. Speed in Information Processing with a Computer Driven Visual Display in a Real-time Digital Simulation. M.S. Thesis - Virginia Polytechnic Inst.

    NASA Technical Reports Server (NTRS)

    Kyle, R. G.

    1972-01-01

    Information transfer between the operator and computer-generated display systems is an area where the human factors engineer discovers little useful design data relating human performance to system effectiveness. This study utilized a computer-driven, cathode-ray-tube graphic display to quantify human response speed in a sequential information processing task. The performance criteria was response time to sixteen cell elements of a square matrix display. A stimulus signal instruction specified selected cell locations by both row and column identification. An equal probable number code, from one to four, was assigned at random to the sixteen cells of the matrix and correspondingly required one of four, matched keyed-response alternatives. The display format corresponded to a sequence of diagnostic system maintenance events, that enable the operator to verify prime system status, engage backup redundancy for failed subsystem components, and exercise alternate decision-making judgements. The experimental task bypassed the skilled decision-making element and computer processing time, in order to determine a lower bound on the basic response speed for given stimulus/response hardware arrangement.

  9. Which University? A Study of the Influence of Cost and Information Factors on Scottish Undergraduate Choice

    ERIC Educational Resources Information Center

    Briggs, Senga; Wilson, Alex

    2007-01-01

    At a time when higher education institutions (HEIs) around the globe face declining student numbers and decreasing funding grants, it becomes imperative for those involved in the recruitment process to understand the factors utilized by students in the search process. This paper explores the influence of two such factors: Information Supplied by…

  10. Examining Factors Associated with (In)Stability in Social Information Processing among Urban School Children: A Latent Transition Analytic Approach

    ERIC Educational Resources Information Center

    Goldweber, Asha; Bradshaw, Catherine P.; Goodman, Kimberly; Monahan, Kathryn; Cooley-Strickland, Michele

    2011-01-01

    There is compelling evidence for the role of social information processing (SIP) in aggressive behavior. However, less is known about factors that influence stability versus instability in patterns of SIP over time. Latent transition analysis was used to identify SIP patterns over one year and examine how community violence exposure, aggressive…

  11. The Influence of Teacher Perceived Administration of Self-Regulated Learning on Students' Motivation and Information-Processing

    ERIC Educational Resources Information Center

    Rozendaal, J.S.; Minnaert, A.; Boekaerts, M.

    2005-01-01

    This study investigates the influence of teacher perceived administration of self-regulated learning on students' motivation and information-processing over time. This was done in the context of the Interactive Learning group System (ILS^(R)): a large-scale innovation program in Dutch vocational schools. A total of 185 students were grouped post…

  12. Integrated Information Support System (IISS). Volume 8. User Interface Subsystem. Part 3. User Interface Services Product Specification.

    DTIC Science & Technology

    1985-11-01

    User Interface that consists of a set of callable execution time routines available to an application program for form processing . IISS Function Screen...provisions for test consists of the normal testing techniques that are accomplished during the construction process . They consist of design and code...application presents a form * to the user which must be filled in with information for processing by that application. The application then

  13. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone whomore » wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables.« less

  14. Effects of Sequences of Cognitions on Group Performance Over Time

    PubMed Central

    Molenaar, Inge; Chiu, Ming Ming

    2017-01-01

    Extending past research showing that sequences of low cognitions (low-level processing of information) and high cognitions (high-level processing of information through questions and elaborations) influence the likelihoods of subsequent high and low cognitions, this study examines whether sequences of cognitions are related to group performance over time; 54 primary school students (18 triads) discussed and wrote an essay about living in another country (32,375 turns of talk). Content analysis and statistical discourse analysis showed that within each lesson, groups with more low cognitions or more sequences of low cognition followed by high cognition added more essay words. Groups with more high cognitions, sequences of low cognition followed by low cognition, or sequences of high cognition followed by an action followed by low cognition, showed different words and sequences, suggestive of new ideas. The links between cognition sequences and group performance over time can inform facilitation and assessment of student discussions. PMID:28490854

  15. Effects of Sequences of Cognitions on Group Performance Over Time.

    PubMed

    Molenaar, Inge; Chiu, Ming Ming

    2017-04-01

    Extending past research showing that sequences of low cognitions (low-level processing of information) and high cognitions (high-level processing of information through questions and elaborations) influence the likelihoods of subsequent high and low cognitions, this study examines whether sequences of cognitions are related to group performance over time; 54 primary school students (18 triads) discussed and wrote an essay about living in another country (32,375 turns of talk). Content analysis and statistical discourse analysis showed that within each lesson, groups with more low cognitions or more sequences of low cognition followed by high cognition added more essay words. Groups with more high cognitions, sequences of low cognition followed by low cognition, or sequences of high cognition followed by an action followed by low cognition, showed different words and sequences, suggestive of new ideas. The links between cognition sequences and group performance over time can inform facilitation and assessment of student discussions.

  16. Hardware for dynamic quantum computing.

    PubMed

    Ryan, Colm A; Johnson, Blake R; Ristè, Diego; Donovan, Brian; Ohki, Thomas A

    2017-10-01

    We describe the hardware, gateware, and software developed at Raytheon BBN Technologies for dynamic quantum information processing experiments on superconducting qubits. In dynamic experiments, real-time qubit state information is fed back or fed forward within a fraction of the qubits' coherence time to dynamically change the implemented sequence. The hardware presented here covers both control and readout of superconducting qubits. For readout, we created a custom signal processing gateware and software stack on commercial hardware to convert pulses in a heterodyne receiver into qubit state assignments with minimal latency, alongside data taking capability. For control, we developed custom hardware with gateware and software for pulse sequencing and steering information distribution that is capable of arbitrary control flow in a fraction of superconducting qubit coherence times. Both readout and control platforms make extensive use of field programmable gate arrays to enable tailored qubit control systems in a reconfigurable fabric suitable for iterative development.

  17. Development of yarn breakage detection software system based on machine vision

    NASA Astrophysics Data System (ADS)

    Wang, Wenyuan; Zhou, Ping; Lin, Xiangyu

    2017-10-01

    For questions spinning mills and yarn breakage cannot be detected in a timely manner, and save the cost of textile enterprises. This paper presents a software system based on computer vision for real-time detection of yarn breakage. The system and Windows8.1 system Tablet PC, cloud server to complete the yarn breakage detection and management. Running on the Tablet PC software system is designed to collect yarn and location information for analysis and processing. And will be processed after the information through the Wi-Fi and http protocol sent to the cloud server to store in the Microsoft SQL2008 database. In order to follow up on the yarn break information query and management. Finally sent to the local display on time display, and remind the operator to deal with broken yarn. The experimental results show that the system of missed test rate not more than 5%o, and no error detection.

  18. Historical Time-Domain: Data Archives, Processing, and Distribution

    NASA Astrophysics Data System (ADS)

    Grindlay, Jonathan E.; Griffin, R. Elizabeth

    2012-04-01

    The workshop on Historical Time-Domain Astronomy (TDA) was attended by a near-capacity gathering of ~30 people. From information provided in turn by those present, an up-to-date overview was created of available plate archives, progress in their digitization, the extent of actual processing of those data, and plans for data distribution. Several recommendations were made for prioritising the processing and distribution of historical TDA data.

  19. The timing and sources of information for the adoption and implementation of production innovations

    NASA Technical Reports Server (NTRS)

    Ettlie, J. E.

    1976-01-01

    Two dimensions (personal-impersonal and internal-external) are used to characterize information sources as they become important during the interorganizational transfer of production innovations. The results of three studies are reviewed for the purpose of deriving a model of the timing and importance of different information sources and the utilization of new technology. Based on the findings of two retrospective studies, it was concluded that the pattern of information seeking behavior in user organizations during the awareness stage of adoption is not a reliable predictor of the eventual utilization rate. Using the additional findings of a real-time study, an empirical model of the relative importance of information sources for successful user organizations is presented. These results are extended and integrated into a theoretical model consisting of a time-profile of successful implementations and the relative importance of four types of information sources during seven stages of the adoption-implementation process.

  20. Stochastic and information-thermodynamic structures of population dynamics in a fluctuating environment

    NASA Astrophysics Data System (ADS)

    Kobayashi, Tetsuya J.; Sughiyama, Yuki

    2017-07-01

    Adaptation in a fluctuating environment is a process of fueling environmental information to gain fitness. Living systems have gradually developed strategies for adaptation from random and passive diversification of the phenotype to more proactive decision making, in which environmental information is sensed and exploited more actively and effectively. Understanding the fundamental relation between fitness and information is therefore crucial to clarify the limits and universal properties of adaptation. In this work, we elucidate the underlying stochastic and information-thermodynamic structure in this process, by deriving causal fluctuation relations (FRs) of fitness and information. Combined with a duality between phenotypic and environmental dynamics, the FRs reveal the limit of fitness gain, the relation of time reversibility with the achievability of the limit, and the possibility and condition for gaining excess fitness due to environmental fluctuation. The loss of fitness due to causal constraints and the limited capacity of real organisms is shown to be the difference between time-forward and time-backward path probabilities of phenotypic and environmental dynamics. Furthermore, the FRs generalize the concept of the evolutionary stable state (ESS) for fluctuating environment by giving the probability that the optimal strategy on average can be invaded by a suboptimal one owing to rare environmental fluctuation. These results clarify the information-thermodynamic structures in adaptation and evolution.

  1. Mental Status Documentation: Information Quality and Data Processes

    PubMed Central

    Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy

    2016-01-01

    Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses’ assessment, documentation, decisionmaking and communication regarding patients’ mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm. PMID:28269919

  2. Mental Status Documentation: Information Quality and Data Processes.

    PubMed

    Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy

    2016-01-01

    Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses' assessment, documentation, decisionmaking and communication regarding patients' mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm.

  3. Enhancement of time images for photointerpretation

    NASA Technical Reports Server (NTRS)

    Gillespie, A. R.

    1986-01-01

    The Thermal Infrared Multispectral Scanner (TIMS) images consist of six channels of data acquired in bands between 8 and 12 microns, thus they contain information about both temperature and emittance. Scene temperatures are controlled by reflectivity of the surface, but also by its geometry with respect to the Sun, time of day, and other factors unrelated to composition. Emittance is dependent upon composition alone. Thus the photointerpreter may wish to enhance emittance information selectively. Because thermal emittances in real scenes vary but little, image data tend to be highly correlated along channels. Special image processing is required to make this information available for the photointerpreter. Processing includes noise removal, construction of model emittance images, and construction of false-color pictures enhanced by decorrelation techniques.

  4. Information gathering, management and transferring for geospatial intelligence - A conceptual approach to create a spatial data infrastructure

    NASA Astrophysics Data System (ADS)

    Nunes, Paulo; Correia, Anacleto; Teodoro, M. Filomena

    2017-06-01

    Since long ago, information is a key factor for military organizations. In military context the success of joint and combined operations depends on the accurate information and knowledge flow concerning the operational theatre: provision of resources, environment evolution, targets' location, where and when an event will occur. Modern military operations cannot be conceive without maps and geospatial information. Staffs and forces on the field request large volume of information during the planning and execution process, horizontal and vertical geospatial information integration is critical for decision cycle. Information and knowledge management are fundamental to clarify an environment full of uncertainty. Geospatial information (GI) management rises as a branch of information and knowledge management, responsible for the conversion process from raw data collect by human or electronic sensors to knowledge. Geospatial information and intelligence systems allow us to integrate all other forms of intelligence and act as a main platform to process and display geospatial-time referenced events. Combining explicit knowledge with person know-how to generate a continuous learning cycle that supports real time decisions, mitigates the influences of fog of war and provides the knowledge supremacy. This paper presents the analysis done after applying a questionnaire and interviews about the GI and intelligence management in a military organization. The study intended to identify the stakeholder's requirements for a military spatial data infrastructure as well as the requirements for a future software system development.

  5. Knowledge-driven information mining in remote-sensing image archives

    NASA Astrophysics Data System (ADS)

    Datcu, M.; Seidel, K.; D'Elia, S.; Marchetti, P. G.

    2002-05-01

    Users in all domains require information or information-related services that are focused, concise, reliable, low cost and timely and which are provided in forms and formats compatible with the user's own activities. In the current Earth Observation (EO) scenario, the archiving centres generally only offer data, images and other "low level" products. The user's needs are being only partially satisfied by a number of, usually small, value-adding companies applying time-consuming (mostly manual) and expensive processes relying on the knowledge of experts to extract information from those data or images.

  6. A computational model of visual marking using an inter-connected network of spiking neurons: the spiking search over time & space model (sSoTS).

    PubMed

    Mavritsaki, Eirini; Heinke, Dietmar; Humphreys, Glyn W; Deco, Gustavo

    2006-01-01

    In the real world, visual information is selected over time as well as space, when we prioritise new stimuli for attention. Watson and Humphreys [Watson, D., Humphreys, G.W., 1997. Visual marking: prioritizing selection for new objects by top-down attentional inhibition of old objects. Psychological Review 104, 90-122] presented evidence that new information in search tasks is prioritised by (amongst other processes) active ignoring of old items - a process they termed visual marking. In this paper we present, for the first time, an explicit computational model of visual marking using biologically plausible activation functions. The "spiking search over time and space" model (sSoTS) incorporates different synaptic components (NMDA, AMPA, GABA) and a frequency adaptation mechanism based on [Ca(2+)] sensitive K(+) current. This frequency adaptation current can act as a mechanism that suppresses the previously attended items. We show that, when coupled with a process of active inhibition applied to old items, frequency adaptation leads to old items being de-prioritised (and new items prioritised) across time in search. Furthermore, the time course of these processes mimics the time course of the preview effect in human search. The results indicate that the sSoTS model can provide a biologically plausible account of human search over time as well as space.

  7. Hardware-software complex of informing passengers of forecasted route transport arrival at stop

    NASA Astrophysics Data System (ADS)

    Pogrebnoy, V. Yu; Pushkarev, M. I.; Fadeev, A. S.

    2017-02-01

    The paper presents the hardware-software complex of informing the passengers of the forecasted route transport arrival. A client-server architecture of the forecasting information system is represented and an electronic information board prototype is described. The scheme of information transfer and processing, starting with receiving navigating telemetric data from a transport vehicle and up to the time of passenger public transport arrival at the stop, as well as representation of the information on the electronic board is illustrated and described. Methods and algorithms of determination of the transport vehicle current location in the city route network are considered in detail. The description of the proposed forecasting model of transport vehicle arrival time at the stop is given. The obtained result is applied in Tomsk for forecasting and displaying the arrival time information at the stops.

  8. LICENSING HYDROPOWER PROJECTS: Better Time and Cost Data Needed to Reach Informed Decisions About Process Reforms

    DTIC Science & Technology

    2001-05-01

    GAO United States General Accounting Office Report to Congressional Requesters May 2001 LICENSING HYDROPOWER PROJECTS Better Time and Cost Data...Dates Covered (from... to) ("DD MON YYYY") Title and Subtitle LICENSING HYDROPOWER PROJECTS: Better Time and Cost Data Needed to Reach Informed...Organization Name(s) and Address(es) General Accounting Office, PO Box 37050, Washington, DC 20013 Performing Organization Number(s) GAO-01-499

  9. The added value of time-variable microgravimetry to the understanding of how volcanoes work

    USGS Publications Warehouse

    Carbone, Daniele; Poland, Michael; Greco, Filippo; Diament, Michel

    2017-01-01

    During the past few decades, time-variable volcano gravimetry has shown great potential for imaging subsurface processes at active volcanoes (including some processes that might otherwise remain “hidden”), especially when combined with other methods (e.g., ground deformation, seismicity, and gas emissions). By supplying information on changes in the distribution of bulk mass over time, gravimetry can provide information regarding processes such as magma accumulation in void space, gas segregation at shallow depths, and mechanisms driving volcanic uplift and subsidence. Despite its potential, time-variable volcano gravimetry is an underexploited method, not widely adopted by volcano researchers or observatories. The cost of instrumentation and the difficulty in using it under harsh environmental conditions is a significant impediment to the exploitation of gravimetry at many volcanoes. In addition, retrieving useful information from gravity changes in noisy volcanic environments is a major challenge. While these difficulties are not trivial, neither are they insurmountable; indeed, creative efforts in a variety of volcanic settings highlight the value of time-variable gravimetry for understanding hazards as well as revealing fundamental insights into how volcanoes work. Building on previous work, we provide a comprehensive review of time-variable volcano gravimetry, including discussions of instrumentation, modeling and analysis techniques, and case studies that emphasize what can be learned from campaign, continuous, and hybrid gravity observations. We are hopeful that this exploration of time-variable volcano gravimetry will excite more scientists about the potential of the method, spurring further application, development, and innovation.

  10. Access of emotional information to visual awareness in patients with major depressive disorder.

    PubMed

    Sterzer, P; Hilgenfeldt, T; Freudenberg, P; Bermpohl, F; Adli, M

    2011-08-01

    According to cognitive theories of depression, negative biases affect most cognitive processes including perception. Such depressive perception may result not only from biased cognitive appraisal but also from automatic processing biases that influence the access of sensory information to awareness. Twenty patients with major depressive disorder (MDD) and 20 healthy control participants underwent behavioural testing with a variant of binocular rivalry, continuous flash suppression (CFS), to investigate the potency of emotional visual stimuli to gain access to awareness. While a neutral, fearful, happy or sad emotional face was presented to one eye, high-contrast dynamic patterns were presented to the other eye, resulting in initial suppression of the face from awareness. Participants indicated the location of the face with a key press as soon as it became visible. The modulation of suppression time by emotional expression was taken as an index of unconscious emotion processing. We found a significant difference in the emotional modulation of suppression time between MDD patients and controls. This difference was due to relatively shorter suppression of sad faces and, to a lesser degree, to longer suppression of happy faces in MDD. Suppression time modulation by sad expression correlated with change in self-reported severity of depression after 4 weeks. Our finding of preferential access to awareness for mood-congruent stimuli supports the notion that depressive perception may be related to altered sensory information processing even at automatic processing stages. Such perceptual biases towards mood-congruent information may reinforce depressed mood and contribute to negative cognitive biases. © Cambridge University Press 2011

  11. A conceptual framework for intelligent real-time information processing

    NASA Technical Reports Server (NTRS)

    Schudy, Robert

    1987-01-01

    By combining artificial intelligence concepts with the human information processing model of Rasmussen, a conceptual framework was developed for real time artificial intelligence systems which provides a foundation for system organization, control and validation. The approach is based on the description of system processing terms of an abstraction hierarchy of states of knowledge. The states of knowledge are organized along one dimension which corresponds to the extent to which the concepts are expressed in terms of the system inouts or in terms of the system response. Thus organized, the useful states form a generally triangular shape with the sensors and effectors forming the lower two vertices and the full evaluated set of courses of action the apex. Within the triangle boundaries are numerous processing paths which shortcut the detailed processing, by connecting incomplete levels of analysis to partially defined responses. Shortcuts at different levels of abstraction include reflexes, sensory motor control, rule based behavior, and satisficing. This approach was used in the design of a real time tactical decision aiding system, and in defining an intelligent aiding system for transport pilots.

  12. Involvement and Influence of Healthcare Providers, Family Members, and Other Mutation Carriers in the Cancer Risk Management Decision-Making Process of BRCA1 and BRCA2 Mutation Carriers.

    PubMed

    Puski, Athena; Hovick, Shelly; Senter, Leigha; Toland, Amanda Ewart

    2018-03-29

    Deciding between increased cancer screening or prophylactic surgery and the timing of such procedures can be a difficult and complex process for women with BRCA mutations. There are gaps in our understanding of involvement of others in the decision-making process for women with BRCA mutations. This study evaluated the management decision-making process of women with BRCA mutations, focusing on the involvement of others. Grounded theory was used to analyze and code risk management decision-making information from interviews with 20 BRCA mutation carriers. Unaffected at-risk participants with a BRCA mutation, those under age 40, and those with no children described having a difficult time making risk management decisions. Physicians were an integral part of the decision-making process by providing decisional support and management recommendations. Family members and other mutation carriers filled similar yet distinct roles by providing experiential information as well as decisional and emotional support for carriers. Participants described genetic counselors as short-term providers of risk information and management recommendations. The study findings suggest that unaffected at-risk women, women under 40, and those who do not have children may benefit from additional support and information during the decision-making process. Genetic counselors are well trained to help women through this process and connect them with resources, and may be under-utilized in long-term follow-up for women with a BRCA mutation.

  13. 77 FR 58585 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-21

    ... action to submit an information collection request to the Office of Management and Budget (OMB) and... time from each applicant or individual to enable the Department of the Treasury to process electronic...

  14. Does incentivising pill-taking 'crowd out' risk-information processing? Evidence from a web-based experiment.

    PubMed

    Mantzari, Eleni; Vogt, Florian; Marteau, Theresa M

    2014-04-01

    The use of financial incentives for changing health-related behaviours raises concerns regarding their potential to undermine the processing of risks associated with incentivised behaviours. Uncertainty remains about the validity of such concerns. This web-based experiment assessed the impact of financial incentives on i) willingness to take a pill with side-effects; ii) the time spent viewing risk-information and iii) risk-information processing, assessed by perceived-risk of taking the pill and knowledge of its side-effects. It further assesses whether effects are moderated by limiting cognitive capacity. Two-hundred and seventy-five UK-based university staff and students were recruited online under the pretext of being screened for a fictitious drug-trial. Participants were randomised to the offer of different compensation levels for taking a fictitious pill (£0; £25; £1000) and the presence or absence of a cognitive load task (presentation of five digits for later recall). Willingness to take the pill increased with the offer of £1000 (84% vs. 67%; OR 3.66, CI 95% 1.27-10.6), but not with the offer of £25 (79% vs. 67%; OR 1.68, CI 95% 0.71-4.01). Risk-information processing was unaffected by the offer of incentives. The time spent viewing the risk-information was affected by the offer of incentives, an effect moderated by cognitive load: Without load, time increased with the value of incentives (£1000: M = 304.4sec vs. £0: M = 37.8sec, p < 0.001; £25: M = 66.6sec vs. £0: M = 37.8sec, p < 0.001). Under load, time decreased with the offer of incentives (£1000: M = 48.9sec vs. £0: M = 132.7sec, p < 0.001; £25: M = 60.9sec vs. £0: M = 132.7sec, p < 0.001), but did not differ between the two incentivised groups (p = 1.00). This study finds no evidence to suggest incentives "crowd out" risk-information processing. On the contrary, incentives appear to signal risk, an effect, however, which disappears under cognitive load. Although these findings require replication, they highlight the need to maximise cognitive capacity when presenting information about incentivised health-related behaviours. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Practical UXO Classification: Enhanced Data Processing Strategies for Technology Transition - Fort Ord: Dynamic and Cued Metalmapper Processing and Classification

    DTIC Science & Technology

    2017-06-06

    OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for...Geophysical Mapping, Electromagnetic Induction, Instrument Verification Strip, Time Domain Electromagnetic, Unexploded Ordnance 16. SECURITY...Munitions Response QA Quality Assurance QC Quality Control ROC Receiver Operating Characteristic RTK Real- time Kinematic s Second SNR

  16. Multiple Cues in Social Perception: The Time Course of Processing Race and Facial Expression

    PubMed Central

    Kubota, Jennifer T.; Ito, Tiffany A.

    2007-01-01

    The purpose of the present study was to examine the time course of race and expression processing to determine how these cues influence early perceptual as well as explicit categorization judgments. Despite their importance in social perception, little research has examined how social category information and emotional expression are processed over time. Moreover, although models of face processing suggest that the two cues should be processed independently, this has rarely been directly examined. Event-related brain potentials were recorded as participants made race and emotion categorization judgments of Black and White men posing either happy, angry, or neutral expressions. Our findings support that processing of race and emotion cues occur independently and in parallel, relatively early in processing. PMID:17940587

  17. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing)

    NASA Astrophysics Data System (ADS)

    Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi

    2018-02-01

    The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  18. How multiple social networks affect user awareness: The information diffusion process in multiplex networks

    NASA Astrophysics Data System (ADS)

    Li, Weihua; Tang, Shaoting; Fang, Wenyi; Guo, Quantong; Zhang, Xiao; Zheng, Zhiming

    2015-10-01

    The information diffusion process in single complex networks has been extensively studied, especially for modeling the spreading activities in online social networks. However, individuals usually use multiple social networks at the same time, and can share the information they have learned from one social network to another. This phenomenon gives rise to a new diffusion process on multiplex networks with more than one network layer. In this paper we account for this multiplex network spreading by proposing a model of information diffusion in two-layer multiplex networks. We develop a theoretical framework using bond percolation and cascading failure to describe the intralayer and interlayer diffusion. This allows us to obtain analytical solutions for the fraction of informed individuals as a function of transmissibility T and the interlayer transmission rate θ . Simulation results show that interaction between layers can greatly enhance the information diffusion process. And explosive diffusion can occur even if the transmissibility of the focal layer is under the critical threshold, due to interlayer transmission.

  19. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †

    PubMed Central

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-01-01

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted. PMID:28287448

  20. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation.

    PubMed

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-03-10

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted.

  1. Mental Rotation of Tactical Instruction Displays Affects Information Processing Demand and Execution Accuracy in Basketball.

    PubMed

    Koopmann, Till; Steggemann-Weinrich, Yvonne; Baumeister, Jochen; Krause, Daniel

    2017-09-01

    In sports games, coaches often use tactic boards to present tactical instructions during time-outs (e.g., 20 s to 60 s in basketball). Instructions should be presented in a way that enables fast and errorless information processing for the players. The aim of this study was to test the effect of different orientations of visual tactical displays on observation time and execution performance. High affordances in visual-spatial transformation (e.g., mental rotation processes) might impede information processing and might decrease execution performance with regard to the instructed playing patterns. In a within-subjects design with 1 factor, 10 novice students were instructed with visual tactical instructions of basketball playing patterns with different orientations either showing the playing pattern with low spatial disparity to the players' on-court perspective (basket on top) or upside down (basket on bottom). The self-chosen time for watching the pattern before execution was significantly shorter and spatial accuracy in pattern execution was significantly higher when the instructional perspective and the real perspective on the basketball court had a congruent orientation. The effects might be explained by interfering mental rotation processes that are necessary to transform the instructional perspective into the players' actual perspective while standing on the court or imagining themselves standing on the court. According to these results, coaches should align their tactic boards to their players' on-court viewing perspective.

  2. Cooperative processing in primary somatosensory cortex and posterior parietal cortex during tactile working memory.

    PubMed

    Ku, Yixuan; Zhao, Di; Bodner, Mark; Zhou, Yong-Di

    2015-08-01

    In the present study, causal roles of both the primary somatosensory cortex (SI) and the posterior parietal cortex (PPC) were investigated in a tactile unimodal working memory (WM) task. Individual magnetic resonance imaging-based single-pulse transcranial magnetic stimulation (spTMS) was applied, respectively, to the left SI (ipsilateral to tactile stimuli), right SI (contralateral to tactile stimuli) and right PPC (contralateral to tactile stimuli), while human participants were performing a tactile-tactile unimodal delayed matching-to-sample task. The time points of spTMS were 300, 600 and 900 ms after the onset of the tactile sample stimulus (duration: 200 ms). Compared with ipsilateral SI, application of spTMS over either contralateral SI or contralateral PPC at those time points significantly impaired the accuracy of task performance. Meanwhile, the deterioration in accuracy did not vary with the stimulating time points. Together, these results indicate that the tactile information is processed cooperatively by SI and PPC in the same hemisphere, starting from the early delay of the tactile unimodal WM task. This pattern of processing of tactile information is different from the pattern in tactile-visual cross-modal WM. In a tactile-visual cross-modal WM task, SI and PPC contribute to the processing sequentially, suggesting a process of sensory information transfer during the early delay between modalities. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  3. Erroneous knowledge of results affects decision and memory processes on timing tasks.

    PubMed

    Ryan, Lawrence J; Fritz, Matthew S

    2007-12-01

    On mental timing tasks, erroneous knowledge of results (KR) leads to incorrect performance accompanied by the subjective judgment of accurate performance. Using the start-stop technique (an analogue of the peak interval procedure) with both reproduction and production timing tasks, the authors analyze what processes erroneous KR alters. KR provides guidance (performance error information) that lowers decision thresholds. Erroneous KR also provides targeting information that alters response durations proportionately to the magnitude of the feedback error. On the production task, this shift results from changes in the reference memory, whereas on the reproduction task this shift results from changes in the decision threshold for responding. The idea that erroneous KR can alter different cognitive processes on related tasks is supported by the authors' demonstration that the learned strategies can transfer from the reproduction task to the production task but not visa versa. Thus effects of KR are both task and context dependent.

  4. Flexibility within working memory and the focus of attention for sequential verbal information does not depend on active maintenance.

    PubMed

    Sandry, Joshua; Schwark, Jeremy D; MacDonald, Justin

    2014-10-01

    The focus of attention seems to be a static element within working memory when verbal information is serially presented, unless additional time is available for processing or active maintenance. Experiment 1 manipulated the reward associated with early and medial list positions in a probe recognition paradigm and found evidence that these nonterminal list positions could be retrieved faster and more accurately if participants were appropriately motivated-without additional time for processing or active maintenance. Experiment 2 used articulatory suppression and demonstrated that the underlying maintenance mechanism cannot be attributed to rehearsal, leaving attentional refreshing as the more likely mechanism. These findings suggest that the focus of attention within working memory can flexibly maintain nonterminal early and medial list representations at the expense of other list representations even when there is not additional time for processing or active maintenance. Maintenance seems to be accomplished through an attentional refreshing mechanism.

  5. GPU-based real-time trinocular stereo vision

    NASA Astrophysics Data System (ADS)

    Yao, Yuanbin; Linton, R. J.; Padir, Taskin

    2013-01-01

    Most stereovision applications are binocular which uses information from a 2-camera array to perform stereo matching and compute the depth image. Trinocular stereovision with a 3-camera array has been proved to provide higher accuracy in stereo matching which could benefit applications like distance finding, object recognition, and detection. This paper presents a real-time stereovision algorithm implemented on a GPGPU (General-purpose graphics processing unit) using a trinocular stereovision camera array. Algorithm employs a winner-take-all method applied to perform fusion of disparities in different directions following various image processing techniques to obtain the depth information. The goal of the algorithm is to achieve real-time processing speed with the help of a GPGPU involving the use of Open Source Computer Vision Library (OpenCV) in C++ and NVidia CUDA GPGPU Solution. The results are compared in accuracy and speed to verify the improvement.

  6. Improving data management practices in the Portuguese HIV/AIDS surveillance system during a time of public sector austerity.

    PubMed

    Shivaji, Tara; Cortes Martins, Helena

    2015-01-01

    In a climate of public sector austerity, the demand for accurate information about disease epidemiology rises as health program managers try to align spending to health needs. A policy of case re-notification to improve HIV information quality resulted in a nine-fold increase in the number of case reports received in 2013 by the Portuguese HIV surveillance office. We used value stream mapping to introduce improvements to data processing practices, identify and reduce waste. Two cycles of improvement were trialled. Before intervention, processing time was nine minutes and 28 seconds (95%CI 8:53-10:58) per report. Two months post intervention, it was six minutes and 34 seconds (95% CI 6:25-6:43). One year after the start of the project, processing time was five minutes and 20 seconds (95% CI 1:46-8:52).

  7. Improving data management practices in the Portuguese HIV/AIDS surveillance system during a time of public sector austerity

    PubMed Central

    Shivaji, Tara; Cortes Martins, Helena

    2015-01-01

    In a climate of public sector austerity, the demand for accurate information about disease epidemiology rises as health program managers try to align spending to health needs. A policy of case re-notification to improve HIV information quality resulted in a nine-fold increase in the number of case reports received in 2013 by the Portuguese HIV surveillance office. We used value stream mapping to introduce improvements to data processing practices, identify and reduce waste. Two cycles of improvement were trialled. Before intervention, processing time was nine minutes and 28 seconds (95%CI 8:53–10:58) per report. Two months post intervention, it was six minutes and 34 seconds (95% CI 6:25–6:43). One year after the start of the project, processing time was five minutes and 20 seconds (95% CI 1:46–8:52). PMID:26734448

  8. Influence of Aggression on Information Processing in the Emotional Stroop Task – an Event-Related Potential Study

    PubMed Central

    Bertsch, Katja; Böhnke, Robina; Kruk, Menno R.; Naumann, Ewald

    2009-01-01

    Aggression is a common behavior which has frequently been explained as involving changes in higher level information processing patterns. Although researchers have started only recently to investigate information processing in healthy individuals while engaged in aggressive behavior, the impact of aggression on information processing beyond an aggressive encounter remains unclear. In an event-related potential study, we investigated the processing of facial expressions (happy, angry, fearful, and neutral) in an emotional Stroop task after experimentally provoking aggressive behavior in healthy participants. Compared to a non-provoked group, these individuals showed increased early (P2) and late (P3) positive amplitudes for all facial expressions. For the P2 amplitude, the effect of provocation was greatest for threat-related expressions. Beyond this, a bias for emotional expressions, i.e., slower reaction times to all emotional expressions, was found in provoked participants with a high level of trait anger. These results indicate significant effects of aggression on information processing, which last beyond the aggressive encounter even in healthy participants. PMID:19826616

  9. Continuous information flow fluctuations

    NASA Astrophysics Data System (ADS)

    Rosinberg, Martin Luc; Horowitz, Jordan M.

    2016-10-01

    Information plays a pivotal role in the thermodynamics of nonequilibrium processes with feedback. However, much remains to be learned about the nature of information fluctuations in small-scale devices and their relation with fluctuations in other thermodynamics quantities, like heat and work. Here we derive a series of fluctuation theorems for information flow and partial entropy production in a Brownian particle model of feedback cooling and extend them to arbitrary driven diffusion processes. We then analyze the long-time behavior of the feedback-cooling model in detail. Our results provide insights into the structure and origin of large deviations of information and thermodynamic quantities in autonomous Maxwell's demons.

  10. Comparing the information seeking strategies of residents, nurse practitioners, and physician assistants in critical care settings

    PubMed Central

    Kannampallil, Thomas G; Jones, Laura K; Patel, Vimla L; Buchman, Timothy G; Franklin, Amy

    2014-01-01

    Objective Critical care environments are information-intensive environments where effective decisions are predicated on successfully finding and using the ‘right information at the right time’. We characterize the differences in processes and strategies of information seeking between residents, nurse practitioners (NPs), and physician assistants (PAs). Method We conducted an exploratory study in the cardiothoracic intensive care units of two large academic hospitals within the same healthcare system. Clinicians (residents (n=5), NPs (n=5), and PAs (n=5)) were shadowed as they gathered information on patients in preparation for clinical rounds. Information seeking activities on 96 patients were collected over a period of 3 months (NRes=37, NNP=24, NPA=35 patients). The sources of information and time spent gathering the information at each source were recorded. Exploratory data analysis using probabilistic sequential approaches was used to analyze the data. Results Residents predominantly used a patient-based information seeking strategy in which all relevant information was aggregated for one patient at a time. In contrast, NPs and PAs primarily utilized a source-based information seeking strategy in which similar (or equivalent) information was aggregated for multiple patients at a time (eg, X-rays for all patients). Conclusions The differences in the information seeking strategies are potentially a result of the differences in clinical training, strategies of managing cognitive load, and the nature of the use of available health IT tools. Further research is needed to investigate the effects of these differences on clinical and process outcomes. PMID:24619926

  11. Rapid visual grouping and figure-ground processing using temporally structured displays.

    PubMed

    Cheadle, Samuel; Usher, Marius; Müller, Hermann J

    2010-08-23

    We examine the time course of visual grouping and figure-ground processing. Figure (contour) and ground (random-texture) elements were flickered with different phases (i.e., contour and background are alternated), requiring the observer to group information within a pre-specified time window. It was found this grouping has a high temporal resolution: less than 20ms for smooth contours, and less than 50ms for line conjunctions with sharp angles. Furthermore, the grouping process takes place without an explicit knowledge of the phase of the elements, and it requires a cumulative build-up of information. The results are discussed in relation to the neural mechanism for visual grouping and figure-ground segregation. Copyright 2010 Elsevier Ltd. All rights reserved.

  12. 36 CFR 810.5 - Fees.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... this section unless it is determined that the requested information will be of primary benefit to the... processed. The time limits for processing the request under § 810.3 shall not begin to run until the...

  13. Mobile mammography: An evaluation of organizational, process, and information systems challenges.

    PubMed

    Browder, Casey; Eberth, Jan M; Schooley, Benjamin; Porter, Nancy R

    2015-03-01

    The purpose of this case study was to evaluate the information systems, personnel, and processes involved in mobile mammography settings, and offer recommendations to improve efficiency and satisfaction among patients and staff. Data includes on-site observations, interviews, and an electronic medical record review of a hospital who offers both mobile and fixed facility mammography services to their community. The optimal expectations for the process of mobile mammography from multiple perspectives were defined as (1) patient receives mammogram the day of their visit, (2) patient has efficient intake process with little wait time, (3) follow-up is completed and timely, (4) site contact and van staff are satisfied with van visit and choose to schedule future visits, and (5) the MMU is able to assess its performance and set goals for improvement. Challenges that prevent the realization of those expectations include a low patient pre-registration rate, difficulty obtaining required physician orders, frequent information system downtime/Internet connectivity issues, ill-defined organizational communication/roles, insufficient site host/patient education, and disparate organizational and information systems. Our recommendations include employing a dedicated mobile mammography team for end-to-end oversight, mitigating for system connectivity issues, allowing for patient self-referrals, integrating scheduling and registration processes, and a focused approach to educating site hosts and respective patients about expectations for the day of the visit. The MMU is an important community resource; we recommend simple process improvements and information flow improvements to further enable the MMU׳s goals. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. An RFID-Based Manufacturing Control Framework for Loosely Coupled Distributed Manufacturing System Supporting Mass Customization

    NASA Astrophysics Data System (ADS)

    Chen, Ruey-Shun; Tsai, Yung-Shun; Tu, Arthur

    In this study we propose a manufacturing control framework based on radio-frequency identification (RFID) technology and a distributed information system to construct a mass-customization production process in a loosely coupled shop-floor control environment. On the basis of this framework, we developed RFID middleware and an integrated information system for tracking and controlling the manufacturing process flow. A bicycle manufacturer was used to demonstrate the prototype system. The findings of this study were that the proposed framework can improve the visibility and traceability of the manufacturing process as well as enhance process quality control and real-time production pedigree access. Using this framework, an enterprise can easily integrate an RFID-based system into its manufacturing environment to facilitate mass customization and a just-in-time production model.

  15. Health Information Needs and Health Seeking Behavior During the 2014-2016 Ebola Outbreak: A Twitter Content Analysis

    PubMed Central

    Odlum, Michelle; Yoon, Sunmoo

    2018-01-01

    Introduction: For effective public communication during major disease outbreaks like the 2014-2016 Ebola epidemic, health information needs of the population must be adequately assessed. Through content analysis of social media data, like tweets, public health information needs can be effectively assessed and in turn provide appropriate health information to address such needs. The aim of the current study was to assess health information needs about Ebola, at distinct epidemic time points, through longitudinal tracking. Methods: Natural language processing was applied to explore public response to Ebola over time from July 2014 to March 2015. A total 155,647 tweets (unique 68,736, retweet 86,911) mentioning Ebola were analyzed and visualized with infographics. Results: Public fear, frustration, and health information seeking regarding Ebola-related global priorities were observed across time. Our longitudinal content analysis revealed that due to ongoing health information deficiencies, resulting in fear and frustration, social media was at times an impediment and not a vehicle to support health information needs.  Discussion: Content analysis of tweets effectively assessed Ebola information needs. Our study also demonstrates the use of Twitter as a method for capturing real-time data to assess ongoing information needs, fear, and frustration over time.  PMID:29707416

  16. The Effect of Aging on the Stages of Processing in a Choice Reaction Time Task

    ERIC Educational Resources Information Center

    Simon, J. Richard; Pouraghabagher, A. Reza

    1978-01-01

    Two experiments were conducted to determine the effect of aging on encoding and response selection stages of a choice reaction time task. Results suggested reducing stimulus discriminability may affect information processing prior to the encoding stage, but the encoding stage is the primary locus of the slowing which accompanied aging. (Author)

  17. Image Understanding Architecture

    DTIC Science & Technology

    1991-09-01

    architecture to support real-time, knowledge -based image understanding , and develop the software support environment that will be needed to utilize...NUMBER OF PAGES Image Understanding Architecture, Knowledge -Based Vision, AI Real-Time Computer Vision, Software Simulator, Parallel Processor IL PRICE... information . In addition to sensory and knowledge -based processing it is useful to introduce a level of symbolic processing. Thus, vision researchers

  18. Information Processing in Nursing Information Systems: An Evaluation Study from a Developing Country.

    PubMed

    Samadbeik, Mahnaz; Shahrokhi, Nafiseh; Saremian, Marzieh; Garavand, Ali; Birjandi, Mahdi

    2017-01-01

    In recent years, information technology has been introduced in the nursing departments of many hospitals to support their daily tasks. Nurses are the largest end user group in Hospital Information Systems (HISs). This study was designed to evaluate data processing in the Nursing Information Systems (NISs) utilized in many university hospitals in Iran. This was a cross-sectional study. The population comprised all nurse managers and NIS users of the five training hospitals in Khorramabad city ( N = 71). The nursing subset of HIS-Monitor questionnaire was used to collect the data. Data were analyzed by the descriptive-analytical method and the inductive content analysis. The results indicated that the nurses participating in the study did not take a desirable advantage of paper (2.02) and computerized (2.34) information processing tools to perform nursing tasks. Moreover, the less work experience nurses have, the further they utilize computer tools for processing patient discharge information. The "readability of patient information" and "repetitive and time-consuming documentation" were stated as the most important expectations and problems regarding the HIS by the participating nurses, respectively. The nurses participating in the present study used to utilize paper and computerized information processing tools together to perform nursing practices. Therefore, it is recommended that the nursing process redesign coincides with NIS implementation in the health care centers.

  19. During running in place, grid cells integrate elapsed time and distance run

    PubMed Central

    Kraus, Benjamin J.; Brandon, Mark P.; Robinson, Robert J.; Connerney, Michael A.; Hasselmo, Michael E.; Eichenbaum, Howard

    2015-01-01

    Summary The spatial scale of grid cells may be provided by self-generated motion information or by external sensory information from environmental cues. To determine whether grid cell activity reflects distance traveled or elapsed time independent of external information, we recorded grid cells as animals ran in place on a treadmill. Grid cell activity was only weakly influenced by location but most grid cells and other neurons recorded from the same electrodes strongly signaled a combination of distance and time, with some signaling only distance or time. Grid cells were more sharply tuned to time and distance than non-grid cells. Many grid cells exhibited multiple firing fields during treadmill running, parallel to the periodic firing fields observed in open fields, suggesting a common mode of information processing. These observations indicate that, in the absence of external dynamic cues, grid cells integrate self-generated distance and time information to encode a representation of experience. PMID:26539893

  20. The Effects of Conflict, Quality and Time on Small Group Information Use and Behavior in Evaluative Decision-Making Situations.

    ERIC Educational Resources Information Center

    Pflum, Glenn D.; Brown, Robert D.

    This study investigated information needs and use by groups in decision-making processes. Problem contexts were varied by conflict, quality, and time conditions and presented to 89 graduate level education students who simulated school board members making decisions about educational programs. The research hypotheses were: (1) there are no…

  1. 76 FR 78083 - Agency Information Collection (Appeal to Board of Veterans' Appeals) Activities Under OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-15

    ... Human Resources and Housing Branch, New Executive Office Building, Room 10235, Washington, DC 20503 (202... appellant and the BVA must be informed so that the appellant's rights may be adequately protected and so... required by basic Constitutional due-process and by Title 38 U.S.C. 7107(b). From time to time, hearing...

  2. A Rapid Information Dissemination System--A Follow-Up Report.

    ERIC Educational Resources Information Center

    Miner, Lynn E.; Niederjohn, Russel J.

    1980-01-01

    A rapid information dissemination system at Marquette University which uses an audio-based technique for quickly transmitting time-dependent information to research faculty is described. The system uses a tape recorder, a special purpose speech processing system, and a telephone auto-answer recorder. Present uses and proposed future modifications…

  3. Design and Development of a Prototype Organizational Effectiveness Information System

    DTIC Science & Technology

    1984-11-01

    information from a large number of people. The existing survey support process for the GOQ is not satisfac- * tory. Most OESOs elect not to use it, because...reporting process uses screen queries and menus to simplify data entry, it is estimated that only 4-6 hours of data entry time would be required for ...description for the file named EVEDIR. The Resource System allows users of the Event Directory to select from the following processing options. o Add a new

  4. Real-Time Joint Streaming Data Processing from Social and Physical Sensors

    NASA Astrophysics Data System (ADS)

    Kropivnitskaya, Y. Y.; Qin, J.; Tiampo, K. F.; Bauer, M.

    2014-12-01

    The results of the technological breakthroughs in computing that have taken place over the last few decades makes it possible to achieve emergency management objectives that focus on saving human lives and decreasing economic effects. In particular, the integration of a wide variety of information sources, including observations from spatially-referenced physical sensors and new social media sources, enables better real-time seismic hazard analysis through distributed computing networks. The main goal of this work is to utilize innovative computational algorithms for better real-time seismic risk analysis by integrating different data sources and processing tools into streaming and cloud computing applications. The Geological Survey of Canada operates the Canadian National Seismograph Network (CNSN) with over 100 high-gain instruments and 60 low-gain or strong motion seismographs. The processing of the continuous data streams from each station of the CNSN provides the opportunity to detect possible earthquakes in near real-time. The information from physical sources is combined to calculate a location and magnitude for an earthquake. The automatically calculated results are not always sufficiently precise and prompt that can significantly reduce the response time to a felt or damaging earthquake. Social sensors, here represented as Twitter users, can provide information earlier to the general public and more rapidly to the emergency planning and disaster relief agencies. We introduce joint streaming data processing from social and physical sensors in real-time based on the idea that social media observations serve as proxies for physical sensors. By using the streams of data in the form of Twitter messages, each of which has an associated time and location, we can extract information related to a target event and perform enhanced analysis by combining it with physical sensor data. Results of this work suggest that the use of data from social media, in conjunction with the development of innovative computing algorithms, when combined with sensor data can provide a new paradigm for real-time earthquake detection in order to facilitate rapid and inexpensive natural risk reduction.

  5. Granger causal time-dependent source connectivity in the somatosensory network

    NASA Astrophysics Data System (ADS)

    Gao, Lin; Sommerlade, Linda; Coffman, Brian; Zhang, Tongsheng; Stephen, Julia M.; Li, Dichen; Wang, Jue; Grebogi, Celso; Schelter, Bjoern

    2015-05-01

    Exploration of transient Granger causal interactions in neural sources of electrophysiological activities provides deeper insights into brain information processing mechanisms. However, the underlying neural patterns are confounded by time-dependent dynamics, non-stationarity and observational noise contamination. Here we investigate transient Granger causal interactions using source time-series of somatosensory evoked magnetoencephalographic (MEG) elicited by air puff stimulation of right index finger and recorded using 306-channel MEG from 21 healthy subjects. A new time-varying connectivity approach, combining renormalised partial directed coherence with state space modelling, is employed to estimate fast changing information flow among the sources. Source analysis confirmed that somatosensory evoked MEG was mainly generated from the contralateral primary somatosensory cortex (SI) and bilateral secondary somatosensory cortices (SII). Transient Granger causality shows a serial processing of somatosensory information, 1) from contralateral SI to contralateral SII, 2) from contralateral SI to ipsilateral SII, 3) from contralateral SII to contralateral SI, and 4) from contralateral SII to ipsilateral SII. These results are consistent with established anatomical connectivity between somatosensory regions and previous source modeling results, thereby providing empirical validation of the time-varying connectivity analysis. We argue that the suggested approach provides novel information regarding transient cortical dynamic connectivity, which previous approaches could not assess.

  6. Conditional Use of Social and Private Information Guides House-Hunting Ants

    PubMed Central

    Cronin, Adam L.

    2013-01-01

    Social animals can use both social and private information to guide decision making. While social information can be relatively economical to acquire, it can lead to maladaptive information cascades if attention to environmental cues is supplanted by unconditional copying. Ants frequently employ pheromone trails, a form of social information, to guide collective processes, and this can include consensus decisions made when choosing a place to live. In this study, I examine how house-hunting ants balance social and private information when these information sources conflict to different degrees. Social information, in the form of pre-established pheromone trails, strongly influenced the decision process in choices between equivalent nests, and lead to a reduced relocation time. When trails lead to non-preferred types of nest, however, social information had less influence when this preference was weak and no influence when the preference was strong. These results suggest that social information is vetted against private information during the house-hunting process in this species. Private information is favoured in cases of conflict and this may help insure colonies against costly wrong decisions. PMID:23741364

  7. Conditional use of social and private information guides house-hunting ants.

    PubMed

    Cronin, Adam L

    2013-01-01

    Social animals can use both social and private information to guide decision making. While social information can be relatively economical to acquire, it can lead to maladaptive information cascades if attention to environmental cues is supplanted by unconditional copying. Ants frequently employ pheromone trails, a form of social information, to guide collective processes, and this can include consensus decisions made when choosing a place to live. In this study, I examine how house-hunting ants balance social and private information when these information sources conflict to different degrees. Social information, in the form of pre-established pheromone trails, strongly influenced the decision process in choices between equivalent nests, and lead to a reduced relocation time. When trails lead to non-preferred types of nest, however, social information had less influence when this preference was weak and no influence when the preference was strong. These results suggest that social information is vetted against private information during the house-hunting process in this species. Private information is favoured in cases of conflict and this may help insure colonies against costly wrong decisions.

  8. Quantum information processing with trapped ions

    NASA Astrophysics Data System (ADS)

    Gaebler, John

    2013-03-01

    Trapped ions are one promising architecture for scalable quantum information processing. Ion qubits are held in multizone traps created from segmented arrays of electrodes and transported between trap zones using time varying electric potentials applied to the electrodes. Quantum information is stored in the ions' internal hyperfine states and quantum gates to manipulate the internal states and create entanglement are performed with laser beams and microwaves. Recently we have made progress in speeding up the ion transport and cooling processes that were the limiting tasks for the operation speed in previous experiments. We are also exploring improved two-qubit gates and new methods for creating ion entanglement. This work was supported by IARPA, ARO contract No. EAO139840, ONR and the NIST Quantum Information Program

  9. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choong, W. -S.; Abu-Nimeh, F.; Moses, W. W.

    Here, we present a 16-channel front-end readout board for the OpenPET electronics system. A major task in developing a nuclear medical imaging system, such as a positron emission computed tomograph (PET) or a single-photon emission computed tomograph (SPECT), is the electronics system. While there are a wide variety of detector and camera design concepts, the relatively simple nature of the acquired data allows for a common set of electronics requirements that can be met by a flexible, scalable, and high-performance OpenPET electronics system. The analog signals from the different types of detectors used in medical imaging share similar characteristics, whichmore » allows for a common analog signal processing. The OpenPET electronics processes the analog signals with Detector Boards. Here we report on the development of a 16-channel Detector Board. Each signal is digitized by a continuously sampled analog-to-digital converter (ADC), which is processed by a field programmable gate array (FPGA) to extract pulse height information. A leading edge discriminator creates a timing edge that is "time stamped" by a time-to-digital converter (TDC) implemented inside the FPGA. In conclusion, this digital information from each channel is sent to an FPGA that services 16 analog channels, and then information from multiple channels is processed by this FPGA to perform logic for crystal lookup, DOI calculation, calibration, etc.« less

  11. Mental Rotation of Tactical Instruction Displays Affects Information Processing Demand and Execution Accuracy in Basketball

    ERIC Educational Resources Information Center

    Koopmann, Till; Steggemann-Weinrich, Yvonne; Baumeister, Jochen; Krause, Daniel

    2017-01-01

    Purpose: In sports games, coaches often use tactic boards to present tactical instructions during time-outs (e.g., 20 s to 60 s in basketball). Instructions should be presented in a way that enables fast and errorless information processing for the players. The aim of this study was to test the effect of different orientations of visual tactical…

  12. The Strategic Thinking Process: Efficient Mobilization of Human Resources for System Definition

    PubMed Central

    Covvey, H. D.

    1987-01-01

    This paper describes the application of several group management techniques to the creation of needs specifications and information systems strategic plans in health care institutions. The overall process is called the “Strategic Thinking Process”. It is a formal methodology that can reduce the time and cost of creating key documents essential for the successful implementation of health care information systems.

  13. Detection and characterization of lightning-based sources using continuous wavelet transform: application to audio-magnetotellurics

    NASA Astrophysics Data System (ADS)

    Larnier, H.; Sailhac, P.; Chambodut, A.

    2018-01-01

    Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio-magnetotelluric time-series, providing the means to assess quality of response functions obtained through processing.

  14. Demodulation processes in auditory perception

    NASA Astrophysics Data System (ADS)

    Feth, Lawrence L.

    1994-08-01

    The long range goal of this project is the understanding of human auditory processing of information conveyed by complex, time-varying signals such as speech, music or important environmental sounds. Our work is guided by the assumption that human auditory communication is a 'modulation - demodulation' process. That is, we assume that sound sources produce a complex stream of sound pressure waves with information encoded as variations ( modulations) of the signal amplitude and frequency. The listeners task then is one of demodulation. Much of past. psychoacoustics work has been based in what we characterize as 'spectrum picture processing.' Complex sounds are Fourier analyzed to produce an amplitude-by-frequency 'picture' and the perception process is modeled as if the listener were analyzing the spectral picture. This approach leads to studies such as 'profile analysis' and the power-spectrum model of masking. Our approach leads us to investigate time-varying, complex sounds. We refer to them as dynamic signals and we have developed auditory signal processing models to help guide our experimental work.

  15. Control and monitoring method and system for electromagnetic forming process

    DOEpatents

    Kunerth, Dennis C.; Lassahn, Gordon D.

    1990-01-01

    A process, system, and improvement for a process for electromagnetic forming of a workpiece in which characteristics of the workpiece such as its geometry, electrical conductivity, quality, and magnetic permeability can be determined by monitoring the current and voltage in the workcoil. In an electromagnet forming process in which a power supply provides current to a workcoil and the electromagnetic field produced by the workcoil acts to form the workpiece, the dynamic interaction of the electromagnetic fields produced by the workcoil with the geometry, electrical conductivity, and magnetic permeability of the workpiece, provides information pertinent to the physical condition of the workpiece that is available for determination of quality and process control. This information can be obtained by deriving in real time the first several time derivatives of the current and voltage in the workcoil. In addition, the process can be extended by injecting test signals into the workcoil during the electromagnetic forming and monitoring the response to the test signals in the workcoil.

  16. Multimedia patient education to assist the informed consent process for knee arthroscopy.

    PubMed

    Cornoiu, Andrei; Beischer, Andrew D; Donnan, Leo; Graves, Stephen; de Steiger, Richard

    2011-03-01

    In contemporary clinical practice, the ability for orthopaedic surgeons to obtain true 'informed consent' is becoming increasingly difficult. This problem has been driven by factors including increased expectations of surgical outcome by patients and increasing complexity of surgical procedures. Surgical pamphlets and computer presentations have been advocated as ways of improving patient education, but evidence of their efficacy is limited. The aim of this study was to compare the efficacy of a computer-based multimedia (MM) presentation against standardized verbal consent and information pamphlets for patients considering knee arthroscopy surgery. A randomized, controlled prospective trial was conducted, comparing the efficacy of three methods of providing preoperative informed consent information to patients. Sixty-one patients were randomly allocated into MM, verbal consent or pamphlet groups 3-6 weeks prior to knee arthroscopy surgery. Information recall after the initial consent process was assessed by questionnaire. Retention of this information was again assessed by questionnaire at the time of surgery and 6 weeks after surgery. The MM group demonstrated a significantly greater proportion of correct responses, 98%, in the questionnaire at the time of consent, in comparison with 88% for verbal and 76% for pamphlet groups, with no difference in anxiety levels. Information was also better retained by the MM group up to 6 weeks after surgery. Patient satisfaction with information delivery was higher in the MM group. MM is an effective tool for aiding in the provision and retention of information during the informed consent process. © 2010 The Authors. ANZ Journal of Surgery © 2010 Royal Australasian College of Surgeons.

  17. Transfer entropy in physical systems and the arrow of time

    NASA Astrophysics Data System (ADS)

    Spinney, Richard E.; Lizier, Joseph T.; Prokopenko, Mikhail

    2016-08-01

    Recent developments have cemented the realization that many concepts and quantities in thermodynamics and information theory are shared. In this paper, we consider a highly relevant quantity in information theory and complex systems, the transfer entropy, and explore its thermodynamic role by considering the implications of time reversal upon it. By doing so we highlight the role of information dynamics on the nuanced question of observer perspective within thermodynamics by relating the temporal irreversibility in the information dynamics to the configurational (or spatial) resolution of the thermodynamics. We then highlight its role in perhaps the most enduring paradox in modern physics, the manifestation of a (thermodynamic) arrow of time. We find that for systems that process information such as those undergoing feedback, a robust arrow of time can be formulated by considering both the apparent physical behavior which leads to conventional entropy production and the information dynamics which leads to a quantity we call the information theoretic arrow of time. We also offer an interpretation in terms of optimal encoding of observed physical behavior.

  18. Provenance information as a tool for addressing engineered nanoparticle reproducibility challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, Donald R.; Munusamy, Prabhakaran; Thrall, Brian D.

    Nanoparticles of various types are of increasing research and technological importance in biological and other applications. Difficulties in the production and delivery of nanoparticles with consistent and well defined properties appear in many forms and have a variety of causes. Among several issues are those associated with incomplete information about the history of particles involved in research studies including the synthesis method, sample history after synthesis including time and nature of storage and the detailed nature of any sample processing or modification. In addition, the tendency of particles to change with time or environmental condition suggests that the time betweenmore » analysis and application is important and some type of consistency or verification process can be important. The essential history of a set of particles can be identified as provenance information tells the origin or source of a batch of nano-objects along with information related to handling and any changes that may have taken place since it was originated. A record of sample provenance information for a set of particles can play a useful role in identifying some of the sources and decreasing the extent of particle variability and the observed lack of reproducibility observed by many researchers.« less

  19. Information collection and processing of dam distortion in digital reservoir system

    NASA Astrophysics Data System (ADS)

    Liang, Yong; Zhang, Chengming; Li, Yanling; Wu, Qiulan; Ge, Pingju

    2007-06-01

    The "digital reservoir" is usually understood as describing the whole reservoir with digital information technology to make it serve the human existence and development furthest. Strictly speaking, the "digital reservoir" is referred to describing vast information of the reservoir in different dimension and space-time by RS, GPS, GIS, telemetry, remote-control and virtual reality technology based on computer, multi-media, large-scale memory and wide-band networks technology for the human existence, development and daily work, life and entertainment. The core of "digital reservoir" is to realize the intelligence and visibility of vast information of the reservoir through computers and networks. The dam is main building of reservoir, whose safety concerns reservoir and people's safety. Safety monitoring is important way guaranteeing the dam's safety, which controls the dam's running through collecting the dam's information concerned and developing trend. Safety monitoring of the dam is the process from collection and processing of initial safety information to forming safety concept in the brain. The paper mainly researches information collection and processing of the dam by digital means.

  20. Buffered coscheduling for parallel programming and enhanced fault tolerance

    DOEpatents

    Petrini, Fabrizio [Los Alamos, NM; Feng, Wu-chun [Los Alamos, NM

    2006-01-31

    A computer implemented method schedules processor jobs on a network of parallel machine processors or distributed system processors. Control information communications generated by each process performed by each processor during a defined time interval is accumulated in buffers, where adjacent time intervals are separated by strobe intervals for a global exchange of control information. A global exchange of the control information communications at the end of each defined time interval is performed during an intervening strobe interval so that each processor is informed by all of the other processors of the number of incoming jobs to be received by each processor in a subsequent time interval. The buffered coscheduling method of this invention also enhances the fault tolerance of a network of parallel machine processors or distributed system processors

  1. Tracking neural coding of perceptual and semantic features of concrete nouns

    PubMed Central

    Sudre, Gustavo; Pomerleau, Dean; Palatucci, Mark; Wehbe, Leila; Fyshe, Alona; Salmelin, Riitta; Mitchell, Tom

    2015-01-01

    We present a methodological approach employing magnetoencephalography (MEG) and machine learning techniques to investigate the flow of perceptual and semantic information decodable from neural activity in the half second during which the brain comprehends the meaning of a concrete noun. Important information about the cortical location of neural activity related to the representation of nouns in the human brain has been revealed by past studies using fMRI. However, the temporal sequence of processing from sensory input to concept comprehension remains unclear, in part because of the poor time resolution provided by fMRI. In this study, subjects answered 20 questions (e.g. is it alive?) about the properties of 60 different nouns prompted by simultaneous presentation of a pictured item and its written name. Our results show that the neural activity observed with MEG encodes a variety of perceptual and semantic features of stimuli at different times relative to stimulus onset, and in different cortical locations. By decoding these features, our MEG-based classifier was able to reliably distinguish between two different concrete nouns that it had never seen before. The results demonstrate that there are clear differences between the time course of the magnitude of MEG activity and that of decodable semantic information. Perceptual features were decoded from MEG activity earlier in time than semantic features, and features related to animacy, size, and manipulability were decoded consistently across subjects. We also observed that regions commonly associated with semantic processing in the fMRI literature may not show high decoding results in MEG. We believe that this type of approach and the accompanying machine learning methods can form the basis for further modeling of the flow of neural information during language processing and a variety of other cognitive processes. PMID:22565201

  2. Exploring laterality and memory effects in the haptic discrimination of verbal and non-verbal shapes.

    PubMed

    Stoycheva, Polina; Tiippana, Kaisa

    2018-03-14

    The brain's left hemisphere often displays advantages in processing verbal information, while the right hemisphere favours processing non-verbal information. In the haptic domain due to contra-lateral innervations, this functional lateralization is reflected in a hand advantage during certain functions. Findings regarding the hand-hemisphere advantage for haptic information remain contradictory, however. This study addressed these laterality effects and their interaction with memory retention times in the haptic modality. Participants performed haptic discrimination of letters, geometric shapes and nonsense shapes at memory retention times of 5, 15 and 30 s with the left and right hand separately, and we measured the discriminability index d'. The d' values were significantly higher for letters and geometric shapes than for nonsense shapes. This might result from dual coding (naming + spatial) or/and from a low stimulus complexity. There was no stimulus-specific laterality effect. However, we found a time-dependent laterality effect, which revealed that the performance of the left hand-right hemisphere was sustained up to 15 s, while the performance of the right-hand-left hemisphere decreased progressively throughout all retention times. This suggests that haptic memory traces are more robust to decay when they are processed by the left hand-right hemisphere.

  3. People-oriented Information Visualization Design

    NASA Astrophysics Data System (ADS)

    Chen, Zhiyong; Zhang, Bolun

    2018-04-01

    In the 21st century with rapid development, in the wake of the continuous progress of science and technology, human society enters the information era and the era of big data, and the lifestyle and aesthetic system also change accordingly, so the emerging field of information visualization is increasingly popular. Information visualization design is the process of visualizing all kinds of tedious information data, so as to quickly accept information and save time-cost. Along with the development of the process of information visualization, information design, also becomes hotter and hotter, and emotional design, people-oriented design is an indispensable part of in the design of information. This paper probes information visualization design through emotional analysis of information design based on the social context of people-oriented experience from the perspective of art design. Based on the three levels of emotional information design: instinct level, behavior level and reflective level research, to explore and discuss information visualization design.

  4. Staff experiences within the implementation of computer-based nursing records in residential aged care facilities: a systematic review and synthesis of qualitative research.

    PubMed

    Meißner, Anne; Schnepp, Wilfried

    2014-06-20

    Since the introduction of electronic nursing documentation systems, its implementation in recent years has increased rapidly in Germany. The objectives of such systems are to save time, to improve information handling and to improve quality. To integrate IT in the daily working processes, the employee is the pivotal element. Therefore it is important to understand nurses' experience with IT implementation. At present the literature shows a lack of understanding exploring staff experiences within the implementation process. A systematic review and meta-ethnographic synthesis of primary studies using qualitative methods was conducted in PubMed, CINAHL, and Cochrane. It adheres to the principles of the PRISMA statement. The studies were original, peer-reviewed articles from 2000 to 2013, focusing on computer-based nursing documentation in Residential Aged Care Facilities. The use of IT requires a different form of information processing. Some experience this new form of information processing as a benefit while others do not. The latter find it more difficult to enter data and this result in poor clinical documentation. Improvement in the quality of residents' records leads to an overall improvement in the quality of care. However, if the quality of those records is poor, some residents do not receive the necessary care. Furthermore, the length of time necessary to complete the documentation is a prominent theme within that process. Those who are more efficient with the electronic documentation demonstrate improved time management. For those who are less efficient with electronic documentation the information processing is perceived as time consuming. Normally, it is possible to experience benefits when using IT, but this depends on either promoting or hindering factors, e.g. ease of use and ability to use it, equipment availability and technical functionality, as well as attitude. In summary, the findings showed that members of staff experience IT as a benefit when it simplifies their daily working routines and as a burden when it complicates their working processes. Whether IT complicates or simplifies their routines depends on influencing factors. The line between benefit and burden is semipermeable. The experiences differ according to duties and responsibilities.

  5. A neuro-inspired spike-based PID motor controller for multi-motor robots with low cost FPGAs.

    PubMed

    Jimenez-Fernandez, Angel; Jimenez-Moreno, Gabriel; Linares-Barranco, Alejandro; Dominguez-Morales, Manuel J; Paz-Vicente, Rafael; Civit-Balcells, Anton

    2012-01-01

    In this paper we present a neuro-inspired spike-based close-loop controller written in VHDL and implemented for FPGAs. This controller has been focused on controlling a DC motor speed, but only using spikes for information representation, processing and DC motor driving. It could be applied to other motors with proper driver adaptation. This controller architecture represents one of the latest layers in a Spiking Neural Network (SNN), which implements a bridge between robotics actuators and spike-based processing layers and sensors. The presented control system fuses actuation and sensors information as spikes streams, processing these spikes in hard real-time, implementing a massively parallel information processing system, through specialized spike-based circuits. This spike-based close-loop controller has been implemented into an AER platform, designed in our labs, that allows direct control of DC motors: the AER-Robot. Experimental results evidence the viability of the implementation of spike-based controllers, and hardware synthesis denotes low hardware requirements that allow replicating this controller in a high number of parallel controllers working together to allow a real-time robot control.

  6. A Neuro-Inspired Spike-Based PID Motor Controller for Multi-Motor Robots with Low Cost FPGAs

    PubMed Central

    Jimenez-Fernandez, Angel; Jimenez-Moreno, Gabriel; Linares-Barranco, Alejandro; Dominguez-Morales, Manuel J.; Paz-Vicente, Rafael; Civit-Balcells, Anton

    2012-01-01

    In this paper we present a neuro-inspired spike-based close-loop controller written in VHDL and implemented for FPGAs. This controller has been focused on controlling a DC motor speed, but only using spikes for information representation, processing and DC motor driving. It could be applied to other motors with proper driver adaptation. This controller architecture represents one of the latest layers in a Spiking Neural Network (SNN), which implements a bridge between robotics actuators and spike-based processing layers and sensors. The presented control system fuses actuation and sensors information as spikes streams, processing these spikes in hard real-time, implementing a massively parallel information processing system, through specialized spike-based circuits. This spike-based close-loop controller has been implemented into an AER platform, designed in our labs, that allows direct control of DC motors: the AER-Robot. Experimental results evidence the viability of the implementation of spike-based controllers, and hardware synthesis denotes low hardware requirements that allow replicating this controller in a high number of parallel controllers working together to allow a real-time robot control. PMID:22666004

  7. The integration of familiarity and recollection information in short-term recognition: modeling speed-accuracy trade-off functions.

    PubMed

    Göthe, Katrin; Oberauer, Klaus

    2008-05-01

    Dual process models postulate familiarity and recollection as the basis of the recognition process. We investigated the time-course of integration of the two information sources to one recognition judgment in a working memory task. We tested 24 subjects with a response signal variant of the modified Sternberg recognition task (Oberauer, 2001) to isolate the time course of three different probe types indicating different combinations of familiarity and source information. We compared two mathematical models implementing different ways of integrating familiarity and recollection. Within each model, we tested three assumptions about the nature of the familiarity signal, with familiarity having (a) only positive values, indicating similarity of the probe with the memory list, (b) only negative values, indicating novelty, or (c) both positive and negative values. Both models provided good fits to the data. A model combining the outputs of both processes additively (Integration Model) gave an overall better fit to the data than a model based on a continuous familiarity signal and a probabilistic all-or-none recollection process (Dominance Model).

  8. Complete information acquisition in scanning probe microscopy

    DOE PAGES

    Belianinov, Alex; Kalinin, Sergei V.; Jesse, Stephen

    2015-03-13

    In the last three decades, scanning probe microscopy (SPM) has emerged as a primary tool for exploring and controlling the nanoworld. A critical part of the SPM measurements is the information transfer from the tip-surface junction to a macroscopic measurement system. This process reduces the many degrees of freedom of a vibrating cantilever to relatively few parameters recorded as images. Similarly, the details of dynamic cantilever response at sub-microsecond time scales of transients, higher-order eigenmodes and harmonics are averaged out by transitioning to millisecond time scale of pixel acquisition. Hence, the amount of information available to the external observer ismore » severely limited, and its selection is biased by the chosen data processing method. Here, we report a fundamentally new approach for SPM imaging based on information theory-type analysis of the data stream from the detector. This approach allows full exploration of complex tip-surface interactions, spatial mapping of multidimensional variability of material s properties and their mutual interactions, and SPM imaging at the information channel capacity limit.« less

  9. 2D first break tomographic processing of data measured for celebration profiles: CEL01, CEL04, CEL05, CEL06, CEL09, CEL11

    NASA Astrophysics Data System (ADS)

    Bielik, M.; Vozar, J.; Hegedus, E.; Celebration Working Group

    2003-04-01

    The contribution informs about the preliminary results that relate to the first arrival p-wave seismic tomographic processing of data measured along the profiles CEL01, CEL04, CEL05, CEL06, CEL09 and CEL11. These profiles were measured in a framework of the seismic project called CELEBRATION 2000. Data acquisition and geometric parameters of the processed profiles, tomographic processing’s principle, particular processing steps and program parameters are described. Characteristic data (shot points, geophone points, total length of profiles, for all profiles, sampling, sensors and record lengths) of observation profiles are given. The fast program package developed by C. Zelt was applied for tomographic velocity inversion. This process consists of several steps. First step is a creation of the starting velocity field for which the calculated arrival times are modelled by the method of finite differences. The next step is minimization of differences between the measured and modelled arrival time till the deviation is small. Elimination of equivalency problem by including a priori information in the starting velocity field was done too. A priori information consists of the depth to the pre-Tertiary basement, estimation of its overlying sedimentary velocity from well-logging and or other seismic velocity data, etc. After checking the reciprocal times, pickings were corrected. The final result of the processing is a reliable travel time curve set considering the reciprocal times. We carried out picking of travel time curves, enhancement of signal-to-noise ratio on the seismograms using the program system of PROMAX. Tomographic inversion was carried out by so called 3D/2D procedure taking into account 3D wave propagation. It means that a corridor along the profile, which contains the outlying shot points and geophone points as well was defined and we carried out 3D processing within this corridor. The preliminary results indicate the seismic anomalous zones within the crust and the uppermost part of the upper mantle in the area consists of the Western Carpathians, the North European platform, the Pannonian basin and the Bohemian Massif.

  10. Watching diagnoses develop: Eye movements reveal symptom processing during diagnostic reasoning.

    PubMed

    Scholz, Agnes; Krems, Josef F; Jahn, Georg

    2017-10-01

    Finding a probable explanation for observed symptoms is a highly complex task that draws on information retrieval from memory. Recent research suggests that observed symptoms are interpreted in a way that maximizes coherence for a single likely explanation. This becomes particularly clear if symptom sequences support more than one explanation. However, there are no existing process data available that allow coherence maximization to be traced in ambiguous diagnostic situations, where critical information has to be retrieved from memory. In this experiment, we applied memory indexing, an eye-tracking method that affords rich time-course information concerning memory-based cognitive processing during higher order thinking, to reveal symptom processing and the preferred interpretation of symptom sequences. Participants first learned information about causes and symptoms presented in spatial frames. Gaze allocation to emptied spatial frames during symptom processing and during the diagnostic response reflected the subjective status of hypotheses held in memory and the preferred interpretation of ambiguous symptoms. Memory indexing traced how the diagnostic decision developed and revealed instances of hypothesis change and biases in symptom processing. Memory indexing thus provided direct online evidence for coherence maximization in processing ambiguous information.

  11. 29 CFR 1401.34 - Time for processing requests.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... denial shall include an estimate of the volume of records or information withheld, in numbers of pages or... qualify for faster processing. (f) Requests and appeals will be taken out of order and given expedited...

  12. Information processing by networks of quantum decision makers

    NASA Astrophysics Data System (ADS)

    Yukalov, V. I.; Yukalova, E. P.; Sornette, D.

    2018-02-01

    We suggest a model of a multi-agent society of decision makers taking decisions being based on two criteria, one is the utility of the prospects and the other is the attractiveness of the considered prospects. The model is the generalization of quantum decision theory, developed earlier for single decision makers realizing one-step decisions, in two principal aspects. First, several decision makers are considered simultaneously, who interact with each other through information exchange. Second, a multistep procedure is treated, when the agents exchange information many times. Several decision makers exchanging information and forming their judgment, using quantum rules, form a kind of a quantum information network, where collective decisions develop in time as a result of information exchange. In addition to characterizing collective decisions that arise in human societies, such networks can describe dynamical processes occurring in artificial quantum intelligence composed of several parts or in a cluster of quantum computers. The practical usage of the theory is illustrated on the dynamic disjunction effect for which three quantitative predictions are made: (i) the probabilistic behavior of decision makers at the initial stage of the process is described; (ii) the decrease of the difference between the initial prospect probabilities and the related utility factors is proved; (iii) the existence of a common consensus after multiple exchange of information is predicted. The predicted numerical values are in very good agreement with empirical data.

  13. Fisher information in a quantum-critical environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun Zhe; Ma Jian; Lu Xiaoming

    2010-08-15

    We consider a process of parameter estimation in a spin-j system surrounded by a quantum-critical spin chain. Quantum Fisher information lies at the heart of the estimation task. We employ Ising spin chain in a transverse field as the environment which exhibits a quantum phase transition. Fisher information decays with time almost monotonously when the environment reaches the critical point. By choosing a fixed time or taking the time average, one can see the quantum Fisher information presents a sudden drop at the critical point. Different initial states of the environment are considered. The phenomenon that the quantum Fisher information,more » namely, the precision of estimation, changes dramatically can be used to detect the quantum criticality of the environment. We also introduce a general method to obtain the maximal Fisher information for a given state.« less

  14. Informing future NRT satellite distribution capabilities: Lessons learned from NASA's Land Atmosphere NRT capability for EOS (LANCE)

    NASA Astrophysics Data System (ADS)

    Davies, D.; Murphy, K. J.; Michael, K.

    2013-12-01

    NASA's Land Atmosphere Near real-time Capability for EOS (Earth Observing System) (LANCE) provides data and imagery from Terra, Aqua and Aura satellites in less than 3 hours from satellite observation, to meet the needs of the near real-time (NRT) applications community. This article describes the architecture of the LANCE and outlines the modifications made to achieve the 3-hour latency requirement with a view to informing future NRT satellite distribution capabilities. It also describes how latency is determined. LANCE is a distributed system that builds on the existing EOS Data and Information System (EOSDIS) capabilities. To achieve the NRT latency requirement, many components of the EOS satellite operations, ground and science processing systems have been made more efficient without compromising the quality of science data processing. The EOS Data and Operations System (EDOS) processes the NRT stream with higher priority than the science data stream in order to minimize latency. In addition to expediting transfer times, the key difference between the NRT Level 0 products and those for standard science processing is the data used to determine the precise location and tilt of the satellite. Standard products use definitive geo-location (attitude and ephemeris) data provided daily, whereas NRT products use predicted geo-location provided by the instrument Global Positioning System (GPS) or approximation of navigational data (depending on platform). Level 0 data are processed in to higher-level products at designated Science Investigator-led Processing Systems (SIPS). The processes used by LANCE have been streamlined and adapted to work with datasets as soon as they are downlinked from satellites or transmitted from ground stations. Level 2 products that require ancillary data have modified production rules to relax the requirements for ancillary data so reducing processing times. Looking to the future, experience gained from LANCE can provide valuable lessons on satellite and ground system architectures and on how the delivery of NRT products from other NASA missions might be achieved.

  15. Discovering the influential users oriented to viral marketing based on online social networks

    NASA Astrophysics Data System (ADS)

    Zhu, Zhiguo

    2013-08-01

    The target of viral marketing on the platform of popular online social networks is to rapidly propagate marketing information at lower cost and increase sales, in which a key problem is how to precisely discover the most influential users in the process of information diffusion. A novel method is proposed in this paper for helping companies to identify such users as seeds to maximize information diffusion in the viral marketing. Firstly, the user trust network oriented to viral marketing and users’ combined interest degree in the network including isolated users are extensively defined. Next, we construct a model considering the time factor to simulate the process of information diffusion in viral marketing and propose a dynamic algorithm description. Finally, experiments are conducted with a real dataset extracted from the famous SNS website Epinions. The experimental results indicate that the proposed algorithm has better scalability and is less time-consuming. Compared with the classical model, the proposed algorithm achieved a better performance than does the classical method on the two aspects of network coverage rate and time-consumption in our four sub-datasets.

  16. Human-Assisted Machine Information Exploitation: a crowdsourced investigation of information-based problem solving

    NASA Astrophysics Data System (ADS)

    Kase, Sue E.; Vanni, Michelle; Caylor, Justine; Hoye, Jeff

    2017-05-01

    The Human-Assisted Machine Information Exploitation (HAMIE) investigation utilizes large-scale online data collection for developing models of information-based problem solving (IBPS) behavior in a simulated time-critical operational environment. These types of environments are characteristic of intelligence workflow processes conducted during human-geo-political unrest situations when the ability to make the best decision at the right time ensures strategic overmatch. The project takes a systems approach to Human Information Interaction (HII) by harnessing the expertise of crowds to model the interaction of the information consumer and the information required to solve a problem at different levels of system restrictiveness and decisional guidance. The design variables derived from Decision Support Systems (DSS) research represent the experimental conditions in this online single-player against-the-clock game where the player, acting in the role of an intelligence analyst, is tasked with a Commander's Critical Information Requirement (CCIR) in an information overload scenario. The player performs a sequence of three information processing tasks (annotation, relation identification, and link diagram formation) with the assistance of `HAMIE the robot' who offers varying levels of information understanding dependent on question complexity. We provide preliminary results from a pilot study conducted with Amazon Mechanical Turk (AMT) participants on the Volunteer Science scientific research platform.

  17. Laboratory testing in primary care: A systematic review of health IT impacts.

    PubMed

    Maillet, Éric; Paré, Guy; Currie, Leanne M; Raymond, Louis; Ortiz de Guinea, Ana; Trudel, Marie-Claude; Marsan, Josianne

    2018-08-01

    Laboratory testing in primary care is a fundamental process that supports patient management and care. Any breakdown in the process may alter clinical information gathering and decision-making activities and can lead to medical errors and potential adverse outcomes for patients. Various information technologies are being used in primary care with the goal to support the process, maximize patient benefits and reduce medical errors. However, the overall impact of health information technologies on laboratory testing processes has not been evaluated. To synthesize the positive and negative impacts resulting from the use of health information technology in each phase of the laboratory 'total testing process' in primary care. We conducted a systematic review. Databases including Medline, PubMed, CINAHL, Web of Science and Google Scholar were searched. Studies eligible for inclusion reported empirical data on: 1) the use of a specific IT system, 2) the impacts of the systems to support the laboratory testing process, and were conducted in 3) primary care settings (including ambulatory care and primary care offices). Our final sample consisted of 22 empirical studies which were mapped to a framework that outlines the phases of the laboratory total testing process, focusing on phases where medical errors may occur. Health information technology systems support several phases of the laboratory testing process, from ordering the test to following-up with patients. This is a growing field of research with most studies focusing on the use of information technology during the final phases of the laboratory total testing process. The findings were largely positive. Positive impacts included easier access to test results by primary care providers, reduced turnaround times, and increased prescribed tests based on best practice guidelines. Negative impacts were reported in several studies: paper-based processes employed in parallel to the electronic process increased the potential for medical errors due to clinicians' cognitive overload; systems deemed not reliable or user-friendly hampered clinicians' performance; and organizational issues arose when results tracking relied on the prescribers' memory. The potential of health information technology lies not only in the exchange of health information, but also in knowledge sharing among clinicians. This review has underscored the important role played by cognitive factors, which are critical in the clinician's decision-making, the selection of the most appropriate tests, correct interpretation of the results and efficient interventions. By providing the right information, at the right time to the right clinician, many IT solutions adequately support the laboratory testing process and help primary care clinicians make better decisions. However, several technological and organizational barriers require more attention to fully support the highly fragmented and error-prone process of laboratory testing. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. 32 CFR 806.22 - Time limits.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 6 2012-07-01 2012-07-01 false Time limits. 806.22 Section 806.22 National... INFORMATION ACT PROGRAM § 806.22 Time limits. Any FOIA appeals received after the 60-day time limit are not processed, unless the requester provides adequate justification for failing to comply with the time limit...

  19. 32 CFR 806.22 - Time limits.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Time limits. 806.22 Section 806.22 National... INFORMATION ACT PROGRAM § 806.22 Time limits. Any FOIA appeals received after the 60-day time limit are not processed, unless the requester provides adequate justification for failing to comply with the time limit...

  20. 32 CFR 806.22 - Time limits.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 6 2014-07-01 2014-07-01 false Time limits. 806.22 Section 806.22 National... INFORMATION ACT PROGRAM § 806.22 Time limits. Any FOIA appeals received after the 60-day time limit are not processed, unless the requester provides adequate justification for failing to comply with the time limit...

  1. 32 CFR 806.22 - Time limits.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Time limits. 806.22 Section 806.22 National... INFORMATION ACT PROGRAM § 806.22 Time limits. Any FOIA appeals received after the 60-day time limit are not processed, unless the requester provides adequate justification for failing to comply with the time limit...

  2. Predict or classify: The deceptive role of time-locking in brain signal classification

    NASA Astrophysics Data System (ADS)

    Rusconi, Marco; Valleriani, Angelo

    2016-06-01

    Several experimental studies claim to be able to predict the outcome of simple decisions from brain signals measured before subjects are aware of their decision. Often, these studies use multivariate pattern recognition methods with the underlying assumption that the ability to classify the brain signal is equivalent to predict the decision itself. Here we show instead that it is possible to correctly classify a signal even if it does not contain any predictive information about the decision. We first define a simple stochastic model that mimics the random decision process between two equivalent alternatives, and generate a large number of independent trials that contain no choice-predictive information. The trials are first time-locked to the time point of the final event and then classified using standard machine-learning techniques. The resulting classification accuracy is above chance level long before the time point of time-locking. We then analyze the same trials using information theory. We demonstrate that the high classification accuracy is a consequence of time-locking and that its time behavior is simply related to the large relaxation time of the process. We conclude that when time-locking is a crucial step in the analysis of neural activity patterns, both the emergence and the timing of the classification accuracy are affected by structural properties of the network that generates the signal.

  3. Effects of reverberation time on the cognitive load in speech communication: theoretical considerations.

    PubMed

    Kjellberg, A

    2004-01-01

    The paper presents a theoretical analysis of possible effects of reverberation time on the cognitive load in speech communication. Speech comprehension requires not only phonological processing of the spoken words. Simultaneously, this information must be further processed and stored. All this processing takes place in the working memory, which has a limited processing capacity. The more resources that are allocated to word identification, the fewer resources are therefore left for the further processing and storing of the information. Reverberation conditions that allow the identification of almost all words may therefore still interfere with speech comprehension and memory storing. These problems are likely to be especially serious in situations where speech has to be followed continuously for a long time. An unfavourable reverberation time (RT) then could contribute to the development of cognitive fatigue, which means that working memory resources are gradually reduced. RT may also affect the cognitive load in two other ways: RT may change the distracting effects of a sound and a person's mood. Both effects could influence the cognitive load of a listener. It is argued that we need studies of RT effects in realistic long-lasting listening situations to better understand the effect of RT on speech communication. Furthermore, the effect of RT on distraction and mood need to be better understood.

  4. Spatial hearing benefits demonstrated with presentation of acoustic temporal fine structure cues in bilateral cochlear implant listeners.

    PubMed

    Churchill, Tyler H; Kan, Alan; Goupell, Matthew J; Litovsky, Ruth Y

    2014-09-01

    Most contemporary cochlear implant (CI) processing strategies discard acoustic temporal fine structure (TFS) information, and this may contribute to the observed deficits in bilateral CI listeners' ability to localize sounds when compared to normal hearing listeners. Additionally, for best speech envelope representation, most contemporary speech processing strategies use high-rate carriers (≥900 Hz) that exceed the limit for interaural pulse timing to provide useful binaural information. Many bilateral CI listeners are sensitive to interaural time differences (ITDs) in low-rate (<300 Hz) constant-amplitude pulse trains. This study explored the trade-off between superior speech temporal envelope representation with high-rate carriers and binaural pulse timing sensitivity with low-rate carriers. The effects of carrier pulse rate and pulse timing on ITD discrimination, ITD lateralization, and speech recognition in quiet were examined in eight bilateral CI listeners. Stimuli consisted of speech tokens processed at different electrical stimulation rates, and pulse timings that either preserved or did not preserve acoustic TFS cues. Results showed that CI listeners were able to use low-rate pulse timing cues derived from acoustic TFS when presented redundantly on multiple electrodes for ITD discrimination and lateralization of speech stimuli.

  5. Semantic integration of differently asynchronous audio-visual information in videos of real-world events in cognitive processing: an ERP study.

    PubMed

    Liu, Baolin; Wu, Guangning; Wang, Zhongning; Ji, Xiang

    2011-07-01

    In the real world, some of the auditory and visual information received by the human brain are temporally asynchronous. How is such information integrated in cognitive processing in the brain? In this paper, we aimed to study the semantic integration of differently asynchronous audio-visual information in cognitive processing using ERP (event-related potential) method. Subjects were presented with videos of real world events, in which the auditory and visual information are temporally asynchronous. When the critical action was prior to the sound, sounds incongruous with the preceding critical actions elicited a N400 effect when compared to congruous condition. This result demonstrates that semantic contextual integration indexed by N400 also applies to cognitive processing of multisensory information. In addition, the N400 effect is early in latency when contrasted with other visually induced N400 studies. It is shown that cross modal information is facilitated in time when contrasted with visual information in isolation. When the sound was prior to the critical action, a larger late positive wave was observed under the incongruous condition compared to congruous condition. P600 might represent a reanalysis process, in which the mismatch between the critical action and the preceding sound was evaluated. It is shown that environmental sound may affect the cognitive processing of a visual event. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  6. 40 CFR 57.203 - Contents of the application.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... application shall also contain the following information: (1) A process flow diagram of the smelter, including current process and instrumentation diagrams for all processes or equipment which may emit or affect the... equipment (flow rates, temperature, volumes, compositions, and variations over time); and a list of all...

  7. 40 CFR 57.203 - Contents of the application.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... application shall also contain the following information: (1) A process flow diagram of the smelter, including current process and instrumentation diagrams for all processes or equipment which may emit or affect the... equipment (flow rates, temperature, volumes, compositions, and variations over time); and a list of all...

  8. 40 CFR 57.203 - Contents of the application.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... application shall also contain the following information: (1) A process flow diagram of the smelter, including current process and instrumentation diagrams for all processes or equipment which may emit or affect the... equipment (flow rates, temperature, volumes, compositions, and variations over time); and a list of all...

  9. 40 CFR 57.203 - Contents of the application.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... application shall also contain the following information: (1) A process flow diagram of the smelter, including current process and instrumentation diagrams for all processes or equipment which may emit or affect the... equipment (flow rates, temperature, volumes, compositions, and variations over time); and a list of all...

  10. A discrimination-association model for decomposing component processes of the implicit association test.

    PubMed

    Stefanutti, Luca; Robusto, Egidio; Vianello, Michelangelo; Anselmi, Pasquale

    2013-06-01

    A formal model is proposed that decomposes the implicit association test (IAT) effect into three process components: stimuli discrimination, automatic association, and termination criterion. Both response accuracy and reaction time are considered. Four independent and parallel Poisson processes, one for each of the four label categories of the IAT, are assumed. The model parameters are the rate at which information accrues on the counter of each process and the amount of information that is needed before a response is given. The aim of this study is to present the model and an illustrative application in which the process components of a Coca-Pepsi IAT are decomposed.

  11. [Digitalization of radiological imaging information and consequences for patient care in the hospital ].

    PubMed

    den Heeten, G J; Barneveld Binkhuysen, F H

    2001-08-25

    Determining the rate at which radiology must be digitalised has been a controversial issue for many years. Much radiological information is still obtained from the film-screen combination (X-rays) with all of its known inherent restrictions. The importance of imaging information in the healthcare process continues to increase for both radiologists and referring physicians, and the ongoing developments in information technology means that it is possible to integrate imaging information and electronic patient files. The healthcare process can only become more effective and efficient when the appropriate information is in the right place at the right time, something that conventional methods, using photos that need to be physically moved, can scarcely satisfy. There is also a desire for integration with information obtained from nuclear medicine, pathology and endoscopy, and eventually of all stand-alone data systems with relevance for the individually oriented hospital healthcare. The transition from a conventional to a digital process is complex; it is accompanied by the transition from a data-oriented to a process-oriented system. Many years have already been invested in the integration of information systems and the development of digital systems within radiology, the current performance of which is such that many hospitals are considering the digitalisation process or are already implementing parts of it.

  12. Judging nursing information on the WWW: a theoretical understanding.

    PubMed

    Cader, Raffik; Campbell, Steve; Watson, Don

    2009-09-01

    This paper is a report of a study of the judgement processes nurses use when evaluating World Wide Web information related to nursing practice. The World Wide Web has increased the global accessibility of online health information. However, the variable nature of the quality of World Wide Web information and its perceived level of reliability may lead to misinformation. This makes demands on healthcare professionals, and on nurses in particular, to ensure that health information of reliable quality is selected for use in practice. A grounded theory approach was adopted. Semi-structured interviews and focus groups were used to collect data, between 2004 and 2005, from 20 nurses undertaking a postqualification graduate course at a university and 13 nurses from a local hospital in the United Kingdom. A theoretical framework emerged that gave insight into the judgement process nurses use when evaluating World Wide Web information. Participants broke the judgement process down into specific tasks. In addition, they used tacit, process and propositional knowledge and intuition, quasi-rational cognition and analysis to undertake these tasks. World Wide Web information cues, time available and nurses' critical skills were influencing factors in their judgement process. Addressing the issue of quality and reliability associated with World Wide Web information is a global challenge. This theoretical framework could contribute towards meeting this challenge.

  13. Temporal characteristics of audiovisual information processing.

    PubMed

    Fuhrmann Alpert, Galit; Hein, Grit; Tsai, Nancy; Naumer, Marcus J; Knight, Robert T

    2008-05-14

    In complex natural environments, auditory and visual information often have to be processed simultaneously. Previous functional magnetic resonance imaging (fMRI) studies focused on the spatial localization of brain areas involved in audiovisual (AV) information processing, but the temporal characteristics of AV information flow in these regions remained unclear. In this study, we used fMRI and a novel information-theoretic approach to study the flow of AV sensory information. Subjects passively perceived sounds and images of objects presented either alone or simultaneously. Applying the measure of mutual information, we computed for each voxel the latency in which the blood oxygenation level-dependent signal had the highest information content about the preceding stimulus. The results indicate that, after AV stimulation, the earliest informative activity occurs in right Heschl's gyrus, left primary visual cortex, and the posterior portion of the superior temporal gyrus, which is known as a region involved in object-related AV integration. Informative activity in the anterior portion of superior temporal gyrus, middle temporal gyrus, right occipital cortex, and inferior frontal cortex was found at a later latency. Moreover, AV presentation resulted in shorter latencies in multiple cortical areas compared with isolated auditory or visual presentation. The results provide evidence for bottom-up processing from primary sensory areas into higher association areas during AV integration in humans and suggest that AV presentation shortens processing time in early sensory cortices.

  14. Integrating complex business processes for knowledge-driven clinical decision support systems.

    PubMed

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  15. Meta-control of combustion performance with a data mining approach

    NASA Astrophysics Data System (ADS)

    Song, Zhe

    Large scale combustion process is complex and proposes challenges of optimizing its performance. Traditional approaches based on thermal dynamics have limitations on finding optimal operational regions due to time-shift nature of the process. Recent advances in information technology enable people collect large volumes of process data easily and continuously. The collected process data contains rich information about the process and, to some extent, represents a digital copy of the process over time. Although large volumes of data exist in industrial combustion processes, they are not fully utilized to the level where the process can be optimized. Data mining is an emerging science which finds patterns or models from large data sets. It has found many successful applications in business marketing, medical and manufacturing domains The focus of this dissertation is on applying data mining to industrial combustion processes, and ultimately optimizing the combustion performance. However the philosophy, methods and frameworks discussed in this research can also be applied to other industrial processes. Optimizing an industrial combustion process has two major challenges. One is the underlying process model changes over time and obtaining an accurate process model is nontrivial. The other is that a process model with high fidelity is usually highly nonlinear, solving the optimization problem needs efficient heuristics. This dissertation is set to solve these two major challenges. The major contribution of this 4-year research is the data-driven solution to optimize the combustion process, where process model or knowledge is identified based on the process data, then optimization is executed by evolutionary algorithms to search for optimal operating regions.

  16. [Neural Mechanisms Underlying the Processing of Temporal Information in Episodic Memory and Its Disturbance].

    PubMed

    Iwata, Saeko; Tsukiura, Takashi

    2017-11-01

    Episodic memory is defined as memory for personally experienced events, and includes memory content and contextual information of time and space. Previous neuroimaging and neuropsychological studies have demonstrated three possible roles of the temporal context in episodic memory. First, temporal information contributes to the arrangement of temporal order for sequential events in episodic memory, and this process is involved in the lateral prefrontal cortex. The second possible role of temporal information in episodic memory is the segregation between memories of multiple events, which are segregated by cues of different time information. The role of segregation is associated with the orbitofrontal regions including the orbitofrontal cortex and basal forebrain region. Third, temporal information in episodic memory plays an important role in the integration of multiple components into a coherent episodic memory, in which episodic components in the different modalities are combined by temporal information as an index. The role of integration is mediated by the medial temporal lobe including the hippocampus and parahippocampal gyrus. Thus, temporal information in episodic memory could be represented in multiple stages, which are involved in a network of the lateral prefrontal, orbitofrontal, and medial temporal lobe regions.

  17. Process mining techniques: an application to time management

    NASA Astrophysics Data System (ADS)

    Khowaja, Ali Raza

    2018-04-01

    In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.

  18. Accessing Information in Working Memory: Can the Focus of Attention Grasp Two Elements at the Same Time?

    ERIC Educational Resources Information Center

    Oberauer, Klaus; Bialkova, Svetlana

    2009-01-01

    Processing information in working memory requires selective access to a subset of working-memory contents by a focus of attention. Complex cognition often requires joint access to 2 items in working memory. How does the focus select 2 items? Two experiments with an arithmetic task and 1 with a spatial task investigate time demands for successive…

  19. Improving women's knowledge of prostaglandin induction of labour through the use of information brochures: a quasi-experimental study.

    PubMed

    Cooper, Megan; Warland, Jane

    2011-12-01

    To gain a better understanding of women's baseline level of knowledge of induction of labour (IOL) and determine whether giving written information at the time IOL is decided, results in significant differences in knowledge and understanding of the process. Fifty pregnant women undergoing antenatal care at a small maternity hospital were recruited. A quasi experimental trial was conducted with non random selection of participants, 25 selected to act as the control group and 25 selected as the intervention group. The study was conducted to determine women's knowledge of IOL both before (non-intervention) and after (intervention) the introduction of a written information brochure. Statistically significant increases in knowledge were evident in the intervention group for knowledge about action (p=0.002) and timing of prostaglandins (p=0.03), the number of side effects known (p<0.0001) as well as time to birth (p=0.001) indicating an increased understanding of the process as a result of reading an information brochure. These results suggest that those in the non-intervention group lacked knowledge pertinent to IOL, even though they have consented to and actually arrived at the hospital prepared to undergo the IOL procedure. The most significant disparity noted between the intervention and non-intervention groups was women's knowledge of side effects of prostaglandin. Further to this, many women in the non-intervention group had unrealistic expectations of both the time for drug action and likely time from prostaglandin administration to birth. In contrast women in the intervention group knew about the common side effects of prostaglandin and possessed a more realistic understanding of the likely time to birth following this procedure. The results indicate that a specifically designed information brochure explaining the process of IOL in plain language has the effect of enhancing women's knowledge. This area of study warrants further investigation, especially research into the role of written information to improve women's understanding across other areas of maternity care education provision. Copyright © 2010 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  20. Unattended real-time re-establishment of visibility in high dynamic range video and stills

    NASA Astrophysics Data System (ADS)

    Abidi, B.

    2014-05-01

    We describe a portable unattended persistent surveillance system that corrects for harsh illumination conditions, where bright sun light creates mixed contrast effects, i.e., heavy shadows and washouts. These effects result in high dynamic range scenes, where illuminance can vary from few luxes to a 6 figure value. When using regular monitors and cameras, such wide span of illuminations can only be visualized if the actual range of values is compressed, leading to the creation of saturated and/or dark noisy areas and a loss of information in these areas. Images containing extreme mixed contrast cannot be fully enhanced from a single exposure, simply because all information is not present in the original data. The active intervention in the acquisition process is required. A software package, capable of integrating multiple types of COTS and custom cameras, ranging from Unmanned Aerial Systems (UAS) data links to digital single-lens reflex cameras (DSLR), is described. Hardware and software are integrated via a novel smart data acquisition algorithm, which communicates to the camera the parameters that would maximize information content in the final processed scene. A fusion mechanism is then applied to the smartly acquired data, resulting in an enhanced scene where information in both dark and bright areas is revealed. Multi-threading and parallel processing are exploited to produce automatic real time full motion corrected video. A novel enhancement algorithm was also devised to process data from legacy and non-controllable cameras. The software accepts and processes pre-recorded sequences and stills, enhances visible, night vision, and Infrared data, and successfully applies to night time and dark scenes. Various user options are available, integrating custom functionalities of the application into intuitive and easy to use graphical interfaces. The ensuing increase in visibility in surveillance video and intelligence imagery will expand the performance and timely decision making of the human analyst, as well as that of unmanned systems performing automatic data exploitation, such as target detection and identification.

  1. The Mediation of Mothers’ Self-Fulfilling Effects on Their Children’s Alcohol Use: Self-Verification, Informational Conformity and Modeling Processes

    PubMed Central

    Madon, Stephanie; Guyll, Max; Buller, Ashley A.; Scherr, Kyle C.; Willard, Jennifer; Spoth, Richard

    2010-01-01

    This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother – child dyads (N1 = 487; N2 = 287). Children’s alcohol use was the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers’ beliefs on children’s alcohol use through children’s self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers’ self-fulfilling effects. The potential for self-fulfilling prophecies to produce long lasting changes in targets’ behavior via self-verification processes are discussed. PMID:18665708

  2. The mediation of mothers' self-fulfilling effects on their children's alcohol use: self-verification, informational conformity, and modeling processes.

    PubMed

    Madon, Stephanie; Guyll, Max; Buller, Ashley A; Scherr, Kyle C; Willard, Jennifer; Spoth, Richard

    2008-08-01

    This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother-child dyads (N-sub-1 = 486; N-sub-2 = 287), with children's alcohol use as the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers' beliefs on children's alcohol use through children's self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers' self-fulfilling effects. The potential for self-fulfilling prophecies to produce long-lasting changes in targets' behavior via self-verification processes are discussed. (c) 2008 APA, all rights reserved

  3. Strategies to Reduce the Negative Effects of Spoken Explanatory Text on Integrated Tasks

    ERIC Educational Resources Information Center

    Singh, Anne-Marie; Marcus, Nadine; Ayres, Paul

    2017-01-01

    Two experiments involving 125 grade-10 students learning about commerce investigated strategies to overcome the transient information effect caused by explanatory spoken text. The transient information effect occurs when learning is reduced as a result of information disappearing before the learner has time to adequately process it, or link it…

  4. Improving Memory after Interruption: Exploiting Soft Constraints and Manipulating Information Access Cost

    ERIC Educational Resources Information Center

    Morgan, Phillip L.; Patrick, John; Waldron, Samuel M.; King, Sophia L.; Patrick, Tanya

    2009-01-01

    Forgetting what one was doing prior to interruption is an everyday problem. The recent soft constraints hypothesis (Gray, Sims, Fu, & Schoelles, 2006) emphasizes the strategic adaptation of information processing strategy to the task environment. It predicts that increasing information access cost (IAC: the time, and physical and mental effort…

  5. Methods for evaluating information in managing the enterprise on the basis of a hybrid three-tier system

    NASA Astrophysics Data System (ADS)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-01-01

    The article presents data on the influence of information upon the functioning of complex systems in the process of ensuring their effective management. Ways and methods for evaluating multidimensional information that reduce time and resources, improve the validity of the studied system management decisions, were proposed.

  6. Non-local correlations via Wigner-Yanase skew information in two SC-qubit having mutual interaction under phase decoherence

    NASA Astrophysics Data System (ADS)

    Mohamed, Abdel-Baset A.

    2017-10-01

    An analytical solution of the master equation that describes a superconducting cavity containing two coupled superconducting charge qubits is obtained. Quantum-mechanical correlations based on Wigner-Yanase skew information, as local quantum uncertainty and uncertainty-induced quantum non-locality, are compared to the concurrence under the effects of the phase decoherence. Local quantum uncertainty exhibits sudden changes during its time evolution and revival process. Sudden death and sudden birth occur only for entanglement, depending on the initial state of the two coupled charge qubits, while the correlations of skew information does not vanish. The quantum correlations of skew information are found to be sensitive to the dephasing rate, the photons number in the cavity, the interaction strength between the two qubits, and the qubit distribution angle of the initial state. With a proper initial state, the stationary correlation of the skew information has a non-zero stationary value for a long time interval under the phase decoherence, that it may be useful in quantum information and computation processes.

  7. Air Force Journal of Logistics, Volume 32, Number 2, Summer 2008

    DTIC Science & Technology

    2008-01-01

    inventory, overproduction, waiting time, motion, transportation , and over processing waste. Waste is often placed into the following categories (D-O-W-N-T-I-M...simpler tools would be sufficient. * Transportation : moving product between processes is a cost that adds no value to the product. * Intellect: human... Transportation . This is the unnecessary movement probles.e of information or materials. Examples include physical hand-off of information and moving

  8. The time course of syntactic activation during language processing: a model based on neuropsychological and neurophysiological data.

    PubMed

    Friederici, A D

    1995-09-01

    This paper presents a model describing the temporal and neurotopological structure of syntactic processes during comprehension. It postulates three distinct phases of language comprehension, two of which are primarily syntactic in nature. During the first phase the parser assigns the initial syntactic structure on the basis of word category information. These early structural processes are assumed to be subserved by the anterior parts of the left hemisphere, as event-related brain potentials show this area to be maximally activated when phrase structure violations are processed and as circumscribed lesions in this area lead to an impairment of the on-line structural assignment. During the second phase lexical-semantic and verb-argument structure information is processed. This phase is neurophysiologically manifest in a negative component in the event-related brain potential around 400 ms after stimulus onset which is distributed over the left and right temporo-parietal areas when lexical-semantic information is processed and over left anterior areas when verb-argument structure information is processed. During the third phase the parser tries to map the initial syntactic structure onto the available lexical-semantic and verb-argument structure information. In case of an unsuccessful match between the two types of information reanalyses may become necessary. These processes of structural reanalysis are correlated with a centroparietally distributed late positive component in the event-related brain potential.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. Some Behavioral and Neurobiological Constraints on Theories of Audiovisual Speech Integration: A Review and Suggestions for New Directions

    PubMed Central

    Altieri, Nicholas; Pisoni, David B.; Townsend, James T.

    2012-01-01

    Summerfield (1987) proposed several accounts of audiovisual speech perception, a field of research that has burgeoned in recent years. The proposed accounts included the integration of discrete phonetic features, vectors describing the values of independent acoustical and optical parameters, the filter function of the vocal tract, and articulatory dynamics of the vocal tract. The latter two accounts assume that the representations of audiovisual speech perception are based on abstract gestures, while the former two assume that the representations consist of symbolic or featural information obtained from visual and auditory modalities. Recent converging evidence from several different disciplines reveals that the general framework of Summerfield’s feature-based theories should be expanded. An updated framework building upon the feature-based theories is presented. We propose a processing model arguing that auditory and visual brain circuits provide facilitatory information when the inputs are correctly timed, and that auditory and visual speech representations do not necessarily undergo translation into a common code during information processing. Future research on multisensory processing in speech perception should investigate the connections between auditory and visual brain regions, and utilize dynamic modeling tools to further understand the timing and information processing mechanisms involved in audiovisual speech integration. PMID:21968081

  10. Nurses' Experiences of an Initial and Reimplemented Electronic Health Record Use.

    PubMed

    Chang, Chi-Ping; Lee, Ting-Ting; Liu, Chia-Hui; Mills, Mary Etta

    2016-04-01

    The electronic health record is a key component of healthcare information systems. Currently, numerous hospitals have adopted electronic health records to replace paper-based records to document care processes and improve care quality. Integrating healthcare information system into traditional nursing daily operations requires time and effort for nurses to become familiarized with this new technology. In the stages of electronic health record implementation, smooth adoption can streamline clinical nursing activities. In order to explore the adoption process, a descriptive qualitative study design and focus group interviews were conducted 3 months after and 2 years after electronic health record system implementation (system aborted 1 year in between) in one hospital located in southern Taiwan. Content analysis was performed to analyze the interview data, and six main themes were derived, in the first stage: (1) liability, work stress, and anticipation for electronic health record; (2) slow network speed, user-unfriendly design for learning process; (3) insufficient information technology/organization support; on the second stage: (4) getting used to electronic health record and further system requirements, (5) benefits of electronic health record in time saving and documentation, (6) unrealistic information technology competence expectation and future use. It concluded that user-friendly design and support by informatics technology and manpower backup would facilitate this adoption process as well.

  11. Some behavioral and neurobiological constraints on theories of audiovisual speech integration: a review and suggestions for new directions.

    PubMed

    Altieri, Nicholas; Pisoni, David B; Townsend, James T

    2011-01-01

    Summerfield (1987) proposed several accounts of audiovisual speech perception, a field of research that has burgeoned in recent years. The proposed accounts included the integration of discrete phonetic features, vectors describing the values of independent acoustical and optical parameters, the filter function of the vocal tract, and articulatory dynamics of the vocal tract. The latter two accounts assume that the representations of audiovisual speech perception are based on abstract gestures, while the former two assume that the representations consist of symbolic or featural information obtained from visual and auditory modalities. Recent converging evidence from several different disciplines reveals that the general framework of Summerfield's feature-based theories should be expanded. An updated framework building upon the feature-based theories is presented. We propose a processing model arguing that auditory and visual brain circuits provide facilitatory information when the inputs are correctly timed, and that auditory and visual speech representations do not necessarily undergo translation into a common code during information processing. Future research on multisensory processing in speech perception should investigate the connections between auditory and visual brain regions, and utilize dynamic modeling tools to further understand the timing and information processing mechanisms involved in audiovisual speech integration.

  12. Two-dimensional thermal video analysis of offshore bird and bat flight

    DOE PAGES

    Matzner, Shari; Cullinan, Valerie I.; Duberstein, Corey A.

    2015-09-11

    Thermal infrared video can provide essential information about bird and bat presence and activity for risk assessment studies, but the analysis of recorded video can be time-consuming and may not extract all of the available information. Automated processing makes continuous monitoring over extended periods of time feasible, and maximizes the information provided by video. This is especially important for collecting data in remote locations that are difficult for human observers to access, such as proposed offshore wind turbine sites. We present guidelines for selecting an appropriate thermal camera based on environmental conditions and the physical characteristics of the target animals.more » We developed new video image processing algorithms that automate the extraction of bird and bat flight tracks from thermal video, and that characterize the extracted tracks to support animal identification and behavior inference. The algorithms use a video peak store process followed by background masking and perceptual grouping to extract flight tracks. The extracted tracks are automatically quantified in terms that could then be used to infer animal type and possibly behavior. The developed automated processing generates results that are reproducible and verifiable, and reduces the total amount of video data that must be retained and reviewed by human experts. Finally, we suggest models for interpreting thermal imaging information.« less

  13. Two-dimensional thermal video analysis of offshore bird and bat flight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzner, Shari; Cullinan, Valerie I.; Duberstein, Corey A.

    Thermal infrared video can provide essential information about bird and bat presence and activity for risk assessment studies, but the analysis of recorded video can be time-consuming and may not extract all of the available information. Automated processing makes continuous monitoring over extended periods of time feasible, and maximizes the information provided by video. This is especially important for collecting data in remote locations that are difficult for human observers to access, such as proposed offshore wind turbine sites. We present guidelines for selecting an appropriate thermal camera based on environmental conditions and the physical characteristics of the target animals.more » We developed new video image processing algorithms that automate the extraction of bird and bat flight tracks from thermal video, and that characterize the extracted tracks to support animal identification and behavior inference. The algorithms use a video peak store process followed by background masking and perceptual grouping to extract flight tracks. The extracted tracks are automatically quantified in terms that could then be used to infer animal type and possibly behavior. The developed automated processing generates results that are reproducible and verifiable, and reduces the total amount of video data that must be retained and reviewed by human experts. Finally, we suggest models for interpreting thermal imaging information.« less

  14. Activity-based costing via an information system: an application created for a breast imaging center.

    PubMed

    Hawkins, H; Langer, J; Padua, E; Reaves, J

    2001-06-01

    Activity-based costing (ABC) is a process that enables the estimation of the cost of producing a product or service. More accurate than traditional charge-based approaches, it emphasizes analysis of processes, and more specific identification of both direct and indirect costs. This accuracy is essential in today's healthcare environment, in which managed care organizations necessitate responsible and accountable costing. However, to be successfully utilized, it requires time, effort, expertise, and support. Data collection can be tedious and expensive. By integrating ABC with information management (IM) and systems (IS), organizations can take advantage of the process orientation of both, extend and improve ABC, and decrease resource utilization for ABC projects. In our case study, we have examined the process of a multidisciplinary breast center. We have mapped the constituent activities and established cost drivers. This information has been structured and included in our information system database for subsequent analysis.

  15. Episodic Memory Retrieval Functionally Relies on Very Rapid Reactivation of Sensory Information.

    PubMed

    Waldhauser, Gerd T; Braun, Verena; Hanslmayr, Simon

    2016-01-06

    Episodic memory retrieval is assumed to rely on the rapid reactivation of sensory information that was present during encoding, a process termed "ecphory." We investigated the functional relevance of this scarcely understood process in two experiments in human participants. We presented stimuli to the left or right of fixation at encoding, followed by an episodic memory test with centrally presented retrieval cues. This allowed us to track the reactivation of lateralized sensory memory traces during retrieval. Successful episodic retrieval led to a very early (∼100-200 ms) reactivation of lateralized alpha/beta (10-25 Hz) electroencephalographic (EEG) power decreases in the visual cortex contralateral to the visual field at encoding. Applying rhythmic transcranial magnetic stimulation to interfere with early retrieval processing in the visual cortex led to decreased episodic memory performance specifically for items encoded in the visual field contralateral to the site of stimulation. These results demonstrate, for the first time, that episodic memory functionally relies on very rapid reactivation of sensory information. Remembering personal experiences requires a "mental time travel" to revisit sensory information perceived in the past. This process is typically described as a controlled, relatively slow process. However, by using electroencephalography to measure neural activity with a high time resolution, we show that such episodic retrieval entails a very rapid reactivation of sensory brain areas. Using transcranial magnetic stimulation to alter brain function during retrieval revealed that this early sensory reactivation is causally relevant for conscious remembering. These results give first neural evidence for a functional, preconscious component of episodic remembering. This provides new insight into the nature of human memory and may help in the understanding of psychiatric conditions that involve the automatic intrusion of unwanted memories. Copyright © 2016 the authors 0270-6474/16/360251-10$15.00/0.

  16. Contributions of familiarity and recollection rejection to recognition: Evidence from the time course of false recognition for semantic and conjunction lures

    PubMed Central

    Matzen, Laura E.; Taylor, Eric G.; Benjamin, Aaron S.

    2010-01-01

    It has been suggested that both familiarity and recollection contribute to the recognition decision process. In this paper, we leverage the form of false alarm rate functions—in which false-alarm rates describe an inverted U-shaped function as the time between study and test increases—to assess how these processes support retention of semantic and surface form information from previously studied words. We directly compare the maxima of these functions for lures that are semantically related and lures that are related by surface form to previously studied material. This analysis reveals a more rapid loss of access to surface form than to semantic information. To separate the contributions of item familiarity and reminding-induced recollection rejection to this effect, we use a simple multinomial process model; this analysis reveals that this loss of access reflects both a more rapid loss of familiarity and lower rates of recollection for surface form information. PMID:21240745

  17. Emergence of Space-Time Localization and Cosmic Decoherence:. More on Irreversible Time, Dark Energy, Anti-Matter and Black-Holes

    NASA Astrophysics Data System (ADS)

    Magnon, Anne

    2005-04-01

    A non geometric cosmology is presented, based on logic of observability, where logical categories of our perception set frontiers to comprehensibility. The Big-Bang singularity finds here a substitute (comparable to a "quantum jump"): a logical process (tied to self-referent and divisible totality) by which information emerges, focalizes on events and recycles, providing a transition from incoherence to causal coherence. This jump manufactures causal order and space-time localization, as exact solutions to Einstein's equation, where the last step of the process disentangles complex Riemann spheres into real null-cones (a geometric overturning imposed by self-reference, reminding us of our ability to project the cosmos within our mental sphere). Concepts such as antimatter and dark energy (dual entities tied to bifurcations or broken symmetries, and their compensation), are presented as hidden in the virtual potentialities, while irreversible time appears with the recycling of information and related flow. Logical bifurcations (such as the "part-totality" category, a quantum of information which owes its recycling to non localizable logical separations, as anticipated by unstability or horizon dependence of the quantum vacuum) induce broken symmetries, at the (complex or real) geometric level [eg. the antiselfdual complex non linear graviton solutions, which break duality symmetry, provide a model for (hidden) anti-matter, itself compensated with dark-energy, and providing, with space-time localization, the radiative gravitational energy (Bondi flux and related bifurcations of the peeling off type), as well as mass of isolated bodies]. These bifurcations are compensated by inertial effects (non geometric precursors of the Coriolis forces) able to explain (on logical grounds) the cosmic expansion (a repulsion?) and critical equilibrium of the cosmic tissue. Space-time environment, itself, emerges through the jump, as a censor to totality, a screen to incoherence (as anticipated by black-hole event horizons, cosmic censors able to shelter causal geometry). In analogy with black-hole singularities, the Big-Bang can be viewed as a geometric hint that a transition from incoherence to (causal space-time) localization and related coherence (comprehensibility), is taking place (space-time demolition, a reverse process towards incoherence or information recycling, is expected in the vicinity of singularities, as hinted by black-holes and related "time-machines"). A theory of the emergence of perception (and life?), in connection with observability and the function of partition (able to screen totality), is on its way [interface incoherence-coherence, sleeping and awaking states of localization, horizons of perception etc, are anticipated by black-hole event horizons, beyond which a non causal, dimensionless incoherent regime or memorization process, presents itself with the loss of localization, suggesting a unifying regime (ultimate energies?) hidden in cosmic potentialities]. The decoherence process presented here, suggests an ultimate interaction, expression of the logical relation of subsystems to totality, and to be identified to the flow of information or its recycling through cosmic jump (this is anticipated by the dissipation of distance or hierarchies on null-cones, themselves recycled with information and events). The geometric projection of this unified irreversible dynamics is expressed by unified Yang-Mills field equations (coupled to Einsteinian gravity). An ultimate form of action ("set"-volumes of information) presents itself, whose extrema can be achieved through extremal transfer of information and related partition of cells of information (thus anticipating the mitosis of living cells, possibly triggered at the non localizable level, as imposed by the logical regime of cosmic decoherence: participating subsystems ?). The matching of the objective and subjective facets of (information and) decoherences is perceived as contact with a reality.

  18. A Cartesian reflex assessment of face processing.

    PubMed

    Polewan, Robert J; Vigorito, Christopher M; Nason, Christopher D; Block, Richard A; Moore, John W

    2006-03-01

    Commands to blink were embedded within pictures of faces and simple geometric shapes or forms. The faces and shapes were conditioned stimuli (CSs), and the required responses were conditioned responses, or more properly, Cartesian reflexes (CRs). As in classical conditioning protocols, response times (RTs) were measured from CS onset. RTs provided a measure of the processing cost (PC) of attending to a CS. A PC is the extra time required to respond relative to RTs to unconditioned stimulus (US) commands presented alone. They reflect the interplay between attentional processing of the informational content of a CS and its signaling function with respect to the US command. This resulted in longer RTs to embedded commands. Differences between PCs of faces and geometric shapes represent a starting place for a new mental chronometry based on the traditional idea that differences in RT reflect differences in information processing.

  19. All-IP-Ethernet architecture for real-time sensor-fusion processing

    NASA Astrophysics Data System (ADS)

    Hiraki, Kei; Inaba, Mary; Tezuka, Hiroshi; Tomari, Hisanobu; Koizumi, Kenichi; Kondo, Shuya

    2016-03-01

    Serendipter is a device that distinguishes and selects very rare particles and cells from huge amount of population. We are currently designing and constructing information processing system for a Serendipter. The information processing system for Serendipter is a kind of sensor-fusion system but with much more difficulties: To fulfill these requirements, we adopt All IP based architecture: All IP-Ethernet based data processing system consists of (1) sensor/detector directly output data as IP-Ethernet packet stream, (2) single Ethernet/TCP/IP streams by a L2 100Gbps Ethernet switch, (3) An FPGA board with 100Gbps Ethernet I/F connected to the switch and a Xeon based server. Circuits in the FPGA include 100Gbps Ethernet MAC, buffers and preprocessing, and real-time Deep learning circuits using multi-layer neural networks. Proposed All-IP architecture solves existing problem to construct large-scale sensor-fusion systems.

  20. A general dead-time correction method based on live-time stamping. Application to the measurement of short-lived radionuclides.

    PubMed

    Chauvenet, B; Bobin, C; Bouchard, J

    2017-12-01

    Dead-time correction formulae are established in the general case of superimposed non-homogeneous Poisson processes. Based on the same principles as conventional live-timed counting, this method exploits the additional information made available using digital signal processing systems, and especially the possibility to store the time stamps of live-time intervals. No approximation needs to be made to obtain those formulae. Estimates of the variances of corrected rates are also presented. This method is applied to the activity measurement of short-lived radionuclides. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Coherent-state information concentration and purification in atomic memory

    NASA Astrophysics Data System (ADS)

    Herec, Jiří; Filip, Radim

    2006-12-01

    We propose a feasible method of coherent-state information concentration and purification utilizing quantum memory. The method allows us to optimally concentrate and purify information carried by many noisy copies of an unknown coherent state (randomly distributed in time) to a single copy. Thus nonclassical resources and operations can be saved, if we compare information processing with many noisy copies and a single copy with concentrated and purified information.

  2. Problem Areas in Data Warehousing and Data Mining in a Surgical Clinic

    PubMed Central

    Tusch, Guenter; Mueller, Margarete; Rohwer-Mensching, Katrin; Heiringhoff, Karlheinz; Klempnauer, Juergen

    2001-01-01

    Hospitals and clinics have taken advantage of information systems to streamline many clinical and administrative processes. However, the potential of health care information technology as a source of data for clinical and administrative decision support has not been fully explored. In response to pressure for timely information, many hospitals are developing clinical data warehouses. This paper attempts to identify problem areas in the process of developing a data warehouse to support data mining in surgery. Based on the experience from a data warehouse in surgery several solutions are discussed.

  3. The persistence of a visual dominance effect in a telemanipulator task: A comparison between visual and electrotactile feedback

    NASA Technical Reports Server (NTRS)

    Gaillard, J. P.

    1981-01-01

    The possibility to use an electrotactile stimulation in teleoperation and to observe the interpretation of such information as a feedback to the operator was investigated. It is proposed that visual feedback is more informative than an electrotactile one; and that complex electrotactile feedback slows down both the motor decision and motor response processes, is processed as an all or nothing signal, and bypasses the receptive structure and accesses directly in a working memory where information is sequentially processed and where memory is limited in treatment capacity. The electrotactile stimulation is used as an alerting signal. It is suggested that the visual dominance effect is the result of the advantage of both a transfer function and a sensory memory register where information is pretreated and memorized for a short time. It is found that dividing attention has an effect on the acquisition of the information but not on the subsequent decision processes.

  4. An Estimate of the Total DNA in the Biosphere

    PubMed Central

    Landenmark, Hanna K. E.; Forgan, Duncan H.; Cockell, Charles S.

    2015-01-01

    Modern whole-organism genome analysis, in combination with biomass estimates, allows us to estimate a lower bound on the total information content in the biosphere: 5.3 × 1031 (±3.6 × 1031) megabases (Mb) of DNA. Given conservative estimates regarding DNA transcription rates, this information content suggests biosphere processing speeds exceeding yottaNOPS values (1024 Nucleotide Operations Per Second). Although prokaryotes evolved at least 3 billion years before plants and animals, we find that the information content of prokaryotes is similar to plants and animals at the present day. This information-based approach offers a new way to quantify anthropogenic and natural processes in the biosphere and its information diversity over time. PMID:26066900

  5. An Estimate of the Total DNA in the Biosphere.

    PubMed

    Landenmark, Hanna K E; Forgan, Duncan H; Cockell, Charles S

    2015-06-01

    Modern whole-organism genome analysis, in combination with biomass estimates, allows us to estimate a lower bound on the total information content in the biosphere: 5.3 × 1031 (±3.6 × 1031) megabases (Mb) of DNA. Given conservative estimates regarding DNA transcription rates, this information content suggests biosphere processing speeds exceeding yottaNOPS values (1024 Nucleotide Operations Per Second). Although prokaryotes evolved at least 3 billion years before plants and animals, we find that the information content of prokaryotes is similar to plants and animals at the present day. This information-based approach offers a new way to quantify anthropogenic and natural processes in the biosphere and its information diversity over time.

  6. Preparing for a decision support system.

    PubMed

    Callan, K

    2000-08-01

    The increasing pressure to reduce costs and improve outcomes is driving the health care industry to view information as a competitive advantage. Timely information is required to help reduce inefficiencies and improve patient care. Numerous disparate operational or transactional information systems with inconsistent and often conflicting data are no longer adequate to meet the information needs of integrated care delivery systems and networks in competitive managed care environments. This article reviews decision support system characteristics and describes a process to assess the preparedness of an organization to implement and use decision support systems to achieve a more effective, information-based decision process. Decision support tools included in this article range from reports to data mining.

  7. Information Diffusion in Facebook-Like Social Networks Under Information Overload

    NASA Astrophysics Data System (ADS)

    Li, Pei; Xing, Kai; Wang, Dapeng; Zhang, Xin; Wang, Hui

    2013-07-01

    Research on social networks has received remarkable attention, since many people use social networks to broadcast information and stay connected with their friends. However, due to the information overload in social networks, it becomes increasingly difficult for users to find useful information. This paper takes Facebook-like social networks into account, and models the process of information diffusion under information overload. The term view scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated is proposed to characterize the information diffusion efficiency. Through theoretical analysis, we find that factors such as network structure and view scope number have no impact on the information diffusion efficiency, which is a surprising result. To verify the results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly.

  8. An innovative nanophotonic information processing concept implementing cogent micro/nanosensors for space robotics

    NASA Astrophysics Data System (ADS)

    Santoli, Salvatore

    2013-02-01

    Cogent sensors, defined as sensors that are capable of performing the transformation of raw data into information, are shown to be of the essence for realization of the long sought-after autonomous robots for space applications. A strongly miniaturized integration of sensing and information processing systems is needed for cogent sensors designed for autonomous sensing—information processing (IP)—actuating behavior. It is shown that the recently developed field of quantum holography (QH), stemming from geometric quantization of any holographic processes through the Heisenberg Group (G) and deeply different, as stressed in detail, from other meanings of "quantum holography" in the literature, supplies the nanophotonic tools for designing and assembling an associative memory (AM) as the brain implementing such strong cogency. An AM is designed through a free-space interconnected large planar multilayer architecture of quantum well-based two-port neurons implementing a shift register on the manifold of G, and whose input consists of photonic holograms from high frequency pulsed microlasers in the infrared band of em or em-transduced outside signals. The optoelectronics as relative, integrated into a hybrid chip involving photonic detectors, microlasers and electronic components for the clock control system, would allow cycle times as short as 30 ns with the large spatial bandwidth available in photonics. IP through QH concerns the encoding and decoding of holographic interference patterns, not of mere binary digital logical (syntactic) information. Accordingly, QH defines on the G's manifold an IP paradigm where information as experimental knowledge is processed; i.e., IP concerns both syntax and semantics. It is shown that such QH-neural brain would cogently deal with spurious signals as random noise that would be caused to die out on the way to the intended target through parallel massive and real-time IP.

  9. Enhancing The Army Operations Process Through The Incorportation of Holography

    DTIC Science & Technology

    2017-06-09

    the process and gives the user the sense of a noninvasive enhancement to quickly make decisions . Processes and information no longer create...mentally overlaying it onto the process . Data now augments reality and is a noninvasive process to decision making . v ACKNOWLEDGMENTS This paper...environment, augmented on top of reality decreases the amount of time needed to make decisions

  10. Temporal coding of reward-guided choice in the posterior parietal cortex

    PubMed Central

    Hawellek, David J.; Wong, Yan T.; Pesaran, Bijan

    2016-01-01

    Making a decision involves computations across distributed cortical and subcortical networks. How such distributed processing is performed remains unclear. We test how the encoding of choice in a key decision-making node, the posterior parietal cortex (PPC), depends on the temporal structure of the surrounding population activity. We recorded spiking and local field potential (LFP) activity in the PPC while two rhesus macaques performed a decision-making task. We quantified the mutual information that neurons carried about an upcoming choice and its dependence on LFP activity. The spiking of PPC neurons was correlated with LFP phases at three distinct time scales in the theta, beta, and gamma frequency bands. Importantly, activity at these time scales encoded upcoming decisions differently. Choice information contained in neural firing varied with the phase of beta and gamma activity. For gamma activity, maximum choice information occurred at the same phase as the maximum spike count. However, for beta activity, choice information and spike count were greatest at different phases. In contrast, theta activity did not modulate the encoding properties of PPC units directly but was correlated with beta and gamma activity through cross-frequency coupling. We propose that the relative timing of local spiking and choice information reveals temporal reference frames for computations in either local or large-scale decision networks. Differences between the timing of task information and activity patterns may be a general signature of distributed processing across large-scale networks. PMID:27821752

  11. Systems Factorial Technology provides new insights on global-local information processing in autism spectrum disorders.

    PubMed

    Johnson, Shannon A; Blaha, Leslie M; Houpt, Joseph W; Townsend, James T

    2010-02-01

    Previous studies of global-local processing in autism spectrum disorders (ASDs) have indicated mixed findings, with some evidence of a local processing bias, or preference for detail-level information, and other results suggesting typical global advantage, or preference for the whole or gestalt. Findings resulting from this paradigm have been used to argue for or against a detail focused processing bias in ASDs, and thus have important theoretical implications. We applied Systems Factorial Technology, and the associated Double Factorial Paradigm (both defined in the text), to examine information processing characteristics during a divided attention global-local task in high-functioning individuals with an ASD and typically developing controls. Group data revealed global advantage for both groups, contrary to some current theories of ASDs. Information processing models applied to each participant revealed that task performance, although showing no differences at the group level, was supported by different cognitive mechanisms in ASD participants compared to controls. All control participants demonstrated inhibitory parallel processing and the majority demonstrated a minimum-time stopping rule. In contrast, ASD participants showed exhaustive parallel processing with mild facilitatory interactions between global and local information. Thus our results indicate fundamental differences in the stopping rules and channel dependencies in individuals with an ASD.

  12. Estimating costs and performance of systems for machine processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Ballard, R. J.; Eastwood, L. F., Jr.

    1977-01-01

    This paper outlines a method for estimating computer processing times and costs incurred in producing information products from digital remotely sensed data. The method accounts for both computation and overhead, and may be applied to any serial computer. The method is applied to estimate the cost and computer time involved in producing Level II Land Use and Vegetative Cover Maps for a five-state midwestern region. The results show that the amount of data to be processed overloads some example computer systems, but that the processing is feasible on others.

  13. 75 FR 42296 - Safe, Efficient Use and Preservation of the Navigable Airspace

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-21

    ... facilitates the aeronautical study process and has reduced the overall processing time for these cases. The... cases to be processed, particularly if additional information, via public comment period, was necessary... the permit application is not necessary. There are cases where circulating the proposal for public...

  14. 10 CFR 1703.108 - Processing of FOIA requests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Processing of FOIA requests. 1703.108 Section 1703.108 Energy DEFENSE NUCLEAR FACILITIES SAFETY BOARD PUBLIC INFORMATION AND REQUESTS § 1703.108 Processing of... section to provide access to requested records shall be taken within twenty working days. This time period...

  15. 10 CFR 1703.108 - Processing of FOIA requests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing of FOIA requests. 1703.108 Section 1703.108 Energy DEFENSE NUCLEAR FACILITIES SAFETY BOARD PUBLIC INFORMATION AND REQUESTS § 1703.108 Processing of... section to provide access to requested records shall be taken within twenty working days. This time period...

  16. 10 CFR 1703.108 - Processing of FOIA requests.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Processing of FOIA requests. 1703.108 Section 1703.108 Energy DEFENSE NUCLEAR FACILITIES SAFETY BOARD PUBLIC INFORMATION AND REQUESTS § 1703.108 Processing of... section to provide access to requested records shall be taken within twenty working days. This time period...

  17. 10 CFR 1703.108 - Processing of FOIA requests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Processing of FOIA requests. 1703.108 Section 1703.108 Energy DEFENSE NUCLEAR FACILITIES SAFETY BOARD PUBLIC INFORMATION AND REQUESTS § 1703.108 Processing of... section to provide access to requested records shall be taken within twenty working days. This time period...

  18. 10 CFR 1703.108 - Processing of FOIA requests.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Processing of FOIA requests. 1703.108 Section 1703.108 Energy DEFENSE NUCLEAR FACILITIES SAFETY BOARD PUBLIC INFORMATION AND REQUESTS § 1703.108 Processing of... section to provide access to requested records shall be taken within twenty working days. This time period...

  19. Priority-setting and hospital strategic planning: a qualitative case study.

    PubMed

    Martin, Douglas; Shulman, Ken; Santiago-Sorrell, Patricia; Singer, Peter

    2003-10-01

    To describe and evaluate the priority-setting element of a hospital's strategic planning process. Qualitative case study and evaluation against the conditions of 'accountability for reasonableness' of a strategic planning process at a large urban university-affiliated hospital. The hospital's strategic planning process met the conditions of 'accountability for reasonableness' in large part. Specifically: the hospital based its decisions on reasons (both information and criteria) that the participants felt were relevant to the hospital; the number and type of participants were very extensive; the process, decisions and reasons were well communicated throughout the organization, using multiple communication vehicles; and the process included an ethical framework linked to an effort to evaluate and improve the process. However, there were opportunities to improve the process, particularly by giving participants more time to absorb the information relevant to priority-setting decisions, more time to take difficult decisions and some means to appeal or revise decisions. A case study linked to an evaluation using 'accountability for reasonableness' can serve to improve priority-setting in the context of hospital strategic planning.

  20. A vector scanning processing technique for pulsed laser velocimetry

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.; Edwards, Robert V.

    1989-01-01

    Pulsed laser sheet velocimetry yields nonintrusive measurements of two-dimensional velocity vectors across an extended planar region of a flow. Current processing techniques offer high precision (1 pct) velocity estimates, but can require several hours of processing time on specialized array processors. Under some circumstances, a simple, fast, less accurate (approx. 5 pct), data reduction technique which also gives unambiguous velocity vector information is acceptable. A direct space domain processing technique was examined. The direct space domain processing technique was found to be far superior to any other techniques known, in achieving the objectives listed above. It employs a new data coding and reduction technique, where the particle time history information is used directly. Further, it has no 180 deg directional ambiguity. A complex convection vortex flow was recorded and completely processed in under 2 minutes on an 80386 based PC, producing a 2-D velocity vector map of the flow field. Hence, using this new space domain vector scanning (VS) technique, pulsed laser velocimetry data can be reduced quickly and reasonably accurately, without specialized array processing hardware.

  1. Deciding treatment for miscarriage--experiences of women and healthcare professionals.

    PubMed

    Olesen, Mette Linnet; Graungaard, Anette H; Husted, Gitte R

    2015-06-01

    Women experiencing miscarriage are offered a choice of different treatments to terminate their wanted pregnancy at a time when they are often shocked and distressed. Women's and healthcare professionals' experiences of the decision-making process are not well described. We aimed to gain insight into this process and the circumstances that may affect it. A qualitative study using a grounded theory approach. Data were obtained through semi-structured interviews with six women who had chosen and completed either surgical, medical or expectant treatment for miscarriage and five healthcare professionals involved in the decision-making at an emergency gynaecological department in Denmark. An inductive explorative method was chosen due to limited knowledge about the decision-making process, and a theoretical perspective was not applied until the final analysis. Despite information and pretreatment counselling, choice of treatment was often determined by unspoken emotional considerations, including fear of seeing the foetus or fear of anaesthesia. These considerations were not discussed during the decision-making process, which was a time when the women were under time pressure and experienced emotional distress. Healthcare professionals did not explore women's considerations for choosing a particular treatment and prioritised information differently. We found theory about coping and decision-making in stressful situations useful in increasing our understanding of the women's reactions. In relation to theory about informed consent, our findings suggest that women need more understanding of the treatments before making a decision. This study is limited due to a small sample size, but it generates important findings that need to be examined in a larger sample. Frequently, women did not use information provided about treatment pros and cons in their decision-making process. Because of unspoken thoughts, and women's needs being unexplored by healthcare professionals, information did not target women's needs and their reasoning remained unapparent. © 2014 Nordic College of Caring Science.

  2. Parallel constraint satisfaction in memory-based decisions.

    PubMed

    Glöckner, Andreas; Hodges, Sara D

    2011-01-01

    Three studies sought to investigate decision strategies in memory-based decisions and to test the predictions of the parallel constraint satisfaction (PCS) model for decision making (Glöckner & Betsch, 2008). Time pressure was manipulated and the model was compared against simple heuristics (take the best and equal weight) and a weighted additive strategy. From PCS we predicted that fast intuitive decision making is based on compensatory information integration and that decision time increases and confidence decreases with increasing inconsistency in the decision task. In line with these predictions we observed a predominant usage of compensatory strategies under all time-pressure conditions and even with decision times as short as 1.7 s. For a substantial number of participants, choices and decision times were best explained by PCS, but there was also evidence for use of simple heuristics. The time-pressure manipulation did not significantly affect decision strategies. Overall, the results highlight intuitive, automatic processes in decision making and support the idea that human information-processing capabilities are less severely bounded than often assumed.

  3. Impact of nowcasting on the production and processing of agricultural crops. [in the US

    NASA Technical Reports Server (NTRS)

    Dancer, W. S.; Tibbitts, T. W.

    1973-01-01

    The value was studied of improved weather information and weather forecasting to farmers, growers, and agricultural processing industries in the United States. The study was undertaken to identify the production and processing operations that could be improved with accurate and timely information on changing weather patterns. Estimates were then made of the potential savings that could be realized with accurate information about the prevailing weather and short term forecasts for up to 12 hours. This weather information has been termed nowcasting. The growing, marketing, and processing operations of the twenty most valuable crops in the United States were studied to determine those operations that are sensitive to short-term weather forecasting. Agricultural extension specialists, research scientists, growers, and representatives of processing industries were consulted and interviewed. The value of the crops included in this survey and their production levels are given. The total value for crops surveyed exceeds 24 billion dollars and represents more than 92 percent of total U.S. crop value.

  4. Cybernetic Basis and System Practice of Remote Sensing and Spatial Information Science

    NASA Astrophysics Data System (ADS)

    Tan, X.; Jing, X.; Chen, R.; Ming, Z.; He, L.; Sun, Y.; Sun, X.; Yan, L.

    2017-09-01

    Cybernetics provides a new set of ideas and methods for the study of modern science, and it has been fully applied in many areas. However, few people have introduced cybernetics into the field of remote sensing. The paper is based on the imaging process of remote sensing system, introducing cybernetics into the field of remote sensing, establishing a space-time closed-loop control theory for the actual operation of remote sensing. The paper made the process of spatial information coherently, and improved the comprehensive efficiency of the space information from acquisition, procession, transformation to application. We not only describes the application of cybernetics in remote sensing platform control, sensor control, data processing control, but also in whole system of remote sensing imaging process control. We achieve the information of output back to the input to control the efficient operation of the entire system. This breakthrough combination of cybernetics science and remote sensing science will improve remote sensing science to a higher level.

  5. 77 FR 65708 - Agency Information Collection Activities: Petition To Remove the Conditions on Residence, Form...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-30

    ... technological collection techniques or other forms of information technology, e.g., permitting electronic... form I-751 and an estimated time burden per response of 1.17 hours for the biometric processing. (6) An...

  6. Results of research on development of an intellectual information system of bankruptcy risk assessment of the enterprise

    NASA Astrophysics Data System (ADS)

    Telipenko, E.; Chernysheva, T.; Zakharova, A.; Dumchev, A.

    2015-10-01

    The article represents research results about the knowledge base development for the intellectual information system for the bankruptcy risk assessment of the enterprise. It is described the process analysis of the knowledge base development; the main process stages, some problems and their solutions are given. The article introduces the connectionist model for the bankruptcy risk assessment based on the analysis of industrial enterprise financial accounting. The basis for this connectionist model is a three-layer perceptron with the back propagation of error algorithm. The knowledge base for the intellectual information system consists of processed information and the processing operation method represented as the connectionist model. The article represents the structure of the intellectual information system, the knowledge base, and the information processing algorithm for neural network training. The paper shows mean values of 10 indexes for industrial enterprises; with the help of them it is possible to carry out a financial analysis of industrial enterprises and identify correctly the current situation for well-timed managerial decisions. Results are given about neural network testing on the data of both bankrupt and financially strong enterprises, which were not included into training and test sets.

  7. EOS Data Products Latency and Reprocessing Evaluation

    NASA Astrophysics Data System (ADS)

    Ramapriyan, H. K.; Wanchoo, L.

    2012-12-01

    NASA's Earth Observing System (EOS) Data and Information System (EOSDIS) program has been processing, archiving, and distributing EOS data since the launch of Terra platform in 1999. The EOSDIS Distributed Active Archive Centers (DAACs) and Science-Investigator-led Processing Systems (SIPSs) are generating over 5000 unique products with a daily average volume of 1.7 Petabytes. Initially EOSDIS had requirements to make process data products within 24 hours of receiving all inputs needed for generating them. Thus, generally, the latency would be slightly over 24 and 48 hours after satellite data acquisition, respectively, for Level 1 and Level 2 products. Due to budgetary constraints these requirements were relaxed, with the requirement being to avoid a growing backlog of unprocessed data. However, the data providers have been generating these products in as timely a manner as possible. The reduction in costs of computing hardware has helped considerably. It is of interest to analyze the actual latencies achieved over the past several years in processing and inserting the data products into the EOSDIS archives for the users to support various scientific studies such as land processes, oceanography, hydrology, atmospheric science, cryospheric science, etc. The instrument science teams have continuously evaluated the data products since the launches of EOS satellites and improved the science algorithms to provide high quality products. Data providers have periodically reprocessed the previously acquired data with these improved algorithms. The reprocessing campaigns run for an extended time period in parallel with forward processing, since all data starting from the beginning of the mission need to be reprocessed. Each reprocessing activity involves more data than the previous reprocessing. The historical record of the reprocessing times would be of interest to future missions, especially those involving large volumes of data and/or computational loads due to complexity of algorithms. Evaluation of latency and reprocessing times requires some of the product metadata information, such as the beginning and ending time of data acquisition, processing date, and version number. This information for each product is made available by data providers to the ESDIS Metrics System (EMS). The EMS replaced the earlier ESDIS Data Gathering and Reporting System (EDGRS) in FY2005. Since then it has collected information about data products' ingest, archive, and distribution. The analysis of latencies and reprocessing times will provide an insight to the data provider process and identify potential areas of weakness in providing timely data to the user community. Delays may be caused by events such as system unavailability, disk failures, delay in level 0 data delivery, availability of input data, network problems, and power failures. Analysis of metrics will highlight areas for focused examination of root causes for delays. The purposes of this study are to: 1) perform a detailed analysis of latency of selected instrument products for last 6 years; 2) analyze the reprocessed data from various data providers to determine the times taken for reprocessing campaigns; 3) identify potential reasons for any anomalies in these metrics.

  8. Badge Office Process Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haurykiewicz, John Paul; Dinehart, Timothy Grant; Parker, Robert Young

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with informationmore » and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.« less

  9. Monitoring landscape level processes using remote sensing of large plots

    Treesearch

    Raymond L. Czaplewski

    1991-01-01

    Global and regional assessaents require timely information on landscape level status (e.g., areal extent of different ecosystems) and processes (e.g., changes in land use and land cover). To measure and understand these processes at the regional level, and model their impacts, remote sensing is often necessary. However, processing massive volumes of remotely sensing...

  10. When a Dog Has a Pen for a Tail: The Time Course of Creative Object Processing

    ERIC Educational Resources Information Center

    Wang, Botao; Duan, Haijun; Qi, Senqing; Hu, Weiping; Zhang, Huan

    2017-01-01

    Creative objects differ from ordinary objects in that they are created by human beings to contain novel, creative information. Previous research has demonstrated that ordinary object processing involves both a perceptual process for analyzing different features of the visual input and a higher-order process for evaluating the relevance of this…

  11. Image processing operations achievable with the Microchannel Spatial Light Modulator

    NASA Astrophysics Data System (ADS)

    Warde, C.; Fisher, A. D.; Thackara, J. I.; Weiss, A. M.

    1980-01-01

    The Microchannel Spatial Light Modulator (MSLM) is a versatile, optically-addressed, highly-sensitive device that is well suited for low-light-level, real-time, optical information processing. It consists of a photocathode, a microchannel plate (MCP), a planar acceleration grid, and an electro-optic plate in proximity focus. A framing rate of 20 Hz with full modulation depth, and 100 Hz with 20% modulation depth has been achieved in a vacuum-demountable LiTaO3 device. A halfwave exposure sensitivity of 2.2 mJ/sq cm and an optical information storage time of more than 2 months have been achieved in a similar gridless LiTaO3 device employing a visible photocathode. Image processing operations such as analog and digital thresholding, real-time image hard clipping, contrast reversal, contrast enhancement, image addition and subtraction, and binary-level logic operations such as AND, OR, XOR, and NOR can be achieved with this device. This collection of achievable image processing characteristics makes the MSLM potentially useful for a number of smart sensor applications.

  12. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  13. Multifractal analysis of information processing in hippocampal neural ensembles during working memory under Δ9-tetrahydrocannabinol administration

    PubMed Central

    Fetterhoff, Dustin; Opris, Ioan; Simpson, Sean L.; Deadwyler, Sam A.; Hampson, Robert E.; Kraft, Robert A.

    2014-01-01

    Background Multifractal analysis quantifies the time-scale-invariant properties in data by describing the structure of variability over time. By applying this analysis to hippocampal interspike interval sequences recorded during performance of a working memory task, a measure of long-range temporal correlations and multifractal dynamics can reveal single neuron correlates of information processing. New method Wavelet leaders-based multifractal analysis (WLMA) was applied to hippocampal interspike intervals recorded during a working memory task. WLMA can be used to identify neurons likely to exhibit information processing relevant to operation of brain–computer interfaces and nonlinear neuronal models. Results Neurons involved in memory processing (“Functional Cell Types” or FCTs) showed a greater degree of multifractal firing properties than neurons without task-relevant firing characteristics. In addition, previously unidentified FCTs were revealed because multifractal analysis suggested further functional classification. The cannabinoid-type 1 receptor partial agonist, tetrahydrocannabinol (THC), selectively reduced multifractal dynamics in FCT neurons compared to non-FCT neurons. Comparison with existing methods WLMA is an objective tool for quantifying the memory-correlated complexity represented by FCTs that reveals additional information compared to classification of FCTs using traditional z-scores to identify neuronal correlates of behavioral events. Conclusion z-Score-based FCT classification provides limited information about the dynamical range of neuronal activity characterized by WLMA. Increased complexity, as measured with multifractal analysis, may be a marker of functional involvement in memory processing. The level of multifractal attributes can be used to differentially emphasize neural signals to improve computational models and algorithms underlying brain–computer interfaces. PMID:25086297

  14. Homo Heuristicus: Less-is-More Effects in Adaptive Cognition

    PubMed Central

    Brighton, Henry; Gigerenzer, Gerd

    2012-01-01

    Heuristics are efficient cognitive processes that ignore information. In contrast to the widely held view that less processing reduces accuracy, the study of heuristics shows that less information, computation, and time can in fact improve accuracy. We discuss some of the major progress made so far, focusing on the discovery of less-is-more effects and the study of the ecological rationality of heuristics which examines in which environments a given strategy succeeds or fails, and why. Homo heuristicus has a biased mind and ignores part of the available information, yet a biased mind can handle uncertainty more efficiently and robustly than an unbiased mind relying on more resource-intensive and general-purpose processing strategies. PMID:23613644

  15. High-end clinical domain information systems for effective healthcare delivery.

    PubMed

    Mangalampalli, Ashish; Rama, Chakravarthy; Muthiyalian, Raja; Jain, Ajeet K

    2007-01-01

    The Electronic Health Record (EHR) provides doctors with a quick, reliable, secure, real-time and user-friendly source of all relevant patient data. The latest information system technologies, such as Clinical Data Warehouses (CDW), Clinical Decision-Support (CDS) systems and data-mining techniques (Online Analytical Processing (OLAP) and Online Transactional Processing (OLTP)), are used to maintain and utilise patient data intelligently, based on the users' requirements. Moreover, clinical trial reports for new drug approvals are now being submitted electronically for faster and easier processing. Also, information systems are used in educating patients about the latest developments in medical science through the internet and specially configured kiosks in hospitals and clinics.

  16. Exploitation and Benefits of BIM in Construction Project Management

    NASA Astrophysics Data System (ADS)

    Mesároš, Peter; Mandičák, Tomáš

    2017-10-01

    BIM is increasingly getting into the awareness in construction industry. BIM is the process of creating and data managing of the building during its life cycle. BIM became a part of management tools in modern construction companies. Construction projects have a number of participants. It means difficulty process of construction project management and a serious requirement for processing the huge amount of information including design, construction, time and cost parameters, economic efficiency and sustainability. Progressive information and communication technologies support cost management and management of construction project. One of them is Building Information Modelling. Aim of the paper is to examine the impact of BIM exploitation and benefits on construction project management in Slovak companies.

  17. ITOHealth: a multimodal middleware-oriented integrated architecture for discovering medical entities.

    PubMed

    Alor-Hernández, Giner; Sánchez-Cervantes, José Luis; Juárez-Martínez, Ulises; Posada-Gómez, Rubén; Cortes-Robles, Guillermo; Aguilar-Laserre, Alberto

    2012-03-01

    Emergency healthcare is one of the emerging application domains for information services, which requires highly multimodal information services. The time of consuming pre-hospital emergency process is critical. Therefore, the minimization of required time for providing primary care and consultation to patients is one of the crucial factors when trying to improve the healthcare delivery in emergency situations. In this sense, dynamic location of medical entities is a complex process that needs time and it can be critical when a person requires medical attention. This work presents a multimodal location-based system for locating and assigning medical entities called ITOHealth. ITOHealth provides a multimodal middleware-oriented integrated architecture using a service-oriented architecture in order to provide information of medical entities in mobile devices and web browsers with enriched interfaces providing multimodality support. ITOHealth's multimodality is based on the use of Microsoft Agent Characters, the integration of natural language voice to the characters, and multi-language and multi-characters support providing an advantage for users with visual impairments.

  18. A Theory of Information Quality and a Framework for its Implementation in the Requirements Engineering Process

    NASA Astrophysics Data System (ADS)

    Grenn, Michael W.

    This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed to reach the desired state of quality is estimated from the perspective of the receiver. The HR may increase, decrease or remain steady depending on the degree to which additions, deletions and revisions impact the distribution of R among the quality levels. Current requirements trend metrics generally treat additions, deletions and revisions the same and simply measure the quantity of these changes over time. The REF evaluates the quantity of requirements changes over time, distinguishes between their positive and negative effects by calculating their impact on HR, Q, and AI, and forecasts when the desired state will be reached, enabling more accurate assessment of the status and progress of the requirements engineering effort. Results from random variable simulations suggest the REF is an improved leading indicator of requirements trends that can be readily combined with current methods. The increase in I, or decrease in H R and uncertainty, is proportional to the engineering effort E input into the requirements engineering process. The REF estimates the AE needed to transition R from their current state of quality to the desired end state or some other interim state of interest. Simulation results are compared with measured engineering effort data for Department of Defense programs published in the SE literature, and the results suggest the REF is a promising new method for estimation of AE.

  19. What is the impact of an electronic test result acknowledgement system on Emergency Department physicians' work processes? A mixed-method pre-post observational study.

    PubMed

    Georgiou, Andrew; McCaughey, Euan J; Tariq, Amina; Walter, Scott R; Li, Julie; Callen, Joanne; Paoloni, Richard; Runciman, William B; Westbrook, Johanna I

    2017-03-01

    To examine the impact of an electronic Results Acknowledgement (eRA) system on emergency physicians' test result management work processes and the time taken to acknowledge microbiology and radiology test results for patients discharged from an Emergency Department (ED). The impact of the eRA system was assessed in an Australian ED using: a) semi-structured interviews with senior emergency physicians; and b) a time and motion direct observational study of senior emergency physicians completing test acknowledgment pre and post the implementation of the eRA system. The eRA system led to changes in the way results and actions were collated, stored, documented and communicated. Although there was a non-significant increase in the average time taken to acknowledge results in the post period, most types of acknowledgements (other than simple acknowledgements) took less time to complete. The number of acknowledgements where physicians sought additional information from the Electronic Medical Record (EMR) rose from 12% pre to 20% post implementation of eRA. Given that the type of results are unlikely to have changed significantly across the pre and post implementation periods, the increase in the time physicians spent accessing additional clinical information in the post period likely reflects the greater access to clinical information provided by the integrated electronic system. Easier access to clinical information may improve clinical decision making and enhance the quality of patient care. For instance, in situations where a senior clinician, not initially involved in the care process, is required to deal with the follow-up of non-normal results. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Extraction of Data from a Hospital Information System to Perform Process Mining.

    PubMed

    Neira, Ricardo Alfredo Quintano; de Vries, Gert-Jan; Caffarel, Jennifer; Stretton, Erin

    2017-01-01

    The aim of this work is to share our experience in relevant data extraction from a hospital information system in preparation for a research study using process mining techniques. The steps performed were: research definition, mapping the normative processes, identification of tables and fields names of the database, and extraction of data. We then offer lessons learned during data extraction phase. Any errors made in the extraction phase will propagate and have implications on subsequent analyses. Thus, it is essential to take the time needed and devote sufficient attention to detail to perform all activities with the goal of ensuring high quality of the extracted data. We hope this work will be informative for other researchers to plan and execute extraction of data for process mining research studies.

  1. QPA-CLIPS: A language and representation for process control

    NASA Technical Reports Server (NTRS)

    Freund, Thomas G.

    1994-01-01

    QPA-CLIPS is an extension of CLIPS oriented towards process control applications. Its constructs define a dependency network of process actions driven by sensor information. The language consists of three basic constructs: TASK, SENSOR, and FILTER. TASK's define the dependency network describing alternative state transitions for a process. SENSOR's and FILTER's define sensor information sources used to activate state transitions within the network. Deftemplate's define these constructs and their run-time environment is an interpreter knowledge base, performing pattern matching on sensor information and so activating TASK's in the dependency network. The pattern matching technique is based on the repeatable occurrence of a sensor data pattern. QPA-CIPS has been successfully tested on a SPARCStation providing supervisory control to an Allen-Bradley PLC 5 controller driving molding equipment.

  2. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach.

    PubMed

    Perez, Susan L; Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-07-20

    Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant's information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites.

  3. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach

    PubMed Central

    Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-01-01

    Background Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Objective Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Methods Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant’s information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. Results We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. Conclusions We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites. PMID:26194787

  4. Algorithmic and heuristic processing of information by the nervous system.

    PubMed

    Restian, A

    1980-01-01

    Starting from the fact that the nervous system must discover the information it needs, the author describes the way it decodes the received message. The logical circuits of the nervous system, submitting the received signals to a process by means of which information brought is discovered step by step, participates in decoding the message. The received signals, as information, can be algorithmically or heuristically processed. Algorithmic processing is done according to precise rules, which must be fulfilled step by step. By algorithmic processing, one develops somatic and vegetative reflexes as blood pressure, heart frequency or water metabolism control. When it does not dispose of precise rules of information processing or when algorithmic processing needs a very long time, the nervous system must use heuristic processing. This is the feature that differentiates the human brain from the electronic computer that can work only according to some extremely precise rules. The human brain can work according to less precise rules because it can resort to trial and error operations, and because it works according to a form of logic. Working with superior order signals which represent the class of all inferior type signals from which they begin, the human brain need not perform all the operations that it would have to perform by superior type of signals. Therefore the brain tries to submit the received signals to intensive as possible superization. All informational processing, and especially heuristical processing, is accompanied by a certain affective color and the brain cannot operate without it. Emotions, passions and sentiments usually complete the lack of precision of the heuristical programmes. Finally, the author shows that informational and especially heuristical processes study can contribute to a better understanding of the transition from neurological to psychological activity.

  5. Video image processing to create a speed sensor

    DOT National Transportation Integrated Search

    1999-11-01

    Image processing has been applied to traffic analysis in recent years, with different goals. In the report, a new approach is presented for extracting vehicular speed information, given a sequence of real-time traffic images. We extract moving edges ...

  6. Multiple neural states of representation in short-term memory? It's a matter of attention.

    PubMed

    Larocque, Joshua J; Lewis-Peacock, Jarrod A; Postle, Bradley R

    2014-01-01

    Short-term memory (STM) refers to the capacity-limited retention of information over a brief period of time, and working memory (WM) refers to the manipulation and use of that information to guide behavior. In recent years it has become apparent that STM and WM interact and overlap with other cognitive processes, including attention (the selection of a subset of information for further processing) and long-term memory (LTM-the encoding and retention of an effectively unlimited amount of information for a much longer period of time). Broadly speaking, there have been two classes of memory models: systems models, which posit distinct stores for STM and LTM (Atkinson and Shiffrin, 1968; Baddeley and Hitch, 1974); and state-based models, which posit a common store with different activation states corresponding to STM and LTM (Cowan, 1995; McElree, 1996; Oberauer, 2002). In this paper, we will focus on state-based accounts of STM. First, we will consider several theoretical models that postulate, based on considerable behavioral evidence, that information in STM can exist in multiple representational states. We will then consider how neural data from recent studies of STM can inform and constrain these theoretical models. In the process we will highlight the inferential advantage of multivariate, information-based analyses of neuroimaging data (fMRI and electroencephalography (EEG)) over conventional activation-based analysis approaches (Postle, in press). We will conclude by addressing lingering questions regarding the fractionation of STM, highlighting differences between the attention to information vs. the retention of information during brief memory delays.

  7. Iterative Neighbour-Information Gathering for Ranking Nodes in Complex Networks

    NASA Astrophysics Data System (ADS)

    Xu, Shuang; Wang, Pei; Lü, Jinhu

    2017-01-01

    Designing node influence ranking algorithms can provide insights into network dynamics, functions and structures. Increasingly evidences reveal that node’s spreading ability largely depends on its neighbours. We introduce an iterative neighbourinformation gathering (Ing) process with three parameters, including a transformation matrix, a priori information and an iteration time. The Ing process iteratively combines priori information from neighbours via the transformation matrix, and iteratively assigns an Ing score to each node to evaluate its influence. The algorithm appropriates for any types of networks, and includes some traditional centralities as special cases, such as degree, semi-local, LeaderRank. The Ing process converges in strongly connected networks with speed relying on the first two largest eigenvalues of the transformation matrix. Interestingly, the eigenvector centrality corresponds to a limit case of the algorithm. By comparing with eight renowned centralities, simulations of susceptible-infected-removed (SIR) model on real-world networks reveal that the Ing can offer more exact rankings, even without a priori information. We also observe that an optimal iteration time is always in existence to realize best characterizing of node influence. The proposed algorithms bridge the gaps among some existing measures, and may have potential applications in infectious disease control, designing of optimal information spreading strategies.

  8. Channelling information flows from observation to decision; or how to increase certainty

    NASA Astrophysics Data System (ADS)

    Weijs, S. V.

    2015-12-01

    To make adequate decisions in an uncertain world, information needs to reach the decision problem, to enable overseeing the full consequences of each possible decision.On its way from the physical world to a decision problem, information is transferred through the physical processes that influence the sensor, then through processes that happen in the sensor, through wires or electromagnetic waves. For the last decade, most information becomes digitized at some point. From moment of digitization, information can in principle be transferred losslessly. Information about the physical world is often also stored, sometimes in compressed form, such as physical laws, concepts, or models of specific hydrological systems. It is important to note, however, that all information about a physical system eventually has to originate from observation (although inevitably coloured by some prior assumptions). This colouring makes the compression lossy, but is effectively the only way to make use of similarities in time and space that enable predictions while measuring only a a few macro-states of a complex hydrological system.Adding physical process knowledge to a hydrological model can thus be seen as a convenient way to transfer information from observations from a different time or place, to make predictions about another situation, assuming the same dynamics are at work.The key challenge to achieve more certainty in hydrological prediction can therefore be formulated as a challenge to tap and channel information flows from the environment. For tapping more information flows, new measurement techniques, large scale campaigns, historical data sets, and large sample hydrology and regionalization efforts can bring progress. For channelling the information flows with minimum loss, model calibration, and model formulation techniques should be critically investigated. Some experience from research in a Swiss high alpine catchment are used as an illustration.

  9. Physically-enhanced data visualisation: towards real time solution of Partial Differential Equations in 3D domains

    NASA Astrophysics Data System (ADS)

    Zlotnik, Sergio

    2017-04-01

    Information provided by visualisation environments can be largely increased if the data shown is combined with some relevant physical processes and the used is allowed to interact with those processes. This is particularly interesting in VR environments where the user has a deep interplay with the data. For example, a geological seismic line in a 3D "cave" shows information of the geological structure of the subsoil. The available information could be enhanced with the thermal state of the region under study, with water-flow patterns in porous rocks or with rock displacements under some stress conditions. The information added by the physical processes is usually the output of some numerical technique applied to solve a Partial Differential Equation (PDE) that describes the underlying physics. Many techniques are available to obtain numerical solutions of PDE (e.g. Finite Elements, Finite Volumes, Finite Differences, etc). Although, all these traditional techniques require very large computational resources (particularly in 3D), making them useless in a real time visualization environment -such as VR- because the time required to compute a solution is measured in minutes or even in hours. We present here a novel alternative for the resolution of PDE-based problems that is able to provide a 3D solutions for a very large family of problems in real time. That is, the solution is evaluated in a one thousands of a second, making the solver ideal to be embedded into VR environments. Based on Model Order Reduction ideas, the proposed technique divides the computational work in to a computationally intensive "offline" phase, that is run only once in a life time, and an "online" phase that allow the real time evaluation of any solution within a family of problems. Preliminary examples of real time solutions of complex PDE-based problems will be presented, including thermal problems, flow problems, wave problems and some simple coupled problems.

  10. An object-oriented software approach for a distributed human tracking motion system

    NASA Astrophysics Data System (ADS)

    Micucci, Daniela L.

    2003-06-01

    Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.

  11. Real-Time On-Board Processing Validation of MSPI Ground Camera Images

    NASA Technical Reports Server (NTRS)

    Pingree, Paula J.; Werne, Thomas A.; Bekker, Dmitriy L.

    2010-01-01

    The Earth Sciences Decadal Survey identifies a multiangle, multispectral, high-accuracy polarization imager as one requirement for the Aerosol-Cloud-Ecosystem (ACE) mission. JPL has been developing a Multiangle SpectroPolarimetric Imager (MSPI) as a candidate to fill this need. A key technology development needed for MSPI is on-board signal processing to calculate polarimetry data as imaged by each of the 9 cameras forming the instrument. With funding from NASA's Advanced Information Systems Technology (AIST) Program, JPL is solving the real-time data processing requirements to demonstrate, for the first time, how signal data at 95 Mbytes/sec over 16-channels for each of the 9 multiangle cameras in the spaceborne instrument can be reduced on-board to 0.45 Mbytes/sec. This will produce the intensity and polarization data needed to characterize aerosol and cloud microphysical properties. Using the Xilinx Virtex-5 FPGA including PowerPC440 processors we have implemented a least squares fitting algorithm that extracts intensity and polarimetric parameters in real-time, thereby substantially reducing the image data volume for spacecraft downlink without loss of science information.

  12. Development of a Paradigm to Assess Nutritive and Biochemical Substances in Humans: A Preliminary Report on the Effects of Tyrosine upon Altitude- and Cold-Induced Stress Responses

    DTIC Science & Technology

    1987-03-01

    3/4 hours. Performance tests evaluated simple and choice reaction time to visual stimuli, vigilance, and processing of symbolic, numerical, verbal...minimize the adverse consequences of these stressors. Tyrosine enhanced performance (e.g. complex information processing , vigilance, and reaction time... processes inherent in many real-world tasks. For example, Map Compass requires association of Wsi PL AFCm uA O-SV CHETCLtISS) direction and degree

  13. Coherent optical pulse sequencer for quantum applications.

    PubMed

    Hosseini, Mahdi; Sparkes, Ben M; Hétet, Gabriel; Longdell, Jevon J; Lam, Ping Koy; Buchler, Ben C

    2009-09-10

    The bandwidth and versatility of optical devices have revolutionized information technology systems and communication networks. Precise and arbitrary control of an optical field that preserves optical coherence is an important requisite for many proposed photonic technologies. For quantum information applications, a device that allows storage and on-demand retrieval of arbitrary quantum states of light would form an ideal quantum optical memory. Recently, significant progress has been made in implementing atomic quantum memories using electromagnetically induced transparency, photon echo spectroscopy, off-resonance Raman spectroscopy and other atom-light interaction processes. Single-photon and bright-optical-field storage with quantum states have both been successfully demonstrated. Here we present a coherent optical memory based on photon echoes induced through controlled reversible inhomogeneous broadening. Our scheme allows storage of multiple pulses of light within a chosen frequency bandwidth, and stored pulses can be recalled in arbitrary order with any chosen delay between each recalled pulse. Furthermore, pulses can be time-compressed, time-stretched or split into multiple smaller pulses and recalled in several pieces at chosen times. Although our experimental results are so far limited to classical light pulses, our technique should enable the construction of an optical random-access memory for time-bin quantum information, and have potential applications in quantum information processing.

  14. A front-end readout Detector Board for the OpenPET electronics system

    NASA Astrophysics Data System (ADS)

    Choong, W.-S.; Abu-Nimeh, F.; Moses, W. W.; Peng, Q.; Vu, C. Q.; Wu, J.-Y.

    2015-08-01

    We present a 16-channel front-end readout board for the OpenPET electronics system. A major task in developing a nuclear medical imaging system, such as a positron emission computed tomograph (PET) or a single-photon emission computed tomograph (SPECT), is the electronics system. While there are a wide variety of detector and camera design concepts, the relatively simple nature of the acquired data allows for a common set of electronics requirements that can be met by a flexible, scalable, and high-performance OpenPET electronics system. The analog signals from the different types of detectors used in medical imaging share similar characteristics, which allows for a common analog signal processing. The OpenPET electronics processes the analog signals with Detector Boards. Here we report on the development of a 16-channel Detector Board. Each signal is digitized by a continuously sampled analog-to-digital converter (ADC), which is processed by a field programmable gate array (FPGA) to extract pulse height information. A leading edge discriminator creates a timing edge that is ``time stamped'' by a time-to-digital converter (TDC) implemented inside the FPGA . This digital information from each channel is sent to an FPGA that services 16 analog channels, and then information from multiple channels is processed by this FPGA to perform logic for crystal lookup, DOI calculation, calibration, etc.

  15. A front-end readout Detector Board for the OpenPET electronics system

    DOE PAGES

    Choong, W. -S.; Abu-Nimeh, F.; Moses, W. W.; ...

    2015-08-12

    Here, we present a 16-channel front-end readout board for the OpenPET electronics system. A major task in developing a nuclear medical imaging system, such as a positron emission computed tomograph (PET) or a single-photon emission computed tomograph (SPECT), is the electronics system. While there are a wide variety of detector and camera design concepts, the relatively simple nature of the acquired data allows for a common set of electronics requirements that can be met by a flexible, scalable, and high-performance OpenPET electronics system. The analog signals from the different types of detectors used in medical imaging share similar characteristics, whichmore » allows for a common analog signal processing. The OpenPET electronics processes the analog signals with Detector Boards. Here we report on the development of a 16-channel Detector Board. Each signal is digitized by a continuously sampled analog-to-digital converter (ADC), which is processed by a field programmable gate array (FPGA) to extract pulse height information. A leading edge discriminator creates a timing edge that is "time stamped" by a time-to-digital converter (TDC) implemented inside the FPGA. In conclusion, this digital information from each channel is sent to an FPGA that services 16 analog channels, and then information from multiple channels is processed by this FPGA to perform logic for crystal lookup, DOI calculation, calibration, etc.« less

  16. An accuracy assessment of realtime GNSS time series toward semi- real time seafloor geodetic observation

    NASA Astrophysics Data System (ADS)

    Osada, Y.; Ohta, Y.; Demachi, T.; Kido, M.; Fujimoto, H.; Azuma, R.; Hino, R.

    2013-12-01

    Large interplate earthquake repeatedly occurred in Japan Trench. Recently, the detail crustal deformation revealed by the nation-wide inland GPS network called as GEONET by GSI. However, the maximum displacement region for interplate earthquake is mainly located offshore region. GPS/Acoustic seafloor geodetic observation (hereafter GPS/A) is quite important and useful for understanding of shallower part of the interplate coupling between subducting and overriding plates. We typically conduct GPS/A in specific ocean area based on repeated campaign style using research vessel or buoy. Therefore, we cannot monitor the temporal variation of seafloor crustal deformation in real time. The one of technical issue on real time observation is kinematic GPS analysis because kinematic GPS analysis based on reference and rover data. If the precise kinematic GPS analysis will be possible in the offshore region, it should be promising method for real time GPS/A with USV (Unmanned Surface Vehicle) and a moored buoy. We assessed stability, precision and accuracy of StarFireTM global satellites based augmentation system. We primarily tested for StarFire in the static condition. In order to assess coordinate precision and accuracy, we compared 1Hz StarFire time series and post-processed precise point positioning (PPP) 1Hz time series by GIPSY-OASIS II processing software Ver. 6.1.2 with three difference product types (ultra-rapid, rapid, and final orbits). We also used difference interval clock information (30 and 300 seconds) for the post-processed PPP processing. The standard deviation of real time StarFire time series is less than 30 mm (horizontal components) and 60 mm (vertical component) based on 1 month continuous processing. We also assessed noise spectrum of the estimated time series by StarFire and post-processed GIPSY PPP results. We found that the noise spectrum of StarFire time series is similar pattern with GIPSY-OASIS II processing result based on JPL rapid orbit products with 300 seconds interval clock information. And we report stability, precision and accuracy of StarFire in the moving conditon.

  17. How does information influence hope in family members of traumatic coma patients in intensive care unit?

    PubMed

    Verhaeghe, Sofie T L; van Zuuren, Florence J; Defloor, Tom; Duijnstee, Mia S H; Grypdonck, Mieke H F

    2007-08-01

    To assess the interplay between hope and the information provided by health care professionals. Earlier research learned that hope is crucial for relatives of traumatic coma patients. Also it has been reported that the need for information is extremely important for relatives of critically ill patients. A qualitative approach according to the 'grounded theory' method with constant comparison was used. We held 24 in-depth interviews with 22 family members of 16 patients with traumatic coma. Data processing and data analysis took place in a cyclic process wherein the induction of themes was alternated by confrontation with new material. Family members of traumatic coma patients want information that is as accurate as possible, provided by doctors and nurses in an understandable manner and leaving room for hope. At first, family members can do no more than passively absorb the information they receive. After some time, they actively start working with information and learn what to build their hope on. In this way, concrete hope evolves and seems to be strongly determined by information. Information that is more positive than warranted is not appreciated at all. It leads to false hope and once its real nature becomes apparent, to increased distress and loss of trust in the professionals. The process of hope is crucial in coping with traumatic coma and information can facilitate this process. If professionals, especially nurses, keep the process in mind that family members go through in handling information, they can not only facilitate this process but also help them to establish realistic hope.

  18. USSR Report, Consumer Goods and Domestic Trade, No. 65

    DTIC Science & Technology

    1983-06-08

    types of oil-bearing raw materials— grape seeds , fruit and tree-fruit pits, corn germs and others. In recent years the ties of the workers of oils and...brief, indicate how the original information was processed. Where no processing indicator is given, the information was summarized or extracted ...good crop. More than 2 million tons of grapes were harvested for the first time. Public education, science and culture underwent further development

  19. Surviving an Information Systems Conversion.

    ERIC Educational Resources Information Center

    Neel, Don

    1999-01-01

    Prompted by the "millennium bug," many school districts are in the process of replacing non-Y2K-compliant information systems. Planners should establish a committee to develop performance criteria and select the winning proposal, estimate time requirements, and schedule retraining during low-activity periods. (MLH)

  20. Clear as glass: transparent financial reporting.

    PubMed

    Valletta, Robert M

    2005-08-01

    To be transparent, financial information needs to be easily accessible, timely, content-rich, and narrative. Not-for-profit hospitals and health systems should report detailed financial information quarterly. They need internal controls to reduce the level of complexity throughout the organization by creating standardized processes.

Top