Sample records for complex information processing

  1. Simplifying the complexity surrounding ICU work processes--identifying the scope for information management in ICU settings.

    PubMed

    Munir, Samina K; Kay, Stephen

    2005-08-01

    A multi-site study, conducted in two English and two Danish intensive care units, investigates the complexity of work processes in intensive care, and the implications of this complexity for information management with regards to clinical information systems. Data were collected via observations, shadowing of clinical staff, interviews and questionnaires. The construction of role activity diagrams enabled the capture of critical care work processes. Upon analysing these diagrams, it was found that intensive care work processes consist of 'simplified-complexity', these processes are changed with the introduction of information systems for the everyday use and management of all clinical information. The prevailing notion of complexity surrounding critical care clinical work processes was refuted and found to be misleading; in reality, it is not the work processes that cause the complexity, the complexity is rooted in the way in which clinical information is used and managed. This study emphasises that the potential for clinical information systems that consider integrating all clinical information requirements is not only immense but also very plausible.

  2. Integrated Information Increases with Fitness in the Evolution of Animats

    PubMed Central

    Edlund, Jeffrey A.; Chaumont, Nicolas; Hintze, Arend; Koch, Christof; Tononi, Giulio; Adami, Christoph

    2011-01-01

    One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent (“animat”) evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its “fit” to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data. PMID:22028639

  3. Processing of spatial and non-spatial information in rats with lesions of the medial and lateral entorhinal cortex: Environmental complexity matters.

    PubMed

    Rodo, Christophe; Sargolini, Francesca; Save, Etienne

    2017-03-01

    The entorhinal-hippocampal circuitry has been suggested to play an important role in episodic memory but the contribution of the entorhinal cortex remains elusive. Predominant theories propose that the medial entorhinal cortex (MEC) processes spatial information whereas the lateral entorhinal cortex (LEC) processes non spatial information. A recent study using an object exploration task has suggested that the involvement of the MEC and LEC spatial and non-spatial information processing could be modulated by the amount of information to be processed, i.e. environmental complexity. To address this hypothesis we used an object exploration task in which rats with excitotoxic lesions of the MEC and LEC had to detect spatial and non-spatial novelty among a set of objects and we varied environmental complexity by decreasing the number of objects or amount of object diversity. Reducing diversity resulted in restored ability to process spatial and non-spatial information in MEC and LEC groups, respectively. Reducing the number of objects yielded restored ability to process non-spatial information in the LEC group but not the ability to process spatial information in the MEC group. The findings indicate that the MEC and LEC are not strictly necessary for spatial and non-spatial processing but that their involvement depends on the complexity of the information to be processed. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Processing power limits social group size: computational evidence for the cognitive costs of sociality

    PubMed Central

    Dávid-Barrett, T.; Dunbar, R. I. M.

    2013-01-01

    Sociality is primarily a coordination problem. However, the social (or communication) complexity hypothesis suggests that the kinds of information that can be acquired and processed may limit the size and/or complexity of social groups that a species can maintain. We use an agent-based model to test the hypothesis that the complexity of information processed influences the computational demands involved. We show that successive increases in the kinds of information processed allow organisms to break through the glass ceilings that otherwise limit the size of social groups: larger groups can only be achieved at the cost of more sophisticated kinds of information processing that are disadvantageous when optimal group size is small. These results simultaneously support both the social brain and the social complexity hypotheses. PMID:23804623

  5. Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Crutchfield, James P.

    2018-03-01

    The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.

  6. Minimized state complexity of quantum-encoded cryptic processes

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-05-01

    The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.

  7. The Effects of Using Multimedia Presentations and Modular Worked-Out Examples as Instructional Methodologies to Manage the Cognitive Processing Associated with Information Literacy Instruction at the Graduate and Undergraduate Levels of Nursing Education

    ERIC Educational Resources Information Center

    Calhoun, Shawn P.

    2012-01-01

    Information literacy is a complex knowledge domain. Cognitive processing theory describes the effects an instructional subject and the learning environment have on working memory. Essential processing is one component of cognitive processing theory that explains the inherent complexity of knowledge domains such as information literacy. Prior…

  8. Information Processing by Schizophrenics When Task Complexity Increases

    ERIC Educational Resources Information Center

    Hirt, Michael; And Others

    1977-01-01

    The performance of hospitalized paranoid schizophrenics, nonparanoids, and hospitalized controls was compared on motor, perceptual, and cognitive tasks of increasing complexity. The data were examined within the context of comparing differential predictions made by input and central processing theories of information-processing deficit. (Editor)

  9. Recoding Numerics to Geometrics for Complex Discrimination Tasks; A Feasibility Study of Coding Strategy.

    ERIC Educational Resources Information Center

    Simpkins, John D.

    Processing complex multivariate information effectively when relational properties of information sub-groups are ambiguous is difficult for man and man-machine systems. However, the information processing task is made easier through code study, cybernetic planning, and accurate display mechanisms. An exploratory laboratory study designed for the…

  10. Social Information Processing and Emotional Understanding in Children with LD

    ERIC Educational Resources Information Center

    Bauminger, Nirit; Edelsztein, Hany Schorr; Morash, Janice

    2005-01-01

    The present study aimed to comprehensively examine social cognition processes in children with and without learning disabilities (LD), focusing on social information processing (SIP) and complex emotional understanding capabilities such as understanding complex, mixed, and hidden emotions. Participants were 50 children with LD (age range 9.4-12.7;…

  11. The use of information theory in evolutionary biology.

    PubMed

    Adami, Christoph

    2012-05-01

    Information is a key concept in evolutionary biology. Information stored in a biological organism's genome is used to generate the organism and to maintain and control it. Information is also that which evolves. When a population adapts to a local environment, information about this environment is fixed in a representative genome. However, when an environment changes, information can be lost. At the same time, information is processed by animal brains to survive in complex environments, and the capacity for information processing also evolves. Here, I review applications of information theory to the evolution of proteins and to the evolution of information processing in simulated agents that adapt to perform a complex task. © 2012 New York Academy of Sciences.

  12. Enhanced and diminished visuo-spatial information processing in autism depends on stimulus complexity.

    PubMed

    Bertone, Armando; Mottron, Laurent; Jelenic, Patricia; Faubert, Jocelyn

    2005-10-01

    Visuo-perceptual processing in autism is characterized by intact or enhanced performance on static spatial tasks and inferior performance on dynamic tasks, suggesting a deficit of dorsal visual stream processing in autism. However, previous findings by Bertone et al. indicate that neuro-integrative mechanisms used to detect complex motion, rather than motion perception per se, may be impaired in autism. We present here the first demonstration of concurrent enhanced and decreased performance in autism on the same visuo-spatial static task, wherein the only factor dichotomizing performance was the neural complexity required to discriminate grating orientation. The ability of persons with autism was found to be superior for identifying the orientation of simple, luminance-defined (or first-order) gratings but inferior for complex, texture-defined (or second-order) gratings. Using a flicker contrast sensitivity task, we demonstrated that this finding is probably not due to abnormal information processing at a sub-cortical level (magnocellular and parvocellular functioning). Together, these findings are interpreted as a clear indication of altered low-level perceptual information processing in autism, and confirm that the deficits and assets observed in autistic visual perception are contingent on the complexity of the neural network required to process a given type of visual stimulus. We suggest that atypical neural connectivity, resulting in enhanced lateral inhibition, may account for both enhanced and decreased low-level information processing in autism.

  13. Efficacy of Cognitive Processes in Young People with High-Functioning Autism Spectrum Disorder Using a Novel Visual Information-Processing Task

    ERIC Educational Resources Information Center

    Speirs, Samantha J.; Rinehart, Nicole J.; Robinson, Stephen R.; Tonge, Bruce J.; Yelland, Gregory W.

    2014-01-01

    Autism spectrum disorders (ASD) are characterised by a unique pattern of preserved abilities and deficits within and across cognitive domains. The Complex Information Processing Theory proposes this pattern reflects an altered capacity to respond to cognitive demands. This study compared how complexity induced by time constraints on processing…

  14. Mixture and odorant processing in the olfactory systems of insects: a comparative perspective.

    PubMed

    Clifford, Marie R; Riffell, Jeffrey A

    2013-11-01

    Natural olfactory stimuli are often complex mixtures of volatiles, of which the identities and ratios of constituents are important for odor-mediated behaviors. Despite this importance, the mechanism by which the olfactory system processes this complex information remains an area of active study. In this review, we describe recent progress in how odorants and mixtures are processed in the brain of insects. We use a comparative approach toward contrasting olfactory coding and the behavioral efficacy of mixtures in different insect species, and organize these topics around four sections: (1) Examples of the behavioral efficacy of odor mixtures and the olfactory environment; (2) mixture processing in the periphery; (3) mixture coding in the antennal lobe; and (4) evolutionary implications and adaptations for olfactory processing. We also include pertinent background information about the processing of individual odorants and comparative differences in wiring and anatomy, as these topics have been richly investigated and inform the processing of mixtures in the insect olfactory system. Finally, we describe exciting studies that have begun to elucidate the role of the processing of complex olfactory information in evolution and speciation.

  15. Presentation Media, Information Complexity, and Learning Outcomes

    ERIC Educational Resources Information Center

    Andres, Hayward P.; Petersen, Candice

    2002-01-01

    Cognitive processing limitations restrict the number of complex information items held and processed in human working memory. To overcome such limitations, a verbal working memory channel is used to construct an if-then proposition representation of facts and a visual working memory channel is used to construct a visual imagery of geometric…

  16. Towards the understanding of network information processing in biology

    NASA Astrophysics Data System (ADS)

    Singh, Vijay

    Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.

  17. Thinking graphically: Connecting vision and cognition during graph comprehension.

    PubMed

    Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A

    2008-03-01

    Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved

  18. Information processing using a single dynamical node as complex system

    PubMed Central

    Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.

    2011-01-01

    Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110

  19. Further Understanding of Complex Information Processing in Verbal Adolescents and Adults with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Williams, Diane L.; Minshew, Nancy J.; Goldstein, Gerald

    2015-01-01

    More than 20?years ago, Minshew and colleagues proposed the Complex Information Processing model of autism in which the impairment is characterized as a generalized deficit involving multiple modalities and cognitive domains that depend on distributed cortical systems responsible for higher order abilities. Subsequent behavioral work revealed a…

  20. Linguistic Complexity and Information Structure in Korean: Evidence from Eye-Tracking during Reading

    ERIC Educational Resources Information Center

    Lee, Yoonhyoung; Lee, Hanjung; Gordon, Peter C.

    2007-01-01

    The nature of the memory processes that support language comprehension and the manner in which information packaging influences online sentence processing were investigated in three experiments that used eye-tracking during reading to measure the ease of understanding complex sentences in Korean. All three experiments examined reading of embedded…

  1. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    NASA Astrophysics Data System (ADS)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  2. Neurophysiological Factors in Human Information Processing Capacity

    ERIC Educational Resources Information Center

    Ramsey, Nick F.; Jansma, J. M.; Jager, G.; Van Raalten, T.; Kahn, R. S.

    2004-01-01

    What determines how well an individual can manage the complexity of information processing demands when several tasks have to be executed simultaneously? Various theoretical frameworks address the mechanisms of information processing and the changes that take place when processes become automated, and brain regions involved in various types of…

  3. Network complexity as a measure of information processing across resting-state networks: evidence from the Human Connectome Project

    PubMed Central

    McDonough, Ian M.; Nashiro, Kaoru

    2014-01-01

    An emerging field of research focused on fluctuations in brain signals has provided evidence that the complexity of those signals, as measured by entropy, conveys important information about network dynamics (e.g., local and distributed processing). While much research has focused on how neural complexity differs in populations with different age groups or clinical disorders, substantially less research has focused on the basic understanding of neural complexity in populations with young and healthy brain states. The present study used resting-state fMRI data from the Human Connectome Project (Van Essen et al., 2013) to test the extent that neural complexity in the BOLD signal, as measured by multiscale entropy (1) would differ from random noise, (2) would differ between four major resting-state networks previously associated with higher-order cognition, and (3) would be associated with the strength and extent of functional connectivity—a complementary method of estimating information processing. We found that complexity in the BOLD signal exhibited different patterns of complexity from white, pink, and red noise and that neural complexity was differentially expressed between resting-state networks, including the default mode, cingulo-opercular, left and right frontoparietal networks. Lastly, neural complexity across all networks was negatively associated with functional connectivity at fine scales, but was positively associated with functional connectivity at coarse scales. The present study is the first to characterize neural complexity in BOLD signals at a high temporal resolution and across different networks and might help clarify the inconsistencies between neural complexity and functional connectivity, thus informing the mechanisms underlying neural complexity. PMID:24959130

  4. Neurophysiological Basis of Multi-Scale Entropy of Brain Complexity and Its Relationship With Functional Connectivity.

    PubMed

    Wang, Danny J J; Jann, Kay; Fan, Chang; Qiao, Yang; Zang, Yu-Feng; Lu, Hanbing; Yang, Yihong

    2018-01-01

    Recently, non-linear statistical measures such as multi-scale entropy (MSE) have been introduced as indices of the complexity of electrophysiology and fMRI time-series across multiple time scales. In this work, we investigated the neurophysiological underpinnings of complexity (MSE) of electrophysiology and fMRI signals and their relations to functional connectivity (FC). MSE and FC analyses were performed on simulated data using neural mass model based brain network model with the Brain Dynamics Toolbox, on animal models with concurrent recording of fMRI and electrophysiology in conjunction with pharmacological manipulations, and on resting-state fMRI data from the Human Connectome Project. Our results show that the complexity of regional electrophysiology and fMRI signals is positively correlated with network FC. The associations between MSE and FC are dependent on the temporal scales or frequencies, with higher associations between MSE and FC at lower temporal frequencies. Our results from theoretical modeling, animal experiment and human fMRI indicate that (1) Regional neural complexity and network FC may be two related aspects of brain's information processing: the more complex regional neural activity, the higher FC this region has with other brain regions; (2) MSE at high and low frequencies may represent local and distributed information processing across brain regions. Based on literature and our data, we propose that the complexity of regional neural signals may serve as an index of the brain's capacity of information processing-increased complexity may indicate greater transition or exploration between different states of brain networks, thereby a greater propensity for information processing.

  5. Multiscale analysis of information dynamics for linear multivariate processes.

    PubMed

    Faes, Luca; Montalto, Alessandro; Stramaglia, Sebastiano; Nollo, Giandomenico; Marinazzo, Daniele

    2016-08-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using statespace (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale information dynamics for simulated unidirectionally and bidirectionally coupled VAR processes, showing that rescaling may lead to insightful patterns of information storage and transfer but also to potentially misleading behaviors.

  6. Effects of emotional tone and visual complexity on processing health information in prescription drug advertising.

    PubMed

    Norris, Rebecca L; Bailey, Rachel L; Bolls, Paul D; Wise, Kevin R

    2012-01-01

    This experiment explored how the emotional tone and visual complexity of direct-to-consumer (DTC) drug advertisements affect the encoding and storage of specific risk and benefit statements about each of the drugs in question. Results are interpreted under the limited capacity model of motivated mediated message processing framework. Findings suggest that DTC drug ads should be pleasantly toned and high in visual complexity in order to maximize encoding and storage of risk and benefit information.

  7. An information transfer based novel framework for fault root cause tracing of complex electromechanical systems in the processing industry

    NASA Astrophysics Data System (ADS)

    Wang, Rongxi; Gao, Xu; Gao, Jianmin; Gao, Zhiyong; Kang, Jiani

    2018-02-01

    As one of the most important approaches for analyzing the mechanism of fault pervasion, fault root cause tracing is a powerful and useful tool for detecting the fundamental causes of faults so as to prevent any further propagation and amplification. Focused on the problems arising from the lack of systematic and comprehensive integration, an information transfer-based novel data-driven framework for fault root cause tracing of complex electromechanical systems in the processing industry was proposed, taking into consideration the experience and qualitative analysis of conventional fault root cause tracing methods. Firstly, an improved symbolic transfer entropy method was presented to construct a directed-weighted information model for a specific complex electromechanical system based on the information flow. Secondly, considering the feedback mechanisms in the complex electromechanical systems, a method for determining the threshold values of weights was developed to explore the disciplines of fault propagation. Lastly, an iterative method was introduced to identify the fault development process. The fault root cause was traced by analyzing the changes in information transfer between the nodes along with the fault propagation pathway. An actual fault root cause tracing application of a complex electromechanical system is used to verify the effectiveness of the proposed framework. A unique fault root cause is obtained regardless of the choice of the initial variable. Thus, the proposed framework can be flexibly and effectively used in fault root cause tracing for complex electromechanical systems in the processing industry, and formulate the foundation of system vulnerability analysis and condition prediction, as well as other engineering applications.

  8. The Robust Beauty of Ordinary Information

    ERIC Educational Resources Information Center

    Katsikopoulos, Konstantinos V.; Schooler, Lael J.; Hertwig, Ralph

    2010-01-01

    Heuristics embodying limited information search and noncompensatory processing of information can yield robust performance relative to computationally more complex models. One criticism raised against heuristics is the argument that complexity is hidden in the calculation of the cue order used to make predictions. We discuss ways to order cues…

  9. Effects of model complexity and priors on estimation using sequential importance sampling/resampling for species conservation

    USGS Publications Warehouse

    Dunham, Kylee; Grand, James B.

    2016-01-01

    We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.

  10. Abstraction of information in repository performance assessments. Examples from the SKI project Site-94

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dverstorp, B.; Andersson, J.

    1995-12-01

    Performance Assessment of a nuclear waste repository implies an analysis of a complex system with many interacting processes. Even if some of these processes may be known to large detail, problems arise when combining all information, and means of abstracting information from complex detailed models into models that couple different processes are needed. Clearly, one of the major objectives of performance assessment, to calculate doses or other performance indicators, implies an enormous abstraction of information compared to all information that is used as input. Other problems are that the knowledge of different parts or processes is strongly variable and adjustments,more » interpretations, are needed when combining models from different disciplines. In addition, people as well as computers, even today, always have a limited capacity to process information and choices have to be made. However, because abstraction of information clearly is unavoidable in performance assessment the validity of choices made, always need to be scrutinized and judgements made need to be updated in an iterative process.« less

  11. Shifts in information processing level: the speed theory of intelligence revisited.

    PubMed

    Sircar, S S

    2000-06-01

    A hypothesis is proposed here to reconcile the inconsistencies observed in the IQ-P3 latency relation. The hypothesis stems from the observation that task-induced increase in P3 latency correlates positively with IQ scores. It is hypothesised that: (a) there are several parallel information processing pathways of varying complexity which are associated with the generation of P3 waves of varying latencies; (b) with increasing workload, there is a shift in the 'information processing level' through progressive recruitment of more complex polysynaptic pathways with greater processing power and inhibition of the oligosynaptic pathways; (c) high-IQ subjects have a greater reserve of higher level processing pathways; (d) a given 'task-load' imposes a greater 'mental workload' in subjects with lower IQ than in those with higher IQ. According to this hypothesis, a meaningful comparison of the P3 correlates of IQ is possible only when the information processing level is pushed to its limits.

  12. Demodulation processes in auditory perception

    NASA Astrophysics Data System (ADS)

    Feth, Lawrence L.

    1994-08-01

    The long range goal of this project is the understanding of human auditory processing of information conveyed by complex, time-varying signals such as speech, music or important environmental sounds. Our work is guided by the assumption that human auditory communication is a 'modulation - demodulation' process. That is, we assume that sound sources produce a complex stream of sound pressure waves with information encoded as variations ( modulations) of the signal amplitude and frequency. The listeners task then is one of demodulation. Much of past. psychoacoustics work has been based in what we characterize as 'spectrum picture processing.' Complex sounds are Fourier analyzed to produce an amplitude-by-frequency 'picture' and the perception process is modeled as if the listener were analyzing the spectral picture. This approach leads to studies such as 'profile analysis' and the power-spectrum model of masking. Our approach leads us to investigate time-varying, complex sounds. We refer to them as dynamic signals and we have developed auditory signal processing models to help guide our experimental work.

  13. Integrating complex business processes for knowledge-driven clinical decision support systems.

    PubMed

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  14. Communication Network Integration and Group Uniformity in a Complex Organization.

    ERIC Educational Resources Information Center

    Danowski, James A.; Farace, Richard V.

    This paper contains a discussion of the limitations of research on group processes in complex organizations and the manner in which a procedure for network analysis in on-going systems can reduce problems. The research literature on group uniformity processes and on theoretical models of these processes from an information processing perspective…

  15. Surviving Blind Decomposition: A Distributional Analysis of the Time-Course of Complex Word Recognition

    ERIC Educational Resources Information Center

    Schmidtke, Daniel; Matsuki, Kazunaga; Kuperman, Victor

    2017-01-01

    The current study addresses a discrepancy in the psycholinguistic literature about the chronology of information processing during the visual recognition of morphologically complex words. "Form-then-meaning" accounts of complex word recognition claim that morphemes are processed as units of form prior to any influence of their meanings,…

  16. Complex vestibular macular anatomical relationships need a synthetic approach

    NASA Technical Reports Server (NTRS)

    Ross, M. D.

    2001-01-01

    Mammalian vestibular maculae are anatomically organized for complex parallel processing of linear acceleration information. Anatomical findings in rat maculae are provided in order to underscore this complexity, which is little understood functionally. This report emphasizes that a synthetic approach is critical to understanding how maculae function and the kind of information they conduct to the brain.

  17. Spintronic characteristics of self-assembled neurotransmitter acetylcholine molecular complexes enable quantum information processing in neural networks and brain

    NASA Astrophysics Data System (ADS)

    Tamulis, Arvydas; Majauskaite, Kristina; Kairys, Visvaldas; Zborowski, Krzysztof; Adhikari, Kapil; Krisciukaitis, Sarunas

    2016-09-01

    Implementation of liquid state quantum information processing based on spatially localized electronic spin in the neurotransmitter stable acetylcholine (ACh) neutral molecular radical is discussed. Using DFT quantum calculations we proved that this molecule possesses stable localized electron spin, which may represent a qubit in quantum information processing. The necessary operating conditions for ACh molecule are formulated in self-assembled dimer and more complex systems. The main quantum mechanical research result of this paper is that the neurotransmitter ACh systems, which were proposed, include the use of quantum molecular spintronics arrays to control the neurotransmission in neural networks.

  18. An assembly process model based on object-oriented hierarchical time Petri Nets

    NASA Astrophysics Data System (ADS)

    Wang, Jiapeng; Liu, Shaoli; Liu, Jianhua; Du, Zenghui

    2017-04-01

    In order to improve the versatility, accuracy and integrity of the assembly process model of complex products, an assembly process model based on object-oriented hierarchical time Petri Nets is presented. A complete assembly process information model including assembly resources, assembly inspection, time, structure and flexible parts is established, and this model describes the static and dynamic data involved in the assembly process. Through the analysis of three-dimensional assembly process information, the assembly information is hierarchically divided from the whole, the local to the details and the subnet model of different levels of object-oriented Petri Nets is established. The communication problem between Petri subnets is solved by using message database, and it reduces the complexity of system modeling effectively. Finally, the modeling process is presented, and a five layer Petri Nets model is established based on the hoisting process of the engine compartment of a wheeled armored vehicle.

  19. ALGORITHM OF CARDIO COMPLEX DETECTION AND SORTING FOR PROCESSING THE DATA OF CONTINUOUS CARDIO SIGNAL MONITORING.

    PubMed

    Krasichkov, A S; Grigoriev, E B; Nifontov, E M; Shapovalov, V V

    The paper presents an algorithm of cardio complex classification as part of processing the data of continuous cardiac monitoring. R-wave detection concurrently with cardio complex sorting is discussed. The core of this approach is the use of prior information about. cardio complex forms, segmental structure, and degree of kindness. Results of the sorting algorithm testing are provided.

  20. Brain signal complexity rises with repetition suppression in visual learning.

    PubMed

    Lafontaine, Marc Philippe; Lacourse, Karine; Lina, Jean-Marc; McIntosh, Anthony R; Gosselin, Frédéric; Théoret, Hugo; Lippé, Sarah

    2016-06-21

    Neuronal activity associated with visual processing of an unfamiliar face gradually diminishes when it is viewed repeatedly. This process, known as repetition suppression (RS), is involved in the acquisition of familiarity. Current models suggest that RS results from interactions between visual information processing areas located in the occipito-temporal cortex and higher order areas, such as the dorsolateral prefrontal cortex (DLPFC). Brain signal complexity, which reflects information dynamics of cortical networks, has been shown to increase as unfamiliar faces become familiar. However, the complementarity of RS and increases in brain signal complexity have yet to be demonstrated within the same measurements. We hypothesized that RS and brain signal complexity increase occur simultaneously during learning of unfamiliar faces. Further, we expected alteration of DLPFC function by transcranial direct current stimulation (tDCS) to modulate RS and brain signal complexity over the occipito-temporal cortex. Participants underwent three tDCS conditions in random order: right anodal/left cathodal, right cathodal/left anodal and sham. Following tDCS, participants learned unfamiliar faces, while an electroencephalogram (EEG) was recorded. Results revealed RS over occipito-temporal electrode sites during learning, reflected by a decrease in signal energy, a measure of amplitude. Simultaneously, as signal energy decreased, brain signal complexity, as estimated with multiscale entropy (MSE), increased. In addition, prefrontal tDCS modulated brain signal complexity over the right occipito-temporal cortex during the first presentation of faces. These results suggest that although RS may reflect a brain mechanism essential to learning, complementary processes reflected by increases in brain signal complexity, may be instrumental in the acquisition of novel visual information. Such processes likely involve long-range coordinated activity between prefrontal and lower order visual areas. Copyright © 2016 IBRO. Published by Elsevier Ltd. All rights reserved.

  1. Information Network Model Query Processing

    NASA Astrophysics Data System (ADS)

    Song, Xiaopu

    Information Networking Model (INM) [31] is a novel database model for real world objects and relationships management. It naturally and directly supports various kinds of static and dynamic relationships between objects. In INM, objects are networked through various natural and complex relationships. INM Query Language (INM-QL) [30] is designed to explore such information network, retrieve information about schema, instance, their attributes, relationships, and context-dependent information, and process query results in the user specified form. INM database management system has been implemented using Berkeley DB, and it supports INM-QL. This thesis is mainly focused on the implementation of the subsystem that is able to effectively and efficiently process INM-QL. The subsystem provides a lexical and syntactical analyzer of INM-QL, and it is able to choose appropriate evaluation strategies and index mechanism to process queries in INM-QL without the user's intervention. It also uses intermediate result structure to hold intermediate query result and other helping structures to reduce complexity of query processing.

  2. Theoretical aspects of cellular decision-making and information-processing.

    PubMed

    Kobayashi, Tetsuya J; Kamimura, Atsushi

    2012-01-01

    Microscopic biological processes have extraordinary complexity and variety at the sub-cellular, intra-cellular, and multi-cellular levels. In dealing with such complex phenomena, conceptual and theoretical frameworks are crucial, which enable us to understand seemingly different intra- and inter-cellular phenomena from unified viewpoints. Decision-making is one such concept that has attracted much attention recently. Since a number of cellular behavior can be regarded as processes to make specific actions in response to external stimuli, decision-making can cover and has been used to explain a broad range of different cellular phenomena [Balázsi et al. (Cell 144(6):910, 2011), Zeng et al. (Cell 141(4):682, 2010)]. Decision-making is also closely related to cellular information-processing because appropriate decisions cannot be made without exploiting the information that the external stimuli contain. Efficiency of information transduction and processing by intra-cellular networks determines the amount of information obtained, which in turn limits the efficiency of subsequent decision-making. Furthermore, information-processing itself can serve as another concept that is crucial for understanding of other biological processes than decision-making. In this work, we review recent theoretical developments on cellular decision-making and information-processing by focusing on the relation between these two concepts.

  3. Information entropy to measure the spatial and temporal complexity of solute transport in heterogeneous porous media

    NASA Astrophysics Data System (ADS)

    Li, Weiyao; Huang, Guanhua; Xiong, Yunwu

    2016-04-01

    The complexity of the spatial structure of porous media, randomness of groundwater recharge and discharge (rainfall, runoff, etc.) has led to groundwater movement complexity, physical and chemical interaction between groundwater and porous media cause solute transport in the medium more complicated. An appropriate method to describe the complexity of features is essential when study on solute transport and conversion in porous media. Information entropy could measure uncertainty and disorder, therefore we attempted to investigate complexity, explore the contact between the information entropy and complexity of solute transport in heterogeneous porous media using information entropy theory. Based on Markov theory, two-dimensional stochastic field of hydraulic conductivity (K) was generated by transition probability. Flow and solute transport model were established under four conditions (instantaneous point source, continuous point source, instantaneous line source and continuous line source). The spatial and temporal complexity of solute transport process was characterized and evaluated using spatial moment and information entropy. Results indicated that the entropy increased as the increase of complexity of solute transport process. For the point source, the one-dimensional entropy of solute concentration increased at first and then decreased along X and Y directions. As time increased, entropy peak value basically unchanged, peak position migrated along the flow direction (X direction) and approximately coincided with the centroid position. With the increase of time, spatial variability and complexity of solute concentration increase, which result in the increases of the second-order spatial moment and the two-dimensional entropy. Information entropy of line source was higher than point source. Solute entropy obtained from continuous input was higher than instantaneous input. Due to the increase of average length of lithoface, media continuity increased, flow and solute transport complexity weakened, and the corresponding information entropy also decreased. Longitudinal macro dispersivity declined slightly at early time then rose. Solute spatial and temporal distribution had significant impacts on the information entropy. Information entropy could reflect the change of solute distribution. Information entropy appears a tool to characterize the spatial and temporal complexity of solute migration and provides a reference for future research.

  4. Lost in the crowd? Using eye-tracking to investigate the effect of complexity on attribute non-attendance in discrete choice experiments.

    PubMed

    Spinks, Jean; Mortimer, Duncan

    2016-02-03

    The provision of additional information is often assumed to improve consumption decisions, allowing consumers to more accurately weigh the costs and benefits of alternatives. However, increasing the complexity of decision problems may prompt changes in information processing. This is particularly relevant for experimental methods such as discrete choice experiments (DCEs) where the researcher can manipulate the complexity of the decision problem. The primary aims of this study are (i) to test whether consumers actually process additional information in an already complex decision problem, and (ii) consider the implications of any such 'complexity-driven' changes in information processing for design and analysis of DCEs. A discrete choice experiment (DCE) is used to simulate a complex decision problem; here, the choice between complementary and conventional medicine for different health conditions. Eye-tracking technology is used to capture the number of times and the duration that a participant looks at any part of a computer screen during completion of DCE choice sets. From this we can analyse what has become known in the DCE literature as 'attribute non-attendance' (ANA). Using data from 32 participants, we model the likelihood of ANA as a function of choice set complexity and respondent characteristics using fixed and random effects models to account for repeated choice set completion. We also model whether participants are consistent with regard to which characteristics (attributes) they consider across choice sets. We find that complexity is the strongest predictor of ANA when other possible influences, such as time pressure, ordering effects, survey specific effects and socio-demographic variables (including proxies for prior experience with the decision problem) are considered. We also find that most participants do not apply a consistent information processing strategy across choice sets. Eye-tracking technology shows promise as a way of obtaining additional information from consumer research, improving DCE design, and informing the design of policy measures. With regards to DCE design, results from the present study suggest that eye-tracking data can identify the point at which adding complexity (and realism) to DCE choice scenarios becomes self-defeating due to unacceptable increases in ANA. Eye-tracking data therefore has clear application in the construction of guidelines for DCE design and during piloting of DCE choice scenarios. With regards to design of policy measures such as labelling requirements for CAM and conventional medicines, the provision of additional information has the potential to make difficult decisions even harder and may not have the desired effect on decision-making.

  5. Locating the source of diffusion in complex networks by time-reversal backward spreading.

    PubMed

    Shen, Zhesi; Cao, Shinan; Wang, Wen-Xu; Di, Zengru; Stanley, H Eugene

    2016-03-01

    Locating the source that triggers a dynamical process is a fundamental but challenging problem in complex networks, ranging from epidemic spreading in society and on the Internet to cancer metastasis in the human body. An accurate localization of the source is inherently limited by our ability to simultaneously access the information of all nodes in a large-scale complex network. This thus raises two critical questions: how do we locate the source from incomplete information and can we achieve full localization of sources at any possible location from a given set of observable nodes. Here we develop a time-reversal backward spreading algorithm to locate the source of a diffusion-like process efficiently and propose a general locatability condition. We test the algorithm by employing epidemic spreading and consensus dynamics as typical dynamical processes and apply it to the H1N1 pandemic in China. We find that the sources can be precisely located in arbitrary networks insofar as the locatability condition is assured. Our tools greatly improve our ability to locate the source of diffusion in complex networks based on limited accessibility of nodal information. Moreover, they have implications for controlling a variety of dynamical processes taking place on complex networks, such as inhibiting epidemics, slowing the spread of rumors, pollution control, and environmental protection.

  6. Locating the source of diffusion in complex networks by time-reversal backward spreading

    NASA Astrophysics Data System (ADS)

    Shen, Zhesi; Cao, Shinan; Wang, Wen-Xu; Di, Zengru; Stanley, H. Eugene

    2016-03-01

    Locating the source that triggers a dynamical process is a fundamental but challenging problem in complex networks, ranging from epidemic spreading in society and on the Internet to cancer metastasis in the human body. An accurate localization of the source is inherently limited by our ability to simultaneously access the information of all nodes in a large-scale complex network. This thus raises two critical questions: how do we locate the source from incomplete information and can we achieve full localization of sources at any possible location from a given set of observable nodes. Here we develop a time-reversal backward spreading algorithm to locate the source of a diffusion-like process efficiently and propose a general locatability condition. We test the algorithm by employing epidemic spreading and consensus dynamics as typical dynamical processes and apply it to the H1N1 pandemic in China. We find that the sources can be precisely located in arbitrary networks insofar as the locatability condition is assured. Our tools greatly improve our ability to locate the source of diffusion in complex networks based on limited accessibility of nodal information. Moreover, they have implications for controlling a variety of dynamical processes taking place on complex networks, such as inhibiting epidemics, slowing the spread of rumors, pollution control, and environmental protection.

  7. Understanding Teacher Collaboration Processes from a Complexity Theory Perspective: A Case Study of a Chinese Secondary School

    ERIC Educational Resources Information Center

    Yuan, Rui; Zhang, Jia; Yu, Shulin

    2018-01-01

    Although research on teacher collaboration has proliferated in the last few decades, scant attention has been paid to the development of teacher collaboration in school contexts. Informed by the perspective of complexity theory, this study investigates the complex process of teacher collaboration through qualitative interviews in an English…

  8. Granular computing with multiple granular layers for brain big data processing.

    PubMed

    Wang, Guoyin; Xu, Ji

    2014-12-01

    Big data is the term for a collection of datasets so huge and complex that it becomes difficult to be processed using on-hand theoretical models and technique tools. Brain big data is one of the most typical, important big data collected using powerful equipments of functional magnetic resonance imaging, multichannel electroencephalography, magnetoencephalography, Positron emission tomography, near infrared spectroscopic imaging, as well as other various devices. Granular computing with multiple granular layers, referred to as multi-granular computing (MGrC) for short hereafter, is an emerging computing paradigm of information processing, which simulates the multi-granular intelligent thinking model of human brain. It concerns the processing of complex information entities called information granules, which arise in the process of data abstraction and derivation of information and even knowledge from data. This paper analyzes three basic mechanisms of MGrC, namely granularity optimization, granularity conversion, and multi-granularity joint computation, and discusses the potential of introducing MGrC into intelligent processing of brain big data.

  9. Coding principles of the canonical cortical microcircuit in the avian brain

    PubMed Central

    Calabrese, Ana; Woolley, Sarah M. N.

    2015-01-01

    Mammalian neocortex is characterized by a layered architecture and a common or “canonical” microcircuit governing information flow among layers. This microcircuit is thought to underlie the computations required for complex behavior. Despite the absence of a six-layered cortex, birds are capable of complex cognition and behavior. In addition, the avian auditory pallium is composed of adjacent information-processing regions with genetically identified neuron types and projections among regions comparable with those found in the neocortex. Here, we show that the avian auditory pallium exhibits the same information-processing principles that define the canonical cortical microcircuit, long thought to have evolved only in mammals. These results suggest that the canonical cortical microcircuit evolved in a common ancestor of mammals and birds and provide a physiological explanation for the evolution of neural processes that give rise to complex behavior in the absence of cortical lamination. PMID:25691736

  10. Evolution of the archaeal and mammalian information processing systems: towards an archaeal model for human disease.

    PubMed

    Lyu, Zhe; Whitman, William B

    2017-01-01

    Current evolutionary models suggest that Eukaryotes originated from within Archaea instead of being a sister lineage. To test this model of ancient evolution, we review recent studies and compare the three major information processing subsystems of replication, transcription and translation in the Archaea and Eukaryotes. Our hypothesis is that if the Eukaryotes arose within the archaeal radiation, their information processing systems will appear to be one of kind and not wholly original. Within the Eukaryotes, the mammalian or human systems are emphasized because of their importance in understanding health. Biochemical as well as genetic studies provide strong evidence for the functional similarity of archaeal homologs to the mammalian information processing system and their dissimilarity to the bacterial systems. In many independent instances, a simple archaeal system is functionally equivalent to more elaborate eukaryotic homologs, suggesting that evolution of complexity is likely an central feature of the eukaryotic information processing system. Because fewer components are often involved, biochemical characterizations of the archaeal systems are often easier to interpret. Similarly, the archaeal cell provides a genetically and metabolically simpler background, enabling convenient studies on the complex information processing system. Therefore, Archaea could serve as a parsimonious and tractable host for studying human diseases that arise in the information processing systems.

  11. Speed Isn't Everything: Complex Processing Speed Measures Mask Individual Differences and Developmental Changes in Executive Control

    ERIC Educational Resources Information Center

    Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko

    2013-01-01

    The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of "processing speed" may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive…

  12. Identifying the Complexities within Clients' Thinking and Decision Making.

    ERIC Educational Resources Information Center

    Heppner, P. Paul

    1989-01-01

    Responds to Gelatt's conception of decision making in counseling. Concurs with need for a broader view of human reasoning that includes complex processes, both rational and intuitive. Advocates examination of how clients think, feel, and behave as they process information during counseling. (Author/TE)

  13. A complexity basis for phenomenology: How information states at criticality offer a new approach to understanding experience of self, being and time.

    PubMed

    Hankey, Alex

    2015-12-01

    In the late 19th century Husserl studied our internal sense of time passing, maintaining that its deep connections into experience represent prima facie evidence for it as the basis for all investigations in the sciences: Phenomenology was born. Merleau-Ponty focused on perception pointing out that any theory of experience must accord with established aspects of biology i.e. be embodied. Recent analyses suggest that theories of experience require non-reductive, integrative information, together with a specific property connecting them to experience. Here we elucidate a new class of information states with just such properties found at the loci of control of complex biological systems, including nervous systems. Complexity biology concerns states satisfying self-organized criticality. Such states are located at critical instabilities, commonly observed in biological systems, and thought to maximize information diversity and processing, and hence to optimize regulation. Major results for biology follow: why organisms have unusually low entropies; and why they are not merely mechanical. Criticality states form singular self-observing systems, which reduce wave packets by processes of perfect self-observation associated with feedback gain g = 1. Analysis of their information properties leads to identification of a new kind of information state with high levels of internal coherence, and feedback loops integrated into their structure. The major idea presented here is that the integrated feedback loops are responsible for our 'sense of self', and also the feeling of continuity in our sense of time passing. Long-range internal correlations guarantee a unique kind of non-reductive, integrative information structure enabling such states to naturally support phenomenal experience. Being founded in complexity biology, they are 'embodied'; they also fulfill the statement that 'The self is a process', a singular process. High internal correlations and René Thom-style catastrophes support non-digital forms of information, gestalt cognition, and information transfer via quantum teleportation. Criticality in complexity biology can 'embody' cognitive states supporting gestalts, and phenomenology's senses of 'self,' time passing, existence and being. Copyright © 2015. Published by Elsevier Ltd.

  14. Process Mining-Based Method of Designing and Optimizing the Layouts of Emergency Departments in Hospitals.

    PubMed

    Rismanchian, Farhood; Lee, Young Hoon

    2017-07-01

    This article proposes an approach to help designers analyze complex care processes and identify the optimal layout of an emergency department (ED) considering several objectives simultaneously. These objectives include minimizing the distances traveled by patients, maximizing design preferences, and minimizing the relocation costs. Rising demand for healthcare services leads to increasing demand for new hospital buildings as well as renovating existing ones. Operations management techniques have been successfully applied in both manufacturing and service industries to design more efficient layouts. However, high complexity of healthcare processes makes it challenging to apply these techniques in healthcare environments. Process mining techniques were applied to address the problem of complexity and to enhance healthcare process analysis. Process-related information, such as information about the clinical pathways, was extracted from the information system of an ED. A goal programming approach was then employed to find a single layout that would simultaneously satisfy several objectives. The layout identified using the proposed method improved the distances traveled by noncritical and critical patients by 42.2% and 47.6%, respectively, and minimized the relocation costs. This study has shown that an efficient placement of the clinical units yields remarkable improvements in the distances traveled by patients.

  15. Neural Markers of Responsiveness to the Environment in Human Sleep.

    PubMed

    Andrillon, Thomas; Poulsen, Andreas Trier; Hansen, Lars Kai; Léger, Damien; Kouider, Sid

    2016-06-15

    Sleep is characterized by a loss of behavioral responsiveness. However, recent research has shown that the sleeping brain is not completely disconnected from its environment. How neural activity constrains the ability to process sensory information while asleep is yet unclear. Here, we instructed human volunteers to classify words with lateralized hand responses while falling asleep. Using an electroencephalographic (EEG) marker of motor preparation, we show how responsiveness is modulated across sleep. These modulations are tracked using classic event-related potential analyses complemented by Lempel-Ziv complexity (LZc), a measure shown to track arousal in sleep and anesthesia. Neural activity related to the semantic content of stimuli was conserved in light non-rapid eye movement (NREM) sleep. However, these processes were suppressed in deep NREM sleep and, importantly, also in REM sleep, despite the recovery of wake-like neural activity in the latter. In NREM sleep, sensory activations were counterbalanced by evoked down states, which, when present, blocked further processing of external information. In addition, responsiveness markers correlated positively with baseline complexity, which could be related to modulation in sleep depth. In REM sleep, however, this relationship was reversed. We therefore propose that, in REM sleep, endogenously generated processes compete with the processing of external input. Sleep can thus be seen as a self-regulated process in which external information can be processed in lighter stages but suppressed in deeper stages. Last, our results suggest drastically different gating mechanisms in NREM and REM sleep. Previous research has tempered the notion that sleepers are isolated from their environment. Here, we pushed this idea forward and examined, across all sleep stages, the brain's ability to flexibly process sensory information, up to the decision level. We extracted an EEG marker of motor preparation to determine the completion of the sensory processing chain and explored how it is constrained by baseline and evoked neural activity. In NREM sleep, slow waves elicited by stimuli appeared to block response preparation. We also used a novel analytic approach (Lempel-Ziv complexity) and showed that the ability to process external information correlates with neural complexity. A reversal of the correlation between complexity and motor indices in REM sleep suggests drastically different gating mechanisms across sleep stages. Copyright © 2016 the authors 0270-6474/16/366583-14$15.00/0.

  16. Teaching Information Systems Development via Process Variants

    ERIC Educational Resources Information Center

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  17. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  18. Environmental Uncertainty and Communication Network Complexity: A Cross-System, Cross-Cultural Test.

    ERIC Educational Resources Information Center

    Danowski, James

    An infographic model is proposed to account for the operation of systems within their information environments. Infographics is a communication paradigm used to indicate the clustering of information processing variables in communication systems. Four propositions concerning environmental uncertainty and internal communication network complexity,…

  19. Measuring information processing in a client with extreme agitation following traumatic brain injury using the Perceive, Recall, Plan and Perform System of Task Analysis.

    PubMed

    Nott, Melissa T; Chapparo, Christine

    2008-09-01

    Agitation following traumatic brain injury is characterised by a heightened state of activity with disorganised information processing that interferes with learning and achieving functional goals. This study aimed to identify information processing problems during task performance of a severely agitated adult using the Perceive, Recall, Plan and Perform (PRPP) System of Task Analysis. Second, this study aimed to examine the sensitivity of the PRPP System to changes in task performance over a short period of rehabilitation, and third, to evaluate the guidance provided by the PRPP in directing intervention. A case study research design was employed. The PRPP System of Task Analysis was used to assess changes in task embedded information processing capacity during occupational therapy intervention with a severely agitated adult in a rehabilitation context. Performance is assessed on three selected tasks over a one-month period. Information processing difficulties during task performance can be clearly identified when observing a severely agitated adult following a traumatic brain injury. Processing skills involving attention, sensory processing and planning were most affected at this stage of rehabilitation. These processing difficulties are linked to established descriptions of agitated behaviour. Fluctuations in performance across three tasks of differing processing complexity were evident, leading to hypothesised relationships between task complexity, environment and novelty with information processing errors. Changes in specific information processing capacity over time were evident based on repeated measures using the PRPP System of Task Analysis. This lends preliminary support for its utility as an outcome measure, and raises hypotheses about the type of therapy required to enhance information processing in people with severe agitation. The PRPP System is sensitive to information processing changes in severely agitated adults when used to reassess performance over short intervals and can provide direct guidance to occupational therapy intervention to improve task embedded information processing by categorising errors under four stages of an information processing model: Perceive, Recall, Plan and Perform.

  20. Low-complexity video encoding method for wireless image transmission in capsule endoscope.

    PubMed

    Takizawa, Kenichi; Hamaguchi, Kiyoshi

    2010-01-01

    This paper presents a low-complexity video encoding method applicable for wireless image transmission in capsule endoscopes. This encoding method is based on Wyner-Ziv theory, in which side information available at a transmitter is treated as side information at its receiver. Therefore complex processes in video encoding, such as estimation of the motion vector, are moved to the receiver side, which has a larger-capacity battery. As a result, the encoding process is only to decimate coded original data through channel coding. We provide a performance evaluation for a low-density parity check (LDPC) coding method in the AWGN channel.

  1. A trade-off between local and distributed information processing associated with remote episodic versus semantic memory.

    PubMed

    Heisz, Jennifer J; Vakorin, Vasily; Ross, Bernhard; Levine, Brian; McIntosh, Anthony R

    2014-01-01

    Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.

  2. Gathering Information from Transport Systems for Processing in Supply Chains

    NASA Astrophysics Data System (ADS)

    Kodym, Oldřich; Unucka, Jakub

    2016-12-01

    Paper deals with complex system for processing information from means of transport acting as parts of train (rail or road). It focuses on automated information gathering using AutoID technology, information transmission via Internet of Things networks and information usage in information systems of logistic firms for support of selected processes on MES and ERP levels. Different kinds of gathered information from whole transport chain are discussed. Compliance with existing standards is mentioned. Security of information in full life cycle is integral part of presented system. Design of fully equipped system based on synthesized functional nodes is presented.

  3. What can one sample tell us? Stable isotopes can assess complex processes in national assessments of lakes, rivers and streams.

    EPA Science Inventory

    Stable isotopes can be very useful in large-scale monitoring programs because samples for isotopic analysis are easy to collect, and isotopes integrate information about complex processes such as evaporation from water isotopes and denitrification from nitrogen isotopes. Traditi...

  4. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  5. STUDY OF TURBULENT ENERGY OVER COMPLEX TERRAIN: STATE, 1978

    EPA Science Inventory

    The complex structure of the earth's surface influenced atmospheric parameters pertinent to modeling the diffusion process during the 1978 'STATE' field study. The Information Theory approach of statistics proved useful for analyzing the complex structures observed in the radiome...

  6. Quality data collection and management technology of aerospace complex product assembly process

    NASA Astrophysics Data System (ADS)

    Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo

    2017-04-01

    Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.

  7. How Analysts Cognitively “Connect the Dots”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradel, Lauren; Self, Jessica S.; Endert, Alexander

    2013-06-04

    As analysts attempt to make sense of a collection of documents, such as intelligence analysis reports, they may wish to “connect the dots” between pieces of information that may initially seem unrelated. This process of synthesizing information between information requires users to make connections between pairs of documents, creating a conceptual story. We conducted a user study to analyze the process by which users connect pairs of documents and how they spatially arrange information. Users created conceptual stories that connected the dots using organizational strategies that ranged in complexity. We propose taxonomies for cognitive connections and physical structures used whenmore » trying to “connect the dots” between two documents. We compared the user-created stories with a data-mining algorithm that constructs chains of documents using co-occurrence metrics. Using the insight gained into the storytelling process, we offer design considerations for the existing data mining algorithm and corresponding tools to combine the power of data mining and the complex cognitive processing of analysts.« less

  8. Method for Evaluating Information to Solve Problems of Control, Monitoring and Diagnostics

    NASA Astrophysics Data System (ADS)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-06-01

    The article describes a method for evaluating information to solve problems of control, monitoring and diagnostics. It is necessary for reducing the dimensionality of informational indicators of situations, bringing them to relative units, for calculating generalized information indicators on their basis, ranking them by characteristic levels, for calculating the efficiency criterion of a system functioning in real time. The design of information evaluation system has been developed on its basis that allows analyzing, processing and assessing information about the object. Such object can be a complex technical, economic and social system. The method and the based system thereof can find a wide application in the field of analysis, processing and evaluation of information on the functioning of the systems, regardless of their purpose, goals, tasks and complexity. For example, they can be used to assess the innovation capacities of industrial enterprises and management decisions.

  9. On the Concept of Information and Its Role in Nature

    NASA Astrophysics Data System (ADS)

    Roederer, Juan G.

    2003-03-01

    In this article we address some fundamental questions concerning information: Can the existing laws of physics adequately deal with the most striking property of information, namely to cause specific changes in the structure and energy flows of a complex system, without the information in itself representing fields, forces or energy in any of their characteristic forms? Or is information irreducible to the laws of physics and chemistry? Are information and complexity related concepts? Does the Universe, in its evolution, constantly generate new information? Or are information and information-processing exclusive attributes of living systems, related to the very definition of life? If that were the case, what happens with the physical meanings of entropy in statistical mechanics or wave function in quantum mechanics? How many distinct classes of information and information processing do exist in the biological world? How does information appear in Darwinian evolution? Does the human brain have unique properties or capabilities in terms of information processing? In what ways does information processing bring about human self-consciousness? We shall introduce the meaning of "information" in a way that is detached from human technological systems and related algorithms and semantics, and that is not based on any mathematical formula. To accomplish this we turn to the concept of interaction as the basic departing point, and identify two fundamentally different classes, with information and information-processing appearing as the key discriminator: force-field driven interactions between elementary particles and ensembles of particles in the macroscopic physical domain, and information-based interactions between certain kinds of complex systems that form the biological domain. We shall show that in an abiotic world, information plays no role; physical interactions just happen, they are driven by energy exchange between the interacting parts and do not require any operations of information processing. Information only enters the non-living physical world when a living thing interacts with it-and when a scientist extracts information through observation and measurement. But for living organisms, information is the very essence of their existence: to maintain a long-term state of unstable thermodynamic equilibrium with its surroundings, consistently increase its organization and reproduce, an organism has to rely on information-based interactions in which form or pattern, not energy, is the controlling factor. This latter class comprises biomolecular information processes controlling the metabolism, growth, multiplication and differentiation of cells, and neural information processes controlling animal behavior and intelligence. The only way new information can appear is through the process of biological evolution and, in the short term, through sensory acquisition and the manipulation of images in the nervous system. Non-living informational systems such as books, computers, AI systems and other artifacts, as well as living organisms that are the result of breeding or cloning, are planned by human beings and will not be considered here.

  10. Could a neuroscientist understand a microprocessor?

    DOE PAGES

    Jonas, Eric; Kording, Konrad Paul; Diedrichsen, Jorn

    2017-01-12

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods frommore » neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Furthermore, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.« less

  11. Could a Neuroscientist Understand a Microprocessor?

    PubMed Central

    Kording, Konrad Paul

    2017-01-01

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods from neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Additionally, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods. PMID:28081141

  12. Could a Neuroscientist Understand a Microprocessor?

    PubMed

    Jonas, Eric; Kording, Konrad Paul

    2017-01-01

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods from neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Additionally, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.

  13. Could a neuroscientist understand a microprocessor?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonas, Eric; Kording, Konrad Paul; Diedrichsen, Jorn

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods frommore » neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Furthermore, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.« less

  14. A novel approach to characterize information radiation in complex networks

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoyang; Wang, Ying; Zhu, Lin; Li, Chao

    2016-06-01

    The traditional research of information dissemination is mostly based on the virus spreading model that the information is being spread by probability, which does not match very well to the reality, because the information that we receive is always more or less than what was sent. In order to quantitatively describe variations in the amount of information during the spreading process, this article proposes a safety information radiation model on the basis of communication theory, combining with relevant theories of complex networks. This model comprehensively considers the various influence factors when safety information radiates in the network, and introduces some concepts from the communication theory perspective, such as the radiation gain function, receiving gain function, information retaining capacity and information second reception capacity, to describe the safety information radiation process between nodes and dynamically investigate the states of network nodes. On a micro level, this article analyzes the influence of various initial conditions and parameters on safety information radiation through the new model simulation. The simulation reveals that this novel approach can reflect the variation of safety information quantity of each node in the complex network, and the scale-free network has better ;radiation explosive power;, while the small-world network has better ;radiation staying power;. The results also show that it is efficient to improve the overall performance of network security by selecting nodes with high degrees as the information source, refining and simplifying the information, increasing the information second reception capacity and decreasing the noises. In a word, this article lays the foundation for further research on the interactions of information and energy between internal components within complex systems.

  15. Analyzing complex networks evolution through Information Theory quantifiers

    NASA Astrophysics Data System (ADS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martín Gómez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Niño/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  16. A Decision-Oriented Investigation of Air Force Civil Engineering’s Operations Branch and the Implications for a Decision Support System.

    DTIC Science & Technology

    1984-09-01

    information when making a decision [ Szilagyi and Wallace , 1983:3201." Driver and Mock used cognitive complexity ideas to develop this two dimensional...flexible AMOUNT OF INFORMATION USED High hierarchic integrative Figure 6. Cognitive Complexity Model ( Szilagyi and Wallace , 1983:321) Decisive Style. The...large amount of inform- ation. However, he processes this information with a multiple focus approach ( Szilagyi and Wallace , 1983:320-321). 26 McKenney

  17. Representation control increases task efficiency in complex graphical representations.

    PubMed

    Moritz, Julia; Meyerhoff, Hauke S; Meyer-Dernbecher, Claudia; Schwan, Stephan

    2018-01-01

    In complex graphical representations, the relevant information for a specific task is often distributed across multiple spatial locations. In such situations, understanding the representation requires internal transformation processes in order to extract the relevant information. However, digital technology enables observers to alter the spatial arrangement of depicted information and therefore to offload the transformation processes. The objective of this study was to investigate the use of such a representation control (i.e. the users' option to decide how information should be displayed) in order to accomplish an information extraction task in terms of solution time and accuracy. In the representation control condition, the participants were allowed to reorganize the graphical representation and reduce information density. In the control condition, no interactive features were offered. We observed that participants in the representation control condition solved tasks that required reorganization of the maps faster and more accurate than participants without representation control. The present findings demonstrate how processes of cognitive offloading, spatial contiguity, and information coherence interact in knowledge media intended for broad and diverse groups of recipients.

  18. Representation control increases task efficiency in complex graphical representations

    PubMed Central

    Meyerhoff, Hauke S.; Meyer-Dernbecher, Claudia; Schwan, Stephan

    2018-01-01

    In complex graphical representations, the relevant information for a specific task is often distributed across multiple spatial locations. In such situations, understanding the representation requires internal transformation processes in order to extract the relevant information. However, digital technology enables observers to alter the spatial arrangement of depicted information and therefore to offload the transformation processes. The objective of this study was to investigate the use of such a representation control (i.e. the users' option to decide how information should be displayed) in order to accomplish an information extraction task in terms of solution time and accuracy. In the representation control condition, the participants were allowed to reorganize the graphical representation and reduce information density. In the control condition, no interactive features were offered. We observed that participants in the representation control condition solved tasks that required reorganization of the maps faster and more accurate than participants without representation control. The present findings demonstrate how processes of cognitive offloading, spatial contiguity, and information coherence interact in knowledge media intended for broad and diverse groups of recipients. PMID:29698443

  19. Air Traffic Complexity Measurement Environment (ACME): Software User's Guide

    NASA Technical Reports Server (NTRS)

    1996-01-01

    A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.

  20. Managing the Process of Protection Level Assessment of the Complex Organization and Technical Industrial Enterprises

    NASA Astrophysics Data System (ADS)

    Gorlov, A. P.; Averchenkov, V. I.; Rytov, M. Yu; Eryomenko, V. T.

    2017-01-01

    The article is concerned with mathematical simulation of protection level assessment of complex organizational and technical systems of industrial enterprises by creating automated system, which main functions are: information security (IS) audit, forming of the enterprise threats model, recommendations concerning creation of the information protection system, a set of organizational-administrative documentation.

  1. Exploring Use of Climate Information in Wildland Fire Management: A Decision Calendar Study

    Treesearch

    Thomas W. Corringham; Anthony L. Westerling; Barbara J. Morehouse

    2006-01-01

    Wildfire management is an institutionally complex process involving a complex budget and appropriations cycle, a variety of objectives, and a set of internal and external political constraints. Significant potential exists for enhancing the use of climate information and long-range climate forecasts in wildland fire management in the Western U.S. Written surveys and...

  2. Organization and post-transcriptional processing of the psb B operon from chloroplasts of Populus deltoides.

    PubMed

    Dixit, R; Trivedi, P K; Nath, P; Sane, P V

    1999-09-01

    Chloroplast genes are typically organized into polycistronic transcription units that give rise to complex sets of mono- and oligo-cistronic overlapping RNAs through a series of processing steps. The psbB operon contains genes for the PSII (psbB, psbT, psbH) and cytochrome b(6)f (petB and petD) complexes which are needed in different amounts during chloroplast biogenesis. The functional significance of gene organization in this polycistronic unit, containing information for two different complexes, is not known and is of interest. To determine the organization and expression of these complexes, studies have been carried out on crop plants by different groups, but not much information is known about trees. We present the nucleotide sequences of PSII genes and RNA profiles of the genes located in the psbB operon from Populus deltoides, a tree species. Although the gene organization of this operon in P. deltoides is similar to that in other species, a few variations have been observed in the processing scheme.

  3. Effects of trial complexity on decision making.

    PubMed

    Horowitz, I A; ForsterLee, L; Brolly, I

    1996-12-01

    The ability of a civil jury to render fair and rational decisions in complex trials has been questioned. However, the nature, dimensions, and effects of trial complexity on decision making have rarely been addressed. In this research, jury-eligible adults saw a videotape of a complex civil trial that varied in information load and complexity of the language of the witnesses. Information load and complexity differentially affected liability and compensatory decisions. An increase in the number of plaintiffs decreased blameworthiness assigned to the defendant despite contrary evidence and amount of probative evidence processed. Complex language did not affect memory but did affect jurors' ability to appropriately compensate differentially worthy plaintiffs. Jurors assigned compensatory awards commensurate with the plaintiffs' injuries only under low-load and less complex language conditions.

  4. Different Types of Laughter Modulate Connectivity within Distinct Parts of the Laughter Perception Network

    PubMed Central

    Ethofer, Thomas; Brück, Carolin; Alter, Kai; Grodd, Wolfgang; Kreifelts, Benjamin

    2013-01-01

    Laughter is an ancient signal of social communication among humans and non-human primates. Laughter types with complex social functions (e.g., taunt and joy) presumably evolved from the unequivocal and reflex-like social bonding signal of tickling laughter already present in non-human primates. Here, we investigated the modulations of cerebral connectivity associated with different laughter types as well as the effects of attention shifts between implicit and explicit processing of social information conveyed by laughter using functional magnetic resonance imaging (fMRI). Complex social laughter types and tickling laughter were found to modulate connectivity in two distinguishable but partially overlapping parts of the laughter perception network irrespective of task instructions. Connectivity changes, presumably related to the higher acoustic complexity of tickling laughter, occurred between areas in the prefrontal cortex and the auditory association cortex, potentially reflecting higher demands on acoustic analysis associated with increased information load on auditory attention, working memory, evaluation and response selection processes. In contrast, the higher degree of socio-relational information in complex social laughter types was linked to increases of connectivity between auditory association cortices, the right dorsolateral prefrontal cortex and brain areas associated with mentalizing as well as areas in the visual associative cortex. These modulations might reflect automatic analysis of acoustic features, attention direction to informative aspects of the laughter signal and the retention of those in working memory during evaluation processes. These processes may be associated with visual imagery supporting the formation of inferences on the intentions of our social counterparts. Here, the right dorsolateral precentral cortex appears as a network node potentially linking the functions of auditory and visual associative sensory cortices with those of the mentalizing-associated anterior mediofrontal cortex during the decoding of social information in laughter. PMID:23667619

  5. Different types of laughter modulate connectivity within distinct parts of the laughter perception network.

    PubMed

    Wildgruber, Dirk; Szameitat, Diana P; Ethofer, Thomas; Brück, Carolin; Alter, Kai; Grodd, Wolfgang; Kreifelts, Benjamin

    2013-01-01

    Laughter is an ancient signal of social communication among humans and non-human primates. Laughter types with complex social functions (e.g., taunt and joy) presumably evolved from the unequivocal and reflex-like social bonding signal of tickling laughter already present in non-human primates. Here, we investigated the modulations of cerebral connectivity associated with different laughter types as well as the effects of attention shifts between implicit and explicit processing of social information conveyed by laughter using functional magnetic resonance imaging (fMRI). Complex social laughter types and tickling laughter were found to modulate connectivity in two distinguishable but partially overlapping parts of the laughter perception network irrespective of task instructions. Connectivity changes, presumably related to the higher acoustic complexity of tickling laughter, occurred between areas in the prefrontal cortex and the auditory association cortex, potentially reflecting higher demands on acoustic analysis associated with increased information load on auditory attention, working memory, evaluation and response selection processes. In contrast, the higher degree of socio-relational information in complex social laughter types was linked to increases of connectivity between auditory association cortices, the right dorsolateral prefrontal cortex and brain areas associated with mentalizing as well as areas in the visual associative cortex. These modulations might reflect automatic analysis of acoustic features, attention direction to informative aspects of the laughter signal and the retention of those in working memory during evaluation processes. These processes may be associated with visual imagery supporting the formation of inferences on the intentions of our social counterparts. Here, the right dorsolateral precentral cortex appears as a network node potentially linking the functions of auditory and visual associative sensory cortices with those of the mentalizing-associated anterior mediofrontal cortex during the decoding of social information in laughter.

  6. On the definition of the concepts thinking, consciousness, and conscience.

    PubMed Central

    Monin, A S

    1992-01-01

    A complex system (CS) is defined as a set of elements, with connections between them, singled out of the environment, capable of getting information from the environment, capable of making decisions (i.e., of choosing between alternatives), and having purposefulness (i.e., an urge towards preferable states or other goals). Thinking is a process that takes place (or which can take place) in some of the CS and consists of (i) receiving information from the environment (and from itself), (ii) memorizing the information, (iii) the subconscious, and (iv) consciousness. Life is a process that takes place in some CS and consists of functions i and ii, as well as (v) reproduction with passing of hereditary information to progeny, and (vi) oriented energy and matter exchange with the environment sufficient for the maintenance of all life processes. Memory is a complex of processes of placing information in memory banks, keeping it there, and producing it according to prescriptions available in the system or to inquiries arising in it. Consciousness is a process of realization by the thinking CS of some set of algorithms consisting of the comparison of its knowledge, intentions, decisions, and actions with reality--i.e., with accumulated and continuously received internal and external information. Conscience is a realization of an algorithm of good and evil pattern recognition. PMID:1631060

  7. Stalking the IQ Quark.

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    1979-01-01

    An information-processing framework is presented for understanding intelligence. Two levels of processing are discussed: the steps involved in solving a complex intellectual task, and higher-order processes used to decide how to solve the problem. (MH)

  8. The Public Life of Information

    ERIC Educational Resources Information Center

    Rowe, Josh

    2011-01-01

    The mid-twentieth century marked a shift in Americans' fundamental orientation toward information. Rather than news or knowledge, information became a disembodied quantum--strings of ones and zeros processed, increasingly, by complex machines. This dissertation examines how Americans became acquainted with "information", as newly conceived by…

  9. Perceptual conflict during sensorimotor integration processes - a neurophysiological study in response inhibition.

    PubMed

    Chmielewski, Witold X; Beste, Christian

    2016-05-25

    A multitude of sensory inputs needs to be processed during sensorimotor integration. A crucial factor for detecting relevant information is its complexity, since information content can be conflicting at a perceptual level. This may be central to executive control processes, such as response inhibition. This EEG study aims to investigate the system neurophysiological mechanisms behind effects of perceptual conflict on response inhibition. We systematically modulated perceptual conflict by integrating a Global-local task with a Go/Nogo paradigm. The results show that conflicting perceptual information, in comparison to non-conflicting perceptual information, impairs response inhibition performance. This effect was evident regardless of whether the relevant information for response inhibition is displayed on the global, or local perceptual level. The neurophysiological data suggests that early perceptual/ attentional processing stages do not underlie these modulations. Rather, processes at the response selection level (P3), play a role in changed response inhibition performance. This conflict-related impairment of inhibitory processes is associated with activation differences in (inferior) parietal areas (BA7 and BA40) and not as commonly found in the medial prefrontal areas. This suggests that various functional neuroanatomical structures may mediate response inhibition and that the functional neuroanatomical structures involved depend on the complexity of sensory integration processes.

  10. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  11. Adaptive Correction from Virtually Complex Dynamic Libraries: The Role of Noncovalent Interactions in Structural Selection and Folding.

    PubMed

    Lafuente, Maria; Atcher, Joan; Solà, Jordi; Alfonso, Ignacio

    2015-11-16

    The hierarchical self-assembling of complex molecular systems is dictated by the chemical and structural information stored in their components. This information can be expressed through an adaptive process that determines the structurally fittest assembly under given environmental conditions. We have set up complex disulfide-based dynamic covalent libraries of chemically and topologically diverse pseudopeptidic compounds. We show how the reaction evolves from very complex mixtures at short reaction times to the almost exclusive formation of a major compound, through the establishment of intramolecular noncovalent interactions. Our experiments demonstrate that the systems evolve through error-check and error-correction processes. The nature of these interactions, the importance of the folding and the effects of the environment are also discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Eye Movement Analysis of Information Processing under Different Testing Conditions.

    ERIC Educational Resources Information Center

    Dillon, Ronna F.

    1985-01-01

    Undergraduates were given complex figural analogies items, and eye movements were observed under three types of feedback: (1) elaborate feedback; (2) subjects verbalized their thinking and application of rules; and (3) no feedback. Both feedback conditions enhanced the rule-governed information processing during inductive reasoning. (Author/GDC)

  13. Information-Processing Modules and Their Relative Modality Specificity

    ERIC Educational Resources Information Center

    Anderson, John R.; Qin, Yulin; Jung, Kwan-Jin; Carter, Cameron S.

    2007-01-01

    This research uses fMRI to understand the role of eight cortical regions in a relatively complex information-processing task. Modality of input (visual versus auditory) and modality of output (manual versus vocal) are manipulated. Two perceptual regions (auditory cortex and fusiform gyrus) only reflected perceptual encoding. Two motor regions were…

  14. How Students Learn: Information Processing, Intellectual Development and Confrontation

    ERIC Educational Resources Information Center

    Entwistle, Noel

    1975-01-01

    A model derived from information processing theory is described, which helps to explain the complex verbal learning of students and suggests implications for lecturing techniques. Other factors affecting learning, which are not covered by the model, are discussed in relationship to it: student's intellectual development and effects of individual…

  15. Understanding Self-Assessment as an Informed Process: Residents' Use of External Information for Self-Assessment of Performance in Simulated Resuscitations

    ERIC Educational Resources Information Center

    Plant, Jennifer L.; Corden, Mark; Mourad, Michelle; O'Brien, Bridget C.; van Schaik, Sandrijn M.

    2013-01-01

    ;Self-directed learning requires self-assessment of learning needs and performance, a complex process that requires collecting and interpreting data from various sources. Learners' approaches to self-assessment likely vary depending on the learner and the context. The aim of this study was to gain insight into how learners process external…

  16. Video Analysis and Remote Digital Ethnography: Approaches to understanding user perspectives and processes involving healthcare information technology.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    Innovations in healthcare information systems promise to revolutionize and streamline healthcare processes worldwide. However, the complexity of these systems and the need to better understand issues related to human-computer interaction have slowed progress in this area. In this chapter the authors describe their work in using methods adapted from usability engineering, video ethnography and analysis of digital log files for improving our understanding of complex real-world healthcare interactions between humans and technology. The approaches taken are cost-effective and practical and can provide detailed ethnographic data on issues health professionals and consumers encounter while using systems as well as potential safety problems. The work is important in that it can be used in techno-anthropology to characterize complex user interactions with technologies and also to provide feedback into redesign and optimization of improved healthcare information systems.

  17. Process mining is an underutilized clinical research tool in transfusion medicine.

    PubMed

    Quinn, Jason G; Conrad, David M; Cheng, Calvino K

    2017-03-01

    To understand inventory performance, transfusion services commonly use key performance indicators (KPIs) as summary descriptors of inventory efficiency that are graphed, trended, and used to benchmark institutions. Here, we summarize current limitations in KPI-based evaluation of blood bank inventory efficiency and propose process mining as an ideal methodology for application to inventory management research to improve inventory flows and performance. The transit of a blood product from inventory receipt to final disposition is complex and relates to many internal and external influences, and KPIs may be inadequate to fully understand the complexity of the blood supply chain and how units interact with its processes. Process mining lends itself well to analysis of blood bank inventories, and modern laboratory information systems can track nearly all of the complex processes that occur in the blood bank. Process mining is an analytical tool already used in other industries and can be applied to blood bank inventory management and research through laboratory information systems data using commercial applications. Although the current understanding of real blood bank inventories is value-centric through KPIs, it potentially can be understood from a process-centric lens using process mining. © 2017 AABB.

  18. Automated information and control complex of hydro-gas endogenous mine processes

    NASA Astrophysics Data System (ADS)

    Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.

    2017-09-01

    The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.

  19. Improving protein complex classification accuracy using amino acid composition profile.

    PubMed

    Huang, Chien-Hung; Chou, Szu-Yu; Ng, Ka-Lok

    2013-09-01

    Protein complex prediction approaches are based on the assumptions that complexes have dense protein-protein interactions and high functional similarity between their subunits. We investigated those assumptions by studying the subunits' interaction topology, sequence similarity and molecular function for human and yeast protein complexes. Inclusion of amino acids' physicochemical properties can provide better understanding of protein complex properties. Principal component analysis is carried out to determine the major features. Adopting amino acid composition profile information with the SVM classifier serves as an effective post-processing step for complexes classification. Improvement is based on primary sequence information only, which is easy to obtain. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. The Evolution of Biological Complexity in Digital Organisms

    NASA Astrophysics Data System (ADS)

    Ofria, Charles

    2013-03-01

    When Darwin first proposed his theory of evolution by natural selection, he realized that it had a problem explaining the origins of traits of ``extreme perfection and complication'' such as the vertebrate eye. Critics of Darwin's theory have latched onto this perceived flaw as a proof that Darwinian evolution is impossible. In anticipation of this issue, Darwin described the perfect data needed to understand this process, but lamented that such data are ``scarcely ever possible'' to obtain. In this talk, I will discuss research where we use populations of digital organisms (self-replicating and evolving computer programs) to elucidate the genetic and evolutionary processes by which new, highly-complex traits arise, drawing inspiration directly from Darwin's wistful thinking and hypotheses. During the process of evolution in these fully-transparent computational environments we can measure the incorporation of new information into the genome, a process akin to a natural Maxwell's Demon, and identify the original source of any such information. We show that, as Darwin predicted, much of the information used to encode a complex trait was already in the genome as part of simpler evolved traits, and that many routes must be possible for a new complex trait to have a high probability of successfully evolving. In even more extreme examples of the evolution of complexity, we are now using these same principles to examine the evolutionary dynamics the drive major transitions in evolution; that is transitions to higher-levels of organization, which are some of the most complex evolutionary events to occur in nature. Finally, I will explore some of the implications of this research to other aspects of evolutionary biology and as well as ways that these evolutionary principles can be applied toward solving computational and engineering problems.

  1. Informed consent process: A step further towards making it meaningful!

    PubMed Central

    Kadam, Rashmi Ashish

    2017-01-01

    Informed consent process is the cornerstone of ethics in clinical research. Obtaining informed consent from patients participating in clinical research is an important legal and ethical imperative for clinical trial researchers. Although informed consent is an important process in clinical research, its effectiveness and validity are always a concern. Issues related to understanding, comprehension, competence, and voluntariness of clinical trial participants may adversely affect the informed consent process. Communication of highly technical, complex, and specialized clinical trial information to participants with limited literacy, diverse sociocultural background, diminished autonomy, and debilitating diseases is a difficult task for clinical researchers. It is therefore essential to investigate and adopt innovative communication strategies to enhance understanding of clinical trial information among participants. This review article visits the challenges that affect the informed consent process and explores various innovative strategies to enhance the consent process. PMID:28828304

  2. Software-hardware complex for the input of telemetric information obtained from rocket studies of the radiation of the earth's upper atmosphere

    NASA Astrophysics Data System (ADS)

    Bazdrov, I. I.; Bortkevich, V. S.; Khokhlov, V. N.

    2004-10-01

    This paper describes a software-hardware complex for the input into a personal computer of telemetric information obtained by means of telemetry stations TRAL KR28, RTS-8, and TRAL K2N. Structural and functional diagrams are given of the input device and the hardware complex. Results that characterize the features of the input process and selective data of optical measurements of atmospheric radiation are given. © 2004

  3. Communication in diagnostic radiology: meeting the challenges of complexity.

    PubMed

    Larson, David B; Froehle, Craig M; Johnson, Neil D; Towbin, Alexander J

    2014-11-01

    As patients and information flow through the imaging process, value is added step-by-step when information is acquired, interpreted, and communicated back to the referring clinician. However, radiology information systems are often plagued with communication errors and delays. This article presents theories and recommends strategies to continuously improve communication in the complex environment of modern radiology. Communication theories, methods, and systems that have proven their effectiveness in other environments can serve as models for radiology.

  4. Markov and non-Markov processes in complex systems by the dynamical information entropy

    NASA Astrophysics Data System (ADS)

    Yulmetyev, R. M.; Gafarov, F. M.

    1999-12-01

    We consider the Markov and non-Markov processes in complex systems by the dynamical information Shannon entropy (DISE) method. The influence and important role of the two mutually dependent channels of entropy alternation (creation or generation of correlation) and anti-correlation (destroying or annihilation of correlation) have been discussed. The developed method has been used for the analysis of the complex systems of various natures: slow neutron scattering in liquid cesium, psychology (short-time numeral and pattern human memory and effect of stress on the dynamical taping-test), random dynamics of RR-intervals in human ECG (problem of diagnosis of various disease of the human cardio-vascular systems), chaotic dynamics of the parameters of financial markets and ecological systems.

  5. Not Merely Experiential: Unconscious Thought Can Be Rational

    PubMed Central

    Garrison, Katie E.; Handley, Ian M.

    2017-01-01

    Individuals often form more reasonable judgments from complex information after a period of distraction vs. deliberation. This phenomenon has been attributed to sophisticated unconscious thought during the distraction period that integrates and organizes the information (Unconscious Thought Theory; Dijksterhuis and Nordgren, 2006). Yet, other research suggests that experiential processes are strengthened during the distraction (relative to deliberation) period, accounting for the judgment and decision benefit. We tested between these possibilities, hypothesizing that unconscious thought is distinct from experiential processes, and independently contributes to judgments and decisions during a distraction period. Using an established paradigm, Experiment 1 (N = 319) randomly induced participants into an experiential or rational mindset, after which participants received complex information describing three roommates to then consider consciously (i.e., deliberation) or unconsciously (i.e., distraction). Results revealed superior roommate judgments (but not choices) following distraction vs. deliberation, consistent with Unconscious Thought Theory. Mindset did not have an influence on roommate judgments. However, planned tests revealed a significant advantage of distraction only within the rational-mindset condition, which is contrary to the idea that experiential processing alone facilitates complex decision-making during periods of distraction. In a second experiment (N = 136), we tested whether effects of unconscious thought manifest for a complex analytical reasoning task for which experiential processing would offer no advantage. As predicted, participants in an unconscious thought condition outperformed participants in a control condition, suggesting that unconscious thought can be analytical. In sum, the current results support the existence of unconscious thinking processes that are distinct from experiential processes, and can be rational. Thus, the experiential vs. rational nature of a process might not cleanly delineate conscious and unconscious thought. PMID:28729844

  6. Not Merely Experiential: Unconscious Thought Can Be Rational.

    PubMed

    Garrison, Katie E; Handley, Ian M

    2017-01-01

    Individuals often form more reasonable judgments from complex information after a period of distraction vs. deliberation. This phenomenon has been attributed to sophisticated unconscious thought during the distraction period that integrates and organizes the information (Unconscious Thought Theory; Dijksterhuis and Nordgren, 2006). Yet, other research suggests that experiential processes are strengthened during the distraction (relative to deliberation) period, accounting for the judgment and decision benefit. We tested between these possibilities, hypothesizing that unconscious thought is distinct from experiential processes, and independently contributes to judgments and decisions during a distraction period. Using an established paradigm, Experiment 1 ( N = 319) randomly induced participants into an experiential or rational mindset, after which participants received complex information describing three roommates to then consider consciously (i.e., deliberation) or unconsciously (i.e., distraction). Results revealed superior roommate judgments (but not choices) following distraction vs. deliberation, consistent with Unconscious Thought Theory. Mindset did not have an influence on roommate judgments. However, planned tests revealed a significant advantage of distraction only within the rational-mindset condition, which is contrary to the idea that experiential processing alone facilitates complex decision-making during periods of distraction. In a second experiment ( N = 136), we tested whether effects of unconscious thought manifest for a complex analytical reasoning task for which experiential processing would offer no advantage. As predicted, participants in an unconscious thought condition outperformed participants in a control condition, suggesting that unconscious thought can be analytical. In sum, the current results support the existence of unconscious thinking processes that are distinct from experiential processes, and can be rational. Thus, the experiential vs. rational nature of a process might not cleanly delineate conscious and unconscious thought.

  7. Task Complexity, Epistemological Beliefs and Metacognitive Calibration: An Exploratory Study

    ERIC Educational Resources Information Center

    Stahl, Elmar; Pieschl, Stephanie; Bromme, Rainer

    2006-01-01

    This article presents an explorative study, which is part of a comprehensive project to examine the impact of epistemological beliefs on metacognitive calibration during learning processes within a complex hypermedia information system. More specifically, this study investigates: 1) if learners differentiate between tasks of different complexity,…

  8. Curcumin complexation with cyclodextrins by the autoclave process: Method development and characterization of complex formation.

    PubMed

    Hagbani, Turki Al; Nazzal, Sami

    2017-03-30

    One approach to enhance curcumin (CUR) aqueous solubility is to use cyclodextrins (CDs) to form inclusion complexes where CUR is encapsulated as a guest molecule within the internal cavity of the water-soluble CD. Several methods have been reported for the complexation of CUR with CDs. Limited information, however, is available on the use of the autoclave process (AU) in complex formation. The aims of this work were therefore to (1) investigate and evaluate the AU cycle as a complex formation method to enhance CUR solubility; (2) compare the efficacy of the AU process with the freeze-drying (FD) and evaporation (EV) processes in complex formation; and (3) confirm CUR stability by characterizing CUR:CD complexes by NMR, Raman spectroscopy, DSC, and XRD. Significant differences were found in the saturation solubility of CUR from its complexes with CD when prepared by the three complexation methods. The AU yielded a complex with expected chemical and physical fingerprints for a CUR:CD inclusion complex that maintained the chemical integrity and stability of CUR and provided the highest solubility of CUR in water. Physical and chemical characterizations of the AU complexes confirmed the encapsulated of CUR inside the CD cavity and the transformation of the crystalline CUR:CD inclusion complex to an amorphous form. It was concluded that the autoclave process with its short processing time could be used as an alternate and efficient methods for drug:CD complexation. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. System model the processing of heterogeneous sensory information in robotized complex

    NASA Astrophysics Data System (ADS)

    Nikolaev, V.; Titov, V.; Syryamkin, V.

    2018-05-01

    Analyzed the scope and the types of robotic systems consisting of subsystems of the form "a heterogeneous sensors data processing subsystem". On the basis of the Queuing theory model is developed taking into account the unevenness of the intensity of information flow from the sensors to the subsystem of information processing. Analytical solution to assess the relationship of subsystem performance and uneven flows. The research of the obtained solution in the range of parameter values of practical interest.

  10. Dynamic control and information processing in chemical reaction systems by tuning self-organization behavior

    NASA Astrophysics Data System (ADS)

    Lebiedz, Dirk; Brandt-Pollmann, Ulrich

    2004-09-01

    Specific external control of chemical reaction systems and both dynamic control and signal processing as central functions in biochemical reaction systems are important issues of modern nonlinear science. For example nonlinear input-output behavior and its regulation are crucial for the maintainance of the life process that requires extensive communication between cells and their environment. An important question is how the dynamical behavior of biochemical systems is controlled and how they process information transmitted by incoming signals. But also from a general point of view external forcing of complex chemical reaction processes is important in many application areas ranging from chemical engineering to biomedicine. In order to study such control issues numerically, here, we choose a well characterized chemical system, the CO oxidation on Pt(110), which is interesting per se as an externally forced chemical oscillator model. We show numerically that tuning of temporal self-organization by input signals in this simple nonlinear chemical reaction exhibiting oscillatory behavior can in principle be exploited for both specific external control of dynamical system behavior and processing of complex information.

  11. Securing Information with Complex Optical Encryption Networks

    DTIC Science & Technology

    2015-08-11

    Network Security, Network Vulnerability , Multi-dimentional Processing, optoelectronic devices 16. SECURITY CLASSIFICATION OF: 17. LIMITATION... optoelectronic devices and systems should be analyzed before the retrieval, any hostile hacker will need to possess multi-disciplinary scientific...sophisticated optoelectronic principles and systems where he/she needs to process the information. However, in the military applications, most military

  12. Changes in Information Processing with Aging: Implications for Teaching Motor Skills.

    ERIC Educational Resources Information Center

    Anshel, Mark H.

    Although there are marked individual differences in the effect of aging on learning and performing motor skills, there is agreement that humans process information less efficiently with advanced age. Significant decrements have been found specifically with motor tasks that are characterized as externally-paced, rapid, complex, and requiring rapid…

  13. Electronic construction collaboration system -- final phase : [tech transfer summary].

    DOT National Transportation Integrated Search

    2014-07-01

    Construction projects have been growing more complex in terms of : project team composition, design aspects, and construction processes. : To help manage the shop/working drawings and requests for information : (RFIs) for its large, complex projects,...

  14. High rate information systems - Architectural trends in support of the interdisciplinary investigator

    NASA Technical Reports Server (NTRS)

    Handley, Thomas H., Jr.; Preheim, Larry E.

    1990-01-01

    Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.

  15. Modern Techniques in Acoustical Signal and Image Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candy, J V

    2002-04-04

    Acoustical signal processing problems can lead to some complex and intricate techniques to extract the desired information from noisy, sometimes inadequate, measurements. The challenge is to formulate a meaningful strategy that is aimed at performing the processing required even in the face of uncertainties. This strategy can be as simple as a transformation of the measured data to another domain for analysis or as complex as embedding a full-scale propagation model into the processor. The aims of both approaches are the same--to extract the desired information and reject the extraneous, that is, develop a signal processing scheme to achieve thismore » goal. In this paper, we briefly discuss this underlying philosophy from a ''bottom-up'' approach enabling the problem to dictate the solution rather than visa-versa.« less

  16. Occam's Quantum Strop: Synchronizing and Compressing Classical Cryptic Processes via a Quantum Channel.

    PubMed

    Mahoney, John R; Aghamohammadi, Cina; Crutchfield, James P

    2016-02-15

    A stochastic process' statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generator to another. How much information is required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process' cryptic order--a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost-one trades off prediction for generation complexity.

  17. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  18. Using a biased qubit to probe complex systems

    NASA Astrophysics Data System (ADS)

    Pollock, Felix A.; Checińska, Agata; Pascazio, Saverio; Modi, Kavan

    2016-09-01

    Complex mesoscopic systems play increasingly important roles in modern science, from understanding biological functions at the molecular level to designing solid-state information processing devices. The operation of these systems typically depends on their energetic structure, yet probing their energy landscape can be extremely challenging; they have many degrees of freedom, which may be hard to isolate and measure independently. Here, we show that a qubit (a two-level quantum system) with a biased energy splitting can directly probe the spectral properties of a complex system, without knowledge of how they couple. Our work is based on the completely positive and trace-preserving map formalism, which treats any unknown dynamics as a "black-box" process. This black box contains information about the system with which the probe interacts, which we access by measuring the survival probability of the initial state of the probe as function of the energy splitting and the process time. Fourier transforming the results yields the energy spectrum of the complex system. Without making assumptions about the strength or form of its coupling, our probe could determine aspects of a complex molecule's energy landscape as well as, in many cases, test for coherent superposition of its energy eigenstates.

  19. Acetylcholine molecular arrays enable quantum information processing

    NASA Astrophysics Data System (ADS)

    Tamulis, Arvydas; Majauskaite, Kristina; Talaikis, Martynas; Zborowski, Krzysztof; Kairys, Visvaldas

    2017-09-01

    We have found self-assembly of four neurotransmitter acetylcholine (ACh) molecular complexes in a water molecules environment by using geometry optimization with DFT B97d method. These complexes organizes to regular arrays of ACh molecules possessing electronic spins, i.e. quantum information bits. These spin arrays could potentially be controlled by the application of a non-uniform external magnetic field. The proper sequence of resonant electromagnetic pulses would then drive all the spin groups into the 3-spin entangled state and proceed large scale quantum information bits.

  20. Volume and Value of Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.

  1. Volume and Value of Big Healthcare Data

    PubMed Central

    Dinov, Ivo D.

    2016-01-01

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309

  2. Workflow computing. Improving management and efficiency of pathology diagnostic services.

    PubMed

    Buffone, G J; Moreau, D; Beck, J R

    1996-04-01

    Traditionally, information technology in health care has helped practitioners to collect, store, and present information and also to add a degree of automation to simple tasks (instrument interfaces supporting result entry, for example). Thus commercially available information systems do little to support the need to model, execute, monitor, coordinate, and revise the various complex clinical processes required to support health-care delivery. Workflow computing, which is already implemented and improving the efficiency of operations in several nonmedical industries, can address the need to manage complex clinical processes. Workflow computing not only provides a means to define and manage the events, roles, and information integral to health-care delivery but also supports the explicit implementation of policy or rules appropriate to the process. This article explains how workflow computing may be applied to health-care and the inherent advantages of the technology, and it defines workflow system requirements for use in health-care delivery with special reference to diagnostic pathology.

  3. A complex adaptive systems perspective of health information technology implementation.

    PubMed

    Keshavjee, Karim; Kuziemsky, Craig; Vassanji, Karim; Ghany, Ahmad

    2013-01-01

    Implementing health information technology (HIT) is a challenge because of the complexity and multiple interactions that define HIT implementation. Much of the research on HIT implementation is descriptive in nature and has focused on distinct processes such as order entry or decision support. These studies fail to take into account the underlying complexity of the processes, people and settings that are typical of HIT implementations. Complex adaptive systems (CAS) is a promising field that could elucidate the complexity and non-linear interacting issues that are typical in HIT implementation. Initially we sought new models that would enable us to better understand the complex nature of HIT implementation, to proactively identify problem issues that could be a precursor to unintended consequences and to develop new models and new approaches to successful HIT implementations. Our investigation demonstrates that CAS does not provide prediction, but forces us to rethink our HIT implementation paradigms and question what we think we know. CAS provides new ways to conceptualize HIT implementation and suggests new approaches to increasing HIT implementation successes.

  4. Information processing for aerospace structural health monitoring

    NASA Astrophysics Data System (ADS)

    Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.

    1998-06-01

    Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.

  5. Encoding techniques for complex information structures in connectionist systems

    NASA Technical Reports Server (NTRS)

    Barnden, John; Srinivas, Kankanahalli

    1990-01-01

    Two general information encoding techniques called relative position encoding and pattern similarity association are presented. They are claimed to be a convenient basis for the connectionist implementation of complex, short term information processing of the sort needed in common sense reasoning, semantic/pragmatic interpretation of natural language utterances, and other types of high level cognitive processing. The relationships of the techniques to other connectionist information-structuring methods, and also to methods used in computers, are discussed in detail. The rich inter-relationships of these other connectionist and computer methods are also clarified. The particular, simple forms are discussed that the relative position encoding and pattern similarity association techniques take in the author's own connectionist system, called Conposit, in order to clarify some issues and to provide evidence that the techniques are indeed useful in practice.

  6. Methodology in the measurement of complex human performance : two-dimensional compensatory tracking.

    DOT National Transportation Integrated Search

    1972-05-01

    Nineteen subjects were tested on two successive days on a complex performance device designed to measure functions of relevance to aircrew performance; included were measures of monitoring, information processing, pattern discrimination, and group pr...

  7. Problems of Automation and Management Principles Information Flow in Manufacturing

    NASA Astrophysics Data System (ADS)

    Grigoryuk, E. N.; Bulkin, V. V.

    2017-07-01

    Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.

  8. Complex Networks - A Key to Understanding Brain Function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sporns, Olaf

    2008-01-23

    The brain is a complex network of neurons, engaging in spontaneous and evoked activity that is thought to be the main substrate of mental life.  How this complex system works together to process information and generate coherent cognitive states, even consciousness, is not yet well understood.  In my talk I will review recent studies that have revealed characteristic structural and functional attributes of brain networks, and discuss efforts to build computational models of the brain that are informed by our growing knowledge of brain anatomy and physiology.

  9. Complex Networks - A Key to Understanding Brain Function

    ScienceCinema

    Sporns, Olaf

    2017-12-22

    The brain is a complex network of neurons, engaging in spontaneous and evoked activity that is thought to be the main substrate of mental life.  How this complex system works together to process information and generate coherent cognitive states, even consciousness, is not yet well understood.  In my talk I will review recent studies that have revealed characteristic structural and functional attributes of brain networks, and discuss efforts to build computational models of the brain that are informed by our growing knowledge of brain anatomy and physiology.

  10. The Difference between Uncertainty and Information, and Why This Matters

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.

    2016-12-01

    Earth science investigation and arbitration (for decision making) is very often organized around a concept of uncertainty. It seems relatively straightforward that the purpose of our science is to reduce uncertainty about how environmental systems will react and evolve under different conditions. I propose here that approaching a science of complex systems as a process of quantifying and reducing uncertainty is a mistake, and specifically a mistake that is rooted in certain rather hisoric logical errors. Instead I propose that we should be asking questions about information. I argue here that an information-based perspective facilitates almost trivial answers to environmental science questions that are either difficult or theoretically impossible to answer when posed as questions about uncertainty. In particular, I propose that an information-centric perspective leads to: Coherent and non-subjective hypothesis tests for complex system models. Process-level diagnostics for complex systems models. Methods for building complex systems models that allow for inductive inference without the need for a priori specification of likelihood functions or ad hoc error metrics. Asymptotically correct quantification of epistemic uncertainty. To put this in slightly more basic terms, I propose that an information-theoretic philosophy of science has the potential to resolve certain important aspects of the Demarcation Problem and the Duhem-Quine Problem, and that Hydrology and other Earth Systems Sciences can immediately capitalize on this to address some of our most difficult and persistent problems.

  11. Bridging the Operational Divide: An Information-Processing Model of Internal Supply Chain Integration

    ERIC Educational Resources Information Center

    Rosado Feger, Ana L.

    2009-01-01

    Supply Chain Management, the coordination of upstream and downstream flows of product, services, finances, and information from a source to a customer, has risen in prominence over the past fifteen years. The delivery of a product to the consumer is a complex process requiring action from several independent entities. An individual firm consists…

  12. Auditory Power-Law Activation Avalanches Exhibit a Fundamental Computational Ground State

    NASA Astrophysics Data System (ADS)

    Stoop, Ruedi; Gomez, Florian

    2016-07-01

    The cochlea provides a biological information-processing paradigm that we are only beginning to understand in its full complexity. Our work reveals an interacting network of strongly nonlinear dynamical nodes, on which even a simple sound input triggers subnetworks of activated elements that follow power-law size statistics ("avalanches"). From dynamical systems theory, power-law size distributions relate to a fundamental ground state of biological information processing. Learning destroys these power laws. These results strongly modify the models of mammalian sound processing and provide a novel methodological perspective for understanding how the brain processes information.

  13. Understanding Complexity and Self-Organization in a Defense Program Management Organization (Experimental Design)

    DTIC Science & Technology

    2016-03-18

    SPONSORED REPORT SERIES Understanding Complexity and Self - Organization in a Defense Program Management Organization (Experimental Design...experiment will examine the decision-making process within the program office and the self - organization of key program office personnel based upon formal...and informal communications links. Additionally, we are interested in the effects of this self - organizing process on the organization’s shared

  14. The effects of mild and severe traumatic brain injury on speed of information processing as measured by the computerized tests of information processing (CTIP).

    PubMed

    Tombaugh, Tom N; Rees, Laura; Stormer, Peter; Harrison, Allyson G; Smith, Andra

    2007-01-01

    In spite of the fact that reaction time (RT) measures are sensitive to the effects of traumatic brain injury (TBI), few RT procedures have been developed for use in standard clinical evaluations. The computerized test of information processing (CTIP) [Tombaugh, T. N., & Rees, L. (2000). Manual for the computerized tests of information processing (CTIP). Ottawa, Ont.: Carleton University] was designed to measure the degree to which TBI decreases the speed at which information is processed. The CTIP consists of three computerized programs that progressively increase the amount of information that is processed. Results of the current study demonstrated that RT increased as the difficulty of the CTIP tests increased (known as the complexity effect), and as severity of injury increased (from mild to severe TBI). The current study also demonstrated the importance of selecting a non-biased measure of variability. Overall, findings suggest that the CTIP is an easy to administer and sensitive measure of information processing speed.

  15. An Integration of a GIS with Peatland Management

    NASA Technical Reports Server (NTRS)

    Hoshal, J. C.; Johnson, R. L.

    1982-01-01

    The complexities of peatland management in Minnesota and the use of a geographic information system, the Minnesota Land Management Information System (MLMIS) in the management process are examined. General information on the nature of peat and it quantity and distribution in Minnesota is also presented.

  16. Predicting protein complexes using a supervised learning method combined with local structural information.

    PubMed

    Dong, Yadong; Sun, Yongqi; Qin, Chao

    2018-01-01

    The existing protein complex detection methods can be broadly divided into two categories: unsupervised and supervised learning methods. Most of the unsupervised learning methods assume that protein complexes are in dense regions of protein-protein interaction (PPI) networks even though many true complexes are not dense subgraphs. Supervised learning methods utilize the informative properties of known complexes; they often extract features from existing complexes and then use the features to train a classification model. The trained model is used to guide the search process for new complexes. However, insufficient extracted features, noise in the PPI data and the incompleteness of complex data make the classification model imprecise. Consequently, the classification model is not sufficient for guiding the detection of complexes. Therefore, we propose a new robust score function that combines the classification model with local structural information. Based on the score function, we provide a search method that works both forwards and backwards. The results from experiments on six benchmark PPI datasets and three protein complex datasets show that our approach can achieve better performance compared with the state-of-the-art supervised, semi-supervised and unsupervised methods for protein complex detection, occasionally significantly outperforming such methods.

  17. Innovations in clinical trials informatics.

    PubMed

    Summers, Ron; Vyas, Hiten; Dudhal, Nilesh; Doherty, Neil F; Coombs, Crispin R; Hepworth, Mark

    2008-01-01

    This paper will investigate innovations in information management for use in clinical trials. The application typifies a complex, adaptive, distributed and information-rich environment for which continuous innovation is necessary. Organisational innovation is highlighted as well as the technical innovations in workflow processes and their representation as an integrated set of web services. Benefits realization uncovers further innovations in the business strand of the work undertaken. Following the description of the development of this information management system, the semantic web is postulated as a possible solution to tame the complexity related to information management issues found within clinical trials support systems.

  18. Ontology-Driven Information Integration

    NASA Technical Reports Server (NTRS)

    Tissot, Florence; Menzel, Chris

    2005-01-01

    Ontology-driven information integration (ODII) is a method of computerized, automated sharing of information among specialists who have expertise in different domains and who are members of subdivisions of a large, complex enterprise (e.g., an engineering project, a government agency, or a business). In ODII, one uses rigorous mathematical techniques to develop computational models of engineering and/or business information and processes. These models are then used to develop software tools that support the reliable processing and exchange of information among the subdivisions of this enterprise or between this enterprise and other enterprises.

  19. A situation-response model for intelligent pilot aiding

    NASA Technical Reports Server (NTRS)

    Schudy, Robert; Corker, Kevin

    1987-01-01

    An intelligent pilot aiding system needs models of the pilot information processing to provide the computational basis for successful cooperation between the pilot and the aiding system. By combining artificial intelligence concepts with the human information processing model of Rasmussen, an abstraction hierarchy of states of knowledge, processing functions, and shortcuts are developed, which is useful for characterizing the information processing both of the pilot and of the aiding system. This approach is used in the conceptual design of a real time intelligent aiding system for flight crews of transport aircraft. One promising result was the tentative identification of a particular class of information processing shortcuts, from situation characterizations to appropriate responses, as the most important reliable pathway for dealing with complex time critical situations.

  20. Altered Topology in Information Processing of a Narrated Story in Older Adults with Mild Cognitive Impairment.

    PubMed

    Yogev-Seligmann, Galit; Oren, Noga; Ash, Elissa L; Hendler, Talma; Giladi, Nir; Lerner, Yulia

    2016-05-03

    The ability to store, integrate, and manipulate information declines with aging. These changes occur earlier, faster, and to a greater degree as a result of neurodegeneration. One of the most common and early characteristics of cognitive decline is difficulty with comprehension of information. The neural mechanisms underlying this breakdown of information processing are poorly understood. Using functional MRI and natural stimuli (e.g., stories), we mapped the neural mechanisms by which the human brain accumulates and processes information with increasing duration and complexity in participants with amnestic mild cognitive impairment (aMCI) and healthy older adults. To explore the mechanisms of information processing, we measured the reliability of brain responses elicited by listening to different versions of a narrated story created by segmenting the story into words, sentences, and paragraphs and then scrambling the segments. Comparing healthy older adults and participants with aMCI revealed that in both groups, all types of stimuli similarly recruited primary auditory areas. However, prominent differences between groups were found at the level of processing long and complex stimuli. In healthy older adults, parietal and frontal regions demonstrated highly synchronized responses in both the paragraph and full story conditions, as has been previously reported in young adults. Participants with aMCI, however, exhibited a robust functional shift of long time scale processing to the pre- and post-central sulci. Our results suggest that participants with aMCI experienced a functional shift of higher order auditory information processing, possibly reflecting a functional response to concurrent or impending neuronal or synaptic loss. This observation might assist in understanding mechanisms of cognitive decline in aMCI.

  1. An Analysis of Informal Reasoning Fallacy and Critical Thinking Dispositions among Malaysian Undergraduates

    ERIC Educational Resources Information Center

    Ramasamy, Shamala

    2011-01-01

    In this information age, the amount of complex information available due to technological advancement would require undergraduates to be extremely competent in processing information systematically. Critical thinking ability of undergraduates has been the focal point among educators, employers and the public at large. One of the dimensions of…

  2. Measuring Spontaneous and Instructed Evaluation Processes during Web Search: Integrating Concurrent Thinking-Aloud Protocols and Eye-Tracking Data

    ERIC Educational Resources Information Center

    Gerjets, Peter; Kammerer, Yvonne; Werner, Benita

    2011-01-01

    Web searching for complex information requires to appropriately evaluating diverse sources of information. Information science studies identified different criteria applied by searchers to evaluate Web information. However, the explicit evaluation instructions used in these studies might have resulted in a distortion of spontaneous evaluation…

  3. The architecture of the management system of complex steganographic information

    NASA Astrophysics Data System (ADS)

    Evsutin, O. O.; Meshcheryakov, R. V.; Kozlova, A. S.; Solovyev, T. M.

    2017-01-01

    The aim of the study is to create a wide area information system that allows one to control processes of generation, embedding, extraction, and detection of steganographic information. In this paper, the following problems are considered: the definition of the system scope and the development of its architecture. For creation of algorithmic maintenance of the system, classic methods of steganography are used to embed information. Methods of mathematical statistics and computational intelligence are used to identify the embedded information. The main result of the paper is the development of the architecture of the management system of complex steganographic information. The suggested architecture utilizes cloud technology in order to provide service using the web-service via the Internet. It is meant to provide streams of multimedia data processing that are streams with many sources of different types. The information system, built in accordance with the proposed architecture, will be used in the following areas: hidden transfer of documents protected by medical secrecy in telemedicine systems; copyright protection of online content in public networks; prevention of information leakage caused by insiders.

  4. Complexity in electronic negotiation support systems.

    PubMed

    Griessmair, Michele; Strunk, Guido; Vetschera, Rudolf; Koeszegi, Sabine T

    2011-10-01

    It is generally acknowledged that the medium influences the way we communicate and negotiation research directs considerable attention to the impact of different electronic communication modes on the negotiation process and outcomes. Complexity theories offer models and methods that allow the investigation of how pattern and temporal sequences unfold over time in negotiation interactions. By focusing on the dynamic and interactive quality of negotiations as well as the information, choice, and uncertainty contained in the negotiation process, the complexity perspective addresses several issues of central interest in classical negotiation research. In the present study we compare the complexity of the negotiation communication process among synchronous and asynchronous negotiations (IM vs. e-mail) as well as an electronic negotiation support system including a decision support system (DSS). For this purpose, transcripts of 145 negotiations have been coded and analyzed with the Shannon entropy and the grammar complexity. Our results show that negotiating asynchronically via e-mail as well as including a DSS significantly reduces the complexity of the negotiation process. Furthermore, a reduction of the complexity increases the probability of reaching an agreement.

  5. Possible disruption of remote viewing by complex weak magnetic fields around the stimulus site and the possibility of accessing real phase space: a pilot study.

    PubMed

    Koren, S A; Persinger, M A

    2002-12-01

    In 2002 Persinger, Roll, Tiller, Koren, and Cook considered whether there are physical processes by which recondite information exists within the space and time of objects or events. The stimuli that compose this information might be directly detected within the whole brain without being processed by the typical sensory modalities. We tested the artist Ingo Swann who can reliably draw and describe randomly selected photographs sealed in envelopes in another room. In the present experiment the photographs were immersed continuously in repeated presentations (5 times per sec.) of one of two types of computer-generated complex magnetic field patterns whose intensities were less than 20 nT over most of the area. WINDOWS-generated but not DOS-generated patterns were associated with a marked decrease in Mr. Swann's accuracy. Whereas the DOS software generated exactly the same pattern, WINDOWS software phase-modulated the actual wave form resulting in an infinite bandwidth and complexity. We suggest that information obtained by processes attributed to "paranormal" phenomena have physical correlates that can be masked by weak, infinitely variable magnetic fields.

  6. A user-system interface agent

    NASA Technical Reports Server (NTRS)

    Wakim, Nagi T.; Srivastava, Sadanand; Bousaidi, Mehdi; Goh, Gin-Hua

    1995-01-01

    Agent-based technologies answer to several challenges posed by additional information processing requirements in today's computing environments. In particular, (1) users desire interaction with computing devices in a mode which is similar to that used between people, (2) the efficiency and successful completion of information processing tasks often require a high-level of expertise in complex and multiple domains, (3) information processing tasks often require handling of large volumes of data and, therefore, continuous and endless processing activities. The concept of an agent is an attempt to address these new challenges by introducing information processing environments in which (1) users can communicate with a system in a natural way, (2) an agent is a specialist and a self-learner and, therefore, it qualifies to be trusted to perform tasks independent of the human user, and (3) an agent is an entity that is continuously active performing tasks that are either delegated to it or self-imposed. The work described in this paper focuses on the development of an interface agent for users of a complex information processing environment (IPE). This activity is part of an on-going effort to build a model for developing agent-based information systems. Such systems will be highly applicable to environments which require a high degree of automation, such as, flight control operations and/or processing of large volumes of data in complex domains, such as the EOSDIS environment and other multidisciplinary, scientific data systems. The concept of an agent as an information processing entity is fully described with emphasis on characteristics of special interest to the User-System Interface Agent (USIA). Issues such as agent 'existence' and 'qualification' are discussed in this paper. Based on a definition of an agent and its main characteristics, we propose an architecture for the development of interface agents for users of an IPE that is agent-oriented and whose resources are likely to be distributed and heterogeneous in nature. The architecture of USIA is outlined in two main components: (1) the user interface which is concerned with issues as user dialog and interaction, user modeling, and adaptation to user profile, and (2) the system interface part which deals with identification of IPE capabilities, task understanding and feasibility assessment, and task delegation and coordination of assistant agents.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The four-dimensional scattering function S(Q,w) obtained by inelastic neutron scattering measurements provides unique "dynamical fingerprints" of the spin state and interactions present in complex magnetic materials. Extracting this information however is currently a slow and complex process that may take an expert -depending on the complexity of the system- up to several weeks of painstaking work to complete. Spin Wave Genie was created to abstract and automate this process. It strives to both reduce the time to complete this analysis and make these calculations more accessible to a broader group of scientists and engineers.

  8. Learning To Live with Complexity.

    ERIC Educational Resources Information Center

    Dosa, Marta

    Neither the design of information systems and networks nor the delivery of library services can claim true user centricity without an understanding of the multifaceted psychological environment of users and potential users. The complexity of the political process, social problems, challenges to scientific inquiry, entrepreneurship, and…

  9. Complexity and Entropy Analysis of DNMT1 Gene

    USDA-ARS?s Scientific Manuscript database

    Background: The application of complexity information on DNA sequence and protein in biological processes are well established in this study. Available sequences for DNMT1 gene, which is a maintenance methyltransferase is responsible for copying DNA methylation patterns to the daughter strands durin...

  10. The Role of Simple Semantics in the Process of Artificial Grammar Learning.

    PubMed

    Öttl, Birgit; Jäger, Gerhard; Kaup, Barbara

    2017-10-01

    This study investigated the effect of semantic information on artificial grammar learning (AGL). Recursive grammars of different complexity levels (regular language, mirror language, copy language) were investigated in a series of AGL experiments. In the with-semantics condition, participants acquired semantic information prior to the AGL experiment; in the without-semantics control condition, participants did not receive semantic information. It was hypothesized that semantics would generally facilitate grammar acquisition and that the learning benefit in the with-semantics conditions would increase with increasing grammar complexity. Experiment 1 showed learning effects for all grammars but no performance difference between conditions. Experiment 2 replicated the absence of a semantic benefit for all grammars even though semantic information was more prominent during grammar acquisition as compared to Experiment 1. Thus, we did not find evidence for the idea that semantics facilitates grammar acquisition, which seems to support the view of an independent syntactic processing component.

  11. Design and implementation of spatial knowledge grid for integrated spatial analysis

    NASA Astrophysics Data System (ADS)

    Liu, Xiangnan; Guan, Li; Wang, Ping

    2006-10-01

    Supported by spatial information grid(SIG), the spatial knowledge grid (SKG) for integrated spatial analysis utilizes the middleware technology in constructing the spatial information grid computation environment and spatial information service system, develops spatial entity oriented spatial data organization technology, carries out the profound computation of the spatial structure and spatial process pattern on the basis of Grid GIS infrastructure, spatial data grid and spatial information grid (specialized definition). At the same time, it realizes the complex spatial pattern expression and the spatial function process simulation by taking the spatial intelligent agent as the core to establish space initiative computation. Moreover through the establishment of virtual geographical environment with man-machine interactivity and blending, complex spatial modeling, network cooperation work and spatial community decision knowledge driven are achieved. The framework of SKG is discussed systematically in this paper. Its implement flow and the key technology with examples of overlay analysis are proposed as well.

  12. Data Processing Center of Radioastron Project: 3 years of operation.

    NASA Astrophysics Data System (ADS)

    Shatskaya, Marina

    ASC DATA PROCESSING CENTER (DPC) of Radioastron Project is a fail-safe complex centralized system of interconnected software/ hardware components along with organizational procedures. Tasks facing of the scientific data processing center are organization of service information exchange, collection of scientific data, storage of all of scientific data, data science oriented processing. DPC takes part in the informational exchange with two tracking stations in Pushchino (Russia) and Green Bank (USA), about 30 ground telescopes, ballistic center, tracking headquarters and session scheduling center. Enormous flows of information go to Astro Space Center. For the inquiring of enormous data volumes we develop specialized network infrastructure, Internet channels and storage. The computer complex has been designed at the Astro Space Center (ASC) of Lebedev Physical Institute and includes: - 800 TB on-line storage, - 2000 TB hard drive archive, - backup system on magnetic tapes (2000 TB); - 24 TB redundant storage at Pushchino Radio Astronomy Observatory; - Web and FTP servers, - DPC management and data transmission networks. The structure and functions of ASC Data Processing Center are fully adequate to the data processing requirements of the Radioastron Mission and has been successfully confirmed during Fringe Search, Early Science Program and first year of Key Science Program.

  13. GEOTAIL Spacecraft historical data report

    NASA Technical Reports Server (NTRS)

    Boersig, George R.; Kruse, Lawrence F.

    1993-01-01

    The purpose of this GEOTAIL Historical Report is to document ground processing operations information gathered on the GEOTAIL mission during processing activities at the Cape Canaveral Air Force Station (CCAFS). It is hoped that this report may aid management analysis, improve integration processing and forecasting of processing trends, and reduce real-time schedule changes. The GEOTAIL payload is the third Delta 2 Expendable Launch Vehicle (ELV) mission to document historical data. Comparisons of planned versus as-run schedule information are displayed. Information will generally fall into the following categories: (1) payload stay times (payload processing facility/hazardous processing facility/launch complex-17A); (2) payload processing times (planned, actual); (3) schedule delays; (4) integrated test times (experiments/launch vehicle); (5) unique customer support requirements; (6) modifications performed at facilities; (7) other appropriate information (Appendices A & B); and (8) lessons learned (reference Appendix C).

  14. The eyes have it: Using eye tracking to inform information processing strategies in multi-attributes choices.

    PubMed

    Ryan, Mandy; Krucien, Nicolas; Hermens, Frouke

    2018-04-01

    Although choice experiments (CEs) are widely applied in economics to study choice behaviour, understanding of how individuals process attribute information remains limited. We show how eye-tracking methods can provide insight into how decisions are made. Participants completed a CE, while their eye movements were recorded. Results show that although the information presented guided participants' decisions, there were also several processing biases at work. Evidence was found of (a) top-to-bottom, (b) left-to-right, and (c) first-to-last order biases. Experimental factors-whether attributes are defined as "best" or "worst," choice task complexity, and attribute ordering-also influence information processing. How individuals visually process attribute information was shown to be related to their choices. Implications for the design and analysis of CEs and future research are discussed. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Measures and Metrics of Information Processing in Complex Systems: A Rope of Sand

    ERIC Educational Resources Information Center

    James, Ryan Gregory

    2013-01-01

    How much information do natural systems store and process? In this work we attempt to answer this question in multiple ways. We first establish a mathematical framework where natural systems are represented by a canonical form of edge-labeled hidden fc models called e-machines. Then, utilizing this framework, a variety of measures are defined and…

  16. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  17. Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure

    PubMed Central

    2016-01-01

    Background Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. Objective The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. Methods We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. Results We identified 5 high-level macrocognitive processes affecting medication management—sensemaking, planning, coordination, monitoring, and decision making—and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Conclusions Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation. PMID:27733331

  18. Medication Management: The Macrocognitive Workflow of Older Adults With Heart Failure.

    PubMed

    Mickelson, Robin S; Unertl, Kim M; Holden, Richard J

    2016-10-12

    Older adults with chronic disease struggle to manage complex medication regimens. Health information technology has the potential to improve medication management, but only if it is based on a thorough understanding of the complexity of medication management workflow as it occurs in natural settings. Prior research reveals that patient work related to medication management is complex, cognitive, and collaborative. Macrocognitive processes are theorized as how people individually and collaboratively think in complex, adaptive, and messy nonlaboratory settings supported by artifacts. The objective of this research was to describe and analyze the work of medication management by older adults with heart failure, using a macrocognitive workflow framework. We interviewed and observed 61 older patients along with 30 informal caregivers about self-care practices including medication management. Descriptive qualitative content analysis methods were used to develop categories, subcategories, and themes about macrocognitive processes used in medication management workflow. We identified 5 high-level macrocognitive processes affecting medication management-sensemaking, planning, coordination, monitoring, and decision making-and 15 subprocesses. Data revealed workflow as occurring in a highly collaborative, fragile system of interacting people, artifacts, time, and space. Process breakdowns were common and patients had little support for macrocognitive workflow from current tools. Macrocognitive processes affected medication management performance. Describing and analyzing this performance produced recommendations for technology supporting collaboration and sensemaking, decision making and problem detection, and planning and implementation.

  19. A Principled Approach to the Specification of System Architectures for Space Missions

    NASA Technical Reports Server (NTRS)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  20. Consciousness: a unique way of processing information.

    PubMed

    Marchetti, Giorgio

    2018-02-08

    In this article, I argue that consciousness is a unique way of processing information, in that: it produces information, rather than purely transmitting it; the information it produces is meaningful for us; the meaning it has is always individuated. This uniqueness allows us to process information on the basis of our personal needs and ever-changing interactions with the environment, and consequently to act autonomously. Three main basic cognitive processes contribute to realize this unique way of information processing: the self, attention and working memory. The self, which is primarily expressed via the central and peripheral nervous systems, maps our body, the environment, and our relations with the environment. It is the primary means by which the complexity inherent to our composite structure is reduced into the "single voice" of a unique individual. It provides a reference system that (albeit evolving) is sufficiently stable to define the variations that will be used as the raw material for the construction of conscious information. Attention allows for the selection of those variations in the state of the self that are most relevant in the given situation. Attention originates and is deployed from a single locus inside our body, which represents the center of the self, around which all our conscious experiences are organized. Whatever is focused by attention appears in our consciousness as possessing a spatial quality defined by this center and the direction toward which attention is focused. In addition, attention determines two other features of conscious experience: periodicity and phenomenal quality. Self and attention are necessary but not sufficient for conscious information to be produced. Complex forms of conscious experiences, such as the various modes of givenness of conscious experience and the stream of consciousness, need a working memory mechanism to assemble the basic pieces of information selected by attention.

  1. Biochemistry of the Envenomation Response--A Generator Theme for Interdisciplinary Integration

    ERIC Educational Resources Information Center

    Montagna, Erik; Guerreiro, Juliano R.; Torres, Bayardo B.

    2010-01-01

    The understanding of complex physiological processes requires information from many different areas of knowledge. To meet this interdisciplinary scenario, the ability of integrating and articulating information is demanded. The difficulty of such approach arises because, more often than not, information is fragmented through under graduation…

  2. Teaching Scientific Metaphors through Informational Text Read-Alouds

    ERIC Educational Resources Information Center

    Barnes, Erica M.; Oliveira, Alandeom W.

    2018-01-01

    Elementary students are expected to use various features of informational texts to build knowledge in the content areas. In science informational texts, scientific metaphors are commonly used to make sense of complex and invisible processes. Although elementary students may be familiar with literary metaphors as used in narratives, they may be…

  3. How children aged 2;6 tailor verbal expressions to interlocutor informational needs.

    PubMed

    Abbot-Smith, Kirsten; Nurmsoo, Erika; Croll, Rebecca; Ferguson, Heather; Forrester, Michael

    2016-11-01

    Although preschoolers are pervasively underinformative in their actual usage of verbal reference, a number of studies have shown that they nonetheless demonstrate sensitivity to listener informational needs, at least when environmental cues to this are obvious. We investigated two issues. The first concerned the types of visual cues to interlocutor informational needs which children aged 2;6 can process whilst producing complex referring expressions. The second was whether performance in experimental tasks related to naturalistic conversational proficiency. We found that 2;6-year-olds used fewer complex expressions when the objects were dissimilar compared to highly similar objects, indicating that they tailor their verbal expressions to the informational needs of another person, even when the cue to the informational need is relatively opaque. We also found a correlation between conversational skills as rated by the parents and the degree to which 2;6-year-olds could learn from feedback to produce complex referring expressions.

  4. A probabilistic process model for pelagic marine ecosystems informed by Bayesian inverse analysis

    EPA Science Inventory

    Marine ecosystems are complex systems with multiple pathways that produce feedback cycles, which may lead to unanticipated effects. Models abstract this complexity and allow us to predict, understand, and hypothesize. In ecological models, however, the paucity of empirical data...

  5. Complex Decision-Making in Heart Failure: A Systematic Review and Thematic Analysis.

    PubMed

    Hamel, Aimee V; Gaugler, Joseph E; Porta, Carolyn M; Hadidi, Niloufar Niakosari

    Heart failure follows a highly variable and difficult course. Patients face complex decisions, including treatment with implantable cardiac defibrillators, mechanical circulatory support, and heart transplantation. The course of decision-making across multiple treatments is unclear yet integral to providing informed and shared decision-making. Recognizing commonalities across treatment decisions could help nurses and physicians to identify opportunities to introduce discussions and support shared decision-making. The specific aims of this review are to examine complex treatment decision-making, specifically implantable cardiac defibrillators, ventricular assist device, and cardiac transplantation, and to recognize commonalities and key points in the decisional process. MEDLINE, CINAHL, PsycINFO, and Web of Science were searched for English-language studies that included qualitative findings reflecting the complexity of heart failure decision-making. Using a 3-step process, findings were synthesized into themes and subthemes. Twelve articles met criteria for inclusion. Participants included patients, caregivers, and clinicians and included decisions to undergo and decline treatment. Emergent themes were "processing the decision," "timing and prognostication," and "considering the future." Subthemes described how participants received and understood information about the therapy, making and changing a treatment decision, timing their decision and gauging health status outcomes in the context of their decision, the influence of a life or death decision, and the future as a factor in their decisional process. Commonalities were present across therapies, which involved the timing of discussions, the delivery of information, and considerations of the future. Exploring this further could help support patient-centered care and optimize shared decision-making interventions.

  6. The Capabilities of Chaos and Complexity

    PubMed Central

    Abel, David L.

    2009-01-01

    To what degree could chaos and complexity have organized a Peptide or RNA World of crude yet necessarily integrated protometabolism? How far could such protolife evolve in the absence of a heritable linear digital symbol system that could mutate, instruct, regulate, optimize and maintain metabolic homeostasis? To address these questions, chaos, complexity, self-ordered states, and organization must all be carefully defined and distinguished. In addition their cause-and-effect relationships and mechanisms of action must be delineated. Are there any formal (non physical, abstract, conceptual, algorithmic) components to chaos, complexity, self-ordering and organization, or are they entirely physicodynamic (physical, mass/energy interaction alone)? Chaos and complexity can produce some fascinating self-ordered phenomena. But can spontaneous chaos and complexity steer events and processes toward pragmatic benefit, select function over non function, optimize algorithms, integrate circuits, produce computational halting, organize processes into formal systems, control and regulate existing systems toward greater efficiency? The question is pursued of whether there might be some yet-to-be discovered new law of biology that will elucidate the derivation of prescriptive information and control. “System” will be rigorously defined. Can a low-informational rapid succession of Prigogine’s dissipative structures self-order into bona fide organization? PMID:19333445

  7. Surviving blind decomposition: A distributional analysis of the time-course of complex word recognition.

    PubMed

    Schmidtke, Daniel; Matsuki, Kazunaga; Kuperman, Victor

    2017-11-01

    The current study addresses a discrepancy in the psycholinguistic literature about the chronology of information processing during the visual recognition of morphologically complex words. Form-then-meaning accounts of complex word recognition claim that morphemes are processed as units of form prior to any influence of their meanings, whereas form-and-meaning models posit that recognition of complex word forms involves the simultaneous access of morphological and semantic information. The study reported here addresses this theoretical discrepancy by applying a nonparametric distributional technique of survival analysis (Reingold & Sheridan, 2014) to 2 behavioral measures of complex word processing. Across 7 experiments reported here, this technique is employed to estimate the point in time at which orthographic, morphological, and semantic variables exert their earliest discernible influence on lexical decision RTs and eye movement fixation durations. Contrary to form-then-meaning predictions, Experiments 1-4 reveal that surface frequency is the earliest lexical variable to exert a demonstrable influence on lexical decision RTs for English and Dutch derived words (e.g., badness ; bad + ness ), English pseudoderived words (e.g., wander ; wand + er ) and morphologically simple control words (e.g., ballad ; ball + ad ). Furthermore, for derived word processing across lexical decision and eye-tracking paradigms (Experiments 1-2; 5-7), semantic effects emerge early in the time-course of word recognition, and their effects either precede or emerge simultaneously with morphological effects. These results are not consistent with the premises of the form-then-meaning view of complex word recognition, but are convergent with a form-and-meaning account of complex word recognition. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Auditory connections and functions of prefrontal cortex

    PubMed Central

    Plakke, Bethany; Romanski, Lizabeth M.

    2014-01-01

    The functional auditory system extends from the ears to the frontal lobes with successively more complex functions occurring as one ascends the hierarchy of the nervous system. Several areas of the frontal lobe receive afferents from both early and late auditory processing regions within the temporal lobe. Afferents from the early part of the cortical auditory system, the auditory belt cortex, which are presumed to carry information regarding auditory features of sounds, project to only a few prefrontal regions and are most dense in the ventrolateral prefrontal cortex (VLPFC). In contrast, projections from the parabelt and the rostral superior temporal gyrus (STG) most likely convey more complex information and target a larger, widespread region of the prefrontal cortex. Neuronal responses reflect these anatomical projections as some prefrontal neurons exhibit responses to features in acoustic stimuli, while other neurons display task-related responses. For example, recording studies in non-human primates indicate that VLPFC is responsive to complex sounds including vocalizations and that VLPFC neurons in area 12/47 respond to sounds with similar acoustic morphology. In contrast, neuronal responses during auditory working memory involve a wider region of the prefrontal cortex. In humans, the frontal lobe is involved in auditory detection, discrimination, and working memory. Past research suggests that dorsal and ventral subregions of the prefrontal cortex process different types of information with dorsal cortex processing spatial/visual information and ventral cortex processing non-spatial/auditory information. While this is apparent in the non-human primate and in some neuroimaging studies, most research in humans indicates that specific task conditions, stimuli or previous experience may bias the recruitment of specific prefrontal regions, suggesting a more flexible role for the frontal lobe during auditory cognition. PMID:25100931

  9. Patterns of patient safety culture: a complexity and arts-informed project of knowledge translation.

    PubMed

    Mitchell, Gail J; Tregunno, Deborah; Gray, Julia; Ginsberg, Liane

    2011-01-01

    The purpose of this paper is to describe patterns of patient safety culture that emerged from an innovative collaboration among health services researchers and fine arts colleagues. The group engaged in an arts-informed knowledge translation project to produce a dramatic expression of patient safety culture research for inclusion in a symposium. Scholars have called for a deeper understanding of the complex interrelationships among structure, process and outcomes relating to patient safety. Four patterns of patient safety culture--blinding familiarity, unyielding determination, illusion of control and dismissive urgency--are described with respect to how they informed creation of an arts-informed project for knowledge translation.

  10. The Generalization of Mutual Information as the Information between a Set of Variables: The Information Correlation Function Hierarchy and the Information Structure of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Wolf, David R.

    2004-01-01

    The topic of this paper is a hierarchy of information-like functions, here named the information correlation functions, where each function of the hierarchy may be thought of as the information between the variables it depends upon. The information correlation functions are particularly suited to the description of the emergence of complex behaviors due to many- body or many-agent processes. They are particularly well suited to the quantification of the decomposition of the information carried among a set of variables or agents, and its subsets. In more graphical language, they provide the information theoretic basis for understanding the synergistic and non-synergistic components of a system, and as such should serve as a forceful toolkit for the analysis of the complexity structure of complex many agent systems. The information correlation functions are the natural generalization to an arbitrary number of sets of variables of the sequence starting with the entropy function (one set of variables) and the mutual information function (two sets). We start by describing the traditional measures of information (entropy) and mutual information.

  11. Further understanding of complex information processing in verbal adolescents and adults with autism spectrum disorders.

    PubMed

    Williams, Diane L; Minshew, Nancy J; Goldstein, Gerald

    2015-10-01

    More than 20 years ago, Minshew and colleagues proposed the Complex Information Processing model of autism in which the impairment is characterized as a generalized deficit involving multiple modalities and cognitive domains that depend on distributed cortical systems responsible for higher order abilities. Subsequent behavioral work revealed a related dissociation between concept formation and concept identification in autism suggesting the lack of an underlying organizational structure to manage increases in processing loads. The results of a recent study supported the impact of this relative weakness in conceptual reasoning on adaptive functioning in children and adults with autism. In this study, we provide further evidence of the difficulty relatively able older adolescents and adults with autism have with conceptual reasoning and provide evidence that this characterizes their difference from age- and ability-matched controls with typical development better than their differences in language. For verbal adults with autism, language may serve as a bootstrap or compensatory mechanism for learning but cannot overcome an inherent weakness in concept formation that makes information processing challenging as task demands increase. © The Author(s) 2015.

  12. Image processing mini manual

    NASA Technical Reports Server (NTRS)

    Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill

    1992-01-01

    The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.

  13. Is the destabilization of the cournot equilibrium a good business strategy in cournot-puu duopoly?

    PubMed

    Canovas, Jose S

    2011-10-01

    It is generally acknowledged that the medium influences the way we communicate and negotiation research directs considerable attention to the impact of different electronic communication modes on the negotiation process and outcomes. Complexity theories offer models and methods that allow the investigation of how pattern and temporal sequences unfold over time in negotiation interactions. By focusing on the dynamic and interactive quality of negotiations as well as the information, choice, and uncertainty contained in the negotiation process, the complexity perspective addresses several issues of central interest in classical negotiation research. In the present study we compare the complexity of the negotiation communication process among synchronous and asynchronous negotiations (IM vs. e-mail) as well as an electronic negotiation support system including a decision support system (DSS). For this purpose, transcripts of 145 negotiations have been coded and analyzed with the Shannon entropy and the grammar complexity. Our results show that negotiating asynchronically via e-mail as well as including a DSS significantly reduces the complexity of the negotiation process. Furthermore, a reduction of the complexity increases the probability of reaching an agreement.

  14. An eye-tracking paradigm for analyzing the processing time of sentences with different linguistic complexities.

    PubMed

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics.

  15. An Eye-Tracking Paradigm for Analyzing the Processing Time of Sentences with Different Linguistic Complexities

    PubMed Central

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184

  16. Increasing complexity with quantum physics.

    PubMed

    Anders, Janet; Wiesner, Karoline

    2011-09-01

    We argue that complex systems science and the rules of quantum physics are intricately related. We discuss a range of quantum phenomena, such as cryptography, computation and quantum phases, and the rules responsible for their complexity. We identify correlations as a central concept connecting quantum information and complex systems science. We present two examples for the power of correlations: using quantum resources to simulate the correlations of a stochastic process and to implement a classically impossible computational task.

  17. Semantic-Based Knowledge Management in E-Government: Modeling Attention for Proactive Information Delivery

    NASA Astrophysics Data System (ADS)

    Samiotis, Konstantinos; Stojanovic, Nenad

    E-government has become almost synonymous with a consumer-led revolution of government services inspired and made possible by the Internet. With technology being the least of the worries for government organizations nowadays, attention is shifting towards managing complexity as one of the basic antecedents of operational and decision-making inefficiency. Complexity has been traditionally preoccupying public administrations and owes its origins to several sources. Among them we encounter primarily the cross-functional nature and the degree of legal structuring of administrative work. Both of them have strong reliance to the underlying process and information infrastructure of public organizations. Managing public administration work thus implies managing its processes and information. Knowledge management (KM) and business process reengineering (BPR) have been deployed already by private organizations with success for the same purposes and certainly comprise improvement practices that are worthwhile investigating. Our contribution through this paper is on the utilization of KM for the e-government.

  18. Management Information Systems.

    ERIC Educational Resources Information Center

    Finlayson, Jean, Ed.

    1989-01-01

    This collection of papers addresses key questions facing college managers and others choosing, introducing, and living with big, complex computer-based systems. "What Use the User Requirement?" (Tony Coles) stresses the importance of an information strategy driven by corporate objectives, not technology. "Process of Selecting a…

  19. Synthetic Analog and Digital Circuits for Cellular Computation and Memory

    PubMed Central

    Purcell, Oliver; Lu, Timothy K.

    2014-01-01

    Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene circuits that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. PMID:24794536

  20. Methods for evaluating information in managing the enterprise on the basis of a hybrid three-tier system

    NASA Astrophysics Data System (ADS)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-01-01

    The article presents data on the influence of information upon the functioning of complex systems in the process of ensuring their effective management. Ways and methods for evaluating multidimensional information that reduce time and resources, improve the validity of the studied system management decisions, were proposed.

  1. The Referential Function of Internal Communication Groups in Complex Organizations: An Empirical Analysis.

    ERIC Educational Resources Information Center

    Taylor, James A.; Farace, Richard V.

    This paper argues that people who interact regularly and repetitively among themselves create a conjoint information space wherein common values, attitudes, and beliefs arise through the process of information transmission among the members in the space. Three major hypotheses concerning informal communication groups in organizations were tested…

  2. Visuo-Spatial Processing and Executive Functions in Children with Specific Language Impairment

    ERIC Educational Resources Information Center

    Marton, Klara

    2008-01-01

    Background: Individual differences in complex working memory tasks reflect simultaneous processing, executive functions, and attention control. Children with specific language impairment (SLI) show a deficit in verbal working memory tasks that involve simultaneous processing of information. Aims: The purpose of the study was to examine executive…

  3. 29 CFR 1910.119 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...

  4. 29 CFR 1910.119 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...

  5. DigiMemo: Facilitating the Note Taking Process

    ERIC Educational Resources Information Center

    Kurt, Serhat

    2009-01-01

    Everyone takes notes daily for various reasons. Note taking is very popular in school settings and generally recognized as an effective learning strategy. Further, note taking is a complex process because it requires understanding, selection of information and writing. Some new technological tools may facilitate the note taking process. Among such…

  6. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  7. Three-Dimensional Computer Simulation as an Important Competence Based Aspect of a Modern Mining Professional

    NASA Astrophysics Data System (ADS)

    Aksenova, Olesya; Pachkina, Anna

    2017-11-01

    The article deals with the problem of necessity of educational process transformation to meet the requirements of modern miming industry; cooperative developing of new educational programs and implementation of educational process taking into account modern manufacturability. The paper proves the idea of introduction into mining professionals learning process studying of three-dimensional models of surface technological complex, ore reserves and underground digging complex as well as creating these models in different graphic editors and working with the information analysis model obtained on the basis of these three-dimensional models. The technological process of manless coal mining at the premises of the mine Polysaevskaya controlled by the information analysis models built on the basis of three-dimensional models of individual objects and technological process as a whole, and at the same time requiring the staff able to use the programs of three-dimensional positioning in the miners and equipment global frame of reference is covered.

  8. Influence of using challenging tasks in biology classrooms on students' cognitive knowledge structure: an empirical video study

    NASA Astrophysics Data System (ADS)

    Nawani, Jigna; Rixius, Julia; Neuhaus, Birgit J.

    2016-08-01

    Empirical analysis of secondary biology classrooms revealed that, on average, 68% of teaching time in Germany revolved around processing tasks. Quality of instruction can thus be assessed by analyzing the quality of tasks used in classroom discourse. This quasi-experimental study analyzed how teachers used tasks in 38 videotaped biology lessons pertaining to the topic 'blood and circulatory system'. Two fundamental characteristics used to analyze tasks include: (1) required cognitive level of processing (e.g. low level information processing: repetiition, summary, define, classify and high level information processing: interpret-analyze data, formulate hypothesis, etc.) and (2) complexity of task content (e.g. if tasks require use of factual, linking or concept level content). Additionally, students' cognitive knowledge structure about the topic 'blood and circulatory system' was measured using student-drawn concept maps (N = 970 students). Finally, linear multilevel models were created with high-level cognitive processing tasks and higher content complexity tasks as class-level predictors and students' prior knowledge, students' interest in biology, and students' interest in biology activities as control covariates. Results showed a positive influence of high-level cognitive processing tasks (β = 0.07; p < .01) on students' cognitive knowledge structure. However, there was no observed effect of higher content complexity tasks on students' cognitive knowledge structure. Presented findings encourage the use of high-level cognitive processing tasks in biology instruction.

  9. The Effect of Visual Information on the Manual Approach and Landing

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1982-01-01

    The effect of visual information in combination with basic display information on the approach performance. A pre-experimental model analysis was performed in terms of the optimal control model. The resulting aircraft approach performance predictions were compared with the results of a moving base simulator program. The results illustrate that the model provides a meaningful description of the visual (scene) perception process involved in the complex (multi-variable, time varying) manual approach task with a useful predictive capability. The theoretical framework was shown to allow a straight-forward investigation of the complex interaction of a variety of task variables.

  10. Moving Students to Deeper Learning in Leadership

    ERIC Educational Resources Information Center

    Stover, Sheri; Seemiller, Corey

    2017-01-01

    The world is a volatile, uncertain, complex, and ambiguous (VUCA) environment (Carvan, 2015) that calls for leaders who can effectively navigate the complexity of leadership today. Students of leadership studies must not only learn leadership information content, but also be able to effectively implement the content and process, requiring deep…

  11. COMPLEX HOST-PARASITE SYSTEMS IN MARTES: IMPLICATIONS FOR CONSERVATION BIOLOGY OF ENDEMIC FAUNAS.

    USDA-ARS?s Scientific Manuscript database

    Complex assemblages of hosts and parasites reveal insights about biogeography and ecology and inform us about processes which serve to structure faunal diversity and the biosphere in space and time. Exploring aspects of parasite diversity among martens (species of Martes) and other mustelids reveal...

  12. Ranking streamflow model performance based on Information theory metrics

    NASA Astrophysics Data System (ADS)

    Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas

    2016-04-01

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.

  13. A computational framework for modeling targets as complex adaptive systems

    NASA Astrophysics Data System (ADS)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  14. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources

    PubMed Central

    Marenco, Luis N.; Wang, Rixin; Bandrowski, Anita E.; Grethe, Jeffrey S.; Shepherd, Gordon M.; Miller, Perry L.

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF’s data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO’s current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation. PMID:25018728

  15. Extending the NIF DISCO framework to automate complex workflow: coordinating the harvest and integration of data from diverse neuroscience information resources.

    PubMed

    Marenco, Luis N; Wang, Rixin; Bandrowski, Anita E; Grethe, Jeffrey S; Shepherd, Gordon M; Miller, Perry L

    2014-01-01

    This paper describes how DISCO, the data aggregator that supports the Neuroscience Information Framework (NIF), has been extended to play a central role in automating the complex workflow required to support and coordinate the NIF's data integration capabilities. The NIF is an NIH Neuroscience Blueprint initiative designed to help researchers access the wealth of data related to the neurosciences available via the Internet. A central component is the NIF Federation, a searchable database that currently contains data from 231 data and information resources regularly harvested, updated, and warehoused in the DISCO system. In the past several years, DISCO has greatly extended its functionality and has evolved to play a central role in automating the complex, ongoing process of harvesting, validating, integrating, and displaying neuroscience data from a growing set of participating resources. This paper provides an overview of DISCO's current capabilities and discusses a number of the challenges and future directions related to the process of coordinating the integration of neuroscience data within the NIF Federation.

  16. Multiple-reason decision making based on automatic processing.

    PubMed

    Glöckner, Andreas; Betsch, Tilmann

    2008-09-01

    It has been repeatedly shown that in decisions under time constraints, individuals predominantly use noncompensatory strategies rather than complex compensatory ones. The authors argue that these findings might be due not to limitations of cognitive capacity but instead to limitations of information search imposed by the commonly used experimental tool Mouselab (J. W. Payne, J. R. Bettman, & E. J. Johnson, 1988). The authors tested this assumption in 3 experiments. In the 1st experiment, information was openly presented, whereas in the 2nd experiment, the standard Mouselab program was used under different time limits. The results indicate that individuals are able to compute weighted additive decision strategies extremely quickly if information search is not restricted by the experimental procedure. In a 3rd experiment, these results were replicated using more complex decision tasks, and the major alternative explanations that individuals use more complex heuristics or that they merely encode the constellation of cues were ruled out. In sum, the findings challenge the fundaments of bounded rationality and highlight the importance of automatic processes in decision making. (c) 2008 APA, all rights reserved.

  17. Using quantum theory to simplify input-output processes

    NASA Astrophysics Data System (ADS)

    Thompson, Jayne; Garner, Andrew J. P.; Vedral, Vlatko; Gu, Mile

    2017-02-01

    All natural things process and transform information. They receive environmental information as input, and transform it into appropriate output responses. Much of science is dedicated to building models of such systems-algorithmic abstractions of their input-output behavior that allow us to simulate how such systems can behave in the future, conditioned on what has transpired in the past. Here, we show that classical models cannot avoid inefficiency-storing past information that is unnecessary for correct future simulation. We construct quantum models that mitigate this waste, whenever it is physically possible to do so. This suggests that the complexity of general input-output processes depends fundamentally on what sort of information theory we use to describe them.

  18. GEOSPATIAL IT/IM QA CHECKLIST

    EPA Science Inventory

    Quality assurance (QA) of information technology (IT) and Information Management (IM) systems help to ensure that the end product is of known quality and integrity. As the complexity of IT & IM processes increase, so does the need for regular QA evaluation.

    The areas revi...

  19. Clear as glass: transparent financial reporting.

    PubMed

    Valletta, Robert M

    2005-08-01

    To be transparent, financial information needs to be easily accessible, timely, content-rich, and narrative. Not-for-profit hospitals and health systems should report detailed financial information quarterly. They need internal controls to reduce the level of complexity throughout the organization by creating standardized processes.

  20. Social behavior of bacteria: from physics to complex organization

    NASA Astrophysics Data System (ADS)

    Ben-Jacob, E.

    2008-10-01

    I describe how bacteria develop complex colonial patterns by utilizing intricate communication capabilities, such as quorum sensing, chemotactic signaling and exchange of genetic information (plasmids) Bacteria do not store genetically all the information required for generating the patterns for all possible environments. Instead, additional information is cooperatively generated as required for the colonial organization to proceed. Each bacterium is, by itself, a biotic autonomous system with its own internal cellular informatics capabilities (storage, processing and assessments of information). These afford the cell certain plasticity to select its response to biochemical messages it receives, including self-alteration and broadcasting messages to initiate alterations in other bacteria. Hence, new features can collectively emerge during self-organization from the intra-cellular level to the whole colony. Collectively bacteria store information, perform decision make decisions (e.g. to sporulate) and even learn from past experience (e.g. exposure to antibiotics)-features we begin to associate with bacterial social behavior and even rudimentary intelligence. I also take Schrdinger’s’ “feeding on negative entropy” criteria further and propose that, in addition organisms have to extract latent information embedded in the environment. By latent information we refer to the non-arbitrary spatio-temporal patterns of regularities and variations that characterize the environmental dynamics. In other words, bacteria must be able to sense the environment and perform internal information processing for thriving on latent information embedded in the complexity of their environment. I then propose that by acting together, bacteria can perform this most elementary cognitive function more efficiently as can be illustrated by their cooperative behavior.

  1. The feasibility of using natural language processing to extract clinical information from breast pathology reports.

    PubMed

    Buckley, Julliette M; Coopey, Suzanne B; Sharko, John; Polubriaginof, Fernanda; Drohan, Brian; Belli, Ahmet K; Kim, Elizabeth M H; Garber, Judy E; Smith, Barbara L; Gadd, Michele A; Specht, Michelle C; Roche, Constance A; Gudewicz, Thomas M; Hughes, Kevin S

    2012-01-01

    The opportunity to integrate clinical decision support systems into clinical practice is limited due to the lack of structured, machine readable data in the current format of the electronic health record. Natural language processing has been designed to convert free text into machine readable data. The aim of the current study was to ascertain the feasibility of using natural language processing to extract clinical information from >76,000 breast pathology reports. APPROACH AND PROCEDURE: Breast pathology reports from three institutions were analyzed using natural language processing software (Clearforest, Waltham, MA) to extract information on a variety of pathologic diagnoses of interest. Data tables were created from the extracted information according to date of surgery, side of surgery, and medical record number. The variety of ways in which each diagnosis could be represented was recorded, as a means of demonstrating the complexity of machine interpretation of free text. There was widespread variation in how pathologists reported common pathologic diagnoses. We report, for example, 124 ways of saying invasive ductal carcinoma and 95 ways of saying invasive lobular carcinoma. There were >4000 ways of saying invasive ductal carcinoma was not present. Natural language processor sensitivity and specificity were 99.1% and 96.5% when compared to expert human coders. We have demonstrated how a large body of free text medical information such as seen in breast pathology reports, can be converted to a machine readable format using natural language processing, and described the inherent complexities of the task.

  2. Interactive visualization of public health indicators to support policymaking: An exploratory study

    PubMed Central

    Zakkar, Moutasem; Sedig, Kamran

    2017-01-01

    Purpose The purpose of this study is to examine the use of interactive visualizations to represent data/information related to social determinants of health and public health indicators, and to investigate the benefits of such visualizations for health policymaking. Methods: The study developed a prototype for an online interactive visualization tool that represents the social determinants of health. The study participants explored and used the tool. The tool was evaluated using the informal user experience evaluation method. This method involves the prospective users of a tool to use and play with it and their feedback to be collected through interviews. Results: Using visualizations to represent and interact with health indicators has advantages over traditional representation techniques that do not allow users to interact with the information. Communicating healthcare indicators to policymakers is a complex task because of the complexity of the indicators, diversity of audiences, and different audience needs. This complexity can lead to information misinterpretation, which occurs when users of the health data ignore or do not know why, where, and how the data has been produced, or where and how it can be used. Conclusions: Public health policymaking is a complex process, and data is only one element among others needed in this complex process. Researchers and healthcare organizations should conduct a strategic evaluation to assess the usability of interactive visualizations and decision support tools before investing in these tools. Such evaluation should take into consideration the cost, ease of use, learnability, and efficiency of those tools, and the factors that influence policymaking. PMID:29026455

  3. HMI conventions for process control graphics.

    PubMed

    Pikaar, Ruud N

    2012-01-01

    Process operators supervise and control complex processes. To enable the operator to do an adequate job, instrumentation and process control engineers need to address several related topics, such as console design, information design, navigation, and alarm management. In process control upgrade projects, usually a 1:1 conversion of existing graphics is proposed. This paper suggests another approach, efficiently leading to a reduced number of new powerful process graphics, supported by a permanent process overview displays. In addition a road map for structuring content (process information) and conventions for the presentation of objects, symbols, and so on, has been developed. The impact of the human factors engineering approach on process control upgrade projects is illustrated by several cases.

  4. Amyloid and the origin of life: self-replicating catalytic amyloids as prebiotic informational and protometabolic entities.

    PubMed

    Maury, Carl Peter J

    2018-05-01

    A crucial stage in the origin of life was the emergence of the first molecular entity that was able to replicate, transmit information, and evolve on the early Earth. The amyloid world hypothesis posits that in the pre-RNA era, information processing was based on catalytic amyloids. The self-assembly of short peptides into β-sheet amyloid conformers leads to extraordinary structural stability and novel multifunctionality that cannot be achieved by the corresponding nonaggregated peptides. The new functions include self-replication, catalytic activities, and information transfer. The environmentally sensitive template-assisted replication cycles generate a variety of amyloid polymorphs on which evolutive forces can act, and the fibrillar assemblies can serve as scaffolds for the amyloids themselves and for ribonucleotides proteins and lipids. The role of amyloid in the putative transition process from an amyloid world to an amyloid-RNA-protein world is not limited to scaffolding and protection: the interactions between amyloid, RNA, and protein are both complex and cooperative, and the amyloid assemblages can function as protometabolic entities catalyzing the formation of simple metabolite precursors. The emergence of a pristine amyloid-based in-put sensitive, chiroselective, and error correcting information-processing system, and the evolvement of mutualistic networks were, arguably, of essential importance in the dynamic processes that led to increased complexity, organization, compartmentalization, and, eventually, the origin of life.

  5. Toward theoretical understanding of the fertility preservation decision-making process: examining information processing among young women with cancer.

    PubMed

    Hershberger, Patricia E; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2013-01-01

    Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. The purpose of this article is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Using a grounded theory approach, 27 women with cancer participated in individual, semistructured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by 5 dimensions within the Contemplate phase of the decision-making process framework. In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Better understanding of theoretical underpinnings surrounding women's information processes can facilitate decision support and improve clinical care.

  6. Information processing characteristics in subtypes of multiple sclerosis.

    PubMed

    De Sonneville, L M J; Boringa, J B; Reuling, I E W; Lazeron, R H C; Adèr, H J; Polman, C H

    2002-01-01

    The purpose of this study was to evaluate information processing characteristics in patients with multiple sclerosis (MS). We selected 53 patients with MS and 58 matched healthy controls. Using computerized tests, we investigated focused, divided, sustained attention, and executive function, and attempted to pinpoint deficits in attentional control to peripheral or central processing stages. The results substantiate the hypothesis that the slowing of attention-demanding (controlled) information processing underlying more complex cognitive skills is general, i.e. irrespective of type of controlled processing, with MS patients being 40% slower than controls. MS patients may suffer from focused, and divided and sustained attention deficits, as well as from compromised central processing stages, with secondary progressive (SP) patients showing the most extensive range of deficits, closely followed by primary progressive (PP) patients, while relapsing-remitting (RR) patients appear to be much less affected. General slowing appears to be highest in PP and SP type MS patients (50% slower) versus relapsing-remitting MS (24% slower). In contrast to most previous results, (complex) processing speed appeared to be robustly correlated with severity of MS as measured by the expanded disability status scale and with disease duration. Patients did much less differ in accuracy of processing from controls, suggesting the importance of using time strategies in planning everyday life and job activities to compensate for or alleviate MS-related speed handicaps. Copyright 2002 Elsevier Science Ltd.

  7. High-performance technology for indexing of high volumes of Earth remote sensing data

    NASA Astrophysics Data System (ADS)

    Strotov, Valery V.; Taganov, Alexander I.; Kolesenkov, Aleksandr N.; Kostrov, Boris V.

    2017-10-01

    The present paper has suggested a technology for search, indexing, cataloging and distribution of aerospace images on the basis of geo-information approach, cluster and spectral analysis. It has considered information and algorithmic support of the system. Functional circuit of the system and structure of the geographical data base have been developed on the basis of the geographical online portal technology. Taking into account heterogeneity of information obtained from various sources it is reasonable to apply a geoinformation platform that allows analyzing space location of objects and territories and executing complex processing of information. Geoinformation platform is based on cartographic fundamentals with the uniform coordinate system, the geographical data base, a set of algorithms and program modules for execution of various tasks. The technology for adding by particular users and companies of images taken by means of professional and amateur devices and also processed by various software tools to the array system has been suggested. Complex usage of visual and instrumental approaches allows significantly expanding an application area of Earth remote sensing data. Development and implementation of new algorithms based on the complex usage of new methods for processing of structured and unstructured data of high volumes will increase periodicity and rate of data updating. The paper has shown that application of original algorithms for search, indexing and cataloging of aerospace images will provide an easy access to information spread by hundreds of suppliers and allow increasing an access rate to aerospace images up to 5 times in comparison with current analogues.

  8. How understanding the neurobiology of complex post-traumatic stress disorder can inform clinical practice: a social cognitive and affective neuroscience approach.

    PubMed

    Lanius, R A; Bluhm, R L; Frewen, P A

    2011-11-01

    In this review, we examine the relevance of the social cognitive and affective neuroscience (SCAN) paradigm for an understanding of the psychology and neurobiology of complex post-traumatic stress disorder (PTSD) and its effective treatment. The relevant literature pertaining to SCAN and PTSD was reviewed. We suggest that SCAN offers a novel theoretical paradigm for understanding psychological trauma and its numerous clinical outcomes, most notably problems in emotional/self-awareness, emotion regulation, social emotional processing and self-referential processing. A core set of brain regions appear to mediate these collective psychological functions, most notably the cortical midline structures, the amygdala, the insula, posterior parietal cortex and temporal poles, suggesting that problems in one area (e.g. emotional awareness) may relate to difficulties in another (e.g. self-referential processing). We further propose, drawing on clinical research, that the experiences of individuals with PTSD related to chronic trauma often reflect impairments in multiple social cognitive and affective functions. It is important that the assessment and treatment of individuals with complex PTSD not only addresses traumatic memories but also takes a SCAN-informed approach that focuses on the underlying deficits in emotional/self-awareness, emotion regulation, social emotional processing and self-referential processing. © 2011 John Wiley & Sons A/S.

  9. The effect of low versus high approach-motivated positive affect on memory for peripherally versus centrally presented information.

    PubMed

    Gable, Philip A; Harmon-Jones, Eddie

    2010-08-01

    Emotions influence attention and processes involved in memory. Although some research has suggested that positive affect categorically influences these processes differently than neutral affect, recent research suggests that motivational intensity of positive affective states influences these processes. The present experiments examined memory for centrally or peripherally presented information after the evocation of approach-motivated positive affect. Experiment 1 found that, relative to neutral conditions, pregoal, approach-motivated positive affect (caused by a monetary incentives task) enhanced memory for centrally presented information, whereas postgoal, low approach-motivated positive affect enhanced memory for peripherally presented information. Experiment 2 found that, relative to a neutral condition, high approach-motivated positive affect (caused by appetitive pictures) enhanced memory for centrally presented information but hindered memory for peripheral information. These results suggest a more complex relationship between positive affect and memory processes and highlight the importance of considering the motivational intensity of positive affects in cognitive processes. Copyright 2010 APA

  10. Reversal of alcohol-induced effects on response control due to changes in proprioceptive information processing.

    PubMed

    Stock, Ann-Kathrin; Mückschel, Moritz; Beste, Christian

    2017-01-01

    Recent research has drawn interest to the effects of binge drinking on response selection. However, choosing an appropriate response is a complex endeavor that usually requires us to process and integrate several streams of information. One of them is proprioceptive information about the position of limbs. As to now, it has however remained elusive how binge drinking affects the processing of proprioceptive information during response selection and control in healthy individuals. We investigated this question using neurophysiological (EEG) techniques in a response selection task, where we manipulated proprioceptive information. The results show a reversal of alcohol-induced effects on response control due to changes in proprioceptive information processing. The most likely explanation for this finding is that proprioceptive information does not seem to be properly integrated in response selection processes during acute alcohol intoxication as found in binge drinking. The neurophysiological data suggest that processes related to the preparation and execution of the motor response, but not upstream processes related to conflict monitoring and spatial attentional orienting, underlie these binge drinking-dependent modulations. Taken together, the results show that even high doses of alcohol have very specific effects within the cascade of neurophysiological processes underlying response control and the integration of proprioceptive information during this process. © 2015 Society for the Study of Addiction.

  11. Reactive immunization on complex networks

    NASA Astrophysics Data System (ADS)

    Alfinito, Eleonora; Beccaria, Matteo; Fachechi, Alberto; Macorini, Guido

    2017-01-01

    Epidemic spreading on complex networks depends on the topological structure as well as on the dynamical properties of the infection itself. Generally speaking, highly connected individuals play the role of hubs and are crucial to channel information across the network. On the other hand, static topological quantities measuring the connectivity structure are independent of the dynamical mechanisms of the infection. A natural question is therefore how to improve the topological analysis by some kind of dynamical information that may be extracted from the ongoing infection itself. In this spirit, we propose a novel vaccination scheme that exploits information from the details of the infection pattern at the moment when the vaccination strategy is applied. Numerical simulations of the infection process show that the proposed immunization strategy is effective and robust on a wide class of complex networks.

  12. Heuristics in Managing Complex Clinical Decision Tasks in Experts’ Decision Making

    PubMed Central

    Islam, Roosan; Weir, Charlene; Del Fiol, Guilherme

    2016-01-01

    Background Clinical decision support is a tool to help experts make optimal and efficient decisions. However, little is known about the high level of abstractions in the thinking process for the experts. Objective The objective of the study is to understand how clinicians manage complexity while dealing with complex clinical decision tasks. Method After approval from the Institutional Review Board (IRB), three clinical experts were interviewed the transcripts from these interviews were analyzed. Results We found five broad categories of strategies by experts for managing complex clinical decision tasks: decision conflict, mental projection, decision trade-offs, managing uncertainty and generating rule of thumb. Conclusion Complexity is created by decision conflicts, mental projection, limited options and treatment uncertainty. Experts cope with complexity in a variety of ways, including using efficient and fast decision strategies to simplify complex decision tasks, mentally simulating outcomes and focusing on only the most relevant information. Application Understanding complex decision making processes can help design allocation based on the complexity of task for clinical decision support design. PMID:27275019

  13. Heuristics in Managing Complex Clinical Decision Tasks in Experts' Decision Making.

    PubMed

    Islam, Roosan; Weir, Charlene; Del Fiol, Guilherme

    2014-09-01

    Clinical decision support is a tool to help experts make optimal and efficient decisions. However, little is known about the high level of abstractions in the thinking process for the experts. The objective of the study is to understand how clinicians manage complexity while dealing with complex clinical decision tasks. After approval from the Institutional Review Board (IRB), three clinical experts were interviewed the transcripts from these interviews were analyzed. We found five broad categories of strategies by experts for managing complex clinical decision tasks: decision conflict, mental projection, decision trade-offs, managing uncertainty and generating rule of thumb. Complexity is created by decision conflicts, mental projection, limited options and treatment uncertainty. Experts cope with complexity in a variety of ways, including using efficient and fast decision strategies to simplify complex decision tasks, mentally simulating outcomes and focusing on only the most relevant information. Understanding complex decision making processes can help design allocation based on the complexity of task for clinical decision support design.

  14. A visual approach to providing prognostic information to parents of children with retinoblastoma.

    PubMed

    Panton, Rachel L; Downie, Robert; Truong, Tran; Mackeen, Leslie; Kabene, Stefane; Yi, Qi-Long; Chan, Helen S L; Gallie, Brenda L

    2009-03-01

    Parents must rapidly assimilate complex information when a child is diagnosed with cancer. Education correlates with the ability to process and use medical information. Graphic tools aid reasoning and communicate complex ideas with precision and efficiency. We developed a graphic tool, DePICT (Disease-specific electronic Patient Illustrated Clinical Timeline), to visually display entire retinoblastoma treatment courses from real-time clinical data. We report retrospective evaluation of the effectiveness of DePICT to communicate risk and complexity of treatment to parents. We assembled DePICT graphics from multiple children on cards representing each stage of intraocular retinoblastoma. Forty-four parents completed a 14-item questionnaire to evaluate the understanding of retinoblastoma treatment and outcomes acquired from DePICT. As a proposed tool for informed consent, DePICT effectively communicated knowledge of complex medical treatment and risks, regardless of the education level. We identified multiple potential factors affecting parent comprehension of treatment complexity and risk. These include language proficiency (p=0.005) and age-related experience, as younger parents had higher education (p=0.021) but lower comprehension scores (p=0.011), regardless of first language. Provision of information at diagnosis concerning long-term treatment complexity helps parents of children with cancer. DePICT effectively transfers knowledge of treatments, risks, and prognosis in a manner that offsets parental educational disadvantages.

  15. Virtual Construction of Space Habitats: Connecting Building Information Models (BIM) and SysML

    NASA Technical Reports Server (NTRS)

    Polit-Casillas, Raul; Howe, A. Scott

    2013-01-01

    Current trends in design, construction and management of complex projects make use of Building Information Models (BIM) connecting different types of data to geometrical models. This information model allow different types of analysis beyond pure graphical representations. Space habitats, regardless their size, are also complex systems that require the synchronization of many types of information and disciplines beyond mass, volume, power or other basic volumetric parameters. For this, the state-of-the-art model based systems engineering languages and processes - for instance SysML - represent a solid way to tackle this problem from a programmatic point of view. Nevertheless integrating this with a powerful geometrical architectural design tool with BIM capabilities could represent a change in the workflow and paradigm of space habitats design applicable to other aerospace complex systems. This paper shows some general findings and overall conclusions based on the ongoing research to create a design protocol and method that practically connects a systems engineering approach with a BIM architectural and engineering design as a complete Model Based Engineering approach. Therefore, one hypothetical example is created and followed during the design process. In order to make it possible this research also tackles the application of IFC categories and parameters in the aerospace field starting with the application upon the space habitats design as way to understand the information flow between disciplines and tools. By building virtual space habitats we can potentially improve in the near future the way more complex designs are developed from very little detail from concept to manufacturing.

  16. Using evaluation to adapt health information outreach to the complex environments of community-based organizations.

    PubMed

    Olney, Cynthia A

    2005-10-01

    After arguing that most community-based organizations (CBOs) function as complex adaptive systems, this white paper describes the evaluation goals, questions, indicators, and methods most important at different stages of community-based health information outreach. This paper presents the basic characteristics of complex adaptive systems and argues that the typical CBO can be considered this type of system. It then presents evaluation as a tool for helping outreach teams adapt their outreach efforts to the CBO environment and thus maximize success. Finally, it describes the goals, questions, indicators, and methods most important or helpful at each stage of evaluation (community assessment, needs assessment and planning, process evaluation, and outcomes assessment). Literature from complex adaptive systems as applied to health care, business, and evaluation settings is presented. Evaluation models and applications, particularly those based on participatory approaches, are presented as methods for maximizing the effectiveness of evaluation in dynamic CBO environments. If one accepts that CBOs function as complex adaptive systems-characterized by dynamic relationships among many agents, influences, and forces-then effective evaluation at the stages of community assessment, needs assessment and planning, process evaluation, and outcomes assessment is critical to outreach success.

  17. Hardware-software complex of informing passengers of forecasted route transport arrival at stop

    NASA Astrophysics Data System (ADS)

    Pogrebnoy, V. Yu; Pushkarev, M. I.; Fadeev, A. S.

    2017-02-01

    The paper presents the hardware-software complex of informing the passengers of the forecasted route transport arrival. A client-server architecture of the forecasting information system is represented and an electronic information board prototype is described. The scheme of information transfer and processing, starting with receiving navigating telemetric data from a transport vehicle and up to the time of passenger public transport arrival at the stop, as well as representation of the information on the electronic board is illustrated and described. Methods and algorithms of determination of the transport vehicle current location in the city route network are considered in detail. The description of the proposed forecasting model of transport vehicle arrival time at the stop is given. The obtained result is applied in Tomsk for forecasting and displaying the arrival time information at the stops.

  18. Games and Simulation.

    ERIC Educational Resources Information Center

    Abt, Clark C.

    Educational games present the complex realities of simultaneous interactive processes more accurately and effectively than serial processes such as lecturing and reading. Objectives of educational gaming are to motivate students by presenting relevant and realistic problems and to induce more efficient and active understanding of information.…

  19. A multistage motion vector processing method for motion-compensated frame interpolation.

    PubMed

    Huang, Ai- Mei; Nguyen, Truong Q

    2008-05-01

    In this paper, a novel, low-complexity motion vector processing algorithm at the decoder is proposed for motion-compensated frame interpolation or frame rate up-conversion. We address the problems of having broken edges and deformed structures in an interpolated frame by hierarchically refining motion vectors on different block sizes. Our method explicitly considers the reliability of each received motion vector and has the capability of preserving the structure information. This is achieved by analyzing the distribution of residual energies and effectively merging blocks that have unreliable motion vectors. The motion vector reliability information is also used as a prior knowledge in motion vector refinement using a constrained vector median filter to avoid choosing identical unreliable one. We also propose using chrominance information in our method. Experimental results show that the proposed scheme has better visual quality and is also robust, even in video sequences with complex scenes and fast motion.

  20. Learning the wrong lessons? Science and fisheries management in the Chesapeake Bay blue crab fishery.

    PubMed

    Beem, Betsi

    2012-05-01

    This paper argues that information produced and then taken up for policy decision making is a function of a complex interplay within the scientific community and between scientists and the broader policy network who are all grappling with issues in a complex environment with a high degree of scientific uncertainty. The dynamics of forming and re-forming the scientific community are shaped by political processes, as are the directions and questions scientists attend to in their roles as policy advisors. Three factors: 1) social construction of scientific communities, 2) the indeterminacy of science, and 3) demands by policy makers to have concrete information for decision making; are intertwined in the production and dissemination of information that may serve as the basis for policy learning. Through this process, however, what gets learned may not be what is needed to mitigate the problem, be complete in terms of addressing multiple causations, or be correct.

  1. Learning from catchments to understand hydrological drought (HS Division Outstanding ECS Award Lecture)

    NASA Astrophysics Data System (ADS)

    Van Loon, Anne

    2017-04-01

    Drought is a global challenge. To be able to manage drought effectively on global or national scales without losing smaller scale variability and local context, we need to understand what the important hydrological drought processes are at different scales. Global scale models and satellite data are providing a global overview and catchment scale studies provide detailed site-specific information. I am interested in bridging these two scale levels by learning from catchments from around the world. Much information from local case studies is currently underused on larger scales because there is too much complexity. However, some of this complexity might be crucial on the level where people are facing the consequences of drought. In this talk, I will take you on a journey around the world to unlock catchment scale information and see if the comparison of many catchments gives us additional understanding of hydrological drought processes on the global scale. I will focus on the role of storage in different compartments of the terrestrial hydrological cycle, and how we as humans interact with that storage. I will discuss aspects of spatial and temporal variability in storage that are crucial for hydrological drought development and persistence, drawing from examples of catchments with storage in groundwater, lakes and wetlands, and snow and ice. The added complexity of human activities shifts the focus from natural to catchments with anthropogenic increases in storage (reservoirs), decreases in storage (groundwater abstraction), and changes in hydrological processes (urbanisation). We learn how local information is providing valuable insights, in some cases challenging theoretical understanding or model outcomes. Despite the challenges of working across countries, with a high number of collaborators, in a multitude of languages, under data-scarce conditions, the scientific advantages of bridging scales are substantial. The comparison of catchments around the world can inform global scale models, give the needed spatial variability to satellite data, and help us make steps in understanding and managing the complex challenge of drought, now and in the future.

  2. In silico Interrogation of Insect Central Complex Suggests Computational Roles for the Ellipsoid Body in Spatial Navigation.

    PubMed

    Fiore, Vincenzo G; Kottler, Benjamin; Gu, Xiaosi; Hirth, Frank

    2017-01-01

    The central complex in the insect brain is a composite of midline neuropils involved in processing sensory cues and mediating behavioral outputs to orchestrate spatial navigation. Despite recent advances, however, the neural mechanisms underlying sensory integration and motor action selections have remained largely elusive. In particular, it is not yet understood how the central complex exploits sensory inputs to realize motor functions associated with spatial navigation. Here we report an in silico interrogation of central complex-mediated spatial navigation with a special emphasis on the ellipsoid body. Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs. Our simulations show integration of multiple sensory sources can be effectively performed in the ellipsoid body. This processed information is used to trigger continuous sequences of action selections resulting in self-motion, obstacle avoidance and the navigation of simulated environments of varying complexity. The motor responses to perceived sensory stimuli can be stored in the neural structure of the central complex to simulate navigation relying on a collective of guidance cues, akin to sensory-driven innate or habitual behaviors. By comparing behaviors under different conditions of accessible sources of input information, we show the simulated insect computes visual inputs and body posture to estimate its position in space. Finally, we tested whether the local connectome of the central complex might also allow the flexibility required to recall an intentional behavioral sequence, among different courses of actions. Our simulations suggest that the central complex can encode combined representations of motor and spatial information to pursue a goal and thus successfully guide orientation behavior. Together, the observed computational features identify central complex circuitry, and especially the ellipsoid body, as a key neural correlate involved in spatial navigation.

  3. In silico Interrogation of Insect Central Complex Suggests Computational Roles for the Ellipsoid Body in Spatial Navigation

    PubMed Central

    Fiore, Vincenzo G.; Kottler, Benjamin; Gu, Xiaosi; Hirth, Frank

    2017-01-01

    The central complex in the insect brain is a composite of midline neuropils involved in processing sensory cues and mediating behavioral outputs to orchestrate spatial navigation. Despite recent advances, however, the neural mechanisms underlying sensory integration and motor action selections have remained largely elusive. In particular, it is not yet understood how the central complex exploits sensory inputs to realize motor functions associated with spatial navigation. Here we report an in silico interrogation of central complex-mediated spatial navigation with a special emphasis on the ellipsoid body. Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs. Our simulations show integration of multiple sensory sources can be effectively performed in the ellipsoid body. This processed information is used to trigger continuous sequences of action selections resulting in self-motion, obstacle avoidance and the navigation of simulated environments of varying complexity. The motor responses to perceived sensory stimuli can be stored in the neural structure of the central complex to simulate navigation relying on a collective of guidance cues, akin to sensory-driven innate or habitual behaviors. By comparing behaviors under different conditions of accessible sources of input information, we show the simulated insect computes visual inputs and body posture to estimate its position in space. Finally, we tested whether the local connectome of the central complex might also allow the flexibility required to recall an intentional behavioral sequence, among different courses of actions. Our simulations suggest that the central complex can encode combined representations of motor and spatial information to pursue a goal and thus successfully guide orientation behavior. Together, the observed computational features identify central complex circuitry, and especially the ellipsoid body, as a key neural correlate involved in spatial navigation. PMID:28824390

  4. A foundational methodology for determining system static complexity using notional lunar oxygen production processes

    NASA Astrophysics Data System (ADS)

    Long, Nicholas James

    This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.

  5. Mass Spectrometry: A Technique of Many Faces

    PubMed Central

    Olshina, Maya A.; Sharon, Michal

    2016-01-01

    Protein complexes form the critical foundation for a wide range of biological process, however understanding the intricate details of their activities is often challenging. In this review we describe how mass spectrometry plays a key role in the analysis of protein assemblies and the cellular pathways which they are involved in. Specifically, we discuss how the versatility of mass spectrometric approaches provides unprecedented information on multiple levels. We demonstrate this on the ubiquitin-proteasome proteolytic pathway, a process that is responsible for protein turnover. We follow the various steps of this degradation route and illustrate the different mass spectrometry workflows that were applied for elucidating molecular information. Overall, this review aims to stimulate the integrated use of multiple mass spectrometry approaches for analyzing complex biological systems. PMID:28100928

  6. Synthetic analog and digital circuits for cellular computation and memory.

    PubMed

    Purcell, Oliver; Lu, Timothy K

    2014-10-01

    Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene networks that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Complexity Level Analysis Revisited: What Can 30 Years of Hindsight Tell Us about How the Brain Might Represent Visual Information?

    PubMed Central

    Tsotsos, John K.

    2017-01-01

    Much has been written about how the biological brain might represent and process visual information, and how this might inspire and inform machine vision systems. Indeed, tremendous progress has been made, and especially during the last decade in the latter area. However, a key question seems too often, if not mostly, be ignored. This question is simply: do proposed solutions scale with the reality of the brain's resources? This scaling question applies equally to brain and to machine solutions. A number of papers have examined the inherent computational difficulty of visual information processing using theoretical and empirical methods. The main goal of this activity had three components: to understand the deep nature of the computational problem of visual information processing; to discover how well the computational difficulty of vision matches to the fixed resources of biological seeing systems; and, to abstract from the matching exercise the key principles that lead to the observed characteristics of biological visual performance. This set of components was termed complexity level analysis in Tsotsos (1987) and was proposed as an important complement to Marr's three levels of analysis. This paper revisits that work with the advantage that decades of hindsight can provide. PMID:28848458

  8. Fluctuations in Wikipedia access-rate and edit-event data

    NASA Astrophysics Data System (ADS)

    Kämpf, Mirko; Tismer, Sebastian; Kantelhardt, Jan W.; Muchnik, Lev

    2012-12-01

    Internet-based social networks often reflect extreme events in nature and society by drastic increases in user activity. We study and compare the dynamics of the two major complex processes necessary for information spread via the online encyclopedia ‘Wikipedia’, i.e., article editing (information upload) and article access (information viewing) based on article edit-event time series and (hourly) user access-rate time series for all articles. Daily and weekly activity patterns occur in addition to fluctuations and bursting activity. The bursts (i.e., significant increases in activity for an extended period of time) are characterized by a power-law distribution of durations of increases and decreases. For describing the recurrence and clustering of bursts we investigate the statistics of the return intervals between them. We find stretched exponential distributions of return intervals in access-rate time series, while edit-event time series yield simple exponential distributions. To characterize the fluctuation behavior we apply detrended fluctuation analysis (DFA), finding that most article access-rate time series are characterized by strong long-term correlations with fluctuation exponents α≈0.9. The results indicate significant differences in the dynamics of information upload and access and help in understanding the complex process of collecting, processing, validating, and distributing information in self-organized social networks.

  9. Complexity Level Analysis Revisited: What Can 30 Years of Hindsight Tell Us about How the Brain Might Represent Visual Information?

    PubMed

    Tsotsos, John K

    2017-01-01

    Much has been written about how the biological brain might represent and process visual information, and how this might inspire and inform machine vision systems. Indeed, tremendous progress has been made, and especially during the last decade in the latter area. However, a key question seems too often, if not mostly, be ignored. This question is simply: do proposed solutions scale with the reality of the brain's resources? This scaling question applies equally to brain and to machine solutions. A number of papers have examined the inherent computational difficulty of visual information processing using theoretical and empirical methods. The main goal of this activity had three components: to understand the deep nature of the computational problem of visual information processing; to discover how well the computational difficulty of vision matches to the fixed resources of biological seeing systems; and, to abstract from the matching exercise the key principles that lead to the observed characteristics of biological visual performance. This set of components was termed complexity level analysis in Tsotsos (1987) and was proposed as an important complement to Marr's three levels of analysis. This paper revisits that work with the advantage that decades of hindsight can provide.

  10. Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model

    NASA Astrophysics Data System (ADS)

    Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi

    Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.

  11. Evolution of natural agents: preservation, advance, and emergence of functional information.

    PubMed

    Sharov, Alexei A

    2016-04-01

    Biological evolution is often viewed narrowly as a change of morphology or allele frequency in a sequence of generations. Here I pursue an alternative informational concept of evolution, as preservation, advance, and emergence of functional information in natural agents. Functional information is a network of signs (e.g., memory, transient messengers, and external signs) that are used by agents to preserve and regulate their functions. Functional information is preserved in evolution via complex interplay of copying and construction processes: the digital components are copied, whereas interpreting subagents together with scaffolds, tools, and resources, are constructed. Some of these processes are simple and invariant, whereas others are complex and contextual. Advance of functional information includes improvement and modification of already existing functions. Although the genome information may change passively and randomly, the interpretation is active and guided by the logic of agent behavior and embryonic development. Emergence of new functions is based on the reinterpretation of already existing information, when old tools, resources, and control algorithms are adopted for novel functions. Evolution of functional information progressed from protosemiosis, where signs correspond directly to actions, to eusemiosis, where agents associate signs with objects. Language is the most advanced form of eusemiosis, where the knowledge of objects and models is communicated between agents.

  12. Evolution of natural agents: preservation, advance, and emergence of functional information

    PubMed Central

    Sharov, Alexei A.

    2016-01-01

    Biological evolution is often viewed narrowly as a change of morphology or allele frequency in a sequence of generations. Here I pursue an alternative informational concept of evolution, as preservation, advance, and emergence of functional information in natural agents. Functional information is a network of signs (e.g., memory, transient messengers, and external signs) that are used by agents to preserve and regulate their functions. Functional information is preserved in evolution via complex interplay of copying and construction processes: the digital components are copied, whereas interpreting subagents together with scaffolds, tools, and resources, are constructed. Some of these processes are simple and invariant, whereas others are complex and contextual. Advance of functional information includes improvement and modification of already existing functions. Although the genome information may change passively and randomly, the interpretation is active and guided by the logic of agent behavior and embryonic development. Emergence of new functions is based on the reinterpretation of already existing information, when old tools, resources, and control algorithms are adopted for novel functions. Evolution of functional information progressed from protosemiosis, where signs correspond directly to actions, to eusemiosis, where agents associate signs with objects. Language is the most advanced form of eusemiosis, where the knowledge of objects and models is communicated between agents. PMID:27525048

  13. Neuropsychological study of FASD in a sample of American Indian children: processing simple versus complex information.

    PubMed

    Aragón, Alfredo S; Kalberg, Wendy O; Buckley, David; Barela-Scott, Lindsey M; Tabachnick, Barbara G; May, Philip A

    2008-12-01

    Although a large body of literature exists on cognitive functioning in alcohol-exposed children, it is unclear if there is a signature neuropsychological profile in children with Fetal Alcohol Spectrum Disorders (FASD). This study assesses cognitive functioning in children with FASD from several American Indian reservations in the Northern Plains States, and it applies a hierarchical model of simple versus complex information processing to further examine cognitive function. We hypothesized that complex tests would discriminate between children with FASD and culturally similar controls, while children with FASD would perform similar to controls on relatively simple tests. Our sample includes 32 control children and 24 children with a form of FASD [fetal alcohol syndrome (FAS) = 10, partial fetal alcohol syndrome (PFAS) = 14]. The test battery measures general cognitive ability, verbal fluency, executive functioning, memory, and fine-motor skills. Many of the neuropsychological tests produced results consistent with a hierarchical model of simple versus complex processing. The complexity of the tests was determined "a priori" based on the number of cognitive processes involved in them. Multidimensional scaling was used to statistically analyze the accuracy of classifying the neurocognitive tests into a simple versus complex dichotomy. Hierarchical logistic regression models were then used to define the contribution made by complex versus simple tests in predicting the significant differences between children with FASD and controls. Complex test items discriminated better than simple test items. The tests that conformed well to the model were the Verbal Fluency, Progressive Planning Test (PPT), the Lhermitte memory tasks, and the Grooved Pegboard Test (GPT). The FASD-grouped children, when compared with controls, demonstrated impaired performance on letter fluency, while their performance was similar on category fluency. On the more complex PPT trials (problems 5 to 8), as well as the Lhermitte logical tasks, the FASD group performed the worst. The differential performance between children with FASD and controls was evident across various neuropsychological measures. The children with FASD performed significantly more poorly on the complex tasks than did the controls. The identification of a neurobehavioral profile in children with prenatal alcohol exposure will help clinicians identify and diagnose children with FASD.

  14. “Gestaltomics”: Systems Biology Schemes for the Study of Neuropsychiatric Diseases

    PubMed Central

    Gutierrez Najera, Nora A.; Resendis-Antonio, Osbaldo; Nicolini, Humberto

    2017-01-01

    The integration of different sources of biological information about what defines a behavioral phenotype is difficult to unify in an entity that reflects the arithmetic sum of its individual parts. In this sense, the challenge of Systems Biology for understanding the “psychiatric phenotype” is to provide an improved vision of the shape of the phenotype as it is visualized by “Gestalt” psychology, whose fundamental axiom is that the observed phenotype (behavior or mental disorder) will be the result of the integrative composition of every part. Therefore, we propose the term “Gestaltomics” as a term from Systems Biology to integrate data coming from different sources of information (such as the genome, transcriptome, proteome, epigenome, metabolome, phenome, and microbiome). In addition to this biological complexity, the mind is integrated through multiple brain functions that receive and process complex information through channels and perception networks (i.e., sight, ear, smell, memory, and attention) that in turn are programmed by genes and influenced by environmental processes (epigenetic). Today, the approach of medical research in human diseases is to isolate one disease for study; however, the presence of an additional disease (co-morbidity) or more than one disease (multimorbidity) adds complexity to the study of these conditions. This review will present the challenge of integrating psychiatric disorders at different levels of information (Gestaltomics). The implications of increasing the level of complexity, for example, studying the co-morbidity with another disease such as cancer, will also be discussed. PMID:28536537

  15. Cooperative outcome interdependence, task reflexivity, and team effectiveness: a motivated information processing perspective.

    PubMed

    De Dreu, Carsten K W

    2007-05-01

    A motivated information processing perspective (C. K. W. De Dreu & P. J. D. Carnevale, 2003; see also V. B. Hinsz, R. S. Tindale, & D. A. Vollrath, 1997) was used to predict that perceived cooperative outcome interdependence interacts with team-level reflexivity to predict information sharing, learning, and team effectiveness. A cross-sectional field study involving management and cross-functional teams (N = 46) performing nonroutine, complex tasks corroborated predictions: The more team members perceived cooperative outcome interdependence, the better they shared information, the more they learned and the more effective they were, especially when task reflexivity was high. When task reflexivity was low, no significant relationship was found between cooperative outcome interdependence and team processes and performance. The author concludes that the motivated information processing perspective is valid outside the confines of the laboratory and can be extended toward teamwork in organizations. 2007 APA, all rights reserved

  16. Assessment of Operational Barriers and Impediments to Transit Use: Transit Information and Scheduling for Major Activity Centers

    DOT National Transportation Integrated Search

    2001-12-01

    The decision to use public transit as a means of alternative transportation is a somewhat complex process. The potential rider must know that a public transportation system is available, how to contact the public transportation system for information...

  17. Information processing of earth resources data

    NASA Technical Reports Server (NTRS)

    Zobrist, A. L.; Bryant, N. A.

    1982-01-01

    Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.

  18. A Computerized Hospital Patient Information Management System

    PubMed Central

    Wig, Eldon D.

    1982-01-01

    The information processing needs of a hospital are many, with varying degrees of complexity. The prime concern in providing an integrated hospital information management system lies in the ability to process the data relating to the single entity for which every hospital functions - the patient. This paper examines the PRIMIS computer system developed to accommodate hospital needs with respect to a central patient registry, inpatients (i.e., Admission/Transfer/Discharge), and out-patients. Finally, the potential for expansion to permit the incorporation of more hospital functions within PRIMIS is examined.

  19. The role of NASA for aerospace information

    NASA Technical Reports Server (NTRS)

    Chandler, G. P., Jr.

    1980-01-01

    The NASA Scientific and Technical Information Program operations are performed by two contractor operated facilities. The NASA STI Facility, located near Baltimore, Maryland, employs about 210 people who process report literature, operate the computer complex, and provide support for software maintenance and developments. A second contractor, the Technical Information Services of the American Institute of Aeronautics and Astronautics, employs approximately 80 people in New York City and processes the open literature such as journals, magazines, and books. Features of these programs include online access via RECON, announcement services, and international document exchange.

  20. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  1. The Goddard Profiling Algorithm (GPROF): Description and Current Applications

    NASA Technical Reports Server (NTRS)

    Olson, William S.; Yang, Song; Stout, John E.; Grecu, Mircea

    2004-01-01

    Atmospheric scientists use different methods for interpreting satellite data. In the early days of satellite meteorology, the analysis of cloud pictures from satellites was primarily subjective. As computer technology improved, satellite pictures could be processed digitally, and mathematical algorithms were developed and applied to the digital images in different wavelength bands to extract information about the atmosphere in an objective way. The kind of mathematical algorithm one applies to satellite data may depend on the complexity of the physical processes that lead to the observed image, and how much information is contained in the satellite images both spatially and at different wavelengths. Imagery from satellite-borne passive microwave radiometers has limited horizontal resolution, and the observed microwave radiances are the result of complex physical processes that are not easily modeled. For this reason, a type of algorithm called a Bayesian estimation method is utilized to interpret passive microwave imagery in an objective, yet computationally efficient manner.

  2. The Relationship Between Cognitive Ability and the Iconic Processing of Spatial and Identity Information

    DTIC Science & Technology

    1989-02-01

    Doctor of Philosophy in the Department of Experimental and Clinical Psychology in the Graduate School of the University of Alabama. Ii. Table of...6 In contrast to this popular single buffer account, other authors have offered more complex descriptions of the mechanisms involved in the processing...experiment exploring sex differences in retention of verbal and spatial information in short-term memory. Males remembered letter identity and letter

  3. Assessing Neurophysiologic Markers for Training and Simulation to Develop Expertise in Complex Cognitive Tasks

    DTIC Science & Technology

    2010-09-01

    analysis process is to categorize the goal according to (Gagné, 2005) domains of learning . These domains are: verbal information, intellectual...to terrain features. The ability to provide a clear verbal description of a unique feature is a learned task that may be separate from the...and experts differently. The process of verbally encoding information on location and providing this description may detract from the primary task of

  4. Communication Challenges in Neonatal Encephalopathy.

    PubMed

    Lemmon, Monica E; Donohue, Pamela K; Parkinson, Charlamaine; Northington, Frances J; Boss, Renee D

    2016-09-01

    Families must process complex information related to neonatal encephalopathy and therapeutic hypothermia. In this mixed methods study, semi-structured interviews were performed with parents whose infants were enrolled in an existing longitudinal cohort study of therapeutic hypothermia between 2011 and 2014. Thematic saturation was achieved after 20 interviews. Parental experience of communicating with clinicians was characterized by 3 principle themes. Theme 1 highlighted that a fragmented communication process mirrored the chaotic maternal and neonatal course. Parents often received key information about neonatal encephalopathy and therapeutic hypothermia from maternal clinicians. Infant medical information was often given to 1 family member (60%), who felt burdened by the responsibility to relay that information to others. Families universally valued the role of the bedside nurse, who was perceived as the primary source of communication for most (75%) families. Theme 2 encompassed the challenges of discussing the complex therapy of therapeutic hypothermia: families appreciated clinicians who used lay language and provided written material, and they often felt overwhelmed by technical information that made it hard to understand the "big picture" of their infant's medical course. Theme 3 involved the uncertain prognosis after neonatal encephalopathy. Parents appreciated specific expectations about their infant's long-term development, and experienced long-term distress about prognostic uncertainty. Communicating complex and large volumes of information in the midst of perinatal crisis presents inherent challenges for both clinicians and families. We identified an actionable set of communication challenges that can be addressed with targeted interventions. Copyright © 2016 by the American Academy of Pediatrics.

  5. An analytical approach to customer requirement information processing

    NASA Astrophysics Data System (ADS)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  6. 77 FR 21991 - Federal Housing Administration (FHA): Multifamily Accelerated Processing (MAP)-Lender and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-12

    ... Administration (FHA): Multifamily Accelerated Processing (MAP)--Lender and Underwriter Eligibility Criteria and....gov . FOR FURTHER INFORMATION CONTACT: Terry W. Clark, Office of Multifamily Development, Office of... qualifications could underwrite loans involving more complex multifamily housing programs and transactions. II...

  7. A business process modeling experience in a complex information system re-engineering.

    PubMed

    Bernonville, Stéphanie; Vantourout, Corinne; Fendeler, Geneviève; Beuscart, Régis

    2013-01-01

    This article aims to share a business process modeling experience in a re-engineering project of a medical records department in a 2,965-bed hospital. It presents the modeling strategy, an extract of the results and the feedback experience.

  8. Keeping Signals Straight: How Cells Process Information and Make Decisions

    PubMed Central

    Laub, Michael T.

    2016-01-01

    As we become increasingly dependent on electronic information-processing systems at home and work, it’s easy to lose sight of the fact that our very survival depends on highly complex biological information-processing systems. Each of the trillions of cells that form the human body has the ability to detect and respond to a wide range of stimuli and inputs, using an extraordinary set of signaling proteins to process this information and make decisions accordingly. Indeed, cells in all organisms rely on these signaling proteins to survive and proliferate in unpredictable and sometimes rapidly changing environments. But how exactly do these proteins relay information within cells, and how do they keep a multitude of incoming signals straight? Here, I describe recent efforts to understand the fidelity of information flow inside cells. This work is providing fundamental insight into how cells function. Additionally, it may lead to the design of novel antibiotics that disrupt the signaling of pathogenic bacteria or it could help to guide the treatment of cancer, which often involves information-processing gone awry inside human cells. PMID:27427909

  9. Environmental hazard mapping using GIS and AHP - A case study of Dong Trieu District in Quang Ninh Province, Vietnam

    NASA Astrophysics Data System (ADS)

    Anh, N. K.; Phonekeo, V.; My, V. C.; Duong, N. D.; Dat, P. T.

    2014-02-01

    In recent years, Vietnamese economy has been growing up rapidly and caused serious environmental quality plunging, especially in industrial and mining areas. It brings an enormous threat to a socially sustainable development and the health of human beings. Environmental quality assessment and protection are complex and dynamic processes, since it involves spatial information from multi-sector, multi-region and multi-field sources and needs complicated data processing. Therefore, an effective environmental protection information system is needed, in which considerable factors hidden in the complex relationships will become clear and visible. In this paper, the authors present the methodology which was used to generate environmental hazard maps which are applied to the integration of Analytic Hierarchy Process (AHP) and Geographical Information system (GIS). We demonstrate the results that were obtained from the study area in Dong Trieu district. This research study has contributed an overall perspective of environmental quality and identified the devastated areas where the administration urgently needs to establish an appropriate policy to improve and protect the environment.

  10. Multifractal analysis of information processing in hippocampal neural ensembles during working memory under Δ9-tetrahydrocannabinol administration

    PubMed Central

    Fetterhoff, Dustin; Opris, Ioan; Simpson, Sean L.; Deadwyler, Sam A.; Hampson, Robert E.; Kraft, Robert A.

    2014-01-01

    Background Multifractal analysis quantifies the time-scale-invariant properties in data by describing the structure of variability over time. By applying this analysis to hippocampal interspike interval sequences recorded during performance of a working memory task, a measure of long-range temporal correlations and multifractal dynamics can reveal single neuron correlates of information processing. New method Wavelet leaders-based multifractal analysis (WLMA) was applied to hippocampal interspike intervals recorded during a working memory task. WLMA can be used to identify neurons likely to exhibit information processing relevant to operation of brain–computer interfaces and nonlinear neuronal models. Results Neurons involved in memory processing (“Functional Cell Types” or FCTs) showed a greater degree of multifractal firing properties than neurons without task-relevant firing characteristics. In addition, previously unidentified FCTs were revealed because multifractal analysis suggested further functional classification. The cannabinoid-type 1 receptor partial agonist, tetrahydrocannabinol (THC), selectively reduced multifractal dynamics in FCT neurons compared to non-FCT neurons. Comparison with existing methods WLMA is an objective tool for quantifying the memory-correlated complexity represented by FCTs that reveals additional information compared to classification of FCTs using traditional z-scores to identify neuronal correlates of behavioral events. Conclusion z-Score-based FCT classification provides limited information about the dynamical range of neuronal activity characterized by WLMA. Increased complexity, as measured with multifractal analysis, may be a marker of functional involvement in memory processing. The level of multifractal attributes can be used to differentially emphasize neural signals to improve computational models and algorithms underlying brain–computer interfaces. PMID:25086297

  11. Designing the Regional College Management Information System.

    ERIC Educational Resources Information Center

    Kin Maung Kywe; And Others

    Beginning in 1976, Regional Colleges were formed in Burma to implement career and technical education at the post-secondary level. This paper describes the Regional Colleges and explores the possible use of a systemic management information process that could assist in the complex planning required to develop second-year vocational and technical…

  12. Field Day: A Case Study examining scientists’ oral performance skills

    USDA-ARS?s Scientific Manuscript database

    Communication is a complex cyclic process wherein senders and receivers encode and decode information in an effort to reach a state of mutuality or mutual understanding. When the communication of scientific or technical information occurs in a public space, effective speakers follow a formula for co...

  13. Impact of Information Incongruity and Authors Group Membership on Assimilation and Accommodation

    ERIC Educational Resources Information Center

    Moskaliuk, J.; Matschke, C.

    2018-01-01

    Learning is a complex process that can be differentiated into assimilation and accommodation. The Internet enables both types of learning through collaboration. There is, however, little research investigating the specific impact of social and information incongruity on assimilation and accommodation. The current research investigates how the…

  14. Information Robots and Manipulators.

    ERIC Educational Resources Information Center

    Katys, G. P.; And Others

    In the modern concept a robot is a complex automatic cybernetics system capable of executing various operations in the sphere of human activity and in various respects combining the imitative capacity of the physical and mental activity of man. They are a class of automatic information systems intended for search, collection, processing, and…

  15. The Role of Simple Semantics in the Process of Artificial Grammar Learning

    ERIC Educational Resources Information Center

    Öttl, Birgit; Jäger, Gerhard; Kaup, Barbara

    2017-01-01

    This study investigated the effect of semantic information on artificial grammar learning (AGL). Recursive grammars of different complexity levels (regular language, mirror language, copy language) were investigated in a series of AGL experiments. In the with-semantics condition, participants acquired semantic information prior to the AGL…

  16. Common Ground: An Interactive Visual Exploration and Discovery for Complex Health Data

    DTIC Science & Technology

    2015-04-01

    working with Intermountain Healthcare on a new rich dataset extracted directly from medical notes using natural language processing ( NLP ) algorithms...probabilities based on a state- of-the-art NLP classifiers. At that stage the data did not include geographic information or temporal information but we

  17. Reversibility in Quantum Models of Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  18. Emotional Picture and Word Processing: An fMRI Study on Effects of Stimulus Complexity

    PubMed Central

    Schlochtermeier, Lorna H.; Kuchinke, Lars; Pehrs, Corinna; Urton, Karolina; Kappelhoff, Hermann; Jacobs, Arthur M.

    2013-01-01

    Neuroscientific investigations regarding aspects of emotional experiences usually focus on one stimulus modality (e.g., pictorial or verbal). Similarities and differences in the processing between the different modalities have rarely been studied directly. The comparison of verbal and pictorial emotional stimuli often reveals a processing advantage of emotional pictures in terms of larger or more pronounced emotion effects evoked by pictorial stimuli. In this study, we examined whether this picture advantage refers to general processing differences or whether it might partly be attributed to differences in visual complexity between pictures and words. We first developed a new stimulus database comprising valence and arousal ratings for more than 200 concrete objects representable in different modalities including different levels of complexity: words, phrases, pictograms, and photographs. Using fMRI we then studied the neural correlates of the processing of these emotional stimuli in a valence judgment task, in which the stimulus material was controlled for differences in emotional arousal. No superiority for the pictorial stimuli was found in terms of emotional information processing with differences between modalities being revealed mainly in perceptual processing regions. While visual complexity might partly account for previously found differences in emotional stimulus processing, the main existing processing differences are probably due to enhanced processing in modality specific perceptual regions. We would suggest that both pictures and words elicit emotional responses with no general superiority for either stimulus modality, while emotional responses to pictures are modulated by perceptual stimulus features, such as picture complexity. PMID:23409009

  19. Structural Information Inference from Lanthanoid Complexing Systems: Photoluminescence Studies on Isolated Ions

    NASA Astrophysics Data System (ADS)

    Greisch, Jean Francois; Harding, Michael E.; Chmela, Jiri; Klopper, Willem M.; Schooss, Detlef; Kappes, Manfred M.

    2016-06-01

    The application of lanthanoid complexes ranges from photovoltaics and light-emitting diodes to quantum memories and biological assays. Rationalization of their design requires a thorough understanding of intramolecular processes such as energy transfer, charge transfer, and non-radiative decay involving their subunits. Characterization of the excited states of such complexes considerably benefits from mass spectrometric methods since the associated optical transitions and processes are strongly affected by stoichiometry, symmetry, and overall charge state. We report herein spectroscopic measurements on ensembles of ions trapped in the gas phase and soft-landed in neon matrices. Their interpretation is considerably facilitated by direct comparison with computations. The combination of energy- and time-resolved measurements on isolated species with density functional as well as ligand-field and Franck-Condon computations enables us to infer structural as well as dynamical information about the species studied. The approach is first illustrated for sets of model lanthanoid complexes whose structure and electronic properties are systematically varied via the substitution of one component (lanthanoid or alkali,alkali-earth ion): (i) systematic dependence of ligand-centered phosphorescence on the lanthanoid(III) promotion energy and its impact on sensitization, and (ii) structural changes induced by the substitution of alkali or alkali-earth ions in relation with structures inferred using ion mobility spectroscopy. The temperature dependence of sensitization is briefly discussed. The focus is then shifted to measurements involving europium complexes with doxycycline an antibiotic of the tetracycline family. Besides discussing the complexes' structural and electronic features, we report on their use to monitor enzymatic processes involving hydrogen peroxide or biologically relevant molecules such as adenosine triphosphate (ATP).

  20. [Change in the event-related skin conductivity: an indicator of the immediate importance of elaborate information processing?].

    PubMed

    Zimmer, H

    1992-01-01

    In recent psychophysiological conceptualizations of the orienting response (OR) within the framework of information processing, the OR is increasingly considered a "call for processing resources", something which is especially inferred from variations in the event-related skin conductance response (SCR). The present study, therefore, was concerned with certain implications arising from this framework or perspective, particularly in regard to the question of whether stimuli eliciting skin conductance responses obligatorily receive/evoke processing priority or not. In order to examine whether these electrodermal responses denote a capturing of attention or merely a call for processing resources, short (1 s) pure sine tones of 65 dB with sudden onset (commonly used as orienting stimuli) were inserted in a reaction time paradigm with an additional memory load. This demand was primarily given because memory processes play a key role in theories of orienting and habituation. The task was run under two different conditions of complexity, factorially combined with a novelty variation of the added auditory stimuli. The results revealed a substantial deterioration of task performance subsequent to the occurrence of the tones, which, however, was dependent on task complexity and on novelty of the tones. The task impairment is particularly remarkable as subjects were asked to avoid distractions by paying attention to the task and as the tones were introduced as subsidiary and task-irrelevant. Together with the missing effects of task complexity on phasic and tonic electrodermal activity, results suggest that information-processing conceptualizations of the OR can only be a meaningful heuristic contribution to theoretical developments about human orienting and its habituation if the setting of processing priority, its conditions, as well as its implications are adequately taken into account. In addition, it seems to be promising to consider the strength of the SCR as an index of urgency of elaborate, attention-demanding processing and not as a peripheral physiological manifestation of the OR, or, respectively, of a call for unspecific processing resources. Such a view would also do justice to the aspect of prioritization. The sufficient conditions for an OR's occurrence could, in this context, be equated with, among others, some of those which activate a mechanism subserving selective attention and, as a possible result, which lead to further and more elaborate processing of potentially important information.

  1. Striatal and Hippocampal Entropy and Recognition Signals in Category Learning: Simultaneous Processes Revealed by Model-Based fMRI

    ERIC Educational Resources Information Center

    Davis, Tyler; Love, Bradley C.; Preston, Alison R.

    2012-01-01

    Category learning is a complex phenomenon that engages multiple cognitive processes, many of which occur simultaneously and unfold dynamically over time. For example, as people encounter objects in the world, they simultaneously engage processes to determine their fit with current knowledge structures, gather new information about the objects, and…

  2. How can systems engineering inform the methods of programme evaluation in health professions education?

    PubMed

    Rojas, David; Grierson, Lawrence; Mylopoulos, Maria; Trbovich, Patricia; Bagli, Darius; Brydges, Ryan

    2018-04-01

    We evaluate programmes in health professions education (HPE) to determine their effectiveness and value. Programme evaluation has evolved from use of reductionist frameworks to those addressing the complex interactions between programme factors. Researchers in HPE have recently suggested a 'holistic programme evaluation' aiming to better describe and understand the implications of 'emergent processes and outcomes'. We propose a programme evaluation framework informed by principles and tools from systems engineering. Systems engineers conceptualise complexity and emergent elements in unique ways that may complement and extend contemporary programme evaluations in HPE. We demonstrate how the abstract decomposition space (ADS), an engineering knowledge elicitation tool, provides the foundation for a systems engineering informed programme evaluation designed to capture both planned and emergent programme elements. We translate the ADS tool to use education-oriented language, and describe how evaluators can use it to create a programme-specific ADS through iterative refinement. We provide a conceptualisation of emergent elements and an equation that evaluators can use to identify the emergent elements in their programme. Using our framework, evaluators can analyse programmes not as isolated units with planned processes and planned outcomes, but as unfolding, complex interactive systems that will exhibit emergent processes and emergent outcomes. Subsequent analysis of these emergent elements will inform the evaluator as they seek to optimise and improve the programme. Our proposed systems engineering informed programme evaluation framework provides principles and tools for analysing the implications of planned and emergent elements, as well as their potential interactions. We acknowledge that our framework is preliminary and will require application and constant refinement. We suggest that our framework will also advance our understanding of the construct of 'emergence' in HPE research. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  3. Toward theoretical understanding of the fertility preservation decision-making process: Examining information processing among young women with cancer

    PubMed Central

    Hershberger, Patricia E.; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2014-01-01

    Background Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. Objective The purpose of this paper is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Methods Using a grounded theory approach, 27 women with cancer participated in individual, semi-structured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by five dimensions within the Contemplate phase of the decision-making process framework. Results In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Conclusion Better understanding of theoretical underpinnings surrounding women’s information processes can facilitate decision support and improve clinical care. PMID:24552086

  4. Dynamics of analyst forecasts and emergence of complexity: Role of information disparity

    PubMed Central

    Ahn, Kwangwon

    2017-01-01

    We report complex phenomena arising among financial analysts, who gather information and generate investment advice, and elucidate them with the help of a theoretical model. Understanding how analysts form their forecasts is important in better understanding the financial market. Carrying out big-data analysis of the analyst forecast data from I/B/E/S for nearly thirty years, we find skew distributions as evidence for emergence of complexity, and show how information asymmetry or disparity affects financial analysts’ forming their forecasts. Here regulations, information dissemination throughout a fiscal year, and interactions among financial analysts are regarded as the proxy for a lower level of information disparity. It is found that financial analysts with better access to information display contrasting behaviors: a few analysts become bolder and issue forecasts independent of other forecasts while the majority of analysts issue more accurate forecasts and flock to each other. Main body of our sample of optimistic forecasts fits a log-normal distribution, with the tail displaying a power law. Based on the Yule process, we propose a model for the dynamics of issuing forecasts, incorporating interactions between analysts. Explaining nicely empirical data on analyst forecasts, this provides an appealing instance of understanding social phenomena in the perspective of complex systems. PMID:28498831

  5. Perceptual learning modules in mathematics: enhancing students' pattern recognition, structure extraction, and fluency.

    PubMed

    Kellman, Philip J; Massey, Christine M; Son, Ji Y

    2010-04-01

    Learning in educational settings emphasizes declarative and procedural knowledge. Studies of expertise, however, point to other crucial components of learning, especially improvements produced by experience in the extraction of information: perceptual learning (PL). We suggest that such improvements characterize both simple sensory and complex cognitive, even symbolic, tasks through common processes of discovery and selection. We apply these ideas in the form of perceptual learning modules (PLMs) to mathematics learning. We tested three PLMs, each emphasizing different aspects of complex task performance, in middle and high school mathematics. In the MultiRep PLM, practice in matching function information across multiple representations improved students' abilities to generate correct graphs and equations from word problems. In the Algebraic Transformations PLM, practice in seeing equation structure across transformations (but not solving equations) led to dramatic improvements in the speed of equation solving. In the Linear Measurement PLM, interactive trials involving extraction of information about units and lengths produced successful transfer to novel measurement problems and fraction problem solving. Taken together, these results suggest (a) that PL techniques have the potential to address crucial, neglected dimensions of learning, including discovery and fluent processing of relations; (b) PL effects apply even to complex tasks that involve symbolic processing; and (c) appropriately designed PL technology can produce rapid and enduring advances in learning. Copyright © 2009 Cognitive Science Society, Inc.

  6. Ant colony clustering with fitness perception and pheromone diffusion for community detection in complex networks

    NASA Astrophysics Data System (ADS)

    Ji, Junzhong; Song, Xiangjing; Liu, Chunnian; Zhang, Xiuzhen

    2013-08-01

    Community structure detection in complex networks has been intensively investigated in recent years. In this paper, we propose an adaptive approach based on ant colony clustering to discover communities in a complex network. The focus of the method is the clustering process of an ant colony in a virtual grid, where each ant represents a node in the complex network. During the ant colony search, the method uses a new fitness function to percept local environment and employs a pheromone diffusion model as a global information feedback mechanism to realize information exchange among ants. A significant advantage of our method is that the locations in the grid environment and the connections of the complex network structure are simultaneously taken into account in ants moving. Experimental results on computer-generated and real-world networks show the capability of our method to successfully detect community structures.

  7. Unifying Complexity and Information

    NASA Astrophysics Data System (ADS)

    Ke, Da-Guan

    2013-04-01

    Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.

  8. Brain Dynamics Sustaining Rapid Rule Extraction from Speech

    ERIC Educational Resources Information Center

    de Diego-Balaguer, Ruth; Fuentemilla, Lluis; Rodriguez-Fornells, Antoni

    2011-01-01

    Language acquisition is a complex process that requires the synergic involvement of different cognitive functions, which include extracting and storing the words of the language and their embedded rules for progressive acquisition of grammatical information. As has been shown in other fields that study learning processes, synchronization…

  9. Knowledge Theories Can Inform Evaluation Practice: What Can a Complexity Lens Add?

    ERIC Educational Resources Information Center

    Hawe, Penelope; Bond, Lyndal; Butler, Helen

    2009-01-01

    Programs and policies invariably contain new knowledge. Theories about knowledge utilization, diffusion, implementation, transfer, and knowledge translation theories illuminate some mechanisms of change processes. But more often than not, when it comes to understanding patterns about change processes, "the foreground" is privileged more…

  10. Automated synthesis of image processing procedures using AI planning techniques

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Mortensen, Helen

    1994-01-01

    This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.

  11. Exploring the application of an evolutionary educational complex systems framework to teaching and learning about issues in the science and technology classroom

    NASA Astrophysics Data System (ADS)

    Yoon, Susan Anne

    Understanding the world through a complex systems lens has recently garnered a great deal of interest in many knowledge disciplines. In the educational arena, interactional studies, through their focus on understanding patterns of system behaviour including the dynamical processes and trajectories of learning, lend support for investigating how a complex systems approach can inform educational research. This study uses previously existing literature and tools for complex systems applications and seeks to extend this research base by exploring learning outcomes of a complex systems framework when applied to curriculum and instruction. It is argued that by applying the evolutionary dynamics of variation, interaction and selection, complexity may be harnessed to achieve growth in both the social and cognitive systems of the classroom. Furthermore, if the goal of education, i.e., the social system under investigation, is to teach for understanding, conceptual knowledge of the kind described in Popper's (1972; 1976) World 3, needs to evolve. Both the study of memetic processes and knowledge building pioneered by Bereiter (cf. Bereiter, 2002) draw on the World 3 notion of ideas existing as conceptual artifacts that can be investigated as products outside of the individual mind providing an educational lens from which to proceed. The curricular topic addressed is the development of an ethical understanding of the scientific and technological issues of genetic engineering. 11 grade 8 students are studied as they proceed through 40 hours of curricular instruction based on the complex systems evolutionary framework. Results demonstrate growth in both complex systems thinking and content knowledge of the topic of genetic engineering. Several memetic processes are hypothesized to have influenced how and why ideas change. Categorized by factors influencing either reflective or non-reflective selection, these processes appear to have exerted differential effects on students' abilities to think and act in complex ways at various points throughout the study. Finally, an analysis of winner and loser memes is offered that is intended to reveal information about the conceptual system---its strengths and deficiencies---that can help educators assess curricular goals and organize and construct additional educational activities.

  12. Decoding the time-course of object recognition in the human brain: From visual features to categorical decisions.

    PubMed

    Contini, Erika W; Wardle, Susan G; Carlson, Thomas A

    2017-10-01

    Visual object recognition is a complex, dynamic process. Multivariate pattern analysis methods, such as decoding, have begun to reveal how the brain processes complex visual information. Recently, temporal decoding methods for EEG and MEG have offered the potential to evaluate the temporal dynamics of object recognition. Here we review the contribution of M/EEG time-series decoding methods to understanding visual object recognition in the human brain. Consistent with the current understanding of the visual processing hierarchy, low-level visual features dominate decodable object representations early in the time-course, with more abstract representations related to object category emerging later. A key finding is that the time-course of object processing is highly dynamic and rapidly evolving, with limited temporal generalisation of decodable information. Several studies have examined the emergence of object category structure, and we consider to what degree category decoding can be explained by sensitivity to low-level visual features. Finally, we evaluate recent work attempting to link human behaviour to the neural time-course of object processing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. How multiple social networks affect user awareness: The information diffusion process in multiplex networks

    NASA Astrophysics Data System (ADS)

    Li, Weihua; Tang, Shaoting; Fang, Wenyi; Guo, Quantong; Zhang, Xiao; Zheng, Zhiming

    2015-10-01

    The information diffusion process in single complex networks has been extensively studied, especially for modeling the spreading activities in online social networks. However, individuals usually use multiple social networks at the same time, and can share the information they have learned from one social network to another. This phenomenon gives rise to a new diffusion process on multiplex networks with more than one network layer. In this paper we account for this multiplex network spreading by proposing a model of information diffusion in two-layer multiplex networks. We develop a theoretical framework using bond percolation and cascading failure to describe the intralayer and interlayer diffusion. This allows us to obtain analytical solutions for the fraction of informed individuals as a function of transmissibility T and the interlayer transmission rate θ . Simulation results show that interaction between layers can greatly enhance the information diffusion process. And explosive diffusion can occur even if the transmissibility of the focal layer is under the critical threshold, due to interlayer transmission.

  14. Individual nodeʼs contribution to the mesoscale of complex networks

    NASA Astrophysics Data System (ADS)

    Klimm, Florian; Borge-Holthoefer, Javier; Wessel, Niels; Kurths, Jürgen; Zamora-López, Gorka

    2014-12-01

    The analysis of complex networks is devoted to the statistical characterization of the topology of graphs at different scales of organization in order to understand their functionality. While the modular structure of networks has become an essential element to better apprehend their complexity, the efforts to characterize the mesoscale of networks have focused on the identification of the modules rather than describing the mesoscale in an informative manner. Here we propose a framework to characterize the position every node takes within the modular configuration of complex networks and to evaluate their function accordingly. For illustration, we apply this framework to a set of synthetic networks, empirical neural networks, and to the transcriptional regulatory network of the Mycobacterium tuberculosis. We find that the architecture of both neuronal and transcriptional networks are optimized for the processing of multisensory information with the coexistence of well-defined modules of specialized components and the presence of hubs conveying information from and to the distinct functional domains.

  15. Peace Process Pedagogy: Lessons from the No-Vote Victory in the Colombian Peace Referendum

    ERIC Educational Resources Information Center

    Gomez-Suarez, Andrei

    2017-01-01

    Is there a need for a new field within Peace Education that looks at the complex dynamics of transitional societies in the post-truth era? What formal and informal pedagogical strategies might be best suited for transforming "emotional anti-peace mindsets?" Drawing on practical examples from the complex political contingencies in…

  16. A complex case of congenital cystic renal disease

    PubMed Central

    Cordiner, David S; Evans, Clair A; Brundler, Marie-Anne; McPhillips, Maeve; Murio, Enric; Darling, Mark; Taheri, Sepideh

    2012-01-01

    This case outlines the potential complexity of autosomal recessive polycystic kidney disease (ARPKD). It highlights the challenges involved in managing this condition, some of the complications faced and areas of uncertainty in the decision making process. With a paucity of published paediatric cases on this subject, this should add to the pool of information currently available. PMID:22605879

  17. Design of a framework for modeling, integration and simulation of physiological models.

    PubMed

    Erson, E Zeynep; Cavuşoğlu, M Cenk

    2012-09-01

    Multiscale modeling and integration of physiological models carry challenges due to the complex nature of physiological processes. High coupling within and among scales present a significant challenge in constructing and integrating multiscale physiological models. In order to deal with such challenges in a systematic way, there is a significant need for an information technology framework together with related analytical and computational tools that will facilitate integration of models and simulations of complex biological systems. Physiological Model Simulation, Integration and Modeling Framework (Phy-SIM) is an information technology framework providing the tools to facilitate development, integration and simulation of integrated models of human physiology. Phy-SIM brings software level solutions to the challenges raised by the complex nature of physiological systems. The aim of Phy-SIM, and this paper is to lay some foundation with the new approaches such as information flow and modular representation of the physiological models. The ultimate goal is to enhance the development of both the models and the integration approaches of multiscale physiological processes and thus this paper focuses on the design approaches that would achieve such a goal. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  18. Efficient embedding of complex networks to hyperbolic space via their Laplacian

    PubMed Central

    Alanis-Lobato, Gregorio; Mier, Pablo; Andrade-Navarro, Miguel A.

    2016-01-01

    The different factors involved in the growth process of complex networks imprint valuable information in their observable topologies. How to exploit this information to accurately predict structural network changes is the subject of active research. A recent model of network growth sustains that the emergence of properties common to most complex systems is the result of certain trade-offs between node birth-time and similarity. This model has a geometric interpretation in hyperbolic space, where distances between nodes abstract this optimisation process. Current methods for network hyperbolic embedding search for node coordinates that maximise the likelihood that the network was produced by the afore-mentioned model. Here, a different strategy is followed in the form of the Laplacian-based Network Embedding, a simple yet accurate, efficient and data driven manifold learning approach, which allows for the quick geometric analysis of big networks. Comparisons against existing embedding and prediction techniques highlight its applicability to network evolution and link prediction. PMID:27445157

  19. Efficient embedding of complex networks to hyperbolic space via their Laplacian

    NASA Astrophysics Data System (ADS)

    Alanis-Lobato, Gregorio; Mier, Pablo; Andrade-Navarro, Miguel A.

    2016-07-01

    The different factors involved in the growth process of complex networks imprint valuable information in their observable topologies. How to exploit this information to accurately predict structural network changes is the subject of active research. A recent model of network growth sustains that the emergence of properties common to most complex systems is the result of certain trade-offs between node birth-time and similarity. This model has a geometric interpretation in hyperbolic space, where distances between nodes abstract this optimisation process. Current methods for network hyperbolic embedding search for node coordinates that maximise the likelihood that the network was produced by the afore-mentioned model. Here, a different strategy is followed in the form of the Laplacian-based Network Embedding, a simple yet accurate, efficient and data driven manifold learning approach, which allows for the quick geometric analysis of big networks. Comparisons against existing embedding and prediction techniques highlight its applicability to network evolution and link prediction.

  20. Triangle network motifs predict complexes by complementing high-error interactomes with structural information.

    PubMed

    Andreopoulos, Bill; Winter, Christof; Labudde, Dirk; Schroeder, Michael

    2009-06-27

    A lot of high-throughput studies produce protein-protein interaction networks (PPINs) with many errors and missing information. Even for genome-wide approaches, there is often a low overlap between PPINs produced by different studies. Second-level neighbors separated by two protein-protein interactions (PPIs) were previously used for predicting protein function and finding complexes in high-error PPINs. We retrieve second level neighbors in PPINs, and complement these with structural domain-domain interactions (SDDIs) representing binding evidence on proteins, forming PPI-SDDI-PPI triangles. We find low overlap between PPINs, SDDIs and known complexes, all well below 10%. We evaluate the overlap of PPI-SDDI-PPI triangles with known complexes from Munich Information center for Protein Sequences (MIPS). PPI-SDDI-PPI triangles have ~20 times higher overlap with MIPS complexes than using second-level neighbors in PPINs without SDDIs. The biological interpretation for triangles is that a SDDI causes two proteins to be observed with common interaction partners in high-throughput experiments. The relatively few SDDIs overlapping with PPINs are part of highly connected SDDI components, and are more likely to be detected in experimental studies. We demonstrate the utility of PPI-SDDI-PPI triangles by reconstructing myosin-actin processes in the nucleus, cytoplasm, and cytoskeleton, which were not obvious in the original PPIN. Using other complementary datatypes in place of SDDIs to form triangles, such as PubMed co-occurrences or threading information, results in a similar ability to find protein complexes. Given high-error PPINs with missing information, triangles of mixed datatypes are a promising direction for finding protein complexes. Integrating PPINs with SDDIs improves finding complexes. Structural SDDIs partially explain the high functional similarity of second-level neighbors in PPINs. We estimate that relatively little structural information would be sufficient for finding complexes involving most of the proteins and interactions in a typical PPIN.

  1. Triangle network motifs predict complexes by complementing high-error interactomes with structural information

    PubMed Central

    Andreopoulos, Bill; Winter, Christof; Labudde, Dirk; Schroeder, Michael

    2009-01-01

    Background A lot of high-throughput studies produce protein-protein interaction networks (PPINs) with many errors and missing information. Even for genome-wide approaches, there is often a low overlap between PPINs produced by different studies. Second-level neighbors separated by two protein-protein interactions (PPIs) were previously used for predicting protein function and finding complexes in high-error PPINs. We retrieve second level neighbors in PPINs, and complement these with structural domain-domain interactions (SDDIs) representing binding evidence on proteins, forming PPI-SDDI-PPI triangles. Results We find low overlap between PPINs, SDDIs and known complexes, all well below 10%. We evaluate the overlap of PPI-SDDI-PPI triangles with known complexes from Munich Information center for Protein Sequences (MIPS). PPI-SDDI-PPI triangles have ~20 times higher overlap with MIPS complexes than using second-level neighbors in PPINs without SDDIs. The biological interpretation for triangles is that a SDDI causes two proteins to be observed with common interaction partners in high-throughput experiments. The relatively few SDDIs overlapping with PPINs are part of highly connected SDDI components, and are more likely to be detected in experimental studies. We demonstrate the utility of PPI-SDDI-PPI triangles by reconstructing myosin-actin processes in the nucleus, cytoplasm, and cytoskeleton, which were not obvious in the original PPIN. Using other complementary datatypes in place of SDDIs to form triangles, such as PubMed co-occurrences or threading information, results in a similar ability to find protein complexes. Conclusion Given high-error PPINs with missing information, triangles of mixed datatypes are a promising direction for finding protein complexes. Integrating PPINs with SDDIs improves finding complexes. Structural SDDIs partially explain the high functional similarity of second-level neighbors in PPINs. We estimate that relatively little structural information would be sufficient for finding complexes involving most of the proteins and interactions in a typical PPIN. PMID:19558694

  2. Method and system for knowledge discovery using non-linear statistical analysis and a 1st and 2nd tier computer program

    DOEpatents

    Hively, Lee M [Philadelphia, TN

    2011-07-12

    The invention relates to a method and apparatus for simultaneously processing different sources of test data into informational data and then processing different categories of informational data into knowledge-based data. The knowledge-based data can then be communicated between nodes in a system of multiple computers according to rules for a type of complex, hierarchical computer system modeled on a human brain.

  3. Microscopic information processing and communication in crowd dynamics

    NASA Astrophysics Data System (ADS)

    Henein, Colin Marc; White, Tony

    2010-11-01

    Due, perhaps, to the historical division of crowd dynamics research into psychological and engineering approaches, microscopic crowd models have tended toward modelling simple interchangeable particles with an emphasis on the simulation of physical factors. Despite the fact that people have complex (non-panic) behaviours in crowd disasters, important human factors in crowd dynamics such as information discovery and processing, changing goals and communication have not yet been well integrated at the microscopic level. We use our Microscopic Human Factors methodology to fuse a microscopic simulation of these human factors with a popular microscopic crowd model. By tightly integrating human factors with the existing model we can study the effects on the physical domain (movement, force and crowd safety) when human behaviour (information processing and communication) is introduced. In a large-room egress scenario with ample exits, information discovery and processing yields a crowd of non-interchangeable individuals who, despite close proximity, have different goals due to their different beliefs. This crowd heterogeneity leads to complex inter-particle interactions such as jamming transitions in open space; at high crowd energies, we found a freezing by heating effect (reminiscent of the disaster at Central Lenin Stadium in 1982) in which a barrier formation of naïve individuals trying to reach blocked exits prevented knowledgeable ones from exiting. Communication, when introduced, reduced this barrier formation, increasing both exit rates and crowd safety.

  4. Information thermodynamics of near-equilibrium computation

    NASA Astrophysics Data System (ADS)

    Prokopenko, Mikhail; Einav, Itai

    2015-06-01

    In studying fundamental physical limits and properties of computational processes, one is faced with the challenges of interpreting primitive information-processing functions through well-defined information-theoretic as well as thermodynamic quantities. In particular, transfer entropy, characterizing the function of computational transmission and its predictability, is known to peak near critical regimes. We focus on a thermodynamic interpretation of transfer entropy aiming to explain the underlying critical behavior by associating information flows intrinsic to computational transmission with particular physical fluxes. Specifically, in isothermal systems near thermodynamic equilibrium, the gradient of the average transfer entropy is shown to be dynamically related to Fisher information and the curvature of system's entropy. This relationship explicitly connects the predictability, sensitivity, and uncertainty of computational processes intrinsic to complex systems and allows us to consider thermodynamic interpretations of several important extreme cases and trade-offs.

  5. Physical Complexity and Cognitive Evolution

    NASA Astrophysics Data System (ADS)

    Jedlicka, Peter

    Our intuition tells us that there is a general trend in the evolution of nature, a trend towards greater complexity. However, there are several definitions of complexity and hence it is difficult to argue for or against the validity of this intuition. Christoph Adami has recently introduced a novel measure called physical complexity that assigns low complexity to both ordered and random systems and high complexity to those in between. Physical complexity measures the amount of information that an organism stores in its genome about the environment in which it evolves. The theory of physical complexity predicts that evolution increases the amount of `knowledge' an organism accumulates about its niche. It might be fruitful to generalize Adami's concept of complexity to the entire evolution (including the evolution of man). Physical complexity fits nicely into the philosophical framework of cognitive biology which considers biological evolution as a progressing process of accumulation of knowledge (as a gradual increase of epistemic complexity). According to this paradigm, evolution is a cognitive `ratchet' that pushes the organisms unidirectionally towards higher complexity. Dynamic environment continually creates problems to be solved. To survive in the environment means to solve the problem, and the solution is an embodied knowledge. Cognitive biology (as well as the theory of physical complexity) uses the concepts of information and entropy and views the evolution from both the information-theoretical and thermodynamical perspective. Concerning humans as conscious beings, it seems necessary to postulate an emergence of a new kind of knowledge - a self-aware and self-referential knowledge. Appearence of selfreflection in evolution indicates that the human brain reached a new qualitative level in the epistemic complexity.

  6. On the maximum-entropy/autoregressive modeling of time series

    NASA Technical Reports Server (NTRS)

    Chao, B. F.

    1984-01-01

    The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.

  7. Reconstruction method for data protection in telemedicine systems

    NASA Astrophysics Data System (ADS)

    Buldakova, T. I.; Suyatinov, S. I.

    2015-03-01

    In the report the approach to protection of transmitted data by creation of pair symmetric keys for the sensor and the receiver is offered. Since biosignals are unique for each person, their corresponding processing allows to receive necessary information for creation of cryptographic keys. Processing is based on reconstruction of the mathematical model generating time series that are diagnostically equivalent to initial biosignals. Information about the model is transmitted to the receiver, where the restoration of physiological time series is performed using the reconstructed model. Thus, information about structure and parameters of biosystem model received in the reconstruction process can be used not only for its diagnostics, but also for protection of transmitted data in telemedicine complexes.

  8. The value of mechanistic biophysical information for systems-level understanding of complex biological processes such as cytokinesis.

    PubMed

    Pollard, Thomas D

    2014-12-02

    This review illustrates the value of quantitative information including concentrations, kinetic constants and equilibrium constants in modeling and simulating complex biological processes. Although much has been learned about some biological systems without these parameter values, they greatly strengthen mechanistic accounts of dynamical systems. The analysis of muscle contraction is a classic example of the value of combining an inventory of the molecules, atomic structures of the molecules, kinetic constants for the reactions, reconstitutions with purified proteins and theoretical modeling to account for the contraction of whole muscles. A similar strategy is now being used to understand the mechanism of cytokinesis using fission yeast as a favorable model system. Copyright © 2014 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  9. Occam’s Quantum Strop: Synchronizing and Compressing Classical Cryptic Processes via a Quantum Channel

    NASA Astrophysics Data System (ADS)

    Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-02-01

    A stochastic process’ statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generator to another. How much information is required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process’ cryptic order-a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost-one trades off prediction for generation complexity.

  10. A Longitudinal Integration of Identity Styles and Educational Identity Processes in Adolescence

    ERIC Educational Resources Information Center

    Negru-Subtirica, Oana; Pop, Eleonora Ioana; Crocetti, Elisabetta

    2017-01-01

    Identity formation is a main adolescent psychosocial developmental task. The complex interconnection between different processes that are at the basis of one's identity is a research and applied intervention priority. In this context, the identity style model focuses on social-cognitive strategies (i.e., informational, normative, and…

  11. Evaluation of an electronic nose for improved biosolids alkaline-stabilization treatment and odor management

    USDA-ARS?s Scientific Manuscript database

    Electronic nose sensors are designed to detect differences in complex air sample matrices. For example, they have been used in the food industry to monitor process performance and quality control. However, no information is available on the application of sensor arrays to monitor process performanc...

  12. The Role of Independent Educational Consultants in the College Application Process

    ERIC Educational Resources Information Center

    Smith, Jill M.

    2014-01-01

    This dissertation focuses on the growing role of private, for-profit Independent Educational Consultants (IECs) in the college application process. Over the past two decades, an "admission industrial complex" of commercial enterprises designed to help students strategize about admissions and give them information about colleges has…

  13. Mismatch and Conflict: Neurophysiological and Behavioral Evidence for Conflict Priming

    ERIC Educational Resources Information Center

    Mager, Ralph; Meuth, Sven G.; Krauchi, Kurt; Schmidlin, Maria; Muller-Spahn, Franz; Falkenstein, Michael

    2009-01-01

    Conflict-related cognitive processes are critical for adapting to sudden environmental changes that confront the individual with inconsistent or ambiguous information. Thus, these processes play a crucial role to cope with daily life. Generally, conflicts tend to accumulate especially in complex and threatening situations. Therefore, the question…

  14. Instructional Dynamics in Two-Year Postsecondary Institutions: Concepts, Trends, and Assessment Issues. Information Series No. 318.

    ERIC Educational Resources Information Center

    Alfred, Richard L.; Hummel, Mary L.

    Postsecondary instructional dynamics is a complex process in which inputs (student characteristics and expectations, resources, and faculty characteristics and preparation) are converted through the educational process (instruction strategies, models, and techniques as well as supportive services) into outputs (outcomes and benefits of instruction…

  15. Computing and data processing

    NASA Technical Reports Server (NTRS)

    Smarr, Larry; Press, William; Arnett, David W.; Cameron, Alastair G. W.; Crutcher, Richard M.; Helfand, David J.; Horowitz, Paul; Kleinmann, Susan G.; Linsky, Jeffrey L.; Madore, Barry F.

    1991-01-01

    The applications of computers and data processing to astronomy are discussed. Among the topics covered are the emerging national information infrastructure, workstations and supercomputers, supertelescopes, digital astronomy, astrophysics in a numerical laboratory, community software, archiving of ground-based observations, dynamical simulations of complex systems, plasma astrophysics, and the remote control of fourth dimension supercomputers.

  16. Process-Oriented Worked Examples: Improving Transfer Performance through Enhanced Understanding

    ERIC Educational Resources Information Center

    van Gog, Tamara; Paas, Fred; van Merrienboer, Jeroen J. G.

    2004-01-01

    The research on worked examples has shown that for novices, studying worked examples is often a more effective and efficient way of learning than solving conventional problems. This theoretical paper argues that adding process-oriented information to worked examples can further enhance transfer performance, especially for complex cognitive skills…

  17. Evaluation: Boundary Identification in the Non-Linear Special Education System.

    ERIC Educational Resources Information Center

    Yacobacci, Patricia M.

    The evaluation process within special education, as in general education, most often becomes one of data collection consisting of formal and informal tests given by the school psychologist and the classroom instructor. Influences of the complex environment on the educational process are often ignored. Evaluation factors include mainstreaming,…

  18. Description and operational status of the National Transonic Facility computer complex

    NASA Technical Reports Server (NTRS)

    Boyles, G. B., Jr.

    1986-01-01

    This paper describes the National Transonic Facility (NTF) computer complex and its support of tunnel operations. The capabilities of the research data acquisition and reduction are discussed along with the types of data that can be acquired and presented. Pretest, test, and posttest capabilities are also outlined along with a discussion of the computer complex to monitor the tunnel control processes and provide the tunnel operators with information needed to control the tunnel. Planned enhancements to the computer complex for support of future testing are presented.

  19. Visual attention.

    PubMed

    Evans, Karla K; Horowitz, Todd S; Howe, Piers; Pedersini, Roccardo; Reijnen, Ester; Pinto, Yair; Kuzmova, Yoana; Wolfe, Jeremy M

    2011-09-01

    A typical visual scene we encounter in everyday life is complex and filled with a huge amount of perceptual information. The term, 'visual attention' describes a set of mechanisms that limit some processing to a subset of incoming stimuli. Attentional mechanisms shape what we see and what we can act upon. They allow for concurrent selection of some (preferably, relevant) information and inhibition of other information. This selection permits the reduction of complexity and informational overload. Selection can be determined both by the 'bottom-up' saliency of information from the environment and by the 'top-down' state and goals of the perceiver. Attentional effects can take the form of modulating or enhancing the selected information. A central role for selective attention is to enable the 'binding' of selected information into unified and coherent representations of objects in the outside world. In the overview on visual attention presented here we review the mechanisms and consequences of selection and inhibition over space and time. We examine theoretical, behavioral and neurophysiologic work done on visual attention. We also discuss the relations between attention and other cognitive processes such as automaticity and awareness. WIREs Cogni Sci 2011 2 503-514 DOI: 10.1002/wcs.127 For further resources related to this article, please visit the WIREs website. Copyright © 2011 John Wiley & Sons, Ltd.

  20. Assessment of Unconscious Decision Aids Applied to Complex Patient-Centered Medical Decisions

    PubMed Central

    Manigault, Andrew Wilhelm; Whillock, Summer Rain

    2015-01-01

    Background To improve patient health, recent research urges for medical decision aids that are designed to enhance the effectiveness of specific medically related decisions. Many such decisions involve complex information, and decision aids that independently use deliberative (analytical and slower) or intuitive (more affective and automatic) cognitive processes for such decisions result in suboptimal decisions. Unconscious thought can arguably use both intuitive and deliberative (slow and analytic) processes, and this combination may further benefit complex patient (or practitioner) decisions as medical decision aids. Indeed, mounting research demonstrates that individuals render better decisions generally if they are distracted from thinking consciously about complex information after it is presented (but can think unconsciously), relative to thinking about that information consciously or not at all. Objective The current research tested whether the benefits of unconscious thought processes can be replicated using an Internet platform for a patient medical decision involving complex information. This research also explored the possibility that judgments reported after a period of unconscious thought are actually the result of a short period of conscious deliberation occurring during the decision report phase. Methods A total of 173 participants in a Web-based experiment received information about four medical treatments, the best (worst) associated with mostly positive (negative) side-effects/attributes and the others with equal positive-negative ratios. Next, participants were either distracted for 3 minutes (unconscious thought), instructed to think about the information for 3 minutes (conscious thought), or moved directly to the decision task (immediate decision). Finally, participants reported their choice of, and attitudes toward, the treatments while experiencing high, low, or no cognitive load, which varied their ability to think consciously while reporting judgments. Cognitive load was manipulated by having participants memorize semi-random (high), line structured (low), or no dot patterns and recall these intermittently with their decision reports. Overall then, participants were randomly assigned to the conditions of a 3 (thought condition) by 3 (cognitive-load level) between-subjects design. Results A logistic regression analysis indicated that the odds of participants choosing the best treatment were 2.25 times higher in the unconscious-thought condition compared to the immediate-decision condition (b=.81, Wald=4.32, P=.04, 95% CI 1.048-4.836), and 2.39 times greater compared to the conscious-thought condition (b=.87, Wald=4.87, P=.027, 95% CI 1.103-5.186). No difference was observed between the conscious-thought condition compared to the immediate-decision condition, and cognitive load manipulations did not affect choices or alter the above finding. Conclusions This research demonstrates a plausible benefit of unconscious thinking as a decision aid for complex medical decisions, and represents the first use of unconscious thought processes as a patient-centered medical decision aid. Further, the quality of decisions reached unconsciously does not appear to be affected by the amount of cognitive load participants experienced. PMID:25677337

  1. Processing reafferent and exafferent visual information for action and perception.

    PubMed

    Reichenbach, Alexandra; Diedrichsen, Jörn

    2015-01-01

    A recent study suggests that reafferent hand-related visual information utilizes a privileged, attention-independent processing channel for motor control. This process was termed visuomotor binding to reflect its proposed function: linking visual reafferences to the corresponding motor control centers. Here, we ask whether the advantage of processing reafferent over exafferent visual information is a specific feature of the motor processing stream or whether the improved processing also benefits the perceptual processing stream. Human participants performed a bimanual reaching task in a cluttered visual display, and one of the visual hand cursors could be displaced laterally during the movement. We measured the rapid feedback responses of the motor system as well as matched perceptual judgments of which cursor was displaced. Perceptual judgments were either made by watching the visual scene without moving or made simultaneously to the reaching tasks, such that the perceptual processing stream could also profit from the specialized processing of reafferent information in the latter case. Our results demonstrate that perceptual judgments in the heavily cluttered visual environment were improved when performed based on reafferent information. Even in this case, however, the filtering capability of the perceptual processing stream suffered more from the increasing complexity of the visual scene than the motor processing stream. These findings suggest partly shared and partly segregated processing of reafferent information for vision for motor control versus vision for perception.

  2. Understanding the Influence of the Complex Relationships among Informal and Formal Supports on the Well-Being of Caregivers of Persons with Dementia

    ERIC Educational Resources Information Center

    Raina, Parminder; McIntyre, Chris; Zhu, Bin; McDowell, Ian; Santaguida, Pasqualina; Kristjansson, Betsy; Hendricks, Alexandra; Massfeller, Helen; Chambers, Larry

    2004-01-01

    This study examined the direct and indirect relationships between caring for a person with dementia and caregiver health. A conceptual model of the caregiver stress process considered informal caregiver characteristics, sources of caregiver stress, and the influence of informal and formal support on the well-being of the caregivers of persons with…

  3. Mechanoluminescence assisting agile optimization of processing design on surgical epiphysis plates

    NASA Astrophysics Data System (ADS)

    Terasaki, Nao; Toyomasu, Takashi; Sonohata, Motoki

    2018-04-01

    We propose a novel method for agile optimization of processing design by visualization of mechanoluminescence. To demonstrate the effect of the new method, epiphysis plates were processed to form dots (diameters: 1 and 1.5 mm) and the mechanical information was evaluated. As a result, the appearance of new strain concentration was successfully visualized on the basis of mechanoluminescence, and complex mechanical information was instinctively understood by surgeons as the designers. In addition, it was clarified by mechanoluminescence analysis that small dots do not have serious mechanical effects such as strength reduction. Such detail mechanical information evaluated on the basis of mechanoluminescence was successfully applied to the judgement of the validity of the processing design. This clearly proves the effectiveness of the new methodology using mechanoluminescence for assisting agile optimization of the processing design.

  4. Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols

    PubMed Central

    2016-01-01

    The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM–0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information. PMID:27385047

  5. Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols.

    PubMed

    Liu, Zhengchun; Liu, Yi; Kim, Eunkyoung; Bentley, William E; Payne, Gregory F

    2016-07-19

    The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM-0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information.

  6. Neural correlates in the processing of phoneme-level complexity in vowel production.

    PubMed

    Park, Haeil; Iverson, Gregory K; Park, Hae-Jeong

    2011-12-01

    We investigated how articulatory complexity at the phoneme level is manifested neurobiologically in an overt production task. fMRI images were acquired from young Korean-speaking adults as they pronounced bisyllabic pseudowords in which we manipulated phonological complexity defined in terms of vowel duration and instability (viz., COMPLEX: /tiɯi/ > MID-COMPLEX: /tiye/ > SIMPLE: /tii/). Increased activity in the left inferior frontal gyrus (Brodmann Areas (BA) 44 and 47), supplementary motor area and anterior insula was observed for the articulation of COMPLEX sequences relative to MID-COMPLEX; this was the case with the articulation of MID-COMPLEX relative to SIMPLE, except that the pars orbitalis (BA 47) was dominantly identified in the Broca's area. The differentiation indicates that phonological complexity is reflected in the neural processing of distinct phonemic representations, both by recruiting brain regions associated with retrieval of phonological information from memory and via articulatory rehearsal for the production of COMPLEX vowels. In addition, the finding that increased complexity engages greater areas of the brain suggests that brain activation can be a neurobiological measure of articulo-phonological complexity, complementing, if not substituting for, biomechanical measurements of speech motor activity. 2011 Elsevier Inc. All rights reserved.

  7. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  8. Octopus Cells in the Posteroventral Cochlear Nucleus Provide the Main Excitatory Input to the Superior Paraolivary Nucleus

    PubMed Central

    Felix II, Richard A.; Gourévitch, Boris; Gómez-Álvarez, Marcelo; Leijon, Sara C. M.; Saldaña, Enrique; Magnusson, Anna K.

    2017-01-01

    Auditory streaming enables perception and interpretation of complex acoustic environments that contain competing sound sources. At early stages of central processing, sounds are segregated into separate streams representing attributes that later merge into acoustic objects. Streaming of temporal cues is critical for perceiving vocal communication, such as human speech, but our understanding of circuits that underlie this process is lacking, particularly at subcortical levels. The superior paraolivary nucleus (SPON), a prominent group of inhibitory neurons in the mammalian brainstem, has been implicated in processing temporal information needed for the segmentation of ongoing complex sounds into discrete events. The SPON requires temporally precise and robust excitatory input(s) to convey information about the steep rise in sound amplitude that marks the onset of voiced sound elements. Unfortunately, the sources of excitation to the SPON and the impact of these inputs on the behavior of SPON neurons have yet to be resolved. Using anatomical tract tracing and immunohistochemistry, we identified octopus cells in the contralateral cochlear nucleus (CN) as the primary source of excitatory input to the SPON. Cluster analysis of miniature excitatory events also indicated that the majority of SPON neurons receive one type of excitatory input. Precise octopus cell-driven onset spiking coupled with transient offset spiking make SPON responses well-suited to signal transitions in sound energy contained in vocalizations. Targets of octopus cell projections, including the SPON, are strongly implicated in the processing of temporal sound features, which suggests a common pathway that conveys information critical for perception of complex natural sounds. PMID:28620283

  9. Use of the self-organising map network (SOMNet) as a decision support system for regional mental health planning.

    PubMed

    Chung, Younjin; Salvador-Carulla, Luis; Salinas-Pérez, José A; Uriarte-Uriarte, Jose J; Iruin-Sanz, Alvaro; García-Alonso, Carlos R

    2018-04-25

    Decision-making in mental health systems should be supported by the evidence-informed knowledge transfer of data. Since mental health systems are inherently complex, involving interactions between its structures, processes and outcomes, decision support systems (DSS) need to be developed using advanced computational methods and visual tools to allow full system analysis, whilst incorporating domain experts in the analysis process. In this study, we use a DSS model developed for interactive data mining and domain expert collaboration in the analysis of complex mental health systems to improve system knowledge and evidence-informed policy planning. We combine an interactive visual data mining approach, the self-organising map network (SOMNet), with an operational expert knowledge approach, expert-based collaborative analysis (EbCA), to develop a DSS model. The SOMNet was applied to the analysis of healthcare patterns and indicators of three different regional mental health systems in Spain, comprising 106 small catchment areas and providing healthcare for over 9 million inhabitants. Based on the EbCA, the domain experts in the development team guided and evaluated the analytical processes and results. Another group of 13 domain experts in mental health systems planning and research evaluated the model based on the analytical information of the SOMNet approach for processing information and discovering knowledge in a real-world context. Through the evaluation, the domain experts assessed the feasibility and technology readiness level (TRL) of the DSS model. The SOMNet, combined with the EbCA, effectively processed evidence-based information when analysing system outliers, explaining global and local patterns, and refining key performance indicators with their analytical interpretations. The evaluation results showed that the DSS model was feasible by the domain experts and reached level 7 of the TRL (system prototype demonstration in operational environment). This study supports the benefits of combining health systems engineering (SOMNet) and expert knowledge (EbCA) to analyse the complexity of health systems research. The use of the SOMNet approach contributes to the demonstration of DSS for mental health planning in practice.

  10. Connected Worlds: Connecting the public with complex environmental systems

    NASA Astrophysics Data System (ADS)

    Uzzo, S. M.; Chen, R. S.; Downs, R. R.

    2016-12-01

    Among the most important concepts in environmental science learning is the structure and dynamics of coupled human and natural systems (CHANS). But the fundamental epistemology for understanding CHANS requires systems thinking, interdisciplinarity, and complexity. Although the Next Generation Science Standards mandate connecting ideas across disciplines and systems, traditional approaches to education do not provide more than superficial understanding of this concept. Informal science learning institutions have a key role in bridging gaps between the reductive nature of classroom learning and contemporary data-driven science. The New York Hall of Science, in partnership with Design I/O and Columbia University's Center for International Earth Science Information Network, has developed an approach to immerse visitors in complex human nature interactions and provide opportunities for those of all ages to elicit and notice environmental consequences of their actions. Connected Worlds is a nearly 1,000 m2 immersive, playful environment in which students learn about complexity and interconnectedness in ecosystems and how ecosystems might respond to human intervention. It engages students through direct interactions with fanciful flora and fauna within and among six biomes: desert, rainforest, grassland, mountain valley, reservoir, and wetlands, which are interconnected through stocks and flows of water. Through gestures and the manipulation of a dynamic water system, Connected Worlds enables students, teachers, and parents to experience how the ecosystems of planet Earth are connected and to observe relationships between the behavior of Earth's inhabitants and our shared world. It is also a cyberlearning platform to study how visitors notice and scaffold their understanding of complex environmental processes and the responses of these processes to human intervention, to help inform the improvement of education practices in complex environmental science.

  11. Resolving Complex Research Data Management Issues in Biomedical Laboratories: Qualitative Study of an Industry-Academia Collaboration

    PubMed Central

    Myneni, Sahiti; Patel, Vimla L.; Bova, G. Steven; Wang, Jian; Ackerman, Christopher F.; Berlinicke, Cynthia A.; Chen, Steve H.; Lindvall, Mikael; Zack, Donald J.

    2016-01-01

    This paper describes a distributed collaborative effort between industry and academia to systematize data management in an academic biomedical laboratory. Heterogeneous and voluminous nature of research data created in biomedical laboratories make information management difficult and research unproductive. One such collaborative effort was evaluated over a period of four years using data collection methods including ethnographic observations, semi-structured interviews, web-based surveys, progress reports, conference call summaries, and face-to-face group discussions. Data were analyzed using qualitative methods of data analysis to 1) characterize specific problems faced by biomedical researchers with traditional information management practices, 2) identify intervention areas to introduce a new research information management system called Labmatrix, and finally to 3) evaluate and delineate important general collaboration (intervention) characteristics that can optimize outcomes of an implementation process in biomedical laboratories. Results emphasize the importance of end user perseverance, human-centric interoperability evaluation, and demonstration of return on investment of effort and time of laboratory members and industry personnel for success of implementation process. In addition, there is an intrinsic learning component associated with the implementation process of an information management system. Technology transfer experience in a complex environment such as the biomedical laboratory can be eased with use of information systems that support human and cognitive interoperability. Such informatics features can also contribute to successful collaboration and hopefully to scientific productivity. PMID:26652980

  12. Multiple identities in Northern Ireland: hierarchical ordering in the representation of group membership.

    PubMed

    Crisp, R J; Hewstone, M; Cairns, E

    2001-12-01

    A study was conducted to explore whether participants in Northern Ireland attend to, and process information about, different group members as a function of a single dimension of category membership (religion) or as a function of additional and/or alternative bases for group membership. Utilizing a bogus 'newspaper story' paradigm, we explored whether participants would differentially recall target attributes as a function of two dimensions of category membership. Findings from this recall measure suggested that information concerning ingroup and outgroup members was processed as an interactive function of both religion and gender intergroup dimensions. Religion was only used to guide processing of more specific information if the story character was also an outgroup member on the gender dimension. These findings suggest a complex pattern of intergroup representation in the processing of group-relevant information in the Northern Irish context.

  13. Development of a Handbook for Educators: Addressing Working Memory Capacity in Elementary Students

    ERIC Educational Resources Information Center

    Fernandez, Julie Marie

    2013-01-01

    Working Memory (WM) refers to a brain system that provides temporary storage and manipulation of the information necessary for complex cognitive tasks such as language comprehension, learning, and reasoning. WM also requires the simultaneous storage and processing of information. WM is directly related to academic performance in the classroom.…

  14. Using Learning Analytics to Characterize Student Experimentation Strategies in Engineering Design

    ERIC Educational Resources Information Center

    Vieira, Camilo; Goldstein, Molly Hathaway; Purzer, Senay; Magana, Alejandra J.

    2016-01-01

    Engineering design is a complex process both for students to participate in and for instructors to assess. Informed designers use the key strategy of conducting experiments as they test ideas to inform next steps. Conversely, beginning designers experiment less, often with confounding variables. These behaviours are not easy to assess in…

  15. Improving IT Portfolio Management Decision Confidence Using Multi-Criteria Decision Making and Hypervariate Display Techniques

    ERIC Educational Resources Information Center

    Landmesser, John Andrew

    2014-01-01

    Information technology (IT) investment decision makers are required to process large volumes of complex data. An existing body of knowledge relevant to IT portfolio management (PfM), decision analysis, visual comprehension of large volumes of information, and IT investment decision making suggest Multi-Criteria Decision Making (MCDM) and…

  16. Faith Informing Competitive Youth Athletes in Christian Schooling

    ERIC Educational Resources Information Center

    Hoven, Matt

    2016-01-01

    How do students use religious faith to inform their actions in competitive sport? This qualitative study critically reflects on this question based upon the thinking processes and experiences of 15-year-old participants in sports and, in turn, produces a basic conceptual framework toward the question at hand. Overall, students reported a complex,…

  17. Online Community Detection for Large Complex Networks

    PubMed Central

    Pan, Gang; Zhang, Wangsheng; Wu, Zhaohui; Li, Shijian

    2014-01-01

    Complex networks describe a wide range of systems in nature and society. To understand complex networks, it is crucial to investigate their community structure. In this paper, we develop an online community detection algorithm with linear time complexity for large complex networks. Our algorithm processes a network edge by edge in the order that the network is fed to the algorithm. If a new edge is added, it just updates the existing community structure in constant time, and does not need to re-compute the whole network. Therefore, it can efficiently process large networks in real time. Our algorithm optimizes expected modularity instead of modularity at each step to avoid poor performance. The experiments are carried out using 11 public data sets, and are measured by two criteria, modularity and NMI (Normalized Mutual Information). The results show that our algorithm's running time is less than the commonly used Louvain algorithm while it gives competitive performance. PMID:25061683

  18. Multilevel depth and image fusion for human activity detection.

    PubMed

    Ni, Bingbing; Pei, Yong; Moulin, Pierre; Yan, Shuicheng

    2013-10-01

    Recognizing complex human activities usually requires the detection and modeling of individual visual features and the interactions between them. Current methods only rely on the visual features extracted from 2-D images, and therefore often lead to unreliable salient visual feature detection and inaccurate modeling of the interaction context between individual features. In this paper, we show that these problems can be addressed by combining data from a conventional camera and a depth sensor (e.g., Microsoft Kinect). We propose a novel complex activity recognition and localization framework that effectively fuses information from both grayscale and depth image channels at multiple levels of the video processing pipeline. In the individual visual feature detection level, depth-based filters are applied to the detected human/object rectangles to remove false detections. In the next level of interaction modeling, 3-D spatial and temporal contexts among human subjects or objects are extracted by integrating information from both grayscale and depth images. Depth information is also utilized to distinguish different types of indoor scenes. Finally, a latent structural model is developed to integrate the information from multiple levels of video processing for an activity detection. Extensive experiments on two activity recognition benchmarks (one with depth information) and a challenging grayscale + depth human activity database that contains complex interactions between human-human, human-object, and human-surroundings demonstrate the effectiveness of the proposed multilevel grayscale + depth fusion scheme. Higher recognition and localization accuracies are obtained relative to the previous methods.

  19. Objects and processes: Two notions for understanding biological information.

    PubMed

    Mercado-Reyes, Agustín; Padilla-Longoria, Pablo; Arroyo-Santos, Alfonso

    2015-09-07

    In spite of being ubiquitous in life sciences, the concept of information is harshly criticized. Uses of the concept other than those derived from Shannon׳s theory are denounced as metaphoric. We perform a computational experiment to explore whether Shannon׳s information is adequate to describe the uses of said concept in commonplace scientific practice. Our results show that semantic sequences do not have unique complexity values different from the value of meaningless sequences. This result suggests that quantitative theoretical frameworks do not account fully for the complex phenomenon that the term "information" refers to. We propose a restructuring of the concept into two related, but independent notions, and conclude that a complete theory of biological information must account completely not only for both notions, but also for the relationship between them. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Ab Initio and Monte Carlo Approaches For the MagnetocaloricEffect in Co- and In-Doped Ni-Mn-Ga Heusler Alloys

    NASA Astrophysics Data System (ADS)

    Sokolovskiy, Vladimir; Grünebohm, Anna; Buchelnikov, Vasiliy; Entel, Peter

    2014-09-01

    This special issue collects contributions from the participants of the "Information in Dynamical Systems and Complex Systems" workshop, which cover a wide range of important problems and new approaches that lie in the intersection of information theory and dynamical systems. The contributions include theoretical characterization and understanding of the different types of information flow and causality in general stochastic processes, inference and identification of coupling structure and parameters of system dynamics, rigorous coarse-grain modeling of network dynamical systems, and exact statistical testing of fundamental information-theoretic quantities such as the mutual information. The collective efforts reported herein reflect a modern perspective of the intimate connection between dynamical systems and information flow, leading to the promise of better understanding and modeling of natural complex systems and better/optimal design of engineering systems.

  1. Grounding explanations in evolving, diagnostic situations

    NASA Technical Reports Server (NTRS)

    Johannesen, Leila J.; Cook, Richard I.; Woods, David D.

    1994-01-01

    Certain fields of practice involve the management and control of complex dynamic systems. These include flight deck operations in commercial aviation, control of space systems, anesthetic management during surgery or chemical or nuclear process control. Fault diagnosis of these dynamic systems generally must occur with the monitored process on-line and in conjunction with maintaining system integrity.This research seeks to understand in more detail what it means for an intelligent system to function cooperatively, or as a 'team player' in complex, dynamic environments. The approach taken was to study human practitioners engaged in the management of a complex, dynamic process: anesthesiologists during neurosurgical operations. The investigation focused on understanding how team members cooperate in management and fault diagnosis and comparing this interaction to the situation with an Artificial Intelligence(AI) system that provides diagnoses and explanations. Of particular concern was to study the ways in which practitioners support one another in keeping aware of relevant information concerning the state of the monitored process and of the problem solving process.

  2. Design and Implementation of Hydrologic Process Knowledge-base Ontology: A case study for the Infiltration Process

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2013-12-01

    Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.

  3. Adaptive automation of human-machine system information-processing functions.

    PubMed

    Kaber, David B; Wright, Melanie C; Prinzel, Lawrence J; Clamann, Michael P

    2005-01-01

    The goal of this research was to describe the ability of human operators to interact with adaptive automation (AA) applied to various stages of complex systems information processing, defined in a model of human-automation interaction. Forty participants operated a simulation of an air traffic control task. Automated assistance was adaptively applied to information acquisition, information analysis, decision making, and action implementation aspects of the task based on operator workload states, which were measured using a secondary task. The differential effects of the forms of automation were determined and compared with a manual control condition. Results of two 20-min trials of AA or manual control revealed a significant effect of the type of automation on performance, particularly during manual control periods as part of the adaptive conditions. Humans appear to better adapt to AA applied to sensory and psychomotor information-processing functions (action implementation) than to AA applied to cognitive functions (information analysis and decision making), and AA is superior to completely manual control. Potential applications of this research include the design of automation to support air traffic controller information processing.

  4. "Chemical transformers" from nanoparticle ensembles operated with logic.

    PubMed

    Motornov, Mikhail; Zhou, Jian; Pita, Marcos; Gopishetty, Venkateshwarlu; Tokarev, Ihor; Katz, Evgeny; Minko, Sergiy

    2008-09-01

    The pH-responsive nanoparticles were coupled with information-processing enzyme-based systems to yield "smart" signal-responsive hybrid systems with built-in Boolean logic. The enzyme systems performed AND/OR logic operations, transducing biochemical input signals into reversible structural changes (signal-directed self-assembly) of the nanoparticle assemblies, thus resulting in the processing and amplification of the biochemical signals. The hybrid system mimics biological systems in effective processing of complex biochemical information, resulting in reversible changes of the self-assembled structures of the nanoparticles. The bioinspired approach to the nanostructured morphing materials could be used in future self-assembled molecular robotic systems.

  5. Certification for civil flight decks and the human-computer interface

    NASA Technical Reports Server (NTRS)

    Mcclumpha, Andrew J.; Rudisill, Marianne

    1994-01-01

    This paper will address the issue of human factor aspects of civil flight deck certification, with emphasis on the pilot's interface with automation. In particular, three questions will be asked that relate to this certification process: (1) are the methods, data, and guidelines available from human factors to adequately address the problems of certifying as safe and error tolerant the complex automated systems of modern civil transport aircraft; (2) do aircraft manufacturers effectively apply human factors information during the aircraft flight deck design process; and (3) do regulatory authorities effectively apply human factors information during the aircraft certification process?

  6. Systems and processes that ensure high quality care.

    PubMed

    Bassett, Sally; Westmore, Kathryn

    2012-10-01

    This is the second in a series of articles examining the components of good corporate governance. It considers how the structures and processes for quality governance can affect an organisation's ability to be assured about the quality of care. Complex information systems and procedures can lead to poor quality care, but sound structures and processes alone are insufficient to ensure good governance, and behavioural factors play a significant part in making sure that staff are enabled to provide good quality care. The next article in this series looks at how the information reporting of an organisation can affect its governance.

  7. Clinical Information Systems as the Backbone of a Complex Information Logistics Process: Findings from the Clinical Information Systems Perspective for 2016.

    PubMed

    Hackl, W O; Ganslandt, T

    2017-08-01

    Objective: To summarize recent research and to propose a selection of best papers published in 2016 in the field of Clinical Information Systems (CIS). Method: The query used to retrieve the articles for the CIS section of the 2016 edition of the IMIA Yearbook of Medical Informatics was reused. It again aimed at identifying relevant publications in the field of CIS from PubMed and Web of Science and comprised search terms from the Medical Subject Headings (MeSH) catalog as well as additional free text search terms. The retrieved articles were categorized in a multi-pass review carried out by the two section editors. The final selection of candidate papers was then peer-reviewed by Yearbook editors and external reviewers. Based on the review results, the best papers were then chosen at the selection meeting with the IMIA Yearbook editorial board. Text mining, term co-occurrence mapping, and topic modelling techniques were used to get an overview on the content of the retrieved articles. Results: The query was carried out in mid-January 2017, yielding a consolidated result set of 2,190 articles published in 921 different journals. Out of them, 14 papers were nominated as candidate best papers and three of them were finally selected as the best papers of the CIS field. The content analysis of the articles revealed the broad spectrum of topics covered by CIS research. Conclusions: The CIS field is multi-dimensional and complex. It is hard to draw a well-defined outline between CIS and other domains or other sections of the IMIA Yearbook. The trends observed in the previous years are progressing. Clinical information systems are more than just sociotechnical systems for data collection, processing, exchange, presentation, and archiving. They are the backbone of a complex, trans-institutional information logistics process. Georg Thieme Verlag KG Stuttgart.

  8. LOD BIM Element specification for Railway Turnout Systems Risk Mitigation using the Information Delivery Manual

    NASA Astrophysics Data System (ADS)

    Gigante-Barrera, Ángel; Dindar, Serdar; Kaewunruen, Sakdirat; Ruikar, Darshan

    2017-10-01

    Railway turnouts are complex systems designed using complex geometries and grades which makes them difficult to be managed in terms of risk prevention. This feature poses a substantial peril to rail users as it is considered a cause of derailment. In addition, derailment deals to financial losses due to operational downtimes and monetary compensations in case of death or injure. These are fundamental drivers to consider mitigating risks arising from poor risk management during design. Prevention through design (PtD) is a process that introduces tacit knowledge from industry professionals during the design process. There is evidence that Building Information Modelling (BIM) can help to mitigate risk since the inception of the project. BIM is considered an Information System (IS) were tacit knowledge can be stored and retrieved from a digital database making easy to take promptly decisions as information is ready to be analysed. BIM at the model element level entails working with 3D elements and embedded data, therefore adding a layer of complexity to the management of information along the different stages of the project and across different disciplines. In order to overcome this problem, the industry has created a framework for model progression specification named Level of Development (LOD). The paper presents an IDM based framework for design risk mitigation through code validation using the LOD. This effort resulted on risk datasets which describe graphically and non-graphically a rail turnout as the model progresses. Thus, permitting its inclusion within risk information systems. The assignment of an LOD construct to a set of data, requires specialised management and process related expertise. Furthermore, the selection of a set of LOD constructs requires a purpose based analysis. Therefore, a framework for LOD constructs implementation within the IDM for code checking is required for the industry to progress in this particular field.

  9. Virtual Model Validation of Complex Multiscale Systems: Applications to Nonlinear Elastostatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, John Tinsley; Prudencio, Ernest E.; Bauman, Paul T.

    We propose a virtual statistical validation process as an aid to the design of experiments for the validation of phenomenological models of the behavior of material bodies, with focus on those cases in which knowledge of the fabrication process used to manufacture the body can provide information on the micro-molecular-scale properties underlying macroscale behavior. One example is given by models of elastomeric solids fabricated using polymerization processes. We describe a framework for model validation that involves Bayesian updates of parameters in statistical calibration and validation phases. The process enables the quanti cation of uncertainty in quantities of interest (QoIs) andmore » the determination of model consistency using tools of statistical information theory. We assert that microscale information drawn from molecular models of the fabrication of the body provides a valuable source of prior information on parameters as well as a means for estimating model bias and designing virtual validation experiments to provide information gain over calibration posteriors.« less

  10. The persistence of a visual dominance effect in a telemanipulator task: A comparison between visual and electrotactile feedback

    NASA Technical Reports Server (NTRS)

    Gaillard, J. P.

    1981-01-01

    The possibility to use an electrotactile stimulation in teleoperation and to observe the interpretation of such information as a feedback to the operator was investigated. It is proposed that visual feedback is more informative than an electrotactile one; and that complex electrotactile feedback slows down both the motor decision and motor response processes, is processed as an all or nothing signal, and bypasses the receptive structure and accesses directly in a working memory where information is sequentially processed and where memory is limited in treatment capacity. The electrotactile stimulation is used as an alerting signal. It is suggested that the visual dominance effect is the result of the advantage of both a transfer function and a sensory memory register where information is pretreated and memorized for a short time. It is found that dividing attention has an effect on the acquisition of the information but not on the subsequent decision processes.

  11. Informational technologies in modern educational structure

    NASA Astrophysics Data System (ADS)

    Fedyanin, A. B.

    2017-01-01

    The article represents the structure of informational technologies complex that is applied in modern school education, describes the most important educational methods, shows the results of their implementation. It represents the forms and methods of educational process informative support usage, examined in respects of different aspects of their using that take into account also the psychological features of students. A range of anxious facts and dangerous trends connected with the usage and distribution of the informational technologies that are to be taken into account in the educational process of informatization is also indicated in the article. Materials of the article are based on the experience of many years in operation and development of the informational educational sphere on the basis of secondary school of the physics and mathematics specialization.

  12. Project Integration Architecture

    NASA Technical Reports Server (NTRS)

    Jones, William Henry

    2008-01-01

    The Project Integration Architecture (PIA) is a distributed, object-oriented, conceptual, software framework for the generation, organization, publication, integration, and consumption of all information involved in any complex technological process in a manner that is intelligible to both computers and humans. In the development of PIA, it was recognized that in order to provide a single computational environment in which all information associated with any given complex technological process could be viewed, reviewed, manipulated, and shared, it is necessary to formulate all the elements of such a process on the most fundamental level. In this formulation, any such element is regarded as being composed of any or all of three parts: input information, some transformation of that input information, and some useful output information. Another fundamental principle of PIA is the assumption that no consumer of information, whether human or computer, can be assumed to have any useful foreknowledge of an element presented to it. Consequently, a PIA-compliant computing system is required to be ready to respond to any questions, posed by the consumer, concerning the nature of the proffered element. In colloquial terms, a PIA-compliant system must be prepared to provide all the information needed to place the element in context. To satisfy this requirement, PIA extends the previously established object-oriented- programming concept of self-revelation and applies it on a grand scale. To enable pervasive use of self-revelation, PIA exploits another previously established object-oriented-programming concept - that of semantic infusion through class derivation. By means of self-revelation and semantic infusion through class derivation, a consumer of information can inquire about the contents of all information entities (e.g., databases and software) and can interact appropriately with those entities. Other key features of PIA are listed.

  13. Combination of binaural and harmonic masking release effects in the detection of a single component in complex tones.

    PubMed

    Klein-Hennig, Martin; Dietz, Mathias; Hohmann, Volker

    2018-03-01

    Both harmonic and binaural signal properties are relevant for auditory processing. To investigate how these cues combine in the auditory system, detection thresholds for an 800-Hz tone masked by a diotic (i.e., identical between the ears) harmonic complex tone were measured in six normal-hearing subjects. The target tone was presented either diotically or with an interaural phase difference (IPD) of 180° and in either harmonic or "mistuned" relationship to the diotic masker. Three different maskers were used, a resolved and an unresolved complex tone (fundamental frequency: 160 and 40 Hz) with four components below and above the target frequency and a broadband unresolved complex tone with 12 additional components. The target IPD provided release from masking in most masker conditions, whereas mistuning led to a significant release from masking only in the diotic conditions with the resolved and the narrowband unresolved maskers. A significant effect of mistuning was neither found in the diotic condition with the wideband unresolved masker nor in any of the dichotic conditions. An auditory model with a single analysis frequency band and different binaural processing schemes was employed to predict the data of the unresolved masker conditions. Sensitivity to modulation cues was achieved by including an auditory-motivated modulation filter in the processing pathway. The predictions of the diotic data were in line with the experimental results and literature data in the narrowband condition, but not in the broadband condition, suggesting that across-frequency processing is involved in processing modulation information. The experimental and model results in the dichotic conditions show that the binaural processor cannot exploit modulation information in binaurally unmasked conditions. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Understanding cognitive processes behind acceptance or refusal of phase I trials.

    PubMed

    Pravettoni, Gabriella; Mazzocco, Ketti; Gorini, Alessandra; Curigliano, Giuseppe

    2016-04-01

    Participation in phase I trials gives patients the chance to obtain control over their disease by trying an experimental therapy. The patients' vulnerability, the informed consent process aiming at understanding the purpose and potential benefits of the phase I trial, and the complexity of the studies may impact the patient's final decision. Emotionally difficult health conditions may induce patients to succumb to cognitive biases, allocating attention only on a part of the provided information. Filling the gap in patients' information process can foster the implementation of strategies to help physicians tailor clinical trials' communication providing personalized support and tailored medical information around patients' need, so avoiding cognitive biases in patients and improving informed shared decision quality. The aim of the present review article focuses on the analysis of cognitive and psychological factors that affect patients' decision to participate or not to early phase clinical trials. Copyright © 2016. Published by Elsevier Ireland Ltd.

  15. Information Search and Decision Making: The Effects of Age and Complexity on Strategy Use

    PubMed Central

    Queen, Tara L.; Hess, Thomas M.; Ennis, Gilda E.; Dowd, Keith; Grühn, Daniel

    2012-01-01

    The impact of task complexity on information search strategy and decision quality was examined in a sample of 135 young, middle-aged, and older adults. We were particularly interested in the competing roles of fluid cognitive ability and domain knowledge and experience, with the former being a negative influence and the latter being a positive influence on older adults’ performance. Participants utilized two decision matrices, which varied in complexity, regarding a consumer purchase. Using process tracing software and an algorithm developed to assess decision strategy, we recorded search behavior, strategy selection, and final decision. Contrary to expectations, older adults were not more likely than the younger age groups to engage in information-minimizing search behaviors in response to increases in task complexity. Similarly, adults of all ages used comparable decision strategies and adapted their strategies to the demands of the task. We also examined decision outcomes in relation to participants’ preferences. Overall, it seems that older adults utilize simpler sets of information primarily reflecting the most valued attributes in making their choice. The results of this study suggest that older adults are adaptive in their approach to decision making and this ability may benefit from accrued knowledge and experience. PMID:22663157

  16. Syntactic Recursion Facilitates and Working Memory Predicts Recursive Theory of Mind

    PubMed Central

    Arslan, Burcu; Hohenberger, Annette; Verbrugge, Rineke

    2017-01-01

    In this study, we focus on the possible roles of second-order syntactic recursion and working memory in terms of simple and complex span tasks in the development of second-order false belief reasoning. We tested 89 Turkish children in two age groups, one younger (4;6–6;5 years) and one older (6;7–8;10 years). Although second-order syntactic recursion is significantly correlated with the second-order false belief task, results of ordinal logistic regressions revealed that the main predictor of second-order false belief reasoning is complex working memory span. Unlike simple working memory and second-order syntactic recursion tasks, the complex working memory task required processing information serially with additional reasoning demands that require complex working memory strategies. Based on our results, we propose that children’s second-order theory of mind develops when they have efficient reasoning rules to process embedded beliefs serially, thus overcoming a possible serial processing bottleneck. PMID:28072823

  17. Costs and benefits of integrating information between the cerebral hemispheres: a computational perspective.

    PubMed

    Belger, A; Banich, M T

    1998-07-01

    Because interaction of the cerebral hemispheres has been found to aid task performance under demanding conditions, the present study examined how this effect is moderated by computational complexity, the degree of lateralization for a task, and individual differences in asymmetric hemispheric activation (AHA). Computational complexity was manipulated across tasks either by increasing the number of inputs to be processed or by increasing the number of steps to a decision. Comparison of within- and across-hemisphere trials indicated that the size of the between-hemisphere advantage increased as a function of task complexity, except for a highly lateralized rhyme decision task that can only be performed by the left hemisphere. Measures of individual differences in AHA revealed that when task demands and an individual's AHA both load on the same hemisphere, the ability to divide the processing between the hemispheres is limited. Thus, interhemispheric division of processing improves performance at higher levels of computational complexity only when the required operations can be divided between the hemispheres.

  18. Watching diagnoses develop: Eye movements reveal symptom processing during diagnostic reasoning.

    PubMed

    Scholz, Agnes; Krems, Josef F; Jahn, Georg

    2017-10-01

    Finding a probable explanation for observed symptoms is a highly complex task that draws on information retrieval from memory. Recent research suggests that observed symptoms are interpreted in a way that maximizes coherence for a single likely explanation. This becomes particularly clear if symptom sequences support more than one explanation. However, there are no existing process data available that allow coherence maximization to be traced in ambiguous diagnostic situations, where critical information has to be retrieved from memory. In this experiment, we applied memory indexing, an eye-tracking method that affords rich time-course information concerning memory-based cognitive processing during higher order thinking, to reveal symptom processing and the preferred interpretation of symptom sequences. Participants first learned information about causes and symptoms presented in spatial frames. Gaze allocation to emptied spatial frames during symptom processing and during the diagnostic response reflected the subjective status of hypotheses held in memory and the preferred interpretation of ambiguous symptoms. Memory indexing traced how the diagnostic decision developed and revealed instances of hypothesis change and biases in symptom processing. Memory indexing thus provided direct online evidence for coherence maximization in processing ambiguous information.

  19. Technology and application of 3D tunnel information monitoring

    NASA Astrophysics Data System (ADS)

    Li, Changqing; Deng, Hongliang; Chen, Ge; Wang, Simiao; Guo, Yang; Wu, Shenglin

    2015-12-01

    It is very necessary that Implement information monitoring and dynamic construction because of Complex geological environment and lack of basic information in the process of tunnel construction. The monitoring results show that 3 d laser scanning technology and information management system has important theoretical significance and application value to ensure the safety of tunnel construction, rich construction theory and technology. It can be known in real time the deformation information and the construction information in near tunnel workplace and the whole tunnel section in real time. In the meantime, it can be known the deformation regularity in the tunnel excavation process and the early warning and forecasting in the form of graphic and data. In order to determine the reasonable time and provide basis for supporting parameters and lining.

  20. The effects of viewpoint on the virtual space of pictures

    NASA Technical Reports Server (NTRS)

    Sedgwick, H. A.

    1989-01-01

    Pictorial displays whose primary purpose is to convey accurate information about the 3-D spatial layout of an environment are discussed. How and how well, pictures can convey such information is discussed. It is suggested that picture perception is not best approached as a unitary, indivisible process. Rather, it is a complex process depending on multiple, partially redundant, interacting sources of visual information for both the real surface of the picture and the virtual space beyond. Each picture must be assessed for the particular information that it makes available. This will determine how accurately the virtual space represented by the picture is seen, as well as how it is distorted when seen from the wrong viewpoint.

  1. Bayesian networks and information theory for audio-visual perception modeling.

    PubMed

    Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis

    2010-09-01

    Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.

  2. Multi-functional bis(alkynyl)gold(iii) N⁁C complexes with distinct mechanochromic luminescence and electroluminescence properties† †Electronic supplementary information (ESI) available: CCDC 1552808. For ESI and crystallographic data in CIF or other electronic format see DOI: 10.1039/c7sc02410j

    PubMed Central

    Wong, Ben Yiu-Wing; Wong, Hok-Lai; Wong, Yi-Chun; Au, Vonika Ka-Man

    2017-01-01

    A new class of donor–acceptor type luminescent bis(alkynyl)gold(iii) N⁁C complexes has been synthesized and characterized. These gold(iii) complexes not only exhibit high photoluminescence quantum yields of up to 0.81, but also interesting mechanochromic luminescence behaviors that are reversible. Upon grinding, a dramatic luminescence color change from green to red can be observed in solid samples of the gold(iii) complexes, and the mechanochromic luminescence can be readily tuned via a judicious selection of substituents on the pyridine ring. In addition, solution-processable OLEDs based on this class of complexes with EQE values of up to 4.0% have been realized, representing the first demonstration of bis(alkynyl)gold(iii) N⁁C complexes as emissive materials in solution-processable OLEDs. PMID:29147519

  3. Information-theoretic metamodel of organizational evolution

    NASA Astrophysics Data System (ADS)

    Sepulveda, Alfredo

    2011-12-01

    Social organizations are abstractly modeled by holarchies---self-similar connected networks---and intelligent complex adaptive multiagent systems---large networks of autonomous reasoning agents interacting via scaled processes. However, little is known of how information shapes evolution in such organizations, a gap that can lead to misleading analytics. The research problem addressed in this study was the ineffective manner in which classical model-predict-control methods used in business analytics attempt to define organization evolution. The purpose of the study was to construct an effective metamodel for organization evolution based on a proposed complex adaptive structure---the info-holarchy. Theoretical foundations of this study were holarchies, complex adaptive systems, evolutionary theory, and quantum mechanics, among other recently developed physical and information theories. Research questions addressed how information evolution patterns gleamed from the study's inductive metamodel more aptly explained volatility in organization. In this study, a hybrid grounded theory based on abstract inductive extensions of information theories was utilized as the research methodology. An overarching heuristic metamodel was framed from the theoretical analysis of the properties of these extension theories and applied to business, neural, and computational entities. This metamodel resulted in the synthesis of a metaphor for, and generalization of organization evolution, serving as the recommended and appropriate analytical tool to view business dynamics for future applications. This study may manifest positive social change through a fundamental understanding of complexity in business from general information theories, resulting in more effective management.

  4. Intrinsic Information Processing and Energy Dissipation in Stochastic Input-Output Dynamical Systems

    DTIC Science & Technology

    2015-07-09

    Crutchfield. Information Anatomy of Stochastic Equilibria, Entropy , (08 2014): 0. doi: 10.3390/e16094713 Virgil Griffith, Edwin Chong, Ryan James...Christopher Ellison, James Crutchfield. Intersection Information Based on Common Randomness, Entropy , (04 2014): 0. doi: 10.3390/e16041985 TOTAL: 5 Number...Learning Group Seminar, Complexity Sciences Center, UC Davis. Korana Burke and Greg Wimsatt (UCD), reviewed PRL “Measurement of Stochastic Entropy

  5. A Qualitative Case Study Approach To Examine Information Resources Management. (Utilisation d'une Approche Qualitative par Methode de cas pour Etudier la Gestion des Ressources D'information).

    ERIC Educational Resources Information Center

    Bergeron, Pierrette

    1997-01-01

    Illustrates how a qualitative approach was used to study the complex and poorly defined concept of information resources management. Explains the general approach to data collection, its advantages and limitations, and the process used to analyze the data. Presents results, along with lessons learned through using method. (Author/AEF)

  6. How Do Students Regulate their Learning of Complex Systems with Hypermedia?.

    ERIC Educational Resources Information Center

    Azevedo, Roger; Seibert, Diane; Guthrie, John T.; Cromley, Jennifer G.; Wang, Huei-yu; Tron, Myriam

    This study examined the role of different goal-setting instructional interventions in facilitating students' shift to more sophisticated mental models of the circulatory system as indicated by both performance and process data. Researchers adopted the information processing model of self-regulated learning of P. Winne and colleagues (1998, 2001)…

  7. Theoretical Review of Phonics Instruction for Struggling/Beginning Readers of English

    ERIC Educational Resources Information Center

    Sitthitikul, Pragasit

    2014-01-01

    Learning to read is a complex task for beginners of English. They must coordinate many cognitive processes to read accurately and fluently, including recognizing words, constructing the meanings of sentences and text, and retaining the information read in memory. An essential part of the process for beginners involves learning the alphabetic…

  8. The Influence of Creative Process Engagement on Employee Creative Performance and Overall Job Performance: A Curvilinear Assessment

    ERIC Educational Resources Information Center

    Zhang, Xiaomeng; Bartol, Kathryn M.

    2010-01-01

    Integrating theories addressing attention and activation with creativity literature, we found an inverted U-shaped relationship between creative process engagement and overall job performance among professionals in complex jobs in an information technology firm. Work experience moderated the curvilinear relationship, with low-experience employees…

  9. A Formal Construction of Term Classes. Technical Report No. TR73-18.

    ERIC Educational Resources Information Center

    Yu, Clement T.

    The computational complexity of a formal process for the construction of term classes for information retrieval is examined. While the process is proven to be difficult computationally, heuristic methods are applied. Experimental results are obtained to illustrate the maximum possible improvement in system performance of retrieval using the formal…

  10. Learning Opportunities in PhD Supervisory Talks: A Social Constructionist Perspective

    ERIC Educational Resources Information Center

    Tian, Wenwen; Singhasiri, Wareesiri

    2016-01-01

    Although PhD supervision has been recognised as an educative process and a complex pedagogy for decades, there is little research into on-site pedagogic processes. Informed by social constructionism and a Foucauldian approach, this qualitative case study explores how learning opportunities were created by analysing both a supervisor's verbal…

  11. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  12. Information spreading in complex networks with participation of independent spreaders

    NASA Astrophysics Data System (ADS)

    Ma, Kun; Li, Weihua; Guo, Quantong; Zheng, Xiaoqi; Zheng, Zhiming; Gao, Chao; Tang, Shaoting

    2018-02-01

    Information diffusion dynamics in complex networks is often modeled as a contagion process among neighbors which is analogous to epidemic diffusion. The attention of previous literature is mainly focused on epidemic diffusion within one network, which, however neglects the possible interactions between nodes beyond the underlying network. The disease can be transmitted to other nodes by other means without following the links in the focal network. Here we account for this phenomenon by introducing the independent spreaders in a susceptible-infectious-recovered contagion process. We derive the critical epidemic thresholds on Erdős-Rényi and scale-free networks as a function of infectious rate, recovery rate and the activeness of independent spreaders. We also present simulation results on ER and SF networks, as well as on a real-world email network. The result shows that the extent to which a disease can infect might be more far-reaching, than we can explain in terms of link contagion only. Besides, these results also help to explain how activeness of independent spreaders can affect the diffusion process, which can be used to explore many other dynamical processes.

  13. Content standards for medical image metadata

    NASA Astrophysics Data System (ADS)

    d'Ornellas, Marcos C.; da Rocha, Rafael P.

    2003-12-01

    Medical images are at the heart of the healthcare diagnostic procedures. They have provided not only a noninvasive mean to view anatomical cross-sections of internal organs but also a mean for physicians to evaluate the patient"s diagnosis and monitor the effects of the treatment. For a Medical Center, the emphasis may shift from the generation of image to post processing and data management since the medical staff may generate even more processed images and other data from the original image after various analyses and post processing. A medical image data repository for health care information system is becoming a critical need. This data repository would contain comprehensive patient records, including information such as clinical data and related diagnostic images, and post-processed images. Due to the large volume and complexity of the data as well as the diversified user access requirements, the implementation of the medical image archive system will be a complex and challenging task. This paper discusses content standards for medical image metadata. In addition it also focuses on the image metadata content evaluation and metadata quality management.

  14. A general framework for a collaborative water quality knowledge and information network.

    PubMed

    Dalcanale, Fernanda; Fontane, Darrell; Csapo, Jorge

    2011-03-01

    Increasing knowledge about the environment has brought about a better understanding of the complexity of the issues, and more information publicly available has resulted into a steady shift from centralized decision making to increasing levels of participatory processes. The management of that information, in turn, is becoming more complex. One of the ways to deal with the complexity is the development of tools that would allow all players, including managers, researchers, educators, stakeholders and the civil society, to be able to contribute to the information system, in any level they are inclined to do so. In this project, a search for the available technology for collaboration, methods of community filtering, and community-based review was performed and the possible implementation of these tools to create a general framework for a collaborative "Water Quality Knowledge and Information Network" was evaluated. The main goals of the network are to advance water quality education and knowledge; encourage distribution and access to data; provide networking opportunities; allow public perceptions and concerns to be collected; promote exchange of ideas; and, give general, open, and free access to information. A reference implementation was made available online and received positive feedback from the community, which also suggested some possible improvements.

  15. A General Framework for a Collaborative Water Quality Knowledge and Information Network

    NASA Astrophysics Data System (ADS)

    Dalcanale, Fernanda; Fontane, Darrell; Csapo, Jorge

    2011-03-01

    Increasing knowledge about the environment has brought about a better understanding of the complexity of the issues, and more information publicly available has resulted into a steady shift from centralized decision making to increasing levels of participatory processes. The management of that information, in turn, is becoming more complex. One of the ways to deal with the complexity is the development of tools that would allow all players, including managers, researchers, educators, stakeholders and the civil society, to be able to contribute to the information system, in any level they are inclined to do so. In this project, a search for the available technology for collaboration, methods of community filtering, and community-based review was performed and the possible implementation of these tools to create a general framework for a collaborative "Water Quality Knowledge and Information Network" was evaluated. The main goals of the network are to advance water quality education and knowledge; encourage distribution and access to data; provide networking opportunities; allow public perceptions and concerns to be collected; promote exchange of ideas; and, give general, open, and free access to information. A reference implementation was made available online and received positive feedback from the community, which also suggested some possible improvements.

  16. Representation of People's Decisions in Health Information Systems. A Complementary Approach for Understanding Health Care Systems and Population Health.

    PubMed

    Gonzalez Bernaldo de Quiros, Fernan; Dawidowski, Adriana R; Figar, Silvana

    2017-02-01

    In this study, we aimed: 1) to conceptualize the theoretical challenges facing health information systems (HIS) to represent patients' decisions about health and medical treatments in everyday life; 2) to suggest approaches for modeling these processes. The conceptualization of the theoretical and methodological challenges was discussed in 2015 during a series of interdisciplinary meetings attended by health informatics staff, epidemiologists and health professionals working in quality management and primary and secondary prevention of chronic diseases of the Hospital Italiano de Buenos Aires, together with sociologists, anthropologists and e-health stakeholders. HIS are facing the need and challenge to represent social human processes based on constructivist and complexity theories, which are the current frameworks of human sciences for understanding human learning and socio-cultural changes. Computer systems based on these theories can model processes of social construction of concrete and subjective entities and the interrelationships between them. These theories could be implemented, among other ways, through the mapping of health assets, analysis of social impact through community trials and modeling of complexity with system simulation tools. This analysis suggested the need to complement the traditional linear causal explanations of disease onset (and treatments) that are the bases for models of analysis of HIS with constructivist and complexity frameworks. Both may enlighten the complex interrelationships among patients, health services and the health system. The aim of this strategy is to clarify people's decision making processes to improve the efficiency, quality and equity of the health services and the health system.

  17. Reconceptualizing children's complex discharge with health systems theory: novel integrative review with embedded expert consultation and theory development.

    PubMed

    Noyes, Jane; Brenner, Maria; Fox, Patricia; Guerin, Ashleigh

    2014-05-01

    To report a novel review to develop a health systems model of successful transition of children with complex healthcare needs from hospital to home. Children with complex healthcare needs commonly experience an expensive, ineffectual and prolonged nurse-led discharge process. Children gain no benefit from prolonged hospitalization and are exposed to significant harm. Research to enable intervention development and process evaluation across the entire health system is lacking. Novel mixed-method integrative review informed by health systems theory. DATA  CINAHL, PsychInfo, EMBASE, PubMed, citation searching, personal contact. REVIEW  Informed by consultation with experts. English language studies, opinion/discussion papers reporting research, best practice and experiences of children, parents and healthcare professionals and purposively selected policies/guidelines from 2002-December 2012 were abstracted using Framework synthesis, followed by iterative theory development. Seven critical factors derived from thirty-four sources across five health system levels explained successful discharge (new programme theory). All seven factors are required in an integrated care pathway, with a dynamic communication loop to facilitate effective discharge (new programme logic). Current health system responses were frequently static and critical success factors were commonly absent, thereby explaining ineffectual discharge. The novel evidence-based model, which reconceptualizes 'discharge' as a highly complex longitudinal health system intervention, makes a significant contribution to global knowledge to drive practice development. Research is required to develop process and outcome measures at different time points in the discharge process and future trials are needed to determine the effectiveness of integrated health system discharge models. © 2013 John Wiley & Sons Ltd.

  18. Instantaneous Transfer Entropy for the Study of Cardiovascular and Cardiorespiratory Nonstationary Dynamics.

    PubMed

    Valenza, Gaetano; Faes, Luca; Citi, Luca; Orini, Michele; Barbieri, Riccardo

    2018-05-01

    Measures of transfer entropy (TE) quantify the direction and strength of coupling between two complex systems. Standard approaches assume stationarity of the observations, and therefore are unable to track time-varying changes in nonlinear information transfer with high temporal resolution. In this study, we aim to define and validate novel instantaneous measures of TE to provide an improved assessment of complex nonstationary cardiorespiratory interactions. We here propose a novel instantaneous point-process TE (ipTE) and validate its assessment as applied to cardiovascular and cardiorespiratory dynamics. In particular, heartbeat and respiratory dynamics are characterized through discrete time series, and modeled with probability density functions predicting the time of the next physiological event as a function of the past history. Likewise, nonstationary interactions between heartbeat and blood pressure dynamics are characterized as well. Furthermore, we propose a new measure of information transfer, the instantaneous point-process information transfer (ipInfTr), which is directly derived from point-process-based definitions of the Kolmogorov-Smirnov distance. Analysis on synthetic data, as well as on experimental data gathered from healthy subjects undergoing postural changes confirms that ipTE, as well as ipInfTr measures are able to dynamically track changes in physiological systems coupling. This novel approach opens new avenues in the study of hidden, transient, nonstationary physiological states involving multivariate autonomic dynamics in cardiovascular health and disease. The proposed method can also be tailored for the study of complex multisystem physiology (e.g., brain-heart or, more in general, brain-body interactions).

  19. A cognitive information processing framework for distributed sensor networks

    NASA Astrophysics Data System (ADS)

    Wang, Feiyi; Qi, Hairong

    2004-09-01

    In this paper, we present a cognitive agent framework (CAF) based on swarm intelligence and self-organization principles, and demonstrate it through collaborative processing for target classification in sensor networks. The framework involves integrated designs to provide both cognitive behavior at the organization level to conquer complexity and reactive behavior at the individual agent level to retain simplicity. The design tackles various problems in the current information processing systems, including overly complex systems, maintenance difficulties, increasing vulnerability to attack, lack of capability to tolerate faults, and inability to identify and cope with low-frequency patterns. An important and distinguishing point of the presented work from classical AI research is that the acquired intelligence does not pertain to distinct individuals but to groups. It also deviates from multi-agent systems (MAS) due to sheer quantity of extremely simple agents we are able to accommodate, to the degree that some loss of coordination messages and behavior of faulty/compromised agents will not affect the collective decision made by the group.

  20. Decision making in a human population living sustainably.

    PubMed

    Hicks, John S; Burgman, Mark A; Marewski, Julian N; Fidler, Fiona; Gigerenzer, Gerd

    2012-10-01

    The Tiwi people of northern Australia have managed natural resources continuously for 6000-8000 years. Tiwi management objectives and outcomes may reflect how they gather information about the environment. We qualitatively analyzed Tiwi documents and management techniques to examine the relation between the social and physical environment of decision makers and their decision-making strategies. We hypothesized that principles of bounded rationality, namely, the use of efficient rules to navigate complex decision problems, explain how Tiwi managers use simple decision strategies (i.e., heuristics) to make robust decisions. Tiwi natural resource managers reduced complexity in decision making through a process that gathers incomplete and uncertain information to quickly guide decisions toward effective outcomes. They used management feedback to validate decisions through an information loop that resulted in long-term sustainability of environmental use. We examined the Tiwi decision-making processes relative to management of barramundi (Lates calcarifer) fisheries and contrasted their management with the state government's management of barramundi. Decisions that enhanced the status of individual people and their attainment of aspiration levels resulted in reliable resource availability for Tiwi consumers. Different decision processes adopted by the state for management of barramundi may not secure similarly sustainable outcomes. ©2012 Society for Conservation Biology.

  1. Neuron-Like Networks Between Ribosomal Proteins Within the Ribosome

    NASA Astrophysics Data System (ADS)

    Poirot, Olivier; Timsit, Youri

    2016-05-01

    From brain to the World Wide Web, information-processing networks share common scale invariant properties. Here, we reveal the existence of neural-like networks at a molecular scale within the ribosome. We show that with their extensions, ribosomal proteins form complex assortative interaction networks through which they communicate through tiny interfaces. The analysis of the crystal structures of 50S eubacterial particles reveals that most of these interfaces involve key phylogenetically conserved residues. The systematic observation of interactions between basic and aromatic amino acids at the interfaces and along the extension provides new structural insights that may contribute to decipher the molecular mechanisms of signal transmission within or between the ribosomal proteins. Similar to neurons interacting through “molecular synapses”, ribosomal proteins form a network that suggest an analogy with a simple molecular brain in which the “sensory-proteins” innervate the functional ribosomal sites, while the “inter-proteins” interconnect them into circuits suitable to process the information flow that circulates during protein synthesis. It is likely that these circuits have evolved to coordinate both the complex macromolecular motions and the binding of the multiple factors during translation. This opens new perspectives on nanoscale information transfer and processing.

  2. Relations between Short-term Memory Deficits, Semantic Processing, and Executive Function

    PubMed Central

    Allen, Corinne M.; Martin, Randi C.; Martin, Nadine

    2012-01-01

    Background Previous research has suggested separable short-term memory (STM) buffers for the maintenance of phonological and lexical-semantic information, as some patients with aphasia show better ability to retain semantic than phonological information and others show the reverse. Recently, researchers have proposed that deficits to the maintenance of semantic information in STM are related to executive control abilities. Aims The present study investigated the relationship of executive function abilities with semantic and phonological short-term memory (STM) and semantic processing in such patients, as some previous research has suggested that semantic STM deficits and semantic processing abilities are critically related to specific or general executive function deficits. Method and Procedures 20 patients with aphasia and STM deficits were tested on measures of short-term retention, semantic processing, and both complex and simple executive function tasks. Outcome and Results In correlational analyses, we found no relation between semantic STM and performance on simple or complex executive function tasks. In contrast, phonological STM was related to executive function performance in tasks that had a verbal component, suggesting that performance in some executive function tasks depends on maintaining or rehearsing phonological codes. Although semantic STM was not related to executive function ability, performance on semantic processing tasks was related to executive function, perhaps due to similar executive task requirements in both semantic processing and executive function tasks. Conclusions Implications for treatment and interpretations of executive deficits are discussed. PMID:22736889

  3. Application of simplified Complexity Theory concepts for healthcare social systems to explain the implementation of evidence into practice.

    PubMed

    Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane

    2016-02-01

    To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and evaluation. © 2015 John Wiley & Sons Ltd.

  4. Temporally selective attention modulates early perceptual processing: event-related potential evidence.

    PubMed

    Sanders, Lisa D; Astheimer, Lori B

    2008-05-01

    Some of the most important information we encounter changes so rapidly that our perceptual systems cannot process all of it in detail. Spatially selective attention is critical for perception when more information than can be processed in detail is presented simultaneously at distinct locations. When presented with complex, rapidly changing information, listeners may need to selectively attend to specific times rather than to locations. We present evidence that listeners can direct selective attention to time points that differ by as little as 500 msec, and that doing so improves target detection, affects baseline neural activity preceding stimulus presentation, and modulates auditory evoked potentials at a perceptually early stage. These data demonstrate that attentional modulation of early perceptual processing is temporally precise and that listeners can flexibly allocate temporally selective attention over short intervals, making it a viable mechanism for preferentially processing the most relevant segments in rapidly changing streams.

  5. How neuroscience can inform the study of individual differences in cognitive abilities

    PubMed Central

    McFarland, Dennis J.

    2018-01-01

    Theories of human mental abilities should be consistent with what is known in neuroscience. Currently tests of human mental abilities are modeled by cognitive constructs such as attention, working memory, and speed of information processing. These constructs are in turn related to a single general ability. However brains are very complex systems and whether most of the variability between the operations of different brains can be ascribed to a single factor is questionable. Research in neuroscience suggests that psychological processes such at perception, attention, decision and executive control are emergent properties of interacting distributed networks. The modules that make up these networks use similar computational processes that involve multiple forms of neural plasticity, each having different time constants. Accordingly these networks might best be characterized in terms of the information they process rather than in terms of abstract psychological processes such as working memory and executive control. PMID:28195556

  6. Multiscale Granger causality

    NASA Astrophysics Data System (ADS)

    Faes, Luca; Nollo, Giandomenico; Stramaglia, Sebastiano; Marinazzo, Daniele

    2017-10-01

    In the study of complex physical and biological systems represented by multivariate stochastic processes, an issue of great relevance is the description of the system dynamics spanning multiple temporal scales. While methods to assess the dynamic complexity of individual processes at different time scales are well established, multiscale analysis of directed interactions has never been formalized theoretically, and empirical evaluations are complicated by practical issues such as filtering and downsampling. Here we extend the very popular measure of Granger causality (GC), a prominent tool for assessing directed lagged interactions between joint processes, to quantify information transfer across multiple time scales. We show that the multiscale processing of a vector autoregressive (AR) process introduces a moving average (MA) component, and describe how to represent the resulting ARMA process using state space (SS) models and to combine the SS model parameters for computing exact GC values at arbitrarily large time scales. We exploit the theoretical formulation to identify peculiar features of multiscale GC in basic AR processes, and demonstrate with numerical simulations the much larger estimation accuracy of the SS approach compared to pure AR modeling of filtered and downsampled data. The improved computational reliability is exploited to disclose meaningful multiscale patterns of information transfer between global temperature and carbon dioxide concentration time series, both in paleoclimate and in recent years.

  7. Learning Cell Biology as a Team: A Project-Based Approach to Upper-Division Cell Biology

    ERIC Educational Resources Information Center

    Wright, Robin; Boggs, James

    2002-01-01

    To help students develop successful strategies for learning how to learn and communicate complex information in cell biology, we developed a quarter-long cell biology class based on team projects. Each team researches a particular human disease and presents information about the cellular structure or process affected by the disease, the cellular…

  8. A Navigation Pattern Analysis of University Department's Websites Using a Processing Mining Approach

    ERIC Educational Resources Information Center

    Han, Kwan Hee; Hwang, Boram; Jeon, Jeonghwan

    2015-01-01

    The university's website is a useful tool in disseminating information to current and future college students and is supportive of the university's administrative activities. However, as the university's website began including more and more information and the design of it has become gradually more complex, it has become hard to find desired…

  9. The Use of Information and Communication Technology (ICT) as a Teaching Method in Vocational Education and Training in Tourism

    ERIC Educational Resources Information Center

    Mocanu, Elena Madalina; Deaconu, Alecxandrina

    2017-01-01

    Globalization and technological change that have characterized recent years have created a new global economy powered by technology, fueled by information and knowledge, with serious implications for the nature and purpose of education institutions. Effective integration of ICT into the education system is a complex, multilateral process that…

  10. Quantum Computing

    DTIC Science & Technology

    1998-04-01

    information representation and processing technology, although faster than the wheels and gears of the Charles Babbage computation machine, is still in...the same computational complexity class as the Babbage machine, with bits of information represented by entities which obey classical (non-quantum...nuclear double resonances Charles M Bowden and Jonathan P. Dowling Weapons Sciences Directorate, AMSMI-RD-WS-ST Missile Research, Development, and

  11. Harnessing expert knowledge: Defining a Bayesian network decision model with limited data-Model structure for the vibration qualification problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rizzo, Davinia B.; Blackburn, Mark R.

    As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less

  12. Hazardous Materials Verification and Limited Characterization Report on Sodium and Caustic Residuals in Materials and Fuel Complex Facilities MFC-799/799A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gary Mecham

    2010-08-01

    This report is a companion to the Facilities Condition and Hazard Assessment for Materials and Fuel Complex Sodium Processing Facilities MFC-799/799A and Nuclear Calibration Laboratory MFC-770C (referred to as the Facilities Condition and Hazards Assessment). This report specifically responds to the requirement of Section 9.2, Item 6, of the Facilities Condition and Hazards Assessment to provide an updated assessment and verification of the residual hazardous materials remaining in the Sodium Processing Facilities processing system. The hazardous materials of concern are sodium and sodium hydroxide (caustic). The information supplied in this report supports the end-point objectives identified in the Transition Planmore » for Multiple Facilities at the Materials and Fuels Complex, Advanced Test Reactor, Central Facilities Area, and Power Burst Facility, as well as the deactivation and decommissioning critical decision milestone 1, as specified in U.S. Department of Energy Guide 413.3-8, “Environmental Management Cleanup Projects.” Using a tailored approach and based on information obtained through a combination of process knowledge, emergency management hazardous assessment documentation, and visual inspection, this report provides sufficient detail regarding the quantity of hazardous materials for the purposes of facility transfer; it also provides that further characterization/verification of these materials is unnecessary.« less

  13. Harnessing expert knowledge: Defining a Bayesian network decision model with limited data-Model structure for the vibration qualification problem

    DOE PAGES

    Rizzo, Davinia B.; Blackburn, Mark R.

    2018-03-30

    As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less

  14. Barriers to data quality resulting from the process of coding health information to administrative data: a qualitative study.

    PubMed

    Lucyk, Kelsey; Tang, Karen; Quan, Hude

    2017-11-22

    Administrative health data are increasingly used for research and surveillance to inform decision-making because of its large sample sizes, geographic coverage, comprehensivity, and possibility for longitudinal follow-up. Within Canadian provinces, individuals are assigned unique personal health numbers that allow for linkage of administrative health records in that jurisdiction. It is therefore necessary to ensure that these data are of high quality, and that chart information is accurately coded to meet this end. Our objective is to explore the potential barriers that exist for high quality data coding through qualitative inquiry into the roles and responsibilities of medical chart coders. We conducted semi-structured interviews with 28 medical chart coders from Alberta, Canada. We used thematic analysis and open-coded each transcript to understand the process of administrative health data generation and identify barriers to its quality. The process of generating administrative health data is highly complex and involves a diverse workforce. As such, there are multiple points in this process that introduce challenges for high quality data. For coders, the main barriers to data quality occurred around chart documentation, variability in the interpretation of chart information, and high quota expectations. This study illustrates the complex nature of barriers to high quality coding, in the context of administrative data generation. The findings from this study may be of use to data users, researchers, and decision-makers who wish to better understand the limitations of their data or pursue interventions to improve data quality.

  15. [Digitalization of radiological imaging information and consequences for patient care in the hospital ].

    PubMed

    den Heeten, G J; Barneveld Binkhuysen, F H

    2001-08-25

    Determining the rate at which radiology must be digitalised has been a controversial issue for many years. Much radiological information is still obtained from the film-screen combination (X-rays) with all of its known inherent restrictions. The importance of imaging information in the healthcare process continues to increase for both radiologists and referring physicians, and the ongoing developments in information technology means that it is possible to integrate imaging information and electronic patient files. The healthcare process can only become more effective and efficient when the appropriate information is in the right place at the right time, something that conventional methods, using photos that need to be physically moved, can scarcely satisfy. There is also a desire for integration with information obtained from nuclear medicine, pathology and endoscopy, and eventually of all stand-alone data systems with relevance for the individually oriented hospital healthcare. The transition from a conventional to a digital process is complex; it is accompanied by the transition from a data-oriented to a process-oriented system. Many years have already been invested in the integration of information systems and the development of digital systems within radiology, the current performance of which is such that many hospitals are considering the digitalisation process or are already implementing parts of it.

  16. Maximizing Modern Distribution of Complex Anatomical Spatial Information: 3D Reconstruction and Rapid Prototype Production of Anatomical Corrosion Casts of Human Specimens

    ERIC Educational Resources Information Center

    Li, Jianyi; Nie, Lanying; Li, Zeyu; Lin, Lijun; Tang, Lei; Ouyang, Jun

    2012-01-01

    Anatomical corrosion casts of human specimens are useful teaching aids. However, their use is limited due to ethical dilemmas associated with their production, their lack of perfect reproducibility, and their consumption of original specimens in the process of casting. In this study, new approaches with modern distribution of complex anatomical…

  17. U.S. Army Research Institute Program in Basic Research - FY 2007

    DTIC Science & Technology

    2008-05-01

    learner characteristics (e.g., cognitive ability or learning style), depth and complexity of content, or instructional design characteristics. There...trainers to think about ways of making learning purposeful. The effects of cognitive load on learning were minimally explored in the current research...Achievement in Complex Learning Environments as a Function of Information Processing Ability , Knowledge, and Self-Control Josep h F . F ag an

  18. An image overall complexity evaluation method based on LSD line detection

    NASA Astrophysics Data System (ADS)

    Li, Jianan; Duan, Jin; Yang, Xu; Xiao, Bo

    2017-04-01

    In the artificial world, whether it is the city's traffic roads or engineering buildings contain a lot of linear features. Therefore, the research on the image complexity of linear information has become an important research direction in digital image processing field. This paper, by detecting the straight line information in the image and using the straight line as the parameter index, establishing the quantitative and accurate mathematics relationship. In this paper, we use LSD line detection algorithm which has good straight-line detection effect to detect the straight line, and divide the detected line by the expert consultation strategy. Then we use the neural network to carry on the weight training and get the weight coefficient of the index. The image complexity is calculated by the complexity calculation model. The experimental results show that the proposed method is effective. The number of straight lines in the image, the degree of dispersion, uniformity and so on will affect the complexity of the image.

  19. An Advanced User Interface Approach for Complex Parameter Study Process Specification in the Information Power Grid

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; McCann, Karen M.; Biswas, Rupak; VanderWijngaart, Rob; Yan, Jerry C. (Technical Monitor)

    2000-01-01

    The creation of parameter study suites has recently become a more challenging problem as the parameter studies have now become multi-tiered and the computational environment has become a supercomputer grid. The parameter spaces are vast, the individual problem sizes are getting larger, and researchers are now seeking to combine several successive stages of parameterization and computation. Simultaneously, grid-based computing offers great resource opportunity but at the expense of great difficulty of use. We present an approach to this problem which stresses intuitive visual design tools for parameter study creation and complex process specification, and also offers programming-free access to grid-based supercomputer resources and process automation.

  20. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.

  1. ICT and mobile health to improve clinical process delivery. a research project for therapy management process innovation.

    PubMed

    Locatelli, Paolo; Montefusco, Vittorio; Sini, Elena; Restifo, Nicola; Facchini, Roberta; Torresani, Michele

    2013-01-01

    The volume and the complexity of clinical and administrative information make Information and Communication Technologies (ICTs) essential for running and innovating healthcare. This paper tells about a project aimed to design, develop and implement a set of organizational models, acknowledged procedures and ICT tools (Mobile & Wireless solutions and Automatic Identification and Data Capture technologies) to improve actual support, safety, reliability and traceability of a specific therapy management (stem cells). The value of the project is to design a solution based on mobile and identification technology in tight collaboration with physicians and actors involved in the process to ensure usability and effectivenes in process management.

  2. DAMT - DISTRIBUTED APPLICATION MONITOR TOOL (HP9000 VERSION)

    NASA Technical Reports Server (NTRS)

    Keith, B.

    1994-01-01

    Typical network monitors measure status of host computers and data traffic among hosts. A monitor to collect statistics about individual processes must be unobtrusive and possess the ability to locate and monitor processes, locate and monitor circuits between processes, and report traffic back to the user through a single application program interface (API). DAMT, Distributed Application Monitor Tool, is a distributed application program that will collect network statistics and make them available to the user. This distributed application has one component (i.e., process) on each host the user wishes to monitor as well as a set of components at a centralized location. DAMT provides the first known implementation of a network monitor at the application layer of abstraction. Potential users only need to know the process names of the distributed application they wish to monitor. The tool locates the processes and the circuit between them, and reports any traffic between them at a user-defined rate. The tool operates without the cooperation of the processes it monitors. Application processes require no changes to be monitored by this tool. Neither does DAMT require the UNIX kernel to be recompiled. The tool obtains process and circuit information by accessing the operating system's existing process database. This database contains all information available about currently executing processes. Expanding the information monitored by the tool can be done by utilizing more information from the process database. Traffic on a circuit between processes is monitored by a low-level LAN analyzer that has access to the raw network data. The tool also provides features such as dynamic event reporting and virtual path routing. A reusable object approach was used in the design of DAMT. The tool has four main components; the Virtual Path Switcher, the Central Monitor Complex, the Remote Monitor, and the LAN Analyzer. All of DAMT's components are independent, asynchronously executing processes. The independent processes communicate with each other via UNIX sockets through a Virtual Path router, or Switcher. The Switcher maintains a routing table showing the host of each component process of the tool, eliminating the need for each process to do so. The Central Monitor Complex provides the single application program interface (API) to the user and coordinates the activities of DAMT. The Central Monitor Complex is itself divided into independent objects that perform its functions. The component objects are the Central Monitor, the Process Locator, the Circuit Locator, and the Traffic Reporter. Each of these objects is an independent, asynchronously executing process. User requests to the tool are interpreted by the Central Monitor. The Process Locator identifies whether a named process is running on a monitored host and which host that is. The circuit between any two processes in the distributed application is identified using the Circuit Locator. The Traffic Reporter handles communication with the LAN Analyzer and accumulates traffic updates until it must send a traffic report to the user. The Remote Monitor process is replicated on each monitored host. It serves the Central Monitor Complex processes with application process information. The Remote Monitor process provides access to operating systems information about currently executing processes. It allows the Process Locator to find processes and the Circuit Locator to identify circuits between processes. It also provides lifetime information about currently monitored processes. The LAN Analyzer consists of two processes. Low-level monitoring is handled by the Sniffer. The Sniffer analyzes the raw data on a single, physical LAN. It responds to commands from the Analyzer process, which maintains the interface to the Traffic Reporter and keeps track of which circuits to monitor. DAMT is written in C-language for HP-9000 series computers running HP-UX and Sun 3 and 4 series computers running SunOS. DAMT requires 1Mb of disk space and 4Mb of RAM for execution. This package requires MIT's X Window System, Version 11 Revision 4, with OSF/Motif 1.1. The HP-9000 version (GSC-13589) includes sample HP-9000/375 and HP-9000/730 executables which were compiled under HP-UX, and the Sun version (GSC-13559) includes sample Sun3 and Sun4 executables compiled under SunOS. The standard distribution medium for the HP version of DAMT is a .25 inch HP pre-formatted streaming magnetic tape cartridge in UNIX tar format. It is also available on a 4mm magnetic tape in UNIX tar format. The standard distribution medium for the Sun version of DAMT is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. DAMT was developed in 1992.

  3. Social complexity beliefs predict posttraumatic growth in survivors of a natural disaster.

    PubMed

    Nalipay, Ma Jenina N; Bernardo, Allan B I; Mordeno, Imelu G

    2016-09-01

    Most studies on posttraumatic growth (PTG) have focused on personal characteristics, interpersonal resources, and the immediate environment. There has been less attention on dynamic internal processes related to the development of PTG and on how these processes are affected by the broader culture. Calhoun and Tedeschi's (2006) model suggests a role of distal culture in PTG development, but empirical investigations on that point are limited. The present study investigated the role of social complexity-the generalized belief about changing social environments and inconsistency of human behavior-as a predictor of PTG. Social complexity was hypothesized to be associated with problem-solving approaches that are likely to give rise to cognitive processes that promote PTG. A sample of 446 survivors of Typhoon Haiyan, 1 of the strongest typhoons ever recorded at the time, answered self-report measures of social complexity, cognitive processing of trauma, and PTG. Structural equation modeling indicated a good fit between the data and the hypothesized model; belief in social complexity predicted stronger PTG, mediated by cognitive processing. The results provide evidence for how disaster survivors' beliefs about the changing nature of social environments and their corresponding behavior changes are predictors of PTG and suggest a psychological mechanism for how distal culture can influence PTG. Thus, assessing social complexity beliefs during early the phases of a postdisaster psychosocial intervention may provide useful information on who is likely to experience PTG. Trauma workers might consider culture-specific social themes related to social complexity in disaster-affected communities. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. Ethical challenges in developing an educational video to empower potential participants during consent processes in HIV cure research in South Africa.

    PubMed

    Staunton, Ciara; de Roubaix, Malcolm; Baatjies, Dianno; Black, Gill; Hendricks, Melany; Rossouw, Theresa; Moodley, Keymanthri

    2018-04-01

    Obtaining consent for HIV research is complex, particularly in low- and middle-income countries. Low levels of education, complexity of science and research processes, confusion about basic elements of research, and socio-economic conditions that make access to medical care difficult have collectively led to concerns about the adequacy of the consent process. Given the exponential growth of HIV prevention and treatment research in South Africa, HIV researchers are increasingly facing challenges obtaining authentic informed consent from potential participants. It is anticipated that HIV cure research, despite being in its infancy in South Africa, will introduce a new discourse into a population that is often struggling to understand the differences between 'cure', 'preventive and therapeutic vaccines' and other elements of the research process. Coupled with this, South Africa has a complex history of 'illegitimate' or 'false cures' for HIV. It is therefore logical to anticipate that HIV cure research may face significant challenges during consent processes. HIV prevention research in South Africa has demonstrated the importance of early community engagement in educating potential research participants and promoting community acceptance of research. Consequently, in an attempt to extrapolate from this experience of engaging with communities early regarding cure research, a 15-minute educational video entitled ' I have a dream: a world without HIV ' was developed to educate and ultimately empower potential research participants to make informed choices during consent processes in future HIV cure clinical trials. To aid others in the development of educational interventions, this paper discusses the challenges faced in developing this educational video.

  5. Integrating transition theory and bioecological theory: a theoretical perspective for nurses supporting the transition to adulthood for young people with medical complexity.

    PubMed

    Joly, Elizabeth

    2016-06-01

    To present a discussion of a theoretical perspective developed through integrating Meleis' Transition Theory and Bronfenbrenner's Bioecological Theory of Human Development to inform nursing and advanced nursing practice supporting the transition to adulthood for young people with medical complexity. Theoretical perspectives to inform nursing practice in supporting successful transition are limited, yet nurses frequently encounter young people with medical complexity during the transition to adulthood. Discussion paper. A literature search of CINAHL and Medline was conducted in 2014 and included articles from 2003-2014; informal discussions with families; the author's experiences in a transition program. The integrated theoretical perspective described in this paper can inform nurses and advanced practice nurses on contextual influences, program and intervention development across spheres of influence and outcomes for the transition to adulthood for young people with medical complexity. Young people and their families require effective reciprocal interactions with individuals and services across sectors to successfully transition to adulthood and become situated in the adult world. Intervention must also extend beyond the young person to include providers, services and health and social policy. Nurses can take a leadership role in supporting the transition to adulthood for young people with medical complexity through direct care, case management, education and research. It is integral that nurses holistically consider developmental processes, complexity and contextual conditions that promote positive outcomes during and beyond the transition to adulthood. © 2016 John Wiley & Sons Ltd.

  6. Complexity Variability Assessment of Nonlinear Time-Varying Cardiovascular Control

    NASA Astrophysics Data System (ADS)

    Valenza, Gaetano; Citi, Luca; Garcia, Ronald G.; Taylor, Jessica Noggle; Toschi, Nicola; Barbieri, Riccardo

    2017-02-01

    The application of complex systems theory to physiology and medicine has provided meaningful information about the nonlinear aspects underlying the dynamics of a wide range of biological processes and their disease-related aberrations. However, no studies have investigated whether meaningful information can be extracted by quantifying second-order moments of time-varying cardiovascular complexity. To this extent, we introduce a novel mathematical framework termed complexity variability, in which the variance of instantaneous Lyapunov spectra estimated over time serves as a reference quantifier. We apply the proposed methodology to four exemplary studies involving disorders which stem from cardiology, neurology and psychiatry: Congestive Heart Failure (CHF), Major Depression Disorder (MDD), Parkinson’s Disease (PD), and Post-Traumatic Stress Disorder (PTSD) patients with insomnia under a yoga training regime. We show that complexity assessments derived from simple time-averaging are not able to discern pathology-related changes in autonomic control, and we demonstrate that between-group differences in measures of complexity variability are consistent across pathologies. Pathological states such as CHF, MDD, and PD are associated with an increased complexity variability when compared to healthy controls, whereas wellbeing derived from yoga in PTSD is associated with lower time-variance of complexity.

  7. Sample preparation for SFM imaging of DNA, proteins, and DNA-protein complexes.

    PubMed

    Ristic, Dejan; Sanchez, Humberto; Wyman, Claire

    2011-01-01

    Direct imaging is invaluable for understanding the mechanism of complex genome transactions where proteins work together to organize, transcribe, replicate, and repair DNA. Scanning (or atomic) force microscopy is an ideal tool for this, providing 3D information on molecular structure at nanometer resolution from defined components. This is a convenient and practical addition to in vitro studies as readily obtainable amounts of purified proteins and DNA are required. The images reveal structural details on the size and location of DNA-bound proteins as well as protein-induced arrangement of the DNA, which are directly correlated in the same complexes. In addition, even from static images, the different forms observed and their relative distributions can be used to deduce the variety and stability of different complexes that are necessarily involved in dynamic processes. Recently available instruments that combine fluorescence with topographic imaging allow the identification of specific molecular components in complex assemblies, which broadens the applications and increases the information obtained from direct imaging of molecular complexes. We describe here basic methods for preparing samples of proteins, DNA, and complexes of the two for topographic imaging and quantitative analysis. We also describe special considerations for combined fluorescence and topographic imaging of molecular complexes.

  8. Characterization of autoregressive processes using entropic quantifiers

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; Redelico, Francisco O.

    2018-01-01

    The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.

  9. Spatio-temporal dynamics in the origin of genetic information

    NASA Astrophysics Data System (ADS)

    Kim, Pan-Jun; Jeong, Hawoong

    2005-04-01

    We study evolutionary processes induced by spatio-temporal dynamics in prebiotic evolution. Using numerical simulations, we demonstrate that hypercycles emerge from complex interaction structures in multispecies systems. In this work, we also find that ‘hypercycle hybrid’ protects the hypercycle from its environment during the growth process. There is little selective advantage for one hypercycle to maintain coexistence with others. This brings the possibility of the outcompetition between hypercycles resulting in the negative effect on information diversity. To enrich the information in hypercycles, symbiosis with parasites is suggested. It is shown that symbiosis with parasites can play an important role in the prebiotic immunology.

  10. Selective perception of novel science: how definitions affect information processing about nanotechnology

    NASA Astrophysics Data System (ADS)

    Kim, Jiyoun; Akin, Heather; Brossard, Dominique; Xenos, Michael; Scheufele, Dietram A.

    2017-05-01

    This study examines how familiarity with an issue—nanotechnology—moderates the effect of exposure to science information on how people process mediated messages about a complex issue. In an online experiment, we provide a nationally representative sample three definitions of nanotechnology (technical, technical applications, and technical risk/benefit definitions). We then ask them to read an article about the topic. We find significant interactions between perceived nano-familiarity and the definition received in terms of how respondents perceive favorable information conveyed in the stimulus. People less familiar with nanotechnology were more significantly affected by the type of definition they received.

  11. [Development method of healthcare information system integration based on business collaboration model].

    PubMed

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  12. A combinatorial framework to quantify peak/pit asymmetries in complex dynamics.

    PubMed

    Hasson, Uri; Iacovacci, Jacopo; Davis, Ben; Flanagan, Ryan; Tagliazucchi, Enzo; Laufs, Helmut; Lacasa, Lucas

    2018-02-23

    We explore a combinatorial framework which efficiently quantifies the asymmetries between minima and maxima in local fluctuations of time series. We first showcase its performance by applying it to a battery of synthetic cases. We find rigorous results on some canonical dynamical models (stochastic processes with and without correlations, chaotic processes) complemented by extensive numerical simulations for a range of processes which indicate that the methodology correctly distinguishes different complex dynamics and outperforms state of the art metrics in several cases. Subsequently, we apply this methodology to real-world problems emerging across several disciplines including cases in neurobiology, finance and climate science. We conclude that differences between the statistics of local maxima and local minima in time series are highly informative of the complex underlying dynamics and a graph-theoretic extraction procedure allows to use these features for statistical learning purposes.

  13. Evolution and development of brain networks: from Caenorhabditis elegans to Homo sapiens.

    PubMed

    Kaiser, Marcus; Varier, Sreedevi

    2011-01-01

    Neural networks show a progressive increase in complexity during the time course of evolution. From diffuse nerve nets in Cnidaria to modular, hierarchical systems in macaque and humans, there is a gradual shift from simple processes involving a limited amount of tasks and modalities to complex functional and behavioral processing integrating different kinds of information from highly specialized tissue. However, studies in a range of species suggest that fundamental similarities, in spatial and topological features as well as in developmental mechanisms for network formation, are retained across evolution. 'Small-world' topology and highly connected regions (hubs) are prevalent across the evolutionary scale, ensuring efficient processing and resilience to internal (e.g. lesions) and external (e.g. environment) changes. Furthermore, in most species, even the establishment of hubs, long-range connections linking distant components, and a modular organization, relies on similar mechanisms. In conclusion, evolutionary divergence leads to greater complexity while following essential developmental constraints.

  14. Business Intelligence Applied to the ALMA Software Integration Process

    NASA Astrophysics Data System (ADS)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  15. Temporal characteristics of audiovisual information processing.

    PubMed

    Fuhrmann Alpert, Galit; Hein, Grit; Tsai, Nancy; Naumer, Marcus J; Knight, Robert T

    2008-05-14

    In complex natural environments, auditory and visual information often have to be processed simultaneously. Previous functional magnetic resonance imaging (fMRI) studies focused on the spatial localization of brain areas involved in audiovisual (AV) information processing, but the temporal characteristics of AV information flow in these regions remained unclear. In this study, we used fMRI and a novel information-theoretic approach to study the flow of AV sensory information. Subjects passively perceived sounds and images of objects presented either alone or simultaneously. Applying the measure of mutual information, we computed for each voxel the latency in which the blood oxygenation level-dependent signal had the highest information content about the preceding stimulus. The results indicate that, after AV stimulation, the earliest informative activity occurs in right Heschl's gyrus, left primary visual cortex, and the posterior portion of the superior temporal gyrus, which is known as a region involved in object-related AV integration. Informative activity in the anterior portion of superior temporal gyrus, middle temporal gyrus, right occipital cortex, and inferior frontal cortex was found at a later latency. Moreover, AV presentation resulted in shorter latencies in multiple cortical areas compared with isolated auditory or visual presentation. The results provide evidence for bottom-up processing from primary sensory areas into higher association areas during AV integration in humans and suggest that AV presentation shortens processing time in early sensory cortices.

  16. Informational analysis involving application of complex information system

    NASA Astrophysics Data System (ADS)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  17. Predictability decomposition detects the impairment of brain-heart dynamical networks during sleep disorders and their recovery with treatment

    NASA Astrophysics Data System (ADS)

    Faes, Luca; Marinazzo, Daniele; Stramaglia, Sebastiano; Jurysta, Fabrice; Porta, Alberto; Giandomenico, Nollo

    2016-05-01

    This work introduces a framework to study the network formed by the autonomic component of heart rate variability (cardiac process η) and the amplitude of the different electroencephalographic waves (brain processes δ, θ, α, σ, β) during sleep. The framework exploits multivariate linear models to decompose the predictability of any given target process into measures of self-, causal and interaction predictability reflecting respectively the information retained in the process and related to its physiological complexity, the information transferred from the other source processes, and the information modified during the transfer according to redundant or synergistic interaction between the sources. The framework is here applied to the η, δ, θ, α, σ, β time series measured from the sleep recordings of eight severe sleep apnoea-hypopnoea syndrome (SAHS) patients studied before and after long-term treatment with continuous positive airway pressure (CPAP) therapy, and 14 healthy controls. Results show that the full and self-predictability of η, δ and θ decreased significantly in SAHS compared with controls, and were restored with CPAP for δ and θ but not for η. The causal predictability of η and δ occurred through significantly redundant source interaction during healthy sleep, which was lost in SAHS and recovered after CPAP. These results indicate that predictability analysis is a viable tool to assess the modifications of complexity and causality of the cerebral and cardiac processes induced by sleep disorders, and to monitor the restoration of the neuroautonomic control of these processes during long-term treatment.

  18. Analysis of haptic information in the cerebral cortex

    PubMed Central

    2016-01-01

    Haptic sensing of objects acquires information about a number of properties. This review summarizes current understanding about how these properties are processed in the cerebral cortex of macaques and humans. Nonnoxious somatosensory inputs, after initial processing in primary somatosensory cortex, are partially segregated into different pathways. A ventrally directed pathway carries information about surface texture into parietal opercular cortex and thence to medial occipital cortex. A dorsally directed pathway transmits information regarding the location of features on objects to the intraparietal sulcus and frontal eye fields. Shape processing occurs mainly in the intraparietal sulcus and lateral occipital complex, while orientation processing is distributed across primary somatosensory cortex, the parietal operculum, the anterior intraparietal sulcus, and a parieto-occipital region. For each of these properties, the respective areas outside primary somatosensory cortex also process corresponding visual information and are thus multisensory. Consistent with the distributed neural processing of haptic object properties, tactile spatial acuity depends on interaction between bottom-up tactile inputs and top-down attentional signals in a distributed neural network. Future work should clarify the roles of the various brain regions and how they interact at the network level. PMID:27440247

  19. Asymmetric multiple information cryptosystem based on chaotic spiral phase mask and random spectrum decomposition

    NASA Astrophysics Data System (ADS)

    Rafiq Abuturab, Muhammad

    2018-01-01

    A new asymmetric multiple information cryptosystem based on chaotic spiral phase mask (CSPM) and random spectrum decomposition is put forwarded. In the proposed system, each channel of secret color image is first modulated with a CSPM and then gyrator transformed. The gyrator spectrum is randomly divided into two complex-valued masks. The same procedure is applied to multiple secret images to get their corresponding first and second complex-valued masks. Finally, first and second masks of each channel are independently added to produce first and second complex ciphertexts, respectively. The main feature of the proposed method is the different secret images encrypted by different CSPMs using different parameters as the sensitive decryption/private keys which are completely unknown to unauthorized users. Consequently, the proposed system would be resistant to potential attacks. Moreover, the CSPMs are easier to position in the decoding process owing to their own centering mark on axis focal ring. The retrieved secret images are free from cross-talk noise effects. The decryption process can be implemented by optical experiment. Numerical simulation results demonstrate the viability and security of the proposed method.

  20. Numerical information processing under the global rule expressed by the Euler-Riemann ζ function defined in the complex plane

    NASA Astrophysics Data System (ADS)

    Chatelin, Françoise

    2010-09-01

    When nonzero, the ζ function is intimately connected with numerical information processing. Two other functions play a key role, namely, η(s )=∑n ≥1(-1)n +1/ns and λ(s )=∑n ≥01/(2n+1)s. The paper opens on a survey of some of the seminal work of Euler [Mémoires Acad. Sci., Berlin 1768, 83 (1749)] and of the amazing theorem by Voronin [Math. USSR, Izv. 9, 443 (1975)] Then, as a follow-up of Chatelin [Qualitative Computing. A Computational Journey into Nonlinearity (World Scientific, Singapore, in press)], we present a fresh look at the triple (η ,ζ,λ) which suggests an elementary analysis based on the distances of the three complex numbers z, z /2, and 2/z to 0 and 1. This metric approach is used to contextualize any nonlinear computation when it is observed at a point describing a complex plane. The results applied to ζ, η, and λ shed a new epistemological light about the critical line. The suggested interpretation related to ζ carries computational significance.

  1. Processing Complex Sounds Passing through the Rostral Brainstem: The New Early Filter Model

    PubMed Central

    Marsh, John E.; Campbell, Tom A.

    2016-01-01

    The rostral brainstem receives both “bottom-up” input from the ascending auditory system and “top-down” descending corticofugal connections. Speech information passing through the inferior colliculus of elderly listeners reflects the periodicity envelope of a speech syllable. This information arguably also reflects a composite of temporal-fine-structure (TFS) information from the higher frequency vowel harmonics of that repeated syllable. The amplitude of those higher frequency harmonics, bearing even higher frequency TFS information, correlates positively with the word recognition ability of elderly listeners under reverberatory conditions. Also relevant is that working memory capacity (WMC), which is subject to age-related decline, constrains the processing of sounds at the level of the brainstem. Turning to the effects of a visually presented sensory or memory load on auditory processes, there is a load-dependent reduction of that processing, as manifest in the auditory brainstem responses (ABR) evoked by to-be-ignored clicks. Wave V decreases in amplitude with increases in the visually presented memory load. A visually presented sensory load also produces a load-dependent reduction of a slightly different sort: The sensory load of visually presented information limits the disruptive effects of background sound upon working memory performance. A new early filter model is thus advanced whereby systems within the frontal lobe (affected by sensory or memory load) cholinergically influence top-down corticofugal connections. Those corticofugal connections constrain the processing of complex sounds such as speech at the level of the brainstem. Selective attention thereby limits the distracting effects of background sound entering the higher auditory system via the inferior colliculus. Processing TFS in the brainstem relates to perception of speech under adverse conditions. Attentional selectivity is crucial when the signal heard is degraded or masked: e.g., speech in noise, speech in reverberatory environments. The assumptions of a new early filter model are consistent with these findings: A subcortical early filter, with a predictive selectivity based on acoustical (linguistic) context and foreknowledge, is under cholinergic top-down control. A prefrontal capacity limitation constrains this top-down control as is guided by the cholinergic processing of contextual information in working memory. PMID:27242396

  2. Processing Complex Sounds Passing through the Rostral Brainstem: The New Early Filter Model.

    PubMed

    Marsh, John E; Campbell, Tom A

    2016-01-01

    The rostral brainstem receives both "bottom-up" input from the ascending auditory system and "top-down" descending corticofugal connections. Speech information passing through the inferior colliculus of elderly listeners reflects the periodicity envelope of a speech syllable. This information arguably also reflects a composite of temporal-fine-structure (TFS) information from the higher frequency vowel harmonics of that repeated syllable. The amplitude of those higher frequency harmonics, bearing even higher frequency TFS information, correlates positively with the word recognition ability of elderly listeners under reverberatory conditions. Also relevant is that working memory capacity (WMC), which is subject to age-related decline, constrains the processing of sounds at the level of the brainstem. Turning to the effects of a visually presented sensory or memory load on auditory processes, there is a load-dependent reduction of that processing, as manifest in the auditory brainstem responses (ABR) evoked by to-be-ignored clicks. Wave V decreases in amplitude with increases in the visually presented memory load. A visually presented sensory load also produces a load-dependent reduction of a slightly different sort: The sensory load of visually presented information limits the disruptive effects of background sound upon working memory performance. A new early filter model is thus advanced whereby systems within the frontal lobe (affected by sensory or memory load) cholinergically influence top-down corticofugal connections. Those corticofugal connections constrain the processing of complex sounds such as speech at the level of the brainstem. Selective attention thereby limits the distracting effects of background sound entering the higher auditory system via the inferior colliculus. Processing TFS in the brainstem relates to perception of speech under adverse conditions. Attentional selectivity is crucial when the signal heard is degraded or masked: e.g., speech in noise, speech in reverberatory environments. The assumptions of a new early filter model are consistent with these findings: A subcortical early filter, with a predictive selectivity based on acoustical (linguistic) context and foreknowledge, is under cholinergic top-down control. A prefrontal capacity limitation constrains this top-down control as is guided by the cholinergic processing of contextual information in working memory.

  3. The strategic control of prospective memory monitoring in response to complex and probabilistic contextual cues.

    PubMed

    Bugg, Julie M; Ball, B Hunter

    2017-07-01

    Participants use simple contextual cues to reduce deployment of costly monitoring processes in contexts in which prospective memory (PM) targets are not expected. This study investigated whether this strategic monitoring pattern is observed in response to complex and probabilistic contextual cues. Participants performed a lexical decision task in which words or nonwords were presented in upper or lower locations on screen. The specific condition was informed that PM targets ("tor" syllable) would occur only in words in the upper location, whereas the nonspecific condition was informed that targets could occur in any location or word type. Context was blocked such that word type and location changed every 8 trials. In Experiment 1, the specific condition used the complex contextual cue to reduce monitoring in unexpected contexts relative to the nonspecific condition. This pattern largely was not evidenced when the complex contextual cue was probabilistic (Experiment 2). Experiment 3 confirmed that strategic monitoring is observed for a complex cue that is deterministic, but not one that is probabilistic. Additionally, Experiments 1 and 3 demonstrated a disadvantage associated with strategic monitoring-namely, that the specific condition was less likely to respond to a PM target in an unexpected context. Experiment 3 provided evidence that this disadvantage is attributable to impaired noticing of the target. The novel findings suggest use of a complex contextual cue per se is not a boundary condition for the strategic, context-specific allocation of monitoring processes to support prospective remembering; however, strategic monitoring is constrained by the predictive utility of the complex contextual cue.

  4. How does information congruence influence diagnosis performance?

    PubMed

    Chen, Kejin; Li, Zhizhong

    2015-01-01

    Diagnosis performance is critical for the safety of high-consequence industrial systems. It depends highly on the information provided, perceived, interpreted and integrated by operators. This article examines the influence of information congruence (congruent information vs. conflicting information vs. missing information) and its interaction with time pressure (high vs. low) on diagnosis performance on a simulated platform. The experimental results reveal that the participants confronted with conflicting information spent significantly more time generating correct hypotheses and rated the results with lower probability values than when confronted with the other two levels of information congruence and were more prone to arrive at a wrong diagnosis result than when they were provided with congruent information. This finding stresses the importance of the proper processing of non-congruent information in safety-critical systems. Time pressure significantly influenced display switching frequency and completion time. This result indicates the decisive role of time pressure. Practitioner Summary: This article examines the influence of information congruence and its interaction with time pressure on human diagnosis performance on a simulated platform. For complex systems in the process control industry, the results stress the importance of the proper processing of non-congruent information in safety-critical systems.

  5. Resolving complex research data management issues in biomedical laboratories: Qualitative study of an industry-academia collaboration.

    PubMed

    Myneni, Sahiti; Patel, Vimla L; Bova, G Steven; Wang, Jian; Ackerman, Christopher F; Berlinicke, Cynthia A; Chen, Steve H; Lindvall, Mikael; Zack, Donald J

    2016-04-01

    This paper describes a distributed collaborative effort between industry and academia to systematize data management in an academic biomedical laboratory. Heterogeneous and voluminous nature of research data created in biomedical laboratories make information management difficult and research unproductive. One such collaborative effort was evaluated over a period of four years using data collection methods including ethnographic observations, semi-structured interviews, web-based surveys, progress reports, conference call summaries, and face-to-face group discussions. Data were analyzed using qualitative methods of data analysis to (1) characterize specific problems faced by biomedical researchers with traditional information management practices, (2) identify intervention areas to introduce a new research information management system called Labmatrix, and finally to (3) evaluate and delineate important general collaboration (intervention) characteristics that can optimize outcomes of an implementation process in biomedical laboratories. Results emphasize the importance of end user perseverance, human-centric interoperability evaluation, and demonstration of return on investment of effort and time of laboratory members and industry personnel for success of implementation process. In addition, there is an intrinsic learning component associated with the implementation process of an information management system. Technology transfer experience in a complex environment such as the biomedical laboratory can be eased with use of information systems that support human and cognitive interoperability. Such informatics features can also contribute to successful collaboration and hopefully to scientific productivity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Electrospray ionization mass spectrometry for the hydrolysis complexes of cisplatin: implications for the hydrolysis process of platinum complexes.

    PubMed

    Feifan, Xie; Pieter, Colin; Jan, Van Bocxlaer

    2017-07-01

    Non-enzyme-dependent hydrolysis of the drug cisplatin is important for its mode of action and toxicity. However, up until today, the hydrolysis process of cisplatin is still not completely understood. In the present study, the hydrolysis of cisplatin in an aqueous solution was systematically investigated by using electrospray ionization mass spectrometry coupled to liquid chromatography. A variety of previously unreported hydrolysis complexes corresponding to monomeric, dimeric and trimeric species were detected and identified. The characteristics of the Pt-containing complexes were investigated by using collision-induced dissociation (CID). The hydrolysis complexes demonstrate distinctive and correlative CID characteristics, which provides tools for an informative identification. The most frequently observed dissociation mechanism was sequential loss of NH 3 , H 2 O and HCl. Loss of the Pt atom was observed as the final step during the CID process. The formation mechanisms of the observed complexes were explored and experimentally examined. The strongly bound dimeric species, which existed in solution, are assumed to be formed from the clustering of the parent compound and its monohydrated or dihydrated complexes. The role of the electrospray process in the formation of some of the observed ions was also evaluated, and the electrospray ionization-related cold clusters were identified. The previously reported hydrolysis equilibria were tested and subsequently refined via a hydrolysis study resulting in a renewed mechanistic equilibrium system of cisplatin as proposed from our results. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Safe medication management in specialized home healthcare - an observational study.

    PubMed

    Lindblad, Marléne; Flink, Maria; Ekstedt, Mirjam

    2017-08-24

    Medication management is a complex, error-prone process. The aim of this study was to explore what constitutes the complexity of the medication management process (MMP) in specialized home healthcare and how healthcare professionals handle this complexity. The study is theoretically based in resilience engineering. Data were collected during the MMP at three specialized home healthcare units in Sweden using two strategies: observation of workplaces and shadowing RNs in everyday work, including interviews. Transcribed material was analysed using grounded theory. The MMP in home healthcare was dynamic and complex with unclear boundaries of responsibilities, inadequate information systems and fluctuating work conditions. Healthcare professionals adapted their everyday clinical work by sharing responsibility and simultaneously being authoritative and preserving patients' active participation, autonomy and integrity. To promote a safe MMP, healthcare professionals constantly re-prioritized goals, handled gaps in communication and information transmission at a distance by creating new bridging solutions. Trade-offs and workarounds were necessary elements, but also posed a threat to patient safety, as these interim solutions were not systematically evaluated or devised learning strategies. To manage a safe medication process in home healthcare, healthcare professionals need to adapt to fluctuating conditions and create bridging strategies through multiple parallel activities distributed over time, space and actors. The healthcare professionals' strategies could be integrated in continuous learning, while preserving boundaries of safety, instead of being more or less interim solutions. Patients' and family caregivers' as active partners in the MMP may be an underestimated resource for a resilient home healthcare.

  8. SFA Ombudsman Serving Students. Interim Report, September 1999 to March 2000.

    ERIC Educational Resources Information Center

    Department of Education, Washington, DC. Student Financial Assistance.

    This 6-page pamphlet shares what the Student Financial Assistance (SFA) Office of the Ombudsman has learned from cases it has handled. An initial section offers information on the case management process, noting that the ombudsman process depends on the case's complexity and that many are revolved quickly. Next, it examines customer-reported case…

  9. Housing Seasonal Workers for the Minnesota Processed Vegetable Industry

    ERIC Educational Resources Information Center

    Ziebarth, Ann

    2006-01-01

    The place where we live and work is a reflection of a complex set of economic conditions and social relationships. Very little information is available regarding housing for Minnesota's migrant workers. It is estimated that approximately 20,000 people migrate to Minnesota each summer to work in the production and processing of green peas and sweet…

  10. Satellite on-board real-time SAR processor prototype

    NASA Astrophysics Data System (ADS)

    Bergeron, Alain; Doucet, Michel; Harnisch, Bernd; Suess, Martin; Marchese, Linda; Bourqui, Pascal; Desnoyers, Nicholas; Legros, Mathieu; Guillot, Ludovic; Mercier, Luc; Châteauneuf, François

    2017-11-01

    A Compact Real-Time Optronic SAR Processor has been successfully developed and tested up to a Technology Readiness Level of 4 (TRL4), the breadboard validation in a laboratory environment. SAR, or Synthetic Aperture Radar, is an active system allowing day and night imaging independent of the cloud coverage of the planet. The SAR raw data is a set of complex data for range and azimuth, which cannot be compressed. Specifically, for planetary missions and unmanned aerial vehicle (UAV) systems with limited communication data rates this is a clear disadvantage. SAR images are typically processed electronically applying dedicated Fourier transformations. This, however, can also be performed optically in real-time. Originally the first SAR images were optically processed. The optical Fourier processor architecture provides inherent parallel computing capabilities allowing real-time SAR data processing and thus the ability for compression and strongly reduced communication bandwidth requirements for the satellite. SAR signal return data are in general complex data. Both amplitude and phase must be combined optically in the SAR processor for each range and azimuth pixel. Amplitude and phase are generated by dedicated spatial light modulators and superimposed by an optical relay set-up. The spatial light modulators display the full complex raw data information over a two-dimensional format, one for the azimuth and one for the range. Since the entire signal history is displayed at once, the processor operates in parallel yielding real-time performances, i.e. without resulting bottleneck. Processing of both azimuth and range information is performed in a single pass. This paper focuses on the onboard capabilities of the compact optical SAR processor prototype that allows in-orbit processing of SAR images. Examples of processed ENVISAT ASAR images are presented. Various SAR processor parameters such as processing capabilities, image quality (point target analysis), weight and size are reviewed.

  11. Special Advanced Studies for Pollution Prevention. Delivery Order 0058: The Monitor - Spring 2000

    DTIC Science & Technology

    2001-04-01

    Process complexity ➨ Strippability ➨ Maturity ➨ Process type and chemistry ➨ Licensing requirements ➨ Vendor information ➨ Niplate 700 (Surface...Courses of Action 4 Identify & Evaluate Potential Alternatives 5 Select Best Alternative & Develop Project 6 Prioritize Projects by Commodity 7 Rank...Burden CS Priority Process Specific P2 OASolution Selection Solution Planning Solution Implementation Solution Evaluation Phase 2 Phase 3 Phase 1

  12. Acquiring Expertise.

    DTIC Science & Technology

    1983-01-31

    LaBerge and Samuels, 1974): The reading process is too complex to operate completely at the declarative processing level. It can only work well when...1978, 85, 363-394. 44 The Acquisition of Expertise Lesgold Laberge , P. , & Samuels, S. J. Toward a theory of automatic information k - processing in...Menlo Park, CA 94025 Box 11A. Yale Station * New Haven, CT 06520 1 William B. Whitten Bell Laboratories 1 Dr. Albert Stevens 2D-610 Bolt Beranek

  13. Evaluation of effectiveness of information systems implementation in organization (by example of ERP-systems)

    NASA Astrophysics Data System (ADS)

    Demyanova, O. V.; Andreeva, E. V.; Sibgatullina, D. R.; Kireeva-Karimova, A. M.; Gafurova, A. Y.; Zakirova, Ch S.

    2018-05-01

    ERP in a modern enterprise information system allowed optimizing internal business processes, reducing production costs and increasing the attractiveness of enterprises for investors. It is an important component of success in the competition and an important condition for attracting investments in the key sector of the state. A vivid example of these systems are enterprise information systems using the methodology of ERP (Enterprise Resource Planning - enterprise resource planning). ERP is an integrated set of methods, processes, technologies and tools. It is based on: supply chain management; advanced planning and scheduling; sales automation; tool responsible for configuring; final resource planning; intelligence business; OLAP technology; block e- Commerce; management of product data. The main purpose of ERP systems is the automation of interrelated processes of planning, accounting and management in key areas of the company. ERP systems are automated systems that effectively address complex problems, including optimal allocation of business resources, ensuring quick and efficient delivery of goods and services to the consumer. Knowledge embedded in ERP systems provided enterprise-wide automation to introduce the activities of all functional departments of the company as a single complex system. At the level of quality estimates, most managers understand that the implementations of ERP systems is a necessary and useful procedure. Assessment of the effectiveness of the information systems implementation is relevant.

  14. Audio-based, unsupervised machine learning reveals cyclic changes in earthquake mechanisms in the Geysers geothermal field, California

    NASA Astrophysics Data System (ADS)

    Holtzman, B. K.; Paté, A.; Paisley, J.; Waldhauser, F.; Repetto, D.; Boschi, L.

    2017-12-01

    The earthquake process reflects complex interactions of stress, fracture and frictional properties. New machine learning methods reveal patterns in time-dependent spectral properties of seismic signals and enable identification of changes in faulting processes. Our methods are based closely on those developed for music information retrieval and voice recognition, using the spectrogram instead of the waveform directly. Unsupervised learning involves identification of patterns based on differences among signals without any additional information provided to the algorithm. Clustering of 46,000 earthquakes of $0.3

  15. Automated image processing of Landsat II digital data for watershed runoff prediction

    NASA Technical Reports Server (NTRS)

    Sasso, R. R.; Jensen, J. R.; Estes, J. E.

    1977-01-01

    Digital image processing of Landsat data from a 230 sq km area was examined as a possible means of generating soil cover information for use in the watershed runoff prediction of Kern County, California. The soil cover information included data on brush, grass, pasture lands and forests. A classification accuracy of 94% for the Landsat-based soil cover survey suggested that the technique could be applied to the watershed runoff estimate. However, problems involving the survey of complex mountainous environments may require further attention

  16. Directional dual-tree rational-dilation complex wavelet transform.

    PubMed

    Serbes, Gorkem; Gulcur, Halil Ozcan; Aydin, Nizamettin

    2014-01-01

    Dyadic discrete wavelet transform (DWT) has been used successfully in processing signals having non-oscillatory transient behaviour. However, due to the low Q-factor property of their wavelet atoms, the dyadic DWT is less effective in processing oscillatory signals such as embolic signals (ESs). ESs are extracted from quadrature Doppler signals, which are the output of Doppler ultrasound systems. In order to process ESs, firstly, a pre-processing operation known as phase filtering for obtaining directional signals from quadrature Doppler signals must be employed. Only then, wavelet based methods can be applied to these directional signals for further analysis. In this study, a directional dual-tree rational-dilation complex wavelet transform, which can be applied directly to quadrature signals and has the ability of extracting directional information during analysis, is introduced.

  17. Harvesting Social Signals to Inform Peace Processes Implementation and Monitoring

    PubMed Central

    Nigam, Aastha; Dambanemuya, Henry K.; Joshi, Madhav; Chawla, Nitesh V.

    2017-01-01

    Abstract Peace processes are complex, protracted, and contentious involving significant bargaining and compromising among various societal and political stakeholders. In civil war terminations, it is pertinent to measure the pulse of the nation to ensure that the peace process is responsive to citizens' concerns. Social media yields tremendous power as a tool for dialogue, debate, organization, and mobilization, thereby adding more complexity to the peace process. Using Colombia's final peace agreement and national referendum as a case study, we investigate the influence of two important indicators: intergroup polarization and public sentiment toward the peace process. We present a detailed linguistic analysis to detect intergroup polarization and a predictive model that leverages Tweet structure, content, and user-based features to predict public sentiment toward the Colombian peace process. We demonstrate that had proaccord stakeholders leveraged public opinion from social media, the outcome of the Colombian referendum could have been different. PMID:29235916

  18. Harvesting Social Signals to Inform Peace Processes Implementation and Monitoring.

    PubMed

    Nigam, Aastha; Dambanemuya, Henry K; Joshi, Madhav; Chawla, Nitesh V

    2017-12-01

    Peace processes are complex, protracted, and contentious involving significant bargaining and compromising among various societal and political stakeholders. In civil war terminations, it is pertinent to measure the pulse of the nation to ensure that the peace process is responsive to citizens' concerns. Social media yields tremendous power as a tool for dialogue, debate, organization, and mobilization, thereby adding more complexity to the peace process. Using Colombia's final peace agreement and national referendum as a case study, we investigate the influence of two important indicators: intergroup polarization and public sentiment toward the peace process. We present a detailed linguistic analysis to detect intergroup polarization and a predictive model that leverages Tweet structure, content, and user-based features to predict public sentiment toward the Colombian peace process. We demonstrate that had proaccord stakeholders leveraged public opinion from social media, the outcome of the Colombian referendum could have been different.

  19. Impacts of complex behavioral responses on asymmetric interacting spreading dynamics in multiplex networks

    PubMed Central

    Liu, Quan-Hui; Wang, Wei; Tang, Ming; Zhang, Hai-Feng

    2016-01-01

    Information diffusion and disease spreading in communication-contact layered network are typically asymmetrically coupled with each other, in which disease spreading can be significantly affected by the way an individual being aware of disease responds to the disease. Many recent studies have demonstrated that human behavioral adoption is a complex and non-Markovian process, where the probability of behavior adoption is dependent on the cumulative times of information received and the social reinforcement effect of the cumulative information. In this paper, the impacts of such a non-Markovian vaccination adoption behavior on the epidemic dynamics and the control effects are explored. It is found that this complex adoption behavior in the communication layer can significantly enhance the epidemic threshold and reduce the final infection rate. By defining the social cost as the total cost of vaccination and treatment, it can be seen that there exists an optimal social reinforcement effect and optimal information transmission rate allowing the minimal social cost. Moreover, a mean-field theory is developed to verify the correctness of simulation results. PMID:27156574

  20. Impacts of complex behavioral responses on asymmetric interacting spreading dynamics in multiplex networks.

    PubMed

    Liu, Quan-Hui; Wang, Wei; Tang, Ming; Zhang, Hai-Feng

    2016-05-09

    Information diffusion and disease spreading in communication-contact layered network are typically asymmetrically coupled with each other, in which disease spreading can be significantly affected by the way an individual being aware of disease responds to the disease. Many recent studies have demonstrated that human behavioral adoption is a complex and non-Markovian process, where the probability of behavior adoption is dependent on the cumulative times of information received and the social reinforcement effect of the cumulative information. In this paper, the impacts of such a non-Markovian vaccination adoption behavior on the epidemic dynamics and the control effects are explored. It is found that this complex adoption behavior in the communication layer can significantly enhance the epidemic threshold and reduce the final infection rate. By defining the social cost as the total cost of vaccination and treatment, it can be seen that there exists an optimal social reinforcement effect and optimal information transmission rate allowing the minimal social cost. Moreover, a mean-field theory is developed to verify the correctness of simulation results.

  1. Structural study of complexes formed by acidic and neutral organophosphorus reagents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braatz, Alexander D.; Antonio, Mark R.; Nilsson, Mikael

    The coordination of the trivalent 4f ions, Ln = La 3+, Dy 3+, and Lu 3+, with neutral and acidic organophosphorus reagents, both individually and combined, was studied by use of X-ray absorption spectroscopy. These studies provide metrical information about the interatomic interactions between these cations and the ligands tri- n-butyl phosphate (TBP) and di- n-butyl phosphoric acid (HDBP), whose behavior are of practical importance to chemical separation processes that are currently used on an industrial scale. Previous studies have suggested the existence of complexes involving a mixture of ligands, accounting for extraction synergy. Through systematic variation of the aqueousmore » phase acidity and extractant concentration and combination, we have found that complexes with Ln and TBP : HDBP at any mixture and HDBP alone involve direct Ln–O interactions involving 6 oxygen atoms and distant Ln–P interactions involving on average 3–5 phosphorus atoms per Ln ion. It was also found that Ln complexes formed by TBP alone seem to favor eight oxygen coordination, though we were unable to obtain metrical results regarding the distant Ln–P interactions due to the low signal attributed to a lower concentration of Ln ions in the organic phases. Our study does not support the existence of mixed Ln–TBP–HDBP complexes but, rather, indicates that the lanthanides are extracted as either Ln–HDBP complexes or Ln–TBP complexes and that these complexes exist in different ratios depending on the conditions of the extraction system. Furthermore, this fundamental structural information offers insight into the solvent extraction processes that are taking place and are of particular importance to issues arising from the separation and disposal of radioactive materials from used nuclear fuel.« less

  2. Structural study of complexes formed by acidic and neutral organophosphorus reagents

    DOE PAGES

    Braatz, Alexander D.; Antonio, Mark R.; Nilsson, Mikael

    2016-12-23

    The coordination of the trivalent 4f ions, Ln = La 3+, Dy 3+, and Lu 3+, with neutral and acidic organophosphorus reagents, both individually and combined, was studied by use of X-ray absorption spectroscopy. These studies provide metrical information about the interatomic interactions between these cations and the ligands tri- n-butyl phosphate (TBP) and di- n-butyl phosphoric acid (HDBP), whose behavior are of practical importance to chemical separation processes that are currently used on an industrial scale. Previous studies have suggested the existence of complexes involving a mixture of ligands, accounting for extraction synergy. Through systematic variation of the aqueousmore » phase acidity and extractant concentration and combination, we have found that complexes with Ln and TBP : HDBP at any mixture and HDBP alone involve direct Ln–O interactions involving 6 oxygen atoms and distant Ln–P interactions involving on average 3–5 phosphorus atoms per Ln ion. It was also found that Ln complexes formed by TBP alone seem to favor eight oxygen coordination, though we were unable to obtain metrical results regarding the distant Ln–P interactions due to the low signal attributed to a lower concentration of Ln ions in the organic phases. Our study does not support the existence of mixed Ln–TBP–HDBP complexes but, rather, indicates that the lanthanides are extracted as either Ln–HDBP complexes or Ln–TBP complexes and that these complexes exist in different ratios depending on the conditions of the extraction system. Furthermore, this fundamental structural information offers insight into the solvent extraction processes that are taking place and are of particular importance to issues arising from the separation and disposal of radioactive materials from used nuclear fuel.« less

  3. On the evolution of misunderstandings about evolutionary psychology.

    PubMed

    Young, J; Persell, R

    2000-04-01

    Some of the controversy surrounding evolutionary explanations of human behavior may be due to cognitive information-processing patterns that are themselves the result of evolutionary processes. Two such patterns are (1) the tendency to oversimplify information so as to reduce demand on cognitive resources and (2) our strong desire to generate predictability and stability from perceptions of the external world. For example, research on social stereotyping has found that people tend to focus automatically on simplified social-categorical information, to use such information when deciding how to behave, and to rely on such information even in the face of contradictory evidence. Similarly, an undying debate over nature vs. nurture is shaped by various data-reduction strategies that frequently oversimplify, and thus distort, the intent of the supporting arguments. This debate is also often marked by an assumption that either the nature or the nurture domain may be justifiably excluded at an explanatory level because one domain appears to operate in a sufficiently stable and predictable way for a particular argument. As a result, critiques in-veighed against evolutionary explanations of behavior often incorporate simplified--and erroneous--assumptions about either the mechanics of how evolution operates or the inevitable implications of evolution for understanding human behavior. The influences of these tendencies are applied to a discussion of the heritability of behavioral characteristics. It is suggested that the common view that Mendelian genetics can explain the heritability of complex behaviors, with a one-gene-one-trait process, is misguided. Complex behaviors are undoubtedly a product of a more complex interaction between genes and environment, ensuring that both nature and nurture must be accommodated in a yet-to-be-developed post-Mendelian model of genetic influence. As a result, current public perceptions of evolutionary explanations of behavior are handicapped by the lack of clear articulation of the relationship between inherited genes and manifest behavior.

  4. Biological network extraction from scientific literature: state of the art and challenges.

    PubMed

    Li, Chen; Liakata, Maria; Rebholz-Schuhmann, Dietrich

    2014-09-01

    Networks of molecular interactions explain complex biological processes, and all known information on molecular events is contained in a number of public repositories including the scientific literature. Metabolic and signalling pathways are often viewed separately, even though both types are composed of interactions involving proteins and other chemical entities. It is necessary to be able to combine data from all available resources to judge the functionality, complexity and completeness of any given network overall, but especially the full integration of relevant information from the scientific literature is still an ongoing and complex task. Currently, the text-mining research community is steadily moving towards processing the full body of the scientific literature by making use of rich linguistic features such as full text parsing, to extract biological interactions. The next step will be to combine these with information from scientific databases to support hypothesis generation for the discovery of new knowledge and the extension of biological networks. The generation of comprehensive networks requires technologies such as entity grounding, coordination resolution and co-reference resolution, which are not fully solved and are required to further improve the quality of results. Here, we analyse the state of the art for the extraction of network information from the scientific literature and the evaluation of extraction methods against reference corpora, discuss challenges involved and identify directions for future research. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  5. An object-oriented software approach for a distributed human tracking motion system

    NASA Astrophysics Data System (ADS)

    Micucci, Daniela L.

    2003-06-01

    Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.

  6. Renewal Processes in the Critical Brain

    NASA Astrophysics Data System (ADS)

    Allegrini, Paolo; Paradisi, Paolo; Menicucci, Danilo; Gemignani, Angelo

    We describe herein a multidisciplinary research, as it developes and applies concepts of the theory of complexity, in turn stemming from recent advancements of statistical physics, onto cognitive neuroscience. We discuss (define) complexity, and how the human brain is a paradigm of it. We discuss how the hypothesis of brain activity dynamically behaving as a critical system is taking momentum in literature, then we focus on a feature of critical systems (hence of the brain), which is the intermittent passage between metastable states, marked by events, locally resetting the memory, but giving rise to correlation functions with infinite correlation times. The events, extracted from multi-channel ElectroEncephaloGrams, mark (are interpreted as) a birth/death process of cooperation, namely of system elements being recruited into collective states. Finally we discuss a recently discovered form of control (in the form of a new Linear Response Theory), that allows an optimized information transmission between complex systems, named Complexity Matching.

  7. Experimentally modeling stochastic processes with less memory by the use of a quantum processor

    PubMed Central

    Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.

    2017-01-01

    Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218

  8. Evaluation of terrain complexity by autocorrelation. [geomorphology and geobotany

    NASA Technical Reports Server (NTRS)

    Craig, R. G.

    1982-01-01

    The topographic complexity of various sections of the Ozark, Appalachian, and Interior Low Plateaus, as well as of the New England, Piedmont, Blue Ridge, Ouachita, and Valley and Ridge Provinces of the Eastern United States were characterized. The variability of autocorrelation within a small area (7 1/2-ft quadrangle) to the variability at widely separated and diverse areas within the same physiographic region was compared to measure the degree of uniformity of the processes which can be expected to be encountered within a given physiographic province. The variability of autocorrelation across the eight geomorphic regions was compared and contrasted. The total study area was partitioned into subareas homogeneous in terrain complexity. The relation between the complexity measured, the geomorphic process mix implied, and the way in which geobotanical information is modified into a more or less recognizable entity is demonstrated. Sampling strategy is described.

  9. Toward a Practical Model of Cognitive/Information Task Analysis and Schema Acquisition for Complex Problem-Solving Situations.

    ERIC Educational Resources Information Center

    Braune, Rolf; Foshay, Wellesley R.

    1983-01-01

    The proposed three-step strategy for research on human information processing--concept hierarchy analysis, analysis of example sets to teach relations among concepts, and analysis of problem sets to build a progressively larger schema for the problem space--may lead to practical procedures for instructional design and task analysis. Sixty-four…

  10. Applying Analogical Reasoning Techniques for Teaching XML Document Querying Skills in Database Classes

    ERIC Educational Resources Information Center

    Mitri, Michel

    2012-01-01

    XML has become the most ubiquitous format for exchange of data between applications running on the Internet. Most Web Services provide their information to clients in the form of XML. The ability to process complex XML documents in order to extract relevant information is becoming as important a skill for IS students to master as querying…

  11. Social scientist's viewpoint on conflict management

    USGS Publications Warehouse

    Ertel, Madge O.

    1990-01-01

    Social scientists can bring to the conflict-management process objective, reliable information needed to resolve increasingly complex issues. Engineers need basic training in the principles of the social sciences and in strategies for public involvement. All scientists need to be sure that that the information they provide is unbiased by their own value judgments and that fair standards and open procedures govern its use.

  12. Context Matters: The Value of Analyzing Human Factors within Educational Contexts as a Way of Informing Technology-Related Decisions within Design Research

    ERIC Educational Resources Information Center

    MacKinnon, Kim

    2012-01-01

    While design research can be useful for designing effective technology integrations within complex social settings, it currently fails to provide concrete methodological guidelines for gathering and organizing information about the research context, or for determining how such analyses ought to guide the iterative design and innovation process. A…

  13. How Linearity and Structural Complexity Interact and Affect the Recognition of Italian Derived Words.

    PubMed

    Bridgers, Franca Ferrari; Kacinik, Natalie

    2017-02-01

    The majority of words in most languages consist of derived poly-morphemic words but a cross-linguistic review of the literature (Amenta and Crepaldi in Front Psychol 3:232-243, 2012) shows a contradictory picture with respect to how such words are represented and processed. The current study examined the effects of linearity and structural complexity on the processing of Italian derived words. Participants performed a lexical decision task on three types of prefixed and suffixed words and nonwords differing in the complexity of their internal structure. The processing of these words was indeed found to vary according to the nature of the affixes, the order in which they appear, and the type of information the affix encodes. The results thus indicate that derived words are not a uniform class and the best account of these findings appears to be a constraint-based or probabilistic multi-route processing model (e.g., Kuperman et al. in Lang Cogn Process 23:1089-1132, 2008; J Exp Psychol Hum Percept Perform 35:876-895, 2009; J Mem Lang 62:83-97, 2010).

  14. Motivation alters impression formation and related neural systems

    PubMed Central

    Zaki, Jamil; Ambady, Nalini

    2017-01-01

    Abstract Observers frequently form impressions of other people based on complex or conflicting information. Rather than being objective, these impressions are often biased by observers’ motives. For instance, observers often downplay negative information they learn about ingroup members. Here, we characterize the neural systems associated with biased impression formation. Participants learned positive and negative information about ingroup and outgroup social targets. Following this information, participants worsened their impressions of outgroup, but not ingroup, targets. This tendency was associated with a failure to engage neural structures including lateral prefrontal cortex, dorsal anterior cingulate cortex, temporoparietal junction, Insula and Precuneus when processing negative information about ingroup (but not outgroup) targets. To the extent that participants engaged these regions while learning negative information about ingroup members, they exhibited less ingroup bias in their impressions. These data are consistent with a model of ‘effortless bias’, under which perceivers fail to process goal-inconsistent information in order to maintain desired conclusions. PMID:27798250

  15. Information driven self-organization of complex robotic behaviors.

    PubMed

    Martius, Georg; Der, Ralf; Ay, Nihat

    2013-01-01

    Information theory is a powerful tool to express principles to drive autonomous systems because it is domain invariant and allows for an intuitive interpretation. This paper studies the use of the predictive information (PI), also called excess entropy or effective measure complexity, of the sensorimotor process as a driving force to generate behavior. We study nonlinear and nonstationary systems and introduce the time-local predicting information (TiPI) which allows us to derive exact results together with explicit update rules for the parameters of the controller in the dynamical systems framework. In this way the information principle, formulated at the level of behavior, is translated to the dynamics of the synapses. We underpin our results with a number of case studies with high-dimensional robotic systems. We show the spontaneous cooperativity in a complex physical system with decentralized control. Moreover, a jointly controlled humanoid robot develops a high behavioral variety depending on its physics and the environment it is dynamically embedded into. The behavior can be decomposed into a succession of low-dimensional modes that increasingly explore the behavior space. This is a promising way to avoid the curse of dimensionality which hinders learning systems to scale well.

  16. The way to universal and correct medical presentation of diagnostic informations for complex spectrophotometry noninvasive medical diagnostic systems

    NASA Astrophysics Data System (ADS)

    Rogatkin, Dmitrii A.; Tchernyi, Vladimir V.

    2003-07-01

    The optical noninvasive diagnostic systems are now widely applied and investigated in different areas of medicine. One of the such techniques is the noninvasive spectrophotometry, the complex diagnostic technique consisting on elastic scattering spectroscopy, absorption spectroscopy, fluorescent diagnostics, photoplethismography, etc. Today a lot of real optical diagnostic systems indicate the technical parameters and physical data only as a result of the diagnostic procedure. But, it is clear that for the medical staff the more convenient medical information is needed. This presentation lights the general way for development a diagnostic system"s software, which can produce the full processing of the diagnostic data from a physical to a medical level. It is shown, that this process is a multilevel (3-level) procedure and the main diagnostic result for noninvasive spectrophotometry methods, the biochemical and morphological composition of the tested tissues, arises in it on a second level of calculations.

  17. Harnessing glycomics technologies: integrating structure with function for glycan characterization

    PubMed Central

    Robinson, Luke N.; Artpradit, Charlermchai; Raman, Rahul; Shriver, Zachary H.; Ruchirawat, Mathuros; Sasisekharan, Ram

    2013-01-01

    Glycans, or complex carbohydrates, are a ubiquitous class of biological molecules which impinge on a variety of physiological processes ranging from signal transduction to tissue development and microbial pathogenesis. In comparison to DNA and proteins, glycans present unique challenges to the study of their structure and function owing to their complex and heterogeneous structures and the dominant role played by multivalency in their sequence-specific biological interactions. Arising from these challenges, there is a need to integrate information from multiple complementary methods to decode structure-function relationships. Focusing on acidic glycans, we describe here key glycomics technologies for characterizing their structural attributes, including linkage, modifications, and topology, as well as for elucidating their role in biological processes. Two cases studies, one involving sialylated branched glycans and the other sulfated glycosaminoglycans, are used to highlight how integration of orthogonal information from diverse datasets enables rapid convergence of glycan characterization for development of robust structure-function relationships. PMID:22522536

  18. Cliques of Neurons Bound into Cavities Provide a Missing Link between Structure and Function.

    PubMed

    Reimann, Michael W; Nolte, Max; Scolamiero, Martina; Turner, Katharine; Perin, Rodrigo; Chindemi, Giuseppe; Dłotko, Paweł; Levi, Ran; Hess, Kathryn; Markram, Henry

    2017-01-01

    The lack of a formal link between neural network structure and its emergent function has hampered our understanding of how the brain processes information. We have now come closer to describing such a link by taking the direction of synaptic transmission into account, constructing graphs of a network that reflect the direction of information flow, and analyzing these directed graphs using algebraic topology. Applying this approach to a local network of neurons in the neocortex revealed a remarkably intricate and previously unseen topology of synaptic connectivity. The synaptic network contains an abundance of cliques of neurons bound into cavities that guide the emergence of correlated activity. In response to stimuli, correlated activity binds synaptically connected neurons into functional cliques and cavities that evolve in a stereotypical sequence toward peak complexity. We propose that the brain processes stimuli by forming increasingly complex functional cliques and cavities.

  19. Connection, regulation, and care plan innovation: a case study of four nursing homes.

    PubMed

    Colón-Emeric, Cathleen S; Lekan-Rutledge, Deborah; Utley-Smith, Queen; Ammarell, Natalie; Bailey, Donald; Piven, Mary L; Corazzini, Kirsten; Anderson, Ruth A

    2006-01-01

    We describe how connections among nursing home staff impact the care planning process using a complexity science framework. We completed six-month case studies of four nursing homes. Field observations (n = 274), shadowing encounters (n = 69), and in-depth interviews (n = 122) of 390 staff at all levels were conducted. Qualitative analysis produced a conceptual/thematic description and complexity science concepts were used to produce conceptual insights. We observed that greater levels of staff connection were associated with higher care plan specificity and innovation. Connection of the frontline nursing staff was crucial for (1) implementation of the formal care plan and (2) spontaneous informal care planning responsive to changing resident needs. Although regulations could theoretically improve cognitive diversity and information flow in care planning, we observed instances of regulatory oversight resulting in less specific care plans and abandonment of an effective care planning process. Interventions which improve staff connectedness may improve resident outcomes.

  20. The Deceptively Simple N170 Reflects Network Information Processing Mechanisms Involving Visual Feature Coding and Transfer Across Hemispheres

    PubMed Central

    Ince, Robin A. A.; Jaworska, Katarzyna; Gross, Joachim; Panzeri, Stefano; van Rijsbergen, Nicola J.; Rousselet, Guillaume A.; Schyns, Philippe G.

    2016-01-01

    A key to understanding visual cognition is to determine “where”, “when”, and “how” brain responses reflect the processing of the specific visual features that modulate categorization behavior—the “what”. The N170 is the earliest Event-Related Potential (ERP) that preferentially responds to faces. Here, we demonstrate that a paradigmatic shift is necessary to interpret the N170 as the product of an information processing network that dynamically codes and transfers face features across hemispheres, rather than as a local stimulus-driven event. Reverse-correlation methods coupled with information-theoretic analyses revealed that visibility of the eyes influences face detection behavior. The N170 initially reflects coding of the behaviorally relevant eye contralateral to the sensor, followed by a causal communication of the other eye from the other hemisphere. These findings demonstrate that the deceptively simple N170 ERP hides a complex network information processing mechanism involving initial coding and subsequent cross-hemispheric transfer of visual features. PMID:27550865

  1. Image processing of metal surface with structured light

    NASA Astrophysics Data System (ADS)

    Luo, Cong; Feng, Chang; Wang, Congzheng

    2014-09-01

    In structured light vision measurement system, the ideal image of structured light strip, in addition to black background , contains only the gray information of the position of the stripe. However, the actual image contains image noise, complex background and so on, which does not belong to the stripe, and it will cause interference to useful information. To extract the stripe center of mental surface accurately, a new processing method was presented. Through adaptive median filtering, the noise can be preliminary removed, and the noise which introduced by CCD camera and measured environment can be further removed with difference image method. To highlight fine details and enhance the blurred regions between the stripe and noise, the sharping algorithm is used which combine the best features of Laplacian operator and Sobel operator. Morphological opening operation and closing operation are used to compensate the loss of information.Experimental results show that this method is effective in the image processing, not only to restrain the information but also heighten contrast. It is beneficial for the following processing.

  2. Landauer in the Age of Synthetic Biology: Energy Consumption and Information Processing in Biochemical Networks

    NASA Astrophysics Data System (ADS)

    Mehta, Pankaj; Lang, Alex H.; Schwab, David J.

    2016-03-01

    A central goal of synthetic biology is to design sophisticated synthetic cellular circuits that can perform complex computations and information processing tasks in response to specific inputs. The tremendous advances in our ability to understand and manipulate cellular information processing networks raises several fundamental physics questions: How do the molecular components of cellular circuits exploit energy consumption to improve information processing? Can one utilize ideas from thermodynamics to improve the design of synthetic cellular circuits and modules? Here, we summarize recent theoretical work addressing these questions. Energy consumption in cellular circuits serves five basic purposes: (1) increasing specificity, (2) manipulating dynamics, (3) reducing variability, (4) amplifying signal, and (5) erasing memory. We demonstrate these ideas using several simple examples and discuss the implications of these theoretical ideas for the emerging field of synthetic biology. We conclude by discussing how it may be possible to overcome these limitations using "post-translational" synthetic biology that exploits reversible protein modification.

  3. Informed Consent and Genomic Incidental Findings: IRB Chair Perspectives

    PubMed Central

    Simon, Christian M.; Williams, Janet K.; Shinkunas, Laura; Brandt, Debra; Daack-Hirsch, Sandra; Driessnack, Martha

    2013-01-01

    It is unclear how genomic incidental finding (GIF) prospects should be addressed in informed consent processes. An exploratory study on this topic was conducted with 34 purposively sampled Chairs of institutional review boards (IRBs) at centers conducting genome-wide association studies. Most Chairs (96%) reported no knowledge of local IRB requirements regarding GIFs and informed consent. Chairs suggested consent processes should address the prospect of, and study disclosure policy on, GIFs; GIF management and follow-up; potential clinical significance of GIFs; potential risks of GIF disclosure; an opportunity for participants to opt out of GIF disclosure; and duration of the researcher's duty to disclose GIFs. Chairs were concerned about participant disclosure preferences changing over time; inherent limitations in determining the scope and accuracy of claims about GIFs; and making consent processes longer and more complex. IRB Chair and other stakeholder perspectives can help advance informed consent efforts to accommodate GIF prospects. PMID:22228060

  4. Informed consent and genomic incidental findings: IRB chair perspectives.

    PubMed

    Simon, Christian M; Williams, Janet K; Shinkunas, Laura; Brandt, Debra; Daack-Hirsch, Sandra; Driessnack, Martha

    2011-12-01

    It is unclear how genomic incidental finding (GIF) prospects should be addressed in informed consent processes. An exploratory study on this topic was conducted with 34 purposively sampled Chairs of institutional review boards (IRBs) at centers conducting genome-wide association studies. Most Chairs (96%) reported no knowledge of local IRB requirements regarding GIFs and informed consent. Chairs suggested consent processes should address the prospect of, and study disclosure policy on, GIFs; GIF management and follow-up; potential clinical significance of GIFs; potential risks of GIF disclosure; an opportunity for participants to opt out of GIF disclosure; and duration of the researcher's duty to disclose GIFs. Chairs were concerned about participant disclosure preferences changing over time; inherent limitations in determining the scope and accuracy of claims about GIFs; and making consent processes longer and more complex. IRB Chair and other stakeholder perspectives can help advance informed consent efforts to accommodate GIF prospects.

  5. Computer simulation of functioning of elements of security systems

    NASA Astrophysics Data System (ADS)

    Godovykh, A. V.; Stepanov, B. P.; Sheveleva, A. A.

    2017-01-01

    The article is devoted to issues of development of the informational complex for simulation of functioning of the security system elements. The complex is described from the point of view of main objectives, a design concept and an interrelation of main elements. The proposed conception of the computer simulation provides an opportunity to simulate processes of security system work for training security staff during normal and emergency operation.

  6. Artificial intelligence techniques for colorectal cancer drug metabolism: ontology and complex network.

    PubMed

    Martínez-Romero, Marcos; Vázquez-Naya, José M; Rabuñal, Juan R; Pita-Fernández, Salvador; Macenlle, Ramiro; Castro-Alvariño, Javier; López-Roses, Leopoldo; Ulla, José L; Martínez-Calvo, Antonio V; Vázquez, Santiago; Pereira, Javier; Porto-Pazos, Ana B; Dorado, Julián; Pazos, Alejandro; Munteanu, Cristian R

    2010-05-01

    Colorectal cancer is one of the most frequent types of cancer in the world and generates important social impact. The understanding of the specific metabolism of this disease and the transformations of the specific drugs will allow finding effective prevention, diagnosis and treatment of the colorectal cancer. All the terms that describe the drug metabolism contribute to the construction of ontology in order to help scientists to link the correlated information and to find the most useful data about this topic. The molecular components involved in this metabolism are included in complex network such as metabolic pathways in order to describe all the molecular interactions in the colorectal cancer. The graphical method of processing biological information such as graphs and complex networks leads to the numerical characterization of the colorectal cancer drug metabolic network by using invariant values named topological indices. Thus, this method can help scientists to study the most important elements in the metabolic pathways and the dynamics of the networks during mutations, denaturation or evolution for any type of disease. This review presents the last studies regarding ontology and complex networks of the colorectal cancer drug metabolism and a basic topology characterization of the drug metabolic process sub-ontology from the Gene Ontology.

  7. Beyond Wiki to Judgewiki for Transparent Climate Change Decisions

    NASA Astrophysics Data System (ADS)

    Capron, M. E.

    2008-12-01

    Climate Change is like the prisoner's dilemma, a zero-sum game, or cheating in sports. Everyone and every country is tempted to selfishly maintain or advance their standard of living. The tremendous difference between standards of living amplifies the desire to opt out of Climate Change solutions adverse to economic competitiveness. Climate Change is also exceedingly complex. No one person, one organization, one country, or partial collection of countries has the capacity and the global support needed to make decisions on Climate Change solutions. There are thousands of potential actions, tens of thousands of known and unknown environmental and economic impacts. Some actions are belatedly found to be unsustainable beyond token volumes, corn ethanol or soy-biodiesel for example. Mankind can address human nature and complexity with a globally transparent information and decision process available to all 7 billion of us. We need a process that builds trust and simplifies complexity. Fortunately, we have the Internet for trust building communication and computers to simplify complexity. Mankind can produce new software tailored to the challenge. We would combine group information collection software (a wiki) with a decision-matrix (a judge), market forecasting, and video games to produce the tool mankind needs for trust building transparent decisions on Climate Change actions. The resulting software would be a judgewiki.

  8. Prospective pilots of routine data capture by paediatricians in clinics and validation of the Disabilities Complexity Scale.

    PubMed

    Horridge, Karen A; Mcgarry, Kenneth; Williams, Jane; Whitlingum, Gabriel

    2016-06-01

    To pilot prospective data collection by paediatricians at the point of care across England using a defined terminology set; demonstrate feasibility of data collection and utility of data outputs; and confirm that counting the number of needs per child is valid for quantifying complexity. Paediatricians in 16 hospital and community settings collected and anonymized data. Participants completed a survey regarding the process. Data were analysed using R version 3.1.2. Overall, 8117 needs captured from 1224 consultations were recorded. Sixteen clinicians responded positively about the process and utility of data collection. The sum of needs varied significantly (p<0.01) by level of gross motor function ascertained using the Gross Motor Function Classification System for children with cerebral palsy; epilepsy severity as defined by level of expertise required to manage it; and by severity of intellectual disability. Prospective data collection at the point of clinical care proved possible without disrupting clinics, even for those with the most complex needs, and took the least time when done electronically. Counting the number of needs was easy to do, and quantified complexity in a way that informed clinical care for individuals and related directly to validated scales of functioning. Data outputs could inform more appropriate design and commissioning of quality services. © 2016 Mac Keith Press.

  9. The Role of Dysfunctional Myths in a Decision-Making Process under Bounded Rationality: A Complex Dynamical Systems Perspective.

    PubMed

    Stamovlasis, Dimitrios; Vaiopoulou, Julie

    2017-07-01

    The present study examines the factors influencing a decision-making process, with specific focus on the role of dysfunctional myths (DM). DM are thoughts or beliefs that are rather irrational, however influential to people's decisions. In this paper a decision-making process regarding the career choice of university students majoring in natural sciences and education (N=496) is examined by analyzing survey data taken via Career Decision Making Difficulties Questionnaire (CDDQ). The difficulty of making the choice and the certainty about one's decision were the state variables, while the independent variables were factors related to the lack of information or knowledge needed, which actually reflect a bounded rationality. Cusp catastrophe analysis, based on both least squares and maximum likelihood procedures, showed that the nonlinear models predicting the two state variables were superior to linear alternatives. Factors related to lack of knowledge about the steps involved in the process of career decision-making, lack of information about the various occupations, lack of information about self and lack of motivation acted as asymmetry, while dysfunctional myths acted as bifurcation factor for both state variables. The catastrophe model, grounded in empirical data, revealed a unique role for DM and a better interpretation within the context of complexity and the notion of bounded rationality. The analysis opens the nonlinear dynamical systems (NDS) perspective in studying decision-making processes. Theoretical and practical implications are discussed.

  10. Estimating the decomposition of predictive information in multivariate systems

    NASA Astrophysics Data System (ADS)

    Faes, Luca; Kugiumtzis, Dimitris; Nollo, Giandomenico; Jurysta, Fabrice; Marinazzo, Daniele

    2015-03-01

    In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of conditional mutual information, to the present target process. Moreover, it computes all information-theoretic quantities using a nearest-neighbor technique designed to compensate the bias due to the different dimensionality of individual entropy terms. The resulting estimators of prediction entropy, storage entropy, transfer entropy, and partial transfer entropy are tested on simulations of coupled linear stochastic and nonlinear deterministic dynamic processes, demonstrating the superiority of the proposed approach over the traditional estimators based on uniform embedding. The framework is then applied to multivariate physiologic time series, resulting in physiologically well-interpretable information decompositions of cardiovascular and cardiorespiratory interactions during head-up tilt and of joint brain-heart dynamics during sleep.

  11. Information-Theoretical Complexity Analysis of Selected Elementary Chemical Reactions

    NASA Astrophysics Data System (ADS)

    Molina-Espíritu, M.; Esquivel, R. O.; Dehesa, J. S.

    We investigate the complexity of selected elementary chemical reactions (namely, the hydrogenic-abstraction reaction and the identity SN2 exchange reaction) by means of the following single and composite information-theoretic measures: disequilibrium (D), exponential entropy(L), Fisher information (I), power entropy (J), I-D, D-L and I-J planes and Fisher-Shannon (FS) and Lopez-Mancini-Calbet (LMC) shape complexities. These quantities, which are functionals of the one-particle density, are computed in both position (r) and momentum (p) spaces. The analysis revealed that the chemically significant regions of these reactions can be identified through most of the single information-theoretic measures and the two-component planes, not only the ones which are commonly revealed by the energy, such as the reactant/product (R/P) and the transition state (TS), but also those that are not present in the energy profile such as the bond cleavage energy region (BCER), the bond breaking/forming regions (B-B/F) and the charge transfer process (CT). The analysis of the complexities shows that the energy profile of the abstraction reaction bears the same information-theoretical features of the LMC and FS measures, however for the identity SN2 exchange reaction does not hold a simple behavior with respect to the LMC and FS measures. Most of the chemical features of interest (BCER, B-B/F and CT) are only revealed when particular information-theoretic aspects of localizability (L or J), uniformity (D) and disorder (I) are considered.

  12. Developing effective messages about potable recycled water: The importance of message structure and content

    NASA Astrophysics Data System (ADS)

    Price, J.; Fielding, K. S.; Gardner, J.; Leviston, Z.; Green, M.

    2015-04-01

    Community opposition is a barrier to potable recycled water schemes. Effective communication strategies about such schemes are needed. Drawing on social psychological literature, two experimental studies are presented, which explore messages that improve public perceptions of potable recycled water. The Elaboration-Likelihood Model of information processing and attitude change is tested and supported. Study 1 (N = 415) premeasured support for recycled water, and trust in government information at Time 1. Messages varied in complexity and sidedness were presented at Time 2 (3 weeks later), and support and trust were remeasured. Support increased after receiving information, provided that participants received complex rather than simple information. Trust in government was also higher after receiving information. There was tentative evidence of this in response to two-sided messages rather than one-sided messages. Initial attitudes to recycled water moderated responses to information. Those initially neutral or ambivalent responded differently to simple and one-sided messages, compared to participants with positive or negative attitudes. Study 2 (N = 957) tested the effectiveness of information about the low relative risks, and/or benefits of potable recycled water, compared to control groups. Messages about the low risks resulted in higher support when the issue of recycled water was relevant. Messages about benefits resulted in higher perceived issue relevance, but did not translate into greater support. The results highlight the importance of understanding people's motivation to process information, and need to tailor communication to match attitudes and stage of recycled water schemes' development.

  13. Temporal Information Partitioning Networks (TIPNets): A process network approach to infer ecohydrologic shifts

    NASA Astrophysics Data System (ADS)

    Goodwell, Allison E.; Kumar, Praveen

    2017-07-01

    In an ecohydrologic system, components of atmospheric, vegetation, and root-soil subsystems participate in forcing and feedback interactions at varying time scales and intensities. The structure of this network of complex interactions varies in terms of connectivity, strength, and time scale due to perturbations or changing conditions such as rainfall, drought, or land use. However, characterization of these interactions is difficult due to multivariate and weak dependencies in the presence of noise, nonlinearities, and limited data. We introduce a framework for Temporal Information Partitioning Networks (TIPNets), in which time-series variables are viewed as nodes, and lagged multivariate mutual information measures are links. These links are partitioned into synergistic, unique, and redundant information components, where synergy is information provided only jointly, unique information is only provided by a single source, and redundancy is overlapping information. We construct TIPNets from 1 min weather station data over several hour time windows. From a comparison of dry, wet, and rainy conditions, we find that information strengths increase when solar radiation and surface moisture are present, and surface moisture and wind variability are redundant and synergistic influences, respectively. Over a growing season, network trends reveal patterns that vary with vegetation and rainfall patterns. The framework presented here enables us to interpret process connectivity in a multivariate context, which can lead to better inference of behavioral shifts due to perturbations in ecohydrologic systems. This work contributes to more holistic characterizations of system behavior, and can benefit a wide variety of studies of complex systems.

  14. Prognostics Methodology for Complex Systems

    NASA Technical Reports Server (NTRS)

    Gulati, Sandeep; Mackey, Ryan

    2003-01-01

    An automatic method to schedule maintenance and repair of complex systems is produced based on a computational structure called the Informed Maintenance Grid (IMG). This method provides solutions to the two fundamental problems in autonomic logistics: (1) unambiguous detection of deterioration or impending loss of function and (2) determination of the time remaining to perform maintenance or other corrective action based upon information from the system. The IMG provides a health determination over the medium-to-longterm operation of the system, from one or more days to years of study. The IMG is especially applicable to spacecraft and both piloted and autonomous aircraft, or industrial control processes.

  15. Preliminary Characterization of Erythrocytes Deformability on the Entropy-Complexity Plane

    PubMed Central

    Korol, Ana M; D’Arrigo, Mabel; Foresto, Patricia; Pérez, Susana; Martín, Maria T; Rosso, Osualdo A

    2010-01-01

    We present an application of wavelet-based Information Theory quantifiers (Normalized Total Shannon Entropy, MPR-Statistical Complexity and Entropy-Complexity plane) on red blood cells membrane viscoelasticity characterization. These quantifiers exhibit important localization advantages provided by the Wavelet Theory. The present approach produces a clear characterization of this dynamical system, finding out an evident manifestation of a random process on the red cell samples of healthy individuals, and its sharp reduction of randomness on analyzing a human haematological disease, such as β-thalassaemia minor. PMID:21611139

  16. Disentangling brain activity related to the processing of emotional visual information and emotional arousal.

    PubMed

    Kuniecki, Michał; Wołoszyn, Kinga; Domagalik, Aleksandra; Pilarczyk, Joanna

    2018-05-01

    Processing of emotional visual information engages cognitive functions and induces arousal. We aimed to examine the modulatory role of emotional valence on brain activations linked to the processing of visual information and those linked to arousal. Participants were scanned and their pupil size was measured while viewing negative and neutral images. The visual noise was added to the images in various proportions to parametrically manipulate the amount of visual information. Pupil size was used as an index of physiological arousal. We show that arousal induced by the negative images, as compared to the neutral ones, is primarily related to greater amygdala activity while increasing visibility of negative content to enhanced activity in the lateral occipital complex (LOC). We argue that more intense visual processing of negative scenes can occur irrespective of the level of arousal. It may suggest that higher areas of the visual stream are fine-tuned to process emotionally relevant objects. Both arousal and processing of emotional visual information modulated activity within the ventromedial prefrontal cortex (vmPFC). Overlapping activations within the vmPFC may reflect the integration of these aspects of emotional processing. Additionally, we show that emotionally-evoked pupil dilations are related to activations in the amygdala, vmPFC, and LOC.

  17. Overview of DYMCAS, the Y-12 Material Control And Accountability System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alspaugh, D. H.

    2001-07-01

    This paper gives an overview of DYMCAS, the material control and accountability information system for the Y-12 National Security Complex. A common misconception, even within the DOE community, understates the nature and complexity of material control and accountability (MC and A) systems, likening them to parcel delivery systems tracking packages at various locations or banking systems that account for money, down to the penny. A major point set forth in this paper is that MC and A systems such as DYMCAS can be and often are very complex. Given accountability reporting requirements and the critical and sensitive nature of themore » task, no MC and A system can be simple. The complexity of site-level accountability systems, however, varies dramatically depending on the amounts, kinds, and forms of nuclear materials and the kinds of processing performed at the site. Some accountability systems are tailored to unique and highly complex site-level materials and material processing and, consequently, are highly complex systems. Sites with less complexity require less complex accountability systems, and where processes and practices are the same or similar, sites on the mid-to-low end of the complexity scale can effectively utilize a standard accountability system. In addition to being complex, a unique feature of DYMCAS is its integration with the site production control and manufacturing system. This paper will review the advantages of such integration, as well as related challenges, and make the point that the effectiveness of complex MC and A systems can be significantly enhanced through appropriate systems integration.« less

  18. Characterizing informative sequence descriptors and predicting binding affinities of heterodimeric protein complexes.

    PubMed

    Srinivasulu, Yerukala Sathipati; Wang, Jyun-Rong; Hsu, Kai-Ti; Tsai, Ming-Ju; Charoenkwan, Phasit; Huang, Wen-Lin; Huang, Hui-Ling; Ho, Shinn-Ying

    2015-01-01

    Protein-protein interactions (PPIs) are involved in various biological processes, and underlying mechanism of the interactions plays a crucial role in therapeutics and protein engineering. Most machine learning approaches have been developed for predicting the binding affinity of protein-protein complexes based on structure and functional information. This work aims to predict the binding affinity of heterodimeric protein complexes from sequences only. This work proposes a support vector machine (SVM) based binding affinity classifier, called SVM-BAC, to classify heterodimeric protein complexes based on the prediction of their binding affinity. SVM-BAC identified 14 of 580 sequence descriptors (physicochemical, energetic and conformational properties of the 20 amino acids) to classify 216 heterodimeric protein complexes into low and high binding affinity. SVM-BAC yielded the training accuracy, sensitivity, specificity, AUC and test accuracy of 85.80%, 0.89, 0.83, 0.86 and 83.33%, respectively, better than existing machine learning algorithms. The 14 features and support vector regression were further used to estimate the binding affinities (Pkd) of 200 heterodimeric protein complexes. Prediction performance of a Jackknife test was the correlation coefficient of 0.34 and mean absolute error of 1.4. We further analyze three informative physicochemical properties according to their contribution to prediction performance. Results reveal that the following properties are effective in predicting the binding affinity of heterodimeric protein complexes: apparent partition energy based on buried molar fractions, relations between chemical structure and biological activity in principal component analysis IV, and normalized frequency of beta turn. The proposed sequence-based prediction method SVM-BAC uses an optimal feature selection method to identify 14 informative features to classify and predict binding affinity of heterodimeric protein complexes. The characterization analysis revealed that the average numbers of beta turns and hydrogen bonds at protein-protein interfaces in high binding affinity complexes are more than those in low binding affinity complexes.

  19. Characterizing informative sequence descriptors and predicting binding affinities of heterodimeric protein complexes

    PubMed Central

    2015-01-01

    Background Protein-protein interactions (PPIs) are involved in various biological processes, and underlying mechanism of the interactions plays a crucial role in therapeutics and protein engineering. Most machine learning approaches have been developed for predicting the binding affinity of protein-protein complexes based on structure and functional information. This work aims to predict the binding affinity of heterodimeric protein complexes from sequences only. Results This work proposes a support vector machine (SVM) based binding affinity classifier, called SVM-BAC, to classify heterodimeric protein complexes based on the prediction of their binding affinity. SVM-BAC identified 14 of 580 sequence descriptors (physicochemical, energetic and conformational properties of the 20 amino acids) to classify 216 heterodimeric protein complexes into low and high binding affinity. SVM-BAC yielded the training accuracy, sensitivity, specificity, AUC and test accuracy of 85.80%, 0.89, 0.83, 0.86 and 83.33%, respectively, better than existing machine learning algorithms. The 14 features and support vector regression were further used to estimate the binding affinities (Pkd) of 200 heterodimeric protein complexes. Prediction performance of a Jackknife test was the correlation coefficient of 0.34 and mean absolute error of 1.4. We further analyze three informative physicochemical properties according to their contribution to prediction performance. Results reveal that the following properties are effective in predicting the binding affinity of heterodimeric protein complexes: apparent partition energy based on buried molar fractions, relations between chemical structure and biological activity in principal component analysis IV, and normalized frequency of beta turn. Conclusions The proposed sequence-based prediction method SVM-BAC uses an optimal feature selection method to identify 14 informative features to classify and predict binding affinity of heterodimeric protein complexes. The characterization analysis revealed that the average numbers of beta turns and hydrogen bonds at protein-protein interfaces in high binding affinity complexes are more than those in low binding affinity complexes. PMID:26681483

  20. Information-driven self-organization: the dynamical system approach to autonomous robot behavior.

    PubMed

    Ay, Nihat; Bernigau, Holger; Der, Ralf; Prokopenko, Mikhail

    2012-09-01

    In recent years, information theory has come into the focus of researchers interested in the sensorimotor dynamics of both robots and living beings. One root for these approaches is the idea that living beings are information processing systems and that the optimization of these processes should be an evolutionary advantage. Apart from these more fundamental questions, there is much interest recently in the question how a robot can be equipped with an internal drive for innovation or curiosity that may serve as a drive for an open-ended, self-determined development of the robot. The success of these approaches depends essentially on the choice of a convenient measure for the information. This article studies in some detail the use of the predictive information (PI), also called excess entropy or effective measure complexity, of the sensorimotor process. The PI of a process quantifies the total information of past experience that can be used for predicting future events. However, the application of information theoretic measures in robotics mostly is restricted to the case of a finite, discrete state-action space. This article aims at applying the PI in the dynamical systems approach to robot control. We study linear systems as a first step and derive exact results for the PI together with explicit learning rules for the parameters of the controller. Interestingly, these learning rules are of Hebbian nature and local in the sense that the synaptic update is given by the product of activities available directly at the pertinent synaptic ports. The general findings are exemplified by a number of case studies. In particular, in a two-dimensional system, designed at mimicking embodied systems with latent oscillatory locomotion patterns, it is shown that maximizing the PI means to recognize and amplify the latent modes of the robotic system. This and many other examples show that the learning rules derived from the maximum PI principle are a versatile tool for the self-organization of behavior in complex robotic systems.

  1. Information search and decision making: effects of age and complexity on strategy use.

    PubMed

    Queen, Tara L; Hess, Thomas M; Ennis, Gilda E; Dowd, Keith; Grühn, Daniel

    2012-12-01

    The impact of task complexity on information search strategy and decision quality was examined in a sample of 135 young, middle-aged, and older adults. We were particularly interested in the competing roles of fluid cognitive ability and domain knowledge and experience, with the former being a negative influence and the latter being a positive influence on older adults' performance. Participants utilized 2 decision matrices, which varied in complexity, regarding a consumer purchase. Using process tracing software and an algorithm developed to assess decision strategy, we recorded search behavior, strategy selection, and final decision. Contrary to expectations, older adults were not more likely than the younger age groups to engage in information-minimizing search behaviors in response to increases in task complexity. Similarly, adults of all ages used comparable decision strategies and adapted their strategies to the demands of the task. We also examined decision outcomes in relation to participants' preferences. Overall, it seems that older adults utilize simpler sets of information primarily reflecting the most valued attributes in making their choice. The results of this study suggest that older adults are adaptive in their approach to decision making and that this ability may benefit from accrued knowledge and experience. 2013 APA, all rights reserved

  2. Imaging of DNA and Protein by SFM and Combined SFM-TIRF Microscopy.

    PubMed

    Grosbart, Małgorzata; Ristić, Dejan; Sánchez, Humberto; Wyman, Claire

    2018-01-01

    Direct imaging is invaluable for understanding the mechanism of complex genome transactions where proteins work together to organize, transcribe, replicate and repair DNA. Scanning (or atomic) force microscopy is an ideal tool for this, providing 3D information on molecular structure at nm resolution from defined components. This is a convenient and practical addition to in vitro studies as readily obtainable amounts of purified proteins and DNA are required. The images reveal structural details on the size and location of DNA bound proteins as well as protein-induced arrangement of the DNA, which are directly correlated in the same complexes. In addition, even from static images, the different forms observed and their relative distributions can be used to deduce the variety and stability of different complexes that are necessarily involved in dynamic processes. Recently available instruments that combine fluorescence with topographic imaging allow the identification of specific molecular components in complex assemblies, which broadens the applications and increases the information obtained from direct imaging of molecular complexes. We describe here basic methods for preparing samples of proteins, DNA and complexes of the two for topographic imaging and quantitative analysis. We also describe special considerations for combined fluorescence and topographic imaging of molecular complexes.

  3. Using Complexity and Network Concepts to Inform Healthcare Knowledge Translation

    PubMed Central

    Kitson, Alison; Brook, Alan; Harvey, Gill; Jordan, Zoe; Marshall, Rhianon; O’Shea, Rebekah; Wilson, David

    2018-01-01

    Many representations of the movement of healthcare knowledge through society exist, and multiple models for the translation of evidence into policy and practice have been articulated. Most are linear or cyclical and very few come close to reflecting the dense and intricate relationships, systems and politics of organizations and the processes required to enact sustainable improvements. We illustrate how using complexity and network concepts can better inform knowledge translation (KT) and argue that changing the way we think and talk about KT could enhance the creation and movement of knowledge throughout those systems needing to develop and utilise it. From our theoretical refinement, we propose that KT is a complex network composed of five interdependent sub-networks, or clusters, of key processes (problem identification [PI], knowledge creation [KC], knowledge synthesis [KS], implementation [I], and evaluation [E]) that interact dynamically in different ways at different times across one or more sectors (community; health; government; education; research for example). We call this the KT Complexity Network, defined as a network that optimises the effective, appropriate and timely creation and movement of knowledge to those who need it in order to improve what they do. Activation within and throughout any one of these processes and systems depends upon the agents promoting the change, successfully working across and between multiple systems and clusters. The case is presented for moving to a way of thinking about KT using complexity and network concepts. This extends the thinking that is developing around integrated KT approaches. There are a number of policy and practice implications that need to be considered in light of this shift in thinking. PMID:29524952

  4. Self-Regulation Principles Underlying Risk Perception and Decision Making within the Context of Genomic Testing

    PubMed Central

    Cameron, Linda D.; Biesecker, Barbara Bowles; Peters, Ellen; Taber, Jennifer M.; Klein, William M. P.

    2017-01-01

    Advances in theory and research on self-regulation and decision-making processes have yielded important insights into how cognitive, emotional, and social processes shape risk perceptions and risk-related decisions. We examine how self-regulation theory can be applied to inform our understanding of decision-making processes within the context of genomic testing, a clinical arena in which individuals face complex risk information and potentially life-altering decisions. After presenting key principles of self-regulation, we present a genomic testing case example to illustrate how principles related to risk representations, approach and avoidance motivations, emotion regulation, defensive responses, temporal construals, and capacities such as numeric abilities can shape decisions and psychological responses during the genomic testing process. We conclude with implications for using self-regulation theory to advance science within genomic testing and opportunities for how this research can inform further developments in self-regulation theory. PMID:29225669

  5. Self-Regulation Principles Underlying Risk Perception and Decision Making within the Context of Genomic Testing.

    PubMed

    Cameron, Linda D; Biesecker, Barbara Bowles; Peters, Ellen; Taber, Jennifer M; Klein, William M P

    2017-05-01

    Advances in theory and research on self-regulation and decision-making processes have yielded important insights into how cognitive, emotional, and social processes shape risk perceptions and risk-related decisions. We examine how self-regulation theory can be applied to inform our understanding of decision-making processes within the context of genomic testing, a clinical arena in which individuals face complex risk information and potentially life-altering decisions. After presenting key principles of self-regulation, we present a genomic testing case example to illustrate how principles related to risk representations, approach and avoidance motivations, emotion regulation, defensive responses, temporal construals, and capacities such as numeric abilities can shape decisions and psychological responses during the genomic testing process. We conclude with implications for using self-regulation theory to advance science within genomic testing and opportunities for how this research can inform further developments in self-regulation theory.

  6. A comparative approach for the investigation of biological information processing: An examination of the structure and function of computer hard drives and DNA

    PubMed Central

    2010-01-01

    Background The robust storage, updating and utilization of information are necessary for the maintenance and perpetuation of dynamic systems. These systems can exist as constructs of metal-oxide semiconductors and silicon, as in a digital computer, or in the "wetware" of organic compounds, proteins and nucleic acids that make up biological organisms. We propose that there are essential functional properties of centralized information-processing systems; for digital computers these properties reside in the computer's hard drive, and for eukaryotic cells they are manifest in the DNA and associated structures. Methods Presented herein is a descriptive framework that compares DNA and its associated proteins and sub-nuclear structure with the structure and function of the computer hard drive. We identify four essential properties of information for a centralized storage and processing system: (1) orthogonal uniqueness, (2) low level formatting, (3) high level formatting and (4) translation of stored to usable form. The corresponding aspects of the DNA complex and a computer hard drive are categorized using this classification. This is intended to demonstrate a functional equivalence between the components of the two systems, and thus the systems themselves. Results Both the DNA complex and the computer hard drive contain components that fulfill the essential properties of a centralized information storage and processing system. The functional equivalence of these components provides insight into both the design process of engineered systems and the evolved solutions addressing similar system requirements. However, there are points where the comparison breaks down, particularly when there are externally imposed information-organizing structures on the computer hard drive. A specific example of this is the imposition of the File Allocation Table (FAT) during high level formatting of the computer hard drive and the subsequent loading of an operating system (OS). Biological systems do not have an external source for a map of their stored information or for an operational instruction set; rather, they must contain an organizational template conserved within their intra-nuclear architecture that "manipulates" the laws of chemistry and physics into a highly robust instruction set. We propose that the epigenetic structure of the intra-nuclear environment and the non-coding RNA may play the roles of a Biological File Allocation Table (BFAT) and biological operating system (Bio-OS) in eukaryotic cells. Conclusions The comparison of functional and structural characteristics of the DNA complex and the computer hard drive leads to a new descriptive paradigm that identifies the DNA as a dynamic storage system of biological information. This system is embodied in an autonomous operating system that inductively follows organizational structures, data hierarchy and executable operations that are well understood in the computer science industry. Characterizing the "DNA hard drive" in this fashion can lead to insights arising from discrepancies in the descriptive framework, particularly with respect to positing the role of epigenetic processes in an information-processing context. Further expansions arising from this comparison include the view of cells as parallel computing machines and a new approach towards characterizing cellular control systems. PMID:20092652

  7. A comparative approach for the investigation of biological information processing: an examination of the structure and function of computer hard drives and DNA.

    PubMed

    D'Onofrio, David J; An, Gary

    2010-01-21

    The robust storage, updating and utilization of information are necessary for the maintenance and perpetuation of dynamic systems. These systems can exist as constructs of metal-oxide semiconductors and silicon, as in a digital computer, or in the "wetware" of organic compounds, proteins and nucleic acids that make up biological organisms. We propose that there are essential functional properties of centralized information-processing systems; for digital computers these properties reside in the computer's hard drive, and for eukaryotic cells they are manifest in the DNA and associated structures. Presented herein is a descriptive framework that compares DNA and its associated proteins and sub-nuclear structure with the structure and function of the computer hard drive. We identify four essential properties of information for a centralized storage and processing system: (1) orthogonal uniqueness, (2) low level formatting, (3) high level formatting and (4) translation of stored to usable form. The corresponding aspects of the DNA complex and a computer hard drive are categorized using this classification. This is intended to demonstrate a functional equivalence between the components of the two systems, and thus the systems themselves. Both the DNA complex and the computer hard drive contain components that fulfill the essential properties of a centralized information storage and processing system. The functional equivalence of these components provides insight into both the design process of engineered systems and the evolved solutions addressing similar system requirements. However, there are points where the comparison breaks down, particularly when there are externally imposed information-organizing structures on the computer hard drive. A specific example of this is the imposition of the File Allocation Table (FAT) during high level formatting of the computer hard drive and the subsequent loading of an operating system (OS). Biological systems do not have an external source for a map of their stored information or for an operational instruction set; rather, they must contain an organizational template conserved within their intra-nuclear architecture that "manipulates" the laws of chemistry and physics into a highly robust instruction set. We propose that the epigenetic structure of the intra-nuclear environment and the non-coding RNA may play the roles of a Biological File Allocation Table (BFAT) and biological operating system (Bio-OS) in eukaryotic cells. The comparison of functional and structural characteristics of the DNA complex and the computer hard drive leads to a new descriptive paradigm that identifies the DNA as a dynamic storage system of biological information. This system is embodied in an autonomous operating system that inductively follows organizational structures, data hierarchy and executable operations that are well understood in the computer science industry. Characterizing the "DNA hard drive" in this fashion can lead to insights arising from discrepancies in the descriptive framework, particularly with respect to positing the role of epigenetic processes in an information-processing context. Further expansions arising from this comparison include the view of cells as parallel computing machines and a new approach towards characterizing cellular control systems.

  8. Social cognition in a case of amnesia with neurodevelopmental mechanisms.

    PubMed

    Staniloiu, Angelica; Borsutzky, Sabine; Woermann, Friedrich G; Markowitsch, Hans J

    2013-01-01

    Episodic-autobiographical memory (EAM) is considered to emerge gradually in concert with the development of other cognitive abilities (such as executive functions, personal semantic knowledge, emotional knowledge, theory of mind (ToM) functions, language, and working memory). On the brain level its emergence is accompanied by structural and functional reorganization of different components of the so-called EAM network. This network includes the hippocampal formation, which is viewed as being vital for the acquisition of memories of personal events for long-term storage. Developmental studies have emphasized socio-cultural-linguistic mechanisms that may be unique to the development of EAM. Furthermore it was hypothesized that one of the main functions of EAM is the social one. In the research field, the link between EAM and social cognition remains however debated. Herein we aim to bring new insights into the relation between EAM and social information processing (including social cognition) by describing a young adult patient with amnesia with neurodevelopmental mechanisms due to perinatal complications accompanied by hypoxia. The patient was investigated medically, psychiatrically, and with neuropsychological and neuroimaging methods. Structural high resolution magnetic resonance imaging revealed significant bilateral hippocampal atrophy as well as indices for degeneration in the amygdalae, basal ganglia, and thalamus, when a less conservative threshold was applied. In addition to extensive memory investigations and testing other (non-social) cognitive functions, we employed a broad range of tests that assessed social information processing (social perception, social cognition, social regulation). Our results point to both preserved (empathy, core ToM functions, visual affect selection, and discrimination, affective prosody discrimination) and impaired domains of social information processing (incongruent affective prosody processing, complex social judgments). They support proposals for a role of the hippocampal formation in processing more complex social information that likely requires multimodal relational handling.

  9. Social cognition in a case of amnesia with neurodevelopmental mechanisms

    PubMed Central

    Staniloiu, Angelica; Borsutzky, Sabine; Woermann, Friedrich G.; Markowitsch, Hans J.

    2013-01-01

    Episodic–autobiographical memory (EAM) is considered to emerge gradually in concert with the development of other cognitive abilities (such as executive functions, personal semantic knowledge, emotional knowledge, theory of mind (ToM) functions, language, and working memory). On the brain level its emergence is accompanied by structural and functional reorganization of different components of the so-called EAM network. This network includes the hippocampal formation, which is viewed as being vital for the acquisition of memories of personal events for long-term storage. Developmental studies have emphasized socio-cultural-linguistic mechanisms that may be unique to the development of EAM. Furthermore it was hypothesized that one of the main functions of EAM is the social one. In the research field, the link between EAM and social cognition remains however debated. Herein we aim to bring new insights into the relation between EAM and social information processing (including social cognition) by describing a young adult patient with amnesia with neurodevelopmental mechanisms due to perinatal complications accompanied by hypoxia. The patient was investigated medically, psychiatrically, and with neuropsychological and neuroimaging methods. Structural high resolution magnetic resonance imaging revealed significant bilateral hippocampal atrophy as well as indices for degeneration in the amygdalae, basal ganglia, and thalamus, when a less conservative threshold was applied. In addition to extensive memory investigations and testing other (non-social) cognitive functions, we employed a broad range of tests that assessed social information processing (social perception, social cognition, social regulation). Our results point to both preserved (empathy, core ToM functions, visual affect selection, and discrimination, affective prosody discrimination) and impaired domains of social information processing (incongruent affective prosody processing, complex social judgments). They support proposals for a role of the hippocampal formation in processing more complex social information that likely requires multimodal relational handling. PMID:23805111

  10. Synaptic physiology of the flow of information in the cat's visual cortex in vivo

    PubMed Central

    Hirsch, Judith A; Martinez, Luis M; Alonso, José-Manuel; Desai, Komal; Pillai, Cinthi; Pierre, Carhine

    2002-01-01

    Each stage of the striate cortical circuit extracts novel information about the visual environment. We asked if this analytic process reflected laminar variations in synaptic physiology by making whole-cell recording with dye-filled electrodes from the cat's visual cortex and thalamus; the stimuli were flashed spots. Thalamic afferents terminate in layer 4, which contains two types of cell, simple and complex, distinguished by the spatial structure of the receptive field. Previously, we had found that the postsynaptic and spike responses of simple cells reliably followed the time course of flash-evoked thalamic activity. Here we report that complex cells in layer 4 (or cells intermediate between simple and complex) similarly reprised thalamic activity (response/trial, 99 ± 1.9 %; response duration 159 ± 57 ms; latency 25 ± 4 ms; average ± standard deviation; n = 7). Thus, all cells in layer 4 share a common synaptic physiology that allows secure integration of thalamic input. By contrast, at the second cortical stage (layer 2+3), where layer 4 directs its output, postsynaptic responses did not track simple patterns of antecedent activity. Typical responses to the static stimulus were intermittent and brief (response/trial, 31 ± 40 %; response duration 72 ± 60 ms, latency 39 ± 7 ms; n = 11). Only richer stimuli like those including motion evoked reliable responses. All told, the second level of cortical processing differs markedly from the first. At that later stage, ascending information seems strongly gated by connections between cortical neurons. Inputs must be combined in newly specified patterns to influence intracortical stages of processing. PMID:11927691

  11. The usability axiom of medical information systems.

    PubMed

    Pantazi, Stefan V; Kushniruk, Andre; Moehr, Jochen R

    2006-12-01

    In this article we begin by connecting the concept of simplicity of user interfaces of information systems with that of usability, and the concept of complexity of the problem-solving in information systems with the concept of usefulness. We continue by stating "the usability axiom" of medical information technology: information systems must be, at the same time, usable and useful. We then try to show why, given existing technology, the axiom is a paradox and we continue with analysing and reformulating it several times, from more fundamental information processing perspectives. We underline the importance of the concept of representation and demonstrate the need for context-dependent representations. By means of thought experiments and examples, we advocate the need for context-dependent information processing and argue for the relevance of algorithmic information theory and case-based reasoning in this context. Further, we introduce the notion of concept spaces and offer a pragmatic perspective on context-dependent representations. We conclude that the efficient management of concept spaces may help with the solution to the medical information technology paradox. Finally, we propose a view of informatics centred on the concepts of context-dependent information processing and management of concept spaces that aligns well with existing knowledge centric definitions of informatics in general and medical informatics in particular. In effect, our view extends M. Musen's proposal and proposes a definition of Medical Informatics as context-dependent medical information processing. The axiom that medical information systems must be, at the same time, useful and usable, is a paradox and its investigation by means of examples and thought experiments leads to the recognition of the crucial importance of context-dependent information processing. On the premise that context-dependent information processing equates to knowledge processing, this view defines Medical Informatics as a context-dependent medical information processing which aligns well with existing knowledge centric definitions of our field.

  12. Facilitation of information processing in the primary somatosensory area in the ball rotation task.

    PubMed

    Wasaka, Toshiaki; Kida, Tetsuo; Kakigi, Ryusuke

    2017-11-14

    Somatosensory input to the brain is known to be modulated during voluntary movement. It has been demonstrated that the response in the primary somatosensory cortex (SI) is generally gated during simple movement of the corresponding body part. This study investigated sensorimotor integration in the SI during manual movement using a motor task combining movement complexity and object manipulation. While the amplitude of M20 and M30 generated in the SI showed a significant reduction during manual movement, the subsequent component (M38) was significantly higher in the motor task than in the stationary condition. Especially, that in the ball rotation task showed a significant enhancement compared with those in the ball grasping and stone and paper tasks. Although sensorimotor integration in the SI generally has an inhibitory effect on information processing, here we found facilitation. Since the ball rotation task seems to be increasing the demand for somatosensory information to control the complex movements and operate two balls in the palm, it may have resulted in an enhancement of M38 generated in the SI.

  13. Spreading dynamics in complex networks

    NASA Astrophysics Data System (ADS)

    Pei, Sen; Makse, Hernán A.

    2013-12-01

    Searching for influential spreaders in complex networks is an issue of great significance for applications across various domains, ranging from epidemic control, innovation diffusion, viral marketing, and social movement to idea propagation. In this paper, we first display some of the most important theoretical models that describe spreading processes, and then discuss the problem of locating both the individual and multiple influential spreaders respectively. Recent approaches in these two topics are presented. For the identification of privileged single spreaders, we summarize several widely used centralities, such as degree, betweenness centrality, PageRank, k-shell, etc. We investigate the empirical diffusion data in a large scale online social community—LiveJournal. With this extensive dataset, we find that various measures can convey very distinct information of nodes. Of all the users in the LiveJournal social network, only a small fraction of them are involved in spreading. For the spreading processes in LiveJournal, while degree can locate nodes participating in information diffusion with higher probability, k-shell is more effective in finding nodes with a large influence. Our results should provide useful information for designing efficient spreading strategies in reality.

  14. Synchronization invariance under network structural transformations

    NASA Astrophysics Data System (ADS)

    Arola-Fernández, Lluís; Díaz-Guilera, Albert; Arenas, Alex

    2018-06-01

    Synchronization processes are ubiquitous despite the many connectivity patterns that complex systems can show. Usually, the emergence of synchrony is a macroscopic observable; however, the microscopic details of the system, as, e.g., the underlying network of interactions, is many times partially or totally unknown. We already know that different interaction structures can give rise to a common functionality, understood as a common macroscopic observable. Building upon this fact, here we propose network transformations that keep the collective behavior of a large system of Kuramoto oscillators invariant. We derive a method based on information theory principles, that allows us to adjust the weights of the structural interactions to map random homogeneous in-degree networks into random heterogeneous networks and vice versa, keeping synchronization values invariant. The results of the proposed transformations reveal an interesting principle; heterogeneous networks can be mapped to homogeneous ones with local information, but the reverse process needs to exploit higher-order information. The formalism provides analytical insight to tackle real complex scenarios when dealing with uncertainty in the measurements of the underlying connectivity structure.

  15. Understanding the Data Complexity continuum to reduce data management costs and increase data usability through partnerships with the National Centers for Environmental Information

    NASA Astrophysics Data System (ADS)

    Mesick, S.; Weathers, K. W.

    2017-12-01

    Data complexity can be seen as a continuum from complex to simple. The term data complexity refers to data collections that are disorganized, poorly documented, and generally do not follow best data management practices. Complex data collections are challenging and expensive to manage. Simplified collections readily support automated archival processes, enhanced discovery and data access, as well as production of services that make data easier to reuse. In this session, NOAA NCEI scientific data stewards will discuss the data complexity continuum. This talk will explore data simplification concepts, methods, and tools that data managers can employ which may offer more control over data management costs and processes, while achieving policy goals for open data access and ready reuse. Topics will include guidance for data managers on best allocation of limited data management resources; models for partnering with NCEI to accomplish shared data management goals; and will demonstrate through case studies the benefits of investing in documentation, accessibility, and services to increase data value and return on investment.

  16. Ontology patterns for complex topographic feature yypes

    USGS Publications Warehouse

    Varanka, Dalia E.

    2011-01-01

    Complex feature types are defined as integrated relations between basic features for a shared meaning or concept. The shared semantic concept is difficult to define in commonly used geographic information systems (GIS) and remote sensing technologies. The role of spatial relations between complex feature parts was recognized in early GIS literature, but had limited representation in the feature or coverage data models of GIS. Spatial relations are more explicitly specified in semantic technology. In this paper, semantics for topographic feature ontology design patterns (ODP) are developed as data models for the representation of complex features. In the context of topographic processes, component assemblages are supported by resource systems and are found on local landscapes. The topographic ontology is organized across six thematic modules that can account for basic feature types, resource systems, and landscape types. Types of complex feature attributes include location, generative processes and physical description. Node/edge networks model standard spatial relations and relations specific to topographic science to represent complex features. To demonstrate these concepts, data from The National Map of the U. S. Geological Survey was converted and assembled into ODP.

  17. A systems-based approach for integrated design of materials, products and design process chains

    NASA Astrophysics Data System (ADS)

    Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh

    2007-12-01

    The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.

  18. Discourse comprehension in L2: Making sense of what is not explicitly said.

    PubMed

    Foucart, Alice; Romero-Rivas, Carlos; Gort, Bernharda Lottie; Costa, Albert

    2016-12-01

    Using ERPs, we tested whether L2 speakers can integrate multiple sources of information (e.g., semantic, pragmatic information) during discourse comprehension. We presented native speakers and L2 speakers with three-sentence scenarios in which the final sentence was highly causally related, intermediately related, or causally unrelated to its context; its interpretation therefore required simple or complex inferences. Native speakers revealed a gradual N400-like effect, larger in the causally unrelated condition than in the highly related condition, and falling in-between in the intermediately related condition, replicating previous results. In the crucial intermediately related condition, L2 speakers behaved like native speakers, however, showing extra processing in a later time-window. Overall, the results show that, when reading, L2 speakers are able to process information from the local context and prior information (e.g., world knowledge) to build global coherence, suggesting that they process different sources of information to make inferences online during discourse comprehension, like native speakers. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Hydrated Cations in the General Chemistry Course.

    ERIC Educational Resources Information Center

    Kauffman, George B.; Baxter, John F., Jr.

    1981-01-01

    Presents selected information regarding the descriptive chemistry of the common metal ions and their compounds, including the concepts of process of solution, polar molecules, ionic size and charge, complex ions, coordination number, and the Bronsted-Lowry acid-base theory. (CS)

  20. c-Mantic: A Cytoscape plugin for Semantic Web

    EPA Science Inventory

    Semantic Web tools can streamline the process of storing, analyzing and sharing biological information. Visualization is important for communicating such complex biological relationships. Here we use the flexibility and speed of the Cytoscape platform to interactively visualize s...

  1. How wilderness visitors choose entry points and campsites

    Treesearch

    Robert C. Lucas

    1990-01-01

    The process of selecting trailheads and campsites is described for visitors to the Bob Marshall Wilderness complex in Montana. Factors influencing decisions by different types of visitors are analyzed. Implications, particularly for information and education programs, are presented.

  2. Patterns of physiological activity accompanying performance on a perceptual-motor task.

    DOT National Transportation Integrated Search

    1969-04-01

    Air traffic controllers are required to spend considerable periods of time observing radar displays. Yet, information regarding physiological measures which best reflect the attentional process in complex vigilance tasks is generally lacking. As an i...

  3. [Clinical decision making and critical thinking in the nursing diagnostic process].

    PubMed

    Müller-Staub, Maria

    2006-10-01

    The daily routine requires complex thinking processes of nurses, but clinical decision making and critical thinking are underestimated in nursing. A great demand for educational measures in clinical judgement related with the diagnostic process was found in nurses. The German literature hardly describes nursing diagnoses as clinical judgements about human reactions on health problems / life processes. Critical thinking is described as an intellectual, disciplined process of active conceptualisation, application and synthesis of information. It is gained through observation, experience, reflection and communication and leads thinking and action. Critical thinking influences the aspects of clinical decision making a) diagnostic judgement, b) therapeutic reasoning and c) ethical decision making. Human reactions are complex processes and in their course, human behavior is interpreted in the focus of health. Therefore, more attention should be given to the nursing diagnostic process. This article presents the theoretical framework of the paper "Clinical decision making: Fostering critical thinking in the nursing diagnostic process through case studies".

  4. An Assessment of Information Exchange Practices, Challenges, and Opportunities to Support US Disease Surveillance in 3 States.

    PubMed

    Garcia, Macarena C; Garrett, Nedra Y; Singletary, Vivian; Brown, Sheereen; Hennessy-Burt, Tamara; Haney, Gillian; Link, Kimberly; Tripp, Jennifer; Mac Kenzie, William R; Yoon, Paula

    2017-12-07

    State and local public health agencies collect and use surveillance data to identify outbreaks, track cases, investigate causes, and implement measures to protect the public-s health through various surveillance systems and data exchange practices. The purpose of this assessment was to better understand current practices at state and local public health agencies for collecting, managing, processing, reporting, and exchanging notifiable disease surveillance information. Over an 18-month period (January 2014-June 2015), we evaluated the process of data exchange between surveillance systems, reporting burdens, and challenges within 3 states (California, Idaho, and Massachusetts) that were using 3 different reporting systems. All 3 states use a combination of paper-based and electronic information systems for managing and exchanging data on reportable conditions within the state. The flow of data from local jurisdictions to the state health departments varies considerably. When state and local information systems are not interoperable, manual duplicative data entry and other work-arounds are often required. The results of the assessment show the complexity of disease reporting at the state and local levels and the multiple systems, processes, and resources engaged in preparing, processing, and transmitting data that limit interoperability and decrease efficiency. Through this structured assessment, the Centers for Disease Control and Prevention (CDC) has a better understanding of the complexities for surveillance of using commercial off-the-shelf data systems (California and Massachusetts), and CDC-developed National Electronic Disease Surveillance System Base System. More efficient data exchange and use of data will help facilitate interoperability between National Notifiable Diseases Surveillance Systems.

  5. Organizational characteristics and processes are important in the adoption of the Alberta Nutrition Guidelines for Children and Youth in child-care centres.

    PubMed

    Farmer, Anna P; Nikolopoulos, Hara; McCargar, Linda; Berry, Tanya; Mager, Diana

    2015-06-01

    The objective of the present study was to gain an understanding of the organizational characteristics and processes in two child-care centres that may influence adoption of the Alberta Nutrition Guidelines for Children and Youth (ANGCY). In-depth qualitative case studies. Data were collected through direct observations, key informant interviews and field notes. Diffusion of Innovations theory guided the evaluation and intrinsic case analysis. Two urban child-care centres in Edmonton, Alberta, Canada identified as exemplary early adopter cases. Ten key informants comprised of directors, junior and senior staff members participated in interviews. Organizational processes such as leadership, networking and knowledge brokering, health champions and organizational culture positively influenced adoption behaviour in child-care centres. A key determinant influencing organizational behaviour within both centres was the directors' strong leadership. Acceptance of and adherence to the guidelines were facilitated by organizational factors, such as degree of centralization, formalization and complexity, level of staff training and education. Knowledge brokering by directors was important for transferring and exchanging information across the centre. All child-care staff embraced their informal role as health champions as essential to supporting guideline adherence and encouraging healthy food and eating environments. Organizational processes and characteristics such as leadership, knowledge brokering and networking, organizational culture and health champions played an important role in the adoption of nutrition guidelines in child-care centres. The complex interplay of decision making, organization of work and specialization of roles influenced the extent to which nutrition guidelines were adopted.

  6. Effects of information processing speed on learning, memory, and executive functioning in people living with HIV/AIDS.

    PubMed

    Fellows, Robert P; Byrd, Desiree A; Morgello, Susan

    2014-01-01

    It is unclear whether or to what degree literacy, aging, and other neurologic abnormalities relate to cognitive deficits among people living with HIV/AIDS in the combined antiretroviral therapy (CART) era. The primary aim of this study was to simultaneously examine the association of age, HIV-associated motor abnormalities, major depressive disorder, and reading level with information processing speed, learning, memory, and executive functions, and to determine whether processing speed mediated any of the relationships between cognitive and noncognitive variables. Participants were 186 racially and ethnically diverse men and women living with HIV/AIDS who underwent comprehensive neurological, neuropsychological, and medical evaluations. Structural equation modeling was utilized to assess the extent to which information processing speed mediated the relationship between age, motor abnormalities, major depressive disorder, and reading level with other cognitive abilities. Age, motor dysfunction, reading level, and current major depressive disorder were all significantly associated with information processing speed. Information processing speed fully mediated the effects of age on learning, memory, and executive functioning and partially mediated the effect of major depressive disorder on learning and memory. The effect of motor dysfunction on learning and memory was fully mediated by processing speed. These findings provide support for information processing speed as a primary deficit, which may account, at least in part, for many of the other cognitive abnormalities recognized in complex HIV/AIDS populations. The association of age and information processing speed may account for HIV/aging synergies in the generation of CART-era cognitive abnormalities.

  7. Analysis of a municipal wastewater treatment plant using a neural network-based pattern analysis

    USGS Publications Warehouse

    Hong, Y.-S.T.; Rosen, Michael R.; Bhamidimarri, R.

    2003-01-01

    This paper addresses the problem of how to capture the complex relationships that exist between process variables and to diagnose the dynamic behaviour of a municipal wastewater treatment plant (WTP). Due to the complex biological reaction mechanisms, the highly time-varying, and multivariable aspects of the real WTP, the diagnosis of the WTP are still difficult in practice. The application of intelligent techniques, which can analyse the multi-dimensional process data using a sophisticated visualisation technique, can be useful for analysing and diagnosing the activated-sludge WTP. In this paper, the Kohonen Self-Organising Feature Maps (KSOFM) neural network is applied to analyse the multi-dimensional process data, and to diagnose the inter-relationship of the process variables in a real activated-sludge WTP. By using component planes, some detailed local relationships between the process variables, e.g., responses of the process variables under different operating conditions, as well as the global information is discovered. The operating condition and the inter-relationship among the process variables in the WTP have been diagnosed and extracted by the information obtained from the clustering analysis of the maps. It is concluded that the KSOFM technique provides an effective analysing and diagnosing tool to understand the system behaviour and to extract knowledge contained in multi-dimensional data of a large-scale WTP. ?? 2003 Elsevier Science Ltd. All rights reserved.

  8. Secret information reconciliation based on punctured low-density parity-check codes for continuous-variable quantum key distribution

    NASA Astrophysics Data System (ADS)

    Jiang, Xue-Qin; Huang, Peng; Huang, Duan; Lin, Dakai; Zeng, Guihua

    2017-02-01

    Achieving information theoretic security with practical complexity is of great interest to continuous-variable quantum key distribution in the postprocessing procedure. In this paper, we propose a reconciliation scheme based on the punctured low-density parity-check (LDPC) codes. Compared to the well-known multidimensional reconciliation scheme, the present scheme has lower time complexity. Especially when the chosen punctured LDPC code achieves the Shannon capacity, the proposed reconciliation scheme can remove the information that has been leaked to an eavesdropper in the quantum transmission phase. Therefore, there is no information leaked to the eavesdropper after the reconciliation stage. This indicates that the privacy amplification algorithm of the postprocessing procedure is no more needed after the reconciliation process. These features lead to a higher secret key rate, optimal performance, and availability for the involved quantum key distribution scheme.

  9. Faster on Easy Items, More Accurate on Difficult Ones: Cognitive Ability and Performance on a Task of Varying Difficulty

    ERIC Educational Resources Information Center

    Dodonova, Yulia A.; Dodonov, Yury S.

    2013-01-01

    Using more complex items than those commonly employed within the information-processing approach, but still easier than those used in intelligence tests, this study analyzed how the association between processing speed and accuracy level changes as the difficulty of the items increases. The study involved measuring cognitive ability using Raven's…

  10. The Role of Self-Regulated Learning in Fostering Students' Conceptual Understanding of Complex Systems with Hypermedia

    ERIC Educational Resources Information Center

    Azevedo, Roger; Guthrie, John T.; Seibert, Diane

    2004-01-01

    This study examines the role of self-regulated learning (SRL) in facilitating students' shifts to more sophisticated mental models of the circulatory system as indicated by both performance and process data. We began with Winne and colleagues' information processing model of SRL (Winne, 2001; Winne & Hadwin, 1998) and used it to examine how…

  11. Extracting features of Gaussian self-similar stochastic processes via the Bandt-Pompe approach.

    PubMed

    Rosso, O A; Zunino, L; Pérez, D G; Figliola, A; Larrondo, H A; Garavaglia, M; Martín, M T; Plastino, A

    2007-12-01

    By recourse to appropriate information theory quantifiers (normalized Shannon entropy and Martín-Plastino-Rosso intensive statistical complexity measure), we revisit the characterization of Gaussian self-similar stochastic processes from a Bandt-Pompe viewpoint. We show that the ensuing approach exhibits considerable advantages with respect to other treatments. In particular, clear quantifiers gaps are found in the transition between the continuous processes and their associated noises.

  12. Nested polynomial trends for the improvement of Gaussian process-based predictors

    NASA Astrophysics Data System (ADS)

    Perrin, G.; Soize, C.; Marque-Pucheu, S.; Garnier, J.

    2017-10-01

    The role of simulation keeps increasing for the sensitivity analysis and the uncertainty quantification of complex systems. Such numerical procedures are generally based on the processing of a huge amount of code evaluations. When the computational cost associated with one particular evaluation of the code is high, such direct approaches based on the computer code only, are not affordable. Surrogate models have therefore to be introduced to interpolate the information given by a fixed set of code evaluations to the whole input space. When confronted to deterministic mappings, the Gaussian process regression (GPR), or kriging, presents a good compromise between complexity, efficiency and error control. Such a method considers the quantity of interest of the system as a particular realization of a Gaussian stochastic process, whose mean and covariance functions have to be identified from the available code evaluations. In this context, this work proposes an innovative parametrization of this mean function, which is based on the composition of two polynomials. This approach is particularly relevant for the approximation of strongly non linear quantities of interest from very little information. After presenting the theoretical basis of this method, this work compares its efficiency to alternative approaches on a series of examples.

  13. Adaptive Classification of Landscape Process and Function: An Integration of Geoinformatics and Self-Organizing Maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Andre M.

    2009-07-17

    The advanced geospatial information extraction and analysis capabilities of a Geographic Information System (GISs) and Artificial Neural Networks (ANNs), particularly Self-Organizing Maps (SOMs), provide a topology-preserving means for reducing and understanding complex data relationships in the landscape. The Adaptive Landscape Classification Procedure (ALCP) is presented as an adaptive and evolutionary capability where varying types of data can be assimilated to address different management needs such as hydrologic response, erosion potential, habitat structure, instrumentation placement, and various forecast or what-if scenarios. This paper defines how the evaluation and analysis of spatial and/or temporal patterns in the landscape can provide insight intomore » complex ecological, hydrological, climatic, and other natural and anthropogenic-influenced processes. Establishing relationships among high-dimensional datasets through neurocomputing based pattern recognition methods can help 1) resolve large volumes of data into a structured and meaningful form; 2) provide an approach for inferring landscape processes in areas that have limited data available but exhibit similar landscape characteristics; and 3) discover the value of individual variables or groups of variables that contribute to specific processes in the landscape. Classification of hydrologic patterns in the landscape is demonstrated.« less

  14. Intersection of argumentation and the use of multiple representations in the context of socioscientific issues

    NASA Astrophysics Data System (ADS)

    Namdar, Bahadir; Shen, Ji

    2016-05-01

    Using multiple representations and argumentation are two fundamental processes in science. With the advancements of information communication technologies, these two processes are blended more so than ever before. However, little is known about how these two processes interact with each other in student learning. Hence, we conducted a design-based study in order to distill the relationship between these two processes. Specifically, we designed a learning unit on nuclear energy and implemented it with a group of preservice middle school teachers. The participants used a web-based knowledge organization platform that incorporated three representational modes: textual, concept map, and pictorial. The participants organized their knowledge on nuclear energy by searching, sorting, clustering information through the use of these representational modes and argued about the nuclear energy issue. We found that the use of multiple representations and argumentation interacted with each other in a complex way. Based on our findings, we argue that the complexity can be unfolded in two aspects: (a) the use of multiple representations mediates argumentation in different forms and for different purposes; (b) the type of argumentation that leads to refinement of the use of multiple representations is often non-mediated and drawn from personal experience.

  15. Image wavelet decomposition and applications

    NASA Technical Reports Server (NTRS)

    Treil, N.; Mallat, S.; Bajcsy, R.

    1989-01-01

    The general problem of computer vision has been investigated for more that 20 years and is still one of the most challenging fields in artificial intelligence. Indeed, taking a look at the human visual system can give us an idea of the complexity of any solution to the problem of visual recognition. This general task can be decomposed into a whole hierarchy of problems ranging from pixel processing to high level segmentation and complex objects recognition. Contrasting an image at different representations provides useful information such as edges. An example of low level signal and image processing using the theory of wavelets is introduced which provides the basis for multiresolution representation. Like the human brain, we use a multiorientation process which detects features independently in different orientation sectors. So, images of the same orientation but of different resolutions are contrasted to gather information about an image. An interesting image representation using energy zero crossings is developed. This representation is shown to be experimentally complete and leads to some higher level applications such as edge and corner finding, which in turn provides two basic steps to image segmentation. The possibilities of feedback between different levels of processing are also discussed.

  16. Deconstructing the simplification of jury instructions: How simplifying the features of complexity affects jurors' application of instructions.

    PubMed

    Baguley, Chantelle M; McKimmie, Blake M; Masser, Barbara M

    2017-06-01

    Research consistently shows that techniques currently used to simplify jury instructions do not always improve mock jurors' comprehension. If improvements are observed, these are limited and overall comprehension remains low. It is unclear, however, why this occurs. It is possible that current simplification techniques do not effectively simplify the features of complexity, present in standardized instructions, which have the greatest effect on jurors' comprehension. It is not yet known, however, how much each feature of complexity individually affects jurors' comprehension. To investigate this, the authors used existing data from published empirical studies to examine how simplifying each feature of complexity affects mock jurors' application of instructions, as jurors can only apply instructions to the extent they understand them. The results suggest that reducing the conceptual complexity and proportion of supplementary information was associated with increased application of the instructions; however, reducing both the linguistic complexity and amount of information, and providing the instructions in a written format was not. In addition, results showed an unexpected adverse effect of simplification-reducing the amount of information was associated with an increase in the punitiveness of mock jurors' verdicts, independently of the instruction content. Together, these results suggest a need to make jury instructions comprehensible, highlight the key principles in the decision-process, and identify a way to eliminate the negative effect of reducing the amount of information. Addressing these needs is essential for developing a simplification technique that maximizes jurors' comprehension and application of instructions, while minimizing the previously overlooked negative effects of simplification. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Mental workload prediction based on attentional resource allocation and information processing.

    PubMed

    Xiao, Xu; Wanyan, Xiaoru; Zhuang, Damin

    2015-01-01

    Mental workload is an important component in complex human-machine systems. The limited applicability of empirical workload measures produces the need for workload modeling and prediction methods. In the present study, a mental workload prediction model is built on the basis of attentional resource allocation and information processing to ensure pilots' accuracy and speed in understanding large amounts of flight information on the cockpit display interface. Validation with an empirical study of an abnormal attitude recovery task showed that this model's prediction of mental workload highly correlated with experimental results. This mental workload prediction model provides a new tool for optimizing human factors interface design and reducing human errors.

  18. Organizing Space Shuttle parametric data for maintainability

    NASA Technical Reports Server (NTRS)

    Angier, R. C.

    1983-01-01

    A model of organization and management of Space Shuttle data is proposed. Shuttle avionics software is parametrically altered by a reconfiguration process for each flight. As the flight rate approaches an operational level, current methods of data management would become increasingly complex. An alternative method is introduced, using modularized standard data, and its implications for data collection, integration, validation, and reconfiguration processes are explored. Information modules are cataloged for later use, and may be combined in several levels for maintenance. For each flight, information modules can then be selected from the catalog at a high level. These concepts take advantage of the reusability of Space Shuttle information to reduce the cost of reconfiguration as flight experience increases.

  19. How to build an information gathering and processing system: lessons from naturally and artificially intelligent systems.

    PubMed

    Chappell, Jackie; Demery, Zoe P; Arriola-Rios, Veronica; Sloman, Aaron

    2012-02-01

    Imagine a situation in which you had to design a physical agent that could collect information from its environment, then store and process that information to help it respond appropriately to novel situations. What kinds of information should it attend to? How should the information be represented so as to allow efficient use and re-use? What kinds of constraints and trade-offs would there be? There are no unique answers. In this paper, we discuss some of the ways in which the need to be able to address problems of varying kinds and complexity can be met by different information processing systems. We also discuss different ways in which relevant information can be obtained, and how different kinds of information can be processed and used, by both biological organisms and artificial agents. We analyse several constraints and design features, and show how they relate both to biological organisms, and to lessons that can be learned from building artificial systems. Our standpoint overlaps with Karmiloff-Smith (1992) in that we assume that a collection of mechanisms geared to learning and developing in biological environments are available in forms that constrain, but do not determine, what can or will be learnt by individuals. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. A tool for calculating binding-site residues on proteins from PDB structures.

    PubMed

    Hu, Jing; Yan, Changhui

    2009-08-03

    In the research on protein functional sites, researchers often need to identify binding-site residues on a protein. A commonly used strategy is to find a complex structure from the Protein Data Bank (PDB) that consists of the protein of interest and its interacting partner(s) and calculate binding-site residues based on the complex structure. However, since a protein may participate in multiple interactions, the binding-site residues calculated based on one complex structure usually do not reveal all binding sites on a protein. Thus, this requires researchers to find all PDB complexes that contain the protein of interest and combine the binding-site information gleaned from them. This process is very time-consuming. Especially, combing binding-site information obtained from different PDB structures requires tedious work to align protein sequences. The process becomes overwhelmingly difficult when researchers have a large set of proteins to analyze, which is usually the case in practice. In this study, we have developed a tool for calculating binding-site residues on proteins, TCBRP http://yanbioinformatics.cs.usu.edu:8080/ppbindingsubmit. For an input protein, TCBRP can quickly find all binding-site residues on the protein by automatically combining the information obtained from all PDB structures that consist of the protein of interest. Additionally, TCBRP presents the binding-site residues in different categories according to the interaction type. TCBRP also allows researchers to set the definition of binding-site residues. The developed tool is very useful for the research on protein binding site analysis and prediction.

Top