Sample records for complex memory hierarchies

  1. Memory-Scalable GPU Spatial Hierarchy Construction.

    PubMed

    Qiming Hou; Xin Sun; Kun Zhou; Lauterbach, C; Manocha, D

    2011-04-01

    Recent GPU algorithms for constructing spatial hierarchies have achieved promising performance for moderately complex models by using the breadth-first search (BFS) construction order. While being able to exploit the massive parallelism on the GPU, the BFS order also consumes excessive GPU memory, which becomes a serious issue for interactive applications involving very complex models with more than a few million triangles. In this paper, we propose to use the partial breadth-first search (PBFS) construction order to control memory consumption while maximizing performance. We apply the PBFS order to two hierarchy construction algorithms. The first algorithm is for kd-trees that automatically balances between the level of parallelism and intermediate memory usage. With PBFS, peak memory consumption during construction can be efficiently controlled without costly CPU-GPU data transfer. We also develop memory allocation strategies to effectively limit memory fragmentation. The resulting algorithm scales well with GPU memory and constructs kd-trees of models with millions of triangles at interactive rates on GPUs with 1 GB memory. Compared with existing algorithms, our algorithm is an order of magnitude more scalable for a given GPU memory bound. The second algorithm is for out-of-core bounding volume hierarchy (BVH) construction for very large scenes based on the PBFS construction order. At each iteration, all constructed nodes are dumped to the CPU memory, and the GPU memory is freed for the next iteration's use. In this way, the algorithm is able to build trees that are too large to be stored in the GPU memory. Experiments show that our algorithm can construct BVHs for scenes with up to 20 M triangles, several times larger than previous GPU algorithms.

  2. Formal verification of a set of memory management units

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, Gerald C.

    1992-01-01

    This document describes the verification of a set of memory management units (MMU). The verification effort demonstrates the use of hierarchical decomposition and abstract theories. The MMUs can be organized into a complexity hierarchy. Each new level in the hierarchy adds a few significant features or modifications to the lower level MMU. The units described include: (1) a page check translation look-aside module (TLM); (2) a page check TLM with supervisor line; (3) a base bounds MMU; (4) a virtual address translation MMU; and (5) a virtual address translation MMU with memory resident segment table.

  3. It's all coming back to me now: perception and memory in amnesia.

    PubMed

    Baxter, Mark G

    2012-07-12

    Medial temporal lobe (MTL) structures may constitute a representational hierarchy, rather than a dedicated system for memory. Barense et al. (2012) show that intact memory for object features can interfere with perception of complex objects in individuals with MTL amnesia. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. A class Hierarchical, object-oriented approach to virtual memory management

    NASA Technical Reports Server (NTRS)

    Russo, Vincent F.; Campbell, Roy H.; Johnston, Gary M.

    1989-01-01

    The Choices family of operating systems exploits class hierarchies and object-oriented programming to facilitate the construction of customized operating systems for shared memory and networked multiprocessors. The software is being used in the Tapestry laboratory to study the performance of algorithms, mechanisms, and policies for parallel systems. Described here are the architectural design and class hierarchy of the Choices virtual memory management system. The software and hardware mechanisms and policies of a virtual memory system implement a memory hierarchy that exploits the trade-off between response times and storage capacities. In Choices, the notion of a memory hierarchy is captured by abstract classes. Concrete subclasses of those abstractions implement a virtual address space, segmentation, paging, physical memory management, secondary storage, and remote (that is, networked) storage. Captured in the notion of a memory hierarchy are classes that represent memory objects. These classes provide a storage mechanism that contains encapsulated data and have methods to read or write the memory object. Each of these classes provides specializations to represent the memory hierarchy.

  5. Simplified Interface to Complex Memory Hierarchies 1.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lang, Michael; Ionkov, Latchesar; Williams, Sean

    2017-02-21

    Memory systems are expected to get evermore complicated in the coming years, and it isn't clear exactly what form that complexity will take. On the software side, a simple, flexible way of identifying and working with memory pools is needed. Additionally, most developers seek code portability and do not want to learn the intricacies of complex memory. Hence, we believe that a library for interacting with complex memory systems should expose two kinds of abstraction: First, a low-level, mechanism-based interface designed for the runtime or advanced user that wants complete control, with its focus on simplified representation but with allmore » decisions left to the caller. Second, a high-level, policy-based interface designed for ease of use for the application developer, in which we aim for best-practice decisions based on application intent. We have developed such a library, called SICM: Simplified Interface to Complex Memory.« less

  6. Implementing a bubble memory hierarchy system

    NASA Technical Reports Server (NTRS)

    Segura, R.; Nichols, C. D.

    1979-01-01

    This paper reports on implementation of a magnetic bubble memory in a two-level hierarchial system. The hierarchy used a major-minor loop device and RAM under microprocessor control. Dynamic memory addressing, dual bus primary memory, and hardware data modification detection are incorporated in the system to minimize access time. It is the objective of the system to incorporate the advantages of bipolar memory with that of bubble domain memory to provide a smart, optimal memory system which is easy to interface and independent of user's system.

  7. Memory Benchmarks for SMP-Based High Performance Parallel Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, A B; de Supinski, B; Mueller, F

    2001-11-20

    As the speed gap between CPU and main memory continues to grow, memory accesses increasingly dominates the performance of many applications. The problem is particularly acute for symmetric multiprocessor (SMP) systems, where the shared memory may be accessed concurrently by a group of threads running on separate CPUs. Unfortunately, several key issues governing memory system performance in current systems are not well understood. Complex interactions between the levels of the memory hierarchy, buses or switches, DRAM back-ends, system software, and application access patterns can make it difficult to pinpoint bottlenecks and determine appropriate optimizations, and the situation is even moremore » complex for SMP systems. To partially address this problem, we formulated a set of multi-threaded microbenchmarks for characterizing and measuring the performance of the underlying memory system in SMP-based high-performance computers. We report our use of these microbenchmarks on two important SMP-based machines. This paper has four primary contributions. First, we introduce a microbenchmark suite to systematically assess and compare the performance of different levels in SMP memory hierarchies. Second, we present a new tool based on hardware performance monitors to determine a wide array of memory system characteristics, such as cache sizes, quickly and easily; by using this tool, memory performance studies can be targeted to the full spectrum of performance regimes with many fewer data points than is otherwise required. Third, we present experimental results indicating that the performance of applications with large memory footprints remains largely constrained by memory. Fourth, we demonstrate that thread-level parallelism further degrades memory performance, even for the latest SMPs with hardware prefetching and switch-based memory interconnects.« less

  8. A general model for memory interference in a multiprocessor system with memory hierarchy

    NASA Technical Reports Server (NTRS)

    Taha, Badie A.; Standley, Hilda M.

    1989-01-01

    The problem of memory interference in a multiprocessor system with a hierarchy of shared buses and memories is addressed. The behavior of the processors is represented by a sequence of memory requests with each followed by a determined amount of processing time. A statistical queuing network model for determining the extent of memory interference in multiprocessor systems with clusters of memory hierarchies is presented. The performance of the system is measured by the expected number of busy memory clusters. The results of the analytic model are compared with simulation results, and the correlation between them is found to be very high.

  9. Short-term plasticity as a neural mechanism supporting memory and attentional functions.

    PubMed

    Jääskeläinen, Iiro P; Ahveninen, Jyrki; Andermann, Mark L; Belliveau, John W; Raij, Tommi; Sams, Mikko

    2011-11-08

    Based on behavioral studies, several relatively distinct perceptual and cognitive functions have been defined in cognitive psychology such as sensory memory, short-term memory, and selective attention. Here, we review evidence suggesting that some of these functions may be supported by shared underlying neuronal mechanisms. Specifically, we present, based on an integrative review of the literature, a hypothetical model wherein short-term plasticity, in the form of transient center-excitatory and surround-inhibitory modulations, constitutes a generic processing principle that supports sensory memory, short-term memory, involuntary attention, selective attention, and perceptual learning. In our model, the size and complexity of receptive fields/level of abstraction of neural representations, as well as the length of temporal receptive windows, increases as one steps up the cortical hierarchy. Consequently, the type of input (bottom-up vs. top down) and the level of cortical hierarchy that the inputs target, determine whether short-term plasticity supports purely sensory vs. semantic short-term memory or attentional functions. Furthermore, we suggest that rather than discrete memory systems, there are continuums of memory representations from short-lived sensory ones to more abstract longer-duration representations, such as those tapped by behavioral studies of short-term memory. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.

    PubMed

    Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd

    2018-07-01

    Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.

  11. Operating systems. [of computers

    NASA Technical Reports Server (NTRS)

    Denning, P. J.; Brown, R. L.

    1984-01-01

    A counter operating system creates a hierarchy of levels of abstraction, so that at a given level all details concerning lower levels can be ignored. This hierarchical structure separates functions according to their complexity, characteristic time scale, and level of abstraction. The lowest levels include the system's hardware; concepts associated explicitly with the coordination of multiple tasks appear at intermediate levels, which conduct 'primitive processes'. Software semaphore is the mechanism controlling primitive processes that must be synchronized. At higher levels lie, in rising order, the access to the secondary storage devices of a particular machine, a 'virtual memory' scheme for managing the main and secondary memories, communication between processes by way of a mechanism called a 'pipe', access to external input and output devices, and a hierarchy of directories cataloguing the hardware and software objects to which access must be controlled.

  12. Software Techniques for Non-Von Neumann Architectures

    DTIC Science & Technology

    1990-01-01

    Commtopo programmable Benes net.; hypercubic lattice for QCD Control CENTRALIZED Assign STATIC Memory :SHARED Synch UNIVERSAL Max-cpu 566 Proessor...boards (each = 4 floating point units, 2 multipliers) Cpu-size 32-bit floating point chips Perform 11.4 Gflops Market quantum chromodynamics ( QCD ...functions there should exist a capability to define hierarchies and lattices of complex objects. A complex object can be made up of a set of simple objects

  13. Stream Processors

    NASA Astrophysics Data System (ADS)

    Erez, Mattan; Dally, William J.

    Stream processors, like other multi core architectures partition their functional units and storage into multiple processing elements. In contrast to typical architectures, which contain symmetric general-purpose cores and a cache hierarchy, stream processors have a significantly leaner design. Stream processors are specifically designed for the stream execution model, in which applications have large amounts of explicit parallel computation, structured and predictable control, and memory accesses that can be performed at a coarse granularity. Applications in the streaming model are expressed in a gather-compute-scatter form, yielding programs with explicit control over transferring data to and from on-chip memory. Relying on these characteristics, which are common to many media processing and scientific computing applications, stream architectures redefine the boundary between software and hardware responsibilities with software bearing much of the complexity required to manage concurrency, locality, and latency tolerance. Thus, stream processors have minimal control consisting of fetching medium- and coarse-grained instructions and executing them directly on the many ALUs. Moreover, the on-chip storage hierarchy of stream processors is under explicit software control, as is all communication, eliminating the need for complex reactive hardware mechanisms.

  14. Eye Movement Evidence for Hierarchy Effects on Memory Representation of Discourses.

    PubMed

    Wu, Yingying; Yang, Xiaohong; Yang, Yufang

    2016-01-01

    In this study, we applied the text-change paradigm to investigate whether and how discourse hierarchy affected the memory representation of a discourse. Three kinds of three-sentence discourses were constructed. In the hierarchy-high condition and the hierarchy-low condition, the three sentences of the discourses were hierarchically organized and the last sentence of each discourse was located at the high level and the low level of the discourse hierarchy, respectively. In the linear condition, the three sentences of the discourses were linearly organized. Critical words were always located at the last sentence of the discourses. These discourses were successively presented twice and the critical words were changed to semantically related words in the second presentation. The results showed that during the early processing stage, the critical words were read for longer times when they were changed in the hierarchy-high and the linear conditions, but not in the hierarchy-low condition. During the late processing stage, the changed-critical words were again found to induce longer reading times only when they were in the hierarchy-high condition. These results suggest that words in a discourse have better memory representation when they are located at the higher rather than at the lower level of the discourse hierarchy. Global discourse hierarchy is established as an important factor in constructing the mental representation of a discourse.

  15. Eye Movement Evidence for Hierarchy Effects on Memory Representation of Discourses

    PubMed Central

    Wu, Yingying; Yang, Xiaohong; Yang, Yufang

    2016-01-01

    In this study, we applied the text-change paradigm to investigate whether and how discourse hierarchy affected the memory representation of a discourse. Three kinds of three-sentence discourses were constructed. In the hierarchy-high condition and the hierarchy-low condition, the three sentences of the discourses were hierarchically organized and the last sentence of each discourse was located at the high level and the low level of the discourse hierarchy, respectively. In the linear condition, the three sentences of the discourses were linearly organized. Critical words were always located at the last sentence of the discourses. These discourses were successively presented twice and the critical words were changed to semantically related words in the second presentation. The results showed that during the early processing stage, the critical words were read for longer times when they were changed in the hierarchy-high and the linear conditions, but not in the hierarchy-low condition. During the late processing stage, the changed-critical words were again found to induce longer reading times only when they were in the hierarchy-high condition. These results suggest that words in a discourse have better memory representation when they are located at the higher rather than at the lower level of the discourse hierarchy. Global discourse hierarchy is established as an important factor in constructing the mental representation of a discourse. PMID:26789002

  16. A Cross-Modal Perspective on the Relationships between Imagery and Working Memory

    PubMed Central

    Likova, Lora T.

    2013-01-01

    Mapping the distinctions and interrelationships between imagery and working memory (WM) remains challenging. Although each of these major cognitive constructs is defined and treated in various ways across studies, most accept that both imagery and WM involve a form of internal representation available to our awareness. In WM, there is a further emphasis on goal-oriented, active maintenance, and use of this conscious representation to guide voluntary action. Multicomponent WM models incorporate representational buffers, such as the visuo-spatial sketchpad, plus central executive functions. If there is a visuo-spatial “sketchpad” for WM, does imagery involve the same representational buffer? Alternatively, does WM employ an imagery-specific representational mechanism to occupy our awareness? Or do both constructs utilize a more generic “projection screen” of an amodal nature? To address these issues, in a cross-modal fMRI study, I introduce a novel Drawing-Based Memory Paradigm, and conceptualize drawing as a complex behavior that is readily adaptable from the visual to non-visual modalities (such as the tactile modality), which opens intriguing possibilities for investigating cross-modal learning and plasticity. Blindfolded participants were trained through our Cognitive-Kinesthetic Method (Likova, 2010a, 2012) to draw complex objects guided purely by the memory of felt tactile images. If this WM task had been mediated by transfer of the felt spatial configuration to the visual imagery mechanism, the response-profile in visual cortex would be predicted to have the “top-down” signature of propagation of the imagery signal downward through the visual hierarchy. Remarkably, the pattern of cross-modal occipital activation generated by the non-visual memory drawing was essentially the inverse of this typical imagery signature. The sole visual hierarchy activation was isolated to the primary visual area (V1), and accompanied by deactivation of the entire extrastriate cortex, thus ’cutting-off’ any signal propagation from/to V1 through the visual hierarchy. The implications of these findings for the debate on the interrelationships between the core cognitive constructs of WM and imagery and the nature of internal representations are evaluated. PMID:23346061

  17. The medial temporal lobe-conduit of parallel connectivity: a model for attention, memory, and perception.

    PubMed

    Mozaffari, Brian

    2014-01-01

    Based on the notion that the brain is equipped with a hierarchical organization, which embodies environmental contingencies across many time scales, this paper suggests that the medial temporal lobe (MTL)-located deep in the hierarchy-serves as a bridge connecting supra- to infra-MTL levels. Bridging the upper and lower regions of the hierarchy provides a parallel architecture that optimizes information flow between upper and lower regions to aid attention, encoding, and processing of quick complex visual phenomenon. Bypassing intermediate hierarchy levels, information conveyed through the MTL "bridge" allows upper levels to make educated predictions about the prevailing context and accordingly select lower representations to increase the efficiency of predictive coding throughout the hierarchy. This selection or activation/deactivation is associated with endogenous attention. In the event that these "bridge" predictions are inaccurate, this architecture enables the rapid encoding of novel contingencies. A review of hierarchical models in relation to memory is provided along with a new theory, Medial-temporal-lobe Conduit for Parallel Connectivity (MCPC). In this scheme, consolidation is considered as a secondary process, occurring after a MTL-bridged connection, which eventually allows upper and lower levels to access each other directly. With repeated reactivations, as contingencies become consolidated, less MTL activity is predicted. Finally, MTL bridging may aid processing transient but structured perceptual events, by allowing communication between upper and lower levels without calling on intermediate levels of representation.

  18. A role for glucocorticoids in the long-term establishment of a social hierarchy.

    PubMed

    Timmer, Marjan; Sandi, Carmen

    2010-11-01

    Stress can affect the establishment and maintenance of social hierarchies. In the present study, we investigated the role of increasing corticosterone levels before or just after a first social encounter between two rats of a dyad in the establishment and the long-term maintenance of a social hierarchy. We show that pre-social encounter corticosterone treatment does not affect the outcome of the hierarchy during a first encounter, but induces a long-term memory for the hierarchy when the corticosterone-injected rat becomes dominant during the encounter, but not when it becomes subordinate. Post-social encounter corticosterone leads to a long-term maintenance of the hierarchy only when the subordinate rat of the dyad is injected with corticosterone. This corticosterone effect mimics previously reported actions of stress on the same model and, hence, implicates glucocorticoids in the consolidation of the memory for a recently established hierarchy. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Generating Adaptive Behaviour within a Memory-Prediction Framework

    PubMed Central

    Rawlinson, David; Kowadlo, Gideon

    2012-01-01

    The Memory-Prediction Framework (MPF) and its Hierarchical-Temporal Memory implementation (HTM) have been widely applied to unsupervised learning problems, for both classification and prediction. To date, there has been no attempt to incorporate MPF/HTM in reinforcement learning or other adaptive systems; that is, to use knowledge embodied within the hierarchy to control a system, or to generate behaviour for an agent. This problem is interesting because the human neocortex is believed to play a vital role in the generation of behaviour, and the MPF is a model of the human neocortex. We propose some simple and biologically-plausible enhancements to the Memory-Prediction Framework. These cause it to explore and interact with an external world, while trying to maximize a continuous, time-varying reward function. All behaviour is generated and controlled within the MPF hierarchy. The hierarchy develops from a random initial configuration by interaction with the world and reinforcement learning only. Among other demonstrations, we show that a 2-node hierarchy can learn to successfully play “rocks, paper, scissors” against a predictable opponent. PMID:22272231

  20. A memristor-based nonvolatile latch circuit

    NASA Astrophysics Data System (ADS)

    Robinett, Warren; Pickett, Matthew; Borghetti, Julien; Xia, Qiangfei; Snider, Gregory S.; Medeiros-Ribeiro, Gilberto; Williams, R. Stanley

    2010-06-01

    Memristive devices, which exhibit a dynamical conductance state that depends on the excitation history, can be used as nonvolatile memory elements by storing information as different conductance states. We describe the implementation of a nonvolatile synchronous flip-flop circuit that uses a nanoscale memristive device as the nonvolatile memory element. Controlled testing of the circuit demonstrated successful state storage and restoration, with an error rate of 0.1%, during 1000 power loss events. These results indicate that integration of digital logic devices and memristors could open the way for nonvolatile computation with applications in small platforms that rely on intermittent power sources. This demonstrated feasibility of tight integration of memristors with CMOS (complementary metal-oxide-semiconductor) circuitry challenges the traditional memory hierarchy, in which nonvolatile memory is only available as a large, slow, monolithic block at the bottom of the hierarchy. In contrast, the nonvolatile, memristor-based memory cell can be fast, fine-grained and small, and is compatible with conventional CMOS electronics. This threatens to upset the traditional memory hierarchy, and may open up new architectural possibilities beyond it.

  1. Stress amplifies memory for social hierarchy.

    PubMed

    Cordero, María Isabel; Sandi, Carmen

    2007-11-01

    Individuals differ in their social status and societies in the extent of social status differences among their members. There is great interest in understanding the key factors that contribute to the establishment of social dominance structures. Given that stress can affect behavior and cognition, we hypothesized that, given equal opportunities to become either dominant or submissive, stress experienced by one of the individuals during their first encounter would determine the long-term establishment of a social hierarchy by acting as a two-stage rocket: (1) by influencing the rank achieved after a social encounter and (2) by facilitating and/or promoting a long-term memory for the specific hierarchy. Using a novel model for the assessment of long-term dominance hierarchies in rats, we present here the first evidence supporting such hypothesis. In control conditions, the social rank established through a first interaction and food competition test between two male rats is not maintained when animals are confronted 1 week later. However, if one of the rats is stressed just before their first encounter, the dominance hierarchy developed on day 1 is still clearly observed 1 week later, with the stressed animal becoming submissive (i.e., looser in competition tests) in both social interactions. Our findings also allow us to propose that stress potentiates a hierarchy-linked recognition memory between "specific" individuals through mechanisms that involve de novo protein synthesis. These results implicate stress among the key mechanisms contributing to create social imbalance and highlight memory mechanisms as key mediators of stress-induced long-term establishment of social rank.

  2. Two-Hierarchy Entanglement Swapping for a Linear Optical Quantum Repeater

    NASA Astrophysics Data System (ADS)

    Xu, Ping; Yong, Hai-Lin; Chen, Luo-Kan; Liu, Chang; Xiang, Tong; Yao, Xing-Can; Lu, He; Li, Zheng-Da; Liu, Nai-Le; Li, Li; Yang, Tao; Peng, Cheng-Zhi; Zhao, Bo; Chen, Yu-Ao; Pan, Jian-Wei

    2017-10-01

    Quantum repeaters play a significant role in achieving long-distance quantum communication. In the past decades, tremendous effort has been devoted towards constructing a quantum repeater. As one of the crucial elements, entanglement has been created in different memory systems via entanglement swapping. The realization of j -hierarchy entanglement swapping, i.e., connecting quantum memory and further extending the communication distance, is important for implementing a practical quantum repeater. Here, we report the first demonstration of a fault-tolerant two-hierarchy entanglement swapping with linear optics using parametric down-conversion sources. In the experiment, the dominant or most probable noise terms in the one-hierarchy entanglement swapping, which is on the same order of magnitude as the desired state and prevents further entanglement connections, are automatically washed out by a proper design of the detection setting, and the communication distance can be extended. Given suitable quantum memory, our techniques can be directly applied to implementing an atomic ensemble based quantum repeater, and are of significant importance in the scalable quantum information processing.

  3. Two-Hierarchy Entanglement Swapping for a Linear Optical Quantum Repeater.

    PubMed

    Xu, Ping; Yong, Hai-Lin; Chen, Luo-Kan; Liu, Chang; Xiang, Tong; Yao, Xing-Can; Lu, He; Li, Zheng-Da; Liu, Nai-Le; Li, Li; Yang, Tao; Peng, Cheng-Zhi; Zhao, Bo; Chen, Yu-Ao; Pan, Jian-Wei

    2017-10-27

    Quantum repeaters play a significant role in achieving long-distance quantum communication. In the past decades, tremendous effort has been devoted towards constructing a quantum repeater. As one of the crucial elements, entanglement has been created in different memory systems via entanglement swapping. The realization of j-hierarchy entanglement swapping, i.e., connecting quantum memory and further extending the communication distance, is important for implementing a practical quantum repeater. Here, we report the first demonstration of a fault-tolerant two-hierarchy entanglement swapping with linear optics using parametric down-conversion sources. In the experiment, the dominant or most probable noise terms in the one-hierarchy entanglement swapping, which is on the same order of magnitude as the desired state and prevents further entanglement connections, are automatically washed out by a proper design of the detection setting, and the communication distance can be extended. Given suitable quantum memory, our techniques can be directly applied to implementing an atomic ensemble based quantum repeater, and are of significant importance in the scalable quantum information processing.

  4. Visual perception as retrospective Bayesian decoding from high- to low-level features

    PubMed Central

    Ding, Stephanie; Cueva, Christopher J.; Tsodyks, Misha; Qian, Ning

    2017-01-01

    When a stimulus is presented, its encoding is known to progress from low- to high-level features. How these features are decoded to produce perception is less clear, and most models assume that decoding follows the same low- to high-level hierarchy of encoding. There are also theories arguing for global precedence, reversed hierarchy, or bidirectional processing, but they are descriptive without quantitative comparison with human perception. Moreover, observers often inspect different parts of a scene sequentially to form overall perception, suggesting that perceptual decoding requires working memory, yet few models consider how working-memory properties may affect decoding hierarchy. We probed decoding hierarchy by comparing absolute judgments of single orientations and relative/ordinal judgments between two sequentially presented orientations. We found that lower-level, absolute judgments failed to account for higher-level, relative/ordinal judgments. However, when ordinal judgment was used to retrospectively decode memory representations of absolute orientations, striking aspects of absolute judgments, including the correlation and forward/backward aftereffects between two reported orientations in a trial, were explained. We propose that the brain prioritizes decoding of higher-level features because they are more behaviorally relevant, and more invariant and categorical, and thus easier to specify and maintain in noisy working memory, and that more reliable higher-level decoding constrains less reliable lower-level decoding. PMID:29073108

  5. Locality Aware Concurrent Start for Stencil Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrestha, Sunil; Gao, Guang R.; Manzano Franco, Joseph B.

    Stencil computations are at the heart of many physical simulations used in scientific codes. Thus, there exists a plethora of optimization efforts for this family of computations. Among these techniques, tiling techniques that allow concurrent start have proven to be very efficient in providing better performance for these critical kernels. Nevertheless, with many core designs being the norm, these optimization techniques might not be able to fully exploit locality (both spatial and temporal) on multiple levels of the memory hierarchy without compromising parallelism. It is no longer true that the machine can be seen as a homogeneous collection of nodesmore » with caches, main memory and an interconnect network. New architectural designs exhibit complex grouping of nodes, cores, threads, caches and memory connected by an ever evolving network-on-chip design. These new designs may benefit greatly from carefully crafted schedules and groupings that encourage parallel actors (i.e. threads, cores or nodes) to be aware of the computational history of other actors in close proximity. In this paper, we provide an efficient tiling technique that allows hierarchical concurrent start for memory hierarchy aware tile groups. Each execution schedule and tile shape exploit the available parallelism, load balance and locality present in the given applications. We demonstrate our technique on the Intel Xeon Phi architecture with selected and representative stencil kernels. We show improvement ranging from 5.58% to 31.17% over existing state-of-the-art techniques.« less

  6. Memory Effects on Movement Behavior in Animal Foraging

    PubMed Central

    Bracis, Chloe; Gurarie, Eliezer; Van Moorter, Bram; Goodwin, R. Andrew

    2015-01-01

    An individual’s choices are shaped by its experience, a fundamental property of behavior important to understanding complex processes. Learning and memory are observed across many taxa and can drive behaviors, including foraging behavior. To explore the conditions under which memory provides an advantage, we present a continuous-space, continuous-time model of animal movement that incorporates learning and memory. Using simulation models, we evaluate the benefit memory provides across several types of landscapes with variable-quality resources and compare the memory model within a nested hierarchy of simpler models (behavioral switching and random walk). We find that memory almost always leads to improved foraging success, but that this effect is most marked in landscapes containing sparse, contiguous patches of high-value resources that regenerate relatively fast and are located in an otherwise devoid landscape. In these cases, there is a large payoff for finding a resource patch, due to size, value, or locational difficulty. While memory-informed search is difficult to differentiate from other factors using solely movement data, our results suggest that disproportionate spatial use of higher value areas, higher consumption rates, and consumption variability all point to memory influencing the movement direction of animals in certain ecosystems. PMID:26288228

  7. Memory Effects on Movement Behavior in Animal Foraging.

    PubMed

    Bracis, Chloe; Gurarie, Eliezer; Van Moorter, Bram; Goodwin, R Andrew

    2015-01-01

    An individual's choices are shaped by its experience, a fundamental property of behavior important to understanding complex processes. Learning and memory are observed across many taxa and can drive behaviors, including foraging behavior. To explore the conditions under which memory provides an advantage, we present a continuous-space, continuous-time model of animal movement that incorporates learning and memory. Using simulation models, we evaluate the benefit memory provides across several types of landscapes with variable-quality resources and compare the memory model within a nested hierarchy of simpler models (behavioral switching and random walk). We find that memory almost always leads to improved foraging success, but that this effect is most marked in landscapes containing sparse, contiguous patches of high-value resources that regenerate relatively fast and are located in an otherwise devoid landscape. In these cases, there is a large payoff for finding a resource patch, due to size, value, or locational difficulty. While memory-informed search is difficult to differentiate from other factors using solely movement data, our results suggest that disproportionate spatial use of higher value areas, higher consumption rates, and consumption variability all point to memory influencing the movement direction of animals in certain ecosystems.

  8. Visual perception as retrospective Bayesian decoding from high- to low-level features.

    PubMed

    Ding, Stephanie; Cueva, Christopher J; Tsodyks, Misha; Qian, Ning

    2017-10-24

    When a stimulus is presented, its encoding is known to progress from low- to high-level features. How these features are decoded to produce perception is less clear, and most models assume that decoding follows the same low- to high-level hierarchy of encoding. There are also theories arguing for global precedence, reversed hierarchy, or bidirectional processing, but they are descriptive without quantitative comparison with human perception. Moreover, observers often inspect different parts of a scene sequentially to form overall perception, suggesting that perceptual decoding requires working memory, yet few models consider how working-memory properties may affect decoding hierarchy. We probed decoding hierarchy by comparing absolute judgments of single orientations and relative/ordinal judgments between two sequentially presented orientations. We found that lower-level, absolute judgments failed to account for higher-level, relative/ordinal judgments. However, when ordinal judgment was used to retrospectively decode memory representations of absolute orientations, striking aspects of absolute judgments, including the correlation and forward/backward aftereffects between two reported orientations in a trial, were explained. We propose that the brain prioritizes decoding of higher-level features because they are more behaviorally relevant, and more invariant and categorical, and thus easier to specify and maintain in noisy working memory, and that more reliable higher-level decoding constrains less reliable lower-level decoding. Published under the PNAS license.

  9. The medial temporal lobe—conduit of parallel connectivity: a model for attention, memory, and perception

    PubMed Central

    Mozaffari, Brian

    2014-01-01

    Based on the notion that the brain is equipped with a hierarchical organization, which embodies environmental contingencies across many time scales, this paper suggests that the medial temporal lobe (MTL)—located deep in the hierarchy—serves as a bridge connecting supra- to infra—MTL levels. Bridging the upper and lower regions of the hierarchy provides a parallel architecture that optimizes information flow between upper and lower regions to aid attention, encoding, and processing of quick complex visual phenomenon. Bypassing intermediate hierarchy levels, information conveyed through the MTL “bridge” allows upper levels to make educated predictions about the prevailing context and accordingly select lower representations to increase the efficiency of predictive coding throughout the hierarchy. This selection or activation/deactivation is associated with endogenous attention. In the event that these “bridge” predictions are inaccurate, this architecture enables the rapid encoding of novel contingencies. A review of hierarchical models in relation to memory is provided along with a new theory, Medial-temporal-lobe Conduit for Parallel Connectivity (MCPC). In this scheme, consolidation is considered as a secondary process, occurring after a MTL-bridged connection, which eventually allows upper and lower levels to access each other directly. With repeated reactivations, as contingencies become consolidated, less MTL activity is predicted. Finally, MTL bridging may aid processing transient but structured perceptual events, by allowing communication between upper and lower levels without calling on intermediate levels of representation. PMID:25426036

  10. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    PubMed

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  11. Dynamic Organization of Hierarchical Memories

    PubMed Central

    Kurikawa, Tomoki; Kaneko, Kunihiko

    2016-01-01

    In the brain, external objects are categorized in a hierarchical way. Although it is widely accepted that objects are represented as static attractors in neural state space, this view does not take account interaction between intrinsic neural dynamics and external input, which is essential to understand how neural system responds to inputs. Indeed, structured spontaneous neural activity without external inputs is known to exist, and its relationship with evoked activities is discussed. Then, how categorical representation is embedded into the spontaneous and evoked activities has to be uncovered. To address this question, we studied bifurcation process with increasing input after hierarchically clustered associative memories are learned. We found a “dynamic categorization”; neural activity without input wanders globally over the state space including all memories. Then with the increase of input strength, diffuse representation of higher category exhibits transitions to focused ones specific to each object. The hierarchy of memories is embedded in the transition probability from one memory to another during the spontaneous dynamics. With increased input strength, neural activity wanders over a narrower state space including a smaller set of memories, showing more specific category or memory corresponding to the applied input. Moreover, such coarse-to-fine transitions are also observed temporally during transient process under constant input, which agrees with experimental findings in the temporal cortex. These results suggest the hierarchy emerging through interaction with an external input underlies hierarchy during transient process, as well as in the spontaneous activity. PMID:27618549

  12. Reactive Goal Decomposition Hierarchies for On-Board Autonomy

    NASA Astrophysics Data System (ADS)

    Hartmann, L.

    2002-01-01

    As our experience grows, space missions and systems are expected to address ever more complex and demanding requirements with fewer resources (e.g., mass, power, budget). One approach to accommodating these higher expectations is to increase the level of autonomy to improve the capabilities and robustness of on- board systems and to simplify operations. The goal decomposition hierarchies described here provide a simple but powerful form of goal-directed behavior that is relatively easy to implement for space systems. A goal corresponds to a state or condition that an operator of the space system would like to bring about. In the system described here goals are decomposed into simpler subgoals until the subgoals are simple enough to execute directly. For each goal there is an activation condition and a set of decompositions. The decompositions correspond to different ways of achieving the higher level goal. Each decomposition contains a gating condition and a set of subgoals to be "executed" sequentially or in parallel. The gating conditions are evaluated in order and for the first one that is true, the corresponding decomposition is executed in order to achieve the higher level goal. The activation condition specifies global conditions (i.e., for all decompositions of the goal) that need to hold in order for the goal to be achieved. In real-time, parameters and state information are passed between goals and subgoals in the decomposition; a termination indication (success, failure, degree) is passed up when a decomposition finishes executing. The lowest level decompositions include servo control loops and finite state machines for generating control signals and sequencing i/o. Semaphores and shared memory are used to synchronize and coordinate decompositions that execute in parallel. The goal decomposition hierarchy is reactive in that the generated behavior is sensitive to the real-time state of the system and the environment. That is, the system is able to react to state and environment and in general can terminate the execution of a decomposition and attempt a new decomposition at any level in the hierarchy. This goal decomposition system is suitable for workstation, microprocessor and fpga implementation and thus is able to support the full range of prototyping activities, from mission design in the laboratory to development of the fpga firmware for the flight system. This approach is based on previous artificial intelligence work including (1) Brooks' subsumption architecture for robot control, (2) Firby's Reactive Action Package System (RAPS) for mediating between high level automated planning and low level execution and (3) hierarchical task networks for automated planning. Reactive goal decomposition hierarchies can be used for a wide variety of on-board autonomy applications including automating low level operation sequences (such as scheduling prerequisite operations, e.g., heaters, warm-up periods, monitoring power constraints), coordinating multiple spacecraft as in formation flying and constellations, robot manipulator operations, rendez-vous, docking, servicing, assembly, on-orbit maintenance, planetary rover operations, solar system and interstellar probes, intelligent science data gathering and disaster early warning. Goal decomposition hierarchies can support high level fault tolerance. Given models of on-board resources and goals to accomplish, the decomposition hierarchy could allocate resources to goals taking into account existing faults and in real-time reallocating resources as new faults arise. Resources to be modeled include memory (e.g., ROM, FPGA configuration memory, processor memory, payload instrument memory), processors, on-board and interspacecraft network nodes and links, sensors, actuators (e.g., attitude determination and control, guidance and navigation) and payload instruments. A goal decomposition hierarchy could be defined to map mission goals and tasks to available on-board resources. As faults occur and are detected the resource allocation is modified to avoid using the faulty resource. Goal decomposition hierarchies can implement variable autonomy (in which the operator chooses to command the system at a high or low level, mixed initiative planning (in which the system is able to interact with the operator, e.g, to request operator intervention when a working envelope is exceeded) and distributed control (in which, for example, multiple spacecraft cooperate to accomplish a task without a fixed master). The full paper will describe in greater detail how goal decompositions work, how they can be implemented, techniques for implementing a candidate application and the current state of the fpga implementation.

  13. Fast maximum intensity projections of large medical data sets by exploiting hierarchical memory architectures.

    PubMed

    Kiefer, Gundolf; Lehmann, Helko; Weese, Jürgen

    2006-04-01

    Maximum intensity projections (MIPs) are an important visualization technique for angiographic data sets. Efficient data inspection requires frame rates of at least five frames per second at preserved image quality. Despite the advances in computer technology, this task remains a challenge. On the one hand, the sizes of computed tomography and magnetic resonance images are increasing rapidly. On the other hand, rendering algorithms do not automatically benefit from the advances in processor technology, especially for large data sets. This is due to the faster evolving processing power and the slower evolving memory access speed, which is bridged by hierarchical cache memory architectures. In this paper, we investigate memory access optimization methods and use them for generating MIPs on general-purpose central processing units (CPUs) and graphics processing units (GPUs), respectively. These methods can work on any level of the memory hierarchy, and we show that properly combined methods can optimize memory access on multiple levels of the hierarchy at the same time. We present performance measurements to compare different algorithm variants and illustrate the influence of the respective techniques. On current hardware, the efficient handling of the memory hierarchy for CPUs improves the rendering performance by a factor of 3 to 4. On GPUs, we observed that the effect is even larger, especially for large data sets. The methods can easily be adjusted to different hardware specifics, although their impact can vary considerably. They can also be used for other rendering techniques than MIPs, and their use for more general image processing task could be investigated in the future.

  14. Exploring Machine Learning Techniques For Dynamic Modeling on Future Exascale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Shuaiwen; Tallent, Nathan R.; Vishnu, Abhinav

    2013-09-23

    Future exascale systems must be optimized for both power and performance at scale in order to achieve DOE’s goal of a sustained petaflop within 20 Megawatts by 2022 [1]. Massive parallelism of the future systems combined with complex memory hierarchies will form a barrier to efficient application and architecture design. These challenges are exacerbated with emerging complex architectures such as GPGPUs and Intel Xeon Phi as parallelism increases orders of magnitude and system power consumption can easily triple or quadruple. Therefore, we need techniques that can reduce the search space for optimization, isolate power-performance bottlenecks, identify root causes for software/hardwaremore » inefficiency, and effectively direct runtime scheduling.« less

  15. Extreme-scale Algorithms and Solver Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dongarra, Jack

    A widening gap exists between the peak performance of high-performance computers and the performance achieved by complex applications running on these platforms. Over the next decade, extreme-scale systems will present major new challenges to algorithm development that could amplify this mismatch in such a way that it prevents the productive use of future DOE Leadership computers due to the following; Extreme levels of parallelism due to multicore processors; An increase in system fault rates requiring algorithms to be resilient beyond just checkpoint/restart; Complex memory hierarchies and costly data movement in both energy and performance; Heterogeneous system architectures (mixing CPUs, GPUs,more » etc.); and Conflicting goals of performance, resilience, and power requirements.« less

  16. Does formal complexity reflect cognitive complexity? Investigating aspects of the Chomsky Hierarchy in an artificial language learning study.

    PubMed

    Öttl, Birgit; Jäger, Gerhard; Kaup, Barbara

    2015-01-01

    This study investigated whether formal complexity, as described by the Chomsky Hierarchy, corresponds to cognitive complexity during language learning. According to the Chomsky Hierarchy, nested dependencies (context-free) are less complex than cross-serial dependencies (mildly context-sensitive). In two artificial grammar learning (AGL) experiments participants were presented with a language containing either nested or cross-serial dependencies. A learning effect for both types of dependencies could be observed, but no difference between dependency types emerged. These behavioral findings do not seem to reflect complexity differences as described in the Chomsky Hierarchy. This study extends previous findings in demonstrating learning effects for nested and cross-serial dependencies with more natural stimulus materials in a classical AGL paradigm after only one hour of exposure. The current findings can be taken as a starting point for further exploring the degree to which the Chomsky Hierarchy reflects cognitive processes.

  17. Does Formal Complexity Reflect Cognitive Complexity? Investigating Aspects of the Chomsky Hierarchy in an Artificial Language Learning Study

    PubMed Central

    Öttl, Birgit; Jäger, Gerhard; Kaup, Barbara

    2015-01-01

    This study investigated whether formal complexity, as described by the Chomsky Hierarchy, corresponds to cognitive complexity during language learning. According to the Chomsky Hierarchy, nested dependencies (context-free) are less complex than cross-serial dependencies (mildly context-sensitive). In two artificial grammar learning (AGL) experiments participants were presented with a language containing either nested or cross-serial dependencies. A learning effect for both types of dependencies could be observed, but no difference between dependency types emerged. These behavioral findings do not seem to reflect complexity differences as described in the Chomsky Hierarchy. This study extends previous findings in demonstrating learning effects for nested and cross-serial dependencies with more natural stimulus materials in a classical AGL paradigm after only one hour of exposure. The current findings can be taken as a starting point for further exploring the degree to which the Chomsky Hierarchy reflects cognitive processes. PMID:25885790

  18. Novices and Experts in Geoinformatics: the Cognitive Gap.

    NASA Astrophysics Data System (ADS)

    Zhilin, M.

    2012-04-01

    Modern geoinformatics is an extremely powerful tool for problem analysis and decision making in various fields. Currently general public uses geoinformatics predominantly for navigating (GPS) and sharing information about particular places (GoogleMaps, Wikimapia). Communities also use geoinformatics for particular purposes: fans of history use it to correspond historical and actual maps (www.retromap.ru), birdwatchers point places where they met birds (geobirds.com/rangemaps) etc. However the majority of stakeholders local authorities are not aware of advantages and possibilities of geoinformatics. The same problem is observed for students. At the same time many professional geoinformatic tools are developed, but sometimes the experts even can't explain their purpose to non-experts. So the question is how to shrink the gap between experts and non-experts in understanding and application of geoinformatics. We think that this gap has a cognitive basis. According to modern cognitive theories (Shiffrin-Atkinson and descending) the information primary has to pass through the perceptual filter that cuts off the information that seems to be irrelevant. The mind estimates the relevance implicitly (unconsciously) basing on previous knowledge and judgments what is important. Then it comes to the working memory which is used (a) for proceeding and (b) for problem solving. The working memory has limited capacity and can operate only with about 7 objects simultaneously. Then information passes to the long-term memory that is of unlimited capacity. There it is stored as more or less complex structures with associative links. When necessary it is extracted into the working memory. If great amount of information is linked ("chunked") the working memory operates with it as one object of seven thus overcoming the limitations of the working memory capacity. To adopt any information it should (a) pass through the perceptual filter, (b) not to overload the working memory and (c) to be structured in the long-term memory. Expert easily adopt domain-specific information because they (a) understand terminology and consider the information to be important thus passing it through the perceptual filter and (b) have a lot of complex domain-specific chunks that are processed by the working memory as a whole thus avoiding to overload it. Novices (students and general public) have neither understanding and feeling importance nor necessary chunks. The following measures should be taken to bridge experts' and novices' understanding of geoinformatics. Expert community should popularize geoscientific problems developing understandable language and available tools for their solving. This requires close collaboration with educational system (especially second education). If students understand a problem, they can find and apply appropriate tool for it. Geoscientific problems and models are extremely complex. In cognitive terms, they require hierarchy of chunks. This hierarchy should coherently develop beginning from simple ones later joining them to complex. It requires an appropriate sequence of learning tasks. There is no necessity in correct solutions - the students should understand how are they solved and realize limitations of models. We think that tasks of weather forecast, global climate modeling etc are suitable. The first step on bridging experts and novices is the elaboration of a set and a sequence of learning tasks and its sequence as well as tools for their solution. The tools should be easy for everybody who understands the task and as versatile as possible - otherwise students will waste a lot of time mastering it. This development requires close collaboration between geoscientists and educators.

  19. CHAMPION: Intelligent Hierarchical Reasoning Agents for Enhanced Decision Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hohimer, Ryan E.; Greitzer, Frank L.; Noonan, Christine F.

    2011-11-15

    We describe the design and development of an advanced reasoning framework employing semantic technologies, organized within a hierarchy of computational reasoning agents that interpret domain specific information. Designed based on an inspirational metaphor of the pattern recognition functions performed by the human neocortex, the CHAMPION reasoning framework represents a new computational modeling approach that derives invariant knowledge representations through memory-prediction belief propagation processes that are driven by formal ontological language specification and semantic technologies. The CHAMPION framework shows promise for enhancing complex decision making in diverse problem domains including cyber security, nonproliferation and energy consumption analysis.

  20. Research on Contextual Memorizing of Meaning in Foreign Language Vocabulary

    ERIC Educational Resources Information Center

    Xu, Linjing; Xiong, Qingxia; Qin, Yufang

    2018-01-01

    The purpose of this study was to examine the role of contexts in the memory of meaning in foreign vocabularies. The study was based on the cognitive processing hierarchy theory of Craik and Lockhart (1972), the memory trace theory of McClelland and Rumelhart (1986) and the memory trace theory of cognitive psychology. The subjects were non-English…

  1. Spatial resolution in visual memory.

    PubMed

    Ben-Shalom, Asaf; Ganel, Tzvi

    2015-04-01

    Representations in visual short-term memory are considered to contain relatively elaborated information on object structure. Conversely, representations in earlier stages of the visual hierarchy are thought to be dominated by a sensory-based, feed-forward buildup of information. In four experiments, we compared the spatial resolution of different object properties between two points in time along the processing hierarchy in visual short-term memory. Subjects were asked either to estimate the distance between objects or to estimate the size of one of the objects' features under two experimental conditions, of either a short or a long delay period between the presentation of the target stimulus and the probe. When different objects were referred to, similar spatial resolution was found for the two delay periods, suggesting that initial processing stages are sensitive to object-based properties. Conversely, superior resolution was found for the short, as compared with the long, delay when features were referred to. These findings suggest that initial representations in visual memory are hybrid in that they allow fine-grained resolution for object features alongside normal visual sensitivity to the segregation between objects. The findings are also discussed in reference to the distinction made in earlier studies between visual short-term memory and iconic memory.

  2. Memory hierarchy using row-based compression

    DOEpatents

    Loh, Gabriel H.; O'Connor, James M.

    2016-10-25

    A system includes a first memory and a device coupleable to the first memory. The device includes a second memory to cache data from the first memory. The second memory includes a plurality of rows, each row including a corresponding set of compressed data blocks of non-uniform sizes and a corresponding set of tag blocks. Each tag block represents a corresponding compressed data block of the row. The device further includes decompression logic to decompress data blocks accessed from the second memory. The device further includes compression logic to compress data blocks to be stored in the second memory.

  3. The rules of implicit evaluation by race, religion, and age.

    PubMed

    Axt, Jordan R; Ebersole, Charles R; Nosek, Brian A

    2014-09-01

    The social world is stratified. Social hierarchies are known but often disavowed as anachronisms or unjust. Nonetheless, hierarchies may persist in social memory. In three studies (total N > 200,000), we found evidence of social hierarchies in implicit evaluation by race, religion, and age. Participants implicitly evaluated their own racial group most positively and the remaining racial groups in accordance with the following hierarchy: Whites > Asians > Blacks > Hispanics. Similarly, participants implicitly evaluated their own religion most positively and the remaining religions in accordance with the following hierarchy: Christianity > Judaism > Hinduism or Buddhism > Islam. In a final study, participants of all ages implicitly evaluated age groups following this rule: children > young adults > middle-age adults > older adults. These results suggest that the rules of social evaluation are pervasively embedded in culture and mind. © The Author(s) 2014.

  4. Complexity measures to track the evolution of a SNOMED hierarchy.

    PubMed

    Wei, Duo; Wang, Yue; Perl, Yehoshua; Xu, Junchuan; Halper, Michael; Spackman, Kent A; Spackman, Kent

    2008-11-06

    SNOMED CT is an extensive terminology with an attendant amount of complexity. Two measures are proposed for quantifying that complexity. Both are based on abstraction networks, called the area taxonomy and the partial-area taxonomy, that provide, for example, distributions of the relationships within a SNOMED hierarchy. The complexity measures are employed specifically to track the complexity of versions of the Specimen hierarchy of SNOMED before and after it is put through an auditing process. The pre-audit and post-audit versions are compared. The results show that the auditing process indeed leads to a simplification of the terminology's structure.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Kai; Song, Linze; Shi, Qiang, E-mail: qshi@iccas.ac.cn

    Based on the path integral approach, we derive a new realization of the exact non-Markovian stochastic Schrödinger equation (SSE). The main difference from the previous non-Markovian quantum state diffusion (NMQSD) method is that the complex Gaussian stochastic process used for the forward propagation of the wave function is correlated, which may be used to reduce the amplitude of the non-Markovian memory term at high temperatures. The new SSE is then written into the recently developed hierarchy of pure states scheme, in a form that is more closely related to the hierarchical equation of motion approach. Numerical simulations are then performedmore » to demonstrate the efficiency of the new method.« less

  6. Using arborescences to estimate hierarchicalness in directed complex networks

    PubMed Central

    2018-01-01

    Complex networks are a useful tool for the understanding of complex systems. One of the emerging properties of such systems is their tendency to form hierarchies: networks can be organized in levels, with nodes in each level exerting control on the ones beneath them. In this paper, we focus on the problem of estimating how hierarchical a directed network is. We propose a structural argument: a network has a strong top-down organization if we need to delete only few edges to reduce it to a perfect hierarchy—an arborescence. In an arborescence, all edges point away from the root and there are no horizontal connections, both characteristics we desire in our idealization of what a perfect hierarchy requires. We test our arborescence score in synthetic and real-world directed networks against the current state of the art in hierarchy detection: agony, flow hierarchy and global reaching centrality. These tests highlight that our arborescence score is intuitive and we can visualize it; it is able to better distinguish between networks with and without a hierarchical structure; it agrees the most with the literature about the hierarchy of well-studied complex systems; and it is not just a score, but it provides an overall scheme of the underlying hierarchy of any directed complex network. PMID:29381761

  7. Diverse Heterologous Primary Infections Radically Alter Immunodominance Hierarchies and Clinical Outcomes Following H7N9 Influenza Challenge in Mice

    PubMed Central

    Duan, Susu; Meliopoulos, Victoria A.; McClaren, Jennifer L.; Guo, Xi-Zhi J.; Sanders, Catherine J.; Smallwood, Heather S.; Webby, Richard J.; Schultz-Cherry, Stacey L.; Doherty, Peter C.; Thomas, Paul G.

    2015-01-01

    The recent emergence of a novel H7N9 influenza A virus (IAV) causing severe human infections in China raises concerns about a possible pandemic. The lack of pre-existing neutralizing antibodies in the broader population highlights the potential protective role of IAV-specific CD8+ cytotoxic T lymphocyte (CTL) memory specific for epitopes conserved between H7N9 and previously encountered IAVs. In the present study, the heterosubtypic immunity generated by prior H9N2 or H1N1 infections significantly, but variably, reduced morbidity and mortality, pulmonary virus load and time to clearance in mice challenged with the H7N9 virus. In all cases, the recall of established CTL memory was characterized by earlier, greater airway infiltration of effectors targeting the conserved or cross-reactive H7N9 IAV peptides; though, depending on the priming IAV, each case was accompanied by distinct CTL epitope immunodominance hierarchies for the prominent KbPB1703, DbPA224, and DbNP366 epitopes. While the presence of conserved, variable, or cross-reactive epitopes between the priming H9N2 and H1N1 and the challenge H7N9 IAVs clearly influenced any change in the immunodominance hierarchy, the changing patterns were not tied solely to epitope conservation. Furthermore, the total size of the IAV-specific memory CTL pool after priming was a better predictor of favorable outcomes than the extent of epitope conservation or secondary CTL expansion. Modifying the size of the memory CTL pool significantly altered its subsequent protective efficacy on disease severity or virus clearance, confirming the important role of heterologous priming. These findings establish that both the protective efficacy of heterosubtypic immunity and CTL immunodominance hierarchies are reflective of the immunological history of the host, a finding that has implications for understanding human CTL responses and the rational design of CTL-mediated vaccines. PMID:25668410

  8. An approach to separating the levels of hierarchical structure building in language and mathematics.

    PubMed

    Makuuchi, Michiru; Bahlmann, Jörg; Friederici, Angela D

    2012-07-19

    We aimed to dissociate two levels of hierarchical structure building in language and mathematics, namely 'first-level' (the build-up of hierarchical structure with externally given elements) and 'second-level' (the build-up of hierarchical structure with internally represented elements produced by first-level processes). Using functional magnetic resonance imaging, we investigated these processes in three domains: sentence comprehension, arithmetic calculation (using Reverse Polish notation, which gives two operands followed by an operator) and a working memory control task. All tasks required the build-up of hierarchical structures at the first- and second-level, resulting in a similar computational hierarchy across language and mathematics, as well as in a working memory control task. Using a novel method that estimates the difference in the integration cost for conditions of different trial durations, we found an anterior-to-posterior functional organization in the prefrontal cortex, according to the level of hierarchy. Common to all domains, the ventral premotor cortex (PMv) supports first-level hierarchy building, while the dorsal pars opercularis (POd) subserves second-level hierarchy building, with lower activation for language compared with the other two tasks. These results suggest that the POd and the PMv support domain-general mechanisms for hierarchical structure building, with the POd being uniquely efficient for language.

  9. Semantic Memory Redux: An Experimental Test of Hierarchical Category Representation

    ERIC Educational Resources Information Center

    Murphy, Gregory L.; Hampton, James A.; Milovanovic, Goran S.

    2012-01-01

    Four experiments investigated the classic issue in semantic memory of whether people organize categorical information in hierarchies and use inference to retrieve information from them, as proposed by Collins and Quillian (1969). Past evidence has focused on RT to confirm sentences such as "All birds are animals" or "Canaries breathe." However,…

  10. The hierarchical and functional connectivity of higher-order cognitive mechanisms: neurorobotic model to investigate the stability and flexibility of working memory

    PubMed Central

    Alnajjar, Fady; Yamashita, Yuichi; Tani, Jun

    2013-01-01

    Higher-order cognitive mechanisms (HOCM), such as planning, cognitive branching, switching, etc., are known to be the outcomes of a unique neural organizations and dynamics between various regions of the frontal lobe. Although some recent anatomical and neuroimaging studies have shed light on the architecture underlying the formation of such mechanisms, the neural dynamics and the pathways in and between the frontal lobe to form and/or to tune the stability level of its working memory remain controversial. A model to clarify this aspect is therefore required. In this study, we propose a simple neurocomputational model that suggests the basic concept of how HOCM, including the cognitive branching and switching in particular, may mechanistically emerge from time-based neural interactions. The proposed model is constructed such that its functional and structural hierarchy mimics, to a certain degree, the biological hierarchy that is believed to exist between local regions in the frontal lobe. Thus, the hierarchy is attained not only by the force of the layout architecture of the neural connections but also through distinct types of neurons, each with different time properties. To validate the model, cognitive branching and switching tasks were simulated in a physical humanoid robot driven by the model. Results reveal that separation between the lower and the higher-level neurons in such a model is an essential factor to form an appropriate working memory to handle cognitive branching and switching. The analyses of the obtained result also illustrates that the breadth of this separation is important to determine the characteristics of the resulting memory, either static memory or dynamic memory. This work can be considered as a joint research between synthetic and empirical studies, which can open an alternative research area for better understanding of brain mechanisms. PMID:23423881

  11. The hierarchical and functional connectivity of higher-order cognitive mechanisms: neurorobotic model to investigate the stability and flexibility of working memory.

    PubMed

    Alnajjar, Fady; Yamashita, Yuichi; Tani, Jun

    2013-01-01

    Higher-order cognitive mechanisms (HOCM), such as planning, cognitive branching, switching, etc., are known to be the outcomes of a unique neural organizations and dynamics between various regions of the frontal lobe. Although some recent anatomical and neuroimaging studies have shed light on the architecture underlying the formation of such mechanisms, the neural dynamics and the pathways in and between the frontal lobe to form and/or to tune the stability level of its working memory remain controversial. A model to clarify this aspect is therefore required. In this study, we propose a simple neurocomputational model that suggests the basic concept of how HOCM, including the cognitive branching and switching in particular, may mechanistically emerge from time-based neural interactions. The proposed model is constructed such that its functional and structural hierarchy mimics, to a certain degree, the biological hierarchy that is believed to exist between local regions in the frontal lobe. Thus, the hierarchy is attained not only by the force of the layout architecture of the neural connections but also through distinct types of neurons, each with different time properties. To validate the model, cognitive branching and switching tasks were simulated in a physical humanoid robot driven by the model. Results reveal that separation between the lower and the higher-level neurons in such a model is an essential factor to form an appropriate working memory to handle cognitive branching and switching. The analyses of the obtained result also illustrates that the breadth of this separation is important to determine the characteristics of the resulting memory, either static memory or dynamic memory. This work can be considered as a joint research between synthetic and empirical studies, which can open an alternative research area for better understanding of brain mechanisms.

  12. CHIMERA: Top-down model for hierarchical, overlapping and directed cluster structures in directed and weighted complex networks

    NASA Astrophysics Data System (ADS)

    Franke, R.

    2016-11-01

    In many networks discovered in biology, medicine, neuroscience and other disciplines special properties like a certain degree distribution and hierarchical cluster structure (also called communities) can be observed as general organizing principles. Detecting the cluster structure of an unknown network promises to identify functional subdivisions, hierarchy and interactions on a mesoscale. It is not trivial choosing an appropriate detection algorithm because there are multiple network, cluster and algorithmic properties to be considered. Edges can be weighted and/or directed, clusters overlap or build a hierarchy in several ways. Algorithms differ not only in runtime, memory requirements but also in allowed network and cluster properties. They are based on a specific definition of what a cluster is, too. On the one hand, a comprehensive network creation model is needed to build a large variety of benchmark networks with different reasonable structures to compare algorithms. On the other hand, if a cluster structure is already known, it is desirable to separate effects of this structure from other network properties. This can be done with null model networks that mimic an observed cluster structure to improve statistics on other network features. A third important application is the general study of properties in networks with different cluster structures, possibly evolving over time. Currently there are good benchmark and creation models available. But what is left is a precise sandbox model to build hierarchical, overlapping and directed clusters for undirected or directed, binary or weighted complex random networks on basis of a sophisticated blueprint. This gap shall be closed by the model CHIMERA (Cluster Hierarchy Interconnection Model for Evaluation, Research and Analysis) which will be introduced and described here for the first time.

  13. PODIO: An Event-Data-Model Toolkit for High Energy Physics Experiments

    NASA Astrophysics Data System (ADS)

    Gaede, F.; Hegner, B.; Mato, P.

    2017-10-01

    PODIO is a C++ library that supports the automatic creation of event data models (EDMs) and efficient I/O code for HEP experiments. It is developed as a new EDM Toolkit for future particle physics experiments in the context of the AIDA2020 EU programme. Experience from LHC and the linear collider community shows that existing solutions partly suffer from overly complex data models with deep object-hierarchies or unfavorable I/O performance. The PODIO project was created in order to address these problems. PODIO is based on the idea of employing plain-old-data (POD) data structures wherever possible, while avoiding deep object-hierarchies and virtual inheritance. At the same time it provides the necessary high-level interface towards the developer physicist, such as the support for inter-object relations and automatic memory-management, as well as a Python interface. To simplify the creation of efficient data models PODIO employs code generation from a simple yaml-based markup language. In addition, it was developed with concurrency in mind in order to support the use of modern CPU features, for example giving basic support for vectorization techniques.

  14. T Cell Receptor-Major Histocompatibility Complex Interaction Strength Defines Trafficking and CD103+ Memory Status of CD8 T Cells in the Brain.

    PubMed

    Sanecka, Anna; Yoshida, Nagisa; Kolawole, Elizabeth Motunrayo; Patel, Harshil; Evavold, Brian D; Frickel, Eva-Maria

    2018-01-01

    T cell receptor-major histocompatibility complex (TCR-MHC) affinities span a wide range in a polyclonal T cell response, yet it is undefined how affinity shapes long-term properties of CD8 T cells during chronic infection with persistent antigen. Here, we investigate how the affinity of the TCR-MHC interaction shapes the phenotype of memory CD8 T cells in the chronically Toxoplasma gondii- infected brain. We employed CD8 T cells from three lines of transnuclear (TN) mice that harbor in their endogenous loci different T cell receptors specific for the same Toxoplasma antigenic epitope ROP7. The three TN CD8 T cell clones span a wide range of affinities to MHCI-ROP7. These three CD8 T cell clones have a distinct and fixed hierarchy in terms of effector function in response to the antigen measured as proliferation capacity, trafficking, T cell maintenance, and memory formation. In particular, the T cell clone of lowest affinity does not home to the brain. The two higher affinity T cell clones show differences in establishing resident-like memory populations (CD103 + ) in the brain with the higher affinity clone persisting longer in the host during chronic infection. Transcriptional profiling of naïve and activated ROP7-specific CD8 T cells revealed that Klf2 encoding a transcription factor that is known to be a negative marker for T cell trafficking is upregulated in the activated lowest affinity ROP7 clone. Our data thus suggest that TCR-MHC affinity dictates memory CD8 T cell fate at the site of infection.

  15. A general graphical user interface for automatic reliability modeling

    NASA Technical Reports Server (NTRS)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1991-01-01

    Reported here is a general Graphical User Interface (GUI) for automatic reliability modeling of Processor Memory Switch (PMS) structures using a Markov model. This GUI is based on a hierarchy of windows. One window has graphical editing capabilities for specifying the system's communication structure, hierarchy, reconfiguration capabilities, and requirements. Other windows have field texts, popup menus, and buttons for specifying parameters and selecting actions. An example application of the GUI is given.

  16. MILC Code Performance on High End CPU and GPU Supercomputer Clusters

    NASA Astrophysics Data System (ADS)

    DeTar, Carleton; Gottlieb, Steven; Li, Ruizi; Toussaint, Doug

    2018-03-01

    With recent developments in parallel supercomputing architecture, many core, multi-core, and GPU processors are now commonplace, resulting in more levels of parallelism, memory hierarchy, and programming complexity. It has been necessary to adapt the MILC code to these new processors starting with NVIDIA GPUs, and more recently, the Intel Xeon Phi processors. We report on our efforts to port and optimize our code for the Intel Knights Landing architecture. We consider performance of the MILC code with MPI and OpenMP, and optimizations with QOPQDP and QPhiX. For the latter approach, we concentrate on the staggered conjugate gradient and gauge force. We also consider performance on recent NVIDIA GPUs using the QUDA library.

  17. Deep recurrent neural network reveals a hierarchy of process memory during dynamic natural vision.

    PubMed

    Shi, Junxing; Wen, Haiguang; Zhang, Yizhen; Han, Kuan; Liu, Zhongming

    2018-05-01

    The human visual cortex extracts both spatial and temporal visual features to support perception and guide behavior. Deep convolutional neural networks (CNNs) provide a computational framework to model cortical representation and organization for spatial visual processing, but unable to explain how the brain processes temporal information. To overcome this limitation, we extended a CNN by adding recurrent connections to different layers of the CNN to allow spatial representations to be remembered and accumulated over time. The extended model, or the recurrent neural network (RNN), embodied a hierarchical and distributed model of process memory as an integral part of visual processing. Unlike the CNN, the RNN learned spatiotemporal features from videos to enable action recognition. The RNN better predicted cortical responses to natural movie stimuli than the CNN, at all visual areas, especially those along the dorsal stream. As a fully observable model of visual processing, the RNN also revealed a cortical hierarchy of temporal receptive window, dynamics of process memory, and spatiotemporal representations. These results support the hypothesis of process memory, and demonstrate the potential of using the RNN for in-depth computational understanding of dynamic natural vision. © 2018 Wiley Periodicals, Inc.

  18. Binary mesh partitioning for cache-efficient visualization.

    PubMed

    Tchiboukdjian, Marc; Danjean, Vincent; Raffin, Bruno

    2010-01-01

    One important bottleneck when visualizing large data sets is the data transfer between processor and memory. Cache-aware (CA) and cache-oblivious (CO) algorithms take into consideration the memory hierarchy to design cache efficient algorithms. CO approaches have the advantage to adapt to unknown and varying memory hierarchies. Recent CA and CO algorithms developed for 3D mesh layouts significantly improve performance of previous approaches, but they lack of theoretical performance guarantees. We present in this paper a {\\schmi O}(N\\log N) algorithm to compute a CO layout for unstructured but well shaped meshes. We prove that a coherent traversal of a N-size mesh in dimension d induces less than N/B+{\\schmi O}(N/M;{1/d}) cache-misses where B and M are the block size and the cache size, respectively. Experiments show that our layout computation is faster and significantly less memory consuming than the best known CO algorithm. Performance is comparable to this algorithm for classical visualization algorithm access patterns, or better when the BSP tree produced while computing the layout is used as an acceleration data structure adjusted to the layout. We also show that cache oblivious approaches lead to significant performance increases on recent GPU architectures.

  19. Segregation and persistence of form in the lateral occipital complex.

    PubMed

    Ferber, Susanne; Humphrey, G Keith; Vilis, Tutis

    2005-01-01

    While the lateral occipital complex (LOC) has been shown to be implicated in object recognition, it is unclear whether this brain area is responsive to low-level stimulus-driven features or high-level representational processes. We used scrambled shape-from-motion displays to disambiguate the presence of contours from figure-ground segregation and to measure the strength of the binding process for shapes without contours. We found persisting brain activation in the LOC for scrambled displays after the motion stopped indicating that this brain area subserves and maintains figure-ground segregation processes, a low-level function in the object processing hierarchy. In our second experiment, we found that the figure-ground segregation process has some form of spatial constancy indicating top-down influences. The persisting activation after the motion stops suggests an intermediate role in object recognition processes for this brain area and might provide further evidence for the idea that the lateral occipital complex subserves mnemonic functions mediating between iconic and short-term memory.

  20. Potential implementation of reservoir computing models based on magnetic skyrmions

    NASA Astrophysics Data System (ADS)

    Bourianoff, George; Pinna, Daniele; Sitte, Matthias; Everschor-Sitte, Karin

    2018-05-01

    Reservoir Computing is a type of recursive neural network commonly used for recognizing and predicting spatio-temporal events relying on a complex hierarchy of nested feedback loops to generate a memory functionality. The Reservoir Computing paradigm does not require any knowledge of the reservoir topology or node weights for training purposes and can therefore utilize naturally existing networks formed by a wide variety of physical processes. Most efforts to implement reservoir computing prior to this have focused on utilizing memristor techniques to implement recursive neural networks. This paper examines the potential of magnetic skyrmion fabrics and the complex current patterns which form in them as an attractive physical instantiation for Reservoir Computing. We argue that their nonlinear dynamical interplay resulting from anisotropic magnetoresistance and spin-torque effects allows for an effective and energy efficient nonlinear processing of spatial temporal events with the aim of event recognition and prediction.

  1. On the origins of hierarchy in complex networks

    PubMed Central

    Corominas-Murtra, Bernat; Goñi, Joaquín; Solé, Ricard V.; Rodríguez-Caso, Carlos

    2013-01-01

    Hierarchy seems to pervade complexity in both living and artificial systems. Despite its relevance, no general theory that captures all features of hierarchy and its origins has been proposed yet. Here we present a formal approach resulting from the convergence of theoretical morphology and network theory that allows constructing a 3D morphospace of hierarchies and hence comparing the hierarchical organization of ecological, cellular, technological, and social networks. Embedded within large voids in the morphospace of all possible hierarchies, four major groups are identified. Two of them match the expected from random networks with similar connectivity, thus suggesting that nonadaptive factors are at work. Ecological and gene networks define the other two, indicating that their topological order is the result of functional constraints. These results are consistent with an exploration of the morphospace, using in silico evolved networks. PMID:23898177

  2. [P. Janet's concept of the notion of time].

    PubMed

    Fouks, L; Guibert, S; Montot, M

    1988-10-01

    The authors primarily show how P. Janet, influenced by Bergson, describes the evolution of the human mind, its complexities and progressive hierarchies, from the reflex arc to the differed arc which allows the emergence of feelings. The notion of time is late, it enters in the groups of feelings. It is interior, subjective and to its study succeeds the analysis of the concept of presence, absence, strain, memory which is for P. Janet essentially prospective, its essential act is narration. The notion of lived time is studied: in: the neurotics whose horror of the present is put forward, the depressed; in: melancholia of waiting where time does not fly, mania through the "delighted ones" and the "restless ones", the delirious.

  3. Teaching Beginners to Program: Some Cognitive Considerations.

    ERIC Educational Resources Information Center

    Rogers, Jean B.

    Learning to program involves developing an understanding of two hierarchies of concepts. One hierarchy consists of data and extends from very literal data (which represents only itself) to very abstract data incorporating variable values in complex interrelationships. The other hierarchy consists of the operations performed on the data and extends…

  4. On Russian concepts of Soil Memory - expansion of Dokuchaev's pedological paradigm

    NASA Astrophysics Data System (ADS)

    Tsatskin, A.

    2012-04-01

    Having developed from Dokuchaev's research on chernosem soils on loess, the Russian school of pedology traditionally focused on soils as essential component of landscape. Dokuchaev's soil-landscape paradigm (SLP) was later considerably advanced and expanded to include surface soils on other continents by Hans Jenny. In the 1970s Sokolov and Targulian in Russia introduced the new term of soil memory as an inherent ability of soils to memorize in its morphology and properties the processes of earlier stages of development. This understanding was built upon ideas of soil organizational hierarchy and different rates of specific soil processes as proposed by Yaalon. Soil memory terminology became particularly popular in Russia which is expressed in the 2008 multi-author monograph on soil memory. The Soil Memory book edited by Targulian and Goryachkin and written by 34 authors touches upon the following themes: General approaches (Section 1), Mineral carriers of soil memory (Section 2), Biological carriers of soil memory (section 3) and Anthropogenic soil memory (section 4). The book presents an original account on different new interdisciplinary projects on Russian soils and represents an important contribution into the classical Dokuchaev-Jenny SL paradigm. There is still a controversy as to in what way the Russian term soil memory is related to western terms of soil as a record or archive of earlier events and processes during the time of soil formation. Targulian and Goryachkin agree that all of the terms are close, albeit not entirely interchangeable. They insist that soil memory may have a more comprehensive meaning, e.g. applicable to such complex cases when certain soil properties whose origin is currently ambiguous cannot provide valid environmental reconstructions or dated by available dating techniques. Anyway, not terminology is the main issue. The Russian soil memory concept advances the frontiers of pedology by deepening the time-related soil functions and encouraging closer cooperation with isotope dating experts. This approach will hopefully help us all in better understanding, management and protection of the Earth's critical zone.

  5. Application of phase-change materials in memory taxonomy.

    PubMed

    Wang, Lei; Tu, Liang; Wen, Jing

    2017-01-01

    Phase-change materials are suitable for data storage because they exhibit reversible transitions between crystalline and amorphous states that have distinguishable electrical and optical properties. Consequently, these materials find applications in diverse memory devices ranging from conventional optical discs to emerging nanophotonic devices. Current research efforts are mostly devoted to phase-change random access memory, whereas the applications of phase-change materials in other types of memory devices are rarely reported. Here we review the physical principles of phase-change materials and devices aiming to help researchers understand the concept of phase-change memory. We classify phase-change memory devices into phase-change optical disc, phase-change scanning probe memory, phase-change random access memory, and phase-change nanophotonic device, according to their locations in memory hierarchy. For each device type we discuss the physical principles in conjunction with merits and weakness for data storage applications. We also outline state-of-the-art technologies and future prospects.

  6. 3D hierarchical spatial representation and memory of multimodal sensory data

    NASA Astrophysics Data System (ADS)

    Khosla, Deepak; Dow, Paul A.; Huber, David J.

    2009-04-01

    This paper describes an efficient method and system for representing, processing and understanding multi-modal sensory data. More specifically, it describes a computational method and system for how to process and remember multiple locations in multimodal sensory space (e.g., visual, auditory, somatosensory, etc.). The multimodal representation and memory is based on a biologically-inspired hierarchy of spatial representations implemented with novel analogues of real representations used in the human brain. The novelty of the work is in the computationally efficient and robust spatial representation of 3D locations in multimodal sensory space as well as an associated working memory for storage and recall of these representations at the desired level for goal-oriented action. We describe (1) A simple and efficient method for human-like hierarchical spatial representations of sensory data and how to associate, integrate and convert between these representations (head-centered coordinate system, body-centered coordinate, etc.); (2) a robust method for training and learning a mapping of points in multimodal sensory space (e.g., camera-visible object positions, location of auditory sources, etc.) to the above hierarchical spatial representations; and (3) a specification and implementation of a hierarchical spatial working memory based on the above for storage and recall at the desired level for goal-oriented action(s). This work is most useful for any machine or human-machine application that requires processing of multimodal sensory inputs, making sense of it from a spatial perspective (e.g., where is the sensory information coming from with respect to the machine and its parts) and then taking some goal-oriented action based on this spatial understanding. A multi-level spatial representation hierarchy means that heterogeneous sensory inputs (e.g., visual, auditory, somatosensory, etc.) can map onto the hierarchy at different levels. When controlling various machine/robot degrees of freedom, the desired movements and action can be computed from these different levels in the hierarchy. The most basic embodiment of this machine could be a pan-tilt camera system, an array of microphones, a machine with arm/hand like structure or/and a robot with some or all of the above capabilities. We describe the approach, system and present preliminary results on a real-robotic platform.

  7. A Study of the Effects of Variation of Short-Term Memory Load, Reading Response Length, and Processing Hierarchy on TOEFL Listening Comprehension Item Performance. Report 33.

    ERIC Educational Resources Information Center

    Henning, Grant

    Criticisms of the Test of English as a Foreign Language (TOEFL) have included speculation that the listening test places too much burden on short-term memory as compared with comprehension, that a knowledge of reading is required to respond successfully, and that many items appear to require mere recall and matching rather than higher-order…

  8. Using CLIPS in the domain of knowledge-based massively parallel programming

    NASA Technical Reports Server (NTRS)

    Dvorak, Jiri J.

    1994-01-01

    The Program Development Environment (PDE) is a tool for massively parallel programming of distributed-memory architectures. Adopting a knowledge-based approach, the PDE eliminates the complexity introduced by parallel hardware with distributed memory and offers complete transparency in respect of parallelism exploitation. The knowledge-based part of the PDE is realized in CLIPS. Its principal task is to find an efficient parallel realization of the application specified by the user in a comfortable, abstract, domain-oriented formalism. A large collection of fine-grain parallel algorithmic skeletons, represented as COOL objects in a tree hierarchy, contains the algorithmic knowledge. A hybrid knowledge base with rule modules and procedural parts, encoding expertise about application domain, parallel programming, software engineering, and parallel hardware, enables a high degree of automation in the software development process. In this paper, important aspects of the implementation of the PDE using CLIPS and COOL are shown, including the embedding of CLIPS with C++-based parts of the PDE. The appropriateness of the chosen approach and of the CLIPS language for knowledge-based software engineering are discussed.

  9. Hierarchy Measure for Complex Networks

    PubMed Central

    Mones, Enys; Vicsek, Lilla; Vicsek, Tamás

    2012-01-01

    Nature, technology and society are full of complexity arising from the intricate web of the interactions among the units of the related systems (e.g., proteins, computers, people). Consequently, one of the most successful recent approaches to capturing the fundamental features of the structure and dynamics of complex systems has been the investigation of the networks associated with the above units (nodes) together with their relations (edges). Most complex systems have an inherently hierarchical organization and, correspondingly, the networks behind them also exhibit hierarchical features. Indeed, several papers have been devoted to describing this essential aspect of networks, however, without resulting in a widely accepted, converging concept concerning the quantitative characterization of the level of their hierarchy. Here we develop an approach and propose a quantity (measure) which is simple enough to be widely applicable, reveals a number of universal features of the organization of real-world networks and, as we demonstrate, is capable of capturing the essential features of the structure and the degree of hierarchy in a complex network. The measure we introduce is based on a generalization of the m-reach centrality, which we first extend to directed/partially directed graphs. Then, we define the global reaching centrality (GRC), which is the difference between the maximum and the average value of the generalized reach centralities over the network. We investigate the behavior of the GRC considering both a synthetic model with an adjustable level of hierarchy and real networks. Results for real networks show that our hierarchy measure is related to the controllability of the given system. We also propose a visualization procedure for large complex networks that can be used to obtain an overall qualitative picture about the nature of their hierarchical structure. PMID:22470477

  10. Multiprocessor switch with selective pairing

    DOEpatents

    Gara, Alan; Gschwind, Michael K; Salapura, Valentina

    2014-03-11

    System, method and computer program product for a multiprocessing system to offer selective pairing of processor cores for increased processing reliability. A selective pairing facility is provided that selectively connects, i.e., pairs, multiple microprocessor or processor cores to provide one highly reliable thread (or thread group). Each paired microprocessor or processor cores that provide one highly reliable thread for high-reliability connect with a system components such as a memory "nest" (or memory hierarchy), an optional system controller, and optional interrupt controller, optional I/O or peripheral devices, etc. The memory nest is attached to a selective pairing facility via a switch or a bus

  11. C-MOS array design techniques: SUMC multiprocessor system study

    NASA Technical Reports Server (NTRS)

    Clapp, W. A.; Helbig, W. A.; Merriam, A. S.

    1972-01-01

    The current capabilities of LSI techniques for speed and reliability, plus the possibilities of assembling large configurations of LSI logic and storage elements, have demanded the study of multiprocessors and multiprocessing techniques, problems, and potentialities. Evaluated are three previous systems studies for a space ultrareliable modular computer multiprocessing system, and a new multiprocessing system is proposed that is flexibly configured with up to four central processors, four 1/0 processors, and 16 main memory units, plus auxiliary memory and peripheral devices. This multiprocessor system features a multilevel interrupt, qualified S/360 compatibility for ground-based generation of programs, virtual memory management of a storage hierarchy through 1/0 processors, and multiport access to multiple and shared memory units.

  12. Hierarchy in the eye of the beholder: (Anti-)egalitarianism shapes perceived levels of social inequality.

    PubMed

    Kteily, Nour S; Sheehy-Skeffington, Jennifer; Ho, Arnold K

    2017-01-01

    Debate surrounding the issue of inequality and hierarchy between social groups has become increasingly prominent in recent years. At the same time, individuals disagree about the extent to which inequality between advantaged and disadvantaged groups exists. Whereas prior work has examined the ways in which individuals legitimize (or delegitimize) inequality as a function of their motivations, we consider whether individuals' orientation toward group-based hierarchy motivates the extent to which they perceive inequality between social groups in the first place. Across 8 studies in both real-world (race, gender, and class) and artificial contexts, and involving members of both advantaged and disadvantaged groups, we show that the more individuals endorse hierarchy between groups, the less they perceive inequality between groups at the top and groups at the bottom. Perceiving less inequality is associated with rejecting egalitarian social policies aimed at reducing it. We show that these differences in hierarchy perception as a function of individuals' motivational orientation hold even when inequality is depicted abstractly using images, and even when individuals are financially incentivized to accurately report their true perceptions. Using a novel methodology to assess accurate memory of hierarchy, we find that differences may be driven by both antiegalitarians underestimating inequality, and egalitarians overestimating it. In sum, our results identify a novel perceptual bias rooted in individuals' chronic motivations toward hierarchy-maintenance, with the potential to influence their policy attitudes. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. A linguistic geometry for space applications

    NASA Technical Reports Server (NTRS)

    Stilman, Boris

    1994-01-01

    We develop a formal theory, the so-called Linguistic Geometry, in order to discover the inner properties of human expert heuristics, which were successful in a certain class of complex control systems, and apply them to different systems. This research relies on the formalization of search heuristics of high-skilled human experts which allow for the decomposition of complex system into the hierarchy of subsystems, and thus solve intractable problems reducing the search. The hierarchy of subsystems is represented as a hierarchy of formal attribute languages. This paper includes a formal survey of the Linguistic Geometry, and new example of a solution of optimization problem for the space robotic vehicles. This example includes actual generation of the hierarchy of languages, some details of trajectory generation and demonstrates the drastic reduction of search in comparison with conventional search algorithms.

  14. Self-organizing hierarchies in sensor and communication networks.

    PubMed

    Prokopenko, Mikhail; Wang, Peter; Valencia, Philip; Price, Don; Foreman, Mark; Farmer, Anthony

    2005-01-01

    We consider a hierarchical multicellular sensing and communication network, embedded in an ageless aerospace vehicle that is expected to detect and react to multiple impacts and damage over a wide range of impact energies. In particular, we investigate self-organization of impact boundaries enclosing critically damaged areas, and impact networks connecting remote cells that have detected noncritical impacts. Each level of the hierarchy is shown to have distinct higher-order emergent properties, desirable in self-monitoring and self-repairing vehicles. In addition, cells and communication messages are shown to need memory (hysteresis) in order to retain desirable emergent behavior within and between various hierarchical levels. Spatiotemporal robustness of self-organizing hierarchies is quantitatively measured with graph-theoretic and information-theoretic techniques, such as the Shannon entropy. This allows us to clearly identify phase transitions separating chaotic dynamics from ordered and robust patterns.

  15. Toward Transparent Data Management in Multi-layer Storage Hierarchy for HPC Systems

    DOE PAGES

    Wadhwa, Bharti; Byna, Suren; Butt, Ali R.

    2018-04-17

    Upcoming exascale high performance computing (HPC) systems are expected to comprise multi-tier storage hierarchy, and thus will necessitate innovative storage and I/O mechanisms. Traditional disk and block-based interfaces and file systems face severe challenges in utilizing capabilities of storage hierarchies due to the lack of hierarchy support and semantic interfaces. Object-based and semantically-rich data abstractions for scientific data management on large scale systems offer a sustainable solution to these challenges. Such data abstractions can also simplify users involvement in data movement. Here, we take the first steps of realizing such an object abstraction and explore storage mechanisms for these objectsmore » to enhance I/O performance, especially for scientific applications. We explore how an object-based interface can facilitate next generation scalable computing systems by presenting the mapping of data I/O from two real world HPC scientific use cases: a plasma physics simulation code (VPIC) and a cosmology simulation code (HACC). Our storage model stores data objects in different physical organizations to support data movement across layers of memory/storage hierarchy. Our implementation sclaes well to 16K parallel processes, and compared to the state of the art, such as MPI-IO and HDF5, our object-based data abstractions and data placement strategy in multi-level storage hierarchy achieves up to 7 X I/O performance improvement for scientific data.« less

  16. Toward Transparent Data Management in Multi-layer Storage Hierarchy for HPC Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wadhwa, Bharti; Byna, Suren; Butt, Ali R.

    Upcoming exascale high performance computing (HPC) systems are expected to comprise multi-tier storage hierarchy, and thus will necessitate innovative storage and I/O mechanisms. Traditional disk and block-based interfaces and file systems face severe challenges in utilizing capabilities of storage hierarchies due to the lack of hierarchy support and semantic interfaces. Object-based and semantically-rich data abstractions for scientific data management on large scale systems offer a sustainable solution to these challenges. Such data abstractions can also simplify users involvement in data movement. Here, we take the first steps of realizing such an object abstraction and explore storage mechanisms for these objectsmore » to enhance I/O performance, especially for scientific applications. We explore how an object-based interface can facilitate next generation scalable computing systems by presenting the mapping of data I/O from two real world HPC scientific use cases: a plasma physics simulation code (VPIC) and a cosmology simulation code (HACC). Our storage model stores data objects in different physical organizations to support data movement across layers of memory/storage hierarchy. Our implementation sclaes well to 16K parallel processes, and compared to the state of the art, such as MPI-IO and HDF5, our object-based data abstractions and data placement strategy in multi-level storage hierarchy achieves up to 7 X I/O performance improvement for scientific data.« less

  17. The Science of Computing: Virtual Memory

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1986-01-01

    In the March-April issue, I described how a computer's storage system is organized as a hierarchy consisting of cache, main memory, and secondary memory (e.g., disk). The cache and main memory form a subsystem that functions like main memory but attains speeds approaching cache. What happens if a program and its data are too large for the main memory? This is not a frivolous question. Every generation of computer users has been frustrated by insufficient memory. A new line of computers may have sufficient storage for the computations of its predecessor, but new programs will soon exhaust its capacity. In 1960, a longrange planning committee at MIT dared to dream of a computer with 1 million words of main memory. In 1985, the Cray-2 was delivered with 256 million words. Computational physicists dream of computers with 1 billion words. Computer architects have done an outstanding job of enlarging main memories yet they have never kept up with demand. Only the shortsighted believe they can.

  18. Application of phase-change materials in memory taxonomy

    PubMed Central

    Wang, Lei; Tu, Liang; Wen, Jing

    2017-01-01

    Abstract Phase-change materials are suitable for data storage because they exhibit reversible transitions between crystalline and amorphous states that have distinguishable electrical and optical properties. Consequently, these materials find applications in diverse memory devices ranging from conventional optical discs to emerging nanophotonic devices. Current research efforts are mostly devoted to phase-change random access memory, whereas the applications of phase-change materials in other types of memory devices are rarely reported. Here we review the physical principles of phase-change materials and devices aiming to help researchers understand the concept of phase-change memory. We classify phase-change memory devices into phase-change optical disc, phase-change scanning probe memory, phase-change random access memory, and phase-change nanophotonic device, according to their locations in memory hierarchy. For each device type we discuss the physical principles in conjunction with merits and weakness for data storage applications. We also outline state-of-the-art technologies and future prospects. PMID:28740557

  19. Optimal Planning and Problem-Solving

    NASA Technical Reports Server (NTRS)

    Clemet, Bradley; Schaffer, Steven; Rabideau, Gregg

    2008-01-01

    CTAEMS MDP Optimal Planner is a problem-solving software designed to command a single spacecraft/rover, or a team of spacecraft/rovers, to perform the best action possible at all times according to an abstract model of the spacecraft/rover and its environment. It also may be useful in solving logistical problems encountered in commercial applications such as shipping and manufacturing. The planner reasons around uncertainty according to specified probabilities of outcomes using a plan hierarchy to avoid exploring certain kinds of suboptimal actions. Also, planned actions are calculated as the state-action space is expanded, rather than afterward, to reduce by an order of magnitude the processing time and memory used. The software solves planning problems with actions that can execute concurrently, that have uncertain duration and quality, and that have functional dependencies on others that affect quality. These problems are modeled in a hierarchical planning language called C_TAEMS, a derivative of the TAEMS language for specifying domains for the DARPA Coordinators program. In realistic environments, actions often have uncertain outcomes and can have complex relationships with other tasks. The planner approaches problems by considering all possible actions that may be taken from any state reachable from a given, initial state, and from within the constraints of a given task hierarchy that specifies what tasks may be performed by which team member.

  20. Exploring the Complexities of Army Civilians and the Army Profession

    DTIC Science & Technology

    2013-03-01

    61. 55 Gareth R. Jones, 2-4. 56 Thomas Diefenbach and Rune Todnem, “Bureaucracy and Hierarchy - What Else!? in Reinventing Hierarchy and...57 Thomas Diefenbach and Rune Todnem, “Introduction,” in Reinventing Hierarchy and Bureaucracy—From the Bureau to Network Organizations (Bingley...Handbook of Public Management (New York: Oxford University Press, 2005), 472; Gareth R. Jones, 313. 73 Thomas Diefenbach and Rune Todnem

  1. Why and when hierarchy impacts team effectiveness: A meta-analytic integration.

    PubMed

    Greer, Lindred L; de Jong, Bart A; Schouten, Maartje E; Dannals, Jennifer E

    2018-06-01

    Hierarchy has the potential to both benefit and harm team effectiveness. In this article, we meta-analytically investigate different explanations for why and when hierarchy helps or hurts team effectiveness, drawing on results from 54 prior studies (N = 13,914 teams). Our findings show that, on net, hierarchy negatively impacts team effectiveness (performance: ρ = -.08; viability: ρ = -.11), and that this effect is mediated by increased conflict-enabling states. Additionally, we show that the negative relationship between hierarchy and team performance is exacerbated by aspects of the team structure (i.e., membership instability, skill differentiation) and the hierarchy itself (i.e., mutability), which make hierarchical teams prone to conflict. The predictions regarding the positive effect of hierarchy on team performance as mediated by coordination-enabling processes, and the moderating roles of several aspects of team tasks (i.e., interdependence, complexity) and the hierarchy (i.e., form) were not supported, with the exception that task ambiguity enhanced the positive effects of hierarchy. Given that our findings largely support dysfunctional views on hierarchy, future research is needed to understand when and why hierarchy may be more likely to live up to its purported functional benefits. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. A Framework for Distributed Problem Solving

    NASA Astrophysics Data System (ADS)

    Leone, Joseph; Shin, Don G.

    1989-03-01

    This work explores a distributed problem solving (DPS) approach, namely the AM/AG model, to cooperative memory recall. The AM/AG model is a hierarchic social system metaphor for DPS based on the Mintzberg's model of organizations. At the core of the model are information flow mechanisms, named amplification and aggregation. Amplification is a process of expounding a given task, called an agenda, into a set of subtasks with magnified degree of specificity and distributing them to multiple processing units downward in the hierarchy. Aggregation is a process of combining the results reported from multiple processing units into a unified view, called a resolution, and promoting the conclusion upward in the hierarchy. The combination of amplification and aggregation can account for a memory recall process which primarily relies on the ability of making associations between vast amounts of related concepts, sorting out the combined results, and promoting the most plausible ones. The amplification process is discussed in detail. An implementation of the amplification process is presented. The process is illustrated by an example.

  3. Structural Measures to Track the Evolution of SNOMED CT Hierarchies

    PubMed Central

    Wei, Duo; Gu, Huanying (Helen); Perl, Yehoshua; Halper, Michael; Ochs, Christopher; Elhanan, Gai; Chen, Yan

    2015-01-01

    The Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT) is an extensive reference terminology with an attendant amount of complexity. It has been updated continuously and revisions have been released semi-annually to meet users’ needs and to reflect the results of quality assurance (QA) activities. Two measures based on structural features are proposed to track the effects of both natural terminology growth and QA activities based on aspects of the complexity of SNOMED CT. These two measures, called the structural density measure and accumulated structural measure, are derived based on two abstraction networks, the area taxonomy and the partial-area taxonomy. The measures derive from attribute relationship distributions and various concept groupings that are associated with the abstraction networks. They are used to track the trends in the complexity of structures as SNOMED CT changes over time. The measures were calculated for consecutive releases of five SNOMED CT hierarchies, including the Specimen hierarchy. The structural density measure shows that natural growth tends to move a hierarchy’s structure toward a more complex state, whereas the accumulated structural measure shows that QA processes tend to move a hierarchy’s structure toward a less complex state. It is also observed that both the structural density and accumulated structural measures are useful tools to track the evolution of an entire SNOMED CT hierarchy and reveal internal concept migration within it. PMID:26260003

  4. Recursion Relations for Double Ramification Hierarchies

    NASA Astrophysics Data System (ADS)

    Buryak, Alexandr; Rossi, Paolo

    2016-03-01

    In this paper we study various properties of the double ramification hierarchy, an integrable hierarchy of hamiltonian PDEs introduced in Buryak (CommunMath Phys 336(3):1085-1107, 2015) using intersection theory of the double ramification cycle in the moduli space of stable curves. In particular, we prove a recursion formula that recovers the full hierarchy starting from just one of the Hamiltonians, the one associated to the first descendant of the unit of a cohomological field theory. Moreover, we introduce analogues of the topological recursion relations and the divisor equation both for the Hamiltonian densities and for the string solution of the double ramification hierarchy. This machinery is very efficient and we apply it to various computations for the trivial and Hodge cohomological field theories, and for the r -spin Witten's classes. Moreover, we prove the Miura equivalence between the double ramification hierarchy and the Dubrovin-Zhang hierarchy for the Gromov-Witten theory of the complex projective line (extended Toda hierarchy).

  5. Experimental evaluation of multiprocessor cache-based error recovery

    NASA Technical Reports Server (NTRS)

    Janssens, Bob; Fuchs, W. K.

    1991-01-01

    Several variations of cache-based checkpointing for rollback error recovery in shared-memory multiprocessors have been recently developed. By modifying the cache replacement policy, these techniques use the inherent redundancy in the memory hierarchy to periodically checkpoint the computation state. Three schemes, different in the manner in which they avoid rollback propagation, are evaluated. By simulation with address traces from parallel applications running on an Encore Multimax shared-memory multiprocessor, the performance effect of integrating the recovery schemes in the cache coherence protocol are evaluated. The results indicate that the cache-based schemes can provide checkpointing capability with low performance overhead but uncontrollable high variability in the checkpoint interval.

  6. A review of emerging non-volatile memory (NVM) technologies and applications

    NASA Astrophysics Data System (ADS)

    Chen, An

    2016-11-01

    This paper will review emerging non-volatile memory (NVM) technologies, with the focus on phase change memory (PCM), spin-transfer-torque random-access-memory (STTRAM), resistive random-access-memory (RRAM), and ferroelectric field-effect-transistor (FeFET) memory. These promising NVM devices are evaluated in terms of their advantages, challenges, and applications. Their performance is compared based on reported parameters of major industrial test chips. Memory selector devices and cell structures are discussed. Changing market trends toward low power (e.g., mobile, IoT) and data-centric applications create opportunities for emerging NVMs. High-performance and low-cost emerging NVMs may simplify memory hierarchy, introduce non-volatility in logic gates and circuits, reduce system power, and enable novel architectures. Storage-class memory (SCM) based on high-density NVMs could fill the performance and density gap between memory and storage. Some unique characteristics of emerging NVMs can be utilized for novel applications beyond the memory space, e.g., neuromorphic computing, hardware security, etc. In the beyond-CMOS era, emerging NVMs have the potential to fulfill more important functions and enable more efficient, intelligent, and secure computing systems.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, Michael A.; Berry, Jonathan W.; Hammond, Simon D.

    A challenge in computer architecture is that processors often cannot be fed data from DRAM as fast as CPUs can consume it. Therefore, many applications are memory-bandwidth bound. With this motivation and the realization that traditional architectures (with all DRAM reachable only via bus) are insufficient to feed groups of modern processing units, vendors have introduced a variety of non-DDR 3D memory technologies (Hybrid Memory Cube (HMC),Wide I/O 2, High Bandwidth Memory (HBM)). These offer higher bandwidth and lower power by stacking DRAM chips on the processor or nearby on a silicon interposer. We will call these solutions “near-memory,” andmore » if user-addressable, “scratchpad.” High-performance systems on the market now offer two levels of main memory: near-memory on package and traditional DRAM further away. In the near term we expect the latencies near-memory and DRAM to be similar. Here, it is natural to think of near-memory as another module on the DRAM level of the memory hierarchy. Vendors are expected to offer modes in which the near memory is used as cache, but we believe that this will be inefficient.« less

  8. SEPAC flight software detailed design specifications, volume 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The detailed design specifications (as built) for the SEPAC Flight Software are defined. The design includes a description of the total software system and of each individual module within the system. The design specifications describe the decomposition of the software system into its major components. The system structure is expressed in the following forms: the control-flow hierarchy of the system, the data-flow structure of the system, the task hierarchy, the memory structure, and the software to hardware configuration mapping. The component design description includes details on the following elements: register conventions, module (subroutines) invocaton, module functions, interrupt servicing, data definitions, and database structure.

  9. Who is the boss? Individual recognition memory and social hierarchy formation in crayfish.

    PubMed

    Jiménez-Morales, Nayeli; Mendoza-Ángeles, Karina; Porras-Villalobos, Mercedes; Ibarra-Coronado, Elizabeth; Roldán-Roldán, Gabriel; Hernández-Falcón, Jesús

    2018-01-01

    Under laboratory conditions, crayfish establish hierarchical orders through agonistic encounters whose outcome defines the dominant one and one, or more, submissive animals. These agonistic encounters are ritualistic, based on threats, pushes, attacks, grabs, and avoidance behaviors that include retreats and escape responses. Agonistic behavior in a triad of unfamiliar, size-matched animals is intense on the first day of social interaction and the intensity fades on daily repetitions. The dominant animal keeps its status for long periods, and the submissive ones seem to remember 'who the boss is'. It has been assumed that animals remember and recognize their hierarchical status by urine signals, but the putative substance mediating this recognition has not been reported. The aim of this work was to characterize this hierarchical recognition memory. Triads of unfamiliar crayfish (male animals, size and weight-matched) were faced during standardized agonistic protocols for five consecutive days to analyze memory acquisition dynamics (Experiment 1). In Experiment 2, dominant crayfish were shifted among triads to disclose whether hierarchy depended upon individual recognition memory or recognition of status. The maintenance of the hierarchical structure without behavioral reinforcement was assessed by immobilizing the dominant animal during eleven daily agonistic encounters, and considering any shift in the dominance order (Experiment 3). Standard amnesic treatments (anisomycin, scopolamine or cold-anesthesia) were given to all members of the triads immediately after the first interaction session to prevent individual recognition memory consolidation and evaluate its effect on the hierarchical order (Experiment 4). Acquisition of hierarchical recognition occurs at the first agonistic encounter and agonistic behavior gradually diminishes in the following days; animals keep their hierarchical order despite the inability of the dominant crayfish to attack the submissive ones. Finally, blocking of protein synthesis or muscarinic receptors and cold anesthesia impair memory consolidation. These findings suggest that agonistic encounters induces the acquisition of a robust and lasting social recognition memory in crayfish. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Infinite hierarchy of nonlinear Schrödinger equations and their solutions.

    PubMed

    Ankiewicz, A; Kedziora, D J; Chowdury, A; Bandelow, U; Akhmediev, N

    2016-01-01

    We study the infinite integrable nonlinear Schrödinger equation hierarchy beyond the Lakshmanan-Porsezian-Daniel equation which is a particular (fourth-order) case of the hierarchy. In particular, we present the generalized Lax pair and generalized soliton solutions, plane wave solutions, Akhmediev breathers, Kuznetsov-Ma breathers, periodic solutions, and rogue wave solutions for this infinite-order hierarchy. We find that "even- order" equations in the set affect phase and "stretching factors" in the solutions, while "odd-order" equations affect the velocities. Hence odd-order equation solutions can be real functions, while even-order equation solutions are always complex.

  11. Feature-Based Visual Short-Term Memory Is Widely Distributed and Hierarchically Organized.

    PubMed

    Dotson, Nicholas M; Hoffman, Steven J; Goodell, Baldwin; Gray, Charles M

    2018-06-15

    Feature-based visual short-term memory is known to engage both sensory and association cortices. However, the extent of the participating circuit and the neural mechanisms underlying memory maintenance is still a matter of vigorous debate. To address these questions, we recorded neuronal activity from 42 cortical areas in monkeys performing a feature-based visual short-term memory task and an interleaved fixation task. We find that task-dependent differences in firing rates are widely distributed throughout the cortex, while stimulus-specific changes in firing rates are more restricted and hierarchically organized. We also show that microsaccades during the memory delay encode the stimuli held in memory and that units modulated by microsaccades are more likely to exhibit stimulus specificity, suggesting that eye movements contribute to visual short-term memory processes. These results support a framework in which most cortical areas, within a modality, contribute to mnemonic representations at timescales that increase along the cortical hierarchy. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Scalable Algorithms for Clustering Large Geospatiotemporal Data Sets on Manycore Architectures

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Hoffman, F. M.; Kumar, J.; Sreepathi, S.; Sripathi, V.

    2016-12-01

    The increasing availability of high-resolution geospatiotemporal data sets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery using data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe a massively parallel implementation of accelerated k-means clustering and some optimizations to boost computational intensity and utilization of wide SIMD lanes on state-of-the art multi- and manycore processors, including the second-generation Intel Xeon Phi ("Knights Landing") processor based on the Intel Many Integrated Core (MIC) architecture, which includes several new features, including an on-package high-bandwidth memory. We also analyze the code in the context of a few practical applications to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.

  13. Two-level main memory co-design: Multi-threaded algorithmic primitives, analysis, and simulation

    DOE PAGES

    Bender, Michael A.; Berry, Jonathan W.; Hammond, Simon D.; ...

    2017-01-03

    A challenge in computer architecture is that processors often cannot be fed data from DRAM as fast as CPUs can consume it. Therefore, many applications are memory-bandwidth bound. With this motivation and the realization that traditional architectures (with all DRAM reachable only via bus) are insufficient to feed groups of modern processing units, vendors have introduced a variety of non-DDR 3D memory technologies (Hybrid Memory Cube (HMC),Wide I/O 2, High Bandwidth Memory (HBM)). These offer higher bandwidth and lower power by stacking DRAM chips on the processor or nearby on a silicon interposer. We will call these solutions “near-memory,” andmore » if user-addressable, “scratchpad.” High-performance systems on the market now offer two levels of main memory: near-memory on package and traditional DRAM further away. In the near term we expect the latencies near-memory and DRAM to be similar. Here, it is natural to think of near-memory as another module on the DRAM level of the memory hierarchy. Vendors are expected to offer modes in which the near memory is used as cache, but we believe that this will be inefficient.« less

  14. Measuring, Enabling and Comparing Modularity, Regularity and Hierarchy in Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2005-01-01

    For computer-automated design systems to scale to complex designs they must be able to produce designs that exhibit the characteristics of modularity, regularity and hierarchy - characteristics that are found both in man-made and natural designs. Here we claim that these characteristics are enabled by implementing the attributes of combination, control-flow and abstraction in the representation. To support this claim we use an evolutionary algorithm to evolve solutions to different sizes of a table design problem using five different representations, each with different combinations of modularity, regularity and hierarchy enabled and show that the best performance happens when all three of these attributes are enabled. We also define metrics for modularity, regularity and hierarchy in design encodings and demonstrate that high fitness values are achieved with high values of modularity, regularity and hierarchy and that there is a positive correlation between increases in fitness and increases in modularity. regularity and hierarchy.

  15. A complex dominance hierarchy is controlled by polymorphism of small RNAs and their targets.

    PubMed

    Yasuda, Shinsuke; Wada, Yuko; Kakizaki, Tomohiro; Tarutani, Yoshiaki; Miura-Uno, Eiko; Murase, Kohji; Fujii, Sota; Hioki, Tomoya; Shimoda, Taiki; Takada, Yoshinobu; Shiba, Hiroshi; Takasaki-Yasuda, Takeshi; Suzuki, Go; Watanabe, Masao; Takayama, Seiji

    2016-12-22

    In diploid organisms, phenotypic traits are often biased by effects known as Mendelian dominant-recessive interactions between inherited alleles. Phenotypic expression of SP11 alleles, which encodes the male determinants of self-incompatibility in Brassica rapa, is governed by a complex dominance hierarchy 1-3 . Here, we show that a single polymorphic 24 nucleotide small RNA, named SP11 methylation inducer 2 (Smi2), controls the linear dominance hierarchy of the four SP11 alleles (S 44 > S 60 > S 40 > S 29 ). In all dominant-recessive interactions, small RNA variants derived from the linked region of dominant SP11 alleles exhibited high sequence similarity to the promoter regions of recessive SP11 alleles and acted in trans to epigenetically silence their expression. Together with our previous study 4 , we propose a new model: sequence similarity between polymorphic small RNAs and their target regulates mono-allelic gene expression, which explains the entire five-phased linear dominance hierarchy of the SP11 phenotypic expression in Brassica.

  16. SMT-Aware Instantaneous Footprint Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roy, Probir; Liu, Xu; Song, Shuaiwen

    Modern architectures employ simultaneous multithreading (SMT) to increase thread-level parallelism. SMT threads share many functional units and the whole memory hierarchy of a physical core. Without a careful code design, SMT threads can easily contend with each other for these shared resources, causing severe performance degradation. Minimizing SMT thread contention for HPC applications running on dedicated platforms is very challenging, because they usually spawn threads within Single Program Multiple Data (SPMD) models. To address this important issue, we introduce a simple scheme for SMT-aware code optimization, which aims to reduce the memory contention across SMT threads.

  17. Efficient algorithms for accurate hierarchical clustering of huge datasets: tackling the entire protein space.

    PubMed

    Loewenstein, Yaniv; Portugaly, Elon; Fromer, Menachem; Linial, Michal

    2008-07-01

    UPGMA (average linking) is probably the most popular algorithm for hierarchical data clustering, especially in computational biology. However, UPGMA requires the entire dissimilarity matrix in memory. Due to this prohibitive requirement, UPGMA is not scalable to very large datasets. We present a novel class of memory-constrained UPGMA (MC-UPGMA) algorithms. Given any practical memory size constraint, this framework guarantees the correct clustering solution without explicitly requiring all dissimilarities in memory. The algorithms are general and are applicable to any dataset. We present a data-dependent characterization of hardness and clustering efficiency. The presented concepts are applicable to any agglomerative clustering formulation. We apply our algorithm to the entire collection of protein sequences, to automatically build a comprehensive evolutionary-driven hierarchy of proteins from sequence alone. The newly created tree captures protein families better than state-of-the-art large-scale methods such as CluSTr, ProtoNet4 or single-linkage clustering. We demonstrate that leveraging the entire mass embodied in all sequence similarities allows to significantly improve on current protein family clusterings which are unable to directly tackle the sheer mass of this data. Furthermore, we argue that non-metric constraints are an inherent complexity of the sequence space and should not be overlooked. The robustness of UPGMA allows significant improvement, especially for multidomain proteins, and for large or divergent families. A comprehensive tree built from all UniProt sequence similarities, together with navigation and classification tools will be made available as part of the ProtoNet service. A C++ implementation of the algorithm is available on request.

  18. Bibliometric Evidence for a Hierarchy of the Sciences.

    PubMed

    Fanelli, Daniele; Glänzel, Wolfgang

    2013-01-01

    The hypothesis of a Hierarchy of the Sciences, first formulated in the 19(th) century, predicts that, moving from simple and general phenomena (e.g. particle dynamics) to complex and particular (e.g. human behaviour), researchers lose ability to reach theoretical and methodological consensus. This hypothesis places each field of research along a continuum of complexity and "softness", with profound implications for our understanding of scientific knowledge. Today, however, the idea is still unproven and philosophically overlooked, too often confused with simplistic dichotomies that contrast natural and social sciences, or science and the humanities. Empirical tests of the hypothesis have usually compared few fields and this, combined with other limitations, makes their results contradictory and inconclusive. We verified whether discipline characteristics reflect a hierarchy, a dichotomy or neither, by sampling nearly 29,000 papers published contemporaneously in 12 disciplines and measuring a set of parameters hypothesised to reflect theoretical and methodological consensus. The biological sciences had in most cases intermediate values between the physical and the social, with bio-molecular disciplines appearing harder than zoology, botany or ecology. In multivariable analyses, most of these parameters were independent predictors of the hierarchy, even when mathematics and the humanities were included. These results support a "gradualist" view of scientific knowledge, suggesting that the Hierarchy of the Sciences provides the best rational framework to understand disciplines' diversity. A deeper grasp of the relationship between subject matter's complexity and consensus could have profound implications for how we interpret, publish, popularize and administer scientific research.

  19. Mathematical concepts for modeling human behavior in complex man-machine systems

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Rouse, W. B.

    1979-01-01

    Many human behavior (e.g., manual control) models have been found to be inadequate for describing processes in certain real complex man-machine systems. An attempt is made to find a way to overcome this problem by examining the range of applicability of existing mathematical models with respect to the hierarchy of human activities in real complex tasks. Automobile driving is chosen as a baseline scenario, and a hierarchy of human activities is derived by analyzing this task in general terms. A structural description leads to a block diagram and a time-sharing computer analogy.

  20. Formal language theory: refining the Chomsky hierarchy

    PubMed Central

    Jäger, Gerhard; Rogers, James

    2012-01-01

    The first part of this article gives a brief overview of the four levels of the Chomsky hierarchy, with a special emphasis on context-free and regular languages. It then recapitulates the arguments why neither regular nor context-free grammar is sufficiently expressive to capture all phenomena in the natural language syntax. In the second part, two refinements of the Chomsky hierarchy are reviewed, which are both relevant to the extant research in cognitive science: the mildly context-sensitive languages (which are located between context-free and context-sensitive languages), and the sub-regular hierarchy (which distinguishes several levels of complexity within the class of regular languages). PMID:22688632

  1. Formal language theory: refining the Chomsky hierarchy.

    PubMed

    Jäger, Gerhard; Rogers, James

    2012-07-19

    The first part of this article gives a brief overview of the four levels of the Chomsky hierarchy, with a special emphasis on context-free and regular languages. It then recapitulates the arguments why neither regular nor context-free grammar is sufficiently expressive to capture all phenomena in the natural language syntax. In the second part, two refinements of the Chomsky hierarchy are reviewed, which are both relevant to the extant research in cognitive science: the mildly context-sensitive languages (which are located between context-free and context-sensitive languages), and the sub-regular hierarchy (which distinguishes several levels of complexity within the class of regular languages).

  2. Boy, Am I Tired!! Sleep....Why You Need It!

    ERIC Educational Resources Information Center

    Olivieri, Chrystyne

    2016-01-01

    Sleep is essential to a healthy human being. It is among the basic necessities of life, located at the bottom of Maslow's Hierarchy of Need. It is a dynamic activity, necessary to maintain mood, memory and cognitive performance. Sleep disorders are strongly associated with the development of acute and chronic medical conditions. This article…

  3. Cognitive representation of "musical fractals": Processing hierarchy and recursion in the auditory domain.

    PubMed

    Martins, Mauricio Dias; Gingras, Bruno; Puig-Waldmueller, Estela; Fitch, W Tecumseh

    2017-04-01

    The human ability to process hierarchical structures has been a longstanding research topic. However, the nature of the cognitive machinery underlying this faculty remains controversial. Recursion, the ability to embed structures within structures of the same kind, has been proposed as a key component of our ability to parse and generate complex hierarchies. Here, we investigated the cognitive representation of both recursive and iterative processes in the auditory domain. The experiment used a two-alternative forced-choice paradigm: participants were exposed to three-step processes in which pure-tone sequences were built either through recursive or iterative processes, and had to choose the correct completion. Foils were constructed according to generative processes that did not match the previous steps. Both musicians and non-musicians were able to represent recursion in the auditory domain, although musicians performed better. We also observed that general 'musical' aptitudes played a role in both recursion and iteration, although the influence of musical training was somehow independent from melodic memory. Moreover, unlike iteration, recursion in audition was well correlated with its non-auditory (recursive) analogues in the visual and action sequencing domains. These results suggest that the cognitive machinery involved in establishing recursive representations is domain-general, even though this machinery requires access to information resulting from domain-specific processes. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Real Time Monitoring and Prediction of the Monsoon Intraseasonal Oscillations: An index based on Nonlinear Laplacian Spectral Analysis Technique

    NASA Astrophysics Data System (ADS)

    Cherumadanakadan Thelliyil, S.; Ravindran, A. M.; Giannakis, D.; Majda, A.

    2016-12-01

    An improved index for real time monitoring and forecast verification of monsoon intraseasonal oscillations (MISO) is introduced using the recently developed Nonlinear Laplacian Spectral Analysis (NLSA) algorithm. Previous studies has demonstrated the proficiency of NLSA in capturing low frequency variability and intermittency of a time series. Using NLSA a hierarchy of Laplace-Beltrami (LB) eigen functions are extracted from the unfiltered daily GPCP rainfall data over the south Asian monsoon region. Two modes representing the full life cycle of complex northeastward propagating boreal summer MISO are identified from the hierarchy of Laplace-Beltrami eigen functions. These two MISO modes have a number of advantages over the conventionally used Extended Empirical Orthogonal Function (EEOF) MISO modes including higher memory and better predictability, higher fractional variance over the western Pacific, Western Ghats and adjoining Arabian Sea regions and more realistic representation of regional heat sources associated with the MISO. The skill of NLSA based MISO indices in real time prediction of MISO is demonstrated using hindcasts of CFSv2 extended range prediction runs. It is shown that these indices yield a higher prediction skill than the other conventional indices supporting the use of NLSA in real time prediction of MISO. Real time monitoring and prediction of MISO finds its application in agriculture, construction and hydro-electric power sectors and hence an important component of monsoon prediction.

  5. Bring It On, Complexity! Present and Future of Self-Organising Middle-Out Abstraction

    NASA Astrophysics Data System (ADS)

    Mammen, Sebastian Von; Steghöfer, Jan-Philipp

    The following sections are included: * The Great Complexity Challenge * Self-Organising Middle-Out Abstraction * Optimising Graphics, Physics and Artificial Intelligence * Emergence and Hierarchies in a Natural System * The Technical Concept of SOMO * Observation of interactions * Interaction pattern recognition and behavioural abstraction * Creating and adjusting hierarchies * Confidence measures * Execution model * Learning SOMO: parameters, knowledge propagation, and procreation * Current Implementations * Awareness Beyond Virtuality * Integration and emergence * Model inference * SOMO net * SOMO after me * The Future of SOMO

  6. Good Intentions Pave the Way to Hierarchy: A Retrospective Autoethnographic Approach

    ERIC Educational Resources Information Center

    Tilley-Lubbs, Gresilda A.

    2009-01-01

    I explore certain complexities of partnering university students with members of the Mexican and Honduran immigrant community through service-learning. I reveal how my "good intentions" inadvertently created social hierarchy and deficit notions of the community, establishing the students as "haves" and community members as "have-nots." Critically…

  7. Multiprocessor architectural study

    NASA Technical Reports Server (NTRS)

    Kosmala, A. L.; Stanten, S. F.; Vandever, W. H.

    1972-01-01

    An architectural design study was made of a multiprocessor computing system intended to meet functional and performance specifications appropriate to a manned space station application. Intermetrics, previous experience, and accumulated knowledge of the multiprocessor field is used to generate a baseline philosophy for the design of a future SUMC* multiprocessor. Interrupts are defined and the crucial questions of interrupt structure, such as processor selection and response time, are discussed. Memory hierarchy and performance is discussed extensively with particular attention to the design approach which utilizes a cache memory associated with each processor. The ability of an individual processor to approach its theoretical maximum performance is then analyzed in terms of a hit ratio. Memory management is envisioned as a virtual memory system implemented either through segmentation or paging. Addressing is discussed in terms of various register design adopted by current computers and those of advanced design.

  8. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex

    PubMed Central

    Chaudhuri, Rishidev; Knoblauch, Kenneth; Gariel, Marie-Alice; Kennedy, Henry; Wang, Xiao-Jing

    2015-01-01

    We developed a large-scale dynamical model of the macaque neocortex, which is based on recently acquired directed- and weighted-connectivity data from tract-tracing experiments, and which incorporates heterogeneity across areas. A hierarchy of timescales naturally emerges from this system: sensory areas show brief, transient responses to input (appropriate for sensory processing), whereas association areas integrate inputs over time and exhibit persistent activity (suitable for decision-making and working memory). The model displays multiple temporal hierarchies, as evidenced by contrasting responses to visual versus somatosensory stimulation. Moreover, slower prefrontal and temporal areas have a disproportionate impact on global brain dynamics. These findings establish a circuit mechanism for “temporal receptive windows” that are progressively enlarged along the cortical hierarchy, suggest an extension of time integration in decision-making from local to large circuits, and should prompt a re-evaluation of the analysis of functional connectivity (measured by fMRI or EEG/MEG) by taking into account inter-areal heterogeneity. PMID:26439530

  9. Visualising large hierarchies with Flextree

    NASA Astrophysics Data System (ADS)

    Song, Hongzhi; Curran, Edwin P.; Sterritt, Roy

    2003-05-01

    One of the main tasks in Information Visualisation research is creating visual tools to facilitate human understanding of large and complex information spaces. Hierarchies, being a good mechanism in organising such information, are ubiquitous. Although much research effort has been spent on finding useful representations for hierarchies, visualising large hierarchies is still a difficult topic. One of the difficulties is how to show both tructure and node content information in one view. Another is how to achieve multiple foci in a focus+context visualisation. This paper describes a novel hierarchy visualisation technique called FlexTree to address these problems. It contains some important features that have not been exploited so far. In this visualisation, a profile or contour unique to the hierarchy being visualised can be gained in a histogram-like layout. A normalised view of a common attribute of all nodes can be acquired, and selection of this attribute is controllable by the user. Multiple foci are consistently accessible within a global context through interaction. Furthermore it can handle a large hierarchy that contains several thousand nodes in a PC environment. In addition results from an informal evaluation are also presented.

  10. Auditory connections and functions of prefrontal cortex

    PubMed Central

    Plakke, Bethany; Romanski, Lizabeth M.

    2014-01-01

    The functional auditory system extends from the ears to the frontal lobes with successively more complex functions occurring as one ascends the hierarchy of the nervous system. Several areas of the frontal lobe receive afferents from both early and late auditory processing regions within the temporal lobe. Afferents from the early part of the cortical auditory system, the auditory belt cortex, which are presumed to carry information regarding auditory features of sounds, project to only a few prefrontal regions and are most dense in the ventrolateral prefrontal cortex (VLPFC). In contrast, projections from the parabelt and the rostral superior temporal gyrus (STG) most likely convey more complex information and target a larger, widespread region of the prefrontal cortex. Neuronal responses reflect these anatomical projections as some prefrontal neurons exhibit responses to features in acoustic stimuli, while other neurons display task-related responses. For example, recording studies in non-human primates indicate that VLPFC is responsive to complex sounds including vocalizations and that VLPFC neurons in area 12/47 respond to sounds with similar acoustic morphology. In contrast, neuronal responses during auditory working memory involve a wider region of the prefrontal cortex. In humans, the frontal lobe is involved in auditory detection, discrimination, and working memory. Past research suggests that dorsal and ventral subregions of the prefrontal cortex process different types of information with dorsal cortex processing spatial/visual information and ventral cortex processing non-spatial/auditory information. While this is apparent in the non-human primate and in some neuroimaging studies, most research in humans indicates that specific task conditions, stimuli or previous experience may bias the recruitment of specific prefrontal regions, suggesting a more flexible role for the frontal lobe during auditory cognition. PMID:25100931

  11. Exascale Hardware Architectures Working Group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemmert, S; Ang, J; Chiang, P

    2011-03-15

    The ASC Exascale Hardware Architecture working group is challenged to provide input on the following areas impacting the future use and usability of potential exascale computer systems: processor, memory, and interconnect architectures, as well as the power and resilience of these systems. Going forward, there are many challenging issues that will need to be addressed. First, power constraints in processor technologies will lead to steady increases in parallelism within a socket. Additionally, all cores may not be fully independent nor fully general purpose. Second, there is a clear trend toward less balanced machines, in terms of compute capability compared tomore » memory and interconnect performance. In order to mitigate the memory issues, memory technologies will introduce 3D stacking, eventually moving on-socket and likely on-die, providing greatly increased bandwidth but unfortunately also likely providing smaller memory capacity per core. Off-socket memory, possibly in the form of non-volatile memory, will create a complex memory hierarchy. Third, communication energy will dominate the energy required to compute, such that interconnect power and bandwidth will have a significant impact. All of the above changes are driven by the need for greatly increased energy efficiency, as current technology will prove unsuitable for exascale, due to unsustainable power requirements of such a system. These changes will have the most significant impact on programming models and algorithms, but they will be felt across all layers of the machine. There is clear need to engage all ASC working groups in planning for how to deal with technological changes of this magnitude. The primary function of the Hardware Architecture Working Group is to facilitate codesign with hardware vendors to ensure future exascale platforms are capable of efficiently supporting the ASC applications, which in turn need to meet the mission needs of the NNSA Stockpile Stewardship Program. This issue is relatively immediate, as there is only a small window of opportunity to influence hardware design for 2018 machines. Given the short timeline a firm co-design methodology with vendors is of prime importance.« less

  12. The derivation and approximation of coarse-grained dynamics from Langevin dynamics

    NASA Astrophysics Data System (ADS)

    Ma, Lina; Li, Xiantao; Liu, Chun

    2016-11-01

    We present a derivation of a coarse-grained description, in the form of a generalized Langevin equation, from the Langevin dynamics model that describes the dynamics of bio-molecules. The focus is placed on the form of the memory kernel function, the colored noise, and the second fluctuation-dissipation theorem that connects them. Also presented is a hierarchy of approximations for the memory and random noise terms, using rational approximations in the Laplace domain. These approximations offer increasing accuracy. More importantly, they eliminate the need to evaluate the integral associated with the memory term at each time step. Direct sampling of the colored noise can also be avoided within this framework. Therefore, the numerical implementation of the generalized Langevin equation is much more efficient.

  13. Complex networks as an emerging property of hierarchical preferential attachment.

    PubMed

    Hébert-Dufresne, Laurent; Laurence, Edward; Allard, Antoine; Young, Jean-Gabriel; Dubé, Louis J

    2015-12-01

    Real complex systems are not rigidly structured; no clear rules or blueprints exist for their construction. Yet, amidst their apparent randomness, complex structural properties universally emerge. We propose that an important class of complex systems can be modeled as an organization of many embedded levels (potentially infinite in number), all of them following the same universal growth principle known as preferential attachment. We give examples of such hierarchy in real systems, for instance, in the pyramid of production entities of the film industry. More importantly, we show how real complex networks can be interpreted as a projection of our model, from which their scale independence, their clustering, their hierarchy, their fractality, and their navigability naturally emerge. Our results suggest that complex networks, viewed as growing systems, can be quite simple, and that the apparent complexity of their structure is largely a reflection of their unobserved hierarchical nature.

  14. Complex networks as an emerging property of hierarchical preferential attachment

    NASA Astrophysics Data System (ADS)

    Hébert-Dufresne, Laurent; Laurence, Edward; Allard, Antoine; Young, Jean-Gabriel; Dubé, Louis J.

    2015-12-01

    Real complex systems are not rigidly structured; no clear rules or blueprints exist for their construction. Yet, amidst their apparent randomness, complex structural properties universally emerge. We propose that an important class of complex systems can be modeled as an organization of many embedded levels (potentially infinite in number), all of them following the same universal growth principle known as preferential attachment. We give examples of such hierarchy in real systems, for instance, in the pyramid of production entities of the film industry. More importantly, we show how real complex networks can be interpreted as a projection of our model, from which their scale independence, their clustering, their hierarchy, their fractality, and their navigability naturally emerge. Our results suggest that complex networks, viewed as growing systems, can be quite simple, and that the apparent complexity of their structure is largely a reflection of their unobserved hierarchical nature.

  15. Risk prioritisation using the analytic hierarchy process

    NASA Astrophysics Data System (ADS)

    Sum, Rabihah Md.

    2015-12-01

    This study demonstrated how to use the Analytic Hierarchy Process (AHP) to prioritise risks of an insurance company. AHP is a technique to structure complex problems by arranging elements of the problems in a hierarchy, assigning numerical values to subjective judgements on the relative importance of the elements and synthesizing the judgements to determine which elements have the highest priority. The study is motivated by wide application of AHP as a prioritisation technique in complex problems. It aims to show AHP is able to minimise some limitations of risk assessment technique using likelihood and impact. The study shows AHP is able to provide consistency check on subjective judgements, organise a large number of risks into a structured framework, assist risk managers to make explicit risk trade-offs, and provide an easy to understand and systematic risk assessment process.

  16. Measuring the Evolution of Ontology Complexity: The Gene Ontology Case Study

    PubMed Central

    Dameron, Olivier; Bettembourg, Charles; Le Meur, Nolwenn

    2013-01-01

    Ontologies support automatic sharing, combination and analysis of life sciences data. They undergo regular curation and enrichment. We studied the impact of an ontology evolution on its structural complexity. As a case study we used the sixty monthly releases between January 2008 and December 2012 of the Gene Ontology and its three independent branches, i.e. biological processes (BP), cellular components (CC) and molecular functions (MF). For each case, we measured complexity by computing metrics related to the size, the nodes connectivity and the hierarchical structure. The number of classes and relations increased monotonously for each branch, with different growth rates. BP and CC had similar connectivity, superior to that of MF. Connectivity increased monotonously for BP, decreased for CC and remained stable for MF, with a marked increase for the three branches in November and December 2012. Hierarchy-related measures showed that CC and MF had similar proportions of leaves, average depths and average heights. BP had a lower proportion of leaves, and a higher average depth and average height. For BP and MF, the late 2012 increase of connectivity resulted in an increase of the average depth and average height and a decrease of the proportion of leaves, indicating that a major enrichment effort of the intermediate-level hierarchy occurred. The variation of the number of classes and relations in an ontology does not provide enough information about the evolution of its complexity. However, connectivity and hierarchy-related metrics revealed different patterns of values as well as of evolution for the three branches of the Gene Ontology. CC was similar to BP in terms of connectivity, and similar to MF in terms of hierarchy. Overall, BP complexity increased, CC was refined with the addition of leaves providing a finer level of annotations but decreasing slightly its complexity, and MF complexity remained stable. PMID:24146805

  17. Coevolution of landesque capital intensive agriculture and sociopolitical hierarchy

    PubMed Central

    Sheehan, Oliver; Gray, Russell D.; Atkinson, Quentin D.

    2018-01-01

    One of the defining trends of the Holocene has been the emergence of complex societies. Two essential features of complex societies are intensive resource use and sociopolitical hierarchy. Although it is widely agreed that these two phenomena are associated cross-culturally and have both contributed to the rise of complex societies, the causality underlying their relationship has been the subject of longstanding debate. Materialist theories of cultural evolution tend to view resource intensification as driving the development of hierarchy, but the reverse order of causation has also been advocated, along with a range of intermediate views. Phylogenetic methods have the potential to test between these different causal models. Here we report the results of a phylogenetic study that modeled the coevolution of one type of resource intensification—the development of landesque capital intensive agriculture—with political complexity and social stratification in a sample of 155 Austronesian-speaking societies. We found support for the coevolution of landesque capital with both political complexity and social stratification, but the contingent and nondeterministic nature of both of these relationships was clear. There was no indication that intensification was the “prime mover” in either relationship. Instead, the relationship between intensification and social stratification was broadly reciprocal, whereas political complexity was more of a driver than a result of intensification. These results challenge the materialist view and emphasize the importance of both material and social factors in the evolution of complex societies, as well as the complex and multifactorial nature of cultural evolution. PMID:29555760

  18. [Dynamic hierarchy of regulatory peptides. Structure of the induction relations of regulators as the target for therapeutic agents].

    PubMed

    Koroleva, S V; Miasoedov, N F

    2012-01-01

    Based on the database information (literature period 1970-2010 gg.) on the effects of regulatory peptides (RP) and non-peptide neurotransmitters (dopamine, serotonin, norepi-nephrine, acetylcholine) it was analyzed of possible cascade processes of endogenous regulators. It was found that the entire continuum of RP and mediators is a chaotic soup of the ordered three-level compartments. Such a dynamic functional hierarchy of endogenous regulators allows to create start-up and corrective tasks for a variety of physiological functions. Some examples of static and dynamic patterns of induction processes of RP and mediators (that regulate the states of anxiety, depression, learning and memory, feeding behavior, reproductive processes, etc.) are considered.

  19. Efficient algorithms for accurate hierarchical clustering of huge datasets: tackling the entire protein space

    PubMed Central

    Loewenstein, Yaniv; Portugaly, Elon; Fromer, Menachem; Linial, Michal

    2008-01-01

    Motivation: UPGMA (average linking) is probably the most popular algorithm for hierarchical data clustering, especially in computational biology. However, UPGMA requires the entire dissimilarity matrix in memory. Due to this prohibitive requirement, UPGMA is not scalable to very large datasets. Application: We present a novel class of memory-constrained UPGMA (MC-UPGMA) algorithms. Given any practical memory size constraint, this framework guarantees the correct clustering solution without explicitly requiring all dissimilarities in memory. The algorithms are general and are applicable to any dataset. We present a data-dependent characterization of hardness and clustering efficiency. The presented concepts are applicable to any agglomerative clustering formulation. Results: We apply our algorithm to the entire collection of protein sequences, to automatically build a comprehensive evolutionary-driven hierarchy of proteins from sequence alone. The newly created tree captures protein families better than state-of-the-art large-scale methods such as CluSTr, ProtoNet4 or single-linkage clustering. We demonstrate that leveraging the entire mass embodied in all sequence similarities allows to significantly improve on current protein family clusterings which are unable to directly tackle the sheer mass of this data. Furthermore, we argue that non-metric constraints are an inherent complexity of the sequence space and should not be overlooked. The robustness of UPGMA allows significant improvement, especially for multidomain proteins, and for large or divergent families. Availability: A comprehensive tree built from all UniProt sequence similarities, together with navigation and classification tools will be made available as part of the ProtoNet service. A C++ implementation of the algorithm is available on request. Contact: lonshy@cs.huji.ac.il PMID:18586742

  20. Benchmarking Memory Performance with the Data Cube Operator

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael A.; Shabanov, Leonid V.

    2004-01-01

    Data movement across a computer memory hierarchy and across computational grids is known to be a limiting factor for applications processing large data sets. We use the Data Cube Operator on an Arithmetic Data Set, called ADC, to benchmark capabilities of computers and of computational grids to handle large distributed data sets. We present a prototype implementation of a parallel algorithm for computation of the operatol: The algorithm follows a known approach for computing views from the smallest parent. The ADC stresses all levels of grid memory and storage by producing some of 2d views of an Arithmetic Data Set of d-tuples described by a small number of integers. We control data intensity of the ADC by selecting the tuple parameters, the sizes of the views, and the number of realized views. Benchmarking results of memory performance of a number of computer architectures and of a small computational grid are presented.

  1. Lasting Adaptations in Social Behavior Produced by Social Disruption and Inhibition of Adult Neurogenesis

    PubMed Central

    Opendak, Maya; Offit, Lily; Monari, Patrick; Schoenfeld, Timothy J.; Sonti, Anup N.; Cameron, Heather A.

    2016-01-01

    Research on social instability has focused on its detrimental consequences, but most people are resilient and respond by invoking various coping strategies. To investigate cellular processes underlying such strategies, a dominance hierarchy of rats was formed and then destabilized. Regardless of social position, rats from disrupted hierarchies had fewer new neurons in the hippocampus compared with rats from control cages and those from stable hierarchies. Social disruption produced a preference for familiar over novel conspecifics, a change that did not involve global memory impairments or increased anxiety. Using the neuropeptide oxytocin as a tool to increase neurogenesis in the hippocampus of disrupted rats restored preference for novel conspecifics to predisruption levels. Conversely, reducing the number of new neurons by limited inhibition of adult neurogenesis in naive transgenic GFAP–thymidine kinase rats resulted in social behavior similar to disrupted rats. Together, these results provide novel mechanistic evidence that social disruption shapes behavior in a potentially adaptive way, possibly by reducing adult neurogenesis in the hippocampus. SIGNIFICANCE STATEMENT To investigate cellular processes underlying adaptation to social instability, a dominance hierarchy of rats was formed and then destabilized. Regardless of social position, rats from disrupted hierarchies had fewer new neurons in the hippocampus compared with rats from control cages and those from stable hierarchies. Unexpectedly, these changes were accompanied by changes in social strategies without evidence of impairments in cognition or anxiety regulation. Restoring adult neurogenesis in disrupted rats using oxytocin and conditionally suppressing the production of new neurons in socially naive GFAP–thymidine kinase rats showed that loss of 6-week-old neurons may be responsible for adaptive changes in social behavior. PMID:27358459

  2. Using architecture information and real-time resource state to reduce power consumption and communication costs in parallel applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, James M.; Devine, Karen Dragon; Gentile, Ann C.

    2014-09-01

    As computer systems grow in both size and complexity, the need for applications and run-time systems to adjust to their dynamic environment also grows. The goal of the RAAMP LDRD was to combine static architecture information and real-time system state with algorithms to conserve power, reduce communication costs, and avoid network contention. We devel- oped new data collection and aggregation tools to extract static hardware information (e.g., node/core hierarchy, network routing) as well as real-time performance data (e.g., CPU uti- lization, power consumption, memory bandwidth saturation, percentage of used bandwidth, number of network stalls). We created application interfaces that allowedmore » this data to be used easily by algorithms. Finally, we demonstrated the benefit of integrating system and application information for two use cases. The first used real-time power consumption and memory bandwidth saturation data to throttle concurrency to save power without increasing application execution time. The second used static or real-time network traffic information to reduce or avoid network congestion by remapping MPI tasks to allocated processors. Results from our work are summarized in this report; more details are available in our publications [2, 6, 14, 16, 22, 29, 38, 44, 51, 54].« less

  3. The future of memory

    NASA Astrophysics Data System (ADS)

    Marinella, M.

    In the not too distant future, the traditional memory and storage hierarchy of may be replaced by a single Storage Class Memory (SCM) device integrated on or near the logic processor. Traditional magnetic hard drives, NAND flash, DRAM, and higher level caches (L2 and up) will be replaced with a single high performance memory device. The Storage Class Memory paradigm will require high speed (< 100 ns read/write), excellent endurance (> 1012), nonvolatility (retention > 10 years), and low switching energies (< 10 pJ per switch). The International Technology Roadmap for Semiconductors (ITRS) has recently evaluated several potential candidates SCM technologies, including Resistive (or Redox) RAM, Spin Torque Transfer RAM (STT-MRAM), and phase change memory (PCM). All of these devices show potential well beyond that of current flash technologies and research efforts are underway to improve the endurance, write speeds, and scalabilities to be on-par with DRAM. This progress has interesting implications for space electronics: each of these emerging device technologies show excellent resistance to the types of radiation typically found in space applications. Commercially developed, high density storage class memory-based systems may include a memory that is physically radiation hard, and suitable for space applications without major shielding efforts. This paper reviews the Storage Class Memory concept, emerging memory devices, and possible applicability to radiation hardened electronics for space.

  4. Schooling Space: Where South Africans Learnt to Position Themselves within the Hierarchy of Apartheid Society

    ERIC Educational Resources Information Center

    Karlsson, Jenni

    2004-01-01

    In setting out to understand how South African school space was harnessed to the political project of apartheid, the author explores memory accounts from several adults who attended school during the apartheid era. Her analysis of their reminiscences found that non-pedagogic areas of the school and public domain beyond school premises were places…

  5. Parallel Multivariate Spatio-Temporal Clustering of Large Ecological Datasets on Hybrid Supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sreepathi, Sarat; Kumar, Jitendra; Mills, Richard T.

    A proliferation of data from vast networks of remote sensing platforms (satellites, unmanned aircraft systems (UAS), airborne etc.), observational facilities (meteorological, eddy covariance etc.), state-of-the-art sensors, and simulation models offer unprecedented opportunities for scientific discovery. Unsupervised classification is a widely applied data mining approach to derive insights from such data. However, classification of very large data sets is a complex computational problem that requires efficient numerical algorithms and implementations on high performance computing (HPC) platforms. Additionally, increasing power, space, cooling and efficiency requirements has led to the deployment of hybrid supercomputing platforms with complex architectures and memory hierarchies like themore » Titan system at Oak Ridge National Laboratory. The advent of such accelerated computing architectures offers new challenges and opportunities for big data analytics in general and specifically, large scale cluster analysis in our case. Although there is an existing body of work on parallel cluster analysis, those approaches do not fully meet the needs imposed by the nature and size of our large data sets. Moreover, they had scaling limitations and were mostly limited to traditional distributed memory computing platforms. We present a parallel Multivariate Spatio-Temporal Clustering (MSTC) technique based on k-means cluster analysis that can target hybrid supercomputers like Titan. We developed a hybrid MPI, CUDA and OpenACC implementation that can utilize both CPU and GPU resources on computational nodes. We describe performance results on Titan that demonstrate the scalability and efficacy of our approach in processing large ecological data sets.« less

  6. The mammary cellular hierarchy and breast cancer.

    PubMed

    Oakes, Samantha R; Gallego-Ortega, David; Ormandy, Christopher J

    2014-11-01

    Advances in the study of hematopoietic cell maturation have paved the way to a deeper understanding the stem and progenitor cellular hierarchy in the mammary gland. The mammary epithelium, unlike the hematopoietic cellular hierarchy, sits in a complex niche where communication between epithelial cells and signals from the systemic hormonal milieu, as well as from extra-cellular matrix, influence cell fate decisions and contribute to tissue homeostasis. We review the discovery, definition and regulation of the mammary cellular hierarchy and we describe the development of the concepts that have guided our investigations. We outline recent advances in in vivo lineage tracing that is now challenging many of our assumptions regarding the behavior of mammary stem cells, and we show how understanding these cellular lineages has altered our view of breast cancer.

  7. Associative Recognition Memory Awareness Improved by Theta-Burst Stimulation of Frontopolar Cortex

    PubMed Central

    Ryals, Anthony J.; Rogers, Lynn M.; Gross, Evan Z.; Polnaszek, Kelly L.; Voss, Joel L.

    2016-01-01

    Neuroimaging and lesion studies have implicated specific prefrontal cortex locations in subjective memory awareness. Based on this evidence, a rostrocaudal organization has been proposed whereby increasingly anterior prefrontal regions are increasingly involved in memory awareness. We used theta-burst transcranial magnetic stimulation (TBS) to temporarily modulate dorsolateral versus frontopolar prefrontal cortex to test for distinct causal roles in memory awareness. In three sessions, participants received TBS bilaterally to frontopolar cortex, dorsolateral prefrontal cortex, or a control location prior to performing an associative-recognition task involving judgments of memory awareness. Objective memory performance (i.e., accuracy) did not differ based on stimulation location. In contrast, frontopolar stimulation significantly influenced several measures of memory awareness. During study, judgments of learning were more accurate such that lower ratings were given to items that were subsequently forgotten selectively following frontopolar TBS. Confidence ratings during test were also higher for correct trials following frontopolar TBS. Finally, trial-by-trial correspondence between overt performance and subjective awareness during study demonstrated a linear increase across control, dorsolateral, and frontopolar TBS locations, supporting a rostrocaudal hierarchy of prefrontal contributions to memory awareness. These findings indicate that frontopolar cortex contributes causally to memory awareness, which was improved selectively by anatomically targeted TBS. PMID:25577574

  8. On the origin of non-exponential fluorescence decays in enzyme-ligand complex

    NASA Astrophysics Data System (ADS)

    Wlodarczyk, Jakub; Kierdaszuk, Borys

    2004-05-01

    Complex fluorescence decays have usually been analyzed with the aid of a multi-exponential model, but interpretation of the individual exponential terms has not been adequately characterized. In such cases the intensity decays were also analyzed in terms of the continuous lifetime distribution as a consequence of an interaction of fluorophore with environment, conformational heterogeneity or their dynamical nature. We show that non-exponential fluorescence decay of the enzyme-ligand complexes may results from time dependent energy transport. The latter, to our opinion, may be accounted for by electron transport from the protein tyrosines to their neighbor residues. We introduce the time-dependent hopping rate in the form v(t)~(a+bt)-1. This in turn leads to the luminescence decay function in the form I(t)=Ioexp(-t/τ1)(1+lt/γτ2)-γ. Such a decay function provides good fits to highly complex fluorescence decays. The power-like tail implies the time hierarchy in migration energy process due to the hierarchical energy-level structure. Moreover, such a power-like term is a manifestation of so called Tsallis nonextensive statistic and is suitable for description of the systems with long-range interactions, memory effect as well as with fluctuations of characteristic lifetime of fluorescence. The proposed decay function was applied in analysis of fluorescence decays of tyrosine protein, i.e. the enzyme purine nucleoside phosphorylase from E. coli in a complex with formycin A (an inhibitor) and orthophosphate (a co-substrate).

  9. Coevolution of landesque capital intensive agriculture and sociopolitical hierarchy.

    PubMed

    Sheehan, Oliver; Watts, Joseph; Gray, Russell D; Atkinson, Quentin D

    2018-04-03

    One of the defining trends of the Holocene has been the emergence of complex societies. Two essential features of complex societies are intensive resource use and sociopolitical hierarchy. Although it is widely agreed that these two phenomena are associated cross-culturally and have both contributed to the rise of complex societies, the causality underlying their relationship has been the subject of longstanding debate. Materialist theories of cultural evolution tend to view resource intensification as driving the development of hierarchy, but the reverse order of causation has also been advocated, along with a range of intermediate views. Phylogenetic methods have the potential to test between these different causal models. Here we report the results of a phylogenetic study that modeled the coevolution of one type of resource intensification-the development of landesque capital intensive agriculture-with political complexity and social stratification in a sample of 155 Austronesian-speaking societies. We found support for the coevolution of landesque capital with both political complexity and social stratification, but the contingent and nondeterministic nature of both of these relationships was clear. There was no indication that intensification was the "prime mover" in either relationship. Instead, the relationship between intensification and social stratification was broadly reciprocal, whereas political complexity was more of a driver than a result of intensification. These results challenge the materialist view and emphasize the importance of both material and social factors in the evolution of complex societies, as well as the complex and multifactorial nature of cultural evolution. Copyright © 2018 the Author(s). Published by PNAS.

  10. Research findings from the Memories of Nursing oral history project.

    PubMed

    Thomas, Gail; Rosser, Elizabeth

    2017-02-23

    Capturing the stories of nurses who practised in the past offers the opportunity to reflect on the changes in practice over time to determine lessons for the future. This article shares some of the memories of a group of 16 nurses who were interviewed in Bournemouth, UK, between 2009 and 2016. Thematic analysis of the interview transcripts identified a number of themes, three of which are presented: defining moments, hygiene and hierarchy. The similarities and differences between their experiences and contemporary nursing practice are discussed to highlight how it may be timely to think back in order to take practice forward positively in the future.

  11. Arithmetic Data Cube as a Data Intensive Benchmark

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael A.; Shabano, Leonid

    2003-01-01

    Data movement across computational grids and across memory hierarchy of individual grid machines is known to be a limiting factor for application involving large data sets. In this paper we introduce the Data Cube Operator on an Arithmetic Data Set which we call Arithmetic Data Cube (ADC). We propose to use the ADC to benchmark grid capabilities to handle large distributed data sets. The ADC stresses all levels of grid memory by producing 2d views of an Arithmetic Data Set of d-tuples described by a small number of parameters. We control data intensity of the ADC by controlling the sizes of the views through choice of the tuple parameters.

  12. GPU color space conversion

    NASA Astrophysics Data System (ADS)

    Chase, Patrick; Vondran, Gary

    2011-01-01

    Tetrahedral interpolation is commonly used to implement continuous color space conversions from sparse 3D and 4D lookup tables. We investigate the implementation and optimization of tetrahedral interpolation algorithms for GPUs, and compare to the best known CPU implementations as well as to a well known GPU-based trilinear implementation. We show that a 500 NVIDIA GTX-580 GPU is 3x faster than a 1000 Intel Core i7 980X CPU for 3D interpolation, and 9x faster for 4D interpolation. Performance-relevant GPU attributes are explored including thread scheduling, local memory characteristics, global memory hierarchy, and cache behaviors. We consider existing tetrahedral interpolation algorithms and tune based on the structure and branching capabilities of current GPUs. Global memory performance is improved by reordering and expanding the lookup table to ensure optimal access behaviors. Per multiprocessor local memory is exploited to implement optimally coalesced global memory accesses, and local memory addressing is optimized to minimize bank conflicts. We explore the impacts of lookup table density upon computation and memory access costs. Also presented are CPU-based 3D and 4D interpolators, using SSE vector operations that are faster than any previously published solution.

  13. What people know about electronic devices: A descriptive study

    NASA Astrophysics Data System (ADS)

    Kieras, D. E.

    1982-10-01

    Informal descriptive results on the nature of people's natural knowledge of electronic devices are presented. Expert and nonexpert subjects were given an electronic device to examine and describe orally. The devices ranged from familiar everyday devices, to those familiar only to the expert, to unusual devices unfamiliar even to an expert. College students were asked to describe everyday devices from memory. The results suggest that device knowledge consists of the major categories of what the device is for, how it is used, its structure in terms of subdevices, its physical layout, how it works, and its behavior. A preliminary theoretical framework for device knowledge is that it consists of a hierarchy of schemas, corresponding to a hierarchial decomposition of the device into subdevices, with each level containing the major categories of information.

  14. Complexity, signal detection, and the application of ergonomics: reflections on a healthcare case study.

    PubMed

    Dekker, Sidney

    2012-05-01

    Complexity is a defining characteristic of healthcare, and ergonomic interventions in clinical practice need to take into account aspects vital for the success or failure of new technology. The introduction of new monitoring technology, for example, creates many ripple effects through clinical relationships and agents' cross-adaptations. This paper uses the signal detection paradigm to account for a case in which multiple clinical decision makers, across power hierarchies and gender gaps, manipulate each others' sensitivities to evidence and decision criteria. These are possible to analyze and predict with an applied ergonomics that is sensitive to the social complexities of the workplace, including power, gender, hierarchy and fuzzy system boundaries. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  15. On the thermodynamics of multilevel evolution.

    PubMed

    Tessera, Marc; Hoelzer, Guy A

    2013-09-01

    Biodiversity is hierarchically structured both phylogenetically and functionally. Phylogenetic hierarchy is understood as a product of branching organic evolution as described by Darwin. Ecosystem biologists understand some aspects of functional hierarchy, such as food web architecture, as a product of evolutionary ecology; but functional hierarchy extends to much lower scales of organization than those studied by ecologists. We argue that the more general use of the term "evolution" employed by physicists and applied to non-living systems connects directly to the narrow biological meaning. Physical evolution is best understood as a thermodynamic phenomenon, and this perspective comfortably includes all of biological evolution. We suggest four dynamical factors that build on each other in a hierarchical fashion and set the stage for the Darwinian evolution of biological systems: (1) the entropic erosion of structure; (2) the construction of dissipative systems; (3) the reproduction of growing systems and (4) the historical memory accrued to populations of reproductive agents by the acquisition of hereditary mechanisms. A particular level of evolution can underpin the emergence of higher levels, but evolutionary processes persist at each level in the hierarchy. We also argue that particular evolutionary processes can occur at any level of the hierarchy where they are not obstructed by material constraints. This theoretical framework provides an extensive basis for understanding natural selection as a multilevel process. The extensive literature on thermodynamics in turn provides an important advantage to this perspective on the evolution of higher levels of organization, such as the evolution of altruism that can accompany the emergence of social organization. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Hierarchy in directed random networks.

    PubMed

    Mones, Enys

    2013-02-01

    In recent years, the theory and application of complex networks have been quickly developing in a markable way due to the increasing amount of data from real systems and the fruitful application of powerful methods used in statistical physics. Many important characteristics of social or biological systems can be described by the study of their underlying structure of interactions. Hierarchy is one of these features that can be formulated in the language of networks. In this paper we present some (qualitative) analytic results on the hierarchical properties of random network models with zero correlations and also investigate, mainly numerically, the effects of different types of correlations. The behavior of the hierarchy is different in the absence and the presence of giant components. We show that the hierarchical structure can be drastically different if there are one-point correlations in the network. We also show numerical results suggesting that the hierarchy does not change monotonically with the correlations and there is an optimal level of nonzero correlations maximizing the level of hierarchy.

  17. Towards organizing health knowledge on community-based health services.

    PubMed

    Akbari, Mohammad; Hu, Xia; Nie, Liqiang; Chua, Tat-Seng

    2016-12-01

    Online community-based health services accumulate a huge amount of unstructured health question answering (QA) records at a continuously increasing pace. The ability to organize these health QA records has been found to be effective for data access. The existing approaches for organizing information are often not applicable to health domain due to its domain nature as characterized by complex relation among entities, large vocabulary gap, and heterogeneity of users. To tackle these challenges, we propose a top-down organization scheme, which can automatically assign the unstructured health-related records into a hierarchy with prior domain knowledge. Besides automatic hierarchy prototype generation, it also enables each data instance to be associated with multiple leaf nodes and profiles each node with terminologies. Based on this scheme, we design a hierarchy-based health information retrieval system. Experiments on a real-world dataset demonstrate the effectiveness of our scheme in organizing health QA into a topic hierarchy and retrieving health QA records from the topic hierarchy.

  18. Abstraction of complex concepts with a refined partial-area taxonomy of SNOMED

    PubMed Central

    Wang, Yue; Halper, Michael; Wei, Duo; Perl, Yehoshua; Geller, James

    2012-01-01

    An algorithmically-derived abstraction network, called the partial-area taxonomy, for a SNOMED hierarchy has led to the identification of concepts considered complex. The designation “complex” is arrived at automatically on the basis of structural analyses of overlap among the constituent concept groups of the partial-area taxonomy. Such complex concepts, called overlapping concepts, constitute a tangled portion of a hierarchy and can be obstacles to users trying to gain an understanding of the hierarchy’s content. A new methodology for partitioning the entire collection of overlapping concepts into singly-rooted groups, that are more manageable to work with and comprehend, is presented. Different kinds of overlapping concepts with varying degrees of complexity are identified. This leads to an abstract model of the overlapping concepts called the disjoint partial-area taxonomy, which serves as a vehicle for enhanced, high-level display. The methodology is demonstrated with an application to SNOMED’s Specimen hierarchy. Overall, the resulting disjoint partial-area taxonomy offers a refined view of the hierarchy’s structural organization and conceptual content that can aid users, such as maintenance personnel, working with SNOMED. The utility of the disjoint partial-area taxonomy as the basis for a SNOMED auditing regimen is presented in a companion paper. PMID:21878396

  19. Quality assurance of chemical ingredient classification for the National Drug File - Reference Terminology.

    PubMed

    Zheng, Ling; Yumak, Hasan; Chen, Ling; Ochs, Christopher; Geller, James; Kapusnik-Uner, Joan; Perl, Yehoshua

    2017-09-01

    The National Drug File - Reference Terminology (NDF-RT) is a large and complex drug terminology consisting of several classification hierarchies on top of an extensive collection of drug concepts. These hierarchies provide important information about clinical drugs, e.g., their chemical ingredients, mechanisms of action, dosage form and physiological effects. Within NDF-RT such information is represented using tens of thousands of roles connecting drugs to classifications. In previous studies, we have introduced various kinds of Abstraction Networks to summarize the content and structure of terminologies in order to facilitate their visual comprehension, and support quality assurance of terminologies. However, these previous kinds of Abstraction Networks are not appropriate for summarizing the NDF-RT classification hierarchies, due to its unique structure. In this paper, we present the novel Ingredient Abstraction Network (IAbN) to summarize, visualize and support the audit of NDF-RT's Chemical Ingredients hierarchy and its associated drugs. A common theme in our quality assurance framework is to use characterizations of sets of concepts, revealed by the Abstraction Network structure, to capture concepts, the modeling of which is more complex than for other concepts. For the IAbN, we characterize drug ingredient concepts as more complex if they belong to IAbN groups with multiple parent groups. We show that such concepts have a statistically significantly higher rate of errors than a control sample and identify two especially common patterns of errors. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Circular instead of hierarchical: methodological principles for the evaluation of complex interventions

    PubMed Central

    Walach, Harald; Falkenberg, Torkel; Fønnebø, Vinjar; Lewith, George; Jonas, Wayne B

    2006-01-01

    Background The reasoning behind evaluating medical interventions is that a hierarchy of methods exists which successively produce improved and therefore more rigorous evidence based medicine upon which to make clinical decisions. At the foundation of this hierarchy are case studies, retrospective and prospective case series, followed by cohort studies with historical and concomitant non-randomized controls. Open-label randomized controlled studies (RCTs), and finally blinded, placebo-controlled RCTs, which offer most internal validity are considered the most reliable evidence. Rigorous RCTs remove bias. Evidence from RCTs forms the basis of meta-analyses and systematic reviews. This hierarchy, founded on a pharmacological model of therapy, is generalized to other interventions which may be complex and non-pharmacological (healing, acupuncture and surgery). Discussion The hierarchical model is valid for limited questions of efficacy, for instance for regulatory purposes and newly devised products and pharmacological preparations. It is inadequate for the evaluation of complex interventions such as physiotherapy, surgery and complementary and alternative medicine (CAM). This has to do with the essential tension between internal validity (rigor and the removal of bias) and external validity (generalizability). Summary Instead of an Evidence Hierarchy, we propose a Circular Model. This would imply a multiplicity of methods, using different designs, counterbalancing their individual strengths and weaknesses to arrive at pragmatic but equally rigorous evidence which would provide significant assistance in clinical and health systems innovation. Such evidence would better inform national health care technology assessment agencies and promote evidence based health reform. PMID:16796762

  1. Cognitive Theory within the Framework of an Information Processing Model and Learning Hierarchy: Viable Alternative to the Bloom-Mager System.

    ERIC Educational Resources Information Center

    Stahl, Robert J.

    This review of the current status of the human information processing model presents the Stahl Perceptual Information Processing and Operations Model (SPInPrOM) as a model of how thinking, memory, and the processing of information take place within the individual learner. A related system, the Domain of Cognition, is presented as an alternative to…

  2. Justification of Estimates for Fiscal Year 1983 Submitted to Congress.

    DTIC Science & Technology

    1982-02-01

    hierarchies to aid software production; completion of the components of an adaptive suspension vehicle including a storage energy unit, hydraulics, laser...and corrosion (long storage times), and radiation-induced breakdown. Solid- lubricated main engine bearings for cruise missile engines would offer...environments will cause "soft error" (computational and memory storage errors) in advanced microelectronic circuits. Research on high-speed, low-power

  3. A neural model of the temporal dynamics of figure-ground segregation in motion perception.

    PubMed

    Raudies, Florian; Neumann, Heiko

    2010-03-01

    How does the visual system manage to segment a visual scene into surfaces and objects and manage to attend to a target object? Based on psychological and physiological investigations, it has been proposed that the perceptual organization and segmentation of a scene is achieved by the processing at different levels of the visual cortical hierarchy. According to this, motion onset detection, motion-defined shape segregation, and target selection are accomplished by processes which bind together simple features into fragments of increasingly complex configurations at different levels in the processing hierarchy. As an alternative to this hierarchical processing hypothesis, it has been proposed that the processing stages for feature detection and segregation are reflected in different temporal episodes in the response patterns of individual neurons. Such temporal epochs have been observed in the activation pattern of neurons as low as in area V1. Here, we present a neural network model of motion detection, figure-ground segregation and attentive selection which explains these response patterns in an unifying framework. Based on known principles of functional architecture of the visual cortex, we propose that initial motion and motion boundaries are detected at different and hierarchically organized stages in the dorsal pathway. Visual shapes that are defined by boundaries, which were generated from juxtaposed opponent motions, are represented at different stages in the ventral pathway. Model areas in the different pathways interact through feedforward and modulating feedback, while mutual interactions enable the communication between motion and form representations. Selective attention is devoted to shape representations by sending modulating feedback signals from higher levels (working memory) to intermediate levels to enhance their responses. Areas in the motion and form pathway are coupled through top-down feedback with V1 cells at the bottom end of the hierarchy. We propose that the different temporal episodes in the response pattern of V1 cells, as recorded in recent experiments, reflect the strength of modulating feedback signals. This feedback results from the consolidated shape representations from coherent motion patterns and the attentive modulation of responses along the cortical hierarchy. The model makes testable predictions concerning the duration and delay of the temporal episodes of V1 cell responses as well as their response variations that were caused by modulating feedback signals. Copyright 2009 Elsevier Ltd. All rights reserved.

  4. Cellular automaton simulation examining progenitor hierarchy structure effects on mammary ductal carcinoma in situ.

    PubMed

    Bankhead, Armand; Magnuson, Nancy S; Heckendorn, Robert B

    2007-06-07

    A computer simulation is used to model ductal carcinoma in situ, a form of non-invasive breast cancer. The simulation uses known histological morphology, cell types, and stochastic cell proliferation to evolve tumorous growth within a duct. The ductal simulation is based on a hybrid cellular automaton design using genetic rules to determine each cell's behavior. The genetic rules are a mutable abstraction that demonstrate genetic heterogeneity in a population. Our goal was to examine the role (if any) that recently discovered mammary stem cell hierarchies play in genetic heterogeneity, DCIS initiation and aggressiveness. Results show that simpler progenitor hierarchies result in greater genetic heterogeneity and evolve DCIS significantly faster. However, the more complex progenitor hierarchy structure was able to sustain the rapid reproduction of a cancer cell population for longer periods of time.

  5. 3D printed hierarchical honeycombs with shape integrity under large compressive deformations

    DOE PAGES

    Chen, Yanyu; Li, Tiantian; Jia, Zian; ...

    2017-10-12

    Here, we describe the in-plane compressive performance of a new type of hierarchical cellular structure created by replacing cell walls in regular honeycombs with triangular lattice configurations. The fabrication of this relatively complex material architecture with size features spanning from micrometer to centimeter is facilitated by the availability of commercial 3D printers. We apply to these hierarchical honeycombs a thermal treatment that facilitates the shape preservation and structural integrity of the structures under large compressive loading. The proposed hierarchical honeycombs exhibit a progressive failure mode, along with improved stiffness and energy absorption under uniaxial compression. High energy dissipation and shapemore » integrity at large imposed strains (up to 60%) have also been observed in these hierarchical honeycombs under cyclic loading. Experimental and numerical studies suggest that these anomalous mechanical behaviors are attributed to the introduction of a structural hierarchy, intrinsically controlled by the cell wall slenderness of the triangular lattice and by the shape memory effect induced by the thermal and mechanical compressive treatment.« less

  6. 3D printed hierarchical honeycombs with shape integrity under large compressive deformations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yanyu; Li, Tiantian; Jia, Zian

    Here, we describe the in-plane compressive performance of a new type of hierarchical cellular structure created by replacing cell walls in regular honeycombs with triangular lattice configurations. The fabrication of this relatively complex material architecture with size features spanning from micrometer to centimeter is facilitated by the availability of commercial 3D printers. We apply to these hierarchical honeycombs a thermal treatment that facilitates the shape preservation and structural integrity of the structures under large compressive loading. The proposed hierarchical honeycombs exhibit a progressive failure mode, along with improved stiffness and energy absorption under uniaxial compression. High energy dissipation and shapemore » integrity at large imposed strains (up to 60%) have also been observed in these hierarchical honeycombs under cyclic loading. Experimental and numerical studies suggest that these anomalous mechanical behaviors are attributed to the introduction of a structural hierarchy, intrinsically controlled by the cell wall slenderness of the triangular lattice and by the shape memory effect induced by the thermal and mechanical compressive treatment.« less

  7. Limits to the usability of iconic memory.

    PubMed

    Rensink, Ronald A

    2014-01-01

    Human vision briefly retains a trace of a stimulus after it disappears. This trace-iconic memory-is often believed to be a surrogate for the original stimulus, a representational structure that can be used as if the original stimulus were still present. To investigate its nature, a flicker-search paradigm was developed that relied upon a full scan (rather than partial report) of its contents. Results show that for visual search it can indeed act as a surrogate, with little cost for alternating between visible and iconic representations. However, the duration over which it can be used depends on the type of task: some tasks can use iconic memory for at least 240 ms, others for only about 190 ms, while others for no more than about 120 ms. The existence of these different limits suggests that iconic memory may have multiple layers, each corresponding to a particular level of the visual hierarchy. In this view, the inability to use a layer of iconic memory may reflect an inability to maintain feedback connections to the corresponding representation.

  8. The Generalization of Mutual Information as the Information between a Set of Variables: The Information Correlation Function Hierarchy and the Information Structure of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Wolf, David R.

    2004-01-01

    The topic of this paper is a hierarchy of information-like functions, here named the information correlation functions, where each function of the hierarchy may be thought of as the information between the variables it depends upon. The information correlation functions are particularly suited to the description of the emergence of complex behaviors due to many- body or many-agent processes. They are particularly well suited to the quantification of the decomposition of the information carried among a set of variables or agents, and its subsets. In more graphical language, they provide the information theoretic basis for understanding the synergistic and non-synergistic components of a system, and as such should serve as a forceful toolkit for the analysis of the complexity structure of complex many agent systems. The information correlation functions are the natural generalization to an arbitrary number of sets of variables of the sequence starting with the entropy function (one set of variables) and the mutual information function (two sets). We start by describing the traditional measures of information (entropy) and mutual information.

  9. Radiative and precipitation controls on root zone soil moisture spectra

    DOE PAGES

    Nakai, Taro; Katul, Gabriel G.; Kotani, Ayumi; ...

    2014-10-20

    Here, we present that temporal variability in root zone soil moisture content (w) exhibits a Lorentzian spectrum with memory dictated by a damping term when forced with white-noise precipitation. In the context of regional dimming, radiation and precipitation variability are needed to reproduce w trends prompting interest in how the w memory is altered by radiative forcing. A hierarchy of models that sequentially introduce the spectrum of precipitation, net radiation, and the effect of w on evaporative and drainage losses was used to analyze the spectrum of w at subtropical and temperate forested sites. Reproducing the w spectra at longmore » time scales necessitated simultaneous precipitation and net radiation measurements depending on site conditions. The w memory inferred from observed w spectra was 25–38 days, larger than that determined from maximum wet evapotranspiration and field capacity. Finally, the w memory can be reasonably inferred from the Lorentzian spectrum when precipitation and evapotranspiration are in phase.« less

  10. Temporal evolution of brain reorganization under cross-modal training: insights into the functional architecture of encoding and retrieval networks

    NASA Astrophysics Data System (ADS)

    Likova, Lora T.

    2015-03-01

    This study is based on the recent discovery of massive and well-structured cross-modal memory activation generated in the primary visual cortex (V1) of totally blind people as a result of novel training in drawing without any vision (Likova, 2012). This unexpected functional reorganization of primary visual cortex was obtained after undergoing only a week of training by the novel Cognitive-Kinesthetic Method, and was consistent across pilot groups of different categories of visual deprivation: congenitally blind, late-onset blind and blindfolded (Likova, 2014). These findings led us to implicate V1 as the implementation of the theoretical visuo-spatial 'sketchpad' for working memory in the human brain. Since neither the source nor the subsequent 'recipient' of this non-visual memory information in V1 is known, these results raise a number of important questions about the underlying functional organization of the respective encoding and retrieval networks in the brain. To address these questions, an individual totally blind from birth was given a week of Cognitive-Kinesthetic training, accompanied by functional magnetic resonance imaging (fMRI) both before and just after training, and again after a two-month consolidation period. The results revealed a remarkable temporal sequence of training-based response reorganization in both the hippocampal complex and the temporal-lobe object processing hierarchy over the prolonged consolidation period. In particular, a pattern of profound learning-based transformations in the hippocampus was strongly reflected in V1, with the retrieval function showing massive growth as result of the Cognitive-Kinesthetic memory training and consolidation, while the initially strong hippocampal response during tactile exploration and encoding became non-existent. Furthermore, after training, an alternating patch structure in the form of a cascade of discrete ventral regions underwent radical transformations to reach complete functional specialization in terms of either encoding or retrieval as a function of the stage of learning. Moreover, several distinct patterns of learning-evolution within the patches as a function of their anatomical location, implying a complex reorganization of the object processing sub-networks through the learning period. These first findings of complex patterns of training-based encoding/retrieval reorganization thus have broad implications for a newly emerging view of the perception/memory interactions and their reorganization through the learning process. Note that the temporal evolution of these forms of extended functional reorganization could not be uncovered with conventional assessment paradigms used in the traditional approaches to functional mapping, which may therefore have to be revisited. Moreover, as the present results are obtained in learning under life-long blindness, they imply modality-independent operations, transcending the usual tight association with visual processing. The present approach of memory drawing training in blindness, has the dual-advantage of being both non-visual and causal intervention, which makes it a promising 'scalpel' to disentangle interactions among diverse cognitive functions.

  11. Algorithms for Data Intensive Applications on Intelligent and Smart Memories

    DTIC Science & Technology

    2003-03-01

    editors). Parallel Algorithms and Architectures. North Holland, 1986. [8] P. Diniz . USC ISI, Personal Communication, March, 2001. [9] M. Frigo, C. E ...hierarchy as well as the Translation Lookaside Buer TLB aect the e ectiveness of cache friendly optimizations These penalties vary among...processors and cause large variations in the e ectiveness of cache performance optimizations The area of graph problems is fundamental in a wide variety of

  12. Computer architecture evaluation for structural dynamics computations: Project summary

    NASA Technical Reports Server (NTRS)

    Standley, Hilda M.

    1989-01-01

    The intent of the proposed effort is the examination of the impact of the elements of parallel architectures on the performance realized in a parallel computation. To this end, three major projects are developed: a language for the expression of high level parallelism, a statistical technique for the synthesis of multicomputer interconnection networks based upon performance prediction, and a queueing model for the analysis of shared memory hierarchies.

  13. Skill and Working Memory.

    DTIC Science & Technology

    1982-04-30

    clusters of rooms or areas. The fairly localized property of architectural patterns at the lowest level in the hierarchy is reminiscent of the localized...three digits. We have termed these clusters of groups "supergroups". Finally, when these supergroups became too large (more than 4 or 5 groups), SF...Supergroups -.> Clusters of Supergroups. Insert Figure 4 about here .... .... o.... In another study, run separately on SF and DD, after an hour’s

  14. Data storage technology comparisons

    NASA Technical Reports Server (NTRS)

    Katti, Romney R.

    1990-01-01

    The role of data storage and data storage technology is an integral, though conceptually often underestimated, portion of data processing technology. Data storage is important in the mass storage mode in which generated data is buffered for later use. But data storage technology is also important in the data flow mode when data are manipulated and hence required to flow between databases, datasets and processors. This latter mode is commonly associated with memory hierarchies which support computation. VLSI devices can reasonably be defined as electronic circuit devices such as channel and control electronics as well as highly integrated, solid-state devices that are fabricated using thin film deposition technology. VLSI devices in both capacities play an important role in data storage technology. In addition to random access memories (RAM), read-only memories (ROM), and other silicon-based variations such as PROM's, EPROM's, and EEPROM's, integrated devices find their way into a variety of memory technologies which offer significant performance advantages. These memory technologies include magnetic tape, magnetic disk, magneto-optic disk, and vertical Bloch line memory. In this paper, some comparison between selected technologies will be made to demonstrate why more than one memory technology exists today, based for example on access time and storage density at the active bit and system levels.

  15. Cultural factors influencing Japanese nurses' assertive communication: Part 2 - hierarchy and power.

    PubMed

    Omura, Mieko; Stone, Teresa E; Levett-Jones, Tracy

    2018-03-23

    Hierarchy and power characterize health-care relationships around the world, constituting a barrier to assertive communication and a risk to patient safety. This issue is more problematic and complex in countries such as Japan, where deep-seated cultural values related to hierarchy and power persist. The current paper is the second of two that present the findings from a study exploring Japanese nurses' views and experiences of how cultural values impact assertive communication for health-care professionals. We conducted semistructured interviews with 23 registered nurses, following which data were analyzed using directed content analysis. Two overarching themes emerged from the analysis: hierarchy/power and collectivism. In the present study, we focus on cultural values related to hierarchy and power, including differences in professional status, gender imbalance, seniority/generation gap, bullying, and humility/modesty. The findings from our research provide meaningful insights into how Japanese cultural values influence and constrain nurses' communication and speaking up behaviors, and can be used to inform educational programs designed to teach assertiveness skills. © 2018 John Wiley & Sons Australia, Ltd.

  16. Overview of emerging nonvolatile memory technologies

    PubMed Central

    2014-01-01

    Nonvolatile memory technologies in Si-based electronics date back to the 1990s. Ferroelectric field-effect transistor (FeFET) was one of the most promising devices replacing the conventional Flash memory facing physical scaling limitations at those times. A variant of charge storage memory referred to as Flash memory is widely used in consumer electronic products such as cell phones and music players while NAND Flash-based solid-state disks (SSDs) are increasingly displacing hard disk drives as the primary storage device in laptops, desktops, and even data centers. The integration limit of Flash memories is approaching, and many new types of memory to replace conventional Flash memories have been proposed. Emerging memory technologies promise new memories to store more data at less cost than the expensive-to-build silicon chips used by popular consumer gadgets including digital cameras, cell phones and portable music players. They are being investigated and lead to the future as potential alternatives to existing memories in future computing systems. Emerging nonvolatile memory technologies such as magnetic random-access memory (MRAM), spin-transfer torque random-access memory (STT-RAM), ferroelectric random-access memory (FeRAM), phase-change memory (PCM), and resistive random-access memory (RRAM) combine the speed of static random-access memory (SRAM), the density of dynamic random-access memory (DRAM), and the nonvolatility of Flash memory and so become very attractive as another possibility for future memory hierarchies. Many other new classes of emerging memory technologies such as transparent and plastic, three-dimensional (3-D), and quantum dot memory technologies have also gained tremendous popularity in recent years. Subsequently, not an exaggeration to say that computer memory could soon earn the ultimate commercial validation for commercial scale-up and production the cheap plastic knockoff. Therefore, this review is devoted to the rapidly developing new class of memory technologies and scaling of scientific procedures based on an investigation of recent progress in advanced Flash memory devices. PMID:25278820

  17. Overview of emerging nonvolatile memory technologies.

    PubMed

    Meena, Jagan Singh; Sze, Simon Min; Chand, Umesh; Tseng, Tseung-Yuen

    2014-01-01

    Nonvolatile memory technologies in Si-based electronics date back to the 1990s. Ferroelectric field-effect transistor (FeFET) was one of the most promising devices replacing the conventional Flash memory facing physical scaling limitations at those times. A variant of charge storage memory referred to as Flash memory is widely used in consumer electronic products such as cell phones and music players while NAND Flash-based solid-state disks (SSDs) are increasingly displacing hard disk drives as the primary storage device in laptops, desktops, and even data centers. The integration limit of Flash memories is approaching, and many new types of memory to replace conventional Flash memories have been proposed. Emerging memory technologies promise new memories to store more data at less cost than the expensive-to-build silicon chips used by popular consumer gadgets including digital cameras, cell phones and portable music players. They are being investigated and lead to the future as potential alternatives to existing memories in future computing systems. Emerging nonvolatile memory technologies such as magnetic random-access memory (MRAM), spin-transfer torque random-access memory (STT-RAM), ferroelectric random-access memory (FeRAM), phase-change memory (PCM), and resistive random-access memory (RRAM) combine the speed of static random-access memory (SRAM), the density of dynamic random-access memory (DRAM), and the nonvolatility of Flash memory and so become very attractive as another possibility for future memory hierarchies. Many other new classes of emerging memory technologies such as transparent and plastic, three-dimensional (3-D), and quantum dot memory technologies have also gained tremendous popularity in recent years. Subsequently, not an exaggeration to say that computer memory could soon earn the ultimate commercial validation for commercial scale-up and production the cheap plastic knockoff. Therefore, this review is devoted to the rapidly developing new class of memory technologies and scaling of scientific procedures based on an investigation of recent progress in advanced Flash memory devices.

  18. Detecting Role Errors in the Gene Hierarchy of the NCI Thesaurus

    PubMed Central

    Min, Hua; Cohen, Barry; Halper, Michael; Oren, Marc; Perl, Yehoshua

    2008-01-01

    Gene terminologies are playing an increasingly important role in the ever-growing field of genomic research. While errors in large, complex terminologies are inevitable, gene terminologies are even more susceptible to them due to the rapid growth of genomic knowledge and the nature of its discovery. It is therefore very important to establish quality-assurance protocols for such genomic-knowledge repositories. Different kinds of terminologies oftentimes require auditing methodologies adapted to their particular structures. In light of this, an auditing methodology tailored to the characteristics of the NCI Thesaurus’s (NCIT’s) Gene hierarchy is presented. The Gene hierarchy is of particular interest to the NCIT’s designers due to the primary role of genomics in current cancer research. This multiphase methodology focuses on detecting role-errors, such as missing roles or roles with incorrect or incomplete target structures, occurring within that hierarchy. The methodology is based on two kinds of abstraction networks, called taxonomies, that highlight the role distribution among concepts within the IS-A (subsumption) hierarchy. These abstract views tend to highlight portions of the hierarchy having a higher concentration of errors. The errors found during an application of the methodology are reported. Hypotheses pertaining to the efficacy of our methodology are investigated. PMID:19221606

  19. Set-relevance determines the impact of distractors on episodic memory retrieval.

    PubMed

    Kwok, Sze Chai; Shallice, Tim; Macaluso, Emiliano

    2014-09-01

    We investigated the interplay between stimulus-driven attention and memory retrieval with a novel interference paradigm that engaged both systems concurrently on each trial. Participants encoded a 45-min movie on Day 1 and, on Day 2, performed a temporal order judgment task during fMRI. Each retrieval trial comprised three images presented sequentially, and the task required participants to judge the temporal order of the first and the last images ("memory probes") while ignoring the second image, which was task irrelevant ("attention distractor"). We manipulated the content relatedness and the temporal proximity between the distractor and the memory probes, as well as the temporal distance between two probes. Behaviorally, short temporal distances between the probes led to reduced retrieval performance. Distractors that at encoding were temporally close to the first probe image reduced these costs, specifically when the distractor was content unrelated to the memory probes. The imaging results associated the distractor probe temporal proximity with activation of the right ventral attention network. By contrast, the precuneus was activated for high-content relatedness between distractors and probes and in trials including a short distance between the two memory probes. The engagement of the right ventral attention network by specific types of distractors suggests a link between stimulus-driven attention control and episodic memory retrieval, whereas the activation pattern of the precuneus implicates this region in memory search within knowledge/content-based hierarchies.

  20. Software Issues in High-Performance Computing and a Framework for the Development of HPC Applications

    DTIC Science & Technology

    1995-01-01

    possible to determine communication points. For this version, a C program spawning Posix threads and using semaphores to synchronize would have to...performance such as the time required for network communication and synchronization as well as issues of asynchrony and memory hierarchy. For example...enhances reusability. Process (or task) parallel computations can also be succinctly expressed with a small set of process creation and synchronization

  1. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 2, Issue 1

    DTIC Science & Technology

    2010-01-01

    Researchers in AHPCRC Technical Area 4 focus on improving processes for developing scalable, accurate parallel programs that are easily ported from one...control number. 1. REPORT DATE 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4 . TITLE AND SUBTITLE AHPCRC (Army High...continued on page 4 Virtual levels in Sequoia represent an abstract memory hierarchy without specifying data transfer mechanisms, giving the

  2. Exploring performance and energy tradeoffs for irregular applications: A case study on the Tilera many-core architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panyala, Ajay; Chavarría-Miranda, Daniel; Manzano, Joseph B.

    High performance, parallel applications with irregular data accesses are becoming a critical workload class for modern systems. In particular, the execution of such workloads on emerging many-core systems is expected to be a significant component of applications in data mining, machine learning, scientific computing and graph analytics. However, power and energy constraints limit the capabilities of individual cores, memory hierarchy and on-chip interconnect of such systems, thus leading to architectural and software trade-os that must be understood in the context of the intended application’s behavior. Irregular applications are notoriously hard to optimize given their data-dependent access patterns, lack of structuredmore » locality and complex data structures and code patterns. We have ported two irregular applications, graph community detection using the Louvain method (Grappolo) and high-performance conjugate gradient (HPCCG), to the Tilera many-core system and have conducted a detailed study of platform-independent and platform-specific optimizations that improve their performance as well as reduce their overall energy consumption. To conduct this study, we employ an auto-tuning based approach that explores the optimization design space along three dimensions - memory layout schemes, GCC compiler flag choices and OpenMP loop scheduling options. We leverage MIT’s OpenTuner auto-tuning framework to explore and recommend energy optimal choices for different combinations of parameters. We then conduct an in-depth architectural characterization to understand the memory behavior of the selected workloads. Finally, we perform a correlation study to demonstrate the interplay between the hardware behavior and application characteristics. Using auto-tuning, we demonstrate whole-node energy savings and performance improvements of up to 49:6% and 60% relative to a baseline instantiation, and up to 31% and 45:4% relative to manually optimized variants.« less

  3. Modeling the pressure-strain correlation of turbulence: An invariant dynamical systems approach

    NASA Technical Reports Server (NTRS)

    Speziale, Charles G.; Sarkar, Sutanu; Gatski, Thomas B.

    1990-01-01

    The modeling of the pressure-strain correlation of turbulence is examined from a basic theoretical standpoint with a view toward developing improved second-order closure models. Invariance considerations along with elementary dynamical systems theory are used in the analysis of the standard hierarchy of closure models. In these commonly used models, the pressure-strain correlation is assumed to be a linear function of the mean velocity gradients with coefficients that depend algebraically on the anisotropy tensor. It is proven that for plane homogeneous turbulent flows the equilibrium structure of this hierarchy of models is encapsulated by a relatively simple model which is only quadratically nonlinear in the anisotropy tensor. This new quadratic model - the SSG model - is shown to outperform the Launder, Reece, and Rodi model (as well as more recent models that have a considerably more complex nonlinear structure) in a variety of homogeneous turbulent flows. Some deficiencies still remain for the description of rotating turbulent shear flows that are intrinsic to this general hierarchy of models and, hence, cannot be overcome by the mere introduction of more complex nonlinearities. It is thus argued that the recent trend of adding substantially more complex nonlinear terms containing the anisotropy tensor may be of questionable value in the modeling of the pressure-strain correlation. Possible alternative approaches are discussed briefly.

  4. Modelling the pressure-strain correlation of turbulence - An invariant dynamical systems approach

    NASA Technical Reports Server (NTRS)

    Speziale, Charles G.; Sarkar, Sutanu; Gatski, Thomas B.

    1991-01-01

    The modeling of the pressure-strain correlation of turbulence is examined from a basic theoretical standpoint with a view toward developing improved second-order closure models. Invariance considerations along with elementary dynamical systems theory are used in the analysis of the standard hierarchy of closure models. In these commonly used models, the pressure-strain correlation is assumed to be a linear function of the mean velocity gradients with coefficients that depend algebraically on the anisotropy tensor. It is proven that for plane homogeneous turbulent flows the equilibrium structure of this hierarchy of models is encapsulated by a relatively simple model which is only quadratically nonlinear in the anisotropy tensor. This new quadratic model - the SSG model - is shown to outperform the Launder, Reece, and Rodi model (as well as more recent models that have a considerably more complex nonlinear structure) in a variety of homogeneous turbulent flows. Some deficiencies still remain for the description of rotating turbulent shear flows that are intrinsic to this general hierarchy of models and, hence, cannot be overcome by the mere introduction of more complex nonlinearities. It is thus argued that the recent trend of adding substantially more complex nonlinear terms containing the anisotropy tensor may be of questionable value in the modeling of the pressure-strain correlation. Possible alternative approaches are discussed briefly.

  5. Dynamic storage in resource-scarce browsing multimedia applications

    NASA Astrophysics Data System (ADS)

    Elenbaas, Herman; Dimitrova, Nevenka

    1998-10-01

    In the convergence of information and entertainment there is a conflict between the consumer's expectation of fast access to high quality multimedia content through narrow bandwidth channels versus the size of this content. During the retrieval and information presentation of a multimedia application there are two problems that have to be solved: the limited bandwidth during transmission of the retrieved multimedia content and the limited memory for temporary caching. In this paper we propose an approach for latency optimization in information browsing applications. We proposed a method for flattening hierarchically linked documents in a manner convenient for network transport over slow channels to minimize browsing latency. Flattening of the hierarchy involves linearization, compression and bundling of the document nodes. After the transfer, the compressed hierarchy is stored on a local device where it can be partly unbundled to fit the caching limits at the local site while giving the user availability to the content.

  6. Is awareness necessary for true inference?

    PubMed

    Leo, Peter D; Greene, Anthony J

    2008-09-01

    In transitive inference, participants learn a set of context-dependent discriminations that can be organized into a hierarchy that supports inference. Several studies show that inference occurs with or without task awareness. However, some studies assert that without awareness, performance is attributable to pseudoinference. By this account, inference-like performance is achieved by differential stimulus weighting according to the stimuli's proximity to the end items of the hierarchy. We implement an inference task that cannot be based on differential stimulus weighting. The design itself rules out pseudoinference strategies. Success on the task without evidence of deliberative strategies would therefore suggest that true inference can be achieved implicitly. We found that accurate performance on the inference task was not dependent on explicit awareness. The finding is consistent with a growing body of evidence that indicates that forms of learning and memory supporting inference and flexibility do not necessarily depend on task awareness.

  7. Alignment hierarchies: engineering architecture from the nanometre to the micrometre scale.

    PubMed

    Kureshi, Alvena; Cheema, Umber; Alekseeva, Tijna; Cambrey, Alison; Brown, Robert

    2010-12-06

    Natural tissues are built of metabolites, soluble proteins and solid extracellular matrix components (largely fibrils) together with cells. These are configured in highly organized hierarchies of structure across length scales from nanometre to millimetre, with alignments that are dominated by anisotropies in their fibrillar matrix. If we are to successfully engineer tissues, these hierarchies need to be mimicked with an understanding of the interaction between them. In particular, the movement of different elements of the tissue (e.g. molecules, cells and bulk fluids) is controlled by matrix structures at distinct scales. We present three novel systems to introduce alignment of collagen fibrils, cells and growth factor gradients within a three-dimensional collagen scaffold using fluid flow, embossing and layering of construct. Importantly, these can be seen as different parts of the same hierarchy of three-dimensional structure, as they are all formed into dense collagen gels. Fluid flow aligns collagen fibrils at the nanoscale, embossed topographical features provide alignment cues at the microscale and introducing layered configuration to three-dimensional collagen scaffolds provides microscale- and mesoscale-aligned pathways for protein factor delivery as well as barriers to confine protein diffusion to specific spatial directions. These seemingly separate methods can be employed to increase complexity of simple extracellular matrix scaffolds, providing insight into new approaches to directly fabricate complex physical and chemical cues at different hierarchical scales, similar to those in natural tissues.

  8. Modeling Patient Treatment With Medical Records: An Abstraction Hierarchy to Understand User Competencies and Needs.

    PubMed

    St-Maurice, Justin D; Burns, Catherine M

    2017-07-28

    Health care is a complex sociotechnical system. Patient treatment is evolving and needs to incorporate the use of technology and new patient-centered treatment paradigms. Cognitive work analysis (CWA) is an effective framework for understanding complex systems, and work domain analysis (WDA) is useful for understanding complex ecologies. Although previous applications of CWA have described patient treatment, due to their scope of work patients were previously characterized as biomedical machines, rather than patient actors involved in their own care. An abstraction hierarchy that characterizes patients as beings with complex social values and priorities is needed. This can help better understand treatment in a modern approach to care. The purpose of this study was to perform a WDA to represent the treatment of patients with medical records. The methods to develop this model included the analysis of written texts and collaboration with subject matter experts. Our WDA represents the ecology through its functional purposes, abstract functions, generalized functions, physical functions, and physical forms. Compared with other work domain models, this model is able to articulate the nuanced balance between medical treatment, patient education, and limited health care resources. Concepts in the analysis were similar to the modeling choices of other WDAs but combined them in as a comprehensive, systematic, and contextual overview. The model is helpful to understand user competencies and needs. Future models could be developed to model the patient's domain and enable the exploration of the shared decision-making (SDM) paradigm. Our work domain model links treatment goals, decision-making constraints, and task workflows. This model can be used by system developers who would like to use ecological interface design (EID) to improve systems. Our hierarchy is the first in a future set that could explore new treatment paradigms. Future hierarchies could model the patient as a controller and could be useful for mobile app development. ©Justin D St-Maurice, Catherine M Burns. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 28.07.2017.

  9. Modeling Patient Treatment With Medical Records: An Abstraction Hierarchy to Understand User Competencies and Needs

    PubMed Central

    2017-01-01

    Background Health care is a complex sociotechnical system. Patient treatment is evolving and needs to incorporate the use of technology and new patient-centered treatment paradigms. Cognitive work analysis (CWA) is an effective framework for understanding complex systems, and work domain analysis (WDA) is useful for understanding complex ecologies. Although previous applications of CWA have described patient treatment, due to their scope of work patients were previously characterized as biomedical machines, rather than patient actors involved in their own care. Objective An abstraction hierarchy that characterizes patients as beings with complex social values and priorities is needed. This can help better understand treatment in a modern approach to care. The purpose of this study was to perform a WDA to represent the treatment of patients with medical records. Methods The methods to develop this model included the analysis of written texts and collaboration with subject matter experts. Our WDA represents the ecology through its functional purposes, abstract functions, generalized functions, physical functions, and physical forms. Results Compared with other work domain models, this model is able to articulate the nuanced balance between medical treatment, patient education, and limited health care resources. Concepts in the analysis were similar to the modeling choices of other WDAs but combined them in as a comprehensive, systematic, and contextual overview. The model is helpful to understand user competencies and needs. Future models could be developed to model the patient’s domain and enable the exploration of the shared decision-making (SDM) paradigm. Conclusion Our work domain model links treatment goals, decision-making constraints, and task workflows. This model can be used by system developers who would like to use ecological interface design (EID) to improve systems. Our hierarchy is the first in a future set that could explore new treatment paradigms. Future hierarchies could model the patient as a controller and could be useful for mobile app development. PMID:28754650

  10. Hierarchy of sedimentary discontinuity surfaces and condensed beds from the middle Paleozoic of eastern North America: Implications for cratonic sequence stratigraphy

    USGS Publications Warehouse

    McLaughlin, P.I.; Brett, Carlton E.; Wilson, M.A.

    2008-01-01

    Sedimentological analyses of middle Paleozoic epeiric sea successions in North America suggest a hierarchy of discontinuity surfaces and condensed beds of increasing complexity. Simple firmgrounds and hardgrounds, which are comparatively ephemeral features, form the base of the hierarchy. Composite hardgrounds, reworked concretions, authigenic mineral crusts and monomictic intraformational conglomerates indicate more complex histories. Polymictic intraformational conglomerates, ironstones and phosphorites form the most complex discontinuity surfaces and condensed beds. Complexity of discontinuities is closely linked to depositional environments duration of sediment starvation and degree of reworking which in turn show a relationship to stratigraphic cyclicity. A model of cratonic sequence stratigraphy is generated by combining data on the complexity and lateral distribution of discontinuities in the context of facies successions. Lowstand, early transgressive and late transgressive systems tracts are representative of sea-level rise. Early and late transgressive systems tracts are separated by the maximum starvation surface (typically a polymictic intraformational conglomerate or condensed phosphorite), deposited during the peak rate of sea-level rise. Conversely the maximum flooding surface, representing the highest stand of sea level, is marked by little to no break in sedimentation. The highstand and falling stage systems tracts are deposited during relative sea-level fall. They are separated by the forced-regression surface, a thin discontinuity surface or condensed bed developed during the most rapid rate of sea-level fall. The lowest stand of sea level is marked by the sequence boundary. In subaerially exposed areas it is occasionally modified as a rockground or composite hardground.

  11. Parallel Optical Random Access Memory (PORAM)

    NASA Technical Reports Server (NTRS)

    Alphonse, G. A.

    1989-01-01

    It is shown that the need to minimize component count, power and size, and to maximize packing density require a parallel optical random access memory to be designed in a two-level hierarchy: a modular level and an interconnect level. Three module designs are proposed, in the order of research and development requirements. The first uses state-of-the-art components, including individually addressed laser diode arrays, acousto-optic (AO) deflectors and magneto-optic (MO) storage medium, aimed at moderate size, moderate power, and high packing density. The next design level uses an electron-trapping (ET) medium to reduce optical power requirements. The third design uses a beam-steering grating surface emitter (GSE) array to reduce size further and minimize the number of components.

  12. I/O efficient algorithms and applications in geographic information systems

    NASA Astrophysics Data System (ADS)

    Danner, Andrew

    Modern remote sensing methods such a laser altimetry (lidar) and Interferometric Synthetic Aperture Radar (IfSAR) produce georeferenced elevation data at unprecedented rates. Many Geographic Information System (GIS) algorithms designed for terrain modelling applications cannot process these massive data sets. The primary problem is that these data sets are too large to fit in the main internal memory of modern computers and must therefore reside on larger, but considerably slower disks. In these applications, the transfer of data between disk and main memory, or I/O, becomes the primary bottleneck. Working in a theoretical model that more accurately represents this two level memory hierarchy, we can develop algorithms that are I/O-efficient and reduce the amount of disk I/O needed to solve a problem. In this thesis we aim to modernize GIS algorithms and develop a number of I/O-efficient algorithms for processing geographic data derived from massive elevation data sets. For each application, we convert a geographic question to an algorithmic question, develop an I/O-efficient algorithm that is theoretically efficient, implement our approach and verify its performance using real-world data. The applications we consider include constructing a gridded digital elevation model (DEM) from an irregularly spaced point cloud, removing topological noise from a DEM, modeling surface water flow over a terrain, extracting river networks and watershed hierarchies from the terrain, and locating polygons containing query points in a planar subdivision. We initially developed solutions to each of these applications individually. However, we also show how to combine individual solutions to form a scalable geo-processing pipeline that seamlessly solves a sequence of sub-problems with little or no manual intervention. We present experimental results that demonstrate orders of magnitude improvement over previously known algorithms.

  13. Visual Short-Term Memory Capacity for Simple and Complex Objects

    ERIC Educational Resources Information Center

    Luria, Roy; Sessa, Paola; Gotler, Alex; Jolicoeur, Pierre; Dell'Acqua, Roberto

    2010-01-01

    Does the capacity of visual short-term memory (VSTM) depend on the complexity of the objects represented in memory? Although some previous findings indicated lower capacity for more complex stimuli, other results suggest that complexity effects arise during retrieval (due to errors in the comparison process with what is in memory) that is not…

  14. A SPATIALLY EXPLICIT HIERARCHICAL APPROACH TO MODELING COMPLEX ECOLOGICAL SYSTEMS: THEORY AND APPLICATIONS. (R827676)

    EPA Science Inventory

    Ecological systems are generally considered among the most complex because they are characterized by a large number of diverse components, nonlinear interactions, scale multiplicity, and spatial heterogeneity. Hierarchy theory, as well as empirical evidence, suggests that comp...

  15. Using Maslow's hierarchy to highlight power imbalances between visiting health professional student volunteers and the host community: An applied qualitative study.

    PubMed

    Evans, Tracey; Akporuno, Orezioghene; Owens, Katrina M; Lickers, Brittany; Marlinga, Jazmin; Lin, Henry C; Loh, Lawrence C

    2017-01-01

    Health professional students from high-income countries increasingly participate in short-term experiences in global health (STEGH) conducted abroad. One common criticism of STEGH is the inherent power differential that exists between visiting learners and the local community. To highlight this power differential, this paper explores perceived benefits as described by volunteer and community respondents and applies Maslow's hierarchy of needs to commonly identified themes in each respondent group. A semistructured survey was used to collect qualitative responses from both volunteers and community members located in a Dominican Republic community, that is, a hotspot for traditionally conducted STEGH. Thematic analysis identified themes of perceived benefits from both respondent groups; each group's common themes were then classified and compared within Maslow's hierarchy of needs. Each respondent group identified resource provision as a perceived benefit of STEGH, but volunteer respondents primarily focused on the provision of highly-skilled, complex resources while community respondents focused on basic necessities (food, water, etc.) Volunteer respondents were also the only group to also mention spiritual/religious/life experiences, personal skills development, and relationships as perceived benefits. Applying Maslow's hierarchy thus demonstrates a difference in needs: community respondents focused on benefits that address deficiency needs at the bottom of the hierarchy while volunteers focused on benefits addressing self-transcendence/actualization needs at the top of the hierarchy. The perceived difference in needs met by STEGH between volunteers and the host community within Maslow's hierarchy may drive an inherent power differential. Refocusing STEGH on the relationship level of the hierarchy (i.e., focusing on partnerships) might help mitigate this imbalance and empower host communities.

  16. A tribal abstraction network for SNOMED CT target hierarchies without attribute relationships.

    PubMed

    Ochs, Christopher; Geller, James; Perl, Yehoshua; Chen, Yan; Agrawal, Ankur; Case, James T; Hripcsak, George

    2015-05-01

    Large and complex terminologies, such as Systematized Nomenclature of Medicine-Clinical Terms (SNOMED CT), are prone to errors and inconsistencies. Abstraction networks are compact summarizations of the content and structure of a terminology. Abstraction networks have been shown to support terminology quality assurance. In this paper, we introduce an abstraction network derivation methodology which can be applied to SNOMED CT target hierarchies whose classes are defined using only hierarchical relationships (ie, without attribute relationships) and similar description-logic-based terminologies. We introduce the tribal abstraction network (TAN), based on the notion of a tribe-a subhierarchy rooted at a child of a hierarchy root, assuming only the existence of concepts with multiple parents. The TAN summarizes a hierarchy that does not have attribute relationships using sets of concepts, called tribal units that belong to exactly the same multiple tribes. Tribal units are further divided into refined tribal units which contain closely related concepts. A quality assurance methodology that utilizes TAN summarizations is introduced. A TAN is derived for the Observable entity hierarchy of SNOMED CT, summarizing its content. A TAN-based quality assurance review of the concepts of the hierarchy is performed, and erroneous concepts are shown to appear more frequently in large refined tribal units than in small refined tribal units. Furthermore, more erroneous concepts appear in large refined tribal units of more tribes than of fewer tribes. In this paper we introduce the TAN for summarizing SNOMED CT target hierarchies. A TAN was derived for the Observable entity hierarchy of SNOMED CT. A quality assurance methodology utilizing the TAN was introduced and demonstrated. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. In praise of hierarchy.

    PubMed

    Jaques, E

    1990-01-01

    Hierarchy has not had its day. After 3,000 years as the preferred structure for large organizations, managerial hierarchy is still the most natural and effective organizational form that a big company can employ. Now, as in the past, the key to organizational success is individual accountability, and hierarchy preserves unambiguous accountability for getting work done. Unfortunately, hierarchy is widely misunderstood and abused. Pay grades are confused with real layers of responsibility, for example, and incompetent bosses abound. As a result, many experts now urge us to adopt group-oriented or "flat" structures. But groups are never held accountable as groups for what they do or fail to do, and groups don't have careers. The proper use of hierarchy derives from the nature of work. As organizational tasks range from simple to very complex, there are sharp jumps in the level of difficulty and responsibility. Surprisingly, people in hundreds of companies in dozens of countries agree on where these jumps take place. They are tied to an objective measure-the time span of the longest task or program assigned to each managerial role-and they occur at 3 months, 1 year, 2 years, 5 years, 10 years, and 20 years. As the time span increases, so does the level of experience, knowledge, and mental stamina required to do the work. This increasing level of mental capacity lets companies put people in jobs they can do, it allows managers to add value to the work of their subordinates, it creates hierarchical layers acceptable to everyone in the organization, and it allows employees to be evaluated by people they accept as organizational superiors. Best of all, understanding hierarchy allows organizations to set up hierarchies with no more than seven layers-often fewer-and to know what the structure is good for and how it ought to perform.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Vetter, Jeffrey S

    Computer architecture experts expect that non-volatile memory (NVM) hierarchies will play a more significant role in future systems including mobile, enterprise, and HPC architectures. With this expectation in mind, we present NVL-C: a novel programming system that facilitates the efficient and correct programming of NVM main memory systems. The NVL-C programming abstraction extends C with a small set of intuitive language features that target NVM main memory, and can be combined directly with traditional C memory model features for DRAM. We have designed these new features to enable compiler analyses and run-time checks that can improve performance and guard againstmore » a number of subtle programming errors, which, when left uncorrected, can corrupt NVM-stored data. Moreover, to enable recovery of data across application or system failures, these NVL-C features include a flexible directive for specifying NVM transactions. So that our implementation might be extended to other compiler front ends and languages, the majority of our compiler analyses are implemented in an extended version of LLVM's intermediate representation (LLVM IR). We evaluate NVL-C on a number of applications to show its flexibility, performance, and correctness.« less

  19. Limits to the usability of iconic memory

    PubMed Central

    Rensink, Ronald A.

    2014-01-01

    Human vision briefly retains a trace of a stimulus after it disappears. This trace—iconic memory—is often believed to be a surrogate for the original stimulus, a representational structure that can be used as if the original stimulus were still present. To investigate its nature, a flicker-search paradigm was developed that relied upon a full scan (rather than partial report) of its contents. Results show that for visual search it can indeed act as a surrogate, with little cost for alternating between visible and iconic representations. However, the duration over which it can be used depends on the type of task: some tasks can use iconic memory for at least 240 ms, others for only about 190 ms, while others for no more than about 120 ms. The existence of these different limits suggests that iconic memory may have multiple layers, each corresponding to a particular level of the visual hierarchy. In this view, the inability to use a layer of iconic memory may reflect an inability to maintain feedback connections to the corresponding representation. PMID:25221539

  20. Breather-to-soliton transformation rules in the hierarchy of nonlinear Schrödinger equations.

    PubMed

    Chowdury, Amdad; Krolikowski, Wieslaw

    2017-06-01

    We study the exact first-order soliton and breather solutions of the integrable nonlinear Schrödinger equations hierarchy up to fifth order. We reveal the underlying physical mechanism which transforms a breather into a soliton. Furthermore, we show how the dynamics of the Akhmediev breathers which exist on a constant background as a result of modulation instability, is connected with solitons on a zero background. We also demonstrate that, while a first-order rogue wave can be directly transformed into a soliton, higher-order rogue wave solutions become rational two-soliton solutions with complex collisional structure on a background. Our results will have practical implications in supercontinuum generation, turbulence, and similar other complex nonlinear scenarios.

  1. Prospects of a mathematical theory of human behavior in complex man-machine systems tasks. [time sharing computer analogy of automobile driving

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Rouse, W. B.

    1978-01-01

    A hierarchy of human activities is derived by analyzing automobile driving in general terms. A structural description leads to a block diagram and a time-sharing computer analogy. The range of applicability of existing mathematical models is considered with respect to the hierarchy of human activities in actual complex tasks. Other mathematical tools so far not often applied to man machine systems are also discussed. The mathematical descriptions at least briefly considered here include utility, estimation, control, queueing, and fuzzy set theory as well as artificial intelligence techniques. Some thoughts are given as to how these methods might be integrated and how further work might be pursued.

  2. Brain Behavior Evolution during Learning: Emergence of Hierarchical Temporal Memory

    DTIC Science & Technology

    2013-08-30

    organization and synapse strengthening and reconnection operating within and upon the existing processing structures[2]. To say the least, the brain is...that it is a tree increases, then we say its hierarchy in- creases. We explore different starting values and different thresholds and find that...impulses from two neuronal columns ( say i and k) to reach column j at the exact same time. This means when column j is analyzing whether or not to

  3. A hierarchy of time-scales and the brain.

    PubMed

    Kiebel, Stefan J; Daunizeau, Jean; Friston, Karl J

    2008-11-01

    In this paper, we suggest that cortical anatomy recapitulates the temporal hierarchy that is inherent in the dynamics of environmental states. Many aspects of brain function can be understood in terms of a hierarchy of temporal scales at which representations of the environment evolve. The lowest level of this hierarchy corresponds to fast fluctuations associated with sensory processing, whereas the highest levels encode slow contextual changes in the environment, under which faster representations unfold. First, we describe a mathematical model that exploits the temporal structure of fast sensory input to track the slower trajectories of their underlying causes. This model of sensory encoding or perceptual inference establishes a proof of concept that slowly changing neuronal states can encode the paths or trajectories of faster sensory states. We then review empirical evidence that suggests that a temporal hierarchy is recapitulated in the macroscopic organization of the cortex. This anatomic-temporal hierarchy provides a comprehensive framework for understanding cortical function: the specific time-scale that engages a cortical area can be inferred by its location along a rostro-caudal gradient, which reflects the anatomical distance from primary sensory areas. This is most evident in the prefrontal cortex, where complex functions can be explained as operations on representations of the environment that change slowly. The framework provides predictions about, and principled constraints on, cortical structure-function relationships, which can be tested by manipulating the time-scales of sensory input.

  4. Understanding Social Hierarchies: The Neural and Psychological Foundations of Status Perception

    PubMed Central

    Koski, Jessica; Xie, Hongling; Olson, Ingrid R.

    2017-01-01

    Social groups across species rapidly self-organize into hierarchies, where members vary in their level of power, influence, skill, or dominance. In this review we explore the nature of social hierarchies and the traits associated with status in both humans and nonhuman primates, and how status varies across development in humans. Our review finds that we can rapidly identify social status based on a wide range of cues. Like monkeys, we tend to use certain cues, like physical strength, to make status judgments, although layered on top of these more primitive perceptual cues are socio-cultural status cues like job titles and educational attainment. One's relative status has profound effects on attention, memory, and social interactions, as well as health and wellness. These effects can be particularly pernicious in children and adolescents. Developmental research on peer groups and social exclusion suggests teenagers may be particularly sensitive to social status information, but research focused specifically on status processing and associated brain areas is very limited. Recent evidence from neuroscience suggests there may be an underlying neural network, including regions involved in executive, emotional, and reward processing, that is sensitive to status information. We conclude with questions for future research as well as stressing the need to expand social neuroscience research on status processing to adolescents. PMID:25697184

  5. Generic hierarchical engine for mask data preparation

    NASA Astrophysics Data System (ADS)

    Kalus, Christian K.; Roessl, Wolfgang; Schnitker, Uwe; Simecek, Michal

    2002-07-01

    Electronic layouts are usually flattened on their path from the hierarchical source downstream to the wafer. Mask data preparation has certainly been identified as a severe bottleneck since long. Data volumes are not only doubling every year along the ITRS roadmap. With the advent of optical proximity correction and phase-shifting masks data volumes are escalating up to non-manageable heights. Hierarchical treatment is one of the most powerful means to keep memory and CPU consumption in reasonable ranges. Only recently, however, has this technique acquired more public attention. Mask data preparation is the most critical area calling for a sound infrastructure to reduce the handling problem. Gaining more and more attention though, are other applications such as large area simulation and manufacturing rule checking (MRC). They all would profit from a generic engine capable to efficiently treat hierarchical data. In this paper we will present a generic engine for hierarchical treatment which solves the major problem, steady transitions along cell borders. Several alternatives exist how to walk through the hierarchy tree. They have, to date, not been thoroughly investigated. One is a bottom-up attempt to treat cells starting with the most elementary cells. The other one is a top-down approach which lends itself to creating a new hierarchy tree. In addition, since the variety, degree of hierarchy and quality of layouts extends over a wide range a generic engine has to take intelligent decisions when exploding the hierarchy tree. Several applications will be shown, in particular how far the limits can be pushed with the current hierarchical engine.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unat, Didem; Dubey, Anshu; Hoefler, Torsten

    The cost of data movement has always been an important concern in high performance computing (HPC) systems. It has now become the dominant factor in terms of both energy consumption and performance. Support for expression of data locality has been explored in the past, but those efforts have had only modest success in being adopted in HPC applications for various reasons. However, with the increasing complexity of the memory hierarchy and higher parallelism in emerging HPC systems, locality management has acquired a new urgency. Developers can no longer limit themselves to low-level solutions and ignore the potential for productivity andmore » performance portability obtained by using locality abstractions. Fortunately, the trend emerging in recent literature on the topic alleviates many of the concerns that got in the way of their adoption by application developers. Data locality abstractions are available in the forms of libraries, data structures, languages and runtime systems; a common theme is increasing productivity without sacrificing performance. Furthermore, this paper examines these trends and identifies commonalities that can combine various locality concepts to develop a comprehensive approach to expressing and managing data locality on future large-scale high-performance computing systems.« less

  7. The Year in Cognitive Neuroscience

    PubMed Central

    Fitch, W Tecumseh; Martins, Mauricio D

    2014-01-01

    Sixty years ago, Karl Lashley suggested that complex action sequences, from simple motor acts to language and music, are a fundamental but neglected aspect of neural function. Lashley demonstrated the inadequacy of then-standard models of associative chaining, positing a more flexible and generalized “syntax of action” necessary to encompass key aspects of language and music. He suggested that hierarchy in language and music builds upon a more basic sequential action system, and provided several concrete hypotheses about the nature of this system. Here, we review a diverse set of modern data concerning musical, linguistic, and other action processing, finding them largely consistent with an updated neuroanatomical version of Lashley's hypotheses. In particular, the lateral premotor cortex, including Broca's area, plays important roles in hierarchical processing in language, music, and at least some action sequences. Although the precise computational function of the lateral prefrontal regions in action syntax remains debated, Lashley's notion—that this cortical region implements a working-memory buffer or stack scannable by posterior and subcortical brain regions—is consistent with considerable experimental data. PMID:24697242

  8. On the Value of Reptilian Brains to Map the Evolution of the Hippocampal Formation.

    PubMed

    Reiter, Sam; Liaw, Hua-Peng; Yamawaki, Tracy M; Naumann, Robert K; Laurent, Gilles

    2017-01-01

    Our ability to navigate through the world depends on the function of the hippocampus. This old cortical structure plays a critical role in spatial navigation in mammals and in a variety of processes, including declarative and episodic memory and social behavior. Intense research has revealed much about hippocampal anatomy, physiology, and computation; yet, even intensely studied phenomena such as the shaping of place cell activity or the function of hippocampal firing patterns during sleep remain incompletely understood. Interestingly, while the hippocampus may be a 'higher order' area linked to a complex cortical hierarchy in mammals, it is an old cortical structure in evolutionary terms. The reptilian cortex, structurally much simpler than the mammalian cortex and hippocampus, therefore presents a good alternative model for exploring hippocampal function. Here, we trace common patterns in the evolution of the hippocampus of reptiles and mammals and ask which parts can be profitably compared to understand functional principles. In addition, we describe a selection of the highly diverse repertoire of reptilian behaviors to illustrate the value of a comparative approach towards understanding hippocampal function. © 2017 S. Karger AG, Basel.

  9. English semantic word-pair norms and a searchable Web portal for experimental stimulus creation.

    PubMed

    Buchanan, Erin M; Holmes, Jessica L; Teasley, Marilee L; Hutchison, Keith A

    2013-09-01

    As researchers explore the complexity of memory and language hierarchies, the need to expand normed stimulus databases is growing. Therefore, we present 1,808 words, paired with their features and concept-concept information, that were collected using previously established norming methods (McRae, Cree, Seidenberg, & McNorgan Behavior Research Methods 37:547-559, 2005). This database supplements existing stimuli and complements the Semantic Priming Project (Hutchison, Balota, Cortese, Neely, Niemeyer, Bengson, & Cohen-Shikora 2010). The data set includes many types of words (including nouns, verbs, adjectives, etc.), expanding on previous collections of nouns and verbs (Vinson & Vigliocco Journal of Neurolinguistics 15:317-351, 2008). We describe the relation between our and other semantic norms, as well as giving a short review of word-pair norms. The stimuli are provided in conjunction with a searchable Web portal that allows researchers to create a set of experimental stimuli without prior programming knowledge. When researchers use this new database in tandem with previous norming efforts, precise stimuli sets can be created for future research endeavors.

  10. Automatic Blocking Of QR and LU Factorizations for Locality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yi, Q; Kennedy, K; You, H

    2004-03-26

    QR and LU factorizations for dense matrices are important linear algebra computations that are widely used in scientific applications. To efficiently perform these computations on modern computers, the factorization algorithms need to be blocked when operating on large matrices to effectively exploit the deep cache hierarchy prevalent in today's computer memory systems. Because both QR (based on Householder transformations) and LU factorization algorithms contain complex loop structures, few compilers can fully automate the blocking of these algorithms. Though linear algebra libraries such as LAPACK provides manually blocked implementations of these algorithms, by automatically generating blocked versions of the computations, moremore » benefit can be gained such as automatic adaptation of different blocking strategies. This paper demonstrates how to apply an aggressive loop transformation technique, dependence hoisting, to produce efficient blockings for both QR and LU with partial pivoting. We present different blocking strategies that can be generated by our optimizer and compare the performance of auto-blocked versions with manually tuned versions in LAPACK, both using reference BLAS, ATLAS BLAS and native BLAS specially tuned for the underlying machine architectures.« less

  11. VLBI-resolution radio-map algorithms: Performance analysis of different levels of data-sharing on multi-socket, multi-core architectures

    NASA Astrophysics Data System (ADS)

    Tabik, S.; Romero, L. F.; Mimica, P.; Plata, O.; Zapata, E. L.

    2012-09-01

    A broad area in astronomy focuses on simulating extragalactic objects based on Very Long Baseline Interferometry (VLBI) radio-maps. Several algorithms in this scope simulate what would be the observed radio-maps if emitted from a predefined extragalactic object. This work analyzes the performance and scaling of this kind of algorithms on multi-socket, multi-core architectures. In particular, we evaluate a sharing approach, a privatizing approach and a hybrid approach on systems with complex memory hierarchy that includes shared Last Level Cache (LLC). In addition, we investigate which manual processes can be systematized and then automated in future works. The experiments show that the data-privatizing model scales efficiently on medium scale multi-socket, multi-core systems (up to 48 cores) while regardless of algorithmic and scheduling optimizations, the sharing approach is unable to reach acceptable scalability on more than one socket. However, the hybrid model with a specific level of data-sharing provides the best scalability over all used multi-socket, multi-core systems.

  12. Memory as the "whole brain work": a large-scale model based on "oscillations in super-synergy".

    PubMed

    Başar, Erol

    2005-01-01

    According to recent trends, memory depends on several brain structures working in concert across many levels of neural organization; "memory is a constant work-in progress." The proposition of a brain theory based on super-synergy in neural populations is most pertinent for the understanding of this constant work in progress. This report introduces a new model on memory basing on the processes of EEG oscillations and Brain Dynamics. This model is shaped by the following conceptual and experimental steps: 1. The machineries of super-synergy in the whole brain are responsible for formation of sensory-cognitive percepts. 2. The expression "dynamic memory" is used for memory processes that evoke relevant changes in alpha, gamma, theta and delta activities. The concerted action of distributed multiple oscillatory processes provides a major key for understanding of distributed memory. It comprehends also the phyletic memory and reflexes. 3. The evolving memory, which incorporates reciprocal actions or reverberations in the APLR alliance and during working memory processes, is especially emphasized. 4. A new model related to "hierarchy of memories as a continuum" is introduced. 5. The notions of "longer activated memory" and "persistent memory" are proposed instead of long-term memory. 6. The new analysis to recognize faces emphasizes the importance of EEG oscillations in neurophysiology and Gestalt analysis. 7. The proposed basic framework called "Memory in the Whole Brain Work" emphasizes that memory and all brain functions are inseparable and are acting as a "whole" in the whole brain. 8. The role of genetic factors is fundamental in living system settings and oscillations and accordingly in memory, according to recent publications. 9. A link from the "whole brain" to "whole body," and incorporation of vegetative and neurological system, is proposed, EEG oscillations and ultraslow oscillations being a control parameter.

  13. Hierarchy Bayesian model based services awareness of high-speed optical access networks

    NASA Astrophysics Data System (ADS)

    Bai, Hui-feng

    2018-03-01

    As the speed of optical access networks soars with ever increasing multiple services, the service-supporting ability of optical access networks suffers greatly from the shortage of service awareness. Aiming to solve this problem, a hierarchy Bayesian model based services awareness mechanism is proposed for high-speed optical access networks. This approach builds a so-called hierarchy Bayesian model, according to the structure of typical optical access networks. Moreover, the proposed scheme is able to conduct simple services awareness operation in each optical network unit (ONU) and to perform complex services awareness from the whole view of system in optical line terminal (OLT). Simulation results show that the proposed scheme is able to achieve better quality of services (QoS), in terms of packet loss rate and time delay.

  14. Symmetries of hyper-Kähler (or Poisson gauge field) hierarchy

    NASA Astrophysics Data System (ADS)

    Takasaki, K.

    1990-08-01

    Symmetry properties of the space of complex (or formal) hyper-Kähler metrics are studied in the language of hyper-Kähler hierarchies. The construction of finite symmetries is analogous to the theory of Riemann-Hilbert transformations, loop group elements now taking values in a (pseudo-) group of canonical transformations of a simplectic manifold. In spite of their highly nonlinear and involved nature, infinitesimal expressions of these symmetries are shown to have a rather simple form. These infinitesimal transformations are extended to the Plebanski key functions to give rise to a nonlinear realization of a Poisson loop algebra. The Poisson algebra structure turns out to originate in a contact structure behind a set of symplectic structures inherent in the hyper-Kähler hierarchy. Possible relations to membrane theory are briefly discussed.

  15. Memory Synapses Are Defined by Distinct Molecular Complexes: A Proposal

    PubMed Central

    Sossin, Wayne S.

    2018-01-01

    Synapses are diverse in form and function. While there are strong evidential and theoretical reasons for believing that memories are stored at synapses, the concept of a specialized “memory synapse” is rarely discussed. Here, we review the evidence that memories are stored at the synapse and consider the opposing possibilities. We argue that if memories are stored in an active fashion at synapses, then these memory synapses must have distinct molecular complexes that distinguish them from other synapses. In particular, examples from Aplysia sensory-motor neuron synapses and synapses on defined engram neurons in rodent models are discussed. Specific hypotheses for molecular complexes that define memory synapses are presented, including persistently active kinases, transmitter receptor complexes and trans-synaptic adhesion proteins. PMID:29695960

  16. State recovery and lockstep execution restart in a system with multiprocessor pairing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gara, Alan; Gschwind, Michael K; Salapura, Valentina

    System, method and computer program product for a multiprocessing system to offer selective pairing of processor cores for increased processing reliability. A selective pairing facility is provided that selectively connects, i.e., pairs, multiple microprocessor or processor cores to provide one highly reliable thread (or thread group). Each paired microprocessor or processor cores that provide one highly reliable thread for high-reliability connect with a system components such as a memory "nest" (or memory hierarchy), an optional system controller, and optional interrupt controller, optional I/O or peripheral devices, etc. The memory nest is attached to a selective pairing facility via a switchmore » or a bus. Each selectively paired processor core is includes a transactional execution facility, whereing the system is configured to enable processor rollback to a previous state and reinitialize lockstep execution in order to recover from an incorrect execution when an incorrect execution has been detected by the selective pairing facility.« less

  17. Discovering Event Structure in Continuous Narrative Perception and Memory.

    PubMed

    Baldassano, Christopher; Chen, Janice; Zadbood, Asieh; Pillow, Jonathan W; Hasson, Uri; Norman, Kenneth A

    2017-08-02

    During realistic, continuous perception, humans automatically segment experiences into discrete events. Using a novel model of cortical event dynamics, we investigate how cortical structures generate event representations during narrative perception and how these events are stored to and retrieved from memory. Our data-driven approach allows us to detect event boundaries as shifts between stable patterns of brain activity without relying on stimulus annotations and reveals a nested hierarchy from short events in sensory regions to long events in high-order areas (including angular gyrus and posterior medial cortex), which represent abstract, multimodal situation models. High-order event boundaries are coupled to increases in hippocampal activity, which predict pattern reinstatement during later free recall. These areas also show evidence of anticipatory reinstatement as subjects listen to a familiar narrative. Based on these results, we propose that brain activity is naturally structured into nested events, which form the basis of long-term memory representations. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. A linguistic geometry for 3D strategic planning

    NASA Technical Reports Server (NTRS)

    Stilman, Boris

    1995-01-01

    This paper is a new step in the development and application of the Linguistic Geometry. This formal theory is intended to discover the inner properties of human expert heuristics, which have been successful in a certain class of complex control systems, and apply them to different systems. In this paper we investigate heuristics extracted in the form of hierarchical networks of planning paths of autonomous agents. Employing Linguistic Geometry tools the dynamic hierarchy of networks is represented as a hierarchy of formal attribute languages. The main ideas of this methodology are shown in this paper on the new pilot example of the solution of the extremely complex 3D optimization problem of strategic planning for the space combat of autonomous vehicles. This example demonstrates deep and highly selective search in comparison with conventional search algorithms.

  19. The role of pulvinar in the transmission of information in the visual hierarchy.

    PubMed

    Cortes, Nelson; van Vreeswijk, Carl

    2012-01-01

    VISUAL RECEPTIVE FIELD (RF) ATTRIBUTES IN VISUAL CORTEX OF PRIMATES HAVE BEEN EXPLAINED MAINLY FROM CORTICAL CONNECTIONS: visual RFs progress from simple to complex through cortico-cortical pathways from lower to higher levels in the visual hierarchy. This feedforward flow of information is paired with top-down processes through the feedback pathway. Although the hierarchical organization explains the spatial properties of RFs, is unclear how a non-linear transmission of activity through the visual hierarchy can yield smooth contrast response functions in all level of the hierarchy. Depending on the gain, non-linear transfer functions create either a bimodal response to contrast, or no contrast dependence of the response in the highest level of the hierarchy. One possible mechanism to regulate this transmission of visual contrast information from low to high level involves an external component that shortcuts the flow of information through the hierarchy. A candidate for this shortcut is the Pulvinar nucleus of the thalamus. To investigate representation of stimulus contrast a hierarchical model network of ten cortical areas is examined. In each level of the network, the activity from the previous layer is integrated and then non-linearly transmitted to the next level. The arrangement of interactions creates a gradient from simple to complex RFs of increasing size as one moves from lower to higher cortical levels. The visual input is modeled as a Gaussian random input, whose width codes for the contrast. This input is applied to the first area. The output activity ratio among different contrast values is analyzed for the last level to observe sensitivity to a contrast and contrast invariant tuning. For a purely cortical system, the output of the last area can be approximately contrast invariant, but the sensitivity to contrast is poor. To account for an alternative visual processing pathway, non-reciprocal connections from and to a parallel pulvinar like structure of nine areas is coupled to the system. Compared to the pure feedforward model, cortico-pulvino-cortical output presents much more sensitivity to contrast and has a similar level of contrast invariance of the tuning.

  20. The Role of Pulvinar in the Transmission of Information in the Visual Hierarchy

    PubMed Central

    Cortes, Nelson; van Vreeswijk, Carl

    2012-01-01

    Visual receptive field (RF) attributes in visual cortex of primates have been explained mainly from cortical connections: visual RFs progress from simple to complex through cortico-cortical pathways from lower to higher levels in the visual hierarchy. This feedforward flow of information is paired with top-down processes through the feedback pathway. Although the hierarchical organization explains the spatial properties of RFs, is unclear how a non-linear transmission of activity through the visual hierarchy can yield smooth contrast response functions in all level of the hierarchy. Depending on the gain, non-linear transfer functions create either a bimodal response to contrast, or no contrast dependence of the response in the highest level of the hierarchy. One possible mechanism to regulate this transmission of visual contrast information from low to high level involves an external component that shortcuts the flow of information through the hierarchy. A candidate for this shortcut is the Pulvinar nucleus of the thalamus. To investigate representation of stimulus contrast a hierarchical model network of ten cortical areas is examined. In each level of the network, the activity from the previous layer is integrated and then non-linearly transmitted to the next level. The arrangement of interactions creates a gradient from simple to complex RFs of increasing size as one moves from lower to higher cortical levels. The visual input is modeled as a Gaussian random input, whose width codes for the contrast. This input is applied to the first area. The output activity ratio among different contrast values is analyzed for the last level to observe sensitivity to a contrast and contrast invariant tuning. For a purely cortical system, the output of the last area can be approximately contrast invariant, but the sensitivity to contrast is poor. To account for an alternative visual processing pathway, non-reciprocal connections from and to a parallel pulvinar like structure of nine areas is coupled to the system. Compared to the pure feedforward model, cortico-pulvino-cortical output presents much more sensitivity to contrast and has a similar level of contrast invariance of the tuning. PMID:22654750

  1. Influence of consonant frequency on Icelandic-speaking children's speech acquisition.

    PubMed

    Másdóttir, Thóra; Stokes, Stephanie F

    2016-04-01

    A developmental hierarchy of phonetic feature complexity has been proposed, suggesting that later emerging sounds have greater articulatory complexity than those learned earlier. The aim of this research was to explore this hierarchy in a relatively unexplored language, Icelandic. Twenty-eight typically-developing Icelandic-speaking children were tested at 2;4 and 3;4 years. Word-initial and word-medial phonemic inventories and a phonemic implicational hierarchy are described. The frequency of occurrence of Icelandic consonants in the speech of 2;4 and 3;4 year old children was, from most to least frequent, n, s, t, p, r, m, l, k, f, ʋ, j, ɵ, h, kʰ, c, [Formula: see text], ɰ, pʰ, tʰ, cʰ, ç, [Formula: see text], [Formula: see text], [Formula: see text]. Consonant frequency was a strong predictor of consonant accuracy at 2;4 months (r(23) = -0.75), but the effect was weaker at 3;4 months (r(23) = -0.51). Acquisition of /c/, /[Formula: see text]/ and /l/ occurred earlier, relative to English, Swedish, Dutch and German. A frequency-bound practice effect on emerging consonants is proposed to account for the early emergence of /c/, /[Formula: see text]/ and /l/ in Icelandic.

  2. A GPU-Accelerated Approach for Feature Tracking in Time-Varying Imagery Datasets.

    PubMed

    Peng, Chao; Sahani, Sandip; Rushing, John

    2017-10-01

    We propose a novel parallel connected component labeling (CCL) algorithm along with efficient out-of-core data management to detect and track feature regions of large time-varying imagery datasets. Our approach contributes to the big data field with parallel algorithms tailored for GPU architectures. We remove the data dependency between frames and achieve pixel-level parallelism. Due to the large size, the entire dataset cannot fit into cached memory. Frames have to be streamed through the memory hierarchy (disk to CPU main memory and then to GPU memory), partitioned, and processed as batches, where each batch is small enough to fit into the GPU. To reconnect the feature regions that are separated due to data partitioning, we present a novel batch merging algorithm to extract the region connection information across multiple batches in a parallel fashion. The information is organized in a memory-efficient structure and supports fast indexing on the GPU. Our experiment uses a commodity workstation equipped with a single GPU. The results show that our approach can efficiently process a weather dataset composed of terabytes of time-varying radar images. The advantages of our approach are demonstrated by comparing to the performance of an efficient CPU cluster implementation which is being used by the weather scientists.

  3. Long-term memory of hierarchical relationships in free-living greylag geese.

    PubMed

    Weiss, Brigitte M; Scheiber, Isabella B R

    2013-01-01

    Animals may memorise spatial and social information for many months and even years. Here, we investigated long-term memory of hierarchically ordered relationships, where the position of a reward depended on the relationship of a stimulus relative to other stimuli in the hierarchy. Seventeen greylag geese (Anser anser) had been trained on discriminations between successive pairs of five or seven implicitly ordered colours, where the higher ranking colour in each pair was rewarded. Geese were re-tested on the task 2, 6 and 12 months after learning the dyadic colour relationships. They chose the correct colour above chance at all three points in time, whereby performance was better in colour pairs at the beginning or end of the colour series. Nonetheless, they also performed above chance on internal colour pairs, which is indicative of long-term memory for quantitative differences in associative strength and/or for relational information. There were no indications for a decline in performance over time, indicating that geese may remember dyadic relationships for at least 6 months and probably well over 1 year. Furthermore, performance in the memory task was unrelated to the individuals' sex and their performance while initially learning the dyadic colour relationships. We discuss possible functions of this long-term memory in the social domain.

  4. Symbolic Dynamics and Grammatical Complexity

    NASA Astrophysics Data System (ADS)

    Hao, Bai-Lin; Zheng, Wei-Mou

    The following sections are included: * Formal Languages and Their Complexity * Formal Language * Chomsky Hierarchy of Grammatical Complexity * The L-System * Regular Language and Finite Automaton * Finite Automaton * Regular Language * Stefan Matrix as Transfer Function for Automaton * Beyond Regular Languages * Feigenbaum and Generalized Feigenbaum Limiting Sets * Even and Odd Fibonacci Sequences * Odd Maximal Primitive Prefixes and Kneading Map * Even Maximal Primitive Prefixes and Distinct Excluded Blocks * Summary of Results

  5. Classroom-Oriented Research from a Complex Systems Perspective

    ERIC Educational Resources Information Center

    Larsen-Freeman, Diane

    2016-01-01

    Bringing a complex systems perspective to bear on classroom-oriented research challenges researchers to think differently, seeing the classroom ecology as one dynamic system nested in a hierarchy of such systems at different levels of scale, all of which are spatially and temporally situated. This article begins with an introduction to complex…

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Philip LaRoche

    At the end of his life, Stephen Jay Kline, longtime professor of mechanical engineering at Stanford University, completed a book on how to address complex systems. The title of the book is 'Conceptual Foundations of Multi-Disciplinary Thinking' (1995), but the topic of the book is systems. Kline first establishes certain limits that are characteristic of our conscious minds. Kline then establishes a complexity measure for systems and uses that complexity measure to develop a hierarchy of systems. Kline then argues that our minds, due to their characteristic limitations, are unable to model the complex systems in that hierarchy. Computers aremore » of no help to us here. Our attempts at modeling these complex systems are based on the way we successfully model some simple systems, in particular, 'inert, naturally-occurring' objects and processes, such as what is the focus of physics. But complex systems overwhelm such attempts. As a result, the best we can do in working with these complex systems is to use a heuristic, what Kline calls the 'Guideline for Complex Systems.' Kline documents the problems that have developed due to 'oversimple' system models and from the inappropriate application of a system model from one domain to another. One prominent such problem is the Procrustean attempt to make the disciplines that deal with complex systems be 'physics-like.' Physics deals with simple systems, not complex ones, using Kline's complexity measure. The models that physics has developed are inappropriate for complex systems. Kline documents a number of the wasteful and dangerous fallacies of this type.« less

  7. The composite complex span: French validation of a short working memory task.

    PubMed

    Gonthier, Corentin; Thomassin, Noémylle; Roulin, Jean-Luc

    2016-03-01

    Most studies in individual differences in the field of working memory research use complex span tasks to measure working memory capacity. Various complex span tasks based on different materials have been developed, and these tasks have proven both reliable and valid; several complex span tasks are often combined to provide a domain-general estimate of working memory capacity with even better psychometric properties. The present work sought to address two issues. Firstly, having participants perform several full-length complex span tasks in succession makes for a long and tedious procedure. Secondly, few complex span tasks have been translated and validated in French. We constructed a French working memory task labeled the Composite Complex Span (CCS). The CCS includes shortened versions of three classic complex span tasks: the reading span, symmetry span, and operation span. We assessed the psychometric properties of the CCS, including test-retest reliability and convergent validity, with Raven's Advanced Progressive Matrices and with an alpha span task; the CCS demonstrated satisfying qualities in a sample of 1,093 participants. This work provides evidence that shorter versions of classic complex span tasks can yield valid working memory estimates. The materials and normative data for the CCS are also included.

  8. Visual short-term memory capacity for simple and complex objects.

    PubMed

    Luria, Roy; Sessa, Paola; Gotler, Alex; Jolicoeur, Pierre; Dell'Acqua, Roberto

    2010-03-01

    Does the capacity of visual short-term memory (VSTM) depend on the complexity of the objects represented in memory? Although some previous findings indicated lower capacity for more complex stimuli, other results suggest that complexity effects arise during retrieval (due to errors in the comparison process with what is in memory) that is not related to storage limitations of VSTM, per se. We used ERPs to track neuronal activity specifically related to retention in VSTM by measuring the sustained posterior contralateral negativity during a change detection task (which required detecting if an item was changed between a memory and a test array). The sustained posterior contralateral negativity, during the retention interval, was larger for complex objects than for simple objects, suggesting that neurons mediating VSTM needed to work harder to maintain more complex objects. This, in turn, is consistent with the view that VSTM capacity depends on complexity.

  9. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  10. Stochastic optimization of GeantV code by use of genetic algorithms

    DOE PAGES

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; ...

    2017-10-01

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) andmore » handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. Here, the goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.« less

  11. Stochastic optimization of GeantV code by use of genetic algorithms

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Behera, S. P.; Brun, R.; Canal, P.; Carminati, F.; Cosmo, G.; Duhem, L.; Elvira, D.; Folger, G.; Gheata, A.; Gheata, M.; Goulas, I.; Hariri, F.; Jun, S. Y.; Konstantinov, D.; Kumawat, H.; Ivantchenko, V.; Lima, G.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2017-10-01

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) and handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. The goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.

  12. Stochastic optimization of GeantV code by use of genetic algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) andmore » handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. Here, the goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.« less

  13. Flows, scaling, and the control of moment hierarchies for stochastic chemical reaction networks

    NASA Astrophysics Data System (ADS)

    Smith, Eric; Krishnamurthy, Supriya

    2017-12-01

    Stochastic chemical reaction networks (CRNs) are complex systems that combine the features of concurrent transformation of multiple variables in each elementary reaction event and nonlinear relations between states and their rates of change. Most general results concerning CRNs are limited to restricted cases where a topological characteristic known as deficiency takes a value 0 or 1, implying uniqueness and positivity of steady states and surprising, low-information forms for their associated probability distributions. Here we derive equations of motion for fluctuation moments at all orders for stochastic CRNs at general deficiency. We show, for the standard base case of proportional sampling without replacement (which underlies the mass-action rate law), that the generator of the stochastic process acts on the hierarchy of factorial moments with a finite representation. Whereas simulation of high-order moments for many-particle systems is costly, this representation reduces the solution of moment hierarchies to a complexity comparable to solving a heat equation. At steady states, moment hierarchies for finite CRNs interpolate between low-order and high-order scaling regimes, which may be approximated separately by distributions similar to those for deficiency-zero networks and connected through matched asymptotic expansions. In CRNs with multiple stable or metastable steady states, boundedness of high-order moments provides the starting condition for recursive solution downward to low-order moments, reversing the order usually used to solve moment hierarchies. A basis for a subset of network flows defined by having the same mean-regressing property as the flows in deficiency-zero networks gives the leading contribution to low-order moments in CRNs at general deficiency, in a 1 /n expansion in large particle numbers. Our results give a physical picture of the different informational roles of mean-regressing and non-mean-regressing flows and clarify the dynamical meaning of deficiency not only for first-moment conditions but for all orders in fluctuations.

  14. Complexity, Training Paradigm Design, and the Contribution of Memory Subsystems to Grammar Learning

    PubMed Central

    Ettlinger, Marc; Wong, Patrick C. M.

    2016-01-01

    Although there is variability in nonnative grammar learning outcomes, the contributions of training paradigm design and memory subsystems are not well understood. To examine this, we presented learners with an artificial grammar that formed words via simple and complex morphophonological rules. Across three experiments, we manipulated training paradigm design and measured subjects' declarative, procedural, and working memory subsystems. Experiment 1 demonstrated that passive, exposure-based training boosted learning of both simple and complex grammatical rules, relative to no training. Additionally, procedural memory correlated with simple rule learning, whereas declarative memory correlated with complex rule learning. Experiment 2 showed that presenting corrective feedback during the test phase did not improve learning. Experiment 3 revealed that structuring the order of training so that subjects are first exposed to the simple rule and then the complex improved learning. The cumulative findings shed light on the contributions of grammatical complexity, training paradigm design, and domain-general memory subsystems in determining grammar learning success. PMID:27391085

  15. Parameterized Micro-benchmarking: An Auto-tuning Approach for Complex Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Wenjing; Krishnamoorthy, Sriram; Agrawal, Gagan

    2012-05-15

    Auto-tuning has emerged as an important practical method for creating highly optimized implementations of key computational kernels and applications. However, the growing complexity of architectures and applications is creating new challenges for auto-tuning. Complex applications can involve a prohibitively large search space that precludes empirical auto-tuning. Similarly, architectures are becoming increasingly complicated, making it hard to model performance. In this paper, we focus on the challenge to auto-tuning presented by applications with a large number of kernels and kernel instantiations. While these kernels may share a somewhat similar pattern, they differ considerably in problem sizes and the exact computation performed.more » We propose and evaluate a new approach to auto-tuning which we refer to as parameterized micro-benchmarking. It is an alternative to the two existing classes of approaches to auto-tuning: analytical model-based and empirical search-based. Particularly, we argue that the former may not be able to capture all the architectural features that impact performance, whereas the latter might be too expensive for an application that has several different kernels. In our approach, different expressions in the application, different possible implementations of each expression, and the key architectural features, are used to derive a simple micro-benchmark and a small parameter space. This allows us to learn the most significant features of the architecture that can impact the choice of implementation for each kernel. We have evaluated our approach in the context of GPU implementations of tensor contraction expressions encountered in excited state calculations in quantum chemistry. We have focused on two aspects of GPUs that affect tensor contraction execution: memory access patterns and kernel consolidation. Using our parameterized micro-benchmarking approach, we obtain a speedup of up to 2 over the version that used default optimizations, but no auto-tuning. We demonstrate that observations made from microbenchmarks match the behavior seen from real expressions. In the process, we make important observations about the memory hierarchy of two of the most recent NVIDIA GPUs, which can be used in other optimization frameworks as well.« less

  16. Vanishing point: Scale independence in geomorphological hierarchies

    NASA Astrophysics Data System (ADS)

    Phillips, Jonathan D.

    2016-08-01

    Scale linkage problems in geosciences are often associated with a hierarchy of components. Both dynamical systems perspectives and intuition suggest that processes or relationships operating at fundamentally different scales are independent with respect to influences on system dynamics. But how far apart is ;fundamentally different;-that is, what is the ;vanishing point; at which scales are no longer interdependent? And how do we reconcile that with the idea (again, supported by both theory and intuition) that we can work our way along scale hierarchies from microscale to planetary (and vice-versa)? Graph and network theory are employed here to address these questions. Analysis of two archetypal hierarchical networks shows low algebraic connectivity, indicating low levels of inferential synchronization. This explains the apparent paradox between scale independence and hierarchical linkages. Incorporating more hierarchical levels results in an increase in complexity or entropy of the network as a whole, but at a nonlinear rate. Complexity increases as a power α of the number of levels in the hierarchy, with α < 1 and usually ≤ 0.6. However, algebraic connectivity decreases at a more rapid rate. Thus, the ability to infer one part of the hierarchical network from other level decays rapidly as more levels are added. Relatedness among system components decreases with differences in scale or resolution, analogous to distance decay in the spatial domain. These findings suggest a strategy of identifying and focusing on the most important or interesting scale levels, rather than attempting to identify the smallest or largest scale levels and work top-down or bottom-up from there. Examples are given from soil geomorphology and karst flow networks.

  17. Influence of motivation on control hierarchy in the human frontal cortex.

    PubMed

    Bahlmann, Jörg; Aarts, Esther; D'Esposito, Mark

    2015-02-18

    The frontal cortex mediates cognitive control and motivation to shape human behavior. It is generally observed that medial frontal areas are involved in motivational aspects of behavior, whereas lateral frontal regions are involved in cognitive control. Recent models of cognitive control suggest a rostro-caudal gradient in lateral frontal regions, such that progressively more rostral (anterior) regions process more complex aspects of cognitive control. How motivation influences such a control hierarchy is still under debate. Although some researchers argue that both systems work in parallel, others argue in favor of an interaction between motivation and cognitive control. In the latter case it is yet unclear how motivation would affect the different levels of the control hierarchy. This was investigated in the present functional MRI study applying different levels of cognitive control under different motivational states (low vs high reward anticipation). Three levels of cognitive control were tested by varying rule complexity: stimulus-response mapping (low-level), flexible task updating (mid-level), and sustained cue-task associations (high-level). We found an interaction between levels of cognitive control and motivation in medial and lateral frontal subregions. Specifically, flexible updating (mid-level of control) showed the strongest beneficial effect of reward and only this level exhibited functional coupling between dopamine-rich midbrain regions and the lateral frontal cortex. These findings suggest that motivation differentially affects the levels of a control hierarchy, influencing recruitment of frontal cortical control regions depending on specific task demands. Copyright © 2015 the authors 0270-6474/15/353207-11$15.00/0.

  18. Predictability and hierarchy in Drosophila behavior.

    PubMed

    Berman, Gordon J; Bialek, William; Shaevitz, Joshua W

    2016-10-18

    Even the simplest of animals exhibit behavioral sequences with complex temporal dynamics. Prominent among the proposed organizing principles for these dynamics has been the idea of a hierarchy, wherein the movements an animal makes can be understood as a set of nested subclusters. Although this type of organization holds potential advantages in terms of motion control and neural circuitry, measurements demonstrating this for an animal's entire behavioral repertoire have been limited in scope and temporal complexity. Here, we use a recently developed unsupervised technique to discover and track the occurrence of all stereotyped behaviors performed by fruit flies moving in a shallow arena. Calculating the optimally predictive representation of the fly's future behaviors, we show that fly behavior exhibits multiple time scales and is organized into a hierarchical structure that is indicative of its underlying behavioral programs and its changing internal states.

  19. An Optimizing Compiler for Petascale I/O on Leadership Class Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhary, Alok; Kandemir, Mahmut

    In high-performance computing systems, parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizesmore » the major achievements of the project and also points out promising future directions.« less

  20. The Sleep Elaboration-Awake Pruning (SEAP) theory of memory: long term memories grow in complexity during sleep and undergo selection while awake. Clinical, psychopharmacological and creative implications.

    PubMed

    Charlton, Bruce G; Andras, Peter

    2009-07-01

    Long term memory (LTM) systems need to be adaptive such that they enhance an organism's reproductive fitness and self-reproducing in order to maintain their complexity of communications over time in the face of entropic loss of information. Traditional 'representation-consolidation' accounts conceptualize memory adaptiveness as due to memories being 'representations' of the environment, and the longevity of memories as due to 'consolidation' processes. The assumption is that memory representations are formed while an animal is awake and interacting with the environment, and these memories are consolidated mainly while the animal is asleep. So the traditional view of memory is 'instructionist' and assumes that information is transferred from the environment into the brain. By contrast, we see memories as arising endogenously within the brain's LTM system mainly during sleep, to create complex but probably maladaptive memories which are then simplified ('pruned') and selected during the awake period. When awake the LTM system is brought into a more intense interaction with past and present experience. Ours is therefore a 'selectionist' account of memory, and could be termed the Sleep Elaboration-Awake Pruning (or SEAP) theory. The SEAP theory explains the longevity of memories in the face of entropy by the tendency for memories to grow in complexity during sleep; and explains the adaptiveness of memory by selection for consistency with perceptions and previous memories during the awake state. Sleep is therefore that behavioural state during which most of the internal processing of the system of LTM occurs; and the reason sleep remains poorly understood is that its primary activity is the expansion of long term memories. By re-conceptualizing the relationship between memory, sleep and the environment; SEAP provides a radically new framework for memory research, with implications for the measurement of memory and the design of empirical investigations in clinical, psychopharmacological and creative domains. For example, it would be predicted that states of insufficient alertness such as delirium would produce errors of commission (memory distortion and false memories, as with psychotic delusions), while sleep deprivation would produce errors of memory omission (memory loss). Ultimately, the main argument in favour of SEAP is that long term memory must be a complex adaptive system, and complex systems arise, are selected and sustained according to the principles of systems theory; and therefore LTM cannot be functioning in the way assumed by 'representation-consolidation' theories.

  1. Helium Nanobubbles Enhance Superelasticity and Retard Shear Localization in Small-Volume Shape Memory Alloy.

    PubMed

    Han, Wei-Zhong; Zhang, Jian; Ding, Ming-Shuai; Lv, Lan; Wang, Wen-Hong; Wu, Guang-Heng; Shan, Zhi-Wei; Li, Ju

    2017-06-14

    The intriguing phenomenon of metal superelasticity relies on stress-induced martensitic transformation (SIMT), which is well-known to be governed by developing cooperative strain accommodation at multiple length scales. It is therefore scientifically interesting to see what happens when this natural length scale hierarchy is disrupted. One method is producing pillars that confine the sample volume to micrometer length scale. Here we apply yet another intervention, helium nanobubbles injection, which produces porosity on the order of several nanometers. While the pillar confinement suppresses superelasticity, we found the dispersion of 5-10 nm helium nanobubbles do the opposite of promoting superelasticity in a Ni 53.5 Fe 19.5 Ga 27 shape memory alloy. The role of helium nanobubbles in modulating the competition between ordinary dislocation slip plasticity and SIMT is discussed.

  2. Most people do not ignore salient invalid cues in memory-based decisions.

    PubMed

    Platzer, Christine; Bröder, Arndt

    2012-08-01

    Former experimental studies have shown that decisions from memory tend to rely only on a few cues, following simple noncompensatory heuristics like "take the best." However, it has also repeatedly been demonstrated that a pictorial, as opposed to a verbal, representation of cue information fosters the inclusion of more cues in compensatory strategies, suggesting a facilitated retrieval of cue patterns. These studies did not properly control for visual salience of cues, however. In the experiment reported here, the cue salience hierarchy established in a pilot study was either congruent or incongruent with the validity order of the cues. Only the latter condition increased compensatory decision making, suggesting that the apparent representational format effect is, rather, a salience effect: Participants automatically retrieve and incorporate salient cues irrespective of their validity. Results are discussed with respect to reaction time data.

  3. Array-based, parallel hierarchical mesh refinement algorithms for unstructured meshes

    DOE PAGES

    Ray, Navamita; Grindeanu, Iulian; Zhao, Xinglin; ...

    2016-08-18

    In this paper, we describe an array-based hierarchical mesh refinement capability through uniform refinement of unstructured meshes for efficient solution of PDE's using finite element methods and multigrid solvers. A multi-degree, multi-dimensional and multi-level framework is designed to generate the nested hierarchies from an initial coarse mesh that can be used for a variety of purposes such as in multigrid solvers/preconditioners, to do solution convergence and verification studies and to improve overall parallel efficiency by decreasing I/O bandwidth requirements (by loading smaller meshes and in memory refinement). We also describe a high-order boundary reconstruction capability that can be used tomore » project the new points after refinement using high-order approximations instead of linear projection in order to minimize and provide more control on geometrical errors introduced by curved boundaries.The capability is developed under the parallel unstructured mesh framework "Mesh Oriented dAtaBase" (MOAB Tautges et al. (2004)). We describe the underlying data structures and algorithms to generate such hierarchies in parallel and present numerical results for computational efficiency and effect on mesh quality. Furthermore, we also present results to demonstrate the applicability of the developed capability to study convergence properties of different point projection schemes for various mesh hierarchies and to a multigrid finite-element solver for elliptic problems.« less

  4. Distributed mixed-integer fuzzy hierarchical programming for municipal solid waste management. Part II: scheme analysis and mechanism revelation.

    PubMed

    Cheng, Guanhui; Huang, Guohe; Dong, Cong; Xu, Ye; Chen, Jiapei; Chen, Xiujuan; Li, Kailong

    2017-03-01

    As presented in the first companion paper, distributed mixed-integer fuzzy hierarchical programming (DMIFHP) was developed for municipal solid waste management (MSWM) under complexities of heterogeneities, hierarchy, discreteness, and interactions. Beijing was selected as a representative case. This paper focuses on presenting the obtained schemes and the revealed mechanisms of the Beijing MSWM system. The optimal MSWM schemes for Beijing under various solid waste treatment policies and their differences are deliberated. The impacts of facility expansion, hierarchy, and spatial heterogeneities and potential extensions of DMIFHP are also discussed. A few of findings are revealed from the results and a series of comparisons and analyses. For instance, DMIFHP is capable of robustly reflecting these complexities in MSWM systems, especially for Beijing. The optimal MSWM schemes are of fragmented patterns due to the dominant role of the proximity principle in allocating solid waste treatment resources, and they are closely related to regulated ratios of landfilling, incineration, and composting. Communities without significant differences among distances to different types of treatment facilities are more sensitive to these ratios than others. The complexities of hierarchy and heterogeneities pose significant impacts on MSWM practices. Spatial dislocation of MSW generation rates and facility capacities caused by unreasonable planning in the past may result in insufficient utilization of treatment capacities under substantial influences of transportation costs. The problems of unreasonable MSWM planning, e.g., severe imbalance among different technologies and complete vacancy of ten facilities, should be gained deliberation of the public and the municipal or local governments in Beijing. These findings are helpful for gaining insights into MSWM systems under these complexities, mitigating key challenges in the planning of these systems, improving the related management practices, and eliminating potential socio-economic and eco-environmental issues resulting from unreasonable management.

  5. Complex Sentence Comprehension and Working Memory in Children With Specific Language Impairment

    PubMed Central

    Montgomery, James W.; Evans, Julia L.

    2015-01-01

    Purpose This study investigated the association of 2 mechanisms of working memory (phonological short-term memory [PSTM], attentional resource capacity/allocation) with the sentence comprehension of school-age children with specific language impairment (SLI) and 2 groups of control children. Method Twenty-four children with SLI, 18 age-matched (CA) children, and 16 language- and memory-matched (LMM) children completed a nonword repetition task (PSTM), the competing language processing task (CLPT; resource capacity/allocation), and a sentence comprehension task comprising complex and simple sentences. Results (1) The SLI group performed worse than the CA group on each memory task; (2) all 3 groups showed comparable simple sentence comprehension, but for complex sentences, the SLI and LMM groups performed worse than the CA group; (3) for the SLI group, (a) CLPT correlated with complex sentence comprehension, and (b) nonword repetition correlated with simple sentence comprehension; (4) for CA children, neither memory variable correlated with either sentence type; and (5) for LMM children, only CLPT correlated with complex sentences. Conclusions Comprehension of both complex and simple grammar by school-age children with SLI is a mentally demanding activity, requiring significant working memory resources. PMID:18723601

  6. Face the hierarchy: ERP and oscillatory brain responses in social rank processing.

    PubMed

    Breton, Audrey; Jerbi, Karim; Henaff, Marie-Anne; Cheylus, Anne; Baudouin, Jean-Yves; Schmitz, Christina; Krolak-Salmon, Pierre; Van der Henst, Jean-Baptiste

    2014-01-01

    Recognition of social hierarchy is a key feature that helps us navigate through our complex social environment. Neuroimaging studies have identified brain structures involved in the processing of hierarchical stimuli but the precise temporal dynamics of brain activity associated with such processing remains largely unknown. Here, we used electroencephalography to examine the effect of social hierarchy on neural responses elicited by faces. In contrast to previous studies, the key manipulation was that a hierarchical context was constructed, not by varying facial expressions, but by presenting neutral-expression faces in a game setting. Once the performance-based hierarchy was established, participants were presented with high-rank, middle-rank and low-rank player faces and had to evaluate the rank of each face with respect to their own position. Both event-related potentials and task-related oscillatory activity were investigated. Three main findings emerge from the study. First, the experimental manipulation had no effect on the early N170 component, which may suggest that hierarchy did not modulate the structural encoding of neutral-expression faces. Second, hierarchy significantly modulated the amplitude of the late positive potential (LPP) within a 400-700 ms time-window, with more a prominent LPP occurring when the participants processed the face of the highest-rank player. Third, high-rank faces were associated with the highest reduction of alpha power. Taken together these findings provide novel electrophysiological evidence for enhanced allocation of attentional resource in the presence of high-rank faces. At a broader level, this study brings new insights into the neural processing underlying social categorization.

  7. An extension of stochastic hierarchy equations of motion for the equilibrium correlation functions

    NASA Astrophysics Data System (ADS)

    Ke, Yaling; Zhao, Yi

    2017-06-01

    A traditional stochastic hierarchy equations of motion method is extended into the correlated real-time and imaginary-time propagations, in this paper, for its applications in calculating the equilibrium correlation functions. The central idea is based on a combined employment of stochastic unravelling and hierarchical techniques for the temperature-dependent and temperature-free parts of the influence functional, respectively, in the path integral formalism of the open quantum systems coupled to a harmonic bath. The feasibility and validity of the proposed method are justified in the emission spectra of homodimer compared to those obtained through the deterministic hierarchy equations of motion. Besides, it is interesting to find that the complex noises generated from a small portion of real-time and imaginary-time cross terms can be safely dropped to produce the stable and accurate position and flux correlation functions in a broad parameter regime.

  8. Model Hierarchies in Edge-Based Compartmental Modeling for Infectious Disease Spread

    PubMed Central

    Miller, Joel C.; Volz, Erik M.

    2012-01-01

    We consider the family of edge-based compartmental models for epidemic spread developed in [11]. These models allow for a range of complex behaviors, and in particular allow us to explicitly incorporate duration of a contact into our mathematical models. Our focus here is to identify conditions under which simpler models may be substituted for more detailed models, and in so doing we define a hierarchy of epidemic models. In particular we provide conditions under which it is appropriate to use the standard mass action SIR model, and we show what happens when these conditions fail. Using our hierarchy, we provide a procedure leading to the choice of the appropriate model for a given population. Our result about the convergence of models to the Mass Action model gives clear, rigorous conditions under which the Mass Action model is accurate. PMID:22911242

  9. Stability of glassy hierarchical networks

    NASA Astrophysics Data System (ADS)

    Zamani, M.; Camargo-Forero, L.; Vicsek, T.

    2018-02-01

    The structure of interactions in most animal and human societies can be best represented by complex hierarchical networks. In order to maintain close-to-optimal function both stability and adaptability are necessary. Here we investigate the stability of hierarchical networks that emerge from the simulations of an organization type with an efficiency function reminiscent of the Hamiltonian of spin glasses. Using this quantitative approach we find a number of expected (from everyday observations) and highly non-trivial results for the obtained locally optimal networks, including, for example: (i) stability increases with growing efficiency and level of hierarchy; (ii) the same perturbation results in a larger change for more efficient states; (iii) networks with a lower level of hierarchy become more efficient after perturbation; (iv) due to the huge number of possible optimal states only a small fraction of them exhibit resilience and, finally, (v) ‘attacks’ targeting the nodes selectively (regarding their position in the hierarchy) can result in paradoxical outcomes.

  10. Colloquium: Hierarchy of scales in language dynamics

    NASA Astrophysics Data System (ADS)

    Blythe, Richard A.

    2015-11-01

    Methods and insights from statistical physics are finding an increasing variety of applications where one seeks to understand the emergent properties of a complex interacting system. One such area concerns the dynamics of language at a variety of levels of description, from the behaviour of individual agents learning simple artificial languages from each other, up to changes in the structure of languages shared by large groups of speakers over historical timescales. In this Colloquium, we survey a hierarchy of scales at which language and linguistic behaviour can be described, along with the main progress in understanding that has been made at each of them - much of which has come from the statistical physics community. We argue that future developments may arise by linking the different levels of the hierarchy together in a more coherent fashion, in particular where this allows more effective use of rich empirical data sets.

  11. Computing chemical organizations in biological networks.

    PubMed

    Centler, Florian; Kaleta, Christoph; di Fenizio, Pietro Speroni; Dittrich, Peter

    2008-07-15

    Novel techniques are required to analyze computational models of intracellular processes as they increase steadily in size and complexity. The theory of chemical organizations has recently been introduced as such a technique that links the topology of biochemical reaction network models to their dynamical repertoire. The network is decomposed into algebraically closed and self-maintaining subnetworks called organizations. They form a hierarchy representing all feasible system states including all steady states. We present three algorithms to compute the hierarchy of organizations for network models provided in SBML format. Two of them compute the complete organization hierarchy, while the third one uses heuristics to obtain a subset of all organizations for large models. While the constructive approach computes the hierarchy starting from the smallest organization in a bottom-up fashion, the flux-based approach employs self-maintaining flux distributions to determine organizations. A runtime comparison on 16 different network models of natural systems showed that none of the two exhaustive algorithms is superior in all cases. Studying a 'genome-scale' network model with 762 species and 1193 reactions, we demonstrate how the organization hierarchy helps to uncover the model structure and allows to evaluate the model's quality, for example by detecting components and subsystems of the model whose maintenance is not explained by the model. All data and a Java implementation that plugs into the Systems Biology Workbench is available from http://www.minet.uni-jena.de/csb/prj/ot/tools.

  12. The Evolutionary Origins of Hierarchy

    PubMed Central

    Huizinga, Joost; Clune, Jeff

    2016-01-01

    Hierarchical organization—the recursive composition of sub-modules—is ubiquitous in biological networks, including neural, metabolic, ecological, and genetic regulatory networks, and in human-made systems, such as large organizations and the Internet. To date, most research on hierarchy in networks has been limited to quantifying this property. However, an open, important question in evolutionary biology is why hierarchical organization evolves in the first place. It has recently been shown that modularity evolves because of the presence of a cost for network connections. Here we investigate whether such connection costs also tend to cause a hierarchical organization of such modules. In computational simulations, we find that networks without a connection cost do not evolve to be hierarchical, even when the task has a hierarchical structure. However, with a connection cost, networks evolve to be both modular and hierarchical, and these networks exhibit higher overall performance and evolvability (i.e. faster adaptation to new environments). Additional analyses confirm that hierarchy independently improves adaptability after controlling for modularity. Overall, our results suggest that the same force–the cost of connections–promotes the evolution of both hierarchy and modularity, and that these properties are important drivers of network performance and adaptability. In addition to shedding light on the emergence of hierarchy across the many domains in which it appears, these findings will also accelerate future research into evolving more complex, intelligent computational brains in the fields of artificial intelligence and robotics. PMID:27280881

  13. The Evolutionary Origins of Hierarchy.

    PubMed

    Mengistu, Henok; Huizinga, Joost; Mouret, Jean-Baptiste; Clune, Jeff

    2016-06-01

    Hierarchical organization-the recursive composition of sub-modules-is ubiquitous in biological networks, including neural, metabolic, ecological, and genetic regulatory networks, and in human-made systems, such as large organizations and the Internet. To date, most research on hierarchy in networks has been limited to quantifying this property. However, an open, important question in evolutionary biology is why hierarchical organization evolves in the first place. It has recently been shown that modularity evolves because of the presence of a cost for network connections. Here we investigate whether such connection costs also tend to cause a hierarchical organization of such modules. In computational simulations, we find that networks without a connection cost do not evolve to be hierarchical, even when the task has a hierarchical structure. However, with a connection cost, networks evolve to be both modular and hierarchical, and these networks exhibit higher overall performance and evolvability (i.e. faster adaptation to new environments). Additional analyses confirm that hierarchy independently improves adaptability after controlling for modularity. Overall, our results suggest that the same force-the cost of connections-promotes the evolution of both hierarchy and modularity, and that these properties are important drivers of network performance and adaptability. In addition to shedding light on the emergence of hierarchy across the many domains in which it appears, these findings will also accelerate future research into evolving more complex, intelligent computational brains in the fields of artificial intelligence and robotics.

  14. A Memento of Complexity: The Rhetorics of Memory, Ambience, and Emergence

    ERIC Educational Resources Information Center

    Southergill, Glen T.

    2014-01-01

    Drawing from complexity theory, this dissertation develops a schema of rhetorical memory that exhibits extended characteristics. Scholars traditionally conceptualize memory, the fourth canon in classical rhetoric, as place (loci) or image (phantasm). However, memory rhetoric resists the traditional loci-phantasm framework and instead emerges from…

  15. Deficits of long-term memory in ecstasy users are related to cognitive complexity of the task.

    PubMed

    Brown, John; McKone, Elinor; Ward, Jeff

    2010-03-01

    Despite animal evidence that methylenedioxymethamphetamine (ecstasy) causes lasting damage in brain regions related to long-term memory, results regarding human memory performance have been variable. This variability may reflect the cognitive complexity of the memory tasks. However, previous studies have tested only a limited range of cognitive complexity. Furthermore, comparisons across different studies are made difficult by regional variations in ecstasy composition and patterns of use. The objective of this study is to evaluate ecstasy-related deficits in human verbal memory over a wide range of cognitive complexity using subjects drawn from a single geographical population. Ecstasy users were compared to non-drug using controls on verbal tasks with low cognitive complexity (stem completion), moderate cognitive complexity (stem-cued recall and word list learning) and high cognitive complexity (California Verbal Learning Test, Verbal Paired Associates and a novel Verbal Triplet Associates test). Where significant differences were found, both groups were also compared to cannabis users. More cognitively complex memory tasks were associated with clearer ecstasy-related deficits than low complexity tasks. In the most cognitively demanding task, ecstasy-related deficits remained even after multiple learning opportunities, whereas the performance of cannabis users approached that of non-drug using controls. Ecstasy users also had weaker deliberate strategy use than both non-drug and cannabis controls. Results were consistent with the proposal that ecstasy-related memory deficits are more reliable on tasks with greater cognitive complexity. This could arise either because such tasks require a greater contribution from the frontal lobe or because they require greater interaction between multiple brain regions.

  16. Children's Verbal Working Memory: Role of Processing Complexity in Predicting Spoken Sentence Comprehension

    ERIC Educational Resources Information Center

    Magimairaj, Beula M.; Montgomery, James W.

    2012-01-01

    Purpose: This study investigated the role of processing complexity of verbal working memory tasks in predicting spoken sentence comprehension in typically developing children. Of interest was whether simple and more complex working memory tasks have similar or different power in predicting sentence comprehension. Method: Sixty-five children (6- to…

  17. Risk Assessment of Groundwater Contamination: A Multilevel Fuzzy Comprehensive Evaluation Approach Based on DRASTIC Model

    PubMed Central

    Zhang, Yan; Zhong, Ming

    2013-01-01

    Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883

  18. From Brown-Peterson to continual distractor via operation span: A SIMPLE account of complex span.

    PubMed

    Neath, Ian; VanWormer, Lisa A; Bireta, Tamra J; Surprenant, Aimée M

    2014-09-01

    Three memory tasks-Brown-Peterson, complex span, and continual distractor-all alternate presentation of a to-be-remembered item and a distractor activity, but each task is associated with a different memory system, short-term memory, working memory, and long-term memory, respectively. SIMPLE, a relative local distinctiveness model, has previously been fit to data from both the Brown-Peterson and continual distractor tasks; here we use the same version of the model to fit data from a complex span task. Despite the many differences between the tasks, including unpredictable list length, SIMPLE fit the data well. Because SIMPLE posits a single memory system, these results constitute yet another demonstration that performance on tasks originally thought to tap different memory systems can be explained without invoking multiple memory systems.

  19. Working memory subsystems and task complexity in young boys with Fragile X syndrome.

    PubMed

    Baker, S; Hooper, S; Skinner, M; Hatton, D; Schaaf, J; Ornstein, P; Bailey, D

    2011-01-01

    Working memory problems have been targeted as core deficits in individuals with Fragile X syndrome (FXS); however, there have been few studies that have examined working memory in young boys with FXS, and even fewer studies that have studied the working memory performance of young boys with FXS across different degrees of complexity. The purpose of this study was to investigate the phonological loop and visual-spatial working memory in young boys with FXS, in comparison to mental age-matched typical boys, and to examine the impact of complexity of the working memory tasks on performance. The performance of young boys (7 to 13-years-old) with FXS (n = 40) was compared with that of mental age and race matched typically developing boys (n = 40) on measures designed to test the phonological loop and the visuospatial sketchpad across low, moderate and high degrees of complexity. Multivariate analyses were used to examine group differences across the specific working memory systems and degrees of complexity. Results suggested that boys with FXS showed deficits in phonological loop and visual-spatial working memory tasks when compared with typically developing mental age-matched boys. For the boys with FXS, the phonological loop was significantly lower than the visual-spatial sketchpad; however, there was no significant difference in performance across the low, moderate and high degrees of complexity in the working memory tasks. Reverse tasks from both the phonological loop and visual-spatial sketchpad appeared to be the most challenging for both groups, but particularly for the boys with FXS. These findings implicate a generalised deficit in working memory in young boys with FXS, with a specific disproportionate impairment in the phonological loop. Given the lack of differentiation on the low versus high complexity tasks, simple span tasks may provide an adequate estimate of working memory until greater involvement of the central executive is achieved. © 2010 The Authors. Journal of Intellectual Disability Research © 2010 Blackwell Publishing Ltd.

  20. Working memory subsystems and task complexity in young boys with Fragile X syndrome

    PubMed Central

    Baker, S.; Hooper, S.; Skinner, M.; Hatton, D.; Schaaf, J.; Ornstein, P.; Bailey, D.

    2011-01-01

    Background Working memory problems have been targeted as core deficits in individuals with Fragile X syndrome (FXS); however, there have been few studies that have examined working memory in young boys with FXS, and even fewer studies that have studied the working memory performance of young boys with FXS across different degrees of complexity. The purpose of this study was to investigate the phonological loop and visual–spatial working memory in young boys with FXS, in comparison to mental age-matched typical boys, and to examine the impact of complexity of the working memory tasks on performance. Methods The performance of young boys (7 to 13-years-old) with FXS (n = 40) was compared with that of mental age and race matched typically developing boys (n = 40) on measures designed to test the phonological loop and the visuospatial sketchpad across low, moderate and high degrees of complexity. Multivariate analyses were used to examine group differences across the specific working memory systems and degrees of complexity. Results Results suggested that boys with FXS showed deficits in phonological loop and visual–spatial working memory tasks when compared with typically developing mental age-matched boys. For the boys with FXS, the phonological loop was significantly lower than the visual–spatial sketchpad; however, there was no significant difference in performance across the low, moderate and high degrees of complexity in the working memory tasks. Reverse tasks from both the phonological loop and visual–spatial sketchpad appeared to be the most challenging for both groups, but particularly for the boys with FXS. Conclusions These findings implicate a generalised deficit in working memory in young boys with FXS, with a specific disproportionate impairment in the phonological loop. Given the lack of differentiation on the low versus high complexity tasks, simple span tasks may provide an adequate estimate of working memory until greater involvement of the central executive is achieved. PMID:21121991

  1. Beyond the FFA: The Role of the Ventral Anterior Temporal Lobes in Face Processing

    PubMed Central

    Collins, Jessica A.; Olson, Ingrid R.

    2014-01-01

    Extensive research has supported the existence of a specialized face-processing network that is distinct from the visual processing areas used for general object recognition. The majority of this work has been aimed at characterizing the response properties of the fusiform face area (FFA) and the occipital face area (OFA), which together are thought to constitute the core network of brain areas responsible for facial identification. Although accruing evidence has shown that face-selective patches in the ventral anterior temporal lobes (vATLs) are interconnected with the FFA and OFA, and that they play a role in facial identification, the relative contribution of these brain areas to the core face-processing network has remained unarticulated. Here we review recent research critically implicating the vATLs in face perception and memory. We propose that current models of face processing should be revised such that the ventral anterior temporal lobes serve a centralized role in the visual face-processing network. We speculate that a hierarchically organized system of face processing areas extends bilaterally from the inferior occipital gyri to the vATLs, with facial representations becoming increasingly complex and abstracted from low-level perceptual features as they move forward along this network. The anterior temporal face areas may serve as the apex of this hierarchy, instantiating the final stages of face recognition. We further argue that the anterior temporal face areas are ideally suited to serve as an interface between face perception and face memory, linking perceptual representations of individual identity with person-specific semantic knowledge. PMID:24937188

  2. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data.

    PubMed

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G; Khanna, Sanjeev

    2017-06-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings.

  3. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data*

    PubMed Central

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G.; Khanna, Sanjeev

    2017-01-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings. PMID:29151821

  4. Ferroelectric Memory Devices and a Proposed Standardized Test System Design

    DTIC Science & Technology

    1992-06-01

    positive clock transition. This provides automatic data protection in case of power loss. The device is being evaluated for applications such as automobile ...systems requiring nonvolatile memory and as these systems become more complex, the demand for reprogrammable nonvolatile memory increases. The...complexity and cost in making conventional nonvolatile memory reprogrammable also increases, so the potential for using ferroelectric memory as a replacement

  5. Relation between bandgap and resistance drift in amorphous phase change materials

    PubMed Central

    Rütten, Martin; Kaes, Matthias; Albert, Andreas; Wuttig, Matthias; Salinga, Martin

    2015-01-01

    Memory based on phase change materials is currently the most promising candidate for bridging the gap in access time between memory and storage in traditional memory hierarchy. However, multilevel storage is still hindered by the so-called resistance drift commonly related to structural relaxation of the amorphous phase. Here, we present the temporal evolution of infrared spectra measured on amorphous thin films of the three phase change materials Ag4In3Sb67Te26, GeTe and the most popular Ge2Sb2Te5. A widening of the bandgap upon annealing accompanied by a decrease of the optical dielectric constant ε∞ is observed for all three materials. Quantitative comparison with experimental data for the apparent activation energy of conduction reveals that the temporal evolution of bandgap and activation energy can be decoupled. The case of Ag4In3Sb67Te26, where the increase of activation energy is significantly smaller than the bandgap widening, demonstrates the possibility to identify new phase change materials with reduced resistance drift. PMID:26621533

  6. Relation between bandgap and resistance drift in amorphous phase change materials.

    PubMed

    Rütten, Martin; Kaes, Matthias; Albert, Andreas; Wuttig, Matthias; Salinga, Martin

    2015-12-01

    Memory based on phase change materials is currently the most promising candidate for bridging the gap in access time between memory and storage in traditional memory hierarchy. However, multilevel storage is still hindered by the so-called resistance drift commonly related to structural relaxation of the amorphous phase. Here, we present the temporal evolution of infrared spectra measured on amorphous thin films of the three phase change materials Ag4In3Sb67Te26, GeTe and the most popular Ge2Sb2Te5. A widening of the bandgap upon annealing accompanied by a decrease of the optical dielectric constant ε∞ is observed for all three materials. Quantitative comparison with experimental data for the apparent activation energy of conduction reveals that the temporal evolution of bandgap and activation energy can be decoupled. The case of Ag4In3Sb67Te26, where the increase of activation energy is significantly smaller than the bandgap widening, demonstrates the possibility to identify new phase change materials with reduced resistance drift.

  7. Defining a stem cell hierarchy in the intestine: markers, caveats and controversies

    PubMed Central

    Smith, Nicholas R.; Gallagher, Alexandra C.

    2016-01-01

    Abstract The past decade has appreciated rapid advance in identifying the once elusive intestinal stem cell (ISC) populations that fuel the continual renewal of the epithelial layer. This advance was largely driven by identification of novel stem cell marker genes, revealing the existence of quiescent, slowly‐ and active‐cycling ISC populations. However, a critical barrier for translating this knowledge to human health and disease remains elucidating the functional interplay between diverse stem cell populations. Currently, the precise hierarchical and regulatory relationships between these ISC populations are under intense scrutiny. The classical theory of a linear hierarchy, where quiescent and slowly‐cycling stem cells self‐renew but replenish an active‐cycling population, is well established in other rapidly renewing tissues such as the haematopoietic system. Efforts to definitively establish a similar stem cell hierarchy within the intestinal epithelium have yielded conflicting results, been difficult to interpret, and suggest non‐conventional alternatives to a linear hierarchy. While these new and potentially paradigm‐shifting discoveries are intriguing, the field will require development of a number of critical tools, including highly specific stem cell marker genes along with more rigorous experimental methodologies, to delineate the complex cellular relationships within this dynamic organ system. PMID:26864260

  8. Memory Indexing: A Novel Method for Tracing Memory Processes in Complex Cognitive Tasks

    ERIC Educational Resources Information Center

    Renkewitz, Frank; Jahn, Georg

    2012-01-01

    We validate an eye-tracking method applicable for studying memory processes in complex cognitive tasks. The method is tested with a task on probabilistic inferences from memory. It provides valuable data on the time course of processing, thus clarifying previous results on heuristic probabilistic inference. Participants learned cue values of…

  9. An Optimizing Compiler for Petascale I/O on Leadership-Class Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kandemir, Mahmut Taylan; Choudary, Alok; Thakur, Rajeev

    In high-performance computing (HPC), parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our DOE project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final reportmore » summarizes the major achievements of the project and also points out promising future directions Two new sections in this report compared to the previous report are IOGenie and SSD/NVM-specific optimizations.« less

  10. Organization and hierarchy of the human functional brain network lead to a chain-like core.

    PubMed

    Mastrandrea, Rossana; Gabrielli, Andrea; Piras, Fabrizio; Spalletta, Gianfranco; Caldarelli, Guido; Gili, Tommaso

    2017-07-07

    The brain is a paradigmatic example of a complex system: its functionality emerges as a global property of local mesoscopic and microscopic interactions. Complex network theory allows to elicit the functional architecture of the brain in terms of links (correlations) between nodes (grey matter regions) and to extract information out of the noise. Here we present the analysis of functional magnetic resonance imaging data from forty healthy humans at rest for the investigation of the basal scaffold of the functional brain network organization. We show how brain regions tend to coordinate by forming a highly hierarchical chain-like structure of homogeneously clustered anatomical areas. A maximum spanning tree approach revealed the centrality of the occipital cortex and the peculiar aggregation of cerebellar regions to form a closed core. We also report the hierarchy of network segregation and the level of clusters integration as a function of the connectivity strength between brain regions.

  11. Library API for Z-Order Memory Layout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes

    This library provides a simple-to-use API for implementing an altnerative to traditional row-major order in-memory layout, one based on a Morton- order space filling curve (SFC) , specifically, a Z-order variant of the Morton order curve. The library enables programmers to, after a simple initialization step, to convert a multidimensional array from row-major to Z- order layouts, then use a single, generic API call to access data from any arbitrary (i,j,k) location from within the array, whether it it be stored in row- major or z-order format. The motivation for using a SFC in-memory layout is for improved spatial locality,more » which results in increased use of local high speed cache memory. The basic idea is that with row-major order layouts, a data access to some location that is nearby in index space is likely far away in physical memory, resulting in poor spatial locality and slow runtime. On the other hand, with a SFC-based layout, accesses that are nearby in index space are much more likely to also be nearby in physical memory, resulting in much better spatial locality, and better runtime performance. Numerous studies over the years have shown significant runtime performance gains are realized by using a SFC-based memory layout compared to a row-major layout, sometimes by as much as 50%, which result from the better use of the memory and cache hierarchy that are attendant with a SFC-based layout (see, for example, [Beth2012]). This library implementation is intended for use with codes that work with structured, array-based data in 2 or 3 dimensions. It is not appropriate for use with unstructured or point-based data.« less

  12. CAUSAL INFERENCE WITH A GRAPHICAL HIERARCHY OF INTERVENTIONS

    PubMed Central

    Shpitser, Ilya; Tchetgen, Eric Tchetgen

    2017-01-01

    Identifying causal parameters from observational data is fraught with subtleties due to the issues of selection bias and confounding. In addition, more complex questions of interest, such as effects of treatment on the treated and mediated effects may not always be identified even in data where treatment assignment is known and under investigator control, or may be identified under one causal model but not another. Increasingly complex effects of interest, coupled with a diversity of causal models in use resulted in a fragmented view of identification. This fragmentation makes it unnecessarily difficult to determine if a given parameter is identified (and in what model), and what assumptions must hold for this to be the case. This, in turn, complicates the development of estimation theory and sensitivity analysis procedures. In this paper, we give a unifying view of a large class of causal effects of interest, including novel effects not previously considered, in terms of a hierarchy of interventions, and show that identification theory for this large class reduces to an identification theory of random variables under interventions from this hierarchy. Moreover, we show that one type of intervention in the hierarchy is naturally associated with queries identified under the Finest Fully Randomized Causally Interpretable Structure Tree Graph (FFRCISTG) model of Robins (via the extended g-formula), and another is naturally associated with queries identified under the Non-Parametric Structural Equation Model with Independent Errors (NPSEM-IE) of Pearl, via a more general functional we call the edge g-formula. Our results motivate the study of estimation theory for the edge g-formula, since we show it arises both in mediation analysis, and in settings where treatment assignment has unobserved causes, such as models associated with Pearl’s front-door criterion. PMID:28919652

  13. Discovering Planetary Nebula Geometries: Explorations with a Hierarchy of Models

    NASA Technical Reports Server (NTRS)

    Huyser, Karen A.; Knuth, Kevin H.; Fischer, Bernd; Schumann, Johann; Granquist-Fraser, Domhnull; Hajian, Arsen R.

    2004-01-01

    Astronomical objects known as planetary nebulae (PNe) consist of a shell of gas expelled by an aging medium-sized star as it makes its transition from a red giant to a white dwarf. In many cases this gas shell can be approximately described as a prolate ellipsoid. Knowledge of the physics of ionization processes in this gaseous shell enables us to construct a model in three dimensions (3D) called the Ionization-Bounded Prolate Ellipsoidal Shell model (IBPES model). Using this model we can generate synthetic nebular images, which can be used in conjunction with Hubble Space Telescope (HST) images of actual PNe to perform Bayesian model estimation. Since the IBPES model is characterized by thirteen parameters, model estimation requires the search of a 13-dimensional parameter space. The 'curse of dimensionality,' compounded by a computationally intense forward problem, makes forward searches extremely time-consuming and frequently causes them to become trapped in local solutions. We find that both the speed and of the search can be improved by judiciously reducing the dimensionality of the search space. Our basic approach employs a hierarchy of models of increasing complexity that converges to the IBPES model. Earlier studies establish that a hierarchical sequence converges more quickly, and to a better solution, than a search relying only on the most complex model. Here we report results for a hierarchy of five models. The first three models treat the nebula as a 2D image, while the last two models explore its characteristics as a 3D object and enable us to characterize the physics of the nebula. This five-model hierarchy is applied to HST images of ellipsoidal PNe to estimate their geometric properties and gas density profiles.

  14. CAUSAL INFERENCE WITH A GRAPHICAL HIERARCHY OF INTERVENTIONS.

    PubMed

    Shpitser, Ilya; Tchetgen, Eric Tchetgen

    2016-12-01

    Identifying causal parameters from observational data is fraught with subtleties due to the issues of selection bias and confounding. In addition, more complex questions of interest, such as effects of treatment on the treated and mediated effects may not always be identified even in data where treatment assignment is known and under investigator control, or may be identified under one causal model but not another. Increasingly complex effects of interest, coupled with a diversity of causal models in use resulted in a fragmented view of identification. This fragmentation makes it unnecessarily difficult to determine if a given parameter is identified (and in what model), and what assumptions must hold for this to be the case. This, in turn, complicates the development of estimation theory and sensitivity analysis procedures. In this paper, we give a unifying view of a large class of causal effects of interest, including novel effects not previously considered, in terms of a hierarchy of interventions, and show that identification theory for this large class reduces to an identification theory of random variables under interventions from this hierarchy. Moreover, we show that one type of intervention in the hierarchy is naturally associated with queries identified under the Finest Fully Randomized Causally Interpretable Structure Tree Graph (FFRCISTG) model of Robins (via the extended g-formula), and another is naturally associated with queries identified under the Non-Parametric Structural Equation Model with Independent Errors (NPSEM-IE) of Pearl, via a more general functional we call the edge g-formula. Our results motivate the study of estimation theory for the edge g-formula, since we show it arises both in mediation analysis, and in settings where treatment assignment has unobserved causes, such as models associated with Pearl's front-door criterion.

  15. Trends in data locality abstractions for HPC systems

    DOE PAGES

    Unat, Didem; Dubey, Anshu; Hoefler, Torsten; ...

    2017-05-10

    The cost of data movement has always been an important concern in high performance computing (HPC) systems. It has now become the dominant factor in terms of both energy consumption and performance. Support for expression of data locality has been explored in the past, but those efforts have had only modest success in being adopted in HPC applications for various reasons. However, with the increasing complexity of the memory hierarchy and higher parallelism in emerging HPC systems, locality management has acquired a new urgency. Developers can no longer limit themselves to low-level solutions and ignore the potential for productivity andmore » performance portability obtained by using locality abstractions. Fortunately, the trend emerging in recent literature on the topic alleviates many of the concerns that got in the way of their adoption by application developers. Data locality abstractions are available in the forms of libraries, data structures, languages and runtime systems; a common theme is increasing productivity without sacrificing performance. Furthermore, this paper examines these trends and identifies commonalities that can combine various locality concepts to develop a comprehensive approach to expressing and managing data locality on future large-scale high-performance computing systems.« less

  16. Hierarchical processing in music, language, and action: Lashley revisited.

    PubMed

    Fitch, W Tecumseh; Martins, Mauricio D

    2014-05-01

    Sixty years ago, Karl Lashley suggested that complex action sequences, from simple motor acts to language and music, are a fundamental but neglected aspect of neural function. Lashley demonstrated the inadequacy of then-standard models of associative chaining, positing a more flexible and generalized "syntax of action" necessary to encompass key aspects of language and music. He suggested that hierarchy in language and music builds upon a more basic sequential action system, and provided several concrete hypotheses about the nature of this system. Here, we review a diverse set of modern data concerning musical, linguistic, and other action processing, finding them largely consistent with an updated neuroanatomical version of Lashley's hypotheses. In particular, the lateral premotor cortex, including Broca's area, plays important roles in hierarchical processing in language, music, and at least some action sequences. Although the precise computational function of the lateral prefrontal regions in action syntax remains debated, Lashley's notion-that this cortical region implements a working-memory buffer or stack scannable by posterior and subcortical brain regions-is consistent with considerable experimental data. © 2014 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals Inc. on behalf of The New York Academy of Sciences.

  17. Limits on perceptual encoding can be predicted from known receptive field properties of human visual cortex.

    PubMed

    Cohen, Michael A; Rhee, Juliana Y; Alvarez, George A

    2016-01-01

    Human cognition has a limited capacity that is often attributed to the brain having finite cognitive resources, but the nature of these resources is usually not specified. Here, we show evidence that perceptual interference between items can be predicted by known receptive field properties of the visual cortex, suggesting that competition within representational maps is an important source of the capacity limitations of visual processing. Across the visual hierarchy, receptive fields get larger and represent more complex, high-level features. Thus, when presented simultaneously, high-level items (e.g., faces) will often land within the same receptive fields, while low-level items (e.g., color patches) will often not. Using a perceptual task, we found long-range interference between high-level items, but only short-range interference for low-level items, with both types of interference being weaker across hemifields. Finally, we show that long-range interference between items appears to occur primarily during perceptual encoding and not during working memory maintenance. These results are naturally explained by the distribution of receptive fields and establish a link between perceptual capacity limits and the underlying neural architecture. (c) 2015 APA, all rights reserved).

  18. A Discussion of the Discrete Fourier Transform Execution on a Typical Desktop PC

    NASA Technical Reports Server (NTRS)

    White, Michael J.

    2006-01-01

    This paper will discuss and compare the execution times of three examples of the Discrete Fourier Transform (DFT). The first two examples will demonstrate the direct implementation of the algorithm. In the first example, the Fourier coefficients are generated at the execution of the DFT. In the second example, the coefficients are generated prior to execution and the DFT coefficients are indexed at execution. The last example will demonstrate the Cooley- Tukey algorithm, better known as the Fast Fourier Transform. All examples were written in C executed on a PC using a Pentium 4 running at 1.7 Ghz. As a function of N, the total complex data size, the direct implementation DFT executes, as expected at order of N2 and the FFT executes at order of N log2 N. At N=16K, there is an increase in processing time beyond what is expected. This is not caused by implementation but is a consequence of the effect that machine architecture and memory hierarchy has on implementation. This paper will include a brief overview of digital signal processing, along with a discussion of contemporary work with discrete Fourier processing.

  19. Invariant visual object recognition: a model, with lighting invariance.

    PubMed

    Rolls, Edmund T; Stringer, Simon M

    2006-01-01

    How are invariant representations of objects formed in the visual cortex? We describe a neurophysiological and computational approach which focusses on a feature hierarchy model in which invariant representations can be built by self-organizing learning based on the statistics of the visual input. The model can use temporal continuity in an associative synaptic learning rule with a short term memory trace, and/or it can use spatial continuity in Continuous Transformation learning. The model of visual processing in the ventral cortical stream can build representations of objects that are invariant with respect to translation, view, size, and in this paper we show also lighting. The model has been extended to provide an account of invariant representations in the dorsal visual system of the global motion produced by objects such as looming, rotation, and object-based movement. The model has been extended to incorporate top-down feedback connections to model the control of attention by biased competition in for example spatial and object search tasks. The model has also been extended to account for how the visual system can select single objects in complex visual scenes, and how multiple objects can be represented in a scene.

  20. Automated hierarchical classification of protein domain subfamilies based on functionally-divergent residue signatures

    PubMed Central

    2012-01-01

    Background The NCBI Conserved Domain Database (CDD) consists of a collection of multiple sequence alignments of protein domains that are at various stages of being manually curated into evolutionary hierarchies based on conserved and divergent sequence and structural features. These domain models are annotated to provide insights into the relationships between sequence, structure and function via web-based BLAST searches. Results Here we automate the generation of conserved domain (CD) hierarchies using a combination of heuristic and Markov chain Monte Carlo (MCMC) sampling procedures and starting from a (typically very large) multiple sequence alignment. This procedure relies on statistical criteria to define each hierarchy based on the conserved and divergent sequence patterns associated with protein functional-specialization. At the same time this facilitates the sequence and structural annotation of residues that are functionally important. These statistical criteria also provide a means to objectively assess the quality of CD hierarchies, a non-trivial task considering that the protein subgroups are often very distantly related—a situation in which standard phylogenetic methods can be unreliable. Our aim here is to automatically generate (typically sub-optimal) hierarchies that, based on statistical criteria and visual comparisons, are comparable to manually curated hierarchies; this serves as the first step toward the ultimate goal of obtaining optimal hierarchical classifications. A plot of runtimes for the most time-intensive (non-parallelizable) part of the algorithm indicates a nearly linear time complexity so that, even for the extremely large Rossmann fold protein class, results were obtained in about a day. Conclusions This approach automates the rapid creation of protein domain hierarchies and thus will eliminate one of the most time consuming aspects of conserved domain database curation. At the same time, it also facilitates protein domain annotation by identifying those pattern residues that most distinguish each protein domain subgroup from other related subgroups. PMID:22726767

  1. Relations between Spatial Distribution, Social Affiliations and Dominance Hierarchy in a Semi-Free Mandrill Population

    PubMed Central

    Naud, Alexandre; Chailleux, Eloise; Kestens, Yan; Bret, Céline; Desjardins, Dominic; Petit, Odile; Ngoubangoye, Barthélémy; Sueur, Cédric

    2016-01-01

    Although there exist advantages to group-living in comparison to a solitary lifestyle, costs and gains of group-living may be unequally distributed among group members. Predation risk, vigilance levels and food intake may be unevenly distributed across group spatial geometry and certain within-group spatial positions may be more or less advantageous depending on the spatial distribution of these factors. In species characterized with dominance hierarchy, high-ranking individuals are commonly observed in advantageous spatial position. However, in complex social systems, individuals can develop affiliative relationships that may balance the effect of dominance relationships in individual's spatial distribution. The objective of the present study is to investigate how the group spatial distribution of a semi-free ranging colony of Mandrills relates to its social organization. Using spatial observations in an area surrounding the feeding zone, we tested the three following hypothesis: (1) does dominance hierarchy explain being observed in proximity or far from a food patch? (2) Do affiliative associations also explain being observed in proximity or far from a food patch? (3) Do the differences in rank in the group hierarchy explain being co-observed in proximity of a food patch? Our results showed that high-ranking individuals were more observed in proximity of the feeding zone while low-ranking individuals were more observed at the boundaries of the observation area. Furthermore, we observed that affiliative relationships were also associated with individual spatial distributions and explain more of the total variance of the spatial distribution in comparison with dominance hierarchy. Finally, we found that individuals observed at a same moment in proximity of the feeding zone were more likely to be distant in the hierarchy while controlling for maternal kinship, age and sex similarity. This study brings some elements about how affiliative networks and dominance hierarchy are related to spatial positions in primates. PMID:27199845

  2. Relations between Spatial Distribution, Social Affiliations and Dominance Hierarchy in a Semi-Free Mandrill Population.

    PubMed

    Naud, Alexandre; Chailleux, Eloise; Kestens, Yan; Bret, Céline; Desjardins, Dominic; Petit, Odile; Ngoubangoye, Barthélémy; Sueur, Cédric

    2016-01-01

    Although there exist advantages to group-living in comparison to a solitary lifestyle, costs and gains of group-living may be unequally distributed among group members. Predation risk, vigilance levels and food intake may be unevenly distributed across group spatial geometry and certain within-group spatial positions may be more or less advantageous depending on the spatial distribution of these factors. In species characterized with dominance hierarchy, high-ranking individuals are commonly observed in advantageous spatial position. However, in complex social systems, individuals can develop affiliative relationships that may balance the effect of dominance relationships in individual's spatial distribution. The objective of the present study is to investigate how the group spatial distribution of a semi-free ranging colony of Mandrills relates to its social organization. Using spatial observations in an area surrounding the feeding zone, we tested the three following hypothesis: (1) does dominance hierarchy explain being observed in proximity or far from a food patch? (2) Do affiliative associations also explain being observed in proximity or far from a food patch? (3) Do the differences in rank in the group hierarchy explain being co-observed in proximity of a food patch? Our results showed that high-ranking individuals were more observed in proximity of the feeding zone while low-ranking individuals were more observed at the boundaries of the observation area. Furthermore, we observed that affiliative relationships were also associated with individual spatial distributions and explain more of the total variance of the spatial distribution in comparison with dominance hierarchy. Finally, we found that individuals observed at a same moment in proximity of the feeding zone were more likely to be distant in the hierarchy while controlling for maternal kinship, age and sex similarity. This study brings some elements about how affiliative networks and dominance hierarchy are related to spatial positions in primates.

  3. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  4. Complex Span versus Updating Tasks of Working Memory: The Gap Is Not that Deep

    ERIC Educational Resources Information Center

    Schmiedek, Florian; Hildebrandt, Andrea; Lovden, Martin; Wilhelm, Oliver; Lindenberger, Ulman

    2009-01-01

    How to best measure working memory capacity is an issue of ongoing debate. Besides established complex span tasks, which combine short-term memory demands with generally unrelated secondary tasks, there exists a set of paradigms characterized by continuous and simultaneous updating of several items in working memory, such as the n-back, memory…

  5. A multi-factor Rasch scale for artistic judgment.

    PubMed

    Bezruczko, Nikolaus

    2002-01-01

    Measurement properties are reported for a combined scale of abstract and figurative artistic judgment aptitude items. Abstract items are synthetic, rule-based images from Visual Designs Test which implements a statistical algorithm to control design complexity and redundancy, and figurative items are canvas paintings in five styles, Fauvism, Post-Impressionism, Surrealism, Renaissance, and Baroque especially created for this research. The paintings integrate syntactic structure from VDT Abstract designs with thematic content for each style at four levels of complexity while controlling redundancy. Trained test administrators collected preference for synthetic abstract designs and authentic figurative art from 462 examinees in Johnson O'Connor Research Foundation testing offices in Boston, New York, Chicago, and Dallas. The Rasch model replicated measurement properties for VDT Abstract items and identified an item hierarchy that was statistically invariant between genders and generally stable across age for new, authentic figurative items. Further examination of the figurative item hierarchy revealed that complexity interacts with style and meaning. Sound measurement properties for a combined VDT Abstract and Figurative scale shows promise for a comprehensive artistic judgment construct.

  6. Context and meter enhance long-range planning in music performance

    PubMed Central

    Mathias, Brian; Pfordresher, Peter Q.; Palmer, Caroline

    2015-01-01

    Neural responses demonstrate evidence of resonance, or oscillation, during the production of periodic auditory events. Music contains periodic auditory events that give rise to a sense of beat, which in turn generates a sense of meter on the basis of multiple periodicities. Metrical hierarchies may aid memory for music by facilitating similarity-based associations among sequence events at different periodic distances that unfold in longer contexts. A fundamental question is how metrical associations arising from a musical context influence memory during music performance. Longer contexts may facilitate metrical associations at higher hierarchical levels more than shorter contexts, a prediction of the range model, a formal model of planning processes in music performance (Palmer and Pfordresher, 2003; Pfordresher et al., 2007). Serial ordering errors, in which intended sequence events are produced in incorrect sequence positions, were measured as skilled pianists performed musical pieces that contained excerpts embedded in long or short musical contexts. Pitch errors arose from metrically similar positions and further sequential distances more often when the excerpt was embedded in long contexts compared to short contexts. Musicians’ keystroke intensities and error rates also revealed influences of metrical hierarchies, which differed for performances in long and short contexts. The range model accounted for contextual effects and provided better fits to empirical findings when metrical associations between sequence events were included. Longer sequence contexts may facilitate planning during sequence production by increasing conceptual similarity between hierarchically associated events. These findings are consistent with the notion that neural oscillations at multiple periodicities may strengthen metrical associations across sequence events during planning. PMID:25628550

  7. Simple and Complex Memory Spans and Their Relation to Fluid Abilities: Evidence from List-Length Effects

    ERIC Educational Resources Information Center

    Unsworth, Nash; Engle, Randall W.

    2006-01-01

    Complex (working memory) span tasks have generally shown larger and more consistent correlations with higher-order cognition than have simple (or short-term memory) span tasks. The relation between verbal complex and simple verbal span tasks to fluid abilities as a function of list-length was examined. The results suggest that the simple…

  8. Amnesia and the organization of the hippocampal system.

    PubMed

    Mishkin, M; Vargha-Khadem, F; Gadian, D G

    1998-01-01

    Early hippocampal injury in humans has been found to result in a limited form of global anterograde amnesia. At issue is whether the limitation is qualitative, with the amnesia reflecting substantially greater impairment in episodic than in semantic memory, or only quantitative, with both episodic and semantic memory being partially and equivalently impaired. Evidence from neuroanatomical and lesion studies in animals suggests that the hippocampus and subhippocampal cortices form a hierarchically organized system, such that the greatest convergence of information (and, by implication, the richest amount of association) takes place within the hippocampus, located at the top of the hierarchy. On the one hand, this evidence is consistent with the view that selective hippocampal damage produces a differential impairment in context-rich episodic memory as compared with context-free semantic memory, because only the latter can be supported by the subhippocampal cortices. On the other hand, given the system's hierarchical form of organization, this dissociation of deficits is difficult to prove, because a quantitatively limited deficit will nearly always be a viable alternative. A final choice between the alternative views is therefore likely to depend less on further evidence gathered in brain-injured patients than on which view accounts for more of the data gathered from converging approaches to the problem.

  9. The storage system of PCM based on random access file system

    NASA Astrophysics Data System (ADS)

    Han, Wenbing; Chen, Xiaogang; Zhou, Mi; Li, Shunfen; Li, Gezi; Song, Zhitang

    2016-10-01

    Emerging memory technologies such as Phase change memory (PCM) tend to offer fast, random access to persistent storage with better scalability. It's a hot topic of academic and industrial research to establish PCM in storage hierarchy to narrow the performance gap. However, the existing file systems do not perform well with the emerging PCM storage, which access storage medium via a slow, block-based interface. In this paper, we propose a novel file system, RAFS, to bring about good performance of PCM, which is built in the embedded platform. We attach PCM chips to the memory bus and build RAFS on the physical address space. In the proposed file system, we simplify traditional system architecture to eliminate block-related operations and layers. Furthermore, we adopt memory mapping and bypassed page cache to reduce copy overhead between the process address space and storage device. XIP mechanisms are also supported in RAFS. To the best of our knowledge, we are among the first to implement file system on real PCM chips. We have analyzed and evaluated its performance with IOZONE benchmark tools. Our experimental results show that the RAFS on PCM outperforms Ext4fs on SDRAM with small record lengths. Based on DRAM, RAFS is significantly faster than Ext4fs by 18% to 250%.

  10. Working memory training may increase working memory capacity but not fluid intelligence.

    PubMed

    Harrison, Tyler L; Shipstead, Zach; Hicks, Kenny L; Hambrick, David Z; Redick, Thomas S; Engle, Randall W

    2013-12-01

    Working memory is a critical element of complex cognition, particularly under conditions of distraction and interference. Measures of working memory capacity correlate positively with many measures of real-world cognition, including fluid intelligence. There have been numerous attempts to use training procedures to increase working memory capacity and thereby performance on the real-world tasks that rely on working memory capacity. In the study reported here, we demonstrated that training on complex working memory span tasks leads to improvement on similar tasks with different materials but that such training does not generalize to measures of fluid intelligence.

  11. Parallel effects of memory set activation and search on timing and working memory capacity.

    PubMed

    Schweickert, Richard; Fortin, Claudette; Xi, Zhuangzhuang; Viau-Quesnel, Charles

    2014-01-01

    Accurately estimating a time interval is required in everyday activities such as driving or cooking. Estimating time is relatively easy, provided a person attends to it. But a brief shift of attention to another task usually interferes with timing. Most processes carried out concurrently with timing interfere with it. Curiously, some do not. Literature on a few processes suggests a general proposition, the Timing and Complex-Span Hypothesis: A process interferes with concurrent timing if and only if process performance is related to complex span. Complex-span is the number of items correctly recalled in order, when each item presented for study is followed by a brief activity. Literature on task switching, visual search, memory search, word generation and mental time travel supports the hypothesis. Previous work found that another process, activation of a memory set in long term memory, is not related to complex-span. If the Timing and Complex-Span Hypothesis is true, activation should not interfere with concurrent timing in dual-task conditions. We tested such activation in single-task memory search task conditions and in dual-task conditions where memory search was executed with concurrent timing. In Experiment 1, activating a memory set increased reaction time, with no significant effect on time production. In Experiment 2, set size and memory set activation were manipulated. Activation and set size had a puzzling interaction for time productions, perhaps due to difficult conditions, leading us to use a related but easier task in Experiment 3. In Experiment 3 increasing set size lengthened time production, but memory activation had no significant effect. Results here and in previous literature on the whole support the Timing and Complex-Span Hypotheses. Results also support a sequential organization of activation and search of memory. This organization predicts activation and set size have additive effects on reaction time and multiplicative effects on percent correct, which was found.

  12. The Influences of Emotion on Learning and Memory

    PubMed Central

    Tyng, Chai M.; Amin, Hafeez U.; Saad, Mohamad N. M.; Malik, Aamir S.

    2017-01-01

    Emotion has a substantial influence on the cognitive processes in humans, including perception, attention, learning, memory, reasoning, and problem solving. Emotion has a particularly strong influence on attention, especially modulating the selectivity of attention as well as motivating action and behavior. This attentional and executive control is intimately linked to learning processes, as intrinsically limited attentional capacities are better focused on relevant information. Emotion also facilitates encoding and helps retrieval of information efficiently. However, the effects of emotion on learning and memory are not always univalent, as studies have reported that emotion either enhances or impairs learning and long-term memory (LTM) retention, depending on a range of factors. Recent neuroimaging findings have indicated that the amygdala and prefrontal cortex cooperate with the medial temporal lobe in an integrated manner that affords (i) the amygdala modulating memory consolidation; (ii) the prefrontal cortex mediating memory encoding and formation; and (iii) the hippocampus for successful learning and LTM retention. We also review the nested hierarchies of circular emotional control and cognitive regulation (bottom-up and top-down influences) within the brain to achieve optimal integration of emotional and cognitive processing. This review highlights a basic evolutionary approach to emotion to understand the effects of emotion on learning and memory and the functional roles played by various brain regions and their mutual interactions in relation to emotional processing. We also summarize the current state of knowledge on the impact of emotion on memory and map implications for educational settings. In addition to elucidating the memory-enhancing effects of emotion, neuroimaging findings extend our understanding of emotional influences on learning and memory processes; this knowledge may be useful for the design of effective educational curricula to provide a conducive learning environment for both traditional “live” learning in classrooms and “virtual” learning through online-based educational technologies. PMID:28883804

  13. New solitary wave solutions to the (2+1)-dimensional Calogero-Bogoyavlenskii-Schiff and the Kadomtsev-Petviashvili hierarchy equations

    NASA Astrophysics Data System (ADS)

    Baskonus, Haci Mehmet; Sulaiman, Tukur Abdulkadir; Bulut, Hasan

    2017-10-01

    In this paper, with the help of Wolfram Mathematica 9 we employ the powerful sine-Gordon expansion method in investigating the solution structures of the two well known nonlinear evolution equations, namely; Calogero-Bogoyavlenskii-Schiff and Kadomtsev-Petviashvili hierarchy equations. We obtain new solutions with complex, hyperbolic and trigonometric function structures. All the obtained solutions in this paper verified their corresponding equations. We also plot the three- and two-dimensional graphics of all the obtained solutions in this paper by using the same program in Wolfram Mathematica 9. We finally submit a comprehensive conclusion.

  14. Lifting the Markov blankets of socio-cultural evolution. A comment on "Answering Schrödinger's question: A free-energy formulation" by Maxwell James Désormeau Ramstead et al.

    NASA Astrophysics Data System (ADS)

    Leydesdorff, Loet

    2018-03-01

    Ramstead et al. [8] claim an encompassing ontology which can be used as a heuristics for studying life, mind, and society both empirically and in terms of computer simulations. The systems levels are self-organizing into a hierarchy; "Markov blankets" close the various levels for one another. Homo sapiens sapiens is placed at the top of this hierarchy as "the world's most complex living systems." Humans are said to generate "(epi)genetically-specified expectations that have been shaped by selection to guide action-perception cycles toward adaptive or unsurprising states."

  15. Planetary environments and the conditions of life

    NASA Technical Reports Server (NTRS)

    Chang, S.

    1988-01-01

    Geophysical models of the first 600 Ma ofthe earth's history following accretion and core formation point to a period of great environmental disequilibrium. In such an environment, the passage of energy from the earth's interior and from the sun through gas-liquid-solid domains and their boundaries with each other generated a dynamically interacting, complex hierarchy of self-organized structures ranging from bubbles at the sea-air interface to tectonic plates. The ability of a planet to produce such a hierarchy is speculated to be a prerequisite to the origin and sustenance of life. The application of this criterion to Mars argues against the origin of Martian life.

  16. Working Memory Subsystems and Task Complexity in Young Boys with Fragile X Syndrome

    ERIC Educational Resources Information Center

    Baker, S.; Hooper, S.; Skinner, M.; Hatton, D.; Schaaf, J.; Ornstein, P.; Bailey, D.

    2011-01-01

    Background: Working memory problems have been targeted as core deficits in individuals with Fragile X syndrome (FXS); however, there have been few studies that have examined working memory in young boys with FXS, and even fewer studies that have studied the working memory performance of young boys with FXS across different degrees of complexity.…

  17. Spatial operator algebra for flexible multibody dynamics

    NASA Technical Reports Server (NTRS)

    Jain, A.; Rodriguez, G.

    1993-01-01

    This paper presents an approach to modeling the dynamics of flexible multibody systems such as flexible spacecraft and limber space robotic systems. A large number of degrees of freedom and complex dynamic interactions are typical in these systems. This paper uses spatial operators to develop efficient recursive algorithms for the dynamics of these systems. This approach very efficiently manages complexity by means of a hierarchy of mathematical operations.

  18. The Visual Orientation Memory of "Drosophila" Requires Foraging (PKG) Upstream of Ignorant (RSK2) in Ring Neurons of the Central Complex

    ERIC Educational Resources Information Center

    Kuntz, Sara; Poeck, Burkhard; Sokolowski, Marla B.; Strauss, Roland

    2012-01-01

    Orientation and navigation in a complex environment requires path planning and recall to exert goal-driven behavior. Walking "Drosophila" flies possess a visual orientation memory for attractive targets which is localized in the central complex of the adult brain. Here we show that this type of working memory requires the cGMP-dependent protein…

  19. Extraction and prediction of indices for monsoon intraseasonal oscillations: an approach based on nonlinear Laplacian spectral analysis

    NASA Astrophysics Data System (ADS)

    Sabeerali, C. T.; Ajayamohan, R. S.; Giannakis, Dimitrios; Majda, Andrew J.

    2017-11-01

    An improved index for real-time monitoring and forecast verification of monsoon intraseasonal oscillations (MISOs) is introduced using the recently developed nonlinear Laplacian spectral analysis (NLSA) technique. Using NLSA, a hierarchy of Laplace-Beltrami (LB) eigenfunctions are extracted from unfiltered daily rainfall data from the Global Precipitation Climatology Project over the south Asian monsoon region. Two modes representing the full life cycle of the northeastward-propagating boreal summer MISO are identified from the hierarchy of LB eigenfunctions. These modes have a number of advantages over MISO modes extracted via extended empirical orthogonal function analysis including higher memory and predictability, stronger amplitude and higher fractional explained variance over the western Pacific, Western Ghats, and adjoining Arabian Sea regions, and more realistic representation of the regional heat sources over the Indian and Pacific Oceans. Real-time prediction of NLSA-derived MISO indices is demonstrated via extended-range hindcasts based on NCEP Coupled Forecast System version 2 operational output. It is shown that in these hindcasts the NLSA MISO indices remain predictable out to ˜3 weeks.

  20. [Occupational complexity and late-life memory and reasoning abilities].

    PubMed

    Ishioka, Yoshiko; Gondo, Yasuyuki; Masui, Yukie; Nakagawa, Takeshi; Tabuchi, Megumi; Ogawa, Madoka; Kamide, Kei; Ikebe, Kazunori; Arai, Yasumichi; Ishizaki, Tatsuro; Takahashi, Ryutaro

    2015-08-01

    This study examined the associations between the complexity of an individual's primary lifetime occupation and his or her late-life memory and reasoning performance, using data from 824 community-dwelling participants aged 69-72 years. The complexity of work with data, people, and things was evaluated based on the Japanese job complexity score. The associations between occupational complexity and participant's memory and reasoning abilities were examined in multiple regression analyses. An association was found between more comple work with people and higher memory performance, as well as between more complex work with data and higher reasoning performance, after having controlled for gender, school records, and education. Further, an interaction effect was observed between gender and complexity of work with data in relation to reasoning performance: work involving a high degree of complexity with data was associated with high reasoning performance in men. These findings suggest the need to consider late-life cognitive functioning within the context of adulthood experiences, specifically those related to occupation and gender.

  1. Performance Prediction Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chennupati, Gopinath; Santhi, Nanadakishore; Eidenbenz, Stephen

    The Performance Prediction Toolkit (PPT), is a scalable co-design tool that contains the hardware and middle-ware models, which accept proxy applications as input in runtime prediction. PPT relies on Simian, a parallel discrete event simulation engine in Python or Lua, that uses the process concept, where each computing unit (host, node, core) is a Simian entity. Processes perform their task through message exchanges to remain active, sleep, wake-up, begin and end. The PPT hardware model of a compute core (such as a Haswell core) consists of a set of parameters, such as clock speed, memory hierarchy levels, their respective sizes,more » cache-lines, access times for different cache levels, average cycle counts of ALU operations, etc. These parameters are ideally read off a spec sheet or are learned using regression models learned from hardware counters (PAPI) data. The compute core model offers an API to the software model, a function called time_compute(), which takes as input a tasklist. A tasklist is an unordered set of ALU, and other CPU-type operations (in particular virtual memory loads and stores). The PPT application model mimics the loop structure of the application and replaces the computational kernels with a call to the hardware model's time_compute() function giving tasklists as input that model the compute kernel. A PPT application model thus consists of tasklists representing kernels and the high-er level loop structure that we like to think of as pseudo code. The key challenge for the hardware model's time_compute-function is to translate virtual memory accesses into actual cache hierarchy level hits and misses.PPT also contains another CPU core level hardware model, Analytical Memory Model (AMM). The AMM solves this challenge soundly, where our previous alternatives explicitly include the L1,L2,L3 hit-rates as inputs to the tasklists. Explicit hit-rates inevitably only reflect the application modeler's best guess, perhaps informed by a few small test problems using hardware counters; also, hard-coded hit-rates make the hardware model insensitive to changes in cache sizes. Alternatively, we use reuse distance distributions in the tasklists. In general, reuse profiles require the application modeler to run a very expensive trace analysis on the real code that realistically can be done at best for small examples.« less

  2. Syntactic Recursion Facilitates and Working Memory Predicts Recursive Theory of Mind

    PubMed Central

    Arslan, Burcu; Hohenberger, Annette; Verbrugge, Rineke

    2017-01-01

    In this study, we focus on the possible roles of second-order syntactic recursion and working memory in terms of simple and complex span tasks in the development of second-order false belief reasoning. We tested 89 Turkish children in two age groups, one younger (4;6–6;5 years) and one older (6;7–8;10 years). Although second-order syntactic recursion is significantly correlated with the second-order false belief task, results of ordinal logistic regressions revealed that the main predictor of second-order false belief reasoning is complex working memory span. Unlike simple working memory and second-order syntactic recursion tasks, the complex working memory task required processing information serially with additional reasoning demands that require complex working memory strategies. Based on our results, we propose that children’s second-order theory of mind develops when they have efficient reasoning rules to process embedded beliefs serially, thus overcoming a possible serial processing bottleneck. PMID:28072823

  3. Hierarchically Enhanced Impact Resistance of Bioinspired Composites.

    PubMed

    Gu, Grace X; Takaffoli, Mahdi; Buehler, Markus J

    2017-07-01

    An order of magnitude tougher than nacre, conch shells are known for being one of the toughest body armors in nature. However, the complexity of the conch shell architecture creates a barrier to emulating its cross-lamellar structure in synthetic materials. Here, a 3D biomimetic conch shell prototype is presented, which can replicate the crack arresting mechanisms embedded in the natural architecture. Through an integrated approach combining simulation, additive manufacturing, and drop tower testing, the function of hierarchy in conch shell's multiscale microarchitectures is explicated. The results show that adding the second level of cross-lamellar hierarchy can boost impact performance by 70% and 85% compared to a single-level hierarchy and the stiff constituent, respectively. The overarching mechanism responsible for the impact resistance of conch shell is the generation of pathways for crack deviation, which can be generalized to the design of future protective apparatus such as helmets and body armor. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Evidence for a distributed hierarchy of action representation in the brain

    PubMed Central

    Grafton, Scott T.; de C. Hamilton, Antonia F.

    2007-01-01

    Complex human behavior is organized around temporally distal outcomes. Behavioral studies based on tasks such as normal prehension, multi-step object use and imitation establish the existence of relative hierarchies of motor control. The retrieval errors in apraxia also support the notion of a hierarchical model for representing action in the brain. In this review, three functional brain imaging studies of action observation using the method of repetition suppression are used to identify a putative neural architecture that supports action understanding at the level of kinematics, object centered goals and ultimately, motor outcomes. These results, based on observation, may match a similar functional anatomic hierarchy for action planning and execution. If this is true, then the findings support a functional anatomic model that is distributed across a set of interconnected brain areas that are differentially recruited for different aspects of goal oriented behavior, rather than a homogeneous mirror neuron system for organizing and understanding all behavior. PMID:17706312

  5. No Evidence for a Fixed Object Limit in Working Memory: Spatial Ensemble Representations Inflate Estimates of Working Memory Capacity for Complex Objects

    ERIC Educational Resources Information Center

    Brady, Timothy F.; Alvarez, George A.

    2015-01-01

    A central question for models of visual working memory is whether the number of objects people can remember depends on object complexity. Some influential "slot" models of working memory capacity suggest that people always represent 3-4 objects and that only the fidelity with which these objects are represented is affected by object…

  6. Working memory at work: how the updating process alters the nature of working memory transfer.

    PubMed

    Zhang, Yanmin; Verhaeghen, Paul; Cerella, John

    2012-01-01

    In three N-Back experiments, we investigated components of the process of working memory (WM) updating, more specifically access to items stored outside the focus of attention and transfer from the focus to the region of WM outside the focus. We used stimulus complexity as a marker. We found that when WM transfer occurred under full attention, it was slow and highly sensitive to stimulus complexity, much more so than WM access. When transfer occurred in conjunction with access, however, it was fast and no longer sensitive to stimulus complexity. Thus the updating context altered the nature of WM processing: The dual-task situation (transfer in conjunction with access) drove memory transfer into a more efficient mode, indifferent to stimulus complexity. In contrast, access times consistently increased with complexity, unaffected by the processing context. This study reinforces recent reports that retrieval is a (perhaps the) key component of working memory functioning. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Working Memory at Work: How the Updating Process Alters the Nature of Working Memory Transfer

    PubMed Central

    Zhang, Yanmin; Verhaeghen, Paul; Cerella, John

    2011-01-01

    In three N-Back experiments, we investigated components of the process of working memory (WM) updating, more specifically access to items stored outside the focus of attention and transfer from the focus to the region of WM outside the focus. We used stimulus complexity as a marker. We found that when WM transfer occurred under full attention, it was slow and highly sensitive to stimulus complexity, much more so than WM access. When transfer occurred in conjunction with access, however, it was fast and no longer sensitive to stimulus complexity. Thus the updating context altered the nature of WM processing: The dual-task situation (transfer in conjunction with access) drove memory transfer into a more efficient mode, indifferent to stimulus complexity. In contrast, access times consistently increased with complexity, unaffected by the processing context. This study reinforces recent reports that retrieval is a (perhaps the) key component of working memory functioning. PMID:22105718

  8. View-tolerant face recognition and Hebbian learning imply mirror-symmetric neural tuning to head orientation

    PubMed Central

    Leibo, Joel Z.; Liao, Qianli; Freiwald, Winrich A.; Anselmi, Fabio; Poggio, Tomaso

    2017-01-01

    SUMMARY The primate brain contains a hierarchy of visual areas, dubbed the ventral stream, which rapidly computes object representations that are both specific for object identity and robust against identity-preserving transformations like depth-rotations [1, 2]. Current computational models of object recognition, including recent deep learning networks, generate these properties through a hierarchy of alternating selectivity-increasing filtering and tolerance-increasing pooling operations, similar to simple-complex cells operations [3, 4, 5, 6]. Here we prove that a class of hierarchical architectures and a broad set of biologically plausible learning rules generate approximate invariance to identity-preserving transformations at the top level of the processing hierarchy. However, all past models tested failed to reproduce the most salient property of an intermediate representation of a three-level face-processing hierarchy in the brain: mirror-symmetric tuning to head orientation [7]. Here we demonstrate that one specific biologically-plausible Hebb-type learning rule generates mirror-symmetric tuning to bilaterally symmetric stimuli like faces at intermediate levels of the architecture and show why it does so. Thus the tuning properties of individual cells inside the visual stream appear to result from group properties of the stimuli they encode and to reflect the learning rules that sculpted the information-processing system within which they reside. PMID:27916522

  9. Experimentally modeling stochastic processes with less memory by the use of a quantum processor

    PubMed Central

    Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.

    2017-01-01

    Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218

  10. Does stress remove the HDAC brakes for the formation and persistence of long-term memory?

    PubMed

    White, André O; Wood, Marcelo A

    2014-07-01

    It has been known for numerous decades that gene expression is required for long-lasting forms of memory. In the past decade, the study of epigenetic mechanisms in memory processes has revealed yet another layer of complexity in the regulation of gene expression. Epigenetic mechanisms do not only provide complexity in the protein regulatory complexes that control coordinate transcription for specific cell function, but the epigenome encodes critical information that integrates experience and cellular history for specific cell functions as well. Thus, epigenetic mechanisms provide a unique mechanism of gene expression regulation for memory processes. This may be why critical negative regulators of gene expression, such as histone deacetylases (HDACs), have powerful effects on the formation and persistence of memory. For example, HDAC inhibition has been shown to transform a subthreshold learning event into robust long-term memory and also generate a form of long-term memory that persists beyond the point at which normal long-term memory fails. A key question that is explored in this review, from a learning and memory perspective, is whether stress-dependent signaling drives the formation and persistence of long-term memory via HDAC-dependent mechanisms. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Does stress remove the HDAC brakes for the formation and persistence of long-term memory?

    PubMed Central

    White, André O.; Wood, Marcelo A.

    2013-01-01

    It has been known for numerous decades that gene expression is required for long-lasting forms of memory. In the past decade, the study of epigenetic mechanisms in memory processes has revealed yet another layer of complexity in the regulation of gene expression. Epigenetic mechanisms do not only provide complexity in the protein regulatory complexes that control coordinate transcription for specific cell function, but the epigenome encodes critical information that integrates experience and cellular history for specific cell functions as well. Thus, epigenetic mechanisms provide a unique mechanism of gene expression regulation for memory processes. This may be why critical negative regulators of gene expression, such as histone deacetylases (HDACs), have powerful effects on the formation and persistence of memory. For example, HDAC inhibition has been shown to transform a subthreshold learning event into robust long-term memory and also generate a form of long-term memory that persists beyond the point at which normal long-term memory fails. A key question that is explored in this review, from a learning and memory perspective, is whether stress-dependent signaling drives the formation and persistence of long-term memory via HDAC-dependent mechanisms. PMID:24149059

  12. Complex-valued Multidirectional Associative Memory

    NASA Astrophysics Data System (ADS)

    Kobayashi, Masaki; Yamazaki, Haruaki

    Hopfield model is a representative associative memory. It was improved to Bidirectional Associative Memory(BAM) by Kosko and Multidirectional Associative Memory(MAM) by Hagiwara. They have two layers or multilayers. Since they have symmetric connections between layers, they ensure to converge. MAM can deal with multiples of many patterns, such as (x1, x2,…), where xm is the pattern on layer-m. Noest, Hirose and Nemoto proposed complex-valued Hopfield model. Lee proposed complex-valued Bidirectional Associative Memory. Zemel proved the rotation invariance of complex-valued Hopfield model. It means that the rotated pattern also stored. In this paper, the complex-valued Multidirectional Associative Memory is proposed. The rotation invariance is also proved. Moreover it is shown by computer simulation that the differences of angles of given patterns are automatically reduced. At first we define complex-valued Multidirectional Associative Memory. Then we define the energy function of network. By using energy function, we prove that the network ensures to converge. Next, we define the learning law and show the characteristic of recall process. The characteristic means that the differences of angles of given patterns are automatically reduced. Especially we prove the following theorem. In case that only a multiple of patterns is stored, if patterns with different angles are given to each layer, the differences are automatically reduced. Finally, we invest that the differences of angles influence the noise robustness. It reduce the noise robustness, because input to each layer become small. We show that by computer simulations.

  13. Representational Complexity and Memory Retrieval in Language Comprehension

    ERIC Educational Resources Information Center

    Hofmeister, Philip

    2011-01-01

    Mental representations formed from words or phrases may vary considerably in their feature-based complexity. Modern theories of retrieval in sentence comprehension do not indicate how this variation and the role of encoding processes should influence memory performance. Here, memory retrieval in language comprehension is shown to be influenced by…

  14. Role of Working Memory in Typically Developing Children's Complex Sentence Comprehension

    ERIC Educational Resources Information Center

    Montgomery, James W.; Magimairaj, Beula M.; O'Malley, Michelle H.

    2008-01-01

    The influence of three mechanisms of working memory (phonological short-term memory (PSTM capacity), attentional resource control/allocation, and processing speed) on children's complex (and simple) sentence comprehension was investigated. Fifty two children (6-12 years) completed a nonword repetition task (indexing PSTM), concurrent verbal…

  15. Presentation Media, Information Complexity, and Learning Outcomes

    ERIC Educational Resources Information Center

    Andres, Hayward P.; Petersen, Candice

    2002-01-01

    Cognitive processing limitations restrict the number of complex information items held and processed in human working memory. To overcome such limitations, a verbal working memory channel is used to construct an if-then proposition representation of facts and a visual working memory channel is used to construct a visual imagery of geometric…

  16. Sentence Complexity and Working Memory Effects in Ambiguity Resolution

    ERIC Educational Resources Information Center

    Kim, Ji Hyon; Christianson, Kiel

    2013-01-01

    Two self-paced reading experiments using a paraphrase decision task paradigm were performed to investigate how sentence complexity contributed to the relative clause (RC) attachment preferences of speakers of different working memory capacities (WMCs). Experiment 1 (English) showed working memory effects on relative clause processing in both…

  17. From lepton protoplasm to the genesis of hadrons

    NASA Astrophysics Data System (ADS)

    Eliseev, S. M.; Kosmachev, O. S.

    2016-01-01

    Theory of matter under extreme conditions opens a new stage in particle physics. It is necessary here to combine Dirac's elementary particle physics with Prigogine's dynamics of nonequilibrium systems. In the article we discuss the problem of the hierarchy of complexity. What can be considered as the lowest level of the organization of extreme matter on the basis of which the self-organization of the complex form occur?

  18. KENNEDY SPACE CENTER, FLA. - In front of the Space Memorial Mirror at the KSC Visitor Complex, Center Director Jim Kennedy (right) speaks to visitors gathered for the memorial service honoring the crew of Columbia. At left are KSC Deputy Director Woodrow Whitlow Jr. and Executive Director of Florida Space Authority Winston Scott; at right is Dr. Stephen Feldman, president of the Astronaut Memorial Foundation. Feb. 1 is the one-year anniversary of the loss of the crew and orbiter Columbia in a tragic accident as the ship returned to Earth following mission STS-107. The public was also invited to the memorial service held at the KSC Visitor Complex.

    NASA Image and Video Library

    2004-02-01

    KENNEDY SPACE CENTER, FLA. - In front of the Space Memorial Mirror at the KSC Visitor Complex, Center Director Jim Kennedy (right) speaks to visitors gathered for the memorial service honoring the crew of Columbia. At left are KSC Deputy Director Woodrow Whitlow Jr. and Executive Director of Florida Space Authority Winston Scott; at right is Dr. Stephen Feldman, president of the Astronaut Memorial Foundation. Feb. 1 is the one-year anniversary of the loss of the crew and orbiter Columbia in a tragic accident as the ship returned to Earth following mission STS-107. The public was also invited to the memorial service held at the KSC Visitor Complex.

  19. The Structure of Working Memory Abilities across the Adult Life Span

    PubMed Central

    Hale, Sandra; Rose, Nathan S.; Myerson, Joel; Strube, Michael J; Sommers, Mitchell; Tye-Murray, Nancy; Spehar, Brent

    2010-01-01

    The present study addresses three questions regarding age differences in working memory: (1) whether performance on complex span tasks decreases as a function of age at a faster rate than performance on simple span tasks; (2) whether spatial working memory decreases at a faster rate than verbal working memory; and (3) whether the structure of working memory abilities is different for different age groups. Adults, ages 20–89 (n=388), performed three simple and three complex verbal span tasks and three simple and three complex spatial memory tasks. Performance on the spatial tasks decreased at faster rates as a function of age than performance on the verbal tasks, but within each domain, performance on complex and simple span tasks decreased at the same rates. Confirmatory factor analyses revealed that domain-differentiated models yielded better fits than models involving domain-general constructs, providing further evidence of the need to distinguish verbal and spatial working memory abilities. Regardless of which domain-differentiated model was examined, and despite the faster rates of decrease in the spatial domain, age group comparisons revealed that the factor structure of working memory abilities was highly similar in younger and older adults and showed no evidence of age-related dedifferentiation. PMID:21299306

  20. Self-Organization of Spatio-Temporal Hierarchy via Learning of Dynamic Visual Image Patterns on Action Sequences

    PubMed Central

    Jung, Minju; Hwang, Jungsik; Tani, Jun

    2015-01-01

    It is well known that the visual cortex efficiently processes high-dimensional spatial information by using a hierarchical structure. Recently, computational models that were inspired by the spatial hierarchy of the visual cortex have shown remarkable performance in image recognition. Up to now, however, most biological and computational modeling studies have mainly focused on the spatial domain and do not discuss temporal domain processing of the visual cortex. Several studies on the visual cortex and other brain areas associated with motor control support that the brain also uses its hierarchical structure as a processing mechanism for temporal information. Based on the success of previous computational models using spatial hierarchy and temporal hierarchy observed in the brain, the current report introduces a novel neural network model for the recognition of dynamic visual image patterns based solely on the learning of exemplars. This model is characterized by the application of both spatial and temporal constraints on local neural activities, resulting in the self-organization of a spatio-temporal hierarchy necessary for the recognition of complex dynamic visual image patterns. The evaluation with the Weizmann dataset in recognition of a set of prototypical human movement patterns showed that the proposed model is significantly robust in recognizing dynamically occluded visual patterns compared to other baseline models. Furthermore, an evaluation test for the recognition of concatenated sequences of those prototypical movement patterns indicated that the model is endowed with a remarkable capability for the contextual recognition of long-range dynamic visual image patterns. PMID:26147887

  1. Self-Organization of Spatio-Temporal Hierarchy via Learning of Dynamic Visual Image Patterns on Action Sequences.

    PubMed

    Jung, Minju; Hwang, Jungsik; Tani, Jun

    2015-01-01

    It is well known that the visual cortex efficiently processes high-dimensional spatial information by using a hierarchical structure. Recently, computational models that were inspired by the spatial hierarchy of the visual cortex have shown remarkable performance in image recognition. Up to now, however, most biological and computational modeling studies have mainly focused on the spatial domain and do not discuss temporal domain processing of the visual cortex. Several studies on the visual cortex and other brain areas associated with motor control support that the brain also uses its hierarchical structure as a processing mechanism for temporal information. Based on the success of previous computational models using spatial hierarchy and temporal hierarchy observed in the brain, the current report introduces a novel neural network model for the recognition of dynamic visual image patterns based solely on the learning of exemplars. This model is characterized by the application of both spatial and temporal constraints on local neural activities, resulting in the self-organization of a spatio-temporal hierarchy necessary for the recognition of complex dynamic visual image patterns. The evaluation with the Weizmann dataset in recognition of a set of prototypical human movement patterns showed that the proposed model is significantly robust in recognizing dynamically occluded visual patterns compared to other baseline models. Furthermore, an evaluation test for the recognition of concatenated sequences of those prototypical movement patterns indicated that the model is endowed with a remarkable capability for the contextual recognition of long-range dynamic visual image patterns.

  2. Is the Link from Working Memory to Analogy Causal? No Analogy Improvements following Working Memory Training Gains

    PubMed Central

    Richey, J. Elizabeth; Phillips, Jeffrey S.; Schunn, Christian D.; Schneider, Walter

    2014-01-01

    Analogical reasoning has been hypothesized to critically depend upon working memory through correlational data [1], but less work has tested this relationship through experimental manipulation [2]. An opportunity for examining the connection between working memory and analogical reasoning has emerged from the growing, although somewhat controversial, body of literature suggests complex working memory training can sometimes lead to working memory improvements that transfer to novel working memory tasks. This study investigated whether working memory improvements, if replicated, would increase analogical reasoning ability. We assessed participants’ performance on verbal and visual analogy tasks after a complex working memory training program incorporating verbal and spatial tasks [3], [4]. Participants’ improvements on the working memory training tasks transferred to other short-term and working memory tasks, supporting the possibility of broad effects of working memory training. However, we found no effects on analogical reasoning. We propose several possible explanations for the lack of an impact of working memory improvements on analogical reasoning. PMID:25188356

  3. Detecting and evaluating communities in complex human and biological networks

    NASA Astrophysics Data System (ADS)

    Morrison, Greg; Mahadevan, L.

    2012-02-01

    We develop a simple method for detecting the community structure in a network can by utilizing a measure of closeness between nodes. This approach readily leads to a method of coarse graining the network, which allows the detection of the natural hierarchy (or hierarchies) of community structure without appealing to an unknown resolution parameter. The closeness measure can also be used to evaluate the robustness of an individual node's assignment to its community (rather than evaluating only the quality of the global structure). Each of these methods in community detection and evaluation are illustrated using a variety of real world networks of either biological or sociological importance and illustrate the power and flexibility of the approach.

  4. Soliton, rational, and periodic solutions for the infinite hierarchy of defocusing nonlinear Schrödinger equations.

    PubMed

    Ankiewicz, Adrian

    2016-07-01

    Analysis of short-pulse propagation in positive dispersion media, e.g., in optical fibers and in shallow water, requires assorted high-order derivative terms. We present an infinite-order "dark" hierarchy of equations, starting from the basic defocusing nonlinear Schrödinger equation. We present generalized soliton solutions, plane-wave solutions, and periodic solutions of all orders. We find that "even"-order equations in the set affect phase and "stretching factors" in the solutions, while "odd"-order equations affect the velocities. Hence odd-order equation solutions can be real functions, while even-order equation solutions are complex. There are various applications in optics and water waves.

  5. Anomaly Detection for Beam Loss Maps in the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Valentino, Gianluca; Bruce, Roderik; Redaelli, Stefano; Rossi, Roberto; Theodoropoulos, Panagiotis; Jaster-Merz, Sonja

    2017-07-01

    In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy.

  6. Consequence of preterm birth in early adolescence: the role of language on auditory short-term memory.

    PubMed

    Fraello, David; Maller-Kesselman, Jill; Vohr, Betty; Katz, Karol H; Kesler, Shelli; Schneider, Karen; Reiss, Allan; Ment, Laura; Spann, Marisa N

    2011-06-01

    This study tested the hypothesis that preterm early adolescents' short-term memory is compromised when presented with increasingly complex verbal information and that associated neuroanatomical volumes would differ between preterm and term groups. Forty-nine preterm and 20 term subjects were evaluated at age 12 years with neuropsychological measures and magnetic resonance imaging (MRI). There were no differences between groups in simple short-term and working memory. Preterm subjects performed lower on learning and short-term memory tests that included increased verbal complexity. They had reduced right parietal, left temporal, and right temporal white matter volumes and greater bilateral frontal gray and right frontal white matter volumes. There was a positive association between complex working memory and the left hippocampus and frontal white matter in term subjects. While not correlated, memory scores and volumes of cortical regions known to subserve language and memory were reduced in preterm subjects. This study provides evidence of possible mechanisms for learning problems in former preterm infants.

  7. Neural Mechanisms of Information Storage in Visual Short-Term Memory

    PubMed Central

    Serences, John T.

    2016-01-01

    The capacity to briefly memorize fleeting sensory information supports visual search and behavioral interactions with relevant stimuli in the environment. Traditionally, studies investigating the neural basis of visual short term memory (STM) have focused on the role of prefrontal cortex (PFC) in exerting executive control over what information is stored and how it is adaptively used to guide behavior. However, the neural substrates that support the actual storage of content-specific information in STM are more controversial, with some attributing this function to PFC and others to the specialized areas of early visual cortex that initially encode incoming sensory stimuli. In contrast to these traditional views, I will review evidence suggesting that content-specific information can be flexibly maintained in areas across the cortical hierarchy ranging from early visual cortex to PFC. While the factors that determine exactly where content-specific information is represented are not yet entirely clear, recognizing the importance of task-demands and better understanding the operation of non-spiking neural codes may help to constrain new theories about how memories are maintained at different resolutions, across different timescales, and in the presence of distracting information. PMID:27668990

  8. Acquisition and improvement of human motor skills: Learning through observation and practice

    NASA Technical Reports Server (NTRS)

    Iba, Wayne

    1991-01-01

    Skilled movement is an integral part of the human existence. A better understanding of motor skills and their development is a prerequisite to the construction of truly flexible intelligent agents. We present MAEANDER, a computational model of human motor behavior, that uniformly addresses both the acquisition of skills through observation and the improvement of skills through practice. MAEANDER consists of a sensory-effector interface, a memory of movements, and a set of performance and learning mechanisms that let it recognize and generate motor skills. The system initially acquires such skills by observing movements performed by another agent and constructing a concept hierarchy. Given a stored motor skill in memory, MAEANDER will cause an effector to behave appropriately. All learning involves changing the hierarchical memory of skill concepts to more closely correspond to either observed experience or to desired behaviors. We evaluated MAEANDER empirically with respect to how well it acquires and improves both artificial movement types and handwritten script letters from the alphabet. We also evaluate MAEANDER as a psychological model by comparing its behavior to robust phenomena in humans and by considering the richness of the predictions it makes.

  9. Neural mechanisms of information storage in visual short-term memory.

    PubMed

    Serences, John T

    2016-11-01

    The capacity to briefly memorize fleeting sensory information supports visual search and behavioral interactions with relevant stimuli in the environment. Traditionally, studies investigating the neural basis of visual short term memory (STM) have focused on the role of prefrontal cortex (PFC) in exerting executive control over what information is stored and how it is adaptively used to guide behavior. However, the neural substrates that support the actual storage of content-specific information in STM are more controversial, with some attributing this function to PFC and others to the specialized areas of early visual cortex that initially encode incoming sensory stimuli. In contrast to these traditional views, I will review evidence suggesting that content-specific information can be flexibly maintained in areas across the cortical hierarchy ranging from early visual cortex to PFC. While the factors that determine exactly where content-specific information is represented are not yet entirely clear, recognizing the importance of task-demands and better understanding the operation of non-spiking neural codes may help to constrain new theories about how memories are maintained at different resolutions, across different timescales, and in the presence of distracting information. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. The influence of levels of processing on recall from working memory and delayed recall tasks.

    PubMed

    Loaiza, Vanessa M; McCabe, David P; Youngblood, Jessie L; Rose, Nathan S; Myerson, Joel

    2011-09-01

    Recent research in working memory has highlighted the similarities involved in retrieval from complex span tasks and episodic memory tasks, suggesting that these tasks are influenced by similar memory processes. In the present article, the authors manipulated the level of processing engaged when studying to-be-remembered words during a reading span task (Experiment 1) and an operation span task (Experiment 2) in order to assess the role of retrieval from secondary memory during complex span tasks. Immediate recall from both span tasks was greater for items studied under deep processing instructions compared with items studied under shallow processing instructions regardless of trial length. Recall was better for deep than for shallow levels of processing on delayed recall tests as well. These data are consistent with the primary-secondary memory framework, which suggests that to-be-remembered items are displaced from primary memory (i.e., the focus of attention) during the processing phases of complex span tasks and therefore must be retrieved from secondary memory. (c) 2011 APA, all rights reserved.

  11. Non-monotonic relationships between emotional arousal and memory for color and location.

    PubMed

    Boywitt, C Dennis

    2015-01-01

    Recent research points to the decreased diagnostic value of subjective retrieval experience for memory accuracy for emotional stimuli. While for neutral stimuli rich recollective experiences are associated with better context memory than merely familiar memories this association appears questionable for emotional stimuli. The present research tested the implicit assumption that the effect of emotional arousal on memory is monotonic, that is, steadily increasing (or decreasing) with increasing arousal. In two experiments emotional arousal was manipulated in three steps using emotional pictures and subjective retrieval experience as well as context memory were assessed. The results show an inverted U-shape relationship between arousal and recognition memory but for context memory and retrieval experience the relationship was more complex. For frame colour, context memory decreased linearly while for spatial location it followed the inverted U-shape function. The complex, non-monotonic relationships between arousal and memory are discussed as possible explanations for earlier divergent findings.

  12. An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity

    PubMed Central

    Whittington, James C. R.; Bogacz, Rafal

    2017-01-01

    To efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error backpropagation algorithm. However, in this algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected with the synapse being modified, whereas the changes in biological synapses are determined only by the activity of presynaptic and postsynaptic neurons. Several models have been proposed that approximate the backpropagation algorithm with local synaptic plasticity, but these models require complex external control over the network or relatively complex plasticity rules. Here we show that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity. Furthermore, for certain parameters, the weight change in the predictive coding model converges to that of the backpropagation algorithm. This suggests that it is possible for cortical networks with simple Hebbian synaptic plasticity to implement efficient learning algorithms in which synapses in areas on multiple levels of hierarchy are modified to minimize the error on the output. PMID:28333583

  13. An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity.

    PubMed

    Whittington, James C R; Bogacz, Rafal

    2017-05-01

    To efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error backpropagation algorithm. However, in this algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected with the synapse being modified, whereas the changes in biological synapses are determined only by the activity of presynaptic and postsynaptic neurons. Several models have been proposed that approximate the backpropagation algorithm with local synaptic plasticity, but these models require complex external control over the network or relatively complex plasticity rules. Here we show that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity. Furthermore, for certain parameters, the weight change in the predictive coding model converges to that of the backpropagation algorithm. This suggests that it is possible for cortical networks with simple Hebbian synaptic plasticity to implement efficient learning algorithms in which synapses in areas on multiple levels of hierarchy are modified to minimize the error on the output.

  14. Optimal causal inference: estimating stored information and approximating causal architecture.

    PubMed

    Still, Susanne; Crutchfield, James P; Ellison, Christopher J

    2010-09-01

    We introduce an approach to inferring the causal architecture of stochastic dynamical systems that extends rate-distortion theory to use causal shielding--a natural principle of learning. We study two distinct cases of causal inference: optimal causal filtering and optimal causal estimation. Filtering corresponds to the ideal case in which the probability distribution of measurement sequences is known, giving a principled method to approximate a system's causal structure at a desired level of representation. We show that in the limit in which a model-complexity constraint is relaxed, filtering finds the exact causal architecture of a stochastic dynamical system, known as the causal-state partition. From this, one can estimate the amount of historical information the process stores. More generally, causal filtering finds a graded model-complexity hierarchy of approximations to the causal architecture. Abrupt changes in the hierarchy, as a function of approximation, capture distinct scales of structural organization. For nonideal cases with finite data, we show how the correct number of the underlying causal states can be found by optimal causal estimation. A previously derived model-complexity control term allows us to correct for the effect of statistical fluctuations in probability estimates and thereby avoid overfitting.

  15. Grammatical complexity for two-dimensional maps

    NASA Astrophysics Data System (ADS)

    Hagiwara, Ryouichi; Shudo, Akira

    2004-11-01

    We calculate the grammatical complexity of the symbol sequences generated from the Hénon map and the Lozi map using the recently developed methods to construct the pruning front. When the map is hyperbolic, the language of symbol sequences is regular in the sense of the Chomsky hierarchy and the corresponding grammatical complexity takes finite values. It is found that the complexity exhibits a self-similar structure as a function of the system parameter, and the similarity of the pruning fronts is discussed as an origin of such self-similarity. For non-hyperbolic cases, it is observed that the complexity monotonically increases as we increase the resolution of the pruning front.

  16. Panarchy

    USGS Publications Warehouse

    Garmestani, Ahjond S.; Allen, Craig R.; El-Shaarawi, Abdel H.; Piegorsch, Walter W.

    2012-01-01

    Panarchy is the term coined to describe hierarchical systems where control is not only top down, as typically considered, but also bottom up. A panarchy is composed of adaptive cycles, and an adaptive cycle describes the processes of development and decay in a system. Complex systems self-organize into hierarchies because this structure limits the possible spread of destructive phenomena (e.g., forest fires, epidemics) that could result in catastrophic system failure. Thus, hierarchical organization enhances the resilience of complex systems.

  17. Operating Systems.

    ERIC Educational Resources Information Center

    Denning, Peter J.; Brown, Robert L.

    1984-01-01

    A computer operating system spans multiple layers of complexity, from commands entered at a keyboard to the details of electronic switching. In addition, the system is organized as a hierarchy of abstractions. Various parts of such a system and system dynamics (using the Unix operating system as an example) are described. (JN)

  18. Chapter 13. Physiology and ecology of host defense against microbial invaders

    USDA-ARS?s Scientific Manuscript database

    Insects mount a complex hierarchy of defenses that pathogens must overcome before successful infection is achieved. Behavioral avoidance and antiseptic behaviors by host insects reduce the degree of encounters between the insect and pathogens. Any pathogen that contacts or establishes on a potentia...

  19. Adaptive simplification of complex multiscale systems.

    PubMed

    Chiavazzo, Eliodoro; Karlin, Ilya

    2011-03-01

    A fully adaptive methodology is developed for reducing the complexity of large dissipative systems. This represents a significant step toward extracting essential physical knowledge from complex systems, by addressing the challenging problem of a minimal number of variables needed to exactly capture the system dynamics. Accurate reduced description is achieved, by construction of a hierarchy of slow invariant manifolds, with an embarrassingly simple implementation in any dimension. The method is validated with the autoignition of the hydrogen-air mixture where a reduction to a cascade of slow invariant manifolds is observed.

  20. Relation of Three Mechanisms of Working Memory to Children's Complex Span Performance

    ERIC Educational Resources Information Center

    Magimairaj, Beula; Montgomery, James; Marinellie, Sally; McCarthy, John

    2009-01-01

    There is a paucity of research examining the relative contribution of the different mechanisms of working memory (short-term storage [STM], processing speed) to children's complex memory span. This study served to replicate and extend the few extant studies that have examined the issue. In this study, the relative contribution of three mechanisms…

  1. Social networks as embedded complex adaptive systems.

    PubMed

    Benham-Hutchins, Marge; Clancy, Thomas R

    2010-09-01

    As systems evolve over time, their natural tendency is to become increasingly more complex. Studies in the field of complex systems have generated new perspectives on management in social organizations such as hospitals. Much of this research appears as a natural extension of the cross-disciplinary field of systems theory. This is the 15th in a series of articles applying complex systems science to the traditional management concepts of planning, organizing, directing, coordinating, and controlling. In this article, the authors discuss healthcare social networks as a hierarchy of embedded complex adaptive systems. The authors further examine the use of social network analysis tools as a means to understand complex communication patterns and reduce medical errors.

  2. A concept taxonomy and an instrument hierarchy: tools for establishing and evaluating the conceptual framework of a patient-reported outcome (PRO) instrument as applied to product labeling claims.

    PubMed

    Erickson, Pennifer; Willke, Richard; Burke, Laurie

    2009-01-01

    To facilitate development and evaluation of a PRO instrument conceptual framework, we propose two tools--a PRO concept taxonomy and a PRO instrument hierarchy. FDA's draft guidance on patient reported outcome (PRO) measures states that a clear description of the conceptual framework of an instrument is useful for evaluating its adequacy to support a treatment benefit claim for use in product labeling the draft guidance, however does not propose tools for establishing or evaluating a PRO instrument's conceptual framework. We draw from our review of PRO concepts and instruments that appear in prescription drug labeling approved in the United States from 1997 to 2007. We propose taxonomy terms that define relationships between PRO concepts, including "family,"compound concept," and "singular concept." Based on the range of complexity represented by the concepts, as defined by the taxonomy, we propose nine instrument orders for PRO measurement. The nine orders range from individual event counts to multi-item, multiscale instruments. This analysis of PRO concepts and instruments illustrates that the taxonomy and hierarchy are applicable to PRO concepts across a wide range of therapeutic areas and provide a basis for defining the instrument conceptual framework complexity. Although the utility of these tools in the drug development, review, and approval processes has not yet been demonstrated, these tools could be useful to improve communication and enhance efficiency in the instrument development and review process.

  3. Evaluation of the Majorana phases of a general Majorana neutrino mass matrix: Testability of hierarchical flavour models

    NASA Astrophysics Data System (ADS)

    Samanta, Rome; Chakraborty, Mainak; Ghosal, Ambar

    2016-03-01

    We evaluate the Majorana phases for a general 3 × 3 complex symmetric neutrino mass matrix on the basis of Mohapatra-Rodejohann's phase convention using the three rephasing invariant quantities I12, I13 and I23 proposed by Sarkar and Singh. We find them interesting as they allow us to evaluate each Majorana phase in a model independent way even if one eigenvalue is zero. Utilizing the solution of a general complex symmetric mass matrix for eigenvalues and mixing angles we determine the Majorana phases for both the hierarchies, normal and inverted, taking into account the constraints from neutrino oscillation global fit data as well as bound on the sum of the three light neutrino masses (Σimi) and the neutrinoless double beta decay (ββ0ν) parameter |m11 |. This methodology of finding the Majorana phases is applied thereafter in some predictive models for both the hierarchical cases (normal and inverted) to evaluate the corresponding Majorana phases and it is shown that all the sub cases presented in inverted hierarchy section can be realized in a model with texture zeros and scaling ansatz within the framework of inverse seesaw although one of the sub cases following the normal hierarchy is yet to be established. Except the case of quasi degenerate neutrinos, the methodology obtained in this work is able to evaluate the corresponding Majorana phases, given any model of neutrino masses.

  4. Photo-reactive charge trapping memory based on lanthanide complex.

    PubMed

    Zhuang, Jiaqing; Lo, Wai-Sum; Zhou, Li; Sun, Qi-Jun; Chan, Chi-Fai; Zhou, Ye; Han, Su-Ting; Yan, Yan; Wong, Wing-Tak; Wong, Ka-Leung; Roy, V A L

    2015-10-09

    Traditional utilization of photo-induced excitons is popularly but restricted in the fields of photovoltaic devices as well as photodetectors, and efforts on broadening its function have always been attempted. However, rare reports are available on organic field effect transistor (OFET) memory employing photo-induced charges. Here, we demonstrate an OFET memory containing a novel organic lanthanide complex Eu(tta)3ppta (Eu(tta)3 = Europium(III) thenoyltrifluoroacetonate, ppta = 2-phenyl-4,6-bis(pyrazol-1-yl)-1,3,5-triazine), in which the photo-induced charges can be successfully trapped and detrapped. The luminescent complex emits intense red emission upon ultraviolet (UV) light excitation and serves as a trapping element of holes injected from the pentacene semiconductor layer. Memory window can be significantly enlarged by light-assisted programming and erasing procedures, during which the photo-induced excitons in the semiconductor layer are separated by voltage bias. The enhancement of memory window is attributed to the increasing number of photo-induced excitons by the UV light. The charges are stored in this luminescent complex for at least 10(4) s after withdrawing voltage bias. The present study on photo-assisted novel memory may motivate the research on a new type of light tunable charge trapping photo-reactive memory devices.

  5. Photo-reactive charge trapping memory based on lanthanide complex

    NASA Astrophysics Data System (ADS)

    Zhuang, Jiaqing; Lo, Wai-Sum; Zhou, Li; Sun, Qi-Jun; Chan, Chi-Fai; Zhou, Ye; Han, Su-Ting; Yan, Yan; Wong, Wing-Tak; Wong, Ka-Leung; Roy, V. A. L.

    2015-10-01

    Traditional utilization of photo-induced excitons is popularly but restricted in the fields of photovoltaic devices as well as photodetectors, and efforts on broadening its function have always been attempted. However, rare reports are available on organic field effect transistor (OFET) memory employing photo-induced charges. Here, we demonstrate an OFET memory containing a novel organic lanthanide complex Eu(tta)3ppta (Eu(tta)3 = Europium(III) thenoyltrifluoroacetonate, ppta = 2-phenyl-4,6-bis(pyrazol-1-yl)-1,3,5-triazine), in which the photo-induced charges can be successfully trapped and detrapped. The luminescent complex emits intense red emission upon ultraviolet (UV) light excitation and serves as a trapping element of holes injected from the pentacene semiconductor layer. Memory window can be significantly enlarged by light-assisted programming and erasing procedures, during which the photo-induced excitons in the semiconductor layer are separated by voltage bias. The enhancement of memory window is attributed to the increasing number of photo-induced excitons by the UV light. The charges are stored in this luminescent complex for at least 104 s after withdrawing voltage bias. The present study on photo-assisted novel memory may motivate the research on a new type of light tunable charge trapping photo-reactive memory devices.

  6. Human area MT+ shows load-dependent activation during working memory maintenance with continuously morphing stimulation.

    PubMed

    Galashan, Daniela; Fehr, Thorsten; Kreiter, Andreas K; Herrmann, Manfred

    2014-07-11

    Initially, human area MT+ was considered a visual area solely processing motion information but further research has shown that it is also involved in various different cognitive operations, such as working memory tasks requiring motion-related information to be maintained or cognitive tasks with implied or expected motion.In the present fMRI study in humans, we focused on MT+ modulation during working memory maintenance using a dynamic shape-tracking working memory task with no motion-related working memory content. Working memory load was systematically varied using complex and simple stimulus material and parametrically increasing retention periods. Activation patterns for the difference between retention of complex and simple memorized stimuli were examined in order to preclude that the reported effects are caused by differences in retrieval. Conjunction analysis over all delay durations for the maintenance of complex versus simple stimuli demonstrated a wide-spread activation pattern. Percent signal change (PSC) in area MT+ revealed a pattern with higher values for the maintenance of complex shapes compared to the retention of a simple circle and with higher values for increasing delay durations. The present data extend previous knowledge by demonstrating that visual area MT+ presents a brain activity pattern usually found in brain regions that are actively involved in working memory maintenance.

  7. Brain and effort: brain activation and effort-related working memory in healthy participants and patients with working memory deficits.

    PubMed

    Engström, Maria; Landtblom, Anne-Marie; Karlsson, Thomas

    2013-01-01

    Despite the interest in the neuroimaging of working memory, little is still known about the neurobiology of complex working memory in tasks that require simultaneous manipulation and storage of information. In addition to the central executive network, we assumed that the recently described salience network [involving the anterior insular cortex (AIC) and the anterior cingulate cortex (ACC)] might be of particular importance to working memory tasks that require complex, effortful processing. Healthy participants (n = 26) and participants suffering from working memory problems related to the Kleine-Levin syndrome (KLS) (a specific form of periodic idiopathic hypersomnia; n = 18) participated in the study. Participants were further divided into a high- and low-capacity group, according to performance on a working memory task (listening span). In a functional magnetic resonance imaging (fMRI) study, participants were administered the reading span complex working memory task tapping cognitive effort. The fMRI-derived blood oxygen level dependent (BOLD) signal was modulated by (1) effort in both the central executive and the salience network and (2) capacity in the salience network in that high performers evidenced a weaker BOLD signal than low performers. In the salience network there was a dichotomy between the left and the right hemisphere; the right hemisphere elicited a steeper increase of the BOLD signal as a function of increasing effort. There was also a stronger functional connectivity within the central executive network because of increased task difficulty. The ability to allocate cognitive effort in complex working memory is contingent upon focused resources in the executive and in particular the salience network. Individual capacity during the complex working memory task is related to activity in the salience (but not the executive) network so that high-capacity participants evidence a lower signal and possibly hence a larger dynamic response.

  8. Brain and effort: brain activation and effort-related working memory in healthy participants and patients with working memory deficits

    PubMed Central

    Engström, Maria; Landtblom, Anne-Marie; Karlsson, Thomas

    2013-01-01

    Despite the interest in the neuroimaging of working memory, little is still known about the neurobiology of complex working memory in tasks that require simultaneous manipulation and storage of information. In addition to the central executive network, we assumed that the recently described salience network [involving the anterior insular cortex (AIC) and the anterior cingulate cortex (ACC)] might be of particular importance to working memory tasks that require complex, effortful processing. Method: Healthy participants (n = 26) and participants suffering from working memory problems related to the Kleine–Levin syndrome (KLS) (a specific form of periodic idiopathic hypersomnia; n = 18) participated in the study. Participants were further divided into a high- and low-capacity group, according to performance on a working memory task (listening span). In a functional magnetic resonance imaging (fMRI) study, participants were administered the reading span complex working memory task tapping cognitive effort. Principal findings: The fMRI-derived blood oxygen level dependent (BOLD) signal was modulated by (1) effort in both the central executive and the salience network and (2) capacity in the salience network in that high performers evidenced a weaker BOLD signal than low performers. In the salience network there was a dichotomy between the left and the right hemisphere; the right hemisphere elicited a steeper increase of the BOLD signal as a function of increasing effort. There was also a stronger functional connectivity within the central executive network because of increased task difficulty. Conclusion: The ability to allocate cognitive effort in complex working memory is contingent upon focused resources in the executive and in particular the salience network. Individual capacity during the complex working memory task is related to activity in the salience (but not the executive) network so that high-capacity participants evidence a lower signal and possibly hence a larger dynamic response. PMID:23616756

  9. Sex-specific mechanism of social hierarchy in mice.

    PubMed

    van den Berg, Wouter E; Lamballais, Sander; Kushner, Steven A

    2015-05-01

    The establishment of social hierarchies is a naturally occurring, evolutionarily conserved phenomenon with a well-established impact on fitness and health. Investigations of complex social group dynamics may offer novel opportunities for translational studies of autism spectrum disorder. Here we describe a robust behavioral paradigm using an automated version of the tube test. Isogenic groups of male and female mice establish linear social hierarchies that remain highly stable for at least 14 days, the longest interval tested. Remarkably, however, their social strategy is sex-specific: females primarily utilize intrinsic attributes, whereas males are strongly influenced by prior social experience. Using both genetic and pharmacological manipulations, we identify testosterone as a critical sex-specific factor for determining which social strategy is used. Males inheriting a null mutation of the sex-determining region Y (Sry) gene used a similar social cognitive strategy as females. In contrast, females with transgenic expression of Sry utilized a typically male social strategy. Analogously, castration of males and testosterone supplementation of females yielded similar outcomes, with a reversal of their social cognitive strategy. Together, our results demonstrate a sex-specific mechanism underlying social hierarchy, in which both males and females retain the functional capacity to adapt their social strategy. More generally, we expect the automated tube test to provide an important complementary approach for both fundamental and translational studies of social behavior.

  10. View-Tolerant Face Recognition and Hebbian Learning Imply Mirror-Symmetric Neural Tuning to Head Orientation.

    PubMed

    Leibo, Joel Z; Liao, Qianli; Anselmi, Fabio; Freiwald, Winrich A; Poggio, Tomaso

    2017-01-09

    The primate brain contains a hierarchy of visual areas, dubbed the ventral stream, which rapidly computes object representations that are both specific for object identity and robust against identity-preserving transformations, like depth rotations [1, 2]. Current computational models of object recognition, including recent deep-learning networks, generate these properties through a hierarchy of alternating selectivity-increasing filtering and tolerance-increasing pooling operations, similar to simple-complex cells operations [3-6]. Here, we prove that a class of hierarchical architectures and a broad set of biologically plausible learning rules generate approximate invariance to identity-preserving transformations at the top level of the processing hierarchy. However, all past models tested failed to reproduce the most salient property of an intermediate representation of a three-level face-processing hierarchy in the brain: mirror-symmetric tuning to head orientation [7]. Here, we demonstrate that one specific biologically plausible Hebb-type learning rule generates mirror-symmetric tuning to bilaterally symmetric stimuli, like faces, at intermediate levels of the architecture and show why it does so. Thus, the tuning properties of individual cells inside the visual stream appear to result from group properties of the stimuli they encode and to reflect the learning rules that sculpted the information-processing system within which they reside. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Toward a Neurobiology of Delusions

    PubMed Central

    Corlett, P.R.; Taylor, J.R.; Wang, X.-J.; Fletcher, P.C.; Krystal, J.H.

    2013-01-01

    Delusions are the false and often incorrigible beliefs that can cause severe suffering in mental illness. We cannot yet explain them in terms of underlying neurobiological abnormalities. However, by drawing on recent advances in the biological, computational and psychological processes of reinforcement learning, memory, and perception it may be feasible to account for delusions in terms of cognition and brain function. The account focuses on a particular parameter, prediction error – the mismatch between expectation and experience – that provides a computational mechanism common to cortical hierarchies, frontostriatal circuits and the amygdala as well as parietal cortices. We suggest that delusions result from aberrations in how brain circuits specify hierarchical predictions, and how they compute and respond to prediction errors. Defects in these fundamental brain mechanisms can vitiate perception, memory, bodily agency and social learning such that individuals with delusions experience an internal and external world that healthy individuals would find difficult to comprehend. The present model attempts to provide a framework through which we can build a mechanistic and translational understanding of these puzzling symptoms. PMID:20558235

  12. The central role of recognition in auditory perception: a neurobiological model.

    PubMed

    McLachlan, Neil; Wilson, Sarah

    2010-01-01

    The model presents neurobiologically plausible accounts of sound recognition (including absolute pitch), neural plasticity involved in pitch, loudness and location information integration, and streaming and auditory recall. It is proposed that a cortical mechanism for sound identification modulates the spectrotemporal response fields of inferior colliculus neurons and regulates the encoding of the echoic trace in the thalamus. Identification involves correlation of sequential spectral slices of the stimulus-driven neural activity with stored representations in association with multimodal memories, verbal lexicons, and contextual information. Identities are then consolidated in auditory short-term memory and bound with attribute information (usually pitch, loudness, and direction) that has been integrated according to the identities' spectral properties. Attention to, or recall of, a particular identity will excite a particular sequence in the identification hierarchies and so lead to modulation of thalamus and inferior colliculus neural spectrotemporal response fields. This operates as an adaptive filter for identities, or their attributes, and explains many puzzling human auditory behaviors, such as the cocktail party effect, selective attention, and continuity illusions.

  13. Naturalistic Assessment of Everyday Activities and Prompting Technologies in Mild Cognitive Impairment

    PubMed Central

    Seelye, Adriana M.; Schmitter-Edgecombe, Maureen; Cook, Diane J.; Crandall, Aaron

    2014-01-01

    Older adults with mild cognitive impairment (MCI) often have difficulty performing complex instrumental activities of daily living (IADLs), which are critical to independent living. In this study, amnestic multi-domain MCI (N = 29), amnestic single-domain MCI (N = 18), and healthy older participants (N = 47) completed eight scripted IADLs (e.g., cook oatmeal on the stove) in a smart apartment testbed. We developed and experimented with a graded hierarchy of technology-based prompts to investigate both the amount of prompting and type of prompts required to assist individuals with MCI in completing the activities. When task errors occurred, progressive levels of assistance were provided, starting with the lowest level needed to adjust performance. Results showed that the multi-domain MCI group made more errors and required more prompts than the single-domain MCI and healthy older adult groups. Similar to the other two groups, the multi-domain MCI group responded well to the indirect prompts and did not need a higher level of prompting to get back on track successfully with the tasks. Need for prompting assistance was best predicted by verbal memory abilities in multi-domain amnestic MCI. Participants across groups indicated that they perceived the prompting technology to be very helpful. PMID:23351284

  14. A parallel finite element procedure for contact-impact problems using edge-based smooth triangular element and GPU

    NASA Astrophysics Data System (ADS)

    Cai, Yong; Cui, Xiangyang; Li, Guangyao; Liu, Wenyang

    2018-04-01

    The edge-smooth finite element method (ES-FEM) can improve the computational accuracy of triangular shell elements and the mesh partition efficiency of complex models. In this paper, an approach is developed to perform explicit finite element simulations of contact-impact problems with a graphical processing unit (GPU) using a special edge-smooth triangular shell element based on ES-FEM. Of critical importance for this problem is achieving finer-grained parallelism to enable efficient data loading and to minimize communication between the device and host. Four kinds of parallel strategies are then developed to efficiently solve these ES-FEM based shell element formulas, and various optimization methods are adopted to ensure aligned memory access. Special focus is dedicated to developing an approach for the parallel construction of edge systems. A parallel hierarchy-territory contact-searching algorithm (HITA) and a parallel penalty function calculation method are embedded in this parallel explicit algorithm. Finally, the program flow is well designed, and a GPU-based simulation system is developed, using Nvidia's CUDA. Several numerical examples are presented to illustrate the high quality of the results obtained with the proposed methods. In addition, the GPU-based parallel computation is shown to significantly reduce the computing time.

  15. The visual orientation memory of Drosophila requires Foraging (PKG) upstream of Ignorant (RSK2) in ring neurons of the central complex

    PubMed Central

    Kuntz, Sara; Poeck, Burkhard; Sokolowski, Marla B.; Strauss, Roland

    2012-01-01

    Orientation and navigation in a complex environment requires path planning and recall to exert goal-driven behavior. Walking Drosophila flies possess a visual orientation memory for attractive targets which is localized in the central complex of the adult brain. Here we show that this type of working memory requires the cGMP-dependent protein kinase encoded by the foraging gene in just one type of ellipsoid-body ring neurons. Moreover, genetic and epistatic interaction studies provide evidence that Foraging functions upstream of the Ignorant Ribosomal-S6 Kinase 2, thus revealing a novel neuronal signaling pathway necessary for this type of memory in Drosophila. PMID:22815538

  16. A multi-criteria decisionmaking approach to management indicator species selection for the Monongahela National Forest, West Virginia.

    Treesearch

    Kurtis R. Moseley; W.Mark Ford; John W. Edwards; Michael P. Strager

    2010-01-01

    The management indicator species concept is useful for land managers charged with monitoring and conserving complex biological diversity over large landscapes with limited available resources. We used the analytical hierarchy process (AHP) to determine the best management indicator species (MIS) for three...

  17. Hierarchical Task Analysis and Training Decisions.

    ERIC Educational Resources Information Center

    Shepherd, A.

    1985-01-01

    Hierarchical task analysis (HTA), which requires description of a task in terms of a hierarchy of operations and plans, is reviewed and examined as a basis for making training decisions. Benefits of HTA in terms of economy of analysis and as a means of accounting for complex performance are outlined. (Author/MBR)

  18. Hierarchy of Interactive Functions in Father-Mother-Baby Three-Way Games

    ERIC Educational Resources Information Center

    Frascarolo, France; Favez, Nicolas; Carneiro, Claudio; Fivaz-Depeursinge, Elisabeth

    2004-01-01

    In developmental research, the family has mainly been studied through dyadic interaction. Three-way interactions have received less attention, partly because of their complexity. This difficulty may be overcome by distinguishing between four hierarchically embedded functions in three-way interactions: (1) participation (inclusion of all…

  19. Assessment of Language Comprehension of 6-Year-Old Deaf Children.

    ERIC Educational Resources Information Center

    Geffner, Donna S.; Freeman, Lisa Rothman

    1980-01-01

    Results show that comprehension of word types (nouns, verbs, etc.) and linguistic structure can be orderly, producing a hierarchy of complexity similar to that found in normally hearing children. However, performance was about three years behind that of normally hearing peers. Journal availability: Elsevier North Holland, Inc., 52 Vanderbilt…

  20. Must "Hard Problems" Be Hard?

    ERIC Educational Resources Information Center

    Kolata, Gina

    1985-01-01

    To determine how hard it is for computers to solve problems, researchers have classified groups of problems (polynomial hierarchy) according to how much time they seem to require for their solutions. A difficult and complex proof is offered which shows that a combinatorial approach (using Boolean circuits) may resolve the problem. (JN)

  1. A spatial classification and database for management, research, and policy making: The Great Lakes aquatic habitat framework

    EPA Science Inventory

    Managing the world’s largest and complex freshwater ecosystem, the Laurentian Great Lakes, requires a spatially hierarchical basin-wide database of ecological and socioeconomic information that are comparable across the region. To meet such a need, we developed a hierarchi...

  2. Children's Play and Culture Learning in an Egalitarian Foraging Society

    ERIC Educational Resources Information Center

    Boyette, Adam H.

    2016-01-01

    Few systematic studies of play in foragers exist despite their significance for understanding the breadth of contexts for human development and the ontogeny of cultural learning. Forager societies lack complex social hierarchies, avenues for prestige or wealth accumulation, and formal educational institutions, and thereby represent a contrast to…

  3. Decoherence and thermalization of a pure quantum state in quantum field theory.

    PubMed

    Giraud, Alexandre; Serreau, Julien

    2010-06-11

    We study the real-time evolution of a self-interacting O(N) scalar field initially prepared in a pure, coherent quantum state. We present a complete solution of the nonequilibrium quantum dynamics from a 1/N expansion of the two-particle-irreducible effective action at next-to-leading order, which includes scattering and memory effects. We demonstrate that, restricting one's attention (or ability to measure) to a subset of the infinite hierarchy of correlation functions, one observes an effective loss of purity or coherence and, on longer time scales, thermalization. We point out that the physics of decoherence is well described by classical statistical field theory.

  4. [Sensory integration: hierarchy and synchronization].

    PubMed

    Kriukov, V I

    2005-01-01

    This is the first in the series of mini-reviews devoted to the basic problems and most important effects of attention in terms of neuronal modeling. We believe that the absence of the unified view on wealth of new date on attention is the main obstacle for further understanding of higher nervous activity. The present work deals with the main ground problem of reconciling two competing architectures designed to integrate the sensory information in the brain. The other mini-reviews will be concerned with the remaining five or six problems of attention, all of them to be ultimately resolved uniformly in the framework of small modification of dominant model of attention and memory.

  5. Using CART to Identify Thresholds and Hierarchies in the Determinants of Funding Decisions.

    PubMed

    Schilling, Chris; Mortimer, Duncan; Dalziel, Kim

    2017-02-01

    There is much interest in understanding decision-making processes that determine funding outcomes for health interventions. We use classification and regression trees (CART) to identify cost-effectiveness thresholds and hierarchies in the determinants of funding decisions. The hierarchical structure of CART is suited to analyzing complex conditional and nonlinear relationships. Our analysis uncovered hierarchies where interventions were grouped according to their type and objective. Cost-effectiveness thresholds varied markedly depending on which group the intervention belonged to: lifestyle-type interventions with a prevention objective had an incremental cost-effectiveness threshold of $2356, suggesting that such interventions need to be close to cost saving or dominant to be funded. For lifestyle-type interventions with a treatment objective, the threshold was much higher at $37,024. Lower down the tree, intervention attributes such as the level of patient contribution and the eligibility for government reimbursement influenced the likelihood of funding within groups of similar interventions. Comparison between our CART models and previously published results demonstrated concurrence with standard regression techniques while providing additional insights regarding the role of the funding environment and the structure of decision-maker preferences.

  6. Cognitive Load Theory: A Broader View on the Role of Memory in Learning and Education

    ERIC Educational Resources Information Center

    Paas, Fred; Ayres, Paul

    2014-01-01

    According to cognitive load theory (CLT), the limitations of working memory (WM) in the learning of new tasks together with its ability to cooperate with an unlimited long-term memory (LTM) for familiar tasks enable human beings to deal effectively with complex problems and acquire highly complex knowledge and skills. With regard to WM, CLT has…

  7. Memory-induced nonlinear dynamics of excitation in cardiac diseases.

    PubMed

    Landaw, Julian; Qu, Zhilin

    2018-04-01

    Excitable cells, such as cardiac myocytes, exhibit short-term memory, i.e., the state of the cell depends on its history of excitation. Memory can originate from slow recovery of membrane ion channels or from accumulation of intracellular ion concentrations, such as calcium ion or sodium ion concentration accumulation. Here we examine the effects of memory on excitation dynamics in cardiac myocytes under two diseased conditions, early repolarization and reduced repolarization reserve, each with memory from two different sources: slow recovery of a potassium ion channel and slow accumulation of the intracellular calcium ion concentration. We first carry out computer simulations of action potential models described by differential equations to demonstrate complex excitation dynamics, such as chaos. We then develop iterated map models that incorporate memory, which accurately capture the complex excitation dynamics and bifurcations of the action potential models. Finally, we carry out theoretical analyses of the iterated map models to reveal the underlying mechanisms of memory-induced nonlinear dynamics. Our study demonstrates that the memory effect can be unmasked or greatly exacerbated under certain diseased conditions, which promotes complex excitation dynamics, such as chaos. The iterated map models reveal that memory converts a monotonic iterated map function into a nonmonotonic one to promote the bifurcations leading to high periodicity and chaos.

  8. Memory-induced nonlinear dynamics of excitation in cardiac diseases

    NASA Astrophysics Data System (ADS)

    Landaw, Julian; Qu, Zhilin

    2018-04-01

    Excitable cells, such as cardiac myocytes, exhibit short-term memory, i.e., the state of the cell depends on its history of excitation. Memory can originate from slow recovery of membrane ion channels or from accumulation of intracellular ion concentrations, such as calcium ion or sodium ion concentration accumulation. Here we examine the effects of memory on excitation dynamics in cardiac myocytes under two diseased conditions, early repolarization and reduced repolarization reserve, each with memory from two different sources: slow recovery of a potassium ion channel and slow accumulation of the intracellular calcium ion concentration. We first carry out computer simulations of action potential models described by differential equations to demonstrate complex excitation dynamics, such as chaos. We then develop iterated map models that incorporate memory, which accurately capture the complex excitation dynamics and bifurcations of the action potential models. Finally, we carry out theoretical analyses of the iterated map models to reveal the underlying mechanisms of memory-induced nonlinear dynamics. Our study demonstrates that the memory effect can be unmasked or greatly exacerbated under certain diseased conditions, which promotes complex excitation dynamics, such as chaos. The iterated map models reveal that memory converts a monotonic iterated map function into a nonmonotonic one to promote the bifurcations leading to high periodicity and chaos.

  9. CARL: a LabVIEW 3 computer program for conducting exposure therapy for the treatment of dental injection fear.

    PubMed

    Coldwell, S E; Getz, T; Milgrom, P; Prall, C W; Spadafora, A; Ramsay, D S

    1998-04-01

    This paper describes CARL (Computer Assisted Relaxation Learning), a computerized, exposure-based therapy program for the treatment of dental injection fear. The CARL program operates primarily in two different modes; in vitro, which presents a video-taped exposure hierarchy, and in vivo, which presents scripts for a dentist or hygienist to use while working with a subject. Two additional modes are used to train subjects to use the program and to administer behavioral assessment tests. The program contains five different modules, which function to register a subject, train subjects to use physical and cognitive relaxation techniques, deliver an exposure hierarchy, question subjects about the helpfulness of each of the therapy components, and test for memory effects of anxiolytic medication. Nine subjects have completed the CARL therapy program and 1-yr follow-up as participants in a placebo-controlled clinical trial examining the effects of alprazolam on exposure therapy for dental injection phobia. All nine subjects were able to receive two dental injections, and all reduced their general fear of dental injections. Initial results therefore indicate that the CARL program successfully reduces dental injection fear.

  10. The Basis of the Syllable Hierarchy: Articulatory Pressures or Universal Phonological Constraints?

    PubMed

    Zhao, Xu; Berent, Iris

    2018-02-01

    Across languages, certain syllable types are systematically preferred to others (e.g., [Formula: see text] lbif, where [Formula: see text] indicates a preference). Previous research has shown that these preferences are active in the brains of individual speakers, they are evident even when none of these syllable types exists in participants' language, and even when the stimuli are presented in print. These results suggest that the syllable hierarchy cannot be reduced to either lexical or auditory/phonetic pressures. Here, we examine whether the syllable hierarchy is due to articulatory pressures. According to the motor embodiment view, the perception of a linguistic stimulus requires simulating its production; dispreferred syllables (e.g., lbif) are universally disliked because their production is harder to simulate. To address this possibility, we assessed syllable preferences while articulation was mechanically suppressed. Our four experiments each found significant effects of suppression. Remarkably, people remained sensitive to the syllable hierarchy regardless of suppression. Specifically, results with auditory materials (Experiments 1-2) showed strong effects of syllable structure irrespective of suppression. Moreover, syllable structure uniquely accounted for listeners' behavior even when controlling for several phonetic characteristics of our auditory materials. Results with printed stimuli (Experiments 3-4) were more complex, as participants in these experiments relied on both phonological and graphemic information. Nonetheless, readers were sensitive to most of the syllable hierarchy (e.g., [Formula: see text]), and these preferences emerged when articulation was suppressed, and even when the statistical properties of our materials were controlled via a regression analysis. Together, these findings indicate that speakers possess broad grammatical preferences that are irreducible to either sensory or motor factors.

  11. Fast analysis of molecular dynamics trajectories with graphics processing units-Radial distribution function histogramming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levine, Benjamin G., E-mail: ben.levine@temple.ed; Stone, John E., E-mail: johns@ks.uiuc.ed; Kohlmeyer, Axel, E-mail: akohlmey@temple.ed

    2011-05-01

    The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU's memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm aremore » presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 s per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis.« less

  12. Fast Analysis of Molecular Dynamics Trajectories with Graphics Processing Units—Radial Distribution Function Histogramming

    PubMed Central

    Stone, John E.; Kohlmeyer, Axel

    2011-01-01

    The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU’s memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm are presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 seconds per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis. PMID:21547007

  13. Hierarchy of forward-backward stochastic Schrödinger equation

    NASA Astrophysics Data System (ADS)

    Ke, Yaling; Zhao, Yi

    2016-07-01

    Driven by the impetus to simulate quantum dynamics in photosynthetic complexes or even larger molecular aggregates, we have established a hierarchy of forward-backward stochastic Schrödinger equation in the light of stochastic unravelling of the symmetric part of the influence functional in the path-integral formalism of reduced density operator. The method is numerically exact and is suited for Debye-Drude spectral density, Ohmic spectral density with an algebraic or exponential cutoff, as well as discrete vibrational modes. The power of this method is verified by performing the calculations of time-dependent population differences in the valuable spin-boson model from zero to high temperatures. By simulating excitation energy transfer dynamics of the realistic full FMO trimer, some important features are revealed.

  14. Image space subdivision for fast ray tracing

    NASA Astrophysics Data System (ADS)

    Yu, Billy T.; Yu, William W.

    1999-09-01

    Ray-tracing is notorious of its computational requirement. There were a number of techniques to speed up the process. However, a famous statistic indicated that ray-object intersections occupies over 95% of the total image generation time. Thus, it is most beneficial to work on this bottle-neck. There were a number of ray-object intersection reduction techniques and they could be classified into three major categories: bounding volume hierarchies, space subdivision, and directional subdivision. This paper introduces a technique falling into the third category. To further speed up the process, it takes advantages of hierarchy by adopting a MX-CIF quadtree in the image space. This special kind of quadtree provides simple objects allocation and ease of implementation. The text also included a theoretical proof of the expected performance. For ray-polygon comparison, the technique reduces the order of complexity from linear to square-root, O(n) -> O(2(root)n). Experiments with various shape, size and complexity were conducted to verify the expectation. Results shown that computational improvement grew with the complexity of the sceneries. The experimental improvement was more than 90% and it agreed with the theoretical value when the number of polygons exceeded 3000. The more complex was the scene, the more efficient was the acceleration. The algorithm described was implemented in the polygonal level, however, it could be easily enhanced and extended to the object or higher levels.

  15. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    PubMed

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  16. HDAC3 and the Molecular Brake Pad Hypothesis

    PubMed Central

    McQuown, Susan C.; Wood, Marcelo A.

    2011-01-01

    Successful transcription of specific genes required for long-term memory processes involves the orchestrated effort of not only transcription factors, but also very specific enzymatic protein complexes that modify chromatin structure. Chromatin modification has been identified as a pivotal molecular mechanism underlying certain forms of synaptic plasticity and memory. The best-studied form of chromatin modification in the learning and memory field is histone acetylation, which is regulated by histone acetyltransferases and histone deacetylases (HDACs). HDAC inhibitors have been shown to strongly enhance long-term memory processes, and recent work has aimed to identify contributions of individual HDACs. In this review, we focus on HDAC3 and discuss its recently defined role as a negative regulator of long-term memory formation. HDAC3 is part of a corepressor complex and has direct interactions with class II HDACs that may be important for its molecular and behavioral consequences. And last, we propose the “molecular brake pad” hypothesis of HDAC function. The HDACs and associated corepressor complexes may function in neurons, in part, as “molecular brake pads.” HDACs are localized to promoters of active genes and act as a persistent clamp that requires strong activity-dependent signaling to temporarily release these complexes (or brake pads) to activate gene expression required for long-term memory formation. Thus, HDAC inhibition removes the “molecular brake pads” constraining the processes necessary for long-term memory and results in strong, persistent memory formation. PMID:21521655

  17. Automated Cache Performance Analysis And Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohror, Kathryn

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool tomore » gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters, cache behavior could only be measured reliably in the ag- gregate across tens or hundreds of thousands of instructions. With the newest iteration of PEBS technology, cache events can be tied to a tuple of instruction pointer, target address (for both loads and stores), memory hierarchy, and observed latency. With this information we can now begin asking questions regarding the efficiency of not only regions of code, but how these regions interact with particular data structures and how these interactions evolve over time. In the short term, this information will be vital for performance analysts understanding and optimizing the behavior of their codes for the memory hierarchy. In the future, we can begin to ask how data layouts might be changed to improve performance and, for a particular application, what the theoretical optimal performance might be. The overall benefit to be produced by this effort was a commercial quality easy-to- use and scalable performance tool that will allow both beginner and experienced parallel programmers to automatically tune their applications for optimal cache usage. Effective use of such a tool can literally save weeks of performance tuning effort. Easy to use. With the proposed innovations, finding and fixing memory performance issues would be more automated and hide most to all of the performance engineer exper- tise ”under the hood” of the Open|SpeedShop performance tool. One of the biggest public benefits from the proposed innovations is that it makes performance analysis more usable to a larger group of application developers. Intuitive reporting of results. The Open|SpeedShop performance analysis tool has a rich set of intuitive, yet detailed reports for presenting performance results to application developers. Our goal was to leverage this existing technology to present the results from our memory performance addition to Open|SpeedShop. Suitable for experts as well as novices. Application performance is getting more difficult to measure as the hardware platforms they run on become more complicated. This makes life difficult for the application developer, in that they need to know more about the hardware platform, including the memory system hierarchy, in order to understand the performance of their application. Some application developers are comfortable in that sce- nario, while others want to do their scientific research and not have to understand all the nuances in the hardware platform they are running their application on. Our proposed innovations were aimed to support both experts and novice performance analysts. Useful in many markets. The enhancement to Open|SpeedShop would appeal to a broader market space, as it will be useful in scientific, commercial, and cloud computing environments. Our goal was to use technology developed initially at the and Lawrence Livermore Na- tional Laboratory combined with the development and commercial software experience of the Argo Navis Technologies, LLC (ANT) to form a powerful combination to delivery these objectives.« less

  18. The Hebb repetition effect in simple and complex memory span.

    PubMed

    Oberauer, Klaus; Jones, Timothy; Lewandowsky, Stephan

    2015-08-01

    The Hebb repetition effect refers to the finding that immediate serial recall is improved over trials for memory lists that are surreptitiously repeated across trials, relative to new lists. We show in four experiments that the Hebb repetition effect is also observed with a complex-span task, in which encoding or retrieval of list items alternates with an unrelated processing task. The interruption of encoding or retrieval by the processing task did not reduce the size of the Hebb effect, demonstrating that incidental long-term learning forms integrated representations of lists, excluding the interleaved processing events. Contrary to the assumption that complex-span performance relies more on long-term memory than standard immediate serial recall (simple span), the Hebb effect was not larger in complex-span than in simple-span performance. The Hebb effect in complex span was also not modulated by the opportunity for refreshing list items, questioning a role of refreshing for the acquisition of the long-term memory representations underlying the effect.

  19. Similarity, Not Complexity, Determines Visual Working Memory Performance

    ERIC Educational Resources Information Center

    Jackson, Margaret C.; Linden, David E. J.; Roberts, Mark V.; Kriegeskorte, Nikolaus; Haenschel, Corinna

    2015-01-01

    A number of studies have shown that visual working memory (WM) is poorer for complex versus simple items, traditionally accounted for by higher information load placing greater demands on encoding and storage capacity limits. Other research suggests that it may not be complexity that determines WM performance per se, but rather increased…

  20. Slime mold uses an externalized spatial "memory" to navigate in complex environments.

    PubMed

    Reid, Chris R; Latty, Tanya; Dussutour, Audrey; Beekman, Madeleine

    2012-10-23

    Spatial memory enhances an organism's navigational ability. Memory typically resides within the brain, but what if an organism has no brain? We show that the brainless slime mold Physarum polycephalum constructs a form of spatial memory by avoiding areas it has previously explored. This mechanism allows the slime mold to solve the U-shaped trap problem--a classic test of autonomous navigational ability commonly used in robotics--requiring the slime mold to reach a chemoattractive goal behind a U-shaped barrier. Drawn into the trap, the organism must rely on other methods than gradient-following to escape and reach the goal. Our data show that spatial memory enhances the organism's ability to navigate in complex environments. We provide a unique demonstration of a spatial memory system in a nonneuronal organism, supporting the theory that an externalized spatial memory may be the functional precursor to the internal memory of higher organisms.

  1. Strategy difficulty effects in young and older adults' episodic memory are modulated by inter-stimulus intervals and executive control processes.

    PubMed

    Burger, Lucile; Uittenhove, Kim; Lemaire, Patrick; Taconnat, Laurence

    2017-04-01

    Efficient execution of strategies is crucial to memory performance and to age-related differences in this performance. Relative strategy complexity influences memory performance and aging effects on memory. Here, we aimed to further our understanding of the effects of relative strategy complexity by looking at the role of cognitive control functions and the time-course of the effects of relative strategy complexity. Thus, we manipulated inter-stimulus intervals (ISI) and assessed executive functions. Results showed that (a) performance as a function of the relative strategy difficulty of the current and previous trial was modulated by ISI, (b) these effects were modulated by inhibition capacities, and (c) significant age differences were found in the way ISI modulates relative strategy difficulty. These findings have important implications for understanding the relationships between aging, executive control, and strategy execution in episodic memory. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Technical support for digital systems technology development. Task order 1: ISP contention analysis and control

    NASA Technical Reports Server (NTRS)

    Stehle, Roy H.; Ogier, Richard G.

    1993-01-01

    Alternatives for realizing a packet-based network switch for use on a frequency division multiple access/time division multiplexed (FDMA/TDM) geostationary communication satellite were investigated. Each of the eight downlink beams supports eight directed dwells. The design needed to accommodate multicast packets with very low probability of loss due to contention. Three switch architectures were designed and analyzed. An output-queued, shared bus system yielded a functionally simple system, utilizing a first-in, first-out (FIFO) memory per downlink dwell, but at the expense of a large total memory requirement. A shared memory architecture offered the most efficiency in memory requirements, requiring about half the memory of the shared bus design. The processing requirement for the shared-memory system adds system complexity that may offset the benefits of the smaller memory. An alternative design using a shared memory buffer per downlink beam decreases circuit complexity through a distributed design, and requires at most 1000 packets of memory more than the completely shared memory design. Modifications to the basic packet switch designs were proposed to accommodate circuit-switched traffic, which must be served on a periodic basis with minimal delay. Methods for dynamically controlling the downlink dwell lengths were developed and analyzed. These methods adapt quickly to changing traffic demands, and do not add significant complexity or cost to the satellite and ground station designs. Methods for reducing the memory requirement by not requiring the satellite to store full packets were also proposed and analyzed. In addition, optimal packet and dwell lengths were computed as functions of memory size for the three switch architectures.

  3. Complex-valued multistate associative memory with nonlinear multilevel functions for gray-level image reconstruction.

    PubMed

    Tanaka, Gouhei; Aihara, Kazuyuki

    2009-09-01

    A widely used complex-valued activation function for complex-valued multistate Hopfield networks is revealed to be essentially based on a multilevel step function. By replacing the multilevel step function with other multilevel characteristics, we present two alternative complex-valued activation functions. One is based on a multilevel sigmoid function, while the other on a characteristic of a multistate bifurcating neuron. Numerical experiments show that both modifications to the complex-valued activation function bring about improvements in network performance for a multistate associative memory. The advantage of the proposed networks over the complex-valued Hopfield networks with the multilevel step function is more outstanding when a complex-valued neuron represents a larger number of multivalued states. Further, the performance of the proposed networks in reconstructing noisy 256 gray-level images is demonstrated in comparison with other recent associative memories to clarify their advantages and disadvantages.

  4. Memory ability of children with complex communication needs.

    PubMed

    Larsson, Maria; Sandberg, Annika Dahlgren

    2008-01-01

    Phonological memory is central to language and reading and writing skills. Many children with complex communication needs (CCN) experience problems with reading and writing acquisition. The reason could be because of the absence of articulatory ability, which might have a negative affect on phonological memory. Phonological and visuo-spatial short-term memory and working memory were tested in 15 children with CCN, aged 5 - 12 years, and compared to children with natural speech matched for gender, and mental and linguistic age. Results indicated weaker phonological STM and visuo-spatial STM and WM in children with CCN. The lack of articulatory ability could be assumed to affect subvocal rehearsal and, therefore, phonological memory which, in turn, may affect reading and writing acquisition. Weak visuo-spatial memory could also complicate the use of Bliss symbols and other types of augmentative and alternative communication.

  5. Structural hierarchy controlling dimerization and target DNA recognition in the AHR transcriptional complex.

    PubMed

    Seok, Seung-Hyeon; Lee, Woojong; Jiang, Li; Molugu, Kaivalya; Zheng, Aiping; Li, Yitong; Park, Sanghyun; Bradfield, Christopher A; Xing, Yongna

    2017-05-23

    The aryl hydrocarbon receptor (AHR) belongs to the PAS (PER-ARNT-SIM) family transcription factors and mediates broad responses to numerous environmental pollutants and cellular metabolites, modulating diverse biological processes from adaptive metabolism, acute toxicity, to normal physiology of vascular and immune systems. The AHR forms a transcriptionally active heterodimer with ARNT (AHR nuclear translocator), which recognizes the dioxin response element (DRE) in the promoter of downstream genes. We determined the crystal structure of the mammalian AHR-ARNT heterodimer in complex with the DRE, in which ARNT curls around AHR into a highly intertwined asymmetric architecture, with extensive heterodimerization interfaces and AHR interdomain interactions. Specific recognition of the DRE is determined locally by the DNA-binding residues, which discriminates it from the closely related hypoxia response element (HRE), and is globally affected by the dimerization interfaces and interdomain interactions. Changes at the interdomain interactions caused either AHR constitutive nuclear localization or failure to translocate to nucleus, underlying an allosteric structural pathway for mediating ligand-induced exposure of nuclear localization signal. These observations, together with the global higher flexibility of the AHR PAS-A and its loosely packed structural elements, suggest a dynamic structural hierarchy for complex scenarios of AHR activation induced by its diverse ligands.

  6. Structural hierarchy controlling dimerization and target DNA recognition in the AHR transcriptional complex

    PubMed Central

    Lee, Woojong; Jiang, Li; Molugu, Kaivalya; Zheng, Aiping; Li, Yitong; Park, Sanghyun; Bradfield, Christopher A.; Xing, Yongna

    2017-01-01

    The aryl hydrocarbon receptor (AHR) belongs to the PAS (PER-ARNT-SIM) family transcription factors and mediates broad responses to numerous environmental pollutants and cellular metabolites, modulating diverse biological processes from adaptive metabolism, acute toxicity, to normal physiology of vascular and immune systems. The AHR forms a transcriptionally active heterodimer with ARNT (AHR nuclear translocator), which recognizes the dioxin response element (DRE) in the promoter of downstream genes. We determined the crystal structure of the mammalian AHR–ARNT heterodimer in complex with the DRE, in which ARNT curls around AHR into a highly intertwined asymmetric architecture, with extensive heterodimerization interfaces and AHR interdomain interactions. Specific recognition of the DRE is determined locally by the DNA-binding residues, which discriminates it from the closely related hypoxia response element (HRE), and is globally affected by the dimerization interfaces and interdomain interactions. Changes at the interdomain interactions caused either AHR constitutive nuclear localization or failure to translocate to nucleus, underlying an allosteric structural pathway for mediating ligand-induced exposure of nuclear localization signal. These observations, together with the global higher flexibility of the AHR PAS-A and its loosely packed structural elements, suggest a dynamic structural hierarchy for complex scenarios of AHR activation induced by its diverse ligands. PMID:28396409

  7. Structural hierarchy controlling dimerization and target DNA recognition in the AHR transcriptional complex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seok, Seung-Hyeon; Lee, Woojong; Jiang, Li

    he aryl hydrocarbon receptor (AHR) belongs to the PAS (PER-ARNT-SIM) family transcription factors and mediates broad responses to numerous environmental pollutants and cellular metabolites, modulating diverse biological processes from adaptive metabolism, acute toxicity, to normal physiology of vascular and immune systems. The AHR forms a transcriptionally active heterodimer with ARNT (AHR nuclear translocator), which recognizes the dioxin response element (DRE) in the promoter of downstream genes. We determined the crystal structure of the mammalian AHR–ARNT heterodimer in complex with the DRE, in which ARNT curls around AHR into a highly intertwined asymmetric architecture, with extensive heterodimerization interfaces and AHR interdomainmore » interactions. Specific recognition of the DRE is determined locally by the DNA-binding residues, which discriminates it from the closely related hypoxia response element (HRE), and is globally affected by the dimerization interfaces and interdomain interactions. Changes at the interdomain interactions caused either AHR constitutive nuclear localization or failure to translocate to nucleus, underlying an allosteric structural pathway for mediating ligand-induced exposure of nuclear localization signal. These observations, together with the global higher flexibility of the AHR PAS-A and its loosely packed structural elements, suggest a dynamic structural hierarchy for complex scenarios of AHR activation induced by its diverse ligands.« less

  8. Managing search complexity in linguistic geometry.

    PubMed

    Stilman, B

    1997-01-01

    This paper is a new step in the development of linguistic geometry. This formal theory is intended to discover and generalize the inner properties of human expert heuristics, which have been successful in a certain class of complex control systems, and apply them to different systems. In this paper, we investigate heuristics extracted in the form of hierarchical networks of planning paths of autonomous agents. Employing linguistic geometry tools the dynamic hierarchy of networks is represented as a hierarchy of formal attribute languages. The main ideas of this methodology are shown in the paper on two pilot examples of the solution of complex optimization problems. The first example is a problem of strategic planning for the air combat, in which concurrent actions of four vehicles are simulated as serial interleaving moves. The second example is a problem of strategic planning for the space comb of eight autonomous vehicles (with interleaving moves) that requires generation of the search tree of the depth 25 with the branching factor 30. This is beyond the capabilities of modern and conceivable future computers (employing conventional approaches). In both examples the linguistic geometry tools showed deep and highly selective searches in comparison with conventional search algorithms. For the first example a sketch of the proof of optimality of the solution is considered.

  9. Data Movement Dominates: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacob, Bruce L.

    Over the past three years in this project, what we have observed is that the primary reason for data movement in large-scale systems is that the per-node capacity is not large enough—i.e., one of the solutions to the data-movement problem (certainly not the only solution that is required, but a significant one nonetheless) is to increase per-node capacity so that inter-node traffic is reduced. This unfortunately is not as simple as it sounds. Today’s main memory systems for datacenters, enterprise computing systems, and supercomputers, fail to provide high per-socket capacity [Dirik & Jacob 2009; Cooper-Balis et al. 2012], except atmore » extremely high price points (factors of 10–100x the cost/bit of consumer main-memory systems) [Stokes 2008]. The reason is that our choice of technology for today’s main memory systems—i.e., DRAM, which we have used as a main-memory technology since the 1970s [Jacob et al. 2007]—can no longer keep up with our needs for density and price per bit. Main memory systems have always been built from the cheapest, densest, lowest-power memory technology available, and DRAM is no longer the cheapest, the densest, nor the lowest-power storage technology out there. It is now time for DRAM to go the way that SRAM went: move out of the way for a cheaper, slower, denser storage technology, and become a cache instead. This inflection point has happened before, in the context of SRAM yielding to DRAM. There was once a time that SRAM was the storage technology of choice for all main memories [Tomasulo 1967; Thornton 1970; Kidder 1981]. However, once DRAM hit volume production in the 1970s and 80s, it supplanted SRAM as a main memory technology because it was cheaper, and it was denser. It also happened to be lower power, but that was not the primary consideration of the day. At the time, it was recognized that DRAM was much slower than SRAM, but it was only at the supercomputer level (For instance the Cray X-MP in the 1980s and its follow-on, the Cray Y-MP, in the 1990s) that could one afford to build ever- larger main memories out of SRAM—the reasoning for moving to DRAM was that an appropriately designed memory hierarchy, built of DRAM as main memory and SRAM as a cache, would approach the performance of SRAM, at the price-per-bit of DRAM [Mashey 1999]. Today it is quite clear that, were one to build an entire multi-gigabyte main memory out of SRAM instead of DRAM, one could improve the performance of almost any computer system by up to an order of magnitude—but this option is not even considered, because to build that system would be prohibitively expensive. It is now time to revisit the same design choice in the context of modern technologies and modern systems. For reasons both technical and economic, we can no longer afford to build ever-larger main memory systems out of DRAM. Flash memory, on the other hand, is significantly cheaper and denser than DRAM and therefore should take its place. While it is true that flash is significantly slower than DRAM, one can afford to build much larger main memories out of flash than out of DRAM, and we show that an appropriately designed memory hierarchy, built of flash as main memory and DRAM as a cache, will approach the performance of DRAM, at the price-per-bit of flash. In our studies as part of this project, we have investigated Non-Volatile Main Memory (NVMM), a new main-memory architecture for large-scale computing systems, one that is specifically designed to address the weaknesses described previously. In particular, it provides the following features: non-volatility: The bulk of the storage is comprised of NAND flash, and in this organization DRAM is used only as a cache, not as main memory. Furthermore, the flash is journaled, which means that operations such as checkpoint/restore are already built into the system. 1+ terabytes of storage per socket: SSDs and DRAM DIMMs have roughly the same form factor (several square inches of PCB surface area), and terabyte SSDs are now commonplace. performance approaching that of DRAM: DRAM is used as a cache to the flash system. price-per-bit approaching that of NAND: Flash is currently well under $0.50 per gigabyte; DDR3 SDRAM is currently just over $10 per gigabyte [Newegg 2014]. Even today, one can build an easily affordable main memory system with a terabyte or more of NAND storage per CPU socket (which would be extremely expensive were one to use DRAM), and our cycle- accurate, full-system experiments show that this can be done at a performance point that lies within a factor of two of DRAM.« less

  10. Preliminary Investigation of a Video-Based Stimulus Preference Assessment

    ERIC Educational Resources Information Center

    Snyder, Katie; Higbee, Thomas S.; Dayton, Elizabeth

    2012-01-01

    Video clips may be an effective format for presenting complex stimuli in preference assessments. In this preliminary study, we evaluated the correspondence between preference hierarchies generated from preference assessments that included either toys or videos of the toys. The top-ranked item corresponded in both assessments for 5 of the 6…

  11. Studying the Cognitive Emphases of Teachers' Classroom Questions.

    ERIC Educational Resources Information Center

    Davis, O. L., Jr.; And Others

    1969-01-01

    Recent interest in the direct, descriptive study of teaching has led to renewed attention to the types of questions asked by teachers. An important contributing factor has also been the progress made in the analysis of cognitive operations and the identification of a complex hierarchy of operations. Several observation systems developed from…

  12. Using Landscape Hierarchies To Guide Restoration Of Disturbed Ecosystems

    Treesearch

    Brian J. Palik; Charles P. Goebel; Katherine L. Kirkman; Larry West

    2000-01-01

    Reestablishing native plant communities is an important focus of ecosystem restoration. In complex landscapes containing a diversity of ecosystem types, restoration requires a set of reference vegetation conditions for the ecosystems of concern, and a predictive model to relate plant community composition to physical variables. Restoration also requires an approach for...

  13. Italian Education beyond Hierarchy: Governance, Evaluation and Headship

    ERIC Educational Resources Information Center

    Grimaldi, Emiliano; Serpieri, Roberto

    2014-01-01

    This article deals with the changes introduced in the Italian education system after the 1997 School Autonomy reform. Looking at the complex interplay between global influences and processes of local inflection, the work explores the degree to which we are witnessing a significant shift towards a new mode of governance and the interplay between…

  14. A Human Factor Analysis to Mitigate Fall Risk Factors in an Aerospace Environment

    NASA Technical Reports Server (NTRS)

    Ware, Joylene H.

    2010-01-01

    This slide presentation reviews the study done to quanitfy the risks from falls from three locations (i.e., Shuttle Landing Facility Launch Complex Payloads and Vehicle Assembly Building) at the Kennedy Space Center. The Analytical Hierarchy Process (AHP) is reviewed and the mathematical model developed is detailed.

  15. Formulation and closure of compressible turbulence equations in the light of kinetic theory

    NASA Technical Reports Server (NTRS)

    Tsuge, S.; Sagara, K.

    1976-01-01

    Fluid-dynamic moment equations, based on a kinetic hierarchy system, are derived governing the interaction between turbulent and thermal fluctuations. The kinetic theory is shown to reduce the inherent complexity of the conventional formalism of compressible turbulence theory and to minimize arbitrariness in formulating the closure condition.

  16. Suffix Ordering and Morphological Processing

    ERIC Educational Resources Information Center

    Plag, Ingo; Baayen, Harald

    2009-01-01

    There is a long-standing debate about the principles constraining the combinatorial properties of suffixes. Hay 2002 and Hay & Plag 2004 proposed a model in which suffixes can be ordered along a hierarchy of processing complexity. We show that this model generalizes to a larger set of suffixes, and we provide independent evidence supporting the…

  17. Courting the Buyer: The Relationship of Newspaper, Audience, and Advertisers.

    ERIC Educational Resources Information Center

    Thompson, Timothy N.

    By applying Kenneth Burke's concepts of Order, the Secret, and the Kill to the newspaper-audience-advertiser relationship, the narrow imagery that depicts that relationship only in economic terms can be counteracted. Burke's maps of hierarchy, mystery, and transcendence in human action allow the depiction of a complex meshing of patterns,…

  18. Understanding Rasch Measurement: Partial Credit Model and Pivot Anchoring.

    ERIC Educational Resources Information Center

    Bode, Rita K.

    2001-01-01

    Describes the Rasch measurement partial credit model, what it is, how it differs from other Rasch models, and when and how to use it. Also describes the calibration of instruments with increasingly complex items. Explains pivot anchoring and illustrates its use and describes the effect of pivot anchoring on step calibrations, item hierarchy, and…

  19. Complex Adaptive Systems Based Data Integration: Theory and Applications

    ERIC Educational Resources Information Center

    Rohn, Eliahu

    2008-01-01

    Data Definition Languages (DDLs) have been created and used to represent data in programming languages and in database dictionaries. This representation includes descriptions in the form of data fields and relations in the form of a hierarchy, with the common exception of relational databases where relations are flat. Network computing created an…

  20. Proportional Reasoning of Preservice Elementary Education Majors: An Epistemic Model of the Proportional Reasoning Construct.

    ERIC Educational Resources Information Center

    Fleener, M. Jayne

    Current research and learning theory suggest that a hierarchy of proportional reasoning exists that can be tested. Using G. Vergnaud's four complexity variables (structure, content, numerical characteristics, and presentation) and T. E. Kieren's model of rational number knowledge building, an epistemic model of proportional reasoning was…

  1. Prospective memory training in older adults and its relevance for successful aging.

    PubMed

    Hering, Alexandra; Rendell, Peter G; Rose, Nathan S; Schnitzspahn, Katharina M; Kliegel, Matthias

    2014-11-01

    In research on cognitive plasticity, two training approaches have been established: (1) training of strategies to improve performance in a given task (e.g., encoding strategies to improve episodic memory performance) and (2) training of basic cognitive processes (e.g., working memory, inhibition) that underlie a range of more complex cognitive tasks (e.g., planning) to improve both the training target and the complex transfer tasks. Strategy training aims to compensate or circumvent limitations in underlying processes, while process training attempts to augment or to restore these processes. Although research on both approaches has produced some promising findings, results are still heterogeneous and the impact of most training regimes for everyday life is unknown. We, therefore, discuss recent proposals of training regimes aiming to improve prospective memory (i.e., forming and realizing delayed intentions) as this type of complex cognition is highly relevant for independent living. Furthermore, prospective memory is associated with working memory and executive functions and age-related decline is widely reported. We review initial evidence suggesting that both training regimes (i.e., strategy and/or process training) can successfully be applied to improve prospective memory. Conceptual and methodological implications of the findings for research on age-related prospective memory and for training research in general are discussed.

  2. Working Memory and Reasoning Benefit from Different Modes of Large-scale Brain Dynamics in Healthy Older Adults.

    PubMed

    Lebedev, Alexander V; Nilsson, Jonna; Lövdén, Martin

    2018-07-01

    Researchers have proposed that solving complex reasoning problems, a key indicator of fluid intelligence, involves the same cognitive processes as solving working memory tasks. This proposal is supported by an overlap of the functional brain activations associated with the two types of tasks and by high correlations between interindividual differences in performance. We replicated these findings in 53 older participants but also showed that solving reasoning and working memory problems benefits from different configurations of the functional connectome and that this dissimilarity increases with a higher difficulty load. Specifically, superior performance in a typical working memory paradigm ( n-back) was associated with upregulation of modularity (increased between-network segregation), whereas performance in the reasoning task was associated with effective downregulation of modularity. We also showed that working memory training promotes task-invariant increases in modularity. Because superior reasoning performance is associated with downregulation of modular dynamics, training may thus have fostered an inefficient way of solving the reasoning tasks. This could help explain why working memory training does little to promote complex reasoning performance. The study concludes that complex reasoning abilities cannot be reduced to working memory and suggests the need to reconsider the feasibility of using working memory training interventions to attempt to achieve effects that transfer to broader cognition.

  3. DReAM: Demand Response Architecture for Multi-level District Heating and Cooling Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Saptarshi; Chandan, Vikas; Arya, Vijay

    In this paper, we exploit the inherent hierarchy of heat exchangers in District Heating and Cooling (DHC) networks and propose DReAM, a novel Demand Response (DR) architecture for Multi-level DHC networks. DReAM serves to economize system operation while still respecting comfort requirements of individual consumers. Contrary to many present day DR schemes that work on a consumer level granularity, DReAM works at a level of hierarchy above buildings, i.e. substations that supply heat to a group of buildings. This improves the overall DR scalability and reduce the computational complexity. In the first step of the proposed approach, mathematical models ofmore » individual substations and their downstream networks are abstracted into appropriately constructed low-complexity structural forms. In the second step, this abstracted information is employed by the utility to perform DR optimization that determines the optimal heat inflow to individual substations rather than buildings, in order to achieve the targeted objectives across the network. We validate the proposed DReAM framework through experimental results under different scenarios on a test network.« less

  4. Organizational Adaptative Behavior: The Complex Perspective of Individuals-Tasks Interaction

    NASA Astrophysics Data System (ADS)

    Wu, Jiang; Sun, Duoyong; Hu, Bin; Zhang, Yu

    Organizations with different organizational structures have different organizational behaviors when responding environmental changes. In this paper, we use a computational model to examine organizational adaptation on four dimensions: Agility, Robustness, Resilience, and Survivability. We analyze the dynamics of organizational adaptation by a simulation study from a complex perspective of the interaction between tasks and individuals in a sales enterprise. The simulation studies in different scenarios show that more flexible communication between employees and less hierarchy level with the suitable centralization can improve organizational adaptation.

  5. Exchange interactions in a dinuclear manganese (II) complex with cyanopyridine-N-oxide bridging ligands

    NASA Astrophysics Data System (ADS)

    Markosyan, A. S.; Gaidukova, I. Yu.; Ruchkin, A. V.; Anokhin, A. O.; Irkhin, V. Yu.; Ryazanov, M. V.; Kuz'mina, N. P.; Nikiforov, V. N.

    2014-01-01

    The magnetic properties of dinuclear manganese(II) complex [Mn(hfa)2cpo]2 (where hfa is hexafluoroacetylacetonate anion and cpo is 4-cyanopyridine-N-oxide) are presented. The non-monotonous dependence of magnetic susceptibility is explained in terms of the hierarchy of exchange parameters by using exact diagonalization. The thermodynamic behavior of pure cpo and [Mn(hfa)2(cpo)]2 is simulated numerically by an extrapolation to spin S=5/2. The Mn-Mn exchange integral is evaluated.

  6. Major Robert Lawrence Memorial Tribute

    NASA Image and Video Library

    2017-12-08

    Following an Astronauts Memorial Foundation tribute honoring U.S. Air Foce Maj. Robert Lawrence, guests place flowers at the Space Mirror Memorial at the Kennedy Space Center Visitor Complex. Selected in 1967 for the Manned Orbiting Laboratory Program, Lawrence was the first African-American astronaut. He lost his life in a training accident 50 years ago. The ceremony took place in the Center for Space Education at the Kennedy Space Center Visitor Complex.

  7. Towards robust algorithms for current deposition and dynamic load-balancing in a GPU particle in cell code

    NASA Astrophysics Data System (ADS)

    Rossi, Francesco; Londrillo, Pasquale; Sgattoni, Andrea; Sinigardi, Stefano; Turchetti, Giorgio

    2012-12-01

    We present `jasmine', an implementation of a fully relativistic, 3D, electromagnetic Particle-In-Cell (PIC) code, capable of running simulations in various laser plasma acceleration regimes on Graphics-Processing-Units (GPUs) HPC clusters. Standard energy/charge preserving FDTD-based algorithms have been implemented using double precision and quadratic (or arbitrary sized) shape functions for the particle weighting. When porting a PIC scheme to the GPU architecture (or, in general, a shared memory environment), the particle-to-grid operations (e.g. the evaluation of the current density) require special care to avoid memory inconsistencies and conflicts. Here we present a robust implementation of this operation that is efficient for any number of particles per cell and particle shape function order. Our algorithm exploits the exposed GPU memory hierarchy and avoids the use of atomic operations, which can hurt performance especially when many particles lay on the same cell. We show the code multi-GPU scalability results and present a dynamic load-balancing algorithm. The code is written using a python-based C++ meta-programming technique which translates in a high level of modularity and allows for easy performance tuning and simple extension of the core algorithms to various simulation schemes.

  8. A Probabilistic Model of Social Working Memory for Information Retrieval in Social Interactions.

    PubMed

    Li, Liyuan; Xu, Qianli; Gan, Tian; Tan, Cheston; Lim, Joo-Hwee

    2018-05-01

    Social working memory (SWM) plays an important role in navigating social interactions. Inspired by studies in psychology, neuroscience, cognitive science, and machine learning, we propose a probabilistic model of SWM to mimic human social intelligence for personal information retrieval (IR) in social interactions. First, we establish a semantic hierarchy as social long-term memory to encode personal information. Next, we propose a semantic Bayesian network as the SWM, which integrates the cognitive functions of accessibility and self-regulation. One subgraphical model implements the accessibility function to learn the social consensus about IR-based on social information concept, clustering, social context, and similarity between persons. Beyond accessibility, one more layer is added to simulate the function of self-regulation to perform the personal adaptation to the consensus based on human personality. Two learning algorithms are proposed to train the probabilistic SWM model on a raw dataset of high uncertainty and incompleteness. One is an efficient learning algorithm of Newton's method, and the other is a genetic algorithm. Systematic evaluations show that the proposed SWM model is able to learn human social intelligence effectively and outperforms the baseline Bayesian cognitive model. Toward real-world applications, we implement our model on Google Glass as a wearable assistant for social interaction.

  9. Relating Complexity and Error Rates of Ontology Concepts. More Complex NCIt Concepts Have More Errors.

    PubMed

    Min, Hua; Zheng, Ling; Perl, Yehoshua; Halper, Michael; De Coronado, Sherri; Ochs, Christopher

    2017-05-18

    Ontologies are knowledge structures that lend support to many health-information systems. A study is carried out to assess the quality of ontological concepts based on a measure of their complexity. The results show a relation between complexity of concepts and error rates of concepts. A measure of lateral complexity defined as the number of exhibited role types is used to distinguish between more complex and simpler concepts. Using a framework called an area taxonomy, a kind of abstraction network that summarizes the structural organization of an ontology, concepts are divided into two groups along these lines. Various concepts from each group are then subjected to a two-phase QA analysis to uncover and verify errors and inconsistencies in their modeling. A hierarchy of the National Cancer Institute thesaurus (NCIt) is used as our test-bed. A hypothesis pertaining to the expected error rates of the complex and simple concepts is tested. Our study was done on the NCIt's Biological Process hierarchy. Various errors, including missing roles, incorrect role targets, and incorrectly assigned roles, were discovered and verified in the two phases of our QA analysis. The overall findings confirmed our hypothesis by showing a statistically significant difference between the amounts of errors exhibited by more laterally complex concepts vis-à-vis simpler concepts. QA is an essential part of any ontology's maintenance regimen. In this paper, we reported on the results of a QA study targeting two groups of ontology concepts distinguished by their level of complexity, defined in terms of the number of exhibited role types. The study was carried out on a major component of an important ontology, the NCIt. The findings suggest that more complex concepts tend to have a higher error rate than simpler concepts. These findings can be utilized to guide ongoing efforts in ontology QA.

  10. Memory for past public events depends on retrieval frequency but not memory age in Alzheimer's disease.

    PubMed

    Müller, Stephan; Mychajliw, Christian; Hautzinger, Martin; Fallgatter, Andreas J; Saur, Ralf; Leyhe, Thomas

    2014-01-01

    Alzheimer's disease (AD) is characterized by retrograde memory deficits primarily caused by dysfunction of the hippocampal complex. Unresolved questions exist concerning the time course of hippocampal involvement in conscious recollection of declarative knowledge, as reports of temporal gradients of retrograde amnesia have been inconclusive. The aim of this study was to examine whether the extent and severity of retrograde amnesia is mediated by retrieval frequency or, in contrast, whether it depends on the age of the memory according to the assumptions of the main current theories of memory formation. We compared recall of past public events in patients with AD and healthy control (HC) individuals using the Historic Events Test (HET). The HET assesses knowledge about famous public events of the past 60 years divided into four time segments and consists of subjective memory rating, dating accuracy, and contextual memory tasks. Although memory for public events was impaired in AD patients, there was a strong effect of retrieval frequency across all time segments and both groups. As AD and HC groups derived similar benefits from greater retrieval frequency, cortical structures other than the hippocampal complex may mediate memory retrieval. These findings suggest that more frequently retrieved events and facts become more independent of the hippocampal complex and thus better protected against early damage of AD. This could explain why cognitive activity may delay the onset of memory decline in persons who develop AD.

  11. KENNEDY SPACE CENTER, FLA. - KSC Deputy Director Woodrow Whitlow Jr. closes the memorial service held for the crew of Columbia at the Space Memorial Mirror in the KSC Visitor Complex. He is surrounded by dancers of the Shoshone-Bannock Native American community who performed a healing ceremony during the memorial. Feb. 1 is the one-year anniversary of the loss of the crew and orbiter Columbia in a tragic accident as the ship returned to Earth following mission STS-107. Students and staff of the Shoshone-Bannock Nation had an experiment on board Columbia. The public was invited to the memorial service, held in the KSC Visitor Complex, which included comments by Center Director Jim Kennedy and Executive Director of Florida Space Authority Winston Scott. Scott is a former astronaut who flew on Columbia in 1997.

    NASA Image and Video Library

    2004-02-01

    KENNEDY SPACE CENTER, FLA. - KSC Deputy Director Woodrow Whitlow Jr. closes the memorial service held for the crew of Columbia at the Space Memorial Mirror in the KSC Visitor Complex. He is surrounded by dancers of the Shoshone-Bannock Native American community who performed a healing ceremony during the memorial. Feb. 1 is the one-year anniversary of the loss of the crew and orbiter Columbia in a tragic accident as the ship returned to Earth following mission STS-107. Students and staff of the Shoshone-Bannock Nation had an experiment on board Columbia. The public was invited to the memorial service, held in the KSC Visitor Complex, which included comments by Center Director Jim Kennedy and Executive Director of Florida Space Authority Winston Scott. Scott is a former astronaut who flew on Columbia in 1997.

  12. SimGen: A General Simulation Method for Large Systems.

    PubMed

    Taylor, William R

    2017-02-03

    SimGen is a stand-alone computer program that reads a script of commands to represent complex macromolecules, including proteins and nucleic acids, in a structural hierarchy that can then be viewed using an integral graphical viewer or animated through a high-level application programming interface in C++. Structural levels in the hierarchy range from α-carbon or phosphate backbones through secondary structure to domains, molecules, and multimers with each level represented in an identical data structure that can be manipulated using the application programming interface. Unlike most coarse-grained simulation approaches, the higher-level objects represented in SimGen can be soft, allowing the lower-level objects that they contain to interact directly. The default motion simulated by SimGen is a Brownian-like diffusion that can be set to occur across all levels of representation in the hierarchy. Links can also be defined between objects, which, when combined with large high-level random movements, result in an effective search strategy for constraint satisfaction, including structure prediction from predicted pairwise distances. The implementation of SimGen makes use of the hierarchic data structure to avoid unnecessary calculation, especially for collision detection, allowing it to be simultaneously run and viewed on a laptop computer while simulating large systems of over 20,000 objects. It has been used previously to model complex molecular interactions including the motion of a myosin-V dimer "walking" on an actin fibre, RNA stem-loop packing, and the simulation of cell motion and aggregation. Several extensions to this original functionality are described. Copyright © 2016 The Francis Crick Institute. Published by Elsevier Ltd.. All rights reserved.

  13. The neural processing of hierarchical structure in music and speech at different timescales

    PubMed Central

    Farbood, Morwaread M.; Heeger, David J.; Marcus, Gary; Hasson, Uri; Lerner, Yulia

    2015-01-01

    Music, like speech, is a complex auditory signal that contains structures at multiple timescales, and as such is a potentially powerful entry point into the question of how the brain integrates complex streams of information. Using an experimental design modeled after previous studies that used scrambled versions of a spoken story (Lerner et al., 2011) and a silent movie (Hasson et al., 2008), we investigate whether listeners perceive hierarchical structure in music beyond short (~6 s) time windows and whether there is cortical overlap between music and language processing at multiple timescales. Experienced pianists were presented with an extended musical excerpt scrambled at multiple timescales—by measure, phrase, and section—while measuring brain activity with functional magnetic resonance imaging (fMRI). The reliability of evoked activity, as quantified by inter-subject correlation of the fMRI responses, was measured. We found that response reliability depended systematically on musical structure coherence, revealing a topographically organized hierarchy of processing timescales. Early auditory areas (at the bottom of the hierarchy) responded reliably in all conditions. For brain areas at the top of the hierarchy, the original (unscrambled) excerpt evoked more reliable responses than any of the scrambled excerpts, indicating that these brain areas process long-timescale musical structures, on the order of minutes. The topography of processing timescales was analogous with that reported previously for speech, but the timescale gradients for music and speech overlapped with one another only partially, suggesting that temporally analogous structures—words/measures, sentences/musical phrases, paragraph/sections—are processed separately. PMID:26029037

  14. The neural processing of hierarchical structure in music and speech at different timescales.

    PubMed

    Farbood, Morwaread M; Heeger, David J; Marcus, Gary; Hasson, Uri; Lerner, Yulia

    2015-01-01

    Music, like speech, is a complex auditory signal that contains structures at multiple timescales, and as such is a potentially powerful entry point into the question of how the brain integrates complex streams of information. Using an experimental design modeled after previous studies that used scrambled versions of a spoken story (Lerner et al., 2011) and a silent movie (Hasson et al., 2008), we investigate whether listeners perceive hierarchical structure in music beyond short (~6 s) time windows and whether there is cortical overlap between music and language processing at multiple timescales. Experienced pianists were presented with an extended musical excerpt scrambled at multiple timescales-by measure, phrase, and section-while measuring brain activity with functional magnetic resonance imaging (fMRI). The reliability of evoked activity, as quantified by inter-subject correlation of the fMRI responses, was measured. We found that response reliability depended systematically on musical structure coherence, revealing a topographically organized hierarchy of processing timescales. Early auditory areas (at the bottom of the hierarchy) responded reliably in all conditions. For brain areas at the top of the hierarchy, the original (unscrambled) excerpt evoked more reliable responses than any of the scrambled excerpts, indicating that these brain areas process long-timescale musical structures, on the order of minutes. The topography of processing timescales was analogous with that reported previously for speech, but the timescale gradients for music and speech overlapped with one another only partially, suggesting that temporally analogous structures-words/measures, sentences/musical phrases, paragraph/sections-are processed separately.

  15. The Relations between Early Working Memory Abilities and Later Developing Reading Skills: A Longitudinal Study from Kindergarten to Fifth Grade

    ERIC Educational Resources Information Center

    Nevo, Einat; Bar-Kochva, Irit

    2015-01-01

    This study investigated the relations of early working-memory abilities (phonological and visual-spatial short-term memory [STM] and complex memory and episodic buffer memory) and later developing reading skills. Sixty Hebrew-speaking children were followed from kindergarten through Grade 5. Working memory was tested in kindergarten and reading in…

  16. The contributions of handedness and working memory to episodic memory.

    PubMed

    Sahu, Aparna; Christman, Stephen D; Propper, Ruth E

    2016-11-01

    Past studies have independently shown associations of working memory and degree of handedness with episodic memory retrieval. The current study takes a step ahead by examining whether handedness and working memory independently predict episodic memory. In agreement with past studies, there was an inconsistent-handed advantage for episodic memory; however, this advantage was absent for working memory tasks. Furthermore, regression analyses showed handedness, and complex working memory predicted episodic memory performance at different times. Results are discussed in light of theories of episodic memory and hemispheric interaction.

  17. Explicit pre-training instruction does not improve implicit perceptual-motor sequence learning

    PubMed Central

    Sanchez, Daniel J.; Reber, Paul J.

    2012-01-01

    Memory systems theory argues for separate neural systems supporting implicit and explicit memory in the human brain. Neuropsychological studies support this dissociation, but empirical studies of cognitively healthy participants generally observe that both kinds of memory are acquired to at least some extent, even in implicit learning tasks. A key question is whether this observation reflects parallel intact memory systems or an integrated representation of memory in healthy participants. Learning of complex tasks in which both explicit instruction and practice is used depends on both kinds of memory, and how these systems interact will be an important component of the learning process. Theories that posit an integrated, or single, memory system for both types of memory predict that explicit instruction should contribute directly to strengthening task knowledge. In contrast, if the two types of memory are independent and acquired in parallel, explicit knowledge should have no direct impact and may serve in a “scaffolding” role in complex learning. Using an implicit perceptual-motor sequence learning task, the effect of explicit pre-training instruction on skill learning and performance was assessed. Explicit pre-training instruction led to robust explicit knowledge, but sequence learning did not benefit from the contribution of pre-training sequence memorization. The lack of an instruction benefit suggests that during skill learning, implicit and explicit memory operate independently. While healthy participants will generally accrue parallel implicit and explicit knowledge in complex tasks, these types of information appear to be separately represented in the human brain consistent with multiple memory systems theory. PMID:23280147

  18. Segregating the core computational faculty of human language from working memory.

    PubMed

    Makuuchi, Michiru; Bahlmann, Jörg; Anwander, Alfred; Friederici, Angela D

    2009-05-19

    In contrast to simple structures in animal vocal behavior, hierarchical structures such as center-embedded sentences manifest the core computational faculty of human language. Previous artificial grammar learning studies found that the left pars opercularis (LPO) subserves the processing of hierarchical structures. However, it is not clear whether this area is activated by the structural complexity per se or by the increased memory load entailed in processing hierarchical structures. To dissociate the effect of structural complexity from the effect of memory cost, we conducted a functional magnetic resonance imaging study of German sentence processing with a 2-way factorial design tapping structural complexity (with/without hierarchical structure, i.e., center-embedding of clauses) and working memory load (long/short distance between syntactically dependent elements; i.e., subject nouns and their respective verbs). Functional imaging data revealed that the processes for structure and memory operate separately but co-operatively in the left inferior frontal gyrus; activities in the LPO increased as a function of structural complexity, whereas activities in the left inferior frontal sulcus (LIFS) were modulated by the distance over which the syntactic information had to be transferred. Diffusion tensor imaging showed that these 2 regions were interconnected through white matter fibers. Moreover, functional coupling between the 2 regions was found to increase during the processing of complex, hierarchically structured sentences. These results suggest a neuroanatomical segregation of syntax-related aspects represented in the LPO from memory-related aspects reflected in the LIFS, which are, however, highly interconnected functionally and anatomically.

  19. Effects of dividing attention on memory for declarative and procedural aspects of tool use.

    PubMed

    Roy, Shumita; Park, Norman W

    2016-07-01

    Tool-related knowledge and skills are supported by a complex set of memory processes that are not well understood. Some aspects of tools are mediated by either declarative or procedural memory, while other aspects may rely on an interaction of both systems. Although motor skill learning is believed to be primarily supported by procedural memory, there is debate in the current literature regarding the role of declarative memory. Growing evidence suggests that declarative memory may be involved during early stages of motor skill learning, although findings have been mixed. In the current experiment, healthy, younger adults were trained to use a set of novel complex tools and were tested on their memory for various aspects of the tools. Declarative memory encoding was interrupted by dividing attention during training. Findings showed that dividing attention during training was detrimental for subsequent memory for tool attributes as well as accurate demonstration of tool use and tool grasping. However, dividing attention did not interfere with motor skill learning, suggesting that declarative memory is not essential for skill learning associated with tools.

  20. The effects of cholesterol on learning and memory.

    PubMed

    Schreurs, Bernard G

    2010-07-01

    Cholesterol is vital to normal brain function including learning and memory but that involvement is as complex as the synthesis, metabolism and excretion of cholesterol itself. Dietary cholesterol influences learning tasks from water maze to fear conditioning even though cholesterol does not cross the blood brain barrier. Excess cholesterol has many consequences including peripheral pathology that can signal brain via cholesterol metabolites, pro-inflammatory mediators and antioxidant processes. Manipulations of cholesterol within the central nervous system through genetic, pharmacological, or metabolic means circumvent the blood brain barrier and affect learning and memory but often in animals already otherwise compromised. The human literature is no less complex. Cholesterol reduction using statins improves memory in some cases but not others. There is also controversy over statin use to alleviate memory problems in Alzheimer's disease. Correlations of cholesterol and cognitive function are mixed and association studies find some genetic polymorphisms are related to cognitive function but others are not. In sum, the field is in flux with a number of seemingly contradictory results and many complexities. Nevertheless, understanding cholesterol effects on learning and memory is too important to ignore.

  1. Settlement Dynamics and Hierarchy from Agent Decision-Making: a Method Derived from Entropy Maximization.

    PubMed

    Altaweel, Mark

    2015-01-01

    This paper presents an agent-based complex system simulation of settlement structure change using methods derived from entropy maximization modeling. The approach is applied to model the movement of people and goods in urban settings to study how settlement size hierarchy develops. While entropy maximization is well known for assessing settlement structure change over different spatiotemporal settings, approaches have rarely attempted to develop and apply this methodology to understand how individual and household decisions may affect settlement size distributions. A new method developed in this paper allows individual decision-makers to chose where to settle based on social-environmental factors, evaluate settlements based on geography and relative benefits, while retaining concepts derived from entropy maximization with settlement size affected by movement ability and site attractiveness feedbacks. To demonstrate the applicability of the theoretical and methodological approach, case study settlement patterns from the Middle Bronze (MBA) and Iron Ages (IA) in the Iraqi North Jazirah Survey (NJS) are used. Results indicate clear differences in settlement factors and household choices in simulations that lead to settlement size hierarchies comparable to the two evaluated periods. Conflict and socio-political cohesion, both their presence and absence, are suggested to have major roles in affecting the observed settlement hierarchy. More broadly, the model is made applicable for different empirically based settings, while being generalized to incorporate data uncertainty, making the model useful for understanding urbanism from top-down and bottom-up perspectives.

  2. Contrasting lexical similarity and formal definitions in SNOMED CT: consistency and implications.

    PubMed

    Agrawal, Ankur; Elhanan, Gai

    2014-02-01

    To quantify the presence of and evaluate an approach for detection of inconsistencies in the formal definitions of SNOMED CT (SCT) concepts utilizing a lexical method. Utilizing SCT's Procedure hierarchy, we algorithmically formulated similarity sets: groups of concepts with similar lexical structure of their fully specified name. We formulated five random samples, each with 50 similarity sets, based on the same parameter: number of parents, attributes, groups, all the former as well as a randomly selected control sample. All samples' sets were reviewed for types of formal definition inconsistencies: hierarchical, attribute assignment, attribute target values, groups, and definitional. For the Procedure hierarchy, 2111 similarity sets were formulated, covering 18.1% of eligible concepts. The evaluation revealed that 38 (Control) to 70% (Different relationships) of similarity sets within the samples exhibited significant inconsistencies. The rate of inconsistencies for the sample with different relationships was highly significant compared to Control, as well as the number of attribute assignment and hierarchical inconsistencies within their respective samples. While, at this time of the HITECH initiative, the formal definitions of SCT are only a minor consideration, in the grand scheme of sophisticated, meaningful use of captured clinical data, they are essential. However, significant portion of the concepts in the most semantically complex hierarchy of SCT, the Procedure hierarchy, are modeled inconsistently in a manner that affects their computability. Lexical methods can efficiently identify such inconsistencies and possibly allow for their algorithmic resolution. Copyright © 2013 Elsevier Inc. All rights reserved.

  3. Slime mold uses an externalized spatial “memory” to navigate in complex environments

    PubMed Central

    Reid, Chris R.; Latty, Tanya; Dussutour, Audrey; Beekman, Madeleine

    2012-01-01

    Spatial memory enhances an organism’s navigational ability. Memory typically resides within the brain, but what if an organism has no brain? We show that the brainless slime mold Physarum polycephalum constructs a form of spatial memory by avoiding areas it has previously explored. This mechanism allows the slime mold to solve the U-shaped trap problem—a classic test of autonomous navigational ability commonly used in robotics—requiring the slime mold to reach a chemoattractive goal behind a U-shaped barrier. Drawn into the trap, the organism must rely on other methods than gradient-following to escape and reach the goal. Our data show that spatial memory enhances the organism’s ability to navigate in complex environments. We provide a unique demonstration of a spatial memory system in a nonneuronal organism, supporting the theory that an externalized spatial memory may be the functional precursor to the internal memory of higher organisms. PMID:23045640

  4. Influence of an immunodominant herpes simplex virus type 1 CD8+ T cell epitope on the target hierarchy and function of subdominant CD8+ T cells

    PubMed Central

    2017-01-01

    Herpes simplex virus type 1 (HSV-1) latency in sensory ganglia such as trigeminal ganglia (TG) is associated with a persistent immune infiltrate that includes effector memory CD8+ T cells that can influence HSV-1 reactivation. In C57BL/6 mice, HSV-1 induces a highly skewed CD8+ T cell repertoire, in which half of CD8+ T cells (gB-CD8s) recognize a single epitope on glycoprotein B (gB498-505), while the remainder (non-gB-CD8s) recognize, in varying proportions, 19 subdominant epitopes on 12 viral proteins. The gB-CD8s remain functional in TG throughout latency, while non-gB-CD8s exhibit varying degrees of functional compromise. To understand how dominance hierarchies relate to CD8+ T cell function during latency, we characterized the TG-associated CD8+ T cells following corneal infection with a recombinant HSV-1 lacking the immunodominant gB498-505 epitope (S1L). S1L induced a numerically equivalent CD8+ T cell infiltrate in the TG that was HSV-specific, but lacked specificity for gB498-505. Instead, there was a general increase of non-gB-CD8s with specific subdominant epitopes arising to codominance. In a latent S1L infection, non-gB-CD8s in the TG showed a hierarchy targeting different epitopes at latency compared to at acute times, and these cells retained an increased functionality at latency. In a latent S1L infection, these non-gB-CD8s also display an equivalent ability to block HSV reactivation in ex vivo ganglionic cultures compared to TG infected with wild type HSV-1. These data indicate that loss of the immunodominant gB498-505 epitope alters the dominance hierarchy and reduces functional compromise of CD8+ T cells specific for subdominant HSV-1 epitopes during viral latency. PMID:29206240

  5. Human infants' learning of social structures: the case of dominance hierarchy.

    PubMed

    Mascaro, Olivier; Csibra, Gergely

    2014-01-01

    We tested 15-month-olds' capacity to represent social-dominance hierarchies with more than two agents. Our results showed that infants found it harder to memorize dominance relations that were presented in an order that hindered the incremental formation of a single structure (Study 1). These results suggest that infants attempt to build structures incrementally, relation by relation, thereby simplifying the complex problem of recognizing a social structure. Infants also found circular dominance structures harder to process than linear dominance structures (Study 2). These expectations about the shape of structures may facilitate learning. Our results suggest that infants attempt to represent social structures composed of social relations. They indicate that human infants go beyond learning about individual social partners and their respective relations and form hypotheses about how social groups are organized.

  6. Decision Support System for Determining Scholarship Selection using an Analytical Hierarchy Process

    NASA Astrophysics Data System (ADS)

    Puspitasari, T. D.; Sari, E. O.; Destarianto, P.; Riskiawan, H. Y.

    2018-01-01

    Decision Support System is a computer program application that analyzes data and presents it so that users can make decision more easily. Determining Scholarship Selection study case in Senior High School in east Java wasn’t easy. It needed application to solve the problem, to improve the accuracy of targets for prospective beneficiaries of poor students and to speed up the screening process. This research will build system uses the method of Analytical Hierarchy Process (AHP) is a method that solves a complex and unstructured problem into its group, organizes the groups into a hierarchical order, inputs numerical values instead of human perception in comparing relative and ultimately with a synthesis determined elements that have the highest priority. The accuracy system for this research is 90%.

  7. Mammary Stem Cells: Premise, Properties, and Perspectives.

    PubMed

    Lloyd-Lewis, Bethan; Harris, Olivia B; Watson, Christine J; Davis, Felicity M

    2017-08-01

    Adult mammary stem cells (MaSCs) drive postnatal organogenesis and remodeling in the mammary gland, and their longevity and potential have important implications for breast cancer. However, despite intense investigation the identity, location, and differentiation potential of MaSCs remain subject to deliberation. The application of genetic lineage-tracing models, combined with quantitative 3D imaging and biophysical methods, has provided new insights into the mammary epithelial hierarchy that challenge classical definitions of MaSC potency and behaviors. We review here recent advances - discussing fundamental unresolved properties of MaSC potency, dynamics, and plasticity - and point to evolving technologies that promise to shed new light on this intractable debate. Elucidation of the physiological mammary differentiation hierarchy is paramount to understanding the complex heterogeneous breast cancer landscape. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Brief Report: Effect of Spatial Complexity on Visual Short-Term Memory and Self-Reported Autistic-Like Traits in Typically Developed Individuals

    ERIC Educational Resources Information Center

    Takahashi, Junichi; Gyoba, Jiro; Yamawaki, Nozomi

    2013-01-01

    This report examines effects of the spatial complexity of configurations on visual short-term memory (VSTM) capacity for individuals from the general population differing on autism-spectrum quotient (AQ) scores. During each trial, nine-line segments with various orientations were arrayed in simple or complex configurations and presented in both…

  9. Memory for Negation in Coordinate and Complex Sentences

    ERIC Educational Resources Information Center

    Harris, Richard J.

    1976-01-01

    Two experiments were run to test memory for the negation morpheme "not" in coordinate sentences (e.g., The ballerina had twins and the policewoman did not have triplets) and complex sentences (e.g., The ghost scared Hamlet into not murdering Shakespeare). (Editor)

  10. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE PAGES

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  11. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  12. Mapping the developmental constraints on working memory span performance.

    PubMed

    Bayliss, Donna M; Jarrold, Christopher; Baddeley, Alan D; Gunn, Deborah M; Leigh, Eleanor

    2005-07-01

    This study investigated the constraints underlying developmental improvements in complex working memory span performance among 120 children of between 6 and 10 years of age. Independent measures of processing efficiency, storage capacity, rehearsal speed, and basic speed of processing were assessed to determine their contribution to age-related variance in complex span. Results showed that developmental improvements in complex span were driven by 2 age-related but separable factors: 1 associated with general speed of processing and 1 associated with storage ability. In addition, there was an age-related contribution shared between working memory, processing speed, and storage ability that was important for higher level cognition. These results pose a challenge for models of complex span performance that emphasize the importance of processing speed alone.

  13. HEP - A semaphore-synchronized multiprocessor with central control. [Heterogeneous Element Processor

    NASA Technical Reports Server (NTRS)

    Gilliland, M. C.; Smith, B. J.; Calvert, W.

    1976-01-01

    The paper describes the design concept of the Heterogeneous Element Processor (HEP), a system tailored to the special needs of scientific simulation. In order to achieve high-speed computation required by simulation, HEP features a hierarchy of processes executing in parallel on a number of processors, with synchronization being largely accomplished by hardware. A full-empty-reserve scheme of synchronization is realized by zero-one-valued hardware semaphores. A typical system has, besides the control computer and the scheduler, an algebraic module, a memory module, a first-in first-out (FIFO) module, an integrator module, and an I/O module. The architecture of the scheduler and the algebraic module is examined in detail.

  14. Characterizing Task-Based OpenMP Programs

    PubMed Central

    Muddukrishna, Ananya; Jonsson, Peter A.; Brorsson, Mats

    2015-01-01

    Programmers struggle to understand performance of task-based OpenMP programs since profiling tools only report thread-based performance. Performance tuning also requires task-based performance in order to balance per-task memory hierarchy utilization against exposed task parallelism. We provide a cost-effective method to extract detailed task-based performance information from OpenMP programs. We demonstrate the utility of our method by quickly diagnosing performance problems and characterizing exposed task parallelism and per-task instruction profiles of benchmarks in the widely-used Barcelona OpenMP Tasks Suite. Programmers can tune performance faster and understand performance tradeoffs more effectively than existing tools by using our method to characterize task-based performance. PMID:25860023

  15. The complexity of divisibility.

    PubMed

    Bausch, Johannes; Cubitt, Toby

    2016-09-01

    We address two sets of long-standing open questions in linear algebra and probability theory, from a computational complexity perspective: stochastic matrix divisibility, and divisibility and decomposability of probability distributions. We prove that finite divisibility of stochastic matrices is an NP-complete problem, and extend this result to nonnegative matrices, and completely-positive trace-preserving maps, i.e. the quantum analogue of stochastic matrices. We further prove a complexity hierarchy for the divisibility and decomposability of probability distributions, showing that finite distribution divisibility is in P, but decomposability is NP-hard. For the former, we give an explicit polynomial-time algorithm. All results on distributions extend to weak-membership formulations, proving that the complexity of these problems is robust to perturbations.

  16. Overexpression of the vesicular acetylcholine transporter enhances dendritic complexity of adult-born hippocampal neurons and improves acquisition of spatial memory during aging.

    PubMed

    Nagy, Paul Michael; Aubert, Isabelle

    2015-05-01

    Aging is marked by progressive impairments in the process of adult neurogenesis and spatial memory performance. The underlying mechanisms for these impairments have not been fully established; however, they may coincide with decline of cholinergic signaling in the hippocampus. This study investigates whether augmenting cholinergic neurotransmission, by enhancing the expression of the vesicular acetylcholine transporter (VAChT), influences the age-related decline in the development of newborn hippocampal cells and spatial memory. We found that enhanced VAChT expression in the hippocampus of mice contributes to lifelong increases in the dendritic complexity of newborn neurons. Furthermore, enhanced VAChT expression improved memory acquisition through an increased use of spatially precise search strategies in the Morris water maze through the course of the aging process. These data suggest that VAChT overexpression contributes to increases in dendritic complexity and improved spatial memory during aging. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Automated control of hierarchical systems using value-driven methods

    NASA Technical Reports Server (NTRS)

    Pugh, George E.; Burke, Thomas E.

    1990-01-01

    An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.

  18. The varieties of immunological experience: of pathogens, stress, and dendritic cells.

    PubMed

    Pulendran, Bali

    2015-01-01

    In the 40 years since their discovery, dendritic cells (DCs) have been recognized as central players in immune regulation. DCs sense microbial stimuli through pathogen-recognition receptors (PRRs) and decode, integrate, and present information derived from such stimuli to T cells, thus stimulating immune responses. DCs can also regulate the quality of immune responses. Several functionally specialized subsets of DCs exist, but DCs also display functional plasticity in response to diverse stimuli. In addition to sensing pathogens via PRRs, emerging evidence suggests that DCs can also sense stress signals, such as amino acid starvation, through ancient stress and nutrient sensing pathways, to stimulate adaptive immunity. Here, I discuss these exciting advances in the context of a historic perspective on the discovery of DCs and their role in immune regulation. I conclude with a discussion of emerging areas in DC biology in the systems immunology era and suggest that the impact of DCs on immunity can be usefully contextualized in a hierarchy-of-organization model in which DCs, their receptors and signaling networks, cell-cell interactions, tissue microenvironment, and the host macroenvironment represent different levels of the hierarchy. Immunity or tolerance can then be represented as a complex function of each of these hierarchies.

  19. Accessible, almost ab initio multi-scale modeling of entangled polymers via slip-links

    NASA Astrophysics Data System (ADS)

    Andreev, Marat

    It is widely accepted that dynamics of entangled polymers can be described by the tube model. Here we advocate for an alternative approach to entanglement modeling known as slip-links. Recently, slip-links were shown to possess important advantages over tube models, namely they have strong connections to atomistic, multichain levels of description, agree with non-equilibrium thermodynamics, are applicable to any chain architecture and can be used in linear or non-linear rheology. We present a hierarchy of slip-link models that are connected to each other through successive coarse graining. Models in the hierarchy are consistent in their overlapping domains of applicability in order to allow a straightforward mapping of parameters. In particular, the most--detailed level of description has four parameters, three of which can be determined directly from atomistic simulations. On the other hand, the least--detailed member of the hierarchy is numerically accessible, and allows for non-equilibrium flow predictions of complex chain architectures. Using GPU implementation these predictions can be obtained in minutes of computational time on a single desktop equipped with a mainstream gaming GPU. The GPU code is available online for free download.

  20. Recognition and characterization of hierarchical interstellar structure. II - Structure tree statistics

    NASA Technical Reports Server (NTRS)

    Houlahan, Padraig; Scalo, John

    1992-01-01

    A new method of image analysis is described, in which images partitioned into 'clouds' are represented by simplified skeleton images, called structure trees, that preserve the spatial relations of the component clouds while disregarding information concerning their sizes and shapes. The method can be used to discriminate between images of projected hierarchical (multiply nested) and random three-dimensional simulated collections of clouds constructed on the basis of observed interstellar properties, and even intermediate systems formed by combining random and hierarchical simulations. For a given structure type, the method can distinguish between different subclasses of models with different parameters and reliably estimate their hierarchical parameters: average number of children per parent, scale reduction factor per level of hierarchy, density contrast, and number of resolved levels. An application to a column density image of the Taurus complex constructed from IRAS data is given. Moderately strong evidence for a hierarchical structural component is found, and parameters of the hierarchy, as well as the average volume filling factor and mass efficiency of fragmentation per level of hierarchy, are estimated. The existence of nested structure contradicts models in which large molecular clouds are supposed to fragment, in a single stage, into roughly stellar-mass cores.

  1. Hurwitz numbers and products of random matrices

    NASA Astrophysics Data System (ADS)

    Orlov, A. Yu.

    2017-09-01

    We study multimatrix models, which may be viewed as integrals of products of tau functions depending on the eigenvalues of products of random matrices. We consider tau functions of the two-component Kadomtsev-Petviashvili (KP) hierarchy (semi-infinite relativistic Toda lattice) and of the B-type KP (BKP) hierarchy introduced by Kac and van de Leur. Such integrals are sometimes tau functions themselves. We consider models that generate Hurwitz numbers HE,F, where E is the Euler characteristic of the base surface and F is the number of branch points. We show that in the case where the integrands contain the product of n > 2 matrices, the integral generates Hurwitz numbers with E ≤ 2 and F ≤ n+2. Both the numbers E and F depend both on n and on the order of the factors in the matrix product. The Euler characteristic E can be either an even or an odd number, i.e., it can match both orientable and nonorientable (Klein) base surfaces depending on the presence of the tau function of the BKP hierarchy in the integrand. We study two cases, the products of complex and the products of unitary matrices.

  2. Selective transfer of visual working memory training on Chinese character learning.

    PubMed

    Opitz, Bertram; Schneiders, Julia A; Krick, Christoph M; Mecklinger, Axel

    2014-01-01

    Previous research has shown a systematic relationship between phonological working memory capacity and second language proficiency for alphabetic languages. However, little is known about the impact of working memory processes on second language learning in a non-alphabetic language such as Mandarin Chinese. Due to the greater complexity of the Chinese writing system we expect that visual working memory rather than phonological working memory exerts a unique influence on learning Chinese characters. This issue was explored in the present experiment by comparing visual working memory training with an active (auditory working memory training) control condition and a passive, no training control condition. Training induced modulations in language-related brain networks were additionally examined using functional magnetic resonance imaging in a pretest-training-posttest design. As revealed by pre- to posttest comparisons and analyses of individual differences in working memory training gains, visual working memory training led to positive transfer effects on visual Chinese vocabulary learning compared to both control conditions. In addition, we found sustained activation after visual working memory training in the (predominantly visual) left infero-temporal cortex that was associated with behavioral transfer. In the control conditions, activation either increased (active control condition) or decreased (passive control condition) without reliable behavioral transfer effects. This suggests that visual working memory training leads to more efficient processing and more refined responses in brain regions involved in visual processing. Furthermore, visual working memory training boosted additional activation in the precuneus, presumably reflecting mental image generation of the learned characters. We, therefore, suggest that the conjoint activity of the mid-fusiform gyrus and the precuneus after visual working memory training reflects an interaction of working memory and imagery processes with complex visual stimuli that fosters the coherent synthesis of a percept from a complex visual input in service of enhanced Chinese character learning. © 2013 Published by Elsevier Ltd.

  3. Molecular brake pad hypothesis: pulling off the brakes for emotional memory

    PubMed Central

    Vogel-Ciernia, Annie

    2015-01-01

    Under basal conditions histone deacetylases (HDACs) and their associated co-repressor complexes serve as molecular ‘brake pads’ to prevent the gene expression required for long-term memory formation. Following a learning event, HDACs and their co-repressor complexes are removed from a subset of specific gene promoters, allowing the histone acetylation and active gene expression required for long-term memory formation. Inhibition of HDACs increases histone acetylation, extends gene expression profiles, and allows for the formation of persistent long-term memories for training events that are otherwise forgotten. We propose that emotionally salient experiences have utilized this system to form strong and persistent memories for behaviorally significant events. Consequently, the presence or absence of HDACs at a selection of specific gene promoters could serve as a critical barrier for permitting the formation of long-term memories. PMID:23096102

  4. Misremembering emotion: Inductive category effects for complex emotional stimuli.

    PubMed

    Corbin, Jonathan C; Crawford, L Elizabeth; Vavra, Dylan T

    2017-07-01

    Memories of objects are biased toward what is typical of the category to which they belong. Prior research on memory for emotional facial expressions has demonstrated a bias towards an emotional expression prototype (e.g., slightly happy faces are remembered as happier). We investigate an alternate source of bias in memory for emotional expressions - the central tendency bias. The central tendency bias skews reconstruction of a memory trace towards the center of the distribution for a particular attribute. This bias has been attributed to a Bayesian combination of an imprecise memory for a particular object with prior information about its category. Until now, studies examining the central tendency bias have focused on simple stimuli. We extend this work to socially relevant, complex, emotional facial expressions. We morphed facial expressions on a continuum from sad to happy. Different ranges of emotion were used in four experiments in which participants viewed individual expressions and, after a variable delay, reproduced each face by adjusting a morph to match it. Estimates were biased toward the center of the presented stimulus range, and the bias increased at longer memory delays, consistent with the Bayesian prediction that as trace memory loses precision, category knowledge is given more weight. The central tendency effect persisted within and across emotion categories (sad, neutral, and happy). This article expands the scope of work on inductive category effects to memory for complex, emotional stimuli.

  5. Access, Status, and Representation: Some Reflections from Two Ethnographic Studies of Elite Schools

    ERIC Educational Resources Information Center

    Gaztambide-Fernandez, Ruben A.; Howard, Adam

    2012-01-01

    In this article, we use our experiences to demonstrate the limits of the "studying up" metaphor to capture the complexity of the dynamics involved in doing research on groups that occupy positions of power within social hierarchies. The article focuses on different facets of the research process, alternating between our individual narratives and a…

  6. The Effect of Orthographic Complexity on Spanish Spelling in Grades 1-3

    ERIC Educational Resources Information Center

    Ford, Karen; Invernizzi, Marcia; Huang, Francis

    2018-01-01

    This study was designed to identify a continuum of orthographic features that characterize Spanish spelling development in Grades 1-3. Two research questions guided this work: (1) Is there a hierarchy of orthographic features that affect students' spelling accuracy in Spanish over and above other school-level, student-level, and word-level…

  7. The potential for species conservation in tropical secondary forests

    Treesearch

    Robin L. Chazdon; Carlos A. Peres; Daisy Dent; Douglas Sheil; Ariel E. Lugo; David Lamb; Nigel E. Stork; Scott E. Miller

    2009-01-01

    In the wake of widespread loss of old-growth forests throughout the tropics, secondary forests will likely play a growing role in the conservation of forest biodiversity. We considered a complex hierarchy of factors that interact in space and time to determine the conservation potential of tropical secondary forests. Beyond the characteristics of local forest patches,...

  8. Distributive Leadership in Public Schools: Experiences and Perceptions of Teachers in the Soweto Region

    ERIC Educational Resources Information Center

    Naicker, Suraiya R.; Mestry, Raj

    2011-01-01

    In current times, the increasing demands of principalship and the complexities facing schools have led to the emergence of distributive forms of leadership in schools. The dissatisfaction with traditional models has resulted in a paradigm shift where leadership focus on the position of individuals in the hierarchy has been rejected in favour of…

  9. Project Success for the SLD Child, Motor-Perception Activities.

    ERIC Educational Resources Information Center

    Wayne - Carroll Public Schools, Wayne, NE.

    Presented is a curriculum guide for a perceptual motor program which was developed by Project Success (Nebraska) through a Title III grant for language learning disabled elementary level students in kindergarten through grade 3. The program is said to be arranged in a hierarchy of skills ranging from simple to complex and to be written so that the…

  10. A unified construction for the algebro-geometric quasiperiodic solutions of the Lotka-Volterra and relativistic Lotka-Volterra hierarchy

    NASA Astrophysics Data System (ADS)

    Zhao, Peng; Fan, Engui

    2015-04-01

    In this paper, a new type of integrable differential-difference hierarchy, namely, the generalized relativistic Lotka-Volterra (GRLV) hierarchy, is introduced. This hierarchy is closely related to Lotka-Volterra lattice and relativistic Lotka-Volterra lattice, which allows us to provide a unified and effective way to obtain some exact solutions for both the Lotka-Volterra hierarchy and the relativistic Lotka-Volterra hierarchy. In particular, we shall construct algebro-geometric quasiperiodic solutions for the LV hierarchy and the RLV hierarchy in a unified manner on the basis of the finite gap integration theory.

  11. Simulation of n-qubit quantum systems. III. Quantum operations

    NASA Astrophysics Data System (ADS)

    Radtke, T.; Fritzsche, S.

    2007-05-01

    During the last decade, several quantum information protocols, such as quantum key distribution, teleportation or quantum computation, have attracted a lot of interest. Despite the recent success and research efforts in quantum information processing, however, we are just at the beginning of understanding the role of entanglement and the behavior of quantum systems in noisy environments, i.e. for nonideal implementations. Therefore, in order to facilitate the investigation of entanglement and decoherence in n-qubit quantum registers, here we present a revised version of the FEYNMAN program for working with quantum operations and their associated (Jamiołkowski) dual states. Based on the implementation of several popular decoherence models, we provide tools especially for the quantitative analysis of quantum operations. Apart from the implementation of different noise models, the current program extension may help investigate the fragility of many quantum states, one of the main obstacles in realizing quantum information protocols today. Program summaryTitle of program: Feynman Catalogue identifier: ADWE_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWE_v3_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: None Operating systems: Any system that supports MAPLE; tested under Microsoft Windows XP, SuSe Linux 10 Program language used:MAPLE 10 Typical time and memory requirements: Most commands that act upon quantum registers with five or less qubits take ⩽10 seconds of processor time (on a Pentium 4 processor with ⩾2 GHz or equivalent) and 5-20 MB of memory. Especially when working with symbolic expressions, however, the memory and time requirements critically depend on the number of qubits in the quantum registers, owing to the exponential dimension growth of the associated Hilbert space. For example, complex (symbolic) noise models (with several Kraus operators) for multi-qubit systems often result in very large symbolic expressions that dramatically slow down the evaluation of measures or other quantities. In these cases, MAPLE's assume facility sometimes helps to reduce the complexity of symbolic expressions, but often only numerical evaluation is possible. Since the complexity of the FEYNMAN commands is very different, no general scaling law for the CPU time and memory usage can be given. No. of bytes in distributed program including test data, etc.: 799 265 No. of lines in distributed program including test data, etc.: 18 589 Distribution format: tar.gz Reasons for new version: While the previous program versions were designed mainly to create and manipulate the state of quantum registers, the present extension aims to support quantum operations as the essential ingredient for studying the effects of noisy environments. Does this version supersede the previous version: Yes Nature of the physical problem: Today, entanglement is identified as the essential resource in virtually all aspects of quantum information theory. In most practical implementations of quantum information protocols, however, decoherence typically limits the lifetime of entanglement. It is therefore necessary and highly desirable to understand the evolution of entanglement in noisy environments. Method of solution: Using the computer algebra system MAPLE, we have developed a set of procedures that support the definition and manipulation of n-qubit quantum registers as well as (unitary) logic gates and (nonunitary) quantum operations that act on the quantum registers. The provided hierarchy of commands can be used interactively in order to simulate and analyze the evolution of n-qubit quantum systems in ideal and nonideal quantum circuits.

  12. Major Robert Lawrence Memorial Tribute

    NASA Image and Video Library

    2017-12-08

    During an Astronauts Memorial Foundation tribute honoring U.S. Air Foce Maj. Robert Lawrence, his sister, Barbara Lawrence, Ph.D., far right, places a flower at the Space Mirror Memorial at the Kennedy Space Center Visitor Complex. Selected in 1967 for the Manned Orbiting Laboratory Program, Lawrence was the first African-American astronaut. He lost his life in a training accident 50 years ago. The ceremony took place in the Center for Space Education at the Kennedy Space Center Visitor Complex.

  13. Long-Term Moderate Exercise Rescues Age-Related Decline in Hippocampal Neuronal Complexity and Memory.

    PubMed

    Tsai, Sheng-Feng; Ku, Nai-Wen; Wang, Tzu-Feng; Yang, Yan-Hsiang; Shih, Yao-Hsiang; Wu, Shih-Ying; Lee, Chu-Wan; Yu, Megan; Yang, Ting-Ting; Kuo, Yu-Min

    2018-05-07

    Aging impairs hippocampal neuroplasticity and hippocampus-related learning and memory. In contrast, exercise training is known to improve hippocampal neuronal function. However, whether exercise is capable of restoring memory function in old animals is less clear. Here, we investigated the effects of exercise on the hippocampal neuroplasticity and memory functions during aging. Young (3 months), middle-aged (9-12 months), and old (18 months) mice underwent moderate-intensity treadmill running training for 6 weeks, and their hippocampus-related learning and memory, and the plasticity of their CA1 neurons was evaluated. The memory performance (Morris water maze and novel object recognition tests), and dendritic complexity (branch and length) and spine density of their hippocampal CA1 neurons decreased as their age increased. The induction and maintenance of high-frequency stimulation-induced long-term potentiation in the CA1 area and the expressions of neuroplasticity-related proteins were not affected by age. Treadmill running increased CA1 neuron long-term potentiation and dendritic complexity in all three age groups, and it restored the learning and memory ability in middle-aged and old mice. Furthermore, treadmill running upregulated the hippocampal expressions of brain-derived neurotrophic factor and monocarboxylate transporter-4 in middle-aged mice, glutamine synthetase in old mice, and full-length TrkB in middle-aged and old mice. The hippocampus-related memory function declines from middle age, but long-term moderate-intensity running effectively increased hippocampal neuroplasticity and memory in mice of different ages, even when the memory impairment had progressed to an advanced stage. Thus, long-term, moderate intensity exercise training might be a way of delaying and treating aging-related memory decline. © 2018 S. Karger AG, Basel.

  14. Intrinsic frequency biases and profiles across human cortex.

    PubMed

    Mellem, Monika S; Wohltjen, Sophie; Gotts, Stephen J; Ghuman, Avniel Singh; Martin, Alex

    2017-11-01

    Recent findings in monkeys suggest that intrinsic periodic spiking activity in selective cortical areas occurs at timescales that follow a sensory or lower order-to-higher order processing hierarchy (Murray JD, Bernacchia A, Freedman DJ, Romo R, Wallis JD, Cai X, Padoa-Schioppa C, Pasternak T, Seo H, Lee D, Wang XJ. Nat Neurosci 17: 1661-1663, 2014). It has not yet been fully explored if a similar timescale hierarchy is present in humans. Additionally, these measures in the monkey studies have not addressed findings that rhythmic activity within a brain area can occur at multiple frequencies. In this study we investigate in humans if regions may be biased toward particular frequencies of intrinsic activity and if a full cortical mapping still reveals an organization that follows this hierarchy. We examined the spectral power in multiple frequency bands (0.5-150 Hz) from task-independent data using magnetoencephalography (MEG). We compared standardized power across bands to find regional frequency biases. Our results demonstrate a mix of lower and higher frequency biases across sensory and higher order regions. Thus they suggest a more complex cortical organization that does not simply follow this hierarchy. Additionally, some regions do not display a bias for a single band, and a data-driven clustering analysis reveals a regional organization with high standardized power in multiple bands. Specifically, theta and beta are both high in dorsal frontal cortex, whereas delta and gamma are high in ventral frontal cortex and temporal cortex. Occipital and parietal regions are biased more narrowly toward alpha power, and ventral temporal lobe displays specific biases toward gamma. Thus intrinsic rhythmic neural activity displays a regional organization but one that is not necessarily hierarchical. NEW & NOTEWORTHY The organization of rhythmic neural activity is not well understood. Whereas it has been postulated that rhythms are organized in a hierarchical manner across brain regions, our novel analysis allows comparison of full cortical maps across different frequency bands, which demonstrate that the rhythmic organization is more complex. Additionally, data-driven methods show that rhythms of multiple frequencies or timescales occur within a particular region and that this nonhierarchical organization is widespread. Copyright © 2017 the American Physiological Society.

  15. Hierarchy, Dominance, and Deliberation: Egalitarian Values Require Mental Effort.

    PubMed

    Van Berkel, Laura; Crandall, Christian S; Eidelman, Scott; Blanchar, John C

    2015-09-01

    Hierarchy and dominance are ubiquitous. Because social hierarchy is early learned and highly rehearsed, the value of hierarchy enjoys relative ease over competing egalitarian values. In six studies, we interfere with deliberate thinking and measure endorsement of hierarchy and egalitarianism. In Study 1, bar patrons' blood alcohol content was correlated with hierarchy preference. In Study 2, cognitive load increased the authority/hierarchy moral foundation. In Study 3, low-effort thought instructions increased hierarchy endorsement and reduced equality endorsement. In Study 4, ego depletion increased hierarchy endorsement and caused a trend toward reduced equality endorsement. In Study 5, low-effort thought instructions increased endorsement of hierarchical attitudes among those with a sense of low personal power. In Study 6, participants' thinking quickly allocated more resources to high-status groups. Across five operationalizations of impaired deliberative thought, hierarchy endorsement increased and egalitarianism receded. These data suggest hierarchy may persist in part because it has a psychological advantage. © 2015 by the Society for Personality and Social Psychology, Inc.

  16. Testing New Programming Paradigms with NAS Parallel Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Schultz, M.; Yan, J.

    2000-01-01

    Over the past decade, high performance computing has evolved rapidly, not only in hardware architectures but also with increasing complexity of real applications. Technologies have been developing to aim at scaling up to thousands of processors on both distributed and shared memory systems. Development of parallel programs on these computers is always a challenging task. Today, writing parallel programs with message passing (e.g. MPI) is the most popular way of achieving scalability and high performance. However, writing message passing programs is difficult and error prone. Recent years new effort has been made in defining new parallel programming paradigms. The best examples are: HPF (based on data parallelism) and OpenMP (based on shared memory parallelism). Both provide simple and clear extensions to sequential programs, thus greatly simplify the tedious tasks encountered in writing message passing programs. HPF is independent of memory hierarchy, however, due to the immaturity of compiler technology its performance is still questionable. Although use of parallel compiler directives is not new, OpenMP offers a portable solution in the shared-memory domain. Another important development involves the tremendous progress in the internet and its associated technology. Although still in its infancy, Java promisses portability in a heterogeneous environment and offers possibility to "compile once and run anywhere." In light of testing these new technologies, we implemented new parallel versions of the NAS Parallel Benchmarks (NPBs) with HPF and OpenMP directives, and extended the work with Java and Java-threads. The purpose of this study is to examine the effectiveness of alternative programming paradigms. NPBs consist of five kernels and three simulated applications that mimic the computation and data movement of large scale computational fluid dynamics (CFD) applications. We started with the serial version included in NPB2.3. Optimization of memory and cache usage was applied to several benchmarks, noticeably BT and SP, resulting in better sequential performance. In order to overcome the lack of an HPF performance model and guide the development of the HPF codes, we employed an empirical performance model for several primitives found in the benchmarks. We encountered a few limitations of HPF, such as lack of supporting the "REDISTRIBUTION" directive and no easy way to handle irregular computation. The parallelization with OpenMP directives was done at the outer-most loop level to achieve the largest granularity. The performance of six HPF and OpenMP benchmarks is compared with their MPI counterparts for the Class-A problem size in the figure in next page. These results were obtained on an SGI Origin2000 (195MHz) with MIPSpro-f77 compiler 7.2.1 for OpenMP and MPI codes and PGI pghpf-2.4.3 compiler with MPI interface for HPF programs.

  17. Positive feelings facilitate working memory and complex decision making among older adults.

    PubMed

    Carpenter, Stephanie M; Peters, Ellen; Västfjäll, Daniel; Isen, Alice M

    2013-01-01

    The impact of induced mild positive feelings on working memory and complex decision making among older adults (aged 63-85) was examined. Participants completed a computer administered card task in which participants could win money if they chose from "gain" decks and lose money if they chose from "loss" decks. Individuals in the positive-feeling condition chose better than neutral-feeling participants and earned more money overall. Participants in the positive-feeling condition also demonstrated improved working-memory capacity. These effects of positive-feeling induction have implications for affect theory, as well as, potentially, practical implications for people of all ages dealing with complex decisions.

  18. Targeted Memory Reactivation during Sleep Adaptively Promotes the Strengthening or Weakening of Overlapping Memories.

    PubMed

    Oyarzún, Javiera P; Morís, Joaquín; Luque, David; de Diego-Balaguer, Ruth; Fuentemilla, Lluís

    2017-08-09

    System memory consolidation is conceptualized as an active process whereby newly encoded memory representations are strengthened through selective memory reactivation during sleep. However, our learning experience is highly overlapping in content (i.e., shares common elements), and memories of these events are organized in an intricate network of overlapping associated events. It remains to be explored whether and how selective memory reactivation during sleep has an impact on these overlapping memories acquired during awake time. Here, we test in a group of adult women and men the prediction that selective memory reactivation during sleep entails the reactivation of associated events and that this may lead the brain to adaptively regulate whether these associated memories are strengthened or pruned from memory networks on the basis of their relative associative strength with the shared element. Our findings demonstrate the existence of efficient regulatory neural mechanisms governing how complex memory networks are shaped during sleep as a function of their associative memory strength. SIGNIFICANCE STATEMENT Numerous studies have demonstrated that system memory consolidation is an active, selective, and sleep-dependent process in which only subsets of new memories become stabilized through their reactivation. However, the learning experience is highly overlapping in content and thus events are encoded in an intricate network of related memories. It remains to be explored whether and how memory reactivation has an impact on overlapping memories acquired during awake time. Here, we show that sleep memory reactivation promotes strengthening and weakening of overlapping memories based on their associative memory strength. These results suggest the existence of an efficient regulatory neural mechanism that avoids the formation of cluttered memory representation of multiple events and promotes stabilization of complex memory networks. Copyright © 2017 the authors 0270-6474/17/377748-11$15.00/0.

  19. Segregating the core computational faculty of human language from working memory

    PubMed Central

    Makuuchi, Michiru; Bahlmann, Jörg; Anwander, Alfred; Friederici, Angela D.

    2009-01-01

    In contrast to simple structures in animal vocal behavior, hierarchical structures such as center-embedded sentences manifest the core computational faculty of human language. Previous artificial grammar learning studies found that the left pars opercularis (LPO) subserves the processing of hierarchical structures. However, it is not clear whether this area is activated by the structural complexity per se or by the increased memory load entailed in processing hierarchical structures. To dissociate the effect of structural complexity from the effect of memory cost, we conducted a functional magnetic resonance imaging study of German sentence processing with a 2-way factorial design tapping structural complexity (with/without hierarchical structure, i.e., center-embedding of clauses) and working memory load (long/short distance between syntactically dependent elements; i.e., subject nouns and their respective verbs). Functional imaging data revealed that the processes for structure and memory operate separately but co-operatively in the left inferior frontal gyrus; activities in the LPO increased as a function of structural complexity, whereas activities in the left inferior frontal sulcus (LIFS) were modulated by the distance over which the syntactic information had to be transferred. Diffusion tensor imaging showed that these 2 regions were interconnected through white matter fibers. Moreover, functional coupling between the 2 regions was found to increase during the processing of complex, hierarchically structured sentences. These results suggest a neuroanatomical segregation of syntax-related aspects represented in the LPO from memory-related aspects reflected in the LIFS, which are, however, highly interconnected functionally and anatomically. PMID:19416819

  20. Working Memory in Children with Reading Disabilities

    ERIC Educational Resources Information Center

    Gathercole, Susan Elizabeth; Alloway, Tracy Packiam; Willis, Catherine; Adams, Anne-Marie

    2006-01-01

    This study investigated associations between working memory (measured by complex memory tasks) and both reading and mathematics abilities, as well as the possible mediating factors of fluid intelligence, verbal abilities, short-term memory (STM), and phonological awareness, in a sample of 46 6- to 11-year-olds with reading disabilities. As a…

  1. Are Working Memory Measures Free of Socioeconomic Influence?

    ERIC Educational Resources Information Center

    Engel, Pascale Marguerite Josiane; Santos, Flavia Heloisa; Gathercole, Susan Elizabeth

    2008-01-01

    Purpose: This study evaluated the impact of socioeconomic factors on children's performance on tests of working memory and vocabulary. Method: Twenty Brazilian children, aged 6 and 7 years, from low-income families, completed tests of working memory (verbal short-term memory and verbal complex span) and vocabulary (expressive and receptive). A…

  2. Memory: Issues of Import to School Psychologists.

    ERIC Educational Resources Information Center

    John, Kirk R.

    This document defines memory as a complex, interactive process that is a prerequisite for all higher learning. Without intact memory skills, a host of disorders may ensue ranging from mild learning problems to disorientation and helplessness (Lezak, 1983). Because of the pervasive and central role memory plays in people's lives, school…

  3. Sparse distributed memory overview

    NASA Technical Reports Server (NTRS)

    Raugh, Mike

    1990-01-01

    The Sparse Distributed Memory (SDM) project is investigating the theory and applications of massively parallel computing architecture, called sparse distributed memory, that will support the storage and retrieval of sensory and motor patterns characteristic of autonomous systems. The immediate objectives of the project are centered in studies of the memory itself and in the use of the memory to solve problems in speech, vision, and robotics. Investigation of methods for encoding sensory data is an important part of the research. Examples of NASA missions that may benefit from this work are Space Station, planetary rovers, and solar exploration. Sparse distributed memory offers promising technology for systems that must learn through experience and be capable of adapting to new circumstances, and for operating any large complex system requiring automatic monitoring and control. Sparse distributed memory is a massively parallel architecture motivated by efforts to understand how the human brain works. Sparse distributed memory is an associative memory, able to retrieve information from cues that only partially match patterns stored in the memory. It is able to store long temporal sequences derived from the behavior of a complex system, such as progressive records of the system's sensory data and correlated records of the system's motor controls.

  4. Complex network structure influences processing in long-term and short-term memory.

    PubMed

    Vitevitch, Michael S; Chan, Kit Ying; Roodenrys, Steven

    2012-07-01

    Complex networks describe how entities in systems interact; the structure of such networks is argued to influence processing. One measure of network structure, clustering coefficient, C, measures the extent to which neighbors of a node are also neighbors of each other. Previous psycholinguistic experiments found that the C of phonological word-forms influenced retrieval from the mental lexicon (that portion of long-term memory dedicated to language) during the on-line recognition and production of spoken words. In the present study we examined how network structure influences other retrieval processes in long- and short-term memory. In a false-memory task-examining long-term memory-participants falsely recognized more words with low- than high-C. In a recognition memory task-examining veridical memories in long-term memory-participants correctly recognized more words with low- than high-C. However, participants in a serial recall task-examining redintegration in short-term memory-recalled lists comprised of high-C words more accurately than lists comprised of low-C words. These results demonstrate that network structure influences cognitive processes associated with several forms of memory including lexical, long-term, and short-term.

  5. Phenomenological analysis of medical time series with regular and stochastic components

    NASA Astrophysics Data System (ADS)

    Timashev, Serge F.; Polyakov, Yuriy S.

    2007-06-01

    Flicker-Noise Spectroscopy (FNS), a general approach to the extraction and parameterization of resonant and stochastic components contained in medical time series, is presented. The basic idea of FNS is to treat the correlation links present in sequences of different irregularities, such as spikes, "jumps", and discontinuities in derivatives of different orders, on all levels of the spatiotemporal hierarchy of the system under study as main information carriers. The tools to extract and analyze the information are power spectra and difference moments (structural functions), which complement the information of each other. The structural function stochastic component is formed exclusively by "jumps" of the dynamic variable while the power spectrum stochastic component is formed by both spikes and "jumps" on every level of the hierarchy. The information "passport" characteristics that are determined by fitting the derived expressions to the experimental variations for the stochastic components of power spectra and structural functions are interpreted as the correlation times and parameters that describe the rate of "memory loss" on these correlation time intervals for different irregularities. The number of the extracted parameters is determined by the requirements of the problem under study. Application of this approach to the analysis of tremor velocity signals for a Parkinsonian patient is discussed.

  6. Cell of origin associated classification of B-cell malignancies by gene signatures of the normal B-cell hierarchy.

    PubMed

    Johnsen, Hans Erik; Bergkvist, Kim Steve; Schmitz, Alexander; Kjeldsen, Malene Krag; Hansen, Steen Møller; Gaihede, Michael; Nørgaard, Martin Agge; Bæch, John; Grønholdt, Marie-Louise; Jensen, Frank Svendsen; Johansen, Preben; Bødker, Julie Støve; Bøgsted, Martin; Dybkær, Karen

    2014-06-01

    Recent findings have suggested biological classification of B-cell malignancies as exemplified by the "activated B-cell-like" (ABC), the "germinal-center B-cell-like" (GCB) and primary mediastinal B-cell lymphoma (PMBL) subtypes of diffuse large B-cell lymphoma and "recurrent translocation and cyclin D" (TC) classification of multiple myeloma. Biological classification of B-cell derived cancers may be refined by a direct and systematic strategy where identification and characterization of normal B-cell differentiation subsets are used to define the cancer cell of origin phenotype. Here we propose a strategy combining multiparametric flow cytometry, global gene expression profiling and biostatistical modeling to generate B-cell subset specific gene signatures from sorted normal human immature, naive, germinal centrocytes and centroblasts, post-germinal memory B-cells, plasmablasts and plasma cells from available lymphoid tissues including lymph nodes, tonsils, thymus, peripheral blood and bone marrow. This strategy will provide an accurate image of the stage of differentiation, which prospectively can be used to classify any B-cell malignancy and eventually purify tumor cells. This report briefly describes the current models of the normal B-cell subset differentiation in multiple tissues and the pathogenesis of malignancies originating from the normal germinal B-cell hierarchy.

  7. The compositional and evolutionary logic of metabolism

    NASA Astrophysics Data System (ADS)

    Braakman, Rogier; Smith, Eric

    2013-02-01

    Metabolism is built on a foundation of organic chemistry, and employs structures and interactions at many scales. Despite these sources of complexity, metabolism also displays striking and robust regularities in the forms of modularity and hierarchy, which may be described compactly in terms of relatively few principles of composition. These regularities render metabolic architecture comprehensible as a system, and also suggests the order in which layers of that system came into existence. In addition metabolism also serves as a foundational layer in other hierarchies, up to at least the levels of cellular integration including bioenergetics and molecular replication, and trophic ecology. The recapitulation of patterns first seen in metabolism, in these higher levels, motivates us to interpret metabolism as a source of causation or constraint on many forms of organization in the biosphere. Many of the forms of modularity and hierarchy exhibited by metabolism are readily interpreted as stages in the emergence of catalytic control by living systems over organic chemistry, sometimes recapitulating or incorporating geochemical mechanisms. We identify as modules, either subsets of chemicals and reactions, or subsets of functions, that are re-used in many contexts with a conserved internal structure. At the small molecule substrate level, module boundaries are often associated with the most complex reaction mechanisms, catalyzed by highly conserved enzymes. Cofactors form a biosynthetically and functionally distinctive control layer over the small-molecule substrate. The most complex members among the cofactors are often associated with the reactions at module boundaries in the substrate networks, while simpler cofactors participate in widely generalized reactions. The highly tuned chemical structures of cofactors (sometimes exploiting distinctive properties of the elements of the periodic table) thereby act as ‘keys’ that incorporate classes of organic reactions within biochemistry. Module boundaries provide the interfaces where change is concentrated, when we catalogue extant diversity of metabolic phenotypes. The same modules that organize the compositional diversity of metabolism are argued, with many explicit examples, to have governed long-term evolution. Early evolution of core metabolism, and especially of carbon-fixation, appears to have required very few innovations, and to have used few rules of composition of conserved modules, to produce adaptations to simple chemical or energetic differences of environment without diverse solutions and without historical contingency. We demonstrate these features of metabolism at each of several levels of hierarchy, beginning with the small-molecule metabolic substrate and network architecture, continuing with cofactors and key conserved reactions, and culminating in the aggregation of multiple diverse physical and biochemical processes in cells.

  8. Distinguishing cognitive state with multifractal complexity of hippocampal interspike interval sequences

    PubMed Central

    Fetterhoff, Dustin; Kraft, Robert A.; Sandler, Roman A.; Opris, Ioan; Sexton, Cheryl A.; Marmarelis, Vasilis Z.; Hampson, Robert E.; Deadwyler, Sam A.

    2015-01-01

    Fractality, represented as self-similar repeating patterns, is ubiquitous in nature and the brain. Dynamic patterns of hippocampal spike trains are known to exhibit multifractal properties during working memory processing; however, it is unclear whether the multifractal properties inherent to hippocampal spike trains reflect active cognitive processing. To examine this possibility, hippocampal neuronal ensembles were recorded from rats before, during and after a spatial working memory task following administration of tetrahydrocannabinol (THC), a memory-impairing component of cannabis. Multifractal detrended fluctuation analysis was performed on hippocampal interspike interval sequences to determine characteristics of monofractal long-range temporal correlations (LRTCs), quantified by the Hurst exponent, and the degree/magnitude of multifractal complexity, quantified by the width of the singularity spectrum. Our results demonstrate that multifractal firing patterns of hippocampal spike trains are a marker of functional memory processing, as they are more complex during the working memory task and significantly reduced following administration of memory impairing THC doses. Conversely, LRTCs are largest during resting state recordings, therefore reflecting different information compared to multifractality. In order to deepen conceptual understanding of multifractal complexity and LRTCs, these measures were compared to classical methods using hippocampal frequency content and firing variability measures. These results showed that LRTCs, multifractality, and theta rhythm represent independent processes, while delta rhythm correlated with multifractality. Taken together, these results provide a novel perspective on memory function by demonstrating that the multifractal nature of spike trains reflects hippocampal microcircuit activity that can be used to detect and quantify cognitive, physiological, and pathological states. PMID:26441562

  9. Avalanches and generalized memory associativity in a network model for conscious and unconscious mental functioning

    NASA Astrophysics Data System (ADS)

    Siddiqui, Maheen; Wedemann, Roseli S.; Jensen, Henrik Jeldtoft

    2018-01-01

    We explore statistical characteristics of avalanches associated with the dynamics of a complex-network model, where two modules corresponding to sensorial and symbolic memories interact, representing unconscious and conscious mental processes. The model illustrates Freud's ideas regarding the neuroses and that consciousness is related with symbolic and linguistic memory activity in the brain. It incorporates the Stariolo-Tsallis generalization of the Boltzmann Machine in order to model memory retrieval and associativity. In the present work, we define and measure avalanche size distributions during memory retrieval, in order to gain insight regarding basic aspects of the functioning of these complex networks. The avalanche sizes defined for our model should be related to the time consumed and also to the size of the neuronal region which is activated, during memory retrieval. This allows the qualitative comparison of the behaviour of the distribution of cluster sizes, obtained during fMRI measurements of the propagation of signals in the brain, with the distribution of avalanche sizes obtained in our simulation experiments. This comparison corroborates the indication that the Nonextensive Statistical Mechanics formalism may indeed be more well suited to model the complex networks which constitute brain and mental structure.

  10. Performance measurements of the first RAID prototype

    NASA Technical Reports Server (NTRS)

    Chervenak, Ann L.

    1990-01-01

    The performance is examined of Redundant Arrays of Inexpensive Disks (RAID) the First, a prototype disk array. A hierarchy of bottlenecks was discovered in the system that limit overall performance. The most serious is the memory system contention on the Sun 4/280 host CPU, which limits array bandwidth to 2.3 MBytes/sec. The array performs more successfully on small random operations, achieving nearly 300 I/Os per second before the Sun 4/280 becomes CPU limited. Other bottlenecks in the system are the VME backplane, bandwidth on the disk controller, and overheads associated with the SCSI protocol. All are examined in detail. The main conclusion is that to achieve the potential bandwidth of arrays, more powerful CPU's alone will not suffice. Just as important are adequate host memory bandwidth and support for high bandwidth on disk controllers. Current disk controllers are more often designed to achieve large numbers of small random operations, rather than high bandwidth. Operating systems also need to change to support high bandwidth from disk arrays. In particular, they should transfer data in larger blocks, and should support asynchronous I/O to improve sequential write performance.

  11. Roofline model toolkit: A practical tool for architectural and program analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lo, Yu Jung; Williams, Samuel; Van Straalen, Brian

    We present preliminary results of the Roofline Toolkit for multicore, many core, and accelerated architectures. This paper focuses on the processor architecture characterization engine, a collection of portable instrumented micro benchmarks implemented with Message Passing Interface (MPI), and OpenMP used to express thread-level parallelism. These benchmarks are specialized to quantify the behavior of different architectural features. Compared to previous work on performance characterization, these microbenchmarks focus on capturing the performance of each level of the memory hierarchy, along with thread-level parallelism, instruction-level parallelism and explicit SIMD parallelism, measured in the context of the compilers and run-time environments. We also measuremore » sustained PCIe throughput with four GPU memory managed mechanisms. By combining results from the architecture characterization with the Roofline model based solely on architectural specifications, this work offers insights for performance prediction of current and future architectures and their software systems. To that end, we instrument three applications and plot their resultant performance on the corresponding Roofline model when run on a Blue Gene/Q architecture.« less

  12. False Memories for Shape Activate the Lateral Occipital Complex

    ERIC Educational Resources Information Center

    Karanian, Jessica M.; Slotnick, Scott D.

    2017-01-01

    Previous functional magnetic resonance imaging evidence has shown that false memories arise from higher-level conscious processing regions rather than lower-level sensory processing regions. In the present study, we assessed whether the lateral occipital complex (LOC)--a lower-level conscious shape processing region--was associated with false…

  13. Borrowing Others' Words: Text, Ownership, Memory, and Plagiarism.

    ERIC Educational Resources Information Center

    Pennycook, Alastair

    1996-01-01

    Considers some of the complexities of text, ownership, memorization, and plagiarism. The article suggests that plagiarism needs to be understood in terms of complex relationships between text, memory, and learning as part of an undertaking to explore different relationships between learning, literacy, and cultural difference. (49 references)…

  14. The fluency of social hierarchy: the ease with which hierarchical relationships are seen, remembered, learned, and liked.

    PubMed

    Zitek, Emily M; Tiedens, Larissa Z

    2012-01-01

    We tested the hypothesis that social hierarchies are fluent social stimuli; that is, they are processed more easily and therefore liked better than less hierarchical stimuli. In Study 1, pairs of people in a hierarchy based on facial dominance were identified faster than pairs of people equal in their facial dominance. In Study 2, a diagram representing hierarchy was memorized more quickly than a diagram representing equality or a comparison diagram. This faster processing led the hierarchy diagram to be liked more than the equality diagram. In Study 3, participants were best able to learn a set of relationships that represented hierarchy (asymmetry of power)--compared to relationships in which there was asymmetry of friendliness, or compared to relationships in which there was symmetry--and this processing ease led them to like the hierarchy the most. In Study 4, participants found it easier to make decisions about a company that was more hierarchical and thus thought the hierarchical organization had more positive qualities. In Study 5, familiarity as a basis for the fluency of hierarchy was demonstrated by showing greater fluency for male than female hierarchies. This study also showed that when social relationships are difficult to learn, people's preference for hierarchy increases. Taken together, these results suggest one reason people might like hierarchies--hierarchies are easy to process. This fluency for social hierarchies might contribute to the construction and maintenance of hierarchies.

  15. The development of episodic memory: items, contexts, and relations.

    PubMed

    Yim, Hyungwook; Dennis, Simon J; Sloutsky, Vladimir M

    2013-11-01

    Episodic memory involves the formation of relational structures that bind information about the stimuli people experience to the contexts in which they experience them. The ability to form and retain such structures may be at the core of the development of episodic memory. In the first experiment reported here, 4- and 7-year-olds were presented with paired-associate learning tasks requiring memory structures of different complexity. A multinomial-processing tree model was applied to estimate the use of different structures in the two age groups. The use of two-way list-context-to-target structures and three-way structures was found to increase between the ages of 4 and 7. Experiment 2 demonstrated that the ability to form increasingly complex relational memory structures develops between the ages of 4 and 7 years and that this development extends well into adulthood. These results have important implications for theories of memory development.

  16. KENNEDY SPACE CENTER, FLA. - Center Director Jim Kennedy speaks to attendees at a memorial service honoring the crew of Columbia. He stands in front of the Space Memorial Mirror at the KSC Visitor Complex. Feb. 1 is the one-year anniversary of the loss of the crew and orbiter Columbia in a tragic accident as the ship returned to Earth following mission STS-107. Attended by many friends, co-workers and families, the memorial service was also open to the public.

    NASA Image and Video Library

    2004-02-01

    KENNEDY SPACE CENTER, FLA. - Center Director Jim Kennedy speaks to attendees at a memorial service honoring the crew of Columbia. He stands in front of the Space Memorial Mirror at the KSC Visitor Complex. Feb. 1 is the one-year anniversary of the loss of the crew and orbiter Columbia in a tragic accident as the ship returned to Earth following mission STS-107. Attended by many friends, co-workers and families, the memorial service was also open to the public.

  17. Quadratic Polynomial Regression using Serial Observation Processing:Implementation within DART

    NASA Astrophysics Data System (ADS)

    Hodyss, D.; Anderson, J. L.; Collins, N.; Campbell, W. F.; Reinecke, P. A.

    2017-12-01

    Many Ensemble-Based Kalman ltering (EBKF) algorithms process the observations serially. Serial observation processing views the data assimilation process as an iterative sequence of scalar update equations. What is useful about this data assimilation algorithm is that it has very low memory requirements and does not need complex methods to perform the typical high-dimensional inverse calculation of many other algorithms. Recently, the push has been towards the prediction, and therefore the assimilation of observations, for regions and phenomena for which high-resolution is required and/or highly nonlinear physical processes are operating. For these situations, a basic hypothesis is that the use of the EBKF is sub-optimal and performance gains could be achieved by accounting for aspects of the non-Gaussianty. To this end, we develop here a new component of the Data Assimilation Research Testbed [DART] to allow for a wide-variety of users to test this hypothesis. This new version of DART allows one to run several variants of the EBKF as well as several variants of the quadratic polynomial lter using the same forecast model and observations. Dierences between the results of the two systems will then highlight the degree of non-Gaussianity in the system being examined. We will illustrate in this work the differences between the performance of linear versus quadratic polynomial regression in a hierarchy of models from Lorenz-63 to a simple general circulation model.

  18. Understanding mental retardation in Down's syndrome using trisomy 16 mouse models.

    PubMed

    Galdzicki, Z; Siarey, R J

    2003-06-01

    Mental retardation in Down's syndrome, human trisomy 21, is characterized by developmental delays, language and memory deficits and other cognitive abnormalities. Neurophysiological and functional information is needed to understand the mechanisms of mental retardation in Down's syndrome. The trisomy mouse models provide windows into the molecular and developmental effects associated with abnormal chromosome numbers. The distal segment of mouse chromosome 16 is homologous to nearly the entire long arm of human chromosome 21. Therefore, mice with full or segmental trisomy 16 (Ts65Dn) are considered reliable animal models of Down's syndrome. Ts65Dn mice demonstrate impaired learning in spatial tests and abnormalities in hippocampal synaptic plasticity. We hypothesize that the physiological impairments in the Ts65Dn mouse hippocampus can model the suboptimal brain function occuring at various levels of Down's syndrome brain hierarchy, starting at a single neuron, and then affecting simple and complex neuronal networks. Once these elements create the gross brain structure, their dysfunctional activity cannot be overcome by extensive plasticity and redundancy, and therefore, at the end of the maturation period the mind inside this brain remains deficient and delayed in its capabilities. The complicated interactions that govern this aberrant developmental process cannot be rescued through existing compensatory mechanisms. In summary, overexpression of genes from chromosome 21 shifts biological homeostasis in the Down's syndrome brain to a new less functional state.

  19. 3D CSEM inversion based on goal-oriented adaptive finite element method

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Key, K.

    2016-12-01

    We present a parallel 3D frequency domain controlled-source electromagnetic inversion code name MARE3DEM. Non-linear inversion of observed data is performed with the Occam variant of regularized Gauss-Newton optimization. The forward operator is based on the goal-oriented finite element method that efficiently calculates the responses and sensitivity kernels in parallel using a data decomposition scheme where independent modeling tasks contain different frequencies and subsets of the transmitters and receivers. To accommodate complex 3D conductivity variation with high flexibility and precision, we adopt the dual-grid approach where the forward mesh conforms to the inversion parameter grid and is adaptively refined until the forward solution converges to the desired accuracy. This dual-grid approach is memory efficient, since the inverse parameter grid remains independent from fine meshing generated around the transmitter and receivers by the adaptive finite element method. Besides, the unstructured inverse mesh efficiently handles multiple scale structures and allows for fine-scale model parameters within the region of interest. Our mesh generation engine keeps track of the refinement hierarchy so that the map of conductivity and sensitivity kernel between the forward and inverse mesh is retained. We employ the adjoint-reciprocity method to calculate the sensitivity kernels which establish a linear relationship between changes in the conductivity model and changes in the modeled responses. Our code uses a direcy solver for the linear systems, so the adjoint problem is efficiently computed by re-using the factorization from the primary problem. Further computational efficiency and scalability is obtained in the regularized Gauss-Newton portion of the inversion using parallel dense matrix-matrix multiplication and matrix factorization routines implemented with the ScaLAPACK library. We show the scalability, reliability and the potential of the algorithm to deal with complex geological scenarios by applying it to the inversion of synthetic marine controlled source EM data generated for a complex 3D offshore model with significant seafloor topography.

  20. Expertise for upright faces improves the precision but not the capacity of visual working memory.

    PubMed

    Lorenc, Elizabeth S; Pratte, Michael S; Angeloni, Christopher F; Tong, Frank

    2014-10-01

    Considerable research has focused on how basic visual features are maintained in working memory, but little is currently known about the precision or capacity of visual working memory for complex objects. How precisely can an object be remembered, and to what extent might familiarity or perceptual expertise contribute to working memory performance? To address these questions, we developed a set of computer-generated face stimuli that varied continuously along the dimensions of age and gender, and we probed participants' memories using a method-of-adjustment reporting procedure. This paradigm allowed us to separately estimate the precision and capacity of working memory for individual faces, on the basis of the assumptions of a discrete capacity model, and to assess the impact of face inversion on memory performance. We found that observers could maintain up to four to five items on average, with equally good memory capacity for upright and upside-down faces. In contrast, memory precision was significantly impaired by face inversion at every set size tested. Our results demonstrate that the precision of visual working memory for a complex stimulus is not strictly fixed but, instead, can be modified by learning and experience. We find that perceptual expertise for upright faces leads to significant improvements in visual precision, without modifying the capacity of working memory.

  1. A transcription factor hierarchy defines an environmental stress response network.

    PubMed

    Song, Liang; Huang, Shao-Shan Carol; Wise, Aaron; Castanon, Rosa; Nery, Joseph R; Chen, Huaming; Watanabe, Marina; Thomas, Jerushah; Bar-Joseph, Ziv; Ecker, Joseph R

    2016-11-04

    Environmental stresses are universally encountered by microbes, plants, and animals. Yet systematic studies of stress-responsive transcription factor (TF) networks in multicellular organisms have been limited. The phytohormone abscisic acid (ABA) influences the expression of thousands of genes, allowing us to characterize complex stress-responsive regulatory networks. Using chromatin immunoprecipitation sequencing, we identified genome-wide targets of 21 ABA-related TFs to construct a comprehensive regulatory network in Arabidopsis thaliana Determinants of dynamic TF binding and a hierarchy among TFs were defined, illuminating the relationship between differential gene expression patterns and ABA pathway feedback regulation. By extrapolating regulatory characteristics of observed canonical ABA pathway components, we identified a new family of transcriptional regulators modulating ABA and salt responsiveness and demonstrated their utility to modulate plant resilience to osmotic stress. Copyright © 2016, American Association for the Advancement of Science.

  2. Auditing Complex Concepts in Overlapping Subsets of SNOMED

    PubMed Central

    Wang, Yue; Wei, Duo; Xu, Junchuan; Elhanan, Gai; Perl, Yehoshua; Halper, Michael; Chen, Yan; Spackman, Kent A.; Hripcsak, George

    2008-01-01

    Limited resources and the sheer volume of concepts make auditing a large terminology, such as SNOMED CT, a daunting task. It is essential to devise techniques that can aid an auditor by automatically identifying concepts that deserve attention. A methodology for this purpose based on a previously introduced abstraction network (called the p-area taxonomy) for a SNOMED CT hierarchy is presented. The methodology algorithmically gathers concepts appearing in certain overlapping subsets, defined exclusively with respect to the p-area taxonomy, for review. The results of applying the methodology to SNOMED’s Specimen hierarchy are presented. These results are compared against a control sample composed of concepts residing in subsets without the overlaps. With the use of the double bootstrap, the concept group produced by our methodology is shown to yield a statistically significant higher proportion of error discoveries. PMID:18998838

  3. Auditing complex concepts in overlapping subsets of SNOMED.

    PubMed

    Wang, Yue; Wei, Duo; Xu, Junchuan; Elhanan, Gai; Perl, Yehoshua; Halper, Michael; Chen, Yan; Spackman, Kent A; Hripcsak, George

    2008-11-06

    Limited resources and the sheer volume of concepts make auditing a large terminology, such as SNOMED CT, a daunting task. It is essential to devise techniques that can aid an auditor by automatically identifying concepts that deserve attention. A methodology for this purpose based on a previously introduced abstraction network (called the p-area taxonomy) for a SNOMED CT hierarchy is presented. The methodology algorithmically gathers concepts appearing in certain overlapping subsets, defined exclusively with respect to the p-area taxonomy, for review. The results of applying the methodology to SNOMED's Specimen hierarchy are presented. These results are compared against a control sample composed of concepts residing in subsets without the overlaps. With the use of the double bootstrap, the concept group produced by our methodology is shown to yield a statistically significant higher proportion of error discoveries.

  4. Consolidation of Hierarchy-Structured Nanopowder Agglomerates and Its Application to Net-Shaping Nanopowder Materials

    PubMed Central

    Lee, Jai-Sung; Choi, Joon-Phil; Lee, Geon-Yong

    2013-01-01

    This paper provides an overview on our recent investigations on the consolidation of hierarchy-structured nanopowder agglomerates and related applications to net-shaping nanopowder materials. Understanding the nanopowder agglomerate sintering (NAS) process is essential to processing of net-shaped nanopowder materials and components with small and complex shape. The key concept of the NAS process is to enhance material transport through controlling the powder interface volume of nanopowder agglomerates. Based upon this concept, we have suggested a new idea of full density processing for fabricating micro-powder injection molded part using metal nanopowder agglomerates produced by hydrogen reduction of metal oxide powders. Studies on the full density sintering of die compacted- and powder injection molded iron base nano-agglomerate powders are introduced and discussed in terms of densification process and microstructure. PMID:28788317

  5. Support vector machine prediction of enzyme function with conjoint triad feature and hierarchical context.

    PubMed

    Wang, Yong-Cui; Wang, Yong; Yang, Zhi-Xia; Deng, Nai-Yang

    2011-06-20

    Enzymes are known as the largest class of proteins and their functions are usually annotated by the Enzyme Commission (EC), which uses a hierarchy structure, i.e., four numbers separated by periods, to classify the function of enzymes. Automatically categorizing enzyme into the EC hierarchy is crucial to understand its specific molecular mechanism. In this paper, we introduce two key improvements in predicting enzyme function within the machine learning framework. One is to introduce the efficient sequence encoding methods for representing given proteins. The second one is to develop a structure-based prediction method with low computational complexity. In particular, we propose to use the conjoint triad feature (CTF) to represent the given protein sequences by considering not only the composition of amino acids but also the neighbor relationships in the sequence. Then we develop a support vector machine (SVM)-based method, named as SVMHL (SVM for hierarchy labels), to output enzyme function by fully considering the hierarchical structure of EC. The experimental results show that our SVMHL with the CTF outperforms SVMHL with the amino acid composition (AAC) feature both in predictive accuracy and Matthew's correlation coefficient (MCC). In addition, SVMHL with the CTF obtains the accuracy and MCC ranging from 81% to 98% and 0.82 to 0.98 when predicting the first three EC digits on a low-homologous enzyme dataset. We further demonstrate that our method outperforms the methods which do not take account of hierarchical relationship among enzyme categories and alternative methods which incorporate prior knowledge about inter-class relationships. Our structure-based prediction model, SVMHL with the CTF, reduces the computational complexity and outperforms the alternative approaches in enzyme function prediction. Therefore our new method will be a useful tool for enzyme function prediction community.

  6. Complex sources of variance in female dominance rank in a nepotistic society

    PubMed Central

    Lea, Amanda J.; Learn, Niki H.; Theus, Marcus J.; Altmann, Jeanne; Alberts, Susan C.

    2016-01-01

    Many mammalian societies are structured by dominance hierarchies, and an individual’s position within this hierarchy can influence reproduction, behaviour, physiology and health. In nepotistic hierarchies, which are common in cercopithecine primates and also seen in spotted hyaenas, Crocuta crocuta, adult daughters are expected to rank immediately below their mother, and in reverse age order (a phenomenon known as ‘youngest ascendancy’). This pattern is well described, but few studies have systematically examined the frequency or causes of departures from the expected pattern. Using a longitudinal data set from a natural population of yellow baboons, Papio cynocephalus, we measured the influence of maternal kin, paternal kin and group size on female rank positions at two life history milestones, menarche and first live birth. At menarche, most females (73%) ranked adjacent to their family members (i.e. the female held an ordinal rank in consecutive order with other members of her maternal family); however, only 33% of females showed youngest ascendancy within their matriline at menarche. By the time they experienced their first live birth, many females had improved their dominance rank: 78% ranked adjacent to their family members and 49% showed youngest ascendancy within their matriline. The presence of mothers and maternal sisters exerted a powerful influence on rank outcomes. However, the presence of fathers, brothers and paternal siblings did not produce a clear effect on female dominance rank in our analyses, perhaps because females in our data set co-resided with variable numbers and types of paternal and male relatives. Our results also raise the possibility that female body size or competitive ability may influence dominance rank, even in this classically nepotistic species. In total, our analyses reveal that the predictors of dominance rank in nepotistic rank systems are much more complex than previously thought. PMID:26997663

  7. Spin generalization of the Calogero–Moser hierarchy and the matrix KP hierarchy

    NASA Astrophysics Data System (ADS)

    Pashkov, V.; Zabrodin, A.

    2018-05-01

    We establish a correspondence between rational solutions to the matrix KP hierarchy and the spin generalization of the Calogero–Moser system on the level of hierarchies. Namely, it is shown that the rational solutions to the matrix KP hierarchy appear to be isomorphic to the spin Calogero–Moser system in a sense that the dynamics of poles of solutions to the matrix KP hierarchy in the higher times is governed by the higher Hamiltonians of the spin Calogero–Moser integrable hierarchy with rational potential.

  8. The best motivator priorities parents choose via analytical hierarchy process

    NASA Astrophysics Data System (ADS)

    Farah, R. N.; Latha, P.

    2015-05-01

    Motivation is probably the most important factor that educators can target in order to improve learning. Numerous cross-disciplinary theories have been postulated to explain motivation. While each of these theories has some truth, no single theory seems to adequately explain all human motivation. The fact is that human beings in general and pupils in particular are complex creatures with complex needs and desires. In this paper, Analytic Hierarchy Process (AHP) has been proposed as an emerging solution to move towards too large, dynamic and complex real world multi-criteria decision making problems in selecting the most suitable motivator when choosing school for their children. Data were analyzed using SPSS 17.0 ("Statistical Package for Social Science") software. Statistic testing used are descriptive and inferential statistic. Descriptive statistic used to identify respondent pupils and parents demographic factors. The statistical testing used to determine the pupils and parents highest motivator priorities and parents' best priorities using AHP to determine the criteria chosen by parents such as school principals, teachers, pupils and parents. The moderating factors are selected schools based on "Standard Kualiti Pendidikan Malaysia" (SKPM) in Ampang. Inferential statistics such as One-way ANOVA used to get the significant and data used to calculate the weightage of AHP. School principals is found to be the best motivator for parents in choosing school for their pupils followed by teachers, parents and pupils.

  9. Theories of Memory and Aging: A Look at the Past and a Glimpse of the Future

    PubMed Central

    Festini, Sara B.

    2017-01-01

    The present article reviews theories of memory and aging over the past 50 years. Particularly notable is a progression from early single-mechanism perspectives to complex multifactorial models proposed to account for commonly observed age deficits in memory function. The seminal mechanistic theories of processing speed, limited resources, and inhibitory deficits are discussed and viewed as especially important theories for understanding age-related memory decline. Additionally, advances in multivariate techniques including structural equation modeling provided new tools that led to the development of more complex multifactorial theories than existed earlier. The important role of neuroimaging is considered, along with the current prevalence of intervention studies. We close with predictions about new directions that future research on memory and aging will take. PMID:27257229

  10. Attention modulations on the perception of social hierarchy at distinct temporal stages: an electrophysiological investigation.

    PubMed

    Feng, Chunliang; Tian, Tengxiang; Feng, Xue; Luo, Yue-Jia

    2015-04-01

    Recent behavioral and neuroscientific studies have revealed the preferential processing of superior-hierarchy cues. However, it remains poorly understood whether top-down controlled mechanisms modulate temporal dynamics of neurocognitive substrates underlying the preferential processing of these biologically and socially relevant cues. This was investigated in the current study by recording event-related potentials from participants who were presented with superior or inferior social hierarchy. Participants performed a hierarchy-judgment task that required attention to hierarchy cues or a gender-judgment task that withdrew their attention from these cues. Superior-hierarchy cues evoked stronger neural responses than inferior-hierarchy cues at both early (N170/N200) and late (late positive potential, LPP) temporal stages. Notably, the modulations of top-down attention were identified on the LPP component, such that superior-hierarchy cues evoked larger LPP amplitudes than inferior-hierarchy cues only in the attended condition; whereas the modulations of the N170/N200 component by hierarchy cues were evident in both attended and unattended conditions. These findings suggest that the preferential perception of superior-hierarchy cues involves both relatively automatic attentional bias at the early temporal stage as well as flexible and voluntary cognitive evaluation at the late temporal stage. Finally, these hierarchy-related effects were absent when participants were shown the same stimuli which, however, were not associated with social-hierarchy information in a non-hierarchy task (Experiment 2), suggesting that effects of social hierarchy at early and late temporal stages could not be accounted for by differences in physical attributes between these social cues. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. How Semantic and Episodic Memory Contribute to Autobiographical Memory. Commentary on Burt

    ERIC Educational Resources Information Center

    Tendolkar, Indira

    2008-01-01

    In his article, Chris Burt focuses on the relationship between time and autobiographical memory. The question Burt puts forward is whether temporal markers in reports on autobiographic memories reflect specific temporal information or result from rather complex cognitive processing of time-relevant knowledge. The aspect of time is inherent to the…

  12. Visual Working Memory Capacity for Objects from Different Categories: A Face-Specific Maintenance Effect

    ERIC Educational Resources Information Center

    Wong, Jason H.; Peterson, Matthew S.; Thompson, James C.

    2008-01-01

    The capacity of visual working memory was examined when complex objects from different categories were remembered. Previous studies have not examined how visual similarity affects object memory, though it has long been known that similar-sounding phonological information interferes with rehearsal in auditory working memory. Here, experiments…

  13. Memory and the Self

    ERIC Educational Resources Information Center

    Conway, Martin A.

    2005-01-01

    The Self-Memory System (SMS) is a conceptual framework that emphasizes the interconnectedness of self and memory. Within this framework memory is viewed as the data base of the self. The self is conceived as a complex set of active goals and associated self-images, collectively referred to as the "working self." The relationship between the…

  14. Remembering Complex Objects in Visual Working Memory: Do Capacity Limits Restrict Objects or Features?

    PubMed Central

    Hardman, Kyle; Cowan, Nelson

    2014-01-01

    Visual working memory stores stimuli from our environment as representations that can be accessed by high-level control processes. This study addresses a longstanding debate in the literature about whether storage limits in visual working memory include a limit to the complexity of discrete items. We examined the issue with a number of change-detection experiments that used complex stimuli which possessed multiple features per stimulus item. We manipulated the number of relevant features of the stimulus objects in order to vary feature load. In all of our experiments, we found that increased feature load led to a reduction in change-detection accuracy. However, we found that feature load alone could not account for the results, but that a consideration of the number of relevant objects was also required. This study supports capacity limits for both feature and object storage in visual working memory. PMID:25089739

  15. Multi-criteria decision analysis for waste management in Saharawi refugee camps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garfi, M.; Tondelli, S.; Bonoli, A.

    2009-10-15

    The aim of this paper is to compare different waste management solutions in Saharawi refugee camps (Algeria) and to test the feasibility of a decision-making method developed to be applied in particular conditions in which environmental and social aspects must be considered. It is based on multi criteria analysis, and in particular on the analytic hierarchy process (AHP), a mathematical technique for multi-criteria decision making (Saaty, T.L., 1980. The Analytic Hierarchy Process. McGraw-Hill, New York, USA; Saaty, T.L., 1990. How to Make a Decision: The Analytic Hierarchy Process. European Journal of Operational Research; Saaty, T.L., 1994. Decision Making for Leaders:more » The Analytic Hierarchy Process in a Complex World. RWS Publications, Pittsburgh, PA), and on participatory approach, focusing on local community's concerns. The research compares four different waste collection and management alternatives: waste collection by using three tipper trucks, disposal and burning in an open area; waste collection by using seven dumpers and disposal in a landfill; waste collection by using seven dumpers and three tipper trucks and disposal in a landfill; waste collection by using three tipper trucks and disposal in a landfill. The results show that the second and the third solutions provide better scenarios for waste management. Furthermore, the discussion of the results points out the multidisciplinarity of the approach, and the equilibrium between social, environmental and technical impacts. This is a very important aspect in a humanitarian and environmental project, confirming the appropriateness of the chosen method.« less

  16. Response Inhibition Is Facilitated by a Change to Red Over Green in the Stop Signal Paradigm

    PubMed Central

    Blizzard, Shawn; Fierro-Rojas, Adriela; Fallah, Mazyar

    2017-01-01

    Actions are informed by the complex interactions of response execution and inhibition networks. These networks integrate sensory information with internal states and behavioral goals to produce an appropriate action or to update an ongoing action. Recent investigations have shown that, behaviorally, attention is captured through a hierarchy of colors. These studies showed how the color hierarchy affected visual processing. To determine whether the color hierarchy can be extended to higher level executive functions such as response execution and inhibition, we conducted several experiments using the stop-signal task (SST). In the first experiment, we modified the classic paradigm so that the go signals could vary in task-irrelevant color, with an auditory stop signal. We found that the task-irrelevant color of the go signals did not differentially affect response times. In the second experiment we determined that making the color of the go signal relevant for response selection still did not affect reaction times(RTs) and, thus, execution. In the third experiment, we modified the paradigm so that the stop signal was a task relevant change in color of the go signal. The mean RT to the red stop signal was approximately 25 ms faster than to the green stop signal. In other words, red stop signals facilitated response inhibition more than green stop signals, however, there was no comparative facilitation of response execution. These findings suggest that response inhibition, but not execution, networks are sensitive to differences in color salience. They also suggest that the color hierarchy is based on attentional networks and not simply on early sensory processing. PMID:28101011

  17. Probing neutrino mass hierarchy by comparing the charged-current and neutral-current interaction rates of supernova neutrinos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, Kwang-Chang; Leung Center for Cosmology and Particle Astrophysics; Lee, Fei-Fan

    2016-07-22

    The neutrino mass hierarchy is one of the neutrino fundamental properties yet to be determined. We introduce a method to determine neutrino mass hierarchy by comparing the interaction rate of neutral current (NC) interactions, ν(ν-bar)+p→ν(ν-bar)+p, and inverse beta decays (IBD), ν-bar{sub e}+p→n+e{sup +}, of supernova neutrinos in scintillation detectors. Neutrino flavor conversions inside the supernova are sensitive to neutrino mass hierarchy. Due to Mikheyev-Smirnov-Wolfenstein effects, the full swapping of ν-bar{sub e} flux with the ν-bar{sub x} (x=μ, τ) one occurs in the inverted hierarchy, while such a swapping does not occur in the normal hierarchy. As a result, more highmore » energy IBD events occur in the detector for the inverted hierarchy than the high energy IBD events in the normal hierarchy. By comparing IBD interaction rate with the mass hierarchy independent NC interaction rate, one can determine the neutrino mass hierarchy.« less

  18. Probing neutrino mass hierarchy by comparing the charged-current and neutral-current interaction rates of supernova neutrinos

    NASA Astrophysics Data System (ADS)

    Lai, Kwang-Chang; Lee, Fei-Fan; Lee, Feng-Shiuh; Lin, Guey-Lin; Liu, Tsung-Che; Yang, Yi

    2016-07-01

    The neutrino mass hierarchy is one of the neutrino fundamental properties yet to be determined. We introduce a method to determine neutrino mass hierarchy by comparing the interaction rate of neutral current (NC) interactions, ν(bar nu) + p → ν(bar nu) + p, and inverse beta decays (IBD), bar nue + p → n + e+, of supernova neutrinos in scintillation detectors. Neutrino flavor conversions inside the supernova are sensitive to neutrino mass hierarchy. Due to Mikheyev-Smirnov-Wolfenstein effects, the full swapping of bar nue flux with the bar nux (x = μ, τ) one occurs in the inverted hierarchy, while such a swapping does not occur in the normal hierarchy. As a result, more high energy IBD events occur in the detector for the inverted hierarchy than the high energy IBD events in the normal hierarchy. By comparing IBD interaction rate with the mass hierarchy independent NC interaction rate, one can determine the neutrino mass hierarchy.

  19. A portable approach for PIC on emerging architectures

    NASA Astrophysics Data System (ADS)

    Decyk, Viktor

    2016-03-01

    A portable approach for designing Particle-in-Cell (PIC) algorithms on emerging exascale computers, is based on the recognition that 3 distinct programming paradigms are needed. They are: low level vector (SIMD) processing, middle level shared memory parallel programing, and high level distributed memory programming. In addition, there is a memory hierarchy associated with each level. Such algorithms can be initially developed using vectorizing compilers, OpenMP, and MPI. This is the approach recommended by Intel for the Phi processor. These algorithms can then be translated and possibly specialized to other programming models and languages, as needed. For example, the vector processing and shared memory programming might be done with CUDA instead of vectorizing compilers and OpenMP, but generally the algorithm itself is not greatly changed. The UCLA PICKSC web site at http://www.idre.ucla.edu/ contains example open source skeleton codes (mini-apps) illustrating each of these three programming models, individually and in combination. Fortran2003 now supports abstract data types, and design patterns can be used to support a variety of implementations within the same code base. Fortran2003 also supports interoperability with C so that implementations in C languages are also easy to use. Finally, main codes can be translated into dynamic environments such as Python, while still taking advantage of high performing compiled languages. Parallel languages are still evolving with interesting developments in co-Array Fortran, UPC, and OpenACC, among others, and these can also be supported within the same software architecture. Work supported by NSF and DOE Grants.

  20. A complexity theory model in science education problem solving: random walks for working memory and mental capacity.

    PubMed

    Stamovlasis, Dimitrios; Tsaparlis, Georgios

    2003-07-01

    The present study examines the role of limited human channel capacity from a science education perspective. A model of science problem solving has been previously validated by applying concepts and tools of complexity theory (the working memory, random walk method). The method correlated the subjects' rank-order achievement scores in organic-synthesis chemistry problems with the subjects' working memory capacity. In this work, we apply the same nonlinear approach to a different data set, taken from chemical-equilibrium problem solving. In contrast to the organic-synthesis problems, these problems are algorithmic, require numerical calculations, and have a complex logical structure. As a result, these problems cause deviations from the model, and affect the pattern observed with the nonlinear method. In addition to Baddeley's working memory capacity, the Pascual-Leone's mental (M-) capacity is examined by the same random-walk method. As the complexity of the problem increases, the fractal dimension of the working memory random walk demonstrates a sudden drop, while the fractal dimension of the M-capacity random walk decreases in a linear fashion. A review of the basic features of the two capacities and their relation is included. The method and findings have consequences for problem solving not only in chemistry and science education, but also in other disciplines.

  1. Toward a Practical Model of Cognitive/Information Task Analysis and Schema Acquisition for Complex Problem-Solving Situations.

    ERIC Educational Resources Information Center

    Braune, Rolf; Foshay, Wellesley R.

    1983-01-01

    The proposed three-step strategy for research on human information processing--concept hierarchy analysis, analysis of example sets to teach relations among concepts, and analysis of problem sets to build a progressively larger schema for the problem space--may lead to practical procedures for instructional design and task analysis. Sixty-four…

  2. Learning under Conditions of Hierarchy and Discipline: The Case of the German Army, 1939-1940

    ERIC Educational Resources Information Center

    Visser, Max

    2008-01-01

    To survive in and adapt to dynamic, turbulent, and complex environments, organizations need to engage in learning. This truism is particularly relevant for army organizations in times of war and armed conflict. In this article a case of army operations during World War II is analyzed on the basis of Ortenblad's integrated model of the learning…

  3. Gender and Schooling in the Early Years. Research on Women and Education

    ERIC Educational Resources Information Center

    Koch, Janice, Ed.; Irby, Beverly, Ed.

    2005-01-01

    In this volume, gender and schooling in the early years addresses a broad range of issues including, but not limited, to gender equity in education. We explore, for example, the complex world of play in Fromberg's chapter and are reminded that for young children, play involves issues of power and hierarchy in ways that parallel the role of gender…

  4. Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time.

    PubMed

    Jack, Rachael E; Garrod, Oliver G B; Schyns, Philippe G

    2014-01-20

    Designed by biological and social evolutionary pressures, facial expressions of emotion comprise specific facial movements to support a near-optimal system of signaling and decoding. Although highly dynamical, little is known about the form and function of facial expression temporal dynamics. Do facial expressions transmit diagnostic signals simultaneously to optimize categorization of the six classic emotions, or sequentially to support a more complex communication system of successive categorizations over time? Our data support the latter. Using a combination of perceptual expectation modeling, information theory, and Bayesian classifiers, we show that dynamic facial expressions of emotion transmit an evolving hierarchy of "biologically basic to socially specific" information over time. Early in the signaling dynamics, facial expressions systematically transmit few, biologically rooted face signals supporting the categorization of fewer elementary categories (e.g., approach/avoidance). Later transmissions comprise more complex signals that support categorization of a larger number of socially specific categories (i.e., the six classic emotions). Here, we show that dynamic facial expressions of emotion provide a sophisticated signaling system, questioning the widely accepted notion that emotion communication is comprised of six basic (i.e., psychologically irreducible) categories, and instead suggesting four. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Modes of Interaction between Individuals Dominate the Topologies of Real World Networks

    PubMed Central

    Lee, Insuk; Kim, Eiru; Marcotte, Edward M.

    2015-01-01

    We find that the topologies of real world networks, such as those formed within human societies, by the Internet, or among cellular proteins, are dominated by the mode of the interactions considered among the individuals. Specifically, a major dichotomy in previously studied networks arises from modeling networks in terms of pairwise versus group tasks. The former often intrinsically give rise to scale-free, disassortative, hierarchical networks, whereas the latter often give rise to single- or broad-scale, assortative, nonhierarchical networks. These dependencies explain contrasting observations among previous topological analyses of real world complex systems. We also observe this trend in systems with natural hierarchies, in which alternate representations of the same networks, but which capture different levels of the hierarchy, manifest these signature topological differences. For example, in both the Internet and cellular proteomes, networks of lower-level system components (routers within domains or proteins within biological processes) are assortative and nonhierarchical, whereas networks of upper-level system components (internet domains or biological processes) are disassortative and hierarchical. Our results demonstrate that network topologies of complex systems must be interpreted in light of their hierarchical natures and interaction types. PMID:25793969

  6. Hierarchical mutual information for the comparison of hierarchical community structures in complex networks

    NASA Astrophysics Data System (ADS)

    Perotti, Juan Ignacio; Tessone, Claudio Juan; Caldarelli, Guido

    2015-12-01

    The quest for a quantitative characterization of community and modular structure of complex networks produced a variety of methods and algorithms to classify different networks. However, it is not clear if such methods provide consistent, robust, and meaningful results when considering hierarchies as a whole. Part of the problem is the lack of a similarity measure for the comparison of hierarchical community structures. In this work we give a contribution by introducing the hierarchical mutual information, which is a generalization of the traditional mutual information and makes it possible to compare hierarchical partitions and hierarchical community structures. The normalized version of the hierarchical mutual information should behave analogously to the traditional normalized mutual information. Here the correct behavior of the hierarchical mutual information is corroborated on an extensive battery of numerical experiments. The experiments are performed on artificial hierarchies and on the hierarchical community structure of artificial and empirical networks. Furthermore, the experiments illustrate some of the practical applications of the hierarchical mutual information, namely the comparison of different community detection methods and the study of the consistency, robustness, and temporal evolution of the hierarchical modular structure of networks.

  7. KENNEDY SPACE CENTER, FLA. - At the Space Memorial Mirror in the KSC Visitor Complex, visitors gather around dancers from the Shoshone-Bannock Native American community, Fort Hall, Idaho, who are performing a healing ceremony during the memorial service held for the crew of Columbia. Feb. 1 is the one-year anniversary of the loss of the crew and orbiter Columbia in a tragic accident as the ship returned to Earth following mission STS-107. Students and staff of the Shoshone-Bannock Nation had an experiment on board Columbia. The public was invited to the memorial service, held in the KSC Visitor Complex, which included comments by Center Director Jim Kennedy and Executive Director of Florida Space Authority Winston Scott. Scott is a former astronaut who flew on Columbia in 1997.

    NASA Image and Video Library

    2004-02-01

    KENNEDY SPACE CENTER, FLA. - At the Space Memorial Mirror in the KSC Visitor Complex, visitors gather around dancers from the Shoshone-Bannock Native American community, Fort Hall, Idaho, who are performing a healing ceremony during the memorial service held for the crew of Columbia. Feb. 1 is the one-year anniversary of the loss of the crew and orbiter Columbia in a tragic accident as the ship returned to Earth following mission STS-107. Students and staff of the Shoshone-Bannock Nation had an experiment on board Columbia. The public was invited to the memorial service, held in the KSC Visitor Complex, which included comments by Center Director Jim Kennedy and Executive Director of Florida Space Authority Winston Scott. Scott is a former astronaut who flew on Columbia in 1997.

  8. Ray Casting of Large Multi-Resolution Volume Datasets

    NASA Astrophysics Data System (ADS)

    Lux, C.; Fröhlich, B.

    2009-04-01

    High quality volume visualization through ray casting on graphics processing units (GPU) has become an important approach for many application domains. We present a GPU-based, multi-resolution ray casting technique for the interactive visualization of massive volume data sets commonly found in the oil and gas industry. Large volume data sets are represented as a multi-resolution hierarchy based on an octree data structure. The original volume data is decomposed into small bricks of a fixed size acting as the leaf nodes of the octree. These nodes are the highest resolution of the volume. Coarser resolutions are represented through inner nodes of the hierarchy which are generated by down sampling eight neighboring nodes on a finer level. Due to limited memory resources of current desktop workstations and graphics hardware only a limited working set of bricks can be locally maintained for a frame to be displayed. This working set is chosen to represent the whole volume at different local resolution levels depending on the current viewer position, transfer function and distinct areas of interest. During runtime the working set of bricks is maintained in CPU- and GPU memory and is adaptively updated by asynchronously fetching data from external sources like hard drives or a network. The CPU memory hereby acts as a secondary level cache for these sources from which the GPU representation is updated. Our volume ray casting algorithm is based on a 3D texture-atlas in GPU memory. This texture-atlas contains the complete working set of bricks of the current multi-resolution representation of the volume. This enables the volume ray casting algorithm to access the whole working set of bricks through only a single 3D texture. For traversing rays through the volume, information about the locations and resolution levels of visited bricks are required for correct compositing computations. We encode this information into a small 3D index texture which represents the current octree subdivision on its finest level and spatially organizes the bricked data. This approach allows us to render a bricked multi-resolution volume data set utilizing only a single rendering pass with no loss of compositing precision. In contrast most state-of-the art volume rendering systems handle the bricked data as individual 3D textures, which are rendered one at a time while the results are composited into a lower precision frame buffer. Furthermore, our method enables us to integrate advanced volume rendering techniques like empty-space skipping, adaptive sampling and preintegrated transfer functions in a very straightforward manner with virtually no extra costs. Our interactive volume ray tracing implementation allows high quality visualizations of massive volume data sets of tens of Gigabytes in size on standard desktop workstations.

  9. Control of information in working memory: Encoding and removal of distractors in the complex-span paradigm.

    PubMed

    Oberauer, Klaus; Lewandowsky, Stephan

    2016-11-01

    The article reports four experiments with complex-span tasks in which encoding of memory items alternates with processing of distractors. The experiments test two assumptions of a computational model of complex span, SOB-CS: (1) distractor processing impairs memory because distractors are encoded into working memory, thereby interfering with memoranda; and (2) free time following distractors is used to remove them from working memory by unbinding their representations from list context. Experiment 1 shows that distractors are erroneously chosen for recall more often than not-presented stimuli, demonstrating that distractors are encoded into memory. Distractor intrusions declined with longer free time, as predicted by distractor removal. Experiment 2 shows these effects even when distractors precede the memory list, ruling out an account based on selective rehearsal of memoranda during free time. Experiments 3 and 4 test the notion that distractors decay over time. Both experiments show that, contrary to the notion of distractor decay, the chance of a distractor intruding at test does not decline with increasing time since encoding of that distractor. Experiment 4 provides additional evidence against the prediction from distractor decay that distractor intrusions decline over an unfilled retention interval. Taken together, the results support SOB-CS and rule out alternative explanations. Data and simulation code are available on Open Science Framework: osf.io/3ewh7. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Hänsel, Gretel and the slime mould—how an external spatial memory aids navigation in complex environments

    NASA Astrophysics Data System (ADS)

    Smith-Ferguson, Jules; Reid, Chris R.; Latty, Tanya; Beekman, Madeleine

    2017-10-01

    The ability to navigate through an environment is critical to most organisms’ ability to survive and reproduce. The presence of a memory system greatly enhances navigational success. Therefore, natural selection is likely to drive the creation of memory systems, even in non-neuronal organisms, if having such a system is adaptive. Here we examine if the external spatial memory system present in the acellular slime mould, Physarum polycephalum, provides an adaptive advantage for resource acquisition. P. polycephalum lays tracks of extracellular slime as it moves through its environment. Previous work has shown that the presence of extracellular slime allows the organism to escape from a trap in laboratory experiments simply by avoiding areas previously explored. Here we further investigate the benefits of using extracellular slime as an external spatial memory by testing the organism’s ability to navigate through environments of differing complexity with and without the ability to use its external memory. Our results suggest that the external memory has an adaptive advantage in ‘open’ and simple bounded environments. However, in a complex bounded environment, the extracellular slime provides no advantage, and may even negatively affect the organism’s navigational abilities. Our results indicate that the exact experimental set up matters if one wants to fully understand how the presence of extracellular slime affects the slime mould’s search behaviour.

  11. Frills, Furbelows and Activated Memory: Syntactically Optional Elements in the Spontaneous Language Production of Bilingual Speakers

    ERIC Educational Resources Information Center

    Fehringer, Carol; Fry, Christina

    2007-01-01

    Previous studies have shown a correlation between working memory (WM) and syntactic complexity (variously defined) in language comprehension. The present study investigates this relationship in spontaneous language production, proposing a novel metric, informed by language development and disorders, where complexity is construed in terms of those…

  12. Roles of Working Memory Performance and Instructional Strategy in Complex Cognitive Task Performance

    ERIC Educational Resources Information Center

    Cevik, V.; Altun, A.

    2016-01-01

    This study aims to investigate how working memory (WM) performances and instructional strategy choices affect learners' complex cognitive task performance in online environments. Three different e-learning environments were designed based on Merrill's (2006a) model of instructional strategies. The lack of experimental research on his framework is…

  13. Validity of the MicroDYN Approach: Complex Problem Solving Predicts School Grades beyond Working Memory Capacity

    ERIC Educational Resources Information Center

    Schweizer, Fabian; Wustenberg, Sascha; Greiff, Samuel

    2013-01-01

    This study examines the validity of the complex problem solving (CPS) test MicroDYN by investigating a) the relation between its dimensions--rule identification (exploration strategy), rule knowledge (acquired knowledge), rule application (control performance)--and working memory capacity (WMC), and b) whether CPS predicts school grades in…

  14. The Effects of Pictorial Complexity and Cognitive Style on Visual Recall Memory.

    ERIC Educational Resources Information Center

    Jesky, Romaine R.; Berry, Louis H.

    The effect of the interaction between cognitive style differences (field dependence/field independence) and various degrees of visual complexity on pictorial recall memory was explored using three sets of visuals in three different formats--line drawing, black and white, and color. The subjects were 86 undergraduate students enrolled in two core…

  15. Block algebra in two-component BKP and D type Drinfeld-Sokolov hierarchies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chuanzhong, E-mail: lichuanzhong@nbu.edu.cn; He, Jingsong, E-mail: hejingsong@nbu.edu.cn

    We construct generalized additional symmetries of a two-component BKP hierarchy defined by two pseudo-differential Lax operators. These additional symmetry flows form a Block type algebra with some modified (or additional) terms because of a B type reduction condition of this integrable hierarchy. Further we show that the D type Drinfeld-Sokolov hierarchy, which is a reduction of the two-component BKP hierarchy, possess a complete Block type additional symmetry algebra. That D type Drinfeld-Sokolov hierarchy has a similar algebraic structure as the bigraded Toda hierarchy which is a differential-discrete integrable system.

  16. GrouseFlocks: steerable exploration of graph hierarchy space.

    PubMed

    Archambault, Daniel; Munzner, Tamara; Auber, David

    2008-01-01

    Several previous systems allow users to interactively explore a large input graph through cuts of a superimposed hierarchy. This hierarchy is often created using clustering algorithms or topological features present in the graph. However, many graphs have domain-specific attributes associated with the nodes and edges, which could be used to create many possible hierarchies providing unique views of the input graph. GrouseFlocks is a system for the exploration of this graph hierarchy space. By allowing users to see several different possible hierarchies on the same graph, the system helps users investigate graph hierarchy space instead of a single fixed hierarchy. GrouseFlocks provides a simple set of operations so that users can create and modify their graph hierarchies based on selections. These selections can be made manually or based on patterns in the attribute data provided with the graph. It provides feedback to the user within seconds, allowing interactive exploration of this space.

  17. The exposure hierarchy as a measure of progress and efficacy in the treatment of social anxiety disorder.

    PubMed

    Katerelos, Marina; Hawley, Lance L; Antony, Martin M; McCabe, Randi E

    2008-07-01

    This study explored the psychometric properties and utility of the exposure hierarchy as a measure of treatment outcome for social anxiety disorder (SAD). An exposure hierarchy was created for each of 103 individuals with a diagnosis of SAD who completed a course of cognitive behavioral group therapy. Exposure hierarchy ratings were collected on a weekly basis, and a series of self-report measures were collected before and after treatment. Results indicated that the exposure hierarchy demonstrated high test-retest reliability, as well as significant convergent validity, as participants' exposure hierarchy ratings correlated positively with scores on conceptually related measures. Hierarchy ratings were significantly associated with changes in SAD symptoms over time. However, exposure hierarchy ratings were correlated to general measures of psychopathology, suggesting limited discriminant validity. The study highlights the clinical and scientific utility of the exposure hierarchy.

  18. Toda hierarchies and their applications

    NASA Astrophysics Data System (ADS)

    Takasaki, Kanehisa

    2018-05-01

    The 2D Toda hierarchy occupies a central position in the family of integrable hierarchies of the Toda type. The 1D Toda hierarchy and the Ablowitz–Ladik (aka relativistic Toda) hierarchy can be derived from the 2D Toda hierarchy as reductions. These integrable hierarchies have been applied to various problems of mathematics and mathematical physics since 1990s. A recent example is a series of studies on models of statistical mechanics called the melting crystal model. This research has revealed that the aforementioned two reductions of the 2D Toda hierarchy underlie two different melting crystal models. Technical clues are a fermionic realization of the quantum torus algebra, special algebraic relations therein called shift symmetries, and a matrix factorization problem. The two melting crystal models thus exhibit remarkable similarity with the Hermitian and unitary matrix models for which the two reductions of the 2D Toda hierarchy play the role of fundamental integrable structures.

  19. Efficient calculation of open quantum system dynamics and time-resolved spectroscopy with distributed memory HEOM (DM-HEOM).

    PubMed

    Kramer, Tobias; Noack, Matthias; Reinefeld, Alexander; Rodríguez, Mirta; Zelinskyy, Yaroslav

    2018-06-11

    Time- and frequency-resolved optical signals provide insights into the properties of light-harvesting molecular complexes, including excitation energies, dipole strengths and orientations, as well as in the exciton energy flow through the complex. The hierarchical equations of motion (HEOM) provide a unifying theory, which allows one to study the combined effects of system-environment dissipation and non-Markovian memory without making restrictive assumptions about weak or strong couplings or separability of vibrational and electronic degrees of freedom. With increasing system size the exact solution of the open quantum system dynamics requires memory and compute resources beyond a single compute node. To overcome this barrier, we developed a scalable variant of HEOM. Our distributed memory HEOM, DM-HEOM, is a universal tool for open quantum system dynamics. It is used to accurately compute all experimentally accessible time- and frequency-resolved processes in light-harvesting molecular complexes with arbitrary system-environment couplings for a wide range of temperatures and complex sizes. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  20. PCM-Based Durable Write Cache for Fast Disk I/O

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Zhuo; Wang, Bin; Carpenter, Patrick

    2012-01-01

    Flash based solid-state devices (FSSDs) have been adopted within the memory hierarchy to improve the performance of hard disk drive (HDD) based storage system. However, with the fast development of storage-class memories, new storage technologies with better performance and higher write endurance than FSSDs are emerging, e.g., phase-change memory (PCM). Understanding how to leverage these state-of-the-art storage technologies for modern computing systems is important to solve challenging data intensive computing problems. In this paper, we propose to leverage PCM for a hybrid PCM-HDD storage architecture. We identify the limitations of traditional LRU caching algorithms for PCM-based caches, and develop amore » novel hash-based write caching scheme called HALO to improve random write performance of hard disks. To address the limited durability of PCM devices and solve the degraded spatial locality in traditional wear-leveling techniques, we further propose novel PCM management algorithms that provide effective wear-leveling while maximizing access parallelism. We have evaluated this PCM-based hybrid storage architecture using applications with a diverse set of I/O access patterns. Our experimental results demonstrate that the HALO caching scheme leads to an average reduction of 36.8% in execution time compared to the LRU caching scheme, and that the SFC wear leveling extends the lifetime of PCM by a factor of 21.6.« less

Top