Sample records for promising computing paradigm

  1. An Architecture for Cross-Cloud System Management

    NASA Astrophysics Data System (ADS)

    Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad

    The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.

  2. Task Scheduling in Desktop Grids: Open Problems

    NASA Astrophysics Data System (ADS)

    Chernov, Ilya; Nikitina, Natalia; Ivashko, Evgeny

    2017-12-01

    We survey the areas of Desktop Grid task scheduling that seem to be insufficiently studied so far and are promising for efficiency, reliability, and quality of Desktop Grid computing. These topics include optimal task grouping, "needle in a haystack" paradigm, game-theoretical scheduling, domain-imposed approaches, special optimization of the final stage of the batch computation, and Enterprise Desktop Grids.

  3. Supervised Machine Learning for Population Genetics: A New Paradigm

    PubMed Central

    Schrider, Daniel R.; Kern, Andrew D.

    2018-01-01

    As population genomic datasets grow in size, researchers are faced with the daunting task of making sense of a flood of information. To keep pace with this explosion of data, computational methodologies for population genetic inference are rapidly being developed to best utilize genomic sequence data. In this review we discuss a new paradigm that has emerged in computational population genomics: that of supervised machine learning (ML). We review the fundamentals of ML, discuss recent applications of supervised ML to population genetics that outperform competing methods, and describe promising future directions in this area. Ultimately, we argue that supervised ML is an important and underutilized tool that has considerable potential for the world of evolutionary genomics. PMID:29331490

  4. A novel brain-computer interface based on the rapid serial visual presentation paradigm.

    PubMed

    Acqualagna, Laura; Treder, Matthias Sebastian; Schreuder, Martijn; Blankertz, Benjamin

    2010-01-01

    Most present-day visual brain computer interfaces (BCIs) suffer from the fact that they rely on eye movements, are slow-paced, or feature a small vocabulary. As a potential remedy, we explored a novel BCI paradigm consisting of a central rapid serial visual presentation (RSVP) of the stimuli. It has a large vocabulary and realizes a BCI system based on covert non-spatial selective visual attention. In an offline study, eight participants were presented sequences of rapid bursts of symbols. Two different speeds and two different color conditions were investigated. Robust early visual and P300 components were elicited time-locked to the presentation of the target. Offline classification revealed a mean accuracy of up to 90% for selecting the correct symbol out of 30 possibilities. The results suggest that RSVP-BCI is a promising new paradigm, also for patients with oculomotor impairments.

  5. Client/Server Architecture Promises Radical Changes.

    ERIC Educational Resources Information Center

    Freeman, Grey; York, Jerry

    1991-01-01

    This article discusses the emergence of the client/server paradigm for the delivery of computer applications, its emergence in response to the proliferation of microcomputers and local area networks, the applicability of the model in academic institutions, and its implications for college campus information technology organizations. (Author/DB)

  6. Exposure Control Using Adaptive Multi-Stage Item Bundles.

    ERIC Educational Resources Information Center

    Luecht, Richard M.

    This paper presents a multistage adaptive testing test development paradigm that promises to handle content balancing and other test development needs, psychometric reliability concerns, and item exposure. The bundled multistage adaptive testing (BMAT) framework is a modification of the computer-adaptive sequential testing framework introduced by…

  7. A Hybrid Brain-Computer Interface Based on the Fusion of P300 and SSVEP Scores.

    PubMed

    Yin, Erwei; Zeyl, Timothy; Saab, Rami; Chau, Tom; Hu, Dewen; Zhou, Zongtan

    2015-07-01

    The present study proposes a hybrid brain-computer interface (BCI) with 64 selectable items based on the fusion of P300 and steady-state visually evoked potential (SSVEP) brain signals. With this approach, row/column (RC) P300 and two-step SSVEP paradigms were integrated to create two hybrid paradigms, which we denote as the double RC (DRC) and 4-D spellers. In each hybrid paradigm, the target is simultaneously detected based on both P300 and SSVEP potentials as measured by the electroencephalogram. We further proposed a maximum-probability estimation (MPE) fusion approach to combine the P300 and SSVEP on a score level and compared this approach to other approaches based on linear discriminant analysis, a naïve Bayes classifier, and support vector machines. The experimental results obtained from thirteen participants indicated that the 4-D hybrid paradigm outperformed the DRC paradigm and that the MPE fusion achieved higher accuracy compared with the other approaches. Importantly, 12 of the 13 participants, using the 4-D paradigm achieved an accuracy of over 90% and the average accuracy was 95.18%. These promising results suggest that the proposed hybrid BCI system could be used in the design of a high-performance BCI-based keyboard.

  8. Challenges of Big Data Analysis.

    PubMed

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  9. Parallel Computing Strategies for Irregular Algorithms

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Oliker, Leonid; Shan, Hongzhang; Biegel, Bryan (Technical Monitor)

    2002-01-01

    Parallel computing promises several orders of magnitude increase in our ability to solve realistic computationally-intensive problems, but relies on their efficient mapping and execution on large-scale multiprocessor architectures. Unfortunately, many important applications are irregular and dynamic in nature, making their effective parallel implementation a daunting task. Moreover, with the proliferation of parallel architectures and programming paradigms, the typical scientist is faced with a plethora of questions that must be answered in order to obtain an acceptable parallel implementation of the solution algorithm. In this paper, we consider three representative irregular applications: unstructured remeshing, sparse matrix computations, and N-body problems, and parallelize them using various popular programming paradigms on a wide spectrum of computer platforms ranging from state-of-the-art supercomputers to PC clusters. We present the underlying problems, the solution algorithms, and the parallel implementation strategies. Smart load-balancing, partitioning, and ordering techniques are used to enhance parallel performance. Overall results demonstrate the complexity of efficiently parallelizing irregular algorithms.

  10. Challenges of Big Data Analysis

    PubMed Central

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-01-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469

  11. Proxy-equation paradigm: A strategy for massively parallel asynchronous computations

    NASA Astrophysics Data System (ADS)

    Mittal, Ankita; Girimaji, Sharath

    2017-09-01

    Massively parallel simulations of transport equation systems call for a paradigm change in algorithm development to achieve efficient scalability. Traditional approaches require time synchronization of processing elements (PEs), which severely restricts scalability. Relaxing synchronization requirement introduces error and slows down convergence. In this paper, we propose and develop a novel "proxy equation" concept for a general transport equation that (i) tolerates asynchrony with minimal added error, (ii) preserves convergence order and thus, (iii) expected to scale efficiently on massively parallel machines. The central idea is to modify a priori the transport equation at the PE boundaries to offset asynchrony errors. Proof-of-concept computations are performed using a one-dimensional advection (convection) diffusion equation. The results demonstrate the promise and advantages of the present strategy.

  12. The virtual mirror: a new interaction paradigm for augmented reality environments.

    PubMed

    Bichlmeier, Christoph; Heining, Sandro Michael; Feuerstein, Marco; Navab, Nassir

    2009-09-01

    Medical augmented reality (AR) has been widely discussed within the medical imaging as well as computer aided surgery communities. Different systems for exemplary medical applications have been proposed. Some of them produced promising results. One major issue still hindering AR technology to be regularly used in medical applications is the interaction between physician and the superimposed 3-D virtual data. Classical interaction paradigms, for instance with keyboard and mouse, to interact with visualized medical 3-D imaging data are not adequate for an AR environment. This paper introduces the concept of a tangible/controllable Virtual Mirror for medical AR applications. This concept intuitively augments the direct view of the surgeon with all desired views on volumetric medical imaging data registered with the operation site without moving around the operating table or displacing the patient. We selected two medical procedures to demonstrate and evaluate the potentials of the Virtual Mirror for the surgical workflow. Results confirm the intuitiveness of this new paradigm and its perceptive advantages for AR-based computer aided interventions.

  13. An Efficient ERP-Based Brain-Computer Interface Using Random Set Presentation and Face Familiarity

    PubMed Central

    Müller, Klaus-Robert; Lee, Seong-Whan

    2014-01-01

    Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup. PMID:25384045

  14. An efficient ERP-based brain-computer interface using random set presentation and face familiarity.

    PubMed

    Yeom, Seul-Ki; Fazli, Siamac; Müller, Klaus-Robert; Lee, Seong-Whan

    2014-01-01

    Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup.

  15. All-optical reservoir computer based on saturation of absorption.

    PubMed

    Dejonckheere, Antoine; Duport, François; Smerieri, Anteo; Fang, Li; Oudar, Jean-Louis; Haelterman, Marc; Massar, Serge

    2014-05-05

    Reservoir computing is a new bio-inspired computation paradigm. It exploits a dynamical system driven by a time-dependent input to carry out computation. For efficient information processing, only a few parameters of the reservoir needs to be tuned, which makes it a promising framework for hardware implementation. Recently, electronic, opto-electronic and all-optical experimental reservoir computers were reported. In those implementations, the nonlinear response of the reservoir is provided by active devices such as optoelectronic modulators or optical amplifiers. By contrast, we propose here the first reservoir computer based on a fully passive nonlinearity, namely the saturable absorption of a semiconductor mirror. Our experimental setup constitutes an important step towards the development of ultrafast low-consumption analog computers.

  16. Software Systems for High-performance Quantum Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S; Britt, Keith A

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventionalmore » computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.« less

  17. On Patterns in Affective Media

    NASA Astrophysics Data System (ADS)

    ADAMATZKY, ANDREW

    In computational experiments with cellular automaton models of affective solutions, where chemical species represent happiness, anger, fear, confusion and sadness, we study phenomena of space time dynamic of emotions. We demonstrate feasibility of the affective solution paradigm in example of emotional abuse therapy. Results outlined in the present paper offer unconventional but promising technique to design, analyze and interpret spatio-temporal dynamic of mass moods in crowds.

  18. A software methodology for compiling quantum programs

    NASA Astrophysics Data System (ADS)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  19. A practical approach to virtualization in HEP

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Aguado Sánchez, C.; Blomer, J.; Harutyunyan, A.; Mudrinic, M.

    2011-01-01

    In the attempt to solve the problem of processing data coming from LHC experiments at CERN at a rate of 15PB per year, for almost a decade the High Enery Physics (HEP) community has focused its efforts on the development of the Worldwide LHC Computing Grid. This generated large interest and expectations promising to revolutionize computing. Meanwhile, having initially taken part in the Grid standardization process, industry has moved in a different direction and started promoting the Cloud Computing paradigm which aims to solve problems on a similar scale and in equally seamless way as it was expected in the idealized Grid approach. A key enabling technology behind Cloud computing is server virtualization. In early 2008, an R&D project was established in the PH-SFT group at CERN to investigate how virtualization technology could be used to improve and simplify the daily interaction of physicists with experiment software frameworks and the Grid infrastructure. In this article we shall first briefly compare Grid and Cloud computing paradigms and then summarize the results of the R&D activity pointing out where and how virtualization technology could be effectively used in our field in order to maximize practical benefits whilst avoiding potential pitfalls.

  20. ProjectQ Software Framework

    NASA Astrophysics Data System (ADS)

    Steiger, Damian S.; Haener, Thomas; Troyer, Matthias

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. A high level quantum programming language and optimizing compilers are essential components to achieve scalable quantum computation. In order to address this, we introduce the ProjectQ software framework - an open source effort to support both theorists and experimentalists by providing intuitive tools to implement and run quantum algorithms. Here, we present our ProjectQ quantum compiler, which compiles a quantum algorithm from our high-level Python-embedded language down to low-level quantum gates available on the target system. We demonstrate how this compiler can be used to control actual hardware and to run high-performance simulations.

  1. RNA nanotechnology for computer design and in vivo computation

    PubMed Central

    Qiu, Meikang; Khisamutdinov, Emil; Zhao, Zhengyi; Pan, Cheryl; Choi, Jeong-Woo; Leontis, Neocles B.; Guo, Peixuan

    2013-01-01

    Molecular-scale computing has been explored since 1989 owing to the foreseeable limitation of Moore's law for silicon-based computation devices. With the potential of massive parallelism, low energy consumption and capability of working in vivo, molecular-scale computing promises a new computational paradigm. Inspired by the concepts from the electronic computer, DNA computing has realized basic Boolean functions and has progressed into multi-layered circuits. Recently, RNA nanotechnology has emerged as an alternative approach. Owing to the newly discovered thermodynamic stability of a special RNA motif (Shu et al. 2011 Nat. Nanotechnol. 6, 658–667 (doi:10.1038/nnano.2011.105)), RNA nanoparticles are emerging as another promising medium for nanodevice and nanomedicine as well as molecular-scale computing. Like DNA, RNA sequences can be designed to form desired secondary structures in a straightforward manner, but RNA is structurally more versatile and more thermodynamically stable owing to its non-canonical base-pairing, tertiary interactions and base-stacking property. A 90-nucleotide RNA can exhibit 490 nanostructures, and its loops and tertiary architecture can serve as a mounting dovetail that eliminates the need for external linking dowels. Its enzymatic and fluorogenic activity creates diversity in computational design. Varieties of small RNA can work cooperatively, synergistically or antagonistically to carry out computational logic circuits. The riboswitch and enzymatic ribozyme activities and its special in vivo attributes offer a great potential for in vivo computation. Unique features in transcription, termination, self-assembly, self-processing and acid resistance enable in vivo production of RNA nanoparticles that harbour various regulators for intracellular manipulation. With all these advantages, RNA computation is promising, but it is still in its infancy. Many challenges still exist. Collaborations between RNA nanotechnologists and computer scientists are necessary to advance this nascent technology. PMID:24000362

  2. RNA nanotechnology for computer design and in vivo computation.

    PubMed

    Qiu, Meikang; Khisamutdinov, Emil; Zhao, Zhengyi; Pan, Cheryl; Choi, Jeong-Woo; Leontis, Neocles B; Guo, Peixuan

    2013-10-13

    Molecular-scale computing has been explored since 1989 owing to the foreseeable limitation of Moore's law for silicon-based computation devices. With the potential of massive parallelism, low energy consumption and capability of working in vivo, molecular-scale computing promises a new computational paradigm. Inspired by the concepts from the electronic computer, DNA computing has realized basic Boolean functions and has progressed into multi-layered circuits. Recently, RNA nanotechnology has emerged as an alternative approach. Owing to the newly discovered thermodynamic stability of a special RNA motif (Shu et al. 2011 Nat. Nanotechnol. 6, 658-667 (doi:10.1038/nnano.2011.105)), RNA nanoparticles are emerging as another promising medium for nanodevice and nanomedicine as well as molecular-scale computing. Like DNA, RNA sequences can be designed to form desired secondary structures in a straightforward manner, but RNA is structurally more versatile and more thermodynamically stable owing to its non-canonical base-pairing, tertiary interactions and base-stacking property. A 90-nucleotide RNA can exhibit 4⁹⁰ nanostructures, and its loops and tertiary architecture can serve as a mounting dovetail that eliminates the need for external linking dowels. Its enzymatic and fluorogenic activity creates diversity in computational design. Varieties of small RNA can work cooperatively, synergistically or antagonistically to carry out computational logic circuits. The riboswitch and enzymatic ribozyme activities and its special in vivo attributes offer a great potential for in vivo computation. Unique features in transcription, termination, self-assembly, self-processing and acid resistance enable in vivo production of RNA nanoparticles that harbour various regulators for intracellular manipulation. With all these advantages, RNA computation is promising, but it is still in its infancy. Many challenges still exist. Collaborations between RNA nanotechnologists and computer scientists are necessary to advance this nascent technology.

  3. Sequenced subjective accents for brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Vlek, R. J.; Schaefer, R. S.; Gielen, C. C. A. M.; Farquhar, J. D. R.; Desain, P.

    2011-06-01

    Subjective accenting is a cognitive process in which identical auditory pulses at an isochronous rate turn into the percept of an accenting pattern. This process can be voluntarily controlled, making it a candidate for communication from human user to machine in a brain-computer interface (BCI) system. In this study we investigated whether subjective accenting is a feasible paradigm for BCI and how its time-structured nature can be exploited for optimal decoding from non-invasive EEG data. Ten subjects perceived and imagined different metric patterns (two-, three- and four-beat) superimposed on a steady metronome. With an offline classification paradigm, we classified imagined accented from non-accented beats on a single trial (0.5 s) level with an average accuracy of 60.4% over all subjects. We show that decoding of imagined accents is also possible with a classifier trained on perception data. Cyclic patterns of accents and non-accents were successfully decoded with a sequence classification algorithm. Classification performances were compared by means of bit rate. Performance in the best scenario translates into an average bit rate of 4.4 bits min-1 over subjects, which makes subjective accenting a promising paradigm for an online auditory BCI.

  4. A Review Study on Cloud Computing Issues

    NASA Astrophysics Data System (ADS)

    Kanaan Kadhim, Qusay; Yusof, Robiah; Sadeq Mahdi, Hamid; Al-shami, Sayed Samer Ali; Rahayu Selamat, Siti

    2018-05-01

    Cloud computing is the most promising current implementation of utility computing in the business world, because it provides some key features over classic utility computing, such as elasticity to allow clients dynamically scale-up and scale-down the resources in execution time. Nevertheless, cloud computing is still in its premature stage and experiences lack of standardization. The security issues are the main challenges to cloud computing adoption. Thus, critical industries such as government organizations (ministries) are reluctant to trust cloud computing due to the fear of losing their sensitive data, as it resides on the cloud with no knowledge of data location and lack of transparency of Cloud Service Providers (CSPs) mechanisms used to secure their data and applications which have created a barrier against adopting this agile computing paradigm. This study aims to review and classify the issues that surround the implementation of cloud computing which a hot area that needs to be addressed by future research.

  5. Neurobiomimetic constructs for intelligent unmanned systems and robotics

    NASA Astrophysics Data System (ADS)

    Braun, Jerome J.; Shah, Danelle C.; DeAngelus, Marianne A.

    2014-06-01

    This paper discusses a paradigm we refer to as neurobiomimetic, which involves emulations of brain neuroanatomy and neurobiology aspects and processes. Neurobiomimetic constructs include rudimentary and down-scaled computational representations of brain regions, sub-regions, and synaptic connectivity. Many different instances of neurobiomimetic constructs are possible, depending on various aspects such as the initial conditions of synaptic connectivity, number of neuron elements in regions, connectivity specifics, and more, and we refer to these instances as `animats'. While downscaled for computational feasibility, the animats are very large constructs; the animats implemented in this work contain over 47,000 neuron elements and over 720,000 synaptic connections. The paper outlines aspects of the animats implemented, spatial memory and learning cognitive task, the virtual-reality environment constructed to study the animat performing that task, and discussion of results. In a broad sense, we argue that the neurobiomimetic paradigm pursued in this work constitutes a particularly promising path to artificial cognition and intelligent unmanned systems. Biological brains readily cope with challenges of real-life tasks that consistently prove beyond even the most sophisticated algorithmic approaches known. At the cross-over point of neuroscience, cognitive science and computer science, paradigms such as the one pursued in this work aim to mimic the mechanisms of biological brains and as such, we argue, may lead to machines with abilities closer to those of biological species.

  6. The genesis of neurosurgery and the evolution of the neurosurgical operative environment: part II--concepts for future development, 2003 and beyond.

    PubMed

    Liu, Charles Y; Spicer, Mark; Apuzzo, Michael L J

    2003-01-01

    The future development of the neurosurgical operative environment is driven principally by concurrent development in science and technology. In the new millennium, these developments are taking on a Jules Verne quality, with the ability to construct and manipulate the human organism and its surroundings at the level of atoms and molecules seemingly at hand. Thus, an examination of currents in technology advancement from the neurosurgical perspective can provide insight into the evolution of the neurosurgical operative environment. In the future, the optimal design solution for the operative environment requirements of specialized neurosurgery may take the form of composites of venues that are currently mutually distinct. Advances in microfabrication technology and laser optical manipulators are expanding the scope and role of robotics, with novel opportunities for bionic integration. Assimilation of biosensor technology into the operative environment promises to provide neurosurgeons of the future with a vastly expanded set of physiological data, which will require concurrent simplification and optimization of analysis and presentation schemes to facilitate practical usefulness. Nanotechnology derivatives are shattering the maximum limits of resolution and magnification allowed by conventional microscopes. Furthermore, quantum computing and molecular electronics promise to greatly enhance computational power, allowing the emerging reality of simulation and virtual neurosurgery for rehearsal and training purposes. Progressive minimalism is evident throughout, leading ultimately to a paradigm shift as the nanoscale is approached. At the interface between the old and new technological paradigms, issues related to integration may dictate the ultimate emergence of the products of the new paradigm. Once initiated, however, history suggests that the process of change will proceed rapidly and dramatically, with the ultimate neurosurgical operative environment of the future being far more complex in functional capacity but strikingly simple in apparent form.

  7. P300 brain computer interface: current challenges and emerging trends

    PubMed Central

    Fazel-Rezai, Reza; Allison, Brendan Z.; Guger, Christoph; Sellers, Eric W.; Kleih, Sonja C.; Kübler, Andrea

    2012-01-01

    A brain-computer interface (BCI) enables communication without movement based on brain signals measured with electroencephalography (EEG). BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP), steady state visual evoked potential (SSVEP), or event related desynchronization (ERD). Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility. PMID:22822397

  8. Computational pan-genomics: status, promises and challenges.

    PubMed

    2018-01-01

    Many disciplines, from human genetics and oncology to plant breeding, microbiology and virology, commonly face the challenge of analyzing rapidly increasing numbers of genomes. In case of Homo sapiens, the number of sequenced genomes will approach hundreds of thousands in the next few years. Simply scaling up established bioinformatics pipelines will not be sufficient for leveraging the full potential of such rich genomic data sets. Instead, novel, qualitatively different computational methods and paradigms are needed. We will witness the rapid extension of computational pan-genomics, a new sub-area of research in computational biology. In this article, we generalize existing definitions and understand a pan-genome as any collection of genomic sequences to be analyzed jointly or to be used as a reference. We examine already available approaches to construct and use pan-genomes, discuss the potential benefits of future technologies and methodologies and review open challenges from the vantage point of the above-mentioned biological disciplines. As a prominent example for a computational paradigm shift, we particularly highlight the transition from the representation of reference genomes as strings to representations as graphs. We outline how this and other challenges from different application domains translate into common computational problems, point out relevant bioinformatics techniques and identify open problems in computer science. With this review, we aim to increase awareness that a joint approach to computational pan-genomics can help address many of the problems currently faced in various domains. © The Author 2016. Published by Oxford University Press.

  9. A Decade of Neural Networks: Practical Applications and Prospects

    NASA Technical Reports Server (NTRS)

    Kemeny, Sabrina E.

    1994-01-01

    The Jet Propulsion Laboratory Neural Network Workshop, sponsored by NASA and DOD, brings together sponsoring agencies, active researchers, and the user community to formulate a vision for the next decade of neural network research and application prospects. While the speed and computing power of microprocessors continue to grow at an ever-increasing pace, the demand to intelligently and adaptively deal with the complex, fuzzy, and often ill-defined world around us remains to a large extent unaddressed. Powerful, highly parallel computing paradigms such as neural networks promise to have a major impact in addressing these needs. Papers in the workshop proceedings highlight benefits of neural networks in real-world applications compared to conventional computing techniques. Topics include fault diagnosis, pattern recognition, and multiparameter optimization.

  10. Minding One's Reach (To Eat): The Promise of Computer Mouse-Tracking to Study Self-Regulation of Eating.

    PubMed

    Lopez, Richard B; Stillman, Paul E; Heatherton, Todd F; Freeman, Jonathan B

    2018-01-01

    In this review, we present the case for using computer mouse-tracking techniques to examine psychological processes that support (and hinder) self-regulation of eating. We first argue that computer mouse-tracking is suitable for studying the simultaneous engagement of-and dynamic interactions between-multiple perceptual and cognitive processes as they unfold and interact over a fine temporal scale (i.e., hundreds of milliseconds). Next, we review recent work that implemented mouse-tracking techniques by measuring mouse movements as participants chose between various food items (of varying nutritional content). Lastly, we propose next steps for future investigations to link behavioral features from mouse-tracking paradigms, corresponding neural correlates, and downstream eating behaviors.

  11. Goal selection versus process control while learning to use a brain-computer interface

    NASA Astrophysics Data System (ADS)

    Royer, Audrey S.; Rose, Minn L.; He, Bin

    2011-06-01

    A brain-computer interface (BCI) can be used to accomplish a task without requiring motor output. Two major control strategies used by BCIs during task completion are process control and goal selection. In process control, the user exerts continuous control and independently executes the given task. In goal selection, the user communicates their goal to the BCI and then receives assistance executing the task. A previous study has shown that goal selection is more accurate and faster in use. An unanswered question is, which control strategy is easier to learn? This study directly compares goal selection and process control while learning to use a sensorimotor rhythm-based BCI. Twenty young healthy human subjects were randomly assigned either to a goal selection or a process control-based paradigm for eight sessions. At the end of the study, the best user from each paradigm completed two additional sessions using all paradigms randomly mixed. The results of this study were that goal selection required a shorter training period for increased speed, accuracy, and information transfer over process control. These results held for the best subjects as well as in the general subject population. The demonstrated characteristics of goal selection make it a promising option to increase the utility of BCIs intended for both disabled and able-bodied users.

  12. Drug target inference through pathway analysis of genomics data

    PubMed Central

    Ma, Haisu; Zhao, Hongyu

    2013-01-01

    Statistical modeling coupled with bioinformatics is commonly used for drug discovery. Although there exist many approaches for single target based drug design and target inference, recent years have seen a paradigm shift to system-level pharmacological research. Pathway analysis of genomics data represents one promising direction for computational inference of drug targets. This article aims at providing a comprehensive review on the evolving issues is this field, covering methodological developments, their pros and cons, as well as future research directions. PMID:23369829

  13. Metal oxide resistive random access memory based synaptic devices for brain-inspired computing

    NASA Astrophysics Data System (ADS)

    Gao, Bin; Kang, Jinfeng; Zhou, Zheng; Chen, Zhe; Huang, Peng; Liu, Lifeng; Liu, Xiaoyan

    2016-04-01

    The traditional Boolean computing paradigm based on the von Neumann architecture is facing great challenges for future information technology applications such as big data, the Internet of Things (IoT), and wearable devices, due to the limited processing capability issues such as binary data storage and computing, non-parallel data processing, and the buses requirement between memory units and logic units. The brain-inspired neuromorphic computing paradigm is believed to be one of the promising solutions for realizing more complex functions with a lower cost. To perform such brain-inspired computing with a low cost and low power consumption, novel devices for use as electronic synapses are needed. Metal oxide resistive random access memory (ReRAM) devices have emerged as the leading candidate for electronic synapses. This paper comprehensively addresses the recent work on the design and optimization of metal oxide ReRAM-based synaptic devices. A performance enhancement methodology and optimized operation scheme to achieve analog resistive switching and low-energy training behavior are provided. A three-dimensional vertical synapse network architecture is proposed for high-density integration and low-cost fabrication. The impacts of the ReRAM synaptic device features on the performances of neuromorphic systems are also discussed on the basis of a constructed neuromorphic visual system with a pattern recognition function. Possible solutions to achieve the high recognition accuracy and efficiency of neuromorphic systems are presented.

  14. A Perspective on the Role of Computational Models in Immunology.

    PubMed

    Chakraborty, Arup K

    2017-04-26

    This is an exciting time for immunology because the future promises to be replete with exciting new discoveries that can be translated to improve health and treat disease in novel ways. Immunologists are attempting to answer increasingly complex questions concerning phenomena that range from the genetic, molecular, and cellular scales to that of organs, whole animals or humans, and populations of humans and pathogens. An important goal is to understand how the many different components involved interact with each other within and across these scales for immune responses to emerge, and how aberrant regulation of these processes causes disease. To aid this quest, large amounts of data can be collected using high-throughput instrumentation. The nonlinear, cooperative, and stochastic character of the interactions between components of the immune system as well as the overwhelming amounts of data can make it difficult to intuit patterns in the data or a mechanistic understanding of the phenomena being studied. Computational models are increasingly important in confronting and overcoming these challenges. I first describe an iterative paradigm of research that integrates laboratory experiments, clinical data, computational inference, and mechanistic computational models. I then illustrate this paradigm with a few examples from the recent literature that make vivid the power of bringing together diverse types of computational models with experimental and clinical studies to fruitfully interrogate the immune system.

  15. Nonmetastatic Castration-resistant Prostate Cancer: A Modern Perspective.

    PubMed

    Cancian, Madeline; Renzulli, Joseph F

    2018-06-01

    Nonmetastatic castration-resistant prostate cancer (nmCRPC) presents a challenge to urologists as currently there are no Food and Drug Administration-approved therapies. However, there are new imaging modalities, including fluciclovine positron emission tomography-computed tomography and Ga-PSMA (prostate specific membrane antigent) positron emission tomography-computed tomography, which are improving accuracy of diagnosis. With improved imaging, we are better able to target therapy. Today there are 3 ongoing clinical trials studying second-generation antiandrogens in nmCRPC, which hold the promise of a new treatment paradigm. In this article, we will review the new imaging techniques and the rationale behind novel treatment modalities in nmCRPC. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Neuroblastoma, a Paradigm for Big Data Science in Pediatric Oncology

    PubMed Central

    Salazar, Brittany M.; Balczewski, Emily A.; Ung, Choong Yong; Zhu, Shizhen

    2016-01-01

    Pediatric cancers rarely exhibit recurrent mutational events when compared to most adult cancers. This poses a challenge in understanding how cancers initiate, progress, and metastasize in early childhood. Also, due to limited detected driver mutations, it is difficult to benchmark key genes for drug development. In this review, we use neuroblastoma, a pediatric solid tumor of neural crest origin, as a paradigm for exploring “big data” applications in pediatric oncology. Computational strategies derived from big data science–network- and machine learning-based modeling and drug repositioning—hold the promise of shedding new light on the molecular mechanisms driving neuroblastoma pathogenesis and identifying potential therapeutics to combat this devastating disease. These strategies integrate robust data input, from genomic and transcriptomic studies, clinical data, and in vivo and in vitro experimental models specific to neuroblastoma and other types of cancers that closely mimic its biological characteristics. We discuss contexts in which “big data” and computational approaches, especially network-based modeling, may advance neuroblastoma research, describe currently available data and resources, and propose future models of strategic data collection and analyses for neuroblastoma and other related diseases. PMID:28035989

  17. Neuroblastoma, a Paradigm for Big Data Science in Pediatric Oncology.

    PubMed

    Salazar, Brittany M; Balczewski, Emily A; Ung, Choong Yong; Zhu, Shizhen

    2016-12-27

    Pediatric cancers rarely exhibit recurrent mutational events when compared to most adult cancers. This poses a challenge in understanding how cancers initiate, progress, and metastasize in early childhood. Also, due to limited detected driver mutations, it is difficult to benchmark key genes for drug development. In this review, we use neuroblastoma, a pediatric solid tumor of neural crest origin, as a paradigm for exploring "big data" applications in pediatric oncology. Computational strategies derived from big data science-network- and machine learning-based modeling and drug repositioning-hold the promise of shedding new light on the molecular mechanisms driving neuroblastoma pathogenesis and identifying potential therapeutics to combat this devastating disease. These strategies integrate robust data input, from genomic and transcriptomic studies, clinical data, and in vivo and in vitro experimental models specific to neuroblastoma and other types of cancers that closely mimic its biological characteristics. We discuss contexts in which "big data" and computational approaches, especially network-based modeling, may advance neuroblastoma research, describe currently available data and resources, and propose future models of strategic data collection and analyses for neuroblastoma and other related diseases.

  18. Gaze-independent BCI-spelling using rapid serial visual presentation (RSVP).

    PubMed

    Acqualagna, Laura; Blankertz, Benjamin

    2013-05-01

    A Brain Computer Interface (BCI) speller is a communication device, which can be used by patients suffering from neurodegenerative diseases to select symbols in a computer application. For patients unable to overtly fixate the target symbol, it is crucial to develop a speller independent of gaze shifts. In the present online study, we investigated rapid serial visual presentation (RSVP) as a paradigm for mental typewriting. We investigated the RSVP speller in three conditions, regarding the Stimulus Onset Asynchrony (SOA) and the use of color features. A vocabulary of 30 symbols was presented one-by-one in a pseudo random sequence at the same location of display. All twelve participants were able to successfully operate the RSVP speller. The results show a mean online spelling rate of 1.43 symb/min and a mean symbol selection accuracy of 94.8% in the best condition. We conclude that the RSVP is a promising paradigm for BCI spelling and its performance is competitive with the fastest gaze-independent spellers in literature. The RSVP speller does not require gaze shifts towards different target locations and can be operated by non-spatial visual attention, therefore it can be considered as a valid paradigm in applications with patients for impaired oculo-motor control. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  19. Materials-by-design: computation, synthesis, and characterization from atoms to structures

    NASA Astrophysics Data System (ADS)

    Yeo, Jingjie; Jung, Gang Seob; Martín-Martínez, Francisco J.; Ling, Shengjie; Gu, Grace X.; Qin, Zhao; Buehler, Markus J.

    2018-05-01

    In the 50 years that succeeded Richard Feynman’s exposition of the idea that there is ‘plenty of room at the bottom’ for manipulating individual atoms for the synthesis and manufacturing processing of materials, the materials-by-design paradigm is being developed gradually through synergistic integration of experimental material synthesis and characterization with predictive computational modeling and optimization. This paper reviews how this paradigm creates the possibility to develop materials according to specific, rational designs from the molecular to the macroscopic scale. We discuss promising techniques in experimental small-scale material synthesis and large-scale fabrication methods to manipulate atomistic or macroscale structures, which can be designed by computational modeling. These include recombinant protein technology to produce peptides and proteins with tailored sequences encoded by recombinant DNA, self-assembly processes induced by conformational transition of proteins, additive manufacturing for designing complex structures, and qualitative and quantitative characterization of materials at different length scales. We describe important material characterization techniques using numerous methods of spectroscopy and microscopy. We detail numerous multi-scale computational modeling techniques that complements these experimental techniques: DFT at the atomistic scale; fully atomistic and coarse-grain molecular dynamics at the molecular to mesoscale; continuum modeling at the macroscale. Additionally, we present case studies that utilize experimental and computational approaches in an integrated manner to broaden our understanding of the properties of two-dimensional materials and materials based on silk and silk-elastin-like proteins.

  20. Fundamental energy limits of SET-based Brownian NAND and half-adder circuits. Preliminary findings from a physical-information-theoretic methodology

    NASA Astrophysics Data System (ADS)

    Ercan, İlke; Suyabatmaz, Enes

    2018-06-01

    The saturation in the efficiency and performance scaling of conventional electronic technologies brings about the development of novel computational paradigms. Brownian circuits are among the promising alternatives that can exploit fluctuations to increase the efficiency of information processing in nanocomputing. A Brownian cellular automaton, where signals propagate randomly and are driven by local transition rules, can be made computationally universal by embedding arbitrary asynchronous circuits on it. One of the potential realizations of such circuits is via single electron tunneling (SET) devices since SET technology enable simulation of noise and fluctuations in a fashion similar to Brownian search. In this paper, we perform a physical-information-theoretic analysis on the efficiency limitations in a Brownian NAND and half-adder circuits implemented using SET technology. The method we employed here establishes a solid ground that enables studying computational and physical features of this emerging technology on an equal footing, and yield fundamental lower bounds that provide valuable insights into how far its efficiency can be improved in principle. In order to provide a basis for comparison, we also analyze a NAND gate and half-adder circuit implemented in complementary metal oxide semiconductor technology to show how the fundamental bound of the Brownian circuit compares against a conventional paradigm.

  1. Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2003-01-01

    The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.

  2. Programmable DNA-Mediated Multitasking Processor.

    PubMed

    Shu, Jian-Jun; Wang, Qi-Wen; Yong, Kian-Yan; Shao, Fangwei; Lee, Kee Jin

    2015-04-30

    Because of DNA appealing features as perfect material, including minuscule size, defined structural repeat and rigidity, programmable DNA-mediated processing is a promising computing paradigm, which employs DNAs as information storing and processing substrates to tackle the computational problems. The massive parallelism of DNA hybridization exhibits transcendent potential to improve multitasking capabilities and yield a tremendous speed-up over the conventional electronic processors with stepwise signal cascade. As an example of multitasking capability, we present an in vitro programmable DNA-mediated optimal route planning processor as a functional unit embedded in contemporary navigation systems. The novel programmable DNA-mediated processor has several advantages over the existing silicon-mediated methods, such as conducting massive data storage and simultaneous processing via much fewer materials than conventional silicon devices.

  3. Machine learning for Big Data analytics in plants.

    PubMed

    Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng

    2014-12-01

    Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Selecting Paradigms From Cognitive Neuroscience for Translation into Use in Clinical Trials: Proceedings of the Third CNTRICS Meeting

    PubMed Central

    Barch, Deanna M.; Carter, Cameron S.; Arnsten, Amy; Buchanan, Robert W.; Cohen, Jonathan D.; Geyer, Mark; Green, Michael F.; Krystal, John H.; Nuechterlein, Keith; Robbins, Trevor; Silverstein, Steven; Smith, Edward E.; Strauss, Milton; Wykes, Til; Heinssen, Robert

    2009-01-01

    This overview describes the goals and objectives of the third conference conducted as part of the Cognitive Neuroscience Treatment Research to Improve Cognition in Schizophrenia (CNTRICS) initiative. This third conference was focused on selecting specific paradigms from cognitive neuroscience that measured the constructs identified in the first CNTRICS meeting, with the goal of facilitating the translation of these paradigms into use in clinical trials contexts. To identify such paradigms, we had an open nomination process in which the field was asked to nominate potentially relevant paradigms and to provide information on several domains relevant to selecting the most promising tasks for each construct (eg, construct validity, neural bases, psychometrics, availability of animal models). Our goal was to identify 1–2 promising tasks for each of the 11 constructs identified at the first CNTRICS meeting. In this overview article, we describe the on-line survey used to generate nominations for promising tasks, the criteria that were used to select the tasks, the rationale behind the criteria, and the ways in which breakout groups worked together to identify the most promising tasks from among those nominated. This article serves as an introduction to the set of 6 articles included in this special issue that provide information about the specific tasks discussed and selected for the constructs from each of 6 broad domains (working memory, executive control, attention, long-term memory, perception, and social cognition). PMID:19023126

  5. Rational design of liposomal drug delivery systems, a review: Combined experimental and computational studies of lipid membranes, liposomes and their PEGylation.

    PubMed

    Bunker, Alex; Magarkar, Aniket; Viitala, Tapani

    2016-10-01

    Combined experimental and computational studies of lipid membranes and liposomes, with the aim to attain mechanistic understanding, result in a synergy that makes possible the rational design of liposomal drug delivery system (LDS) based therapies. The LDS is the leading form of nanoscale drug delivery platform, an avenue in drug research, known as "nanomedicine", that holds the promise to transcend the current paradigm of drug development that has led to diminishing returns. Unfortunately this field of research has, so far, been far more successful in generating publications than new drug therapies. This partly results from the trial and error based methodologies used. We discuss experimental techniques capable of obtaining mechanistic insight into LDS structure and behavior. Insight obtained purely experimentally is, however, limited; computational modeling using molecular dynamics simulation can provide insight not otherwise available. We review computational research, that makes use of the multiscale modeling paradigm, simulating the phospholipid membrane with all atom resolution and the entire liposome with coarse grained models. We discuss in greater detail the computational modeling of liposome PEGylation. Overall, we wish to convey the power that lies in the combined use of experimental and computational methodologies; we hope to provide a roadmap for the rational design of LDS based therapies. Computational modeling is able to provide mechanistic insight that explains the context of experimental results and can also take the lead and inspire new directions for experimental research into LDS development. This article is part of a Special Issue entitled: Biosimulations edited by Ilpo Vattulainen and Tomasz Róg. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Contextuality supplies the 'magic' for quantum computation.

    PubMed

    Howard, Mark; Wallman, Joel; Veitch, Victor; Emerson, Joseph

    2014-06-19

    Quantum computers promise dramatic advantages over their classical counterparts, but the source of the power in quantum computing has remained elusive. Here we prove a remarkable equivalence between the onset of contextuality and the possibility of universal quantum computation via 'magic state' distillation, which is the leading model for experimentally realizing a fault-tolerant quantum computer. This is a conceptually satisfying link, because contextuality, which precludes a simple 'hidden variable' model of quantum mechanics, provides one of the fundamental characterizations of uniquely quantum phenomena. Furthermore, this connection suggests a unifying paradigm for the resources of quantum information: the non-locality of quantum theory is a particular kind of contextuality, and non-locality is already known to be a critical resource for achieving advantages with quantum communication. In addition to clarifying these fundamental issues, this work advances the resource framework for quantum computation, which has a number of practical applications, such as characterizing the efficiency and trade-offs between distinct theoretical and experimental schemes for achieving robust quantum computation, and putting bounds on the overhead cost for the classical simulation of quantum algorithms.

  7. Single-molecule protein sequencing through fingerprinting: computational assessment

    NASA Astrophysics Data System (ADS)

    Yao, Yao; Docter, Margreet; van Ginkel, Jetty; de Ridder, Dick; Joo, Chirlmin

    2015-10-01

    Proteins are vital in all biological systems as they constitute the main structural and functional components of cells. Recent advances in mass spectrometry have brought the promise of complete proteomics by helping draft the human proteome. Yet, this commonly used protein sequencing technique has fundamental limitations in sensitivity. Here we propose a method for single-molecule (SM) protein sequencing. A major challenge lies in the fact that proteins are composed of 20 different amino acids, which demands 20 molecular reporters. We computationally demonstrate that it suffices to measure only two types of amino acids to identify proteins and suggest an experimental scheme using SM fluorescence. When achieved, this highly sensitive approach will result in a paradigm shift in proteomics, with major impact in the biological and medical sciences.

  8. A novel BCI based on ERP components sensitive to configural processing of human faces

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Zhao, Qibin; Jing, Jin; Wang, Xingyu; Cichocki, Andrzej

    2012-04-01

    This study introduces a novel brain-computer interface (BCI) based on an oddball paradigm using stimuli of facial images with loss of configural face information (e.g., inversion of face). To the best of our knowledge, till now the configural processing of human faces has not been applied to BCI but widely studied in cognitive neuroscience research. Our experiments confirm that the face-sensitive event-related potential (ERP) components N170 and vertex positive potential (VPP) have reflected early structural encoding of faces and can be modulated by the configural processing of faces. With the proposed novel paradigm, we investigate the effects of ERP components N170, VPP and P300 on target detection for BCI. An eight-class BCI platform is developed to analyze ERPs and evaluate the target detection performance using linear discriminant analysis without complicated feature extraction processing. The online classification accuracy of 88.7% and information transfer rate of 38.7 bits min-1 using stimuli of inverted faces with only single trial suggest that the proposed paradigm based on the configural processing of faces is very promising for visual stimuli-driven BCI applications.

  9. A paradigm for the study of paranoia in the general population: the Prisoner's Dilemma Game.

    PubMed

    Ellett, Lyn; Allen-Crooks, Rhani; Stevens, Adele; Wildschut, Tim; Chadwick, Paul

    2013-01-01

    A growing body of research shows that paranoia is common in the general population. We report three studies that examined the Prisoner's Dilemma Game (PDG) as a paradigm for evaluation of non-clinical paranoia. The PDG captures three key qualities that are at the heart of paranoia--it is interpersonal, it concerns threat, and it concerns the perception of others' intentions towards the self. Study 1 (n=175) found that state paranoia was positively associated with selection of the competitive PDG choice. Study 2 (n=111) found that this association was significant only when participants believed they were playing the PDG against another person, and not when playing against a computer. This finding underscores the interpersonal nature of paranoia and the concomitant necessity of studying paranoia in interpersonal context. In Study 3 (n=152), we assessed both trait and state paranoia, and differentiated between distrust- and greed-based competition. Both trait and state paranoia were positively associated with distrust-based competition (but not with greed-based competition). Crucially, we found that the association between trait paranoia and distrust-based competition was fully mediated by state paranoia. The PDG is a promising paradigm for the study of non-clinical paranoia.

  10. A novel BCI based on ERP components sensitive to configural processing of human faces.

    PubMed

    Zhang, Yu; Zhao, Qibin; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej

    2012-04-01

    This study introduces a novel brain-computer interface (BCI) based on an oddball paradigm using stimuli of facial images with loss of configural face information (e.g., inversion of face). To the best of our knowledge, till now the configural processing of human faces has not been applied to BCI but widely studied in cognitive neuroscience research. Our experiments confirm that the face-sensitive event-related potential (ERP) components N170 and vertex positive potential (VPP) have reflected early structural encoding of faces and can be modulated by the configural processing of faces. With the proposed novel paradigm, we investigate the effects of ERP components N170, VPP and P300 on target detection for BCI. An eight-class BCI platform is developed to analyze ERPs and evaluate the target detection performance using linear discriminant analysis without complicated feature extraction processing. The online classification accuracy of 88.7% and information transfer rate of 38.7 bits min(-1) using stimuli of inverted faces with only single trial suggest that the proposed paradigm based on the configural processing of faces is very promising for visual stimuli-driven BCI applications.

  11. Underwater Threat Source Localization: Processing Sensor Network TDOAs with a Terascale Optical Core Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barhen, Jacob; Imam, Neena

    2007-01-01

    Revolutionary computing technologies are defined in terms of technological breakthroughs, which leapfrog over near-term projected advances in conventional hardware and software to produce paradigm shifts in computational science. For underwater threat source localization using information provided by a dynamical sensor network, one of the most promising computational advances builds upon the emergence of digital optical-core devices. In this article, we present initial results of sensor network calculations that focus on the concept of signal wavefront time-difference-of-arrival (TDOA). The corresponding algorithms are implemented on the EnLight processing platform recently introduced by Lenslet Laboratories. This tera-scale digital optical core processor is optimizedmore » for array operations, which it performs in a fixed-point-arithmetic architecture. Our results (i) illustrate the ability to reach the required accuracy in the TDOA computation, and (ii) demonstrate that a considerable speed-up can be achieved when using the EnLight 64a prototype processor as compared to a dual Intel XeonTM processor.« less

  12. Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.

    PubMed

    Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei

    2018-06-15

    Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.

  13. Parallel, distributed and GPU computing technologies in single-particle electron microscopy

    PubMed Central

    Schmeisser, Martin; Heisen, Burkhard C.; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger

    2009-01-01

    Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today’s technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined. PMID:19564686

  14. Parallel, distributed and GPU computing technologies in single-particle electron microscopy.

    PubMed

    Schmeisser, Martin; Heisen, Burkhard C; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger

    2009-07-01

    Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined.

  15. Co-"Lab"oration: A New Paradigm for Building a Management Information Systems Course

    ERIC Educational Resources Information Center

    Breimer, Eric; Cotler, Jami; Yoder, Robert

    2010-01-01

    We propose a new paradigm for building a Management Information Systems course that focuses on laboratory activities developed collaboratively using Computer-Mediated Communication and Collaboration tools. A highlight of our paradigm is the "practice what you preach" concept where the computer communication tools and collaboration…

  16. Reprogrammable logic in memristive crossbar for in-memory computing

    NASA Astrophysics Data System (ADS)

    Cheng, Long; Zhang, Mei-Yun; Li, Yi; Zhou, Ya-Xiong; Wang, Zhuo-Rui; Hu, Si-Yu; Long, Shi-Bing; Liu, Ming; Miao, Xiang-Shui

    2017-12-01

    Memristive stateful logic has emerged as a promising next-generation in-memory computing paradigm to address escalating computing-performance pressures in traditional von Neumann architecture. Here, we present a nonvolatile reprogrammable logic method that can process data between different rows and columns in a memristive crossbar array based on material implication (IMP) logic. Arbitrary Boolean logic can be executed with a reprogrammable cell containing four memristors in a crossbar array. In the fabricated Ti/HfO2/W memristive array, some fundamental functions, such as universal NAND logic and data transfer, were experimentally implemented. Moreover, using eight memristors in a 2  ×  4 array, a one-bit full adder was theoretically designed and verified by simulation to exhibit the feasibility of our method to accomplish complex computing tasks. In addition, some critical logic-related performances were further discussed, such as the flexibility of data processing, cascading problem and bit error rate. Such a method could be a step forward in developing IMP-based memristive nonvolatile logic for large-scale in-memory computing architecture.

  17. Embracing the Cloud: Six Ways to Look at the Shift to Cloud Computing

    ERIC Educational Resources Information Center

    Ullman, David F.; Haggerty, Blake

    2010-01-01

    Cloud computing is the latest paradigm shift for the delivery of IT services. Where previous paradigms (centralized, decentralized, distributed) were based on fairly straightforward approaches to technology and its management, cloud computing is radical in comparison. The literature on cloud computing, however, suffers from many divergent…

  18. Cloudbus Toolkit for Market-Oriented Cloud Computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian

    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.

  19. Audio-visual affective expression recognition

    NASA Astrophysics Data System (ADS)

    Huang, Thomas S.; Zeng, Zhihong

    2007-11-01

    Automatic affective expression recognition has attracted more and more attention of researchers from different disciplines, which will significantly contribute to a new paradigm for human computer interaction (affect-sensitive interfaces, socially intelligent environments) and advance the research in the affect-related fields including psychology, psychiatry, and education. Multimodal information integration is a process that enables human to assess affective states robustly and flexibly. In order to understand the richness and subtleness of human emotion behavior, the computer should be able to integrate information from multiple sensors. We introduce in this paper our efforts toward machine understanding of audio-visual affective behavior, based on both deliberate and spontaneous displays. Some promising methods are presented to integrate information from both audio and visual modalities. Our experiments show the advantage of audio-visual fusion in affective expression recognition over audio-only or visual-only approaches.

  20. Information processing psychology: A promising paradigm for research in science teaching

    NASA Astrophysics Data System (ADS)

    Stewart, James H.; Atkin, Julia A.

    Three research paradigms, those of Ausubel, Gagné and Piaget, have received a great deal of attention in the literature of science education. In this article a fourth paradigm is presented - an information processing psychology paradigm. The article is composed of two sections. The first section describes a model of memory developed by information processing psychologists. The second section describes how such a model could be used to guide science education research on learning and problem solving.Received: 19 October 1981

  1. Feasibility Study of a Generalized Framework for Developing Computer-Aided Detection Systems-a New Paradigm.

    PubMed

    Nemoto, Mitsutaka; Hayashi, Naoto; Hanaoka, Shouhei; Nomura, Yukihiro; Miki, Soichiro; Yoshikawa, Takeharu

    2017-10-01

    We propose a generalized framework for developing computer-aided detection (CADe) systems whose characteristics depend only on those of the training dataset. The purpose of this study is to show the feasibility of the framework. Two different CADe systems were experimentally developed by a prototype of the framework, but with different training datasets. The CADe systems include four components; preprocessing, candidate area extraction, candidate detection, and candidate classification. Four pretrained algorithms with dedicated optimization/setting methods corresponding to the respective components were prepared in advance. The pretrained algorithms were sequentially trained in the order of processing of the components. In this study, two different datasets, brain MRA with cerebral aneurysms and chest CT with lung nodules, were collected to develop two different types of CADe systems in the framework. The performances of the developed CADe systems were evaluated by threefold cross-validation. The CADe systems for detecting cerebral aneurysms in brain MRAs and for detecting lung nodules in chest CTs were successfully developed using the respective datasets. The framework was shown to be feasible by the successful development of the two different types of CADe systems. The feasibility of this framework shows promise for a new paradigm in the development of CADe systems: development of CADe systems without any lesion specific algorithm designing.

  2. Double Super-Exchange in Silicon Quantum Dots Connected by Short-Bridged Networks

    NASA Astrophysics Data System (ADS)

    Li, Huashan; Wu, Zhigang; Lusk, Mark

    2013-03-01

    Silicon quantum dots (QDs) with diameters in the range of 1-2 nm are attractive for photovoltaic applications. They absorb photons more readily, transport excitons with greater efficiency, and show greater promise in multiple-exciton generation and hot carrier collection paradigms. However, their high excitonic binding energy makes it difficult to dissociate excitons into separate charge carriers. One possible remedy is to create dot assemblies in which a second material creates a Type-II heterojunction with the dot so that exciton dissociation occurs locally. This talk will focus on such a Type-II heterojunction paradigm in which QDs are connected via covalently bonded, short-bridge molecules. For such interpenetrating networks of dots and molecules, our first principles computational investigation shows that it is possible to rapidly and efficiently separate electrons to QDs and holes to bridge units. The bridge network serves as an efficient mediator of electron superexchange between QDs while the dots themselves play the complimentary role of efficient hole superexchange mediators. Dissociation, photoluminescence and carrier transport rates will be presented for bridge networks of silicon QDs that exhibit such double superexchange. This material is based upon work supported by the Renewable Energy Materials Research Science and Engineering Center (REMRSEC) under Grant No. DMR-0820518 and Golden Energy Computing Organization (GECO).

  3. Omics-Based Strategies in Precision Medicine: Toward a Paradigm Shift in Inborn Errors of Metabolism Investigations

    PubMed Central

    Tebani, Abdellah; Afonso, Carlos; Marret, Stéphane; Bekri, Soumeya

    2016-01-01

    The rise of technologies that simultaneously measure thousands of data points represents the heart of systems biology. These technologies have had a huge impact on the discovery of next-generation diagnostics, biomarkers, and drugs in the precision medicine era. Systems biology aims to achieve systemic exploration of complex interactions in biological systems. Driven by high-throughput omics technologies and the computational surge, it enables multi-scale and insightful overviews of cells, organisms, and populations. Precision medicine capitalizes on these conceptual and technological advancements and stands on two main pillars: data generation and data modeling. High-throughput omics technologies allow the retrieval of comprehensive and holistic biological information, whereas computational capabilities enable high-dimensional data modeling and, therefore, accessible and user-friendly visualization. Furthermore, bioinformatics has enabled comprehensive multi-omics and clinical data integration for insightful interpretation. Despite their promise, the translation of these technologies into clinically actionable tools has been slow. In this review, we present state-of-the-art multi-omics data analysis strategies in a clinical context. The challenges of omics-based biomarker translation are discussed. Perspectives regarding the use of multi-omics approaches for inborn errors of metabolism (IEM) are presented by introducing a new paradigm shift in addressing IEM investigations in the post-genomic era. PMID:27649151

  4. Omics-Based Strategies in Precision Medicine: Toward a Paradigm Shift in Inborn Errors of Metabolism Investigations.

    PubMed

    Tebani, Abdellah; Afonso, Carlos; Marret, Stéphane; Bekri, Soumeya

    2016-09-14

    The rise of technologies that simultaneously measure thousands of data points represents the heart of systems biology. These technologies have had a huge impact on the discovery of next-generation diagnostics, biomarkers, and drugs in the precision medicine era. Systems biology aims to achieve systemic exploration of complex interactions in biological systems. Driven by high-throughput omics technologies and the computational surge, it enables multi-scale and insightful overviews of cells, organisms, and populations. Precision medicine capitalizes on these conceptual and technological advancements and stands on two main pillars: data generation and data modeling. High-throughput omics technologies allow the retrieval of comprehensive and holistic biological information, whereas computational capabilities enable high-dimensional data modeling and, therefore, accessible and user-friendly visualization. Furthermore, bioinformatics has enabled comprehensive multi-omics and clinical data integration for insightful interpretation. Despite their promise, the translation of these technologies into clinically actionable tools has been slow. In this review, we present state-of-the-art multi-omics data analysis strategies in a clinical context. The challenges of omics-based biomarker translation are discussed. Perspectives regarding the use of multi-omics approaches for inborn errors of metabolism (IEM) are presented by introducing a new paradigm shift in addressing IEM investigations in the post-genomic era.

  5. Computational models and motor learning paradigms: Could they provide insights for neuroplasticity after stroke? An overview.

    PubMed

    Kiper, Pawel; Szczudlik, Andrzej; Venneri, Annalena; Stozek, Joanna; Luque-Moreno, Carlos; Opara, Jozef; Baba, Alfonc; Agostini, Michela; Turolla, Andrea

    2016-10-15

    Computational approaches for modelling the central nervous system (CNS) aim to develop theories on processes occurring in the brain that allow the transformation of all information needed for the execution of motor acts. Computational models have been proposed in several fields, to interpret not only the CNS functioning, but also its efferent behaviour. Computational model theories can provide insights into neuromuscular and brain function allowing us to reach a deeper understanding of neuroplasticity. Neuroplasticity is the process occurring in the CNS that is able to permanently change both structure and function due to interaction with the external environment. To understand such a complex process several paradigms related to motor learning and computational modeling have been put forward. These paradigms have been explained through several internal model concepts, and supported by neurophysiological and neuroimaging studies. Therefore, it has been possible to make theories about the basis of different learning paradigms according to known computational models. Here we review the computational models and motor learning paradigms used to describe the CNS and neuromuscular functions, as well as their role in the recovery process. These theories have the potential to provide a way to rigorously explain all the potential of CNS learning, providing a basis for future clinical studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. An Object-Oriented Approach to Writing Computational Electromagnetics Codes

    NASA Technical Reports Server (NTRS)

    Zimmerman, Martin; Mallasch, Paul G.

    1996-01-01

    Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.

  7. Taking the Mystery Out of Research in Computing Information Systems: A New Approach to Teaching Research Paradigm Architecture.

    ERIC Educational Resources Information Center

    Heslin, J. Alexander, Jr.

    In senior-level undergraduate research courses in Computer Information Systems (CIS), students are required to read and assimilate a large volume of current research literature. One course objective is to demonstrate to the student that there are patterns or models or paradigms of research. A new approach in identifying research paradigms is…

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demeure, I.M.

    The research presented here is concerned with representation techniques and tools to support the design, prototyping, simulation, and evaluation of message-based parallel, distributed computations. The author describes ParaDiGM-Parallel, Distributed computation Graph Model-a visual representation technique for parallel, message-based distributed computations. ParaDiGM provides several views of a computation depending on the aspect of concern. It is made of two complementary submodels, the DCPG-Distributed Computing Precedence Graph-model, and the PAM-Process Architecture Model-model. DCPGs are precedence graphs used to express the functionality of a computation in terms of tasks, message-passing, and data. PAM graphs are used to represent the partitioning of a computationmore » into schedulable units or processes, and the pattern of communication among those units. There is a natural mapping between the two models. He illustrates the utility of ParaDiGM as a representation technique by applying it to various computations (e.g., an adaptive global optimization algorithm, the client-server model). ParaDiGM representations are concise. They can be used in documenting the design and the implementation of parallel, distributed computations, in describing such computations to colleagues, and in comparing and contrasting various implementations of the same computation. He then describes VISA-VISual Assistant, a software tool to support the design, prototyping, and simulation of message-based parallel, distributed computations. VISA is based on the ParaDiGM model. In particular, it supports the editing of ParaDiGM graphs to describe the computations of interest, and the animation of these graphs to provide visual feedback during simulations. The graphs are supplemented with various attributes, simulation parameters, and interpretations which are procedures that can be executed by VISA.« less

  9. An Efficient Offloading Scheme For MEC System Considering Delay and Energy Consumption

    NASA Astrophysics Data System (ADS)

    Sun, Yanhua; Hao, Zhe; Zhang, Yanhua

    2018-01-01

    With the increasing numbers of mobile devices, mobile edge computing (MEC) which provides cloud computing capabilities proximate to mobile devices in 5G networks has been envisioned as a promising paradigm to enhance users experience. In this paper, we investigate a joint consideration of delay and energy consumption offloading scheme (JCDE) for MEC system in 5G heterogeneous networks. An optimization is formulated to minimize the delay as well as energy consumption of the offloading system, which the delay and energy consumption of transmitting and calculating tasks are taken into account. We adopt an iterative greedy algorithm to solve the optimization problem. Furthermore, simulations were carried out to validate the utility and effectiveness of our proposed scheme. The effect of parameter variations on the system is analysed as well. Numerical results demonstrate delay and energy efficiency promotion of our proposed scheme compared with another paper’s scheme.

  10. Emergent Adaptive Noise Reduction from Communal Cooperation of Sensor Grid

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Jones, Michael G.; Nark, Douglas M.; Lodding, Kenneth N.

    2010-01-01

    In the last decade, the realization of small, inexpensive, and powerful devices with sensors, computers, and wireless communication has promised the development of massive sized sensor networks with dense deployments over large areas capable of high fidelity situational assessments. However, most management models have been based on centralized control and research has concentrated on methods for passing data from sensor devices to the central controller. Most implementations have been small but, as it is not scalable, this methodology is insufficient for massive deployments. Here, a specific application of a large sensor network for adaptive noise reduction demonstrates a new paradigm where communities of sensor/computer devices assess local conditions and make local decisions from which emerges a global behaviour. This approach obviates many of the problems of centralized control as it is not prone to single point of failure and is more scalable, efficient, robust, and fault tolerant

  11. Adapting Instruction to Individual Learner Differences: A Research Paradigm for Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Mills, Steven C.; Ragan, Tillman J.

    This paper examines a research paradigm that is particularly suited to experimentation-related computer-based instruction and integrated learning systems. The main assumption of the model is that one of the most powerful capabilities of computer-based instruction, and specifically of integrated learning systems, is the capacity to adapt…

  12. A Refined Computer Harassment Paradigm: Validation, and Test of Hypotheses about Target Characteristics

    ERIC Educational Resources Information Center

    Siebler, Frank; Sabelus, Saskia; Bohner, Gerd

    2008-01-01

    A refined computer paradigm for assessing sexual harassment is presented, validated, and used for testing substantive hypotheses. Male participants were given an opportunity to send sexist jokes to a computer-simulated female chat partner. In Study 1 (N = 44), the harassment measure (number of sexist jokes sent) correlated positively with…

  13. A paradigm shift in biomass technology from complete to partial cellulose hydrolysis: lessons learned from nature.

    PubMed

    Chen, Rachel

    2015-01-01

    A key characteristic of current biomass technology is the requirement for complete hydrolysis of cellulose and hemicellulose, which stems from the inability of microbial strains to use partially hydrolyzed cellulose, or cellodextrin. The complete hydrolysis paradigm has been practiced over the past 4 decades with major enzyme companies perfecting their cellulase mix for maximal yield of monosaccharides, with corresponding efforts in strain development focus almost solely on the conversion of monosaccharides, not cellodextrin, to products. While still in its nascent infancy, a new paradigm requiring only partial hydrolysis has begun to take hold, promising a shift in the biomass technology at its fundamental core. The new paradigm has the potential to reduce the requirement for cellulase enzymes in the hydrolysis step and provides new strategies for metabolic engineers, synthetic biologists and alike in engineering fermenting organisms. Several recent publications reveal that microorganisms engineered to metabolize cellodextrins, rather than monomer glucose, can reap significant energy gains in both uptake and subsequent phosphorylation. These energetic benefits can in turn be directed for enhanced robustness and increased productivity of a bioprocess. Furthermore, the new cellodextrin metabolism endows the biocatalyst the ability to evade catabolite repression, a cellular regulatory mechanism that is hampering rapid conversion of biomass sugars to products. Together, the new paradigm offers significant advantages over the old and promises to overcome several critical barriers in biomass technology. More research, however, is needed to realize these promises, especially in discovery and engineering of cellodextrin transporters, in developing a cost-effective method for cellodextrin generation, and in better integration of cellodextrin metabolism to endogenous glycolysis.

  14. A perspective of adaptation in healthcare.

    PubMed

    Mezghani, Emna; Da Silveira, Marcos; Pruski, Cédric; Exposito, Ernesto; Drira, Khalil

    2014-01-01

    Emerging new technologies in healthcare has proven great promises for managing patient care. In recent years, the evolution of Information and Communication Technologies pushes many research studies to think about treatment plan adaptation in this area. The main goal is to accelerate the decision making by dynamically generating new treatment due to unexpected situations. This paper portrays the treatment adaptation from a new perspective inspired from the human nervous system named autonomic computing. Thus, the selected potential studies are classified according to the maturity levels of this paradigm. To guarantee optimal and accurate treatment adaptation, challenges related to medical knowledge and data are identified and future directions to be explored in healthcare systems are discussed.

  15. A Privacy Access Control Framework for Web Services Collaboration with Role Mechanisms

    NASA Astrophysics Data System (ADS)

    Liu, Linyuan; Huang, Zhiqiu; Zhu, Haibin

    With the popularity of Internet technology, web services are becoming the most promising paradigm for distributed computing. This increased use of web services has meant that more and more personal information of consumers is being shared with web service providers, leading to the need to guarantee the privacy of consumers. This paper proposes a role-based privacy access control framework for Web services collaboration, it utilizes roles to specify the privacy privileges of services, and considers the impact on the reputation degree of the historic experience of services in playing roles. Comparing to the traditional privacy access control approaches, this framework can make the fine-grained authorization decision, thus efficiently protecting consumers' privacy.

  16. Orientation selectivity in a multi-gated organic electrochemical transistor

    NASA Astrophysics Data System (ADS)

    Gkoupidenis, Paschalis; Koutsouras, Dimitrios A.; Lonjaret, Thomas; Fairfield, Jessamyn A.; Malliaras, George G.

    2016-06-01

    Neuromorphic devices offer promising computational paradigms that transcend the limitations of conventional technologies. A prominent example, inspired by the workings of the brain, is spatiotemporal information processing. Here we demonstrate orientation selectivity, a spatiotemporal processing function of the visual cortex, using a poly(3,4ethylenedioxythiophene):poly(styrene sulfonate) (PEDOT:PSS) organic electrochemical transistor with multiple gates. Spatially distributed inputs on a gate electrode array are found to correlate with the output of the transistor, leading to the ability to discriminate between different stimuli orientations. The demonstration of spatiotemporal processing in an organic electronic device paves the way for neuromorphic devices with new form factors and a facile interface with biology.

  17. Effects of Bright Light Therapy of Sleep, Cognition, Brain Function, and Neurochemistry in Mild Traumatic Brain Injury

    DTIC Science & Technology

    2012-01-01

    computerized stimulation paradigms for use during functional neuroimaging (i.e., MSIT). Accomplishments: • The following computer tasks were...and Stability Test. • Programming of all computerized functional MRI stimulation paradigms and assessment tasks using E-prime software was completed...Computer stimulation paradigms were tested in the scanner environment to ensure that they could be presented and seen by subjects in the scanner

  18. P2P Technology for High-Performance Computing: An Overview

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J. (Technical Monitor); Berry, Jason

    2003-01-01

    The transition from cluster computing to peer-to-peer (P2P) high-performance computing has recently attracted the attention of the computer science community. It has been recognized that existing local networks and dedicated clusters of headless workstations can serve as inexpensive yet powerful virtual supercomputers. It has also been recognized that the vast number of lower-end computers connected to the Internet stay idle for as long as 90% of the time. The growing speed of Internet connections and the high availability of free CPU time encourage exploration of the possibility to use the whole Internet rather than local clusters as massively parallel yet almost freely available P2P supercomputer. As a part of a larger project on P2P high-performance computing, it has been my goal to compile an overview of the 2P2 paradigm. I have studied various P2P platforms and I have compiled systematic brief descriptions of their most important characteristics. I have also experimented and obtained hands-on experience with selected P2P platforms focusing on those that seem promising with respect to P2P high-performance computing. I have also compiled relevant literature and web references. I have prepared a draft technical report and I have summarized my findings in a poster paper.

  19. Emerging and Future Computing Paradigms and Their Impact on the Research, Training, and Design Environments of the Aerospace Workforce

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler)

    2003-01-01

    The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.

  20. Rule-based programming paradigm: a formal basis for biological, chemical and physical computation.

    PubMed

    Krishnamurthy, V; Krishnamurthy, E V

    1999-03-01

    A rule-based programming paradigm is described as a formal basis for biological, chemical and physical computations. In this paradigm, the computations are interpreted as the outcome arising out of interaction of elements in an object space. The interactions can create new elements (or same elements with modified attributes) or annihilate old elements according to specific rules. Since the interaction rules are inherently parallel, any number of actions can be performed cooperatively or competitively among the subsets of elements, so that the elements evolve toward an equilibrium or unstable or chaotic state. Such an evolution may retain certain invariant properties of the attributes of the elements. The object space resembles Gibbsian ensemble that corresponds to a distribution of points in the space of positions and momenta (called phase space). It permits the introduction of probabilities in rule applications. As each element of the ensemble changes over time, its phase point is carried into a new phase point. The evolution of this probability cloud in phase space corresponds to a distributed probabilistic computation. Thus, this paradigm can handle tor deterministic exact computation when the initial conditions are exactly specified and the trajectory of evolution is deterministic. Also, it can handle probabilistic mode of computation if we want to derive macroscopic or bulk properties of matter. We also explain how to support this rule-based paradigm using relational-database like query processing and transactions.

  1. In Search of Gender Free Paradigms for Computer Science Education. [Proceedings of a Preconference Research Workshop at the National Educational Computing Conference (Nashville, Tennessee, June 24, 1990).

    ERIC Educational Resources Information Center

    Martin, C. Dianne, Ed.; Murchie-Beyma, Eric, Ed.

    This monograph includes nine papers delivered at a National Educational Computing Conference (NECC) preconference workshop, and a previously unpublished paper on gender and attitudes. The papers, which are presented in four categories, are: (1) "Report on the Workshop: In Search of Gender Free Paradigms for Computer Science Education"…

  2. Detecting Lung Diseases from Exhaled Aerosols: Non-Invasive Lung Diagnosis Using Fractal Analysis and SVM Classification

    PubMed Central

    Xi, Jinxiang; Zhao, Weizhong; Yuan, Jiayao Eddie; Kim, JongWon; Si, Xiuhua; Xu, Xiaowei

    2015-01-01

    Background Each lung structure exhales a unique pattern of aerosols, which can be used to detect and monitor lung diseases non-invasively. The challenges are accurately interpreting the exhaled aerosol fingerprints and quantitatively correlating them to the lung diseases. Objective and Methods In this study, we presented a paradigm of an exhaled aerosol test that addresses the above two challenges and is promising to detect the site and severity of lung diseases. This paradigm consists of two steps: image feature extraction using sub-regional fractal analysis and data classification using a support vector machine (SVM). Numerical experiments were conducted to evaluate the feasibility of the breath test in four asthmatic lung models. A high-fidelity image-CFD approach was employed to compute the exhaled aerosol patterns under different disease conditions. Findings By employing the 10-fold cross-validation method, we achieved 100% classification accuracy among four asthmatic models using an ideal 108-sample dataset and 99.1% accuracy using a more realistic 324-sample dataset. The fractal-SVM classifier has been shown to be robust, highly sensitive to structural variations, and inherently suitable for investigating aerosol-disease correlations. Conclusion For the first time, this study quantitatively linked the exhaled aerosol patterns with their underlying diseases and set the stage for the development of a computer-aided diagnostic system for non-invasive detection of obstructive respiratory diseases. PMID:26422016

  3. "Big data" and "open data": What kind of access should researchers enjoy?

    PubMed

    Chatellier, Gilles; Varlet, Vincent; Blachier-Poisson, Corinne

    2016-02-01

    The healthcare sector is currently facing a new paradigm, the explosion of "big data". Coupled with advances in computer technology, the field of "big data" appears promising, allowing us to better understand the natural history of diseases, to follow-up new technologies (devices, drugs) implementation and to participate in precision medicine, etc. Data sources are multiple (medical and administrative data, electronic medical records, data from rapidly developing technologies such as DNA sequencing, connected devices, etc.) and heterogeneous while their use requires complex methods for accurate analysis. Moreover, faced with this new paradigm, we must determine who could (or should) have access to which data, how to combine collective interest and protection of personal data and how to finance in the long-term both operating costs and databases interrogation. This article analyses the opportunities and challenges related to the use of open and/or "big data", from the viewpoint of pharmacologists and representatives of the pharmaceutical and medical device industry. Copyright © 2016 Société française de pharmacologie et de thérapeutique. Published by Elsevier Masson SAS. All rights reserved.

  4. Flexible robotics: a new paradigm.

    PubMed

    Aron, Monish; Haber, Georges-Pascal; Desai, Mihir M; Gill, Inderbir S

    2007-05-01

    The use of robotics in urologic surgery has seen exponential growth over the last 5 years. Existing surgical robots operate rigid instruments on the master/slave principle and currently allow extraluminal manipulations and surgical procedures. Flexible robotics is an entirely novel paradigm. This article explores the potential of flexible robotic platforms that could permit endoluminal and transluminal surgery in the future. Computerized catheter-control systems are being developed primarily for cardiac applications. This development is driven by the need for precise positioning and manipulation of the catheter tip in the three-dimensional cardiovascular space. Such systems employ either remote navigation in a magnetic field or a computer-controlled electromechanical flexible robotic system. We have adapted this robotic system for flexible ureteropyeloscopy and have to date completed the initial porcine studies. Flexible robotics is on the horizon. It has potential for improved scope-tip precision, superior operative ergonomics, and reduced occupational radiation exposure. In the near future, in urology, we believe that it holds promise for endoluminal therapeutic ureterorenoscopy. Looking further ahead, within the next 3-5 years, it could enable transluminal surgery.

  5. Linking consistency with object/thread semantics - An approach to robust computation

    NASA Technical Reports Server (NTRS)

    Chen, Raymond C.; Dasgupta, Partha

    1989-01-01

    This paper presents an object/thread based paradigm that links data consistency with object/thread semantics. The paradigm can be used to achieve a wide range of consistency semantics from strict atomic transactions to standard process semantics. The paradigm supports three types of data consistency. Object programmers indicate the type of consistency desired on a per-operation basis and the system performs automatic concurrency control and recovery management to ensure that those consistency requirements are met. This allows programmers to customize consistency and recovery on a per-application basis without having to supply complicated, custom recovery management schemes. The paradigm allows robust and nonrobust computation to operate concurrently on the same data in a well defined manner. The operating system needs to support only one vehicle of computation - the thread.

  6. AN INTELLIGENT REPRODUCTIVE AND DEVELOPMENTAL TESTING PARADIGM FOR THE 21ST CENTURY

    EPA Science Inventory

    Addressing the chemical evaluation bottleneck that currently exists can only be achieved through progressive changes to the current testing paradigm. The primary resources for addressing these issues lie in computational toxicology, a field enriched by recent advances in computer...

  7. Synthetic Elucidation of Design Principles for Molecular Qubits

    NASA Astrophysics Data System (ADS)

    Graham, Michael James

    Quantum information processing (QIP) is an emerging computational paradigm with the potential to enable a vast increase in computational power, fundamentally transforming fields from structural biology to finance. QIP employs qubits, or quantum bits, as its fundamental units of information, which can exist in not just the classical states of 0 or 1, but in a superposition of the two. In order to successfully perform QIP, this superposition state must be sufficiently long-lived. One promising paradigm for the implementation of QIP involves employing unpaired electrons in coordination complexes as qubits. This architecture is highly tunable and scalable, however coordination complexes frequently suffer from short superposition lifetimes, or T2. In order to capitalize on the promise of molecular qubits, it is necessary to develop a set of design principles that allow the rational synthesis of complexes with sufficiently long values of T2. In this dissertation, I report efforts to use the synthesis of series of complexes to elucidate design principles for molecular qubits. Chapter 1 details previous work by our group and others in the field. Chapter 2 details the first efforts of our group to determine the impact of varying spin and spin-orbit coupling on T2. Chapter 3 examines the effect of removing nuclear spins on coherence time, and reports a series of vanadyl bis(dithiolene) complexes which exhibit extremely long coherence lifetimes, in excess of the 100 mus threshold for qubit viability. Chapters 4 and 5 form two complimentary halves of a study to determine the exact relationship between electronic spin-nuclear spin distance and the effect of the nuclear spins on T2. Finally, chapter 6 suggests next directions for the field as a whole, including the potential for work in this field to impact the development of other technologies as diverse as quantum sensors and magnetic resonance imaging contrast agents.

  8. Nonlinear Real-Time Optical Signal Processing

    DTIC Science & Technology

    1990-09-01

    pattern recognition. Additional work concerns the relationship of parallel computation paradigms to optical computing and halftone screen techniques...paradigms to optical computing and halftone screen techniques for implementing general nonlinear functions. 3\\ 2 Research Progress This section...Vol. 23, No. 8, pp. 34-57, 1986. 2.4 Nonlinear Optical Processing with Halftones : Degradation and Compen- sation Models This paper is concerned with

  9. New Paradigms for Computer Aids to Invention.

    ERIC Educational Resources Information Center

    Langston, M. Diane

    Many people are interested in computer aids to rhetorical invention and want to know how to evaluate an invention aid, what the criteria are for a good one, and how to assess the trade-offs involved in buying one product or another. The frame of reference for this evaluation is an "old paradigm," which treats the computer as if it were…

  10. From gaze cueing to perspective taking: Revisiting the claim that we automatically compute where or what other people are looking at

    PubMed Central

    Bukowski, Henryk; Hietanen, Jari K.; Samson, Dana

    2015-01-01

    ABSTRACT Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations. PMID:26924936

  11. From gaze cueing to perspective taking: Revisiting the claim that we automatically compute where or what other people are looking at.

    PubMed

    Bukowski, Henryk; Hietanen, Jari K; Samson, Dana

    2015-09-14

    Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations.

  12. All-optical reservoir computing.

    PubMed

    Duport, François; Schneider, Bendix; Smerieri, Anteo; Haelterman, Marc; Massar, Serge

    2012-09-24

    Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.

  13. Moglichkeiten und Grenzen eines evolutionaren Paradigmas in der Erziehungswissenschaft (Possibilities and Limits of an Evolutionary Paradigm in Educational Science).

    ERIC Educational Resources Information Center

    Nipkow, Karl Ernst

    2002-01-01

    Describes a sectoral and paradigmatic approach to evolutionary research. Argues that an evolutionary paradigm does not exist. Examines the socio-biological approach and that of a system-theoretical oriented general evolutionary theory. Utilizes the topics of cooperation, delimitation, and indoctrination to explain more promising ways of adoption.…

  14. A comparison among several P300 brain-computer interface speller paradigms.

    PubMed

    Fazel-Rezai, Reza; Gavett, Scott; Ahmad, Waqas; Rabbi, Ahmed; Schneider, Eric

    2011-10-01

    Since the brain-computer interface (BCI) speller was first proposed by Farwell and Donchin, there have been modifications in the visual aspects of P300 paradigms. Most of the changes are based on the original matrix format such as changes in the number of rows and columns, font size, flash/ blank time, and flash order. The improvement in the resulting accuracy and speed of such systems has always been the ultimate goal. In this study, we have compared several different speller paradigms including row-column, single character flashing, and two region-based paradigms which are not based on the matrix format. In the first region-based paradigm, at the first level, characters and symbols are distributed over seven regions alphabetically, while in the second region-based paradigm they are distributed in the most frequently used order. At the second level, each one of the regions is further subdivided into seven subsets. The experimental results showed that the average accuracy and user acceptability for two region-based paradigms were higher than those for traditional paradigms such as row/column and single character.

  15. A study on strategic provisioning of cloud computing services.

    PubMed

    Whaiduzzaman, Md; Haque, Mohammad Nazmul; Rejaul Karim Chowdhury, Md; Gani, Abdullah

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models "everything-as-a-service." Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.

  16. Brain-Computer Interfaces: A Neuroscience Paradigm of Social Interaction? A Matter of Perspective

    PubMed Central

    Mattout, Jérémie

    2012-01-01

    A number of recent studies have put human subjects in true social interactions, with the aim of better identifying the psychophysiological processes underlying social cognition. Interestingly, this emerging Neuroscience of Social Interactions (NSI) field brings up challenges which resemble important ones in the field of Brain-Computer Interfaces (BCI). Importantly, these challenges go beyond common objectives such as the eventual use of BCI and NSI protocols in the clinical domain or common interests pertaining to the use of online neurophysiological techniques and algorithms. Common fundamental challenges are now apparent and one can argue that a crucial one is to develop computational models of brain processes relevant to human interactions with an adaptive agent, whether human or artificial. Coupled with neuroimaging data, such models have proved promising in revealing the neural basis and mental processes behind social interactions. Similar models could help BCI to move from well-performing but offline static machines to reliable online adaptive agents. This emphasizes a social perspective to BCI, which is not limited to a computational challenge but extends to all questions that arise when studying the brain in interaction with its environment. PMID:22675291

  17. Compressive sensing scalp EEG signals: implementations and practical performance.

    PubMed

    Abdulghani, Amir M; Casson, Alexander J; Rodriguez-Villegas, Esther

    2012-11-01

    Highly miniaturised, wearable computing and communication systems allow unobtrusive, convenient and long term monitoring of a range of physiological parameters. For long term operation from the physically smallest batteries, the average power consumption of a wearable device must be very low. It is well known that the overall power consumption of these devices can be reduced by the inclusion of low power consumption, real-time compression of the raw physiological data in the wearable device itself. Compressive sensing is a new paradigm for providing data compression: it has shown significant promise in fields such as MRI; and is potentially suitable for use in wearable computing systems as the compression process required in the wearable device has a low computational complexity. However, the practical performance very much depends on the characteristics of the signal being sensed. As such the utility of the technique cannot be extrapolated from one application to another. Long term electroencephalography (EEG) is a fundamental tool for the investigation of neurological disorders and is increasingly used in many non-medical applications, such as brain-computer interfaces. This article investigates in detail the practical performance of different implementations of the compressive sensing theory when applied to scalp EEG signals.

  18. A Study on Strategic Provisioning of Cloud Computing Services

    PubMed Central

    Rejaul Karim Chowdhury, Md

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified. PMID:25032243

  19. A new paradigm for atomically detailed simulations of kinetics in biophysical systems.

    PubMed

    Elber, Ron

    2017-01-01

    The kinetics of biochemical and biophysical events determined the course of life processes and attracted considerable interest and research. For example, modeling of biological networks and cellular responses relies on the availability of information on rate coefficients. Atomically detailed simulations hold the promise of supplementing experimental data to obtain a more complete kinetic picture. However, simulations at biological time scales are challenging. Typical computer resources are insufficient to provide the ensemble of trajectories at the correct length that is required for straightforward calculations of time scales. In the last years, new technologies emerged that make atomically detailed simulations of rate coefficients possible. Instead of computing complete trajectories from reactants to products, these approaches launch a large number of short trajectories at different positions. Since the trajectories are short, they are computed trivially in parallel on modern computer architecture. The starting and termination positions of the short trajectories are chosen, following statistical mechanics theory, to enhance efficiency. These trajectories are analyzed. The analysis produces accurate estimates of time scales as long as hours. The theory of Milestoning that exploits the use of short trajectories is discussed, and several applications are described.

  20. Answer Set Programming and Other Computing Paradigms

    ERIC Educational Resources Information Center

    Meng, Yunsong

    2013-01-01

    Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to…

  1. An online hybrid brain-computer interface combining multiple physiological signals for webpage browse.

    PubMed

    Long Chen; Zhongpeng Wang; Feng He; Jiajia Yang; Hongzhi Qi; Peng Zhou; Baikun Wan; Dong Ming

    2015-08-01

    The hybrid brain computer interface (hBCI) could provide higher information transfer rate than did the classical BCIs. It included more than one brain-computer or human-machine interact paradigms, such as the combination of the P300 and SSVEP paradigms. Research firstly constructed independent subsystems of three different paradigms and tested each of them with online experiments. Then we constructed a serial hybrid BCI system which combined these paradigms to achieve the functions of typing letters, moving and clicking cursor, and switching among them for the purpose of browsing webpages. Five subjects were involved in this study. They all successfully realized these functions in the online tests. The subjects could achieve an accuracy above 90% after training, which met the requirement in operating the system efficiently. The results demonstrated that it was an efficient system capable of robustness, which provided an approach for the clinic application.

  2. Artificial Intelligence and brain.

    PubMed

    Shapshak, Paul

    2018-01-01

    From the start, Kurt Godel observed that computer and brain paradigms were considered on a par by researchers and that researchers had misunderstood his theorems. He hailed with displeasure that the brain transcends computers. In this brief article, we point out that Artificial Intelligence (AI) comprises multitudes of human-made methodologies, systems, and languages, and implemented with computer technology. These advances enhance development in the electron and quantum realms. In the biological realm, animal neurons function, also utilizing electron flow, and are products of evolution. Mirror neurons are an important paradigm in neuroscience research. Moreover, the paradigm shift proposed here - 'hall of mirror neurons' - is a potentially further productive research tactic. These concepts further expand AI and brain research.

  3. CNTRICS Imaging Biomarkers Final Task Selection: Long-Term Memory and Reinforcement Learning

    PubMed Central

    Ragland, John D.; Cohen, Neal J.; Cools, Roshan; Frank, Michael J.; Hannula, Deborah E.; Ranganath, Charan

    2012-01-01

    Functional imaging paradigms hold great promise as biomarkers for schizophrenia research as they can detect altered neural activity associated with the cognitive and emotional processing deficits that are so disabling to this patient population. In an attempt to identify the most promising functional imaging biomarkers for research on long-term memory (LTM), the Cognitive Neuroscience Treatment Research to Improve Cognition in Schizophrenia (CNTRICS) initiative selected “item encoding and retrieval,” “relational encoding and retrieval,” and “reinforcement learning” as key LTM constructs to guide the nomination process. This manuscript reports on the outcome of the third CNTRICS biomarkers meeting in which nominated paradigms in each of these domains were discussed by a review panel to arrive at a consensus on which of the nominated paradigms could be recommended for immediate translational development. After briefly describing this decision process, information is presented from the nominating authors describing the 4 functional imaging paradigms that were selected for immediate development. In addition to describing the tasks, information is provided on cognitive and neural construct validity, sensitivity to behavioral or pharmacological manipulations, availability of animal models, psychometric characteristics, effects of schizophrenia, and avenues for future development. PMID:22102094

  4. An Instructional Paradigm for the Teaching of Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Howard, Craig D.

    2012-01-01

    This article outlines an instructional paradigm that guides the design of interventions that build skills in computer-mediated communication (CMC). It is applicable to learning at multiple levels of communicative proficiency and aims to heighten awareness, the understanding of the impact of media configurations, the role of cultures and social…

  5. Approaches to Cyber Intrusion Response. A Legal Foundations Study. Report 12 of 12

    DTIC Science & Technology

    1997-01-01

    Computers: Controlling Behavior in Cyberspace through a Contract Law Paradigm, 35 Jurimetrics J. 1, 8-9 (1994). 5 Ingrid Becker, Cybercrime: Cops Can’t...Computers: Controlling Behavior in Cyberspace through a Contract Law Paradigm, 35 JURIMETRICS J. 1-15 (1994). 9 • Pro: This approach allows the victim

  6. A Framework for Safe Composition of Heterogeneous SOA Services in a Pervasive Computing Environment with Resource Constraints

    ERIC Educational Resources Information Center

    Reyes Alamo, Jose M.

    2010-01-01

    The Service Oriented Computing (SOC) paradigm, defines services as software artifacts whose implementations are separated from their specifications. Application developers rely on services to simplify the design, reduce the development time and cost. Within the SOC paradigm, different Service Oriented Architectures (SOAs) have been developed.…

  7. Test-retest reliability of effective connectivity in the face perception network.

    PubMed

    Frässle, Stefan; Paulus, Frieder Michel; Krach, Sören; Jansen, Andreas

    2016-02-01

    Computational approaches have great potential for moving neuroscience toward mechanistic models of the functional integration among brain regions. Dynamic causal modeling (DCM) offers a promising framework for inferring the effective connectivity among brain regions and thus unraveling the neural mechanisms of both normal cognitive function and psychiatric disorders. While the benefit of such approaches depends heavily on their reliability, systematic analyses of the within-subject stability are rare. Here, we present a thorough investigation of the test-retest reliability of an fMRI paradigm for DCM analysis dedicated to unraveling intra- and interhemispheric integration among the core regions of the face perception network. First, we examined the reliability of face-specific BOLD activity in 25 healthy volunteers, who performed a face perception paradigm in two separate sessions. We found good to excellent reliability of BOLD activity within the DCM-relevant regions. Second, we assessed the stability of effective connectivity among these regions by analyzing the reliability of Bayesian model selection and model parameter estimation in DCM. Reliability was excellent for the negative free energy and good for model parameter estimation, when restricting the analysis to parameters with substantial effect sizes. Third, even when the experiment was shortened, reliability of BOLD activity and DCM results dropped only slightly as a function of the length of the experiment. This suggests that the face perception paradigm presented here provides reliable estimates for both conventional activation and effective connectivity measures. We conclude this paper with an outlook on potential clinical applications of the paradigm for studying psychiatric disorders. Hum Brain Mapp 37:730-744, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  8. Application of a single-flicker online SSVEP BCI for spatial navigation.

    PubMed

    Chen, Jingjing; Zhang, Dan; Engel, Andreas K; Gong, Qin; Maye, Alexander

    2017-01-01

    A promising approach for brain-computer interfaces (BCIs) employs the steady-state visual evoked potential (SSVEP) for extracting control information. Main advantages of these SSVEP BCIs are a simple and low-cost setup, little effort to adjust the system parameters to the user and comparatively high information transfer rates (ITR). However, traditional frequency-coded SSVEP BCIs require the user to gaze directly at the selected flicker stimulus, which is liable to cause fatigue or even photic epileptic seizures. The spatially coded SSVEP BCI we present in this article addresses this issue. It uses a single flicker stimulus that appears always in the extrafoveal field of view, yet it allows the user to control four control channels. We demonstrate the embedding of this novel SSVEP stimulation paradigm in the user interface of an online BCI for navigating a 2-dimensional computer game. Offline analysis of the training data reveals an average classification accuracy of 96.9±1.64%, corresponding to an information transfer rate of 30.1±1.8 bits/min. In online mode, the average classification accuracy reached 87.9±11.4%, which resulted in an ITR of 23.8±6.75 bits/min. We did not observe a strong relation between a subject's offline and online performance. Analysis of the online performance over time shows that users can reliably control the new BCI paradigm with stable performance over at least 30 minutes of continuous operation.

  9. Lambda Data Grid: Communications Architecture in Support of Grid Computing

    DTIC Science & Technology

    2006-12-21

    number of paradigm shifts in the 20th century, including the growth of large geographically dispersed teams and the use of simulations and computational...get results. The work in this thesis automates the orchestration of networks with other resources, better utilizing all resources in a time efficient...domains, over transatlantic links in around minute. The main goal of this thesis is to build a new grid-computing paradigm that fully harnesses the

  10. The 'Biologically-Inspired Computing' Column

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike

    2006-01-01

    The field of Biology changed dramatically in 1953, with the determination by Francis Crick and James Dewey Watson of the double helix structure of DNA. This discovery changed Biology for ever, allowing the sequencing of the human genome, and the emergence of a "new Biology" focused on DNA, genes, proteins, data, and search. Computational Biology and Bioinformatics heavily rely on computing to facilitate research into life and development. Simultaneously, an understanding of the biology of living organisms indicates a parallel with computing systems: molecules in living cells interact, grow, and transform according to the "program" dictated by DNA. Moreover, paradigms of Computing are emerging based on modelling and developing computer-based systems exploiting ideas that are observed in nature. This includes building into computer systems self-management and self-governance mechanisms that are inspired by the human body's autonomic nervous system, modelling evolutionary systems analogous to colonies of ants or other insects, and developing highly-efficient and highly-complex distributed systems from large numbers of (often quite simple) largely homogeneous components to reflect the behaviour of flocks of birds, swarms of bees, herds of animals, or schools of fish. This new field of "Biologically-Inspired Computing", often known in other incarnations by other names, such as: Autonomic Computing, Pervasive Computing, Organic Computing, Biomimetics, and Artificial Life, amongst others, is poised at the intersection of Computer Science, Engineering, Mathematics, and the Life Sciences. Successes have been reported in the fields of drug discovery, data communications, computer animation, control and command, exploration systems for space, undersea, and harsh environments, to name but a few, and augur much promise for future progress.

  11. Comparison of two paradigms for distributed shared memory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levelt, W.G.; Kaashoek, M.F.; Bal, H.E.

    1990-08-01

    The paper compares two paradigms for Distributed Shared Memory on loosely coupled computing systems: the shared data-object model as used in Orca, a programming language specially designed for loosely coupled computing systems and the Shared Virtual Memory model. For both paradigms the authors have implemented two systems, one using only point-to-point messages, the other using broadcasting as well. They briefly describe these two paradigms and their implementations. Then they compare their performance on four applications: the traveling salesman problem, alpha-beta search, matrix multiplication and the all pairs shortest paths problem. The measurements show that both paradigms can be used efficientlymore » for programming large-grain parallel applications. Significant speedups were obtained on all applications. The unstructured Shared Virtual Memory paradigm achieves the best absolute performance, although this is largely due to the preliminary nature of the Orca compiler used. The structured shared data-object model achieves the highest speedups and is much easier to program and to debug.« less

  12. The Virtual Teacher (VT) Paradigm: Learning New Patterns of Interpersonal Coordination Using the Human Dynamic Clamp

    PubMed Central

    2015-01-01

    The Virtual Teacher paradigm, a version of the Human Dynamic Clamp (HDC), is introduced into studies of learning patterns of inter-personal coordination. Combining mathematical modeling and experimentation, we investigate how the HDC may be used as a Virtual Teacher (VT) to help humans co-produce and internalize new inter-personal coordination pattern(s). Human learners produced rhythmic finger movements whilst observing a computer-driven avatar, animated by dynamic equations stemming from the well-established Haken-Kelso-Bunz (1985) and Schöner-Kelso (1988) models of coordination. We demonstrate that the VT is successful in shifting the pattern co-produced by the VT-human system toward any value (Experiment 1) and that the VT can help humans learn unstable relative phasing patterns (Experiment 2). Using transfer entropy, we find that information flow from one partner to the other increases when VT-human coordination loses stability. This suggests that variable joint performance may actually facilitate interaction, and in the long run learning. VT appears to be a promising tool for exploring basic learning processes involved in social interaction, unraveling the dynamics of information flow between interacting partners, and providing possible rehabilitation opportunities. PMID:26569608

  13. The Virtual Teacher (VT) Paradigm: Learning New Patterns of Interpersonal Coordination Using the Human Dynamic Clamp.

    PubMed

    Kostrubiec, Viviane; Dumas, Guillaume; Zanone, Pier-Giorgio; Kelso, J A Scott

    2015-01-01

    The Virtual Teacher paradigm, a version of the Human Dynamic Clamp (HDC), is introduced into studies of learning patterns of inter-personal coordination. Combining mathematical modeling and experimentation, we investigate how the HDC may be used as a Virtual Teacher (VT) to help humans co-produce and internalize new inter-personal coordination pattern(s). Human learners produced rhythmic finger movements whilst observing a computer-driven avatar, animated by dynamic equations stemming from the well-established Haken-Kelso-Bunz (1985) and Schöner-Kelso (1988) models of coordination. We demonstrate that the VT is successful in shifting the pattern co-produced by the VT-human system toward any value (Experiment 1) and that the VT can help humans learn unstable relative phasing patterns (Experiment 2). Using transfer entropy, we find that information flow from one partner to the other increases when VT-human coordination loses stability. This suggests that variable joint performance may actually facilitate interaction, and in the long run learning. VT appears to be a promising tool for exploring basic learning processes involved in social interaction, unraveling the dynamics of information flow between interacting partners, and providing possible rehabilitation opportunities.

  14. KeyWare: an open wireless distributed computing environment

    NASA Astrophysics Data System (ADS)

    Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir

    1995-12-01

    Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.

  15. Cloud@Home: A New Enhanced Computing Paradigm

    NASA Astrophysics Data System (ADS)

    Distefano, Salvatore; Cunsolo, Vincenzo D.; Puliafito, Antonio; Scarpa, Marco

    Cloud computing is a distributed computing paradigm that mixes aspects of Grid computing, ("… hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities" (Foster, 2002)) Internet Computing ("…a computing platform geographically distributed across the Internet" (Milenkovic et al., 2003)), Utility computing ("a collection of technologies and business practices that enables computing to be delivered seamlessly and reliably across multiple computers, ... available as needed and billed according to usage, much like water and electricity are today" (Ross & Westerman, 2004)) Autonomic computing ("computing systems that can manage themselves given high-level objectives from administrators" (Kephart & Chess, 2003)), Edge computing ("… provides a generic template facility for any type of application to spread its execution across a dedicated grid, balancing the load …" Davis, Parikh, & Weihl, 2004) and Green computing (a new frontier of Ethical computing1 starting from the assumption that in next future energy costs will be related to the environment pollution).

  16. Quantum computers: Definition and implementations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez-Delgado, Carlos A.; Kok, Pieter

    The DiVincenzo criteria for implementing a quantum computer have been seminal in focusing both experimental and theoretical research in quantum-information processing. These criteria were formulated specifically for the circuit model of quantum computing. However, several new models for quantum computing (paradigms) have been proposed that do not seem to fit the criteria well. Therefore, the question is what are the general criteria for implementing quantum computers. To this end, a formal operational definition of a quantum computer is introduced. It is then shown that, according to this definition, a device is a quantum computer if it obeys the following criteria:more » Any quantum computer must consist of a quantum memory, with an additional structure that (1) facilitates a controlled quantum evolution of the quantum memory; (2) includes a method for information theoretic cooling of the memory; and (3) provides a readout mechanism for subsets of the quantum memory. The criteria are met when the device is scalable and operates fault tolerantly. We discuss various existing quantum computing paradigms and how they fit within this framework. Finally, we present a decision tree for selecting an avenue toward building a quantum computer. This is intended to help experimentalists determine the most natural paradigm given a particular physical implementation.« less

  17. EHR standards--A comparative study.

    PubMed

    Blobel, Bernd; Pharow, Peter

    2006-01-01

    For ensuring quality and efficiency of patient's care, the care paradigm moves from organization-centered over process-controlled towards personal care. Such health system paradigm change leads to new paradigms for analyzing, designing, implementing and deploying supporting health information systems including EHR systems as core application in a distributed eHealth environment. The paper defines the architectural paradigm for future-proof EHR systems. It compares advanced EHR architectures referencing them at the Generic Component Model. The paper introduces the evolving paradigm of autonomous computing for self-organizing health information systems.

  18. Unlocking the potential of metagenomics through replicated experimental design.

    PubMed

    Knight, Rob; Jansson, Janet; Field, Dawn; Fierer, Noah; Desai, Narayan; Fuhrman, Jed A; Hugenholtz, Phil; van der Lelie, Daniel; Meyer, Folker; Stevens, Rick; Bailey, Mark J; Gordon, Jeffrey I; Kowalchuk, George A; Gilbert, Jack A

    2012-06-07

    Metagenomics holds enormous promise for discovering novel enzymes and organisms that are biomarkers or drivers of processes relevant to disease, industry and the environment. In the past two years, we have seen a paradigm shift in metagenomics to the application of cross-sectional and longitudinal studies enabled by advances in DNA sequencing and high-performance computing. These technologies now make it possible to broadly assess microbial diversity and function, allowing systematic investigation of the largely unexplored frontier of microbial life. To achieve this aim, the global scientific community must collaborate and agree upon common objectives and data standards to enable comparative research across the Earth's microbiome. Improvements in comparability of data will facilitate the study of biotechnologically relevant processes, such as bioprospecting for new glycoside hydrolases or identifying novel energy sources.

  19. On-chip phase-change photonic memory and computing

    NASA Astrophysics Data System (ADS)

    Cheng, Zengguang; Ríos, Carlos; Youngblood, Nathan; Wright, C. David; Pernice, Wolfram H. P.; Bhaskaran, Harish

    2017-08-01

    The use of photonics in computing is a hot topic of interest, driven by the need for ever-increasing speed along with reduced power consumption. In existing computing architectures, photonic data storage would dramatically improve the performance by reducing latencies associated with electrical memories. At the same time, the rise of `big data' and `deep learning' is driving the quest for non-von Neumann and brain-inspired computing paradigms. To succeed in both aspects, we have demonstrated non-volatile multi-level photonic memory avoiding the von Neumann bottleneck in the existing computing paradigm and a photonic synapse resembling the biological synapses for brain-inspired computing using phase-change materials (Ge2Sb2Te5).

  20. Relate@IU>>>Share@IU: A New and Different Computer-Based Communications Paradigm.

    ERIC Educational Resources Information Center

    Frick, Theodore W.; Roberto, Joseph; Korkmaz, Ali; Oh, Jeong-En; Twal, Riad

    The purpose of this study was to examine problems with the current computer-based electronic communication systems and to initially test and revise a new and different paradigm for e-collaboration, Relate@IU. Understanding the concept of sending links to resources, rather than sending the resource itself, is at the core of how Relate@IU differs…

  1. Application of two neural network paradigms to the study of voluntary employee turnover.

    PubMed

    Somers, M J

    1999-04-01

    Two neural network paradigms--multilayer perceptron and learning vector quantization--were used to study voluntary employee turnover with a sample of 577 hospital employees. The objectives of the study were twofold. The 1st was to assess whether neural computing techniques offered greater predictive accuracy than did conventional turnover methodologies. The 2nd was to explore whether computer models of turnover based on neural network technologies offered new insights into turnover processes. When compared with logistic regression analysis, both neural network paradigms provided considerably more accurate predictions of turnover behavior, particularly with respect to the correct classification of leavers. In addition, these neural network paradigms captured nonlinear relationships that are relevant for theory development. Results are discussed in terms of their implications for future research.

  2. Smart Collaborative Caching for Information-Centric IoT in Fog Computing.

    PubMed

    Song, Fei; Ai, Zheng-Yang; Li, Jun-Jie; Pau, Giovanni; Collotta, Mario; You, Ilsun; Zhang, Hong-Ke

    2017-11-01

    The significant changes enabled by the fog computing had demonstrated that Internet of Things (IoT) urgently needs more evolutional reforms. Limited by the inflexible design philosophy; the traditional structure of a network is hard to meet the latest demands. However, Information-Centric Networking (ICN) is a promising option to bridge and cover these enormous gaps. In this paper, a Smart Collaborative Caching (SCC) scheme is established by leveraging high-level ICN principles for IoT within fog computing paradigm. The proposed solution is supposed to be utilized in resource pooling, content storing, node locating and other related situations. By investigating the available characteristics of ICN, some challenges of such combination are reviewed in depth. The details of building SCC, including basic model and advanced algorithms, are presented based on theoretical analysis and simplified examples. The validation focuses on two typical scenarios: simple status inquiry and complex content sharing. The number of clusters, packet loss probability and other parameters are also considered. The analytical results demonstrate that the performance of our scheme, regarding total packet number and average transmission latency, can outperform that of the original ones. We expect that the SCC will contribute an efficient solution to the related studies.

  3. Smart Collaborative Caching for Information-Centric IoT in Fog Computing

    PubMed Central

    Song, Fei; Ai, Zheng-Yang; Li, Jun-Jie; Zhang, Hong-Ke

    2017-01-01

    The significant changes enabled by the fog computing had demonstrated that Internet of Things (IoT) urgently needs more evolutional reforms. Limited by the inflexible design philosophy; the traditional structure of a network is hard to meet the latest demands. However, Information-Centric Networking (ICN) is a promising option to bridge and cover these enormous gaps. In this paper, a Smart Collaborative Caching (SCC) scheme is established by leveraging high-level ICN principles for IoT within fog computing paradigm. The proposed solution is supposed to be utilized in resource pooling, content storing, node locating and other related situations. By investigating the available characteristics of ICN, some challenges of such combination are reviewed in depth. The details of building SCC, including basic model and advanced algorithms, are presented based on theoretical analysis and simplified examples. The validation focuses on two typical scenarios: simple status inquiry and complex content sharing. The number of clusters, packet loss probability and other parameters are also considered. The analytical results demonstrate that the performance of our scheme, regarding total packet number and average transmission latency, can outperform that of the original ones. We expect that the SCC will contribute an efficient solution to the related studies. PMID:29104219

  4. An overview of wireless structural health monitoring for civil structures.

    PubMed

    Lynch, Jerome Peter

    2007-02-15

    Wireless monitoring has emerged in recent years as a promising technology that could greatly impact the field of structural monitoring and infrastructure asset management. This paper is a summary of research efforts that have resulted in the design of numerous wireless sensing unit prototypes explicitly intended for implementation in civil structures. Wireless sensing units integrate wireless communications and mobile computing with sensors to deliver a relatively inexpensive sensor platform. A key design feature of wireless sensing units is the collocation of computational power and sensors; the tight integration of computing with a wireless sensing unit provides sensors with the opportunity to self-interrogate measurement data. In particular, there is strong interest in using wireless sensing units to build structural health monitoring systems that interrogate structural data for signs of damage. After the hardware and the software designs of wireless sensing units are completed, the Alamosa Canyon Bridge in New Mexico is utilized to validate their accuracy and reliability. To improve the ability of low-cost wireless sensing units to detect the onset of structural damage, the wireless sensing unit paradigm is extended to include the capability to command actuators and active sensors.

  5. Sublattice parallel replica dynamics.

    PubMed

    Martínez, Enrique; Uberuaga, Blas P; Voter, Arthur F

    2014-06-01

    Exascale computing presents a challenge for the scientific community as new algorithms must be developed to take full advantage of the new computing paradigm. Atomistic simulation methods that offer full fidelity to the underlying potential, i.e., molecular dynamics (MD) and parallel replica dynamics, fail to use the whole machine speedup, leaving a region in time and sample size space that is unattainable with current algorithms. In this paper, we present an extension of the parallel replica dynamics algorithm [A. F. Voter, Phys. Rev. B 57, R13985 (1998)] by combining it with the synchronous sublattice approach of Shim and Amar [ and , Phys. Rev. B 71, 125432 (2005)], thereby exploiting event locality to improve the algorithm scalability. This algorithm is based on a domain decomposition in which events happen independently in different regions in the sample. We develop an analytical expression for the speedup given by this sublattice parallel replica dynamics algorithm and compare it with parallel MD and traditional parallel replica dynamics. We demonstrate how this algorithm, which introduces a slight additional approximation of event locality, enables the study of physical systems unreachable with traditional methodologies and promises to better utilize the resources of current high performance and future exascale computers.

  6. Integration of nanoscale memristor synapses in neuromorphic computing architectures

    NASA Astrophysics Data System (ADS)

    Indiveri, Giacomo; Linares-Barranco, Bernabé; Legenstein, Robert; Deligeorgis, George; Prodromakis, Themistoklis

    2013-09-01

    Conventional neuro-computing architectures and artificial neural networks have often been developed with no or loose connections to neuroscience. As a consequence, they have largely ignored key features of biological neural processing systems, such as their extremely low-power consumption features or their ability to carry out robust and efficient computation using massively parallel arrays of limited precision, highly variable, and unreliable components. Recent developments in nano-technologies are making available extremely compact and low power, but also variable and unreliable solid-state devices that can potentially extend the offerings of availing CMOS technologies. In particular, memristors are regarded as a promising solution for modeling key features of biological synapses due to their nanoscale dimensions, their capacity to store multiple bits of information per element and the low energy required to write distinct states. In this paper, we first review the neuro- and neuromorphic computing approaches that can best exploit the properties of memristor and scale devices, and then propose a novel hybrid memristor-CMOS neuromorphic circuit which represents a radical departure from conventional neuro-computing approaches, as it uses memristors to directly emulate the biophysics and temporal dynamics of real synapses. We point out the differences between the use of memristors in conventional neuro-computing architectures and the hybrid memristor-CMOS circuit proposed, and argue how this circuit represents an ideal building block for implementing brain-inspired probabilistic computing paradigms that are robust to variability and fault tolerant by design.

  7. Hybrid cloud: bridging of private and public cloud computing

    NASA Astrophysics Data System (ADS)

    Aryotejo, Guruh; Kristiyanto, Daniel Y.; Mufadhol

    2018-05-01

    Cloud Computing is quickly emerging as a promising paradigm in the recent years especially for the business sector. In addition, through cloud service providers, cloud computing is widely used by Information Technology (IT) based startup company to grow their business. However, the level of most businesses awareness on data security issues is low, since some Cloud Service Provider (CSP) could decrypt their data. Hybrid Cloud Deployment Model (HCDM) has characteristic as open source, which is one of secure cloud computing model, thus HCDM may solve data security issues. The objective of this study is to design, deploy and evaluate a HCDM as Infrastructure as a Service (IaaS). In the implementation process, Metal as a Service (MAAS) engine was used as a base to build an actual server and node. Followed by installing the vsftpd application, which serves as FTP server. In comparison with HCDM, public cloud was adopted through public cloud interface. As a result, the design and deployment of HCDM was conducted successfully, instead of having good security, HCDM able to transfer data faster than public cloud significantly. To the best of our knowledge, Hybrid Cloud Deployment model is one of secure cloud computing model due to its characteristic as open source. Furthermore, this study will serve as a base for future studies about Hybrid Cloud Deployment model which may relevant for solving big security issues of IT-based startup companies especially in Indonesia.

  8. Cancer research in need of a scientific revolution: Using 'paradigm shift' as a method of investigation.

    PubMed

    Wion, Didier; Appaix, Florence; Burruss, Meriwether; Berger, Francois; van der Sanden, Boudewijn

    2015-09-01

    Despite important human and financial resources and considerable accumulation of scientific publications, patents, and clinical trials, cancer research has been slow in achieving a therapeutic revolution similar to the one that occurred in the last century for infectious diseases. It has been proposed that science proceeds not only by accumulating data but also through paradigm shifts. Here, we propose to use the concept of 'paradigm shift' as a method of investigation when dominant paradigms fail to achieve their promises. The first step in using the 'paradigm shift' method in cancer research requires identifying its founding paradigms. In this review, two of these founding paradigms will be discussed: (i) the reification of cancer as a tumour mass and (ii) the translation of the concepts issued from infectious disease in cancer research. We show how these founding paradigms can generate biases that lead to over-diagnosis and over-treatment and also hamper the development of curative cancer therapies. We apply the 'paradigm shift' method to produce perspective reversals consistent with current experimental evidence. The 'paradigm shift' method enlightens the existence of a tumour physiologic-prophylactic-pathologic continuum. It integrates the target/antitarget concept and that cancer is also an extracellular disease. The 'paradigm shift' method has immediate implications for cancer prevention and therapy. It could be a general method of investigation for other diseases awaiting therapy.

  9. Performance improvement of ERP-based brain-computer interface via varied geometric patterns.

    PubMed

    Ma, Zheng; Qiu, Tianshuang

    2017-12-01

    Recently, many studies have been focusing on optimizing the stimulus of an event-related potential (ERP)-based brain-computer interface (BCI). However, little is known about the effectiveness when increasing the stimulus unpredictability. We investigated a new stimulus type of varied geometric pattern where both complexity and unpredictability of the stimulus are increased. The proposed and classical paradigms were compared in within-subject experiments with 16 healthy participants. Results showed that the BCI performance was significantly improved for the proposed paradigm, with an average online written symbol rate increasing by 138% comparing with that of the classical paradigm. Amplitudes of primary ERP components, such as N1, P2a, P2b, N2, were also found to be significantly enhanced with the proposed paradigm. In this paper, a novel ERP BCI paradigm with a new stimulus type of varied geometric pattern is proposed. By jointly increasing the complexity and unpredictability of the stimulus, the performance of an ERP BCI could be considerably improved.

  10. Challenges and Promises of Overcoming Epistemological and Methodological Partiality: Advancing Engineering Education through Acceptance of Diverse Ways of Knowing

    ERIC Educational Resources Information Center

    Douglas, Elliot P.; Koro-Ljungberg, Mirka; Borrego, Maura

    2010-01-01

    The purpose of this paper is to explore some challenges and promises when the epistemological diversity embedded in qualitative research traditions is introduced to research communities with one dominant research paradigm, such as engineering education. Literature is used from other fields and empirical data are used from engineering education,…

  11. A Tutoring and Student Modelling Paradigm for Gaming Environments.

    ERIC Educational Resources Information Center

    Burton, Richard R.; Brown, John Seely

    This paper describes a paradigm for tutorial systems capable of automatically providing feedback and hints in a game environment. The paradigm is illustrated by a tutoring system for the PLATO game "How the West Was Won." The system uses a computer-based "Expert" player to evaluate a student's moves and construct a "differential model" of the…

  12. Building Bridges across Knowledge Systems: Ubuntu and Participative Research Paradigms in Bantu Communities

    ERIC Educational Resources Information Center

    Muwanga-Zake, J. W. F.

    2009-01-01

    This paper discusses how Ubuntu as a philosophy and a methodology was used among Bantu in South Africa together with participative Western paradigms in evaluating an educational computer game. The paper argues that research among Bantu has to articulate research experiences within Ubuntu paradigms if valid outcomes are to be realised. (Contains 1…

  13. Paradigms for Realizing Machine Learning Algorithms.

    PubMed

    Agneeswaran, Vijay Srinivas; Tonpay, Pranay; Tiwary, Jayati

    2013-12-01

    The article explains the three generations of machine learning algorithms-with all three trying to operate on big data. The first generation tools are SAS, SPSS, etc., while second generation realizations include Mahout and RapidMiner (that work over Hadoop), and the third generation paradigms include Spark and GraphLab, among others. The essence of the article is that for a number of machine learning algorithms, it is important to look beyond the Hadoop's Map-Reduce paradigm in order to make them work on big data. A number of promising contenders have emerged in the third generation that can be exploited to realize deep analytics on big data.

  14. Heterogeneous concurrent computing with exportable services

    NASA Technical Reports Server (NTRS)

    Sunderam, Vaidy

    1995-01-01

    Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.

  15. Comparing the OpenMP, MPI, and Hybrid Programming Paradigm on an SMP Cluster

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; anMey, Dieter; Hatay, Ferhat F.

    2003-01-01

    With the advent of parallel hardware and software technologies users are faced with the challenge to choose a programming paradigm best suited for the underlying computer architecture. With the current trend in parallel computer architectures towards clusters of shared memory symmetric multi-processors (SMP), parallel programming techniques have evolved to support parallelism beyond a single level. Which programming paradigm is the best will depend on the nature of the given problem, the hardware architecture, and the available software. In this study we will compare different programming paradigms for the parallelization of a selected benchmark application on a cluster of SMP nodes. We compare the timings of different implementations of the same CFD benchmark application employing the same numerical algorithm on a cluster of Sun Fire SMP nodes. The rest of the paper is structured as follows: In section 2 we briefly discuss the programming models under consideration. We describe our compute platform in section 3. The different implementations of our benchmark code are described in section 4 and the performance results are presented in section 5. We conclude our study in section 6.

  16. The paradigm compiler: Mapping a functional language for the connection machine

    NASA Technical Reports Server (NTRS)

    Dennis, Jack B.

    1989-01-01

    The Paradigm Compiler implements a new approach to compiling programs written in high level languages for execution on highly parallel computers. The general approach is to identify the principal data structures constructed by the program and to map these structures onto the processing elements of the target machine. The mapping is chosen to maximize performance as determined through compile time global analysis of the source program. The source language is Sisal, a functional language designed for scientific computations, and the target language is Paris, the published low level interface to the Connection Machine. The data structures considered are multidimensional arrays whose dimensions are known at compile time. Computations that build such arrays usually offer opportunities for highly parallel execution; they are data parallel. The Connection Machine is an attractive target for these computations, and the parallel for construct of the Sisal language is a convenient high level notation for data parallel algorithms. The principles and organization of the Paradigm Compiler are discussed.

  17. Assisted closed-loop optimization of SSVEP-BCI efficiency

    PubMed Central

    Fernandez-Vargas, Jacobo; Pfaff, Hanns U.; Rodríguez, Francisco B.; Varona, Pablo

    2012-01-01

    We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research. PMID:23443214

  18. Assisted closed-loop optimization of SSVEP-BCI efficiency.

    PubMed

    Fernandez-Vargas, Jacobo; Pfaff, Hanns U; Rodríguez, Francisco B; Varona, Pablo

    2013-01-01

    We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research.

  19. An auditory brain-computer interface evoked by natural speech

    NASA Astrophysics Data System (ADS)

    Lopez-Gordo, M. A.; Fernandez, E.; Romero, S.; Pelayo, F.; Prieto, Alberto

    2012-06-01

    Brain-computer interfaces (BCIs) are mainly intended for people unable to perform any muscular movement, such as patients in a complete locked-in state. The majority of BCIs interact visually with the user, either in the form of stimulation or biofeedback. However, visual BCIs challenge their ultimate use because they require the subjects to gaze, explore and shift eye-gaze using their muscles, thus excluding patients in a complete locked-in state or under the condition of the unresponsive wakefulness syndrome. In this study, we present a novel fully auditory EEG-BCI based on a dichotic listening paradigm using human voice for stimulation. This interface has been evaluated with healthy volunteers, achieving an average information transmission rate of 1.5 bits min-1 in full-length trials and 2.7 bits min-1 using the optimal length of trials, recorded with only one channel and without formal training. This novel technique opens the door to a more natural communication with users unable to use visual BCIs, with promising results in terms of performance, usability, training and cognitive effort.

  20. Implications of the Java language on computer-based patient records.

    PubMed

    Pollard, D; Kucharz, E; Hammond, W E

    1996-01-01

    The growth of the utilization of the World Wide Web (WWW) as a medium for the delivery of computer-based patient records (CBPR) has created a new paradigm in which clinical information may be delivered. Until recently the authoring tools and environment for application development on the WWW have been limited to Hyper Text Markup Language (HTML) utilizing common gateway interface scripts. While, at times, this provides an effective medium for the delivery of CBPR, it is a less than optimal solution. The server-centric dynamics and low levels of interactivity do not provide for a robust application which is required in a clinical environment. The emergence of Sun Microsystems' Java language is a solution to the problem. In this paper we examine the Java language and its implications to the CBPR. A quantitative and qualitative assessment was performed. The Java environment is compared to HTML and Telnet CBPR environments. Qualitative comparisons include level of interactivity, server load, client load, ease of use, and application capabilities. Quantitative comparisons include data transfer time delays. The Java language has demonstrated promise for delivering CBPRs.

  1. Workplace loyalty in the 1990s.

    PubMed

    Umiker, W

    1995-03-01

    The loyalty paradigm is evolving as part of our cultural modifications. We must enunciate our own standards and temper them with realism. Employers can seldom promise permanent jobs, and employees are reluctant to hitch their stars to one organization. In health care institutions, ethical considerations may result in divergent allegiances. The new loyalty paradigm is affected by the movements toward participative management and team building. A number of suggestions are offered for enhancing a pragmatic form of loyalty.

  2. Investigating true and false confessions within a novel experimental paradigm.

    PubMed

    Russano, Melissa B; Meissner, Christian A; Narchet, Fadia M; Kassin, Saul M

    2005-06-01

    The primary goal of the current study was to develop a novel experimental paradigm with which to study the influence of psychologically based interrogation techniques on the likelihood of true and false confessions. The paradigm involves guilty and innocent participants being accused of intentionally breaking an experimental rule, or "cheating." In the first demonstration of this paradigm, we explored the influence of two common police interrogation tactics: minimization and an explicit offer of leniency, or a "deal." Results indicated that guilty persons were more likely to confess than innocent persons, and that the use of minimization and the offer of a deal increased the rate of both true and false confessions. Police investigators are encouraged to avoid interrogation techniques that imply or directly promise leniency, as they appear to reduce the diagnostic value of any confession that is elicited.

  3. Generating classes of 3D virtual mandibles for AR-based medical simulation.

    PubMed

    Hippalgaonkar, Neha R; Sider, Alexa D; Hamza-Lup, Felix G; Santhanam, Anand P; Jaganathan, Bala; Imielinska, Celina; Rolland, Jannick P

    2008-01-01

    Simulation and modeling represent promising tools for several application domains from engineering to forensic science and medicine. Advances in 3D imaging technology convey paradigms such as augmented reality (AR) and mixed reality inside promising simulation tools for the training industry. Motivated by the requirement for superimposing anatomically correct 3D models on a human patient simulator (HPS) and visualizing them in an AR environment, the purpose of this research effort was to develop and validate a method for scaling a source human mandible to a target human mandible within a 2 mm root mean square (RMS) error. Results show that, given a distance between 2 same landmarks on 2 different mandibles, a relative scaling factor may be computed. Using this scaling factor, results show that a 3D virtual mandible model can be made morphometrically equivalent to a real target-specific mandible within a 1.30 mm RMS error. The virtual mandible may be further used as a reference target for registering other anatomic models, such as the lungs, on the HPS. Such registration will be made possible by physical constraints among the mandible and the spinal column in the horizontal normal rest position.

  4. Organic-Inorganic Hybrid Ruddlesden-Popper Perovskites: An Emerging Paradigm for High-Performance Light-Emitting Diodes.

    PubMed

    Liu, Xiao-Ke; Gao, Feng

    2018-05-03

    Recently, lead halide perovskite materials have attracted extensive interest, in particular, in the research field of solar cells. These materials are fascinating "soft" materials with semiconducting properties comparable to the best inorganic semiconductors like silicon and gallium arsenide. As one of the most promising perovskite family members, organic-inorganic hybrid Ruddlesden-Popper perovskites (HRPPs) offer rich chemical and structural flexibility for exploring excellent properties for optoelectronic devices, such as solar cells and light-emitting diodes (LEDs). In this Perspective, we present an overview of HRPPs on their structural characteristics, synthesis of pure HRPP compounds and thin films, control of their preferential orientations, and investigations of heterogeneous HRPP thin films. Based on these recent advances, future directions and prospects have been proposed. HRPPs are promising to open up a new paradigm for high-performance LEDs.

  5. Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing

    NASA Astrophysics Data System (ADS)

    Chine, Karim

    The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.

  6. Beyond maximum speed—a novel two-stimulus paradigm for brain-computer interfaces based on event-related potentials (P300-BCI)

    NASA Astrophysics Data System (ADS)

    Kaufmann, Tobias; Kübler, Andrea

    2014-10-01

    Objective. The speed of brain-computer interfaces (BCI), based on event-related potentials (ERP), is inherently limited by the commonly used one-stimulus paradigm. In this paper, we introduce a novel paradigm that can increase the spelling speed by a factor of 2, thereby extending the one-stimulus paradigm to a two-stimulus paradigm. Two different stimuli (a face and a symbol) are presented at the same time, superimposed on different characters and ERPs are classified using a multi-class classifier. Here, we present the proof-of-principle that is achieved with healthy participants. Approach. Eight participants were confronted with the novel two-stimulus paradigm and, for comparison, with two one-stimulus paradigms that used either one of the stimuli. Classification accuracies (percentage of correctly predicted letters) and elicited ERPs from the three paradigms were compared in a comprehensive offline analysis. Main results. The accuracies slightly decreased with the novel system compared to the established one-stimulus face paradigm. However, the use of two stimuli allowed for spelling at twice the maximum speed of the one-stimulus paradigms, and participants still achieved an average accuracy of 81.25%. This study introduced an alternative way of increasing the spelling speed in ERP-BCIs and illustrated that ERP-BCIs may not yet have reached their speed limit. Future research is needed in order to improve the reliability of the novel approach, as some participants displayed reduced accuracies. Furthermore, a comparison to the most recent BCI systems with individually adjusted, rapid stimulus timing is needed to draw conclusions about the practical relevance of the proposed paradigm. Significance. We introduced a novel two-stimulus paradigm that might be of high value for users who have reached the speed limit with the current one-stimulus ERP-BCI systems.

  7. Cognitive remediation versus active computer control in bipolar disorder with psychosis: study protocol for a randomized controlled trial.

    PubMed

    Lewandowski, Kathryn Eve; Sperry, Sarah H; Ongur, Dost; Cohen, Bruce M; Norris, Lesley A; Keshavan, Matcheri S

    2016-03-12

    Cognitive dysfunction is a major feature of bipolar disorder with psychosis and is strongly associated with functional outcomes. Computer-based cognitive remediation has shown promise in improving cognition in patients with schizophrenia. However, despite similar neurocognitive deficits between patients with schizophrenia and bipolar disorder, few studies have extended neuroscience-based cognitive remediation programs to this population. The Treatment to Enhance Cognition in Bipolar Disorder study is an investigator-initiated, parallel group, randomized, blinded clinical trial of an Internet-based cognitive remediation protocol for patients with bipolar disorder I with psychosis (n = 100). We also describe the development of our dose-matched active control paradigm. Both conditions involve 70 sessions of computer-based activities over 24 weeks. The control intervention was developed to mirror the treatment condition in dose and format but without the neuroplasticity-based task design and structure. All participants undergo neuropsychological and clinical assessment at baseline, after approximately 25 hours of study activities, post treatment, and after 6 months of no study contact to assess durability. Neuroimaging at baseline and post treatment are offered in an "opt-in" format. The primary outcomes are scores on the MATRICS battery; secondary and exploratory outcomes include measures of clinical symptoms, community functioning, and neuroimaging changes. Associations between change in cognitive measures and change in community functioning will be assessed. Baseline predictors of treatment response will be examined. The present study is the first we are aware of to implement an Internet-based cognitive remediation program in patients with bipolar disorder with psychosis and to develop a comparable web-based control paradigm. The mixed online and study-site format allows accessible treatment while providing weekly staff contact and bridging. Based on user-provided feedback, participant blinding is feasible. ClinicalTrials.gov NCT01470781 ; 11 July 2011.

  8. Computing with dynamical systems based on insulator-metal-transition oscillators

    NASA Astrophysics Data System (ADS)

    Parihar, Abhinav; Shukla, Nikhil; Jerry, Matthew; Datta, Suman; Raychowdhury, Arijit

    2017-04-01

    In this paper, we review recent work on novel computing paradigms using coupled oscillatory dynamical systems. We explore systems of relaxation oscillators based on linear state transitioning devices, which switch between two discrete states with hysteresis. By harnessing the dynamics of complex, connected systems, we embrace the philosophy of "let physics do the computing" and demonstrate how complex phase and frequency dynamics of such systems can be controlled, programmed, and observed to solve computationally hard problems. Although our discussion in this paper is limited to insulator-to-metallic state transition devices, the general philosophy of such computing paradigms can be translated to other mediums including optical systems. We present the necessary mathematical treatments necessary to understand the time evolution of these systems and demonstrate through recent experimental results the potential of such computational primitives.

  9. Reinforced Adversarial Neural Computer for de Novo Molecular Design.

    PubMed

    Putin, Evgeny; Asadulaev, Arip; Ivanenkov, Yan; Aladinskiy, Vladimir; Sanchez-Lengeling, Benjamin; Aspuru-Guzik, Alán; Zhavoronkov, Alex

    2018-06-12

    In silico modeling is a crucial milestone in modern drug design and development. Although computer-aided approaches in this field are well-studied, the application of deep learning methods in this research area is at the beginning. In this work, we present an original deep neural network (DNN) architecture named RANC (Reinforced Adversarial Neural Computer) for the de novo design of novel small-molecule organic structures based on the generative adversarial network (GAN) paradigm and reinforcement learning (RL). As a generator RANC uses a differentiable neural computer (DNC), a category of neural networks, with increased generation capabilities due to the addition of an explicit memory bank, which can mitigate common problems found in adversarial settings. The comparative results have shown that RANC trained on the SMILES string representation of the molecules outperforms its first DNN-based counterpart ORGANIC by several metrics relevant to drug discovery: the number of unique structures, passing medicinal chemistry filters (MCFs), Muegge criteria, and high QED scores. RANC is able to generate structures that match the distributions of the key chemical features/descriptors (e.g., MW, logP, TPSA) and lengths of the SMILES strings in the training data set. Therefore, RANC can be reasonably regarded as a promising starting point to develop novel molecules with activity against different biological targets or pathways. In addition, this approach allows scientists to save time and covers a broad chemical space populated with novel and diverse compounds.

  10. Statistical Control Paradigm for Aerospace Structures Under Impulsive Disturbances

    DTIC Science & Technology

    2006-08-03

    attitude control system with an innovative and robust statistical controller design shows significant promise for use in attitude hold mode operation...indicate that the existing attitude control system with an innovative and robust statistical controller design shows significant promise for use in...and three thrusters are for use in controlling the attitude of the satellite. Then the angular momentum of the satellite with three thrusters and a

  11. Towards pattern generation and chaotic series prediction with photonic reservoir computers

    NASA Astrophysics Data System (ADS)

    Antonik, Piotr; Hermans, Michiel; Duport, François; Haelterman, Marc; Massar, Serge

    2016-03-01

    Reservoir Computing is a bio-inspired computing paradigm for processing time dependent signals that is particularly well suited for analog implementations. Our team has demonstrated several photonic reservoir computers with performance comparable to digital algorithms on a series of benchmark tasks such as channel equalisation and speech recognition. Recently, we showed that our opto-electronic reservoir computer could be trained online with a simple gradient descent algorithm programmed on an FPGA chip. This setup makes it in principle possible to feed the output signal back into the reservoir, and thus highly enrich the dynamics of the system. This will allow to tackle complex prediction tasks in hardware, such as pattern generation and chaotic and financial series prediction, which have so far only been studied in digital implementations. Here we report simulation results of our opto-electronic setup with an FPGA chip and output feedback applied to pattern generation and Mackey-Glass chaotic series prediction. The simulations take into account the major aspects of our experimental setup. We find that pattern generation can be easily implemented on the current setup with very good results. The Mackey-Glass series prediction task is more complex and requires a large reservoir and more elaborate training algorithm. With these adjustments promising result are obtained, and we now know what improvements are needed to match previously reported numerical results. These simulation results will serve as basis of comparison for experiments we will carry out in the coming months.

  12. Protein Modelling: What Happened to the “Protein Structure Gap”?

    PubMed Central

    Schwede, Torsten

    2013-01-01

    Computational modeling and prediction of three-dimensional macromolecular structures and complexes from their sequence has been a long standing vision in structural biology as it holds the promise to bypass part of the laborious process of experimental structure solution. Over the last two decades, a paradigm shift has occurred: starting from a situation where the “structure knowledge gap” between the huge number of protein sequences and small number of known structures has hampered the widespread use of structure-based approaches in life science research, today some form of structural information – either experimental or computational – is available for the majority of amino acids encoded by common model organism genomes. Template based homology modeling techniques have matured to a point where they are now routinely used to complement experimental techniques. With the scientific focus of interest moving towards larger macromolecular complexes and dynamic networks of interactions, the integration of computational modeling methods with low-resolution experimental techniques allows studying large and complex molecular machines. Computational modeling and prediction techniques are still facing a number of challenges which hamper the more widespread use by the non-expert scientist. For example, it is often difficult to convey the underlying assumptions of a computational technique, as well as the expected accuracy and structural variability of a specific model. However, these aspects are crucial to understand the limitations of a model, and to decide which interpretations and conclusions can be supported. PMID:24010712

  13. Enzyme-regulated the changes of pH values for assembling a colorimetric and multistage interconnection logic network with multiple readouts.

    PubMed

    Huang, Yanyan; Ran, Xiang; Lin, Youhui; Ren, Jinsong; Qu, Xiaogang

    2015-04-22

    Based on enzymatic reactions-triggered changes of pH values and biocomputing, a novel and multistage interconnection biological network with multiple easy-detectable signal outputs has been developed. Compared with traditional chemical computing, the enzyme-based biological system could overcome the interference between reactions or the incompatibility of individual computing gates and offer a unique opportunity to assemble multicomponent/multifunctional logic circuitries. Our system included four enzyme inputs: β-galactosidase (β-gal), glucose oxidase (GOx), esterase (Est) and urease (Ur). With the assistance of two signal transducers (gold nanoparticles and acid-base indicators) or pH meter, the outputs of the biological network could be conveniently read by the naked eyes. In contrast to current methods, the approach present here could realize cost-effective, label-free and colorimetric logic operations without complicated instrument. By designing a series of Boolean logic operations, we could logically make judgment of the compositions of the samples on the basis of visual output signals. Our work offered a promising paradigm for future biological computing technology and might be highly useful in future intelligent diagnostics, prodrug activation, smart drug delivery, process control, and electronic applications. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Sensitivity to Social Contingency in Adults with High-Functioning Autism during Computer-Mediated Embodied Interaction.

    PubMed

    Zapata-Fonseca, Leonardo; Froese, Tom; Schilbach, Leonhard; Vogeley, Kai; Timmermans, Bert

    2018-02-08

    Autism Spectrum Disorder (ASD) can be understood as a social interaction disorder. This makes the emerging "second-person approach" to social cognition a more promising framework for studying ASD than classical approaches focusing on mindreading capacities in detached, observer-based arrangements. According to the second-person approach, embodied, perceptual, and embedded or interactive capabilities are also required for understanding others, and these are hypothesized to be compromised in ASD. We therefore recorded the dynamics of real-time sensorimotor interaction in pairs of control participants and participants with High-Functioning Autism (HFA), using the minimalistic human-computer interface paradigm known as "perceptual crossing" (PC). We investigated whether HFA is associated with impaired detection of social contingency, i.e., a reduced sensitivity to the other's responsiveness to one's own behavior. Surprisingly, our analysis reveals that, at least under the conditions of this highly simplified, computer-mediated, embodied form of social interaction, people with HFA perform equally well as controls. This finding supports the increasing use of virtual reality interfaces for helping people with ASD to better compensate for their social disabilities. Further dynamical analyses are necessary for a better understanding of the mechanisms that are leading to the somewhat surprising results here obtained.

  15. Optical studies of current-induced magnetization switching and photonic quantum states

    NASA Astrophysics Data System (ADS)

    Lorenz, Virginia

    2017-04-01

    The ever-decreasing size of electronic components is leading to a fundamental change in the way computers operate, as at the few-nanometer scale, resistive heating and quantum mechanics prohibit efficient and stable operation. One of the most promising next-generation computing paradigms is Spintronics, which uses the spin of the electron to manipulate and store information in the form of magnetic thin films. I will present our optical studies of the fundamental mechanisms by which we can efficiently manipulate magnetization using electrical current. Although electron spin is a quantum-mechanical property, Spintronics relies on macroscopic magnetization and thus does not take advantage of quantum mechanics in the algorithms used to encode and transmit information. For the second part of my talk, I will present our work under the umbrella of new computing and communication technologies based on the quantum mechanical properties of photons. Quantum technologies often require the carriers of information, or qubits, to have specific properties. Photonic quantum states are good information carriers because they travel fast and are robust to environmental fluctuations, but characterizing and controlling photonic sources so the photons have just the right properties is still a challenge. I will describe our work towards enabling quantum-physics-based secure long-distance communication using photons.

  16. Turbomachinery computational fluid dynamics: asymptotes and paradigm shifts.

    PubMed

    Dawes, W N

    2007-10-15

    This paper reviews the development of computational fluid dynamics (CFD) specifically for turbomachinery simulations and with a particular focus on application to problems with complex geometry. The review is structured by considering this development as a series of paradigm shifts, followed by asymptotes. The original S1-S2 blade-blade-throughflow model is briefly described, followed by the development of two-dimensional then three-dimensional blade-blade analysis. This in turn evolved from inviscid to viscous analysis and then from steady to unsteady flow simulations. This development trajectory led over a surprisingly small number of years to an accepted approach-a 'CFD orthodoxy'. A very important current area of intense interest and activity in turbomachinery simulation is in accounting for real geometry effects, not just in the secondary air and turbine cooling systems but also associated with the primary path. The requirements here are threefold: capturing and representing these geometries in a computer model; making rapid design changes to these complex geometries; and managing the very large associated computational models on PC clusters. Accordingly, the challenges in the application of the current CFD orthodoxy to complex geometries are described in some detail. The main aim of this paper is to argue that the current CFD orthodoxy is on a new asymptote and is not in fact suited for application to complex geometries and that a paradigm shift must be sought. In particular, the new paradigm must be geometry centric and inherently parallel without serial bottlenecks. The main contribution of this paper is to describe such a potential paradigm shift, inspired by the animation industry, based on a fundamental shift in perspective from explicit to implicit geometry and then illustrate this with a number of applications to turbomachinery.

  17. Moving alcohol prevention research forward-Part I: introducing a complex systems paradigm.

    PubMed

    Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller

    2018-02-01

    The drinking environment is a complex system consisting of a number of heterogeneous, evolving and interacting components, which exhibit circular causality and emergent properties. These characteristics reduce the efficacy of commonly used research approaches, which typically do not account for the underlying dynamic complexity of alcohol consumption and the interdependent nature of diverse factors influencing misuse over time. We use alcohol misuse among college students in the United States as an example for framing our argument for a complex systems paradigm. A complex systems paradigm, grounded in socio-ecological and complex systems theories and computational modeling and simulation, is introduced. Theoretical, conceptual, methodological and analytical underpinnings of this paradigm are described in the context of college drinking prevention research. The proposed complex systems paradigm can transcend limitations of traditional approaches, thereby fostering new directions in alcohol prevention research. By conceptualizing student alcohol misuse as a complex adaptive system, computational modeling and simulation methodologies and analytical techniques can be used. Moreover, use of participatory model-building approaches to generate simulation models can further increase stakeholder buy-in, understanding and policymaking. A complex systems paradigm for research into alcohol misuse can provide a holistic understanding of the underlying drinking environment and its long-term trajectory, which can elucidate high-leverage preventive interventions. © 2017 Society for the Study of Addiction.

  18. Language-learning disabilities: Paradigms for the nineties.

    PubMed

    Wiig, E H

    1991-01-01

    We are beginning a decade, during which many traditional paradigms in education, special education, and speech-language pathology will undergo change. Among paradigms considered promising for speech-language pathology in the schools are collaborative language intervention and strategy training for language and communication. This presentation introduces management models for developing a collaborative language intervention process, among them the Deming Management Method for Total Quality (TQ) (Deming 1986). Implementation models for language assessment and IEP planning and multicultural issues are also introduced (Damico and Nye 1990; Secord and Wiig in press). While attention to processes involved in developing and implementing collaborative language intervention is paramount, content should not be neglected. To this end, strategy training for language and communication is introduced as a viable paradigm. Macro- and micro-level process models for strategy training are featured and general issues are discussed (Ellis, Deshler, and Schumaker 1989; Swanson 1989; Wiig 1989).

  19. A general formula for computing maximum proportion correct scores in various psychophysical paradigms with arbitrary probability distributions of stimulus observations.

    PubMed

    Dai, Huanping; Micheyl, Christophe

    2015-05-01

    Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.

  20. Cloud Computing in Support of Applied Learning: A Baseline Study of Infrastructure Design at Southern Polytechnic State University

    ERIC Educational Resources Information Center

    Conn, Samuel S.; Reichgelt, Han

    2013-01-01

    Cloud computing represents an architecture and paradigm of computing designed to deliver infrastructure, platforms, and software as constructible computing resources on demand to networked users. As campuses are challenged to better accommodate academic needs for applications and computing environments, cloud computing can provide an accommodating…

  1. Optimally stopped variational quantum algorithms

    NASA Astrophysics Data System (ADS)

    Vinci, Walter; Shabani, Alireza

    2018-04-01

    Quantum processors promise a paradigm shift in high-performance computing which needs to be assessed by accurate benchmarking measures. In this article, we introduce a benchmark for the variational quantum algorithm (VQA), recently proposed as a heuristic algorithm for small-scale quantum processors. In VQA, a classical optimization algorithm guides the processor's quantum dynamics to yield the best solution for a given problem. A complete assessment of the scalability and competitiveness of VQA should take into account both the quality and the time of dynamics optimization. The method of optimal stopping, employed here, provides such an assessment by explicitly including time as a cost factor. Here, we showcase this measure for benchmarking VQA as a solver for some quadratic unconstrained binary optimization. Moreover, we show that a better choice for the cost function of the classical routine can significantly improve the performance of the VQA algorithm and even improve its scaling properties.

  2. Integrated logic circuits using single-atom transistors

    PubMed Central

    Mol, J. A.; Verduijn, J.; Levine, R. D.; Remacle, F.

    2011-01-01

    Scaling down the size of computing circuits is about to reach the limitations imposed by the discrete atomic structure of matter. Reducing the power requirements and thereby dissipation of integrated circuits is also essential. New paradigms are needed to sustain the rate of progress that society has become used to. Single-atom transistors, SATs, cascaded in a circuit are proposed as a promising route that is compatible with existing technology. We demonstrate the use of quantum degrees of freedom to perform logic operations in a complementary-metal–oxide–semiconductor device. Each SAT performs multilevel logic by electrically addressing the electronic states of a dopant atom. A single electron transistor decodes the physical multivalued output into the conventional binary output. A robust scalable circuit of two concatenated full adders is reported, where by utilizing charge and quantum degrees of freedom, the functionality of the transistor is pushed far beyond that of a simple switch. PMID:21808050

  3. Angelcare mobile system: homecare patient monitoring using bluetooth and GPRS.

    PubMed

    Ribeiro, Anna G D; Maitelli, Andre L; Valentim, Ricardo A M; Brandao, Glaucio B; Guerreiro, Ana M G

    2010-01-01

    The quick progress in technology has brought new paradigms to the computing area, bringing with them many benefits to society. The paradigm of ubiquitous computing brings innovations applying computing in people's daily life without being noticed. For this, it has used the combination of several existing technologies like wireless communications and sensors. Several of the benefits have reached the medical area, bringing new methods of surgery, appointments and examinations. This work presents telemedicine software that adds the idea of ubiquity to the medical area, innovating the relation between doctor and patient. It also brings security and confidence to a patient being monitored in homecare.

  4. Collaborative Working Architecture for IoT-Based Applications.

    PubMed

    Mora, Higinio; Signes-Pont, María Teresa; Gil, David; Johnsson, Magnus

    2018-05-23

    The new sensing applications need enhanced computing capabilities to handle the requirements of complex and huge data processing. The Internet of Things (IoT) concept brings processing and communication features to devices. In addition, the Cloud Computing paradigm provides resources and infrastructures for performing the computations and outsourcing the work from the IoT devices. This scenario opens new opportunities for designing advanced IoT-based applications, however, there is still much research to be done to properly gear all the systems for working together. This work proposes a collaborative model and an architecture to take advantage of the available computing resources. The resulting architecture involves a novel network design with different levels which combines sensing and processing capabilities based on the Mobile Cloud Computing (MCC) paradigm. An experiment is included to demonstrate that this approach can be used in diverse real applications. The results show the flexibility of the architecture to perform complex computational tasks of advanced applications.

  5. Community Cloud Computing

    NASA Astrophysics Data System (ADS)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  6. Recent Developments in Toxico-Cheminformatics; Supporting a New Paradigm for Predictive Toxicology

    EPA Science Inventory

    EPA's National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction through the harnessing of legacy toxicity data, creation of data linkages, and generation of new high-content and high-thoughput screening d...

  7. Advances in Toxico-Cheminformatics: Supporting a New Paradigm for Predictive Toxicology

    EPA Science Inventory

    EPA’s National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction through the harnessing of legacy toxicity data, creation of data linkages, and generation of new high-throughput screening (HTS) data. The D...

  8. Region based Brain Computer Interface for a home control application.

    PubMed

    Akman Aydin, Eda; Bay, Omer Faruk; Guler, Inan

    2015-08-01

    Environment control is one of the important challenges for disabled people who suffer from neuromuscular diseases. Brain Computer Interface (BCI) provides a communication channel between the human brain and the environment without requiring any muscular activation. The most important expectation for a home control application is high accuracy and reliable control. Region-based paradigm is a stimulus paradigm based on oddball principle and requires selection of a target at two levels. This paper presents an application of region based paradigm for a smart home control application for people with neuromuscular diseases. In this study, a region based stimulus interface containing 49 commands was designed. Five non-disabled subjects were attended to the experiments. Offline analysis results of the experiments yielded 95% accuracy for five flashes. This result showed that region based paradigm can be used to select commands of a smart home control application with high accuracy in the low number of repetitions successfully. Furthermore, a statistically significant difference was not observed between the level accuracies.

  9. The Personal Software Process: Downscaling the factory

    NASA Technical Reports Server (NTRS)

    Roy, Daniel M.

    1994-01-01

    It is argued that the next wave of software process improvement (SPI) activities will be based on a people-centered paradigm. The most promising such paradigm, Watts Humphrey's personal software process (PSP), is summarized and its advantages are listed. The concepts of the PSP are shown also to fit a down-scaled version of Basili's experience factory. The author's data and lessons learned while practicing the PSP are presented along with personal experience, observations, and advice from the perspective of a consultant and teacher for the personal software process.

  10. Paradigms of psychiatry: eclecticism and its discontents.

    PubMed

    Ghaemi, Seyyed Nassir

    2006-11-01

    To assess paradigms of psychiatry, assessing their strengths and limitations. The biopsychosocial model, and eclecticism in general, serves as the primary paradigm of mainstream contemporary psychiatry. In the past few decades, the biopsychosocial model served as a cease-fire between the biological and psychoanalytic extremism that characterized much of the 19th and 20th century history of psychiatry. Despite being broad and fostering an 'anything goes' mentality, it fails to provide much guidance as a model. In recent years, the biological school has gained prominence and now is under attack from many quarters. Critics tend toward dogmatism themselves, usually of postmodernist or libertarian varieties. Three alternate approaches include pragmatism, integrationism, and pluralism. Pluralism, as technically defined here based on the work of Karl Jaspers, rejects or accepts different methods but holds that some methods are better than others for specific circumstances or conditions. The compromise paradigm of biopsychosocial eclecticism has failed to sufficiently guide contemporary psychiatry. The concurrent revival of the biological model has led to postmodernist counter-reactions which, though valid in many specifics, promise to replace one ideological dogma with another. New paradigms are needed.

  11. Dynamic partitioning as a way to exploit new computing paradigms: the cloud use case.

    NASA Astrophysics Data System (ADS)

    Ciaschini, Vincenzo; Dal Pra, Stefano; dell'Agnello, Luca

    2015-12-01

    The WLCG community and many groups in the HEP community have based their computing strategy on the Grid paradigm, which proved successful and still ensures its goals. However, Grid technology has not spread much over other communities; in the commercial world, the cloud paradigm is the emerging way to provide computing services. WLCG experiments aim to achieve integration of their existing current computing model with cloud deployments and take advantage of the so-called opportunistic resources (including HPC facilities) which are usually not Grid compliant. One missing feature in the most common cloud frameworks, is the concept of job scheduler, which plays a key role in a traditional computing centre, by enabling a fairshare based access at the resources to the experiments in a scenario where demand greatly outstrips availability. At CNAF we are investigating the possibility to access the Tier-1 computing resources as an OpenStack based cloud service. The system, exploiting the dynamic partitioning mechanism already being used to enable Multicore computing, allowed us to avoid a static splitting of the computing resources in the Tier-1 farm, while permitting a share friendly approach. The hosts in a dynamically partitioned farm may be moved to or from the partition, according to suitable policies for request and release of computing resources. Nodes being requested in the partition switch their role and become available to play a different one. In the cloud use case hosts may switch from acting as Worker Node in the Batch system farm to cloud compute node member, made available to tenants. In this paper we describe the dynamic partitioning concept, its implementation and integration with our current batch system, LSF.

  12. Changing computing paradigms towards power efficiency

    PubMed Central

    Klavík, Pavel; Malossi, A. Cristiano I.; Bekas, Costas; Curioni, Alessandro

    2014-01-01

    Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. PMID:24842033

  13. Challenges and Opportunities in Gen3 Embedded Cooling with High-Quality Microgap Flow

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Avram; Robinson, Franklin L.; Deisenroth, David C.

    2018-01-01

    Gen3, Embedded Cooling, promises to revolutionize thermal management of advanced microelectronic systems by eliminating the sequential conductive and interfacial thermal resistances which dominate the present 'remote cooling' paradigm. Single-phase interchip microfluidic flow with high thermal conductivity chips and substrates has been used successfully to cool single transistors dissipating more than 40kW/sq cm, but efficient heat removal from transistor arrays, larger chips, and chip stacks operating at these prodigious heat fluxes would require the use of high vapor fraction (quality), two-phase cooling in intra- and inter-chip microgap channels. The motivation, as well as the challenges and opportunities associated with evaporative embedded cooling in realistic form factors, is the focus of this paper. The paper will begin with a brief review of the history of thermal packaging, reflecting the 70-year 'inward migration' of cooling technology from the computer-room, to the rack, and then to the single chip and multichip module with 'remote' or attached air- and liquid-cooled coldplates. Discussion of the limitations of this approach and recent results from single-phase embedded cooling will follow. This will set the stage for discussion of the development challenges associated with application of this Gen3 thermal management paradigm to commercial semiconductor hardware, including dealing with the effects of channel length, orientation, and manifold-driven centrifugal acceleration on the governing behavior.

  14. Integration of drug dosing data with physiological data streams using a cloud computing paradigm.

    PubMed

    Bressan, Nadja; James, Andrew; McGregor, Carolyn

    2013-01-01

    Many drugs are used during the provision of intensive care for the preterm newborn infant. Recommendations for drug dosing in newborns depend upon data from population based pharmacokinetic research. There is a need to be able to modify drug dosing in response to the preterm infant's response to the standard dosing recommendations. The real-time integration of physiological data with drug dosing data would facilitate individualised drug dosing for these immature infants. This paper proposes the use of a novel computational framework that employs real-time, temporal data analysis for this task. Deployment of the framework within the cloud computing paradigm will enable widespread distribution of individualized drug dosing for newborn infants.

  15. Computational Prosodic Markers for Autism

    ERIC Educational Resources Information Center

    Van Santen, Jan P.H.; Prud'hommeaux, Emily T.; Black, Lois M.; Mitchell, Margaret

    2010-01-01

    We present results obtained with new instrumental methods for the acoustic analysis of prosody to evaluate prosody production by children with Autism Spectrum Disorder (ASD) and Typical Development (TD). Two tasks elicit focal stress--one in a vocal imitation paradigm, the other in a picture-description paradigm; a third task also uses a vocal…

  16. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A. (Technical Monitor); Jost, G.; Jin, H.; Labarta J.; Gimenez, J.; Caubet, J.

    2003-01-01

    Parallel programming paradigms include process level parallelism, thread level parallelization, and multilevel parallelism. This viewgraph presentation describes a detailed performance analysis of these paradigms for Shared Memory Architecture (SMA). This analysis uses the Paraver Performance Analysis System. The presentation includes diagrams of a flow of useful computations.

  17. Recent Developments in Toxico-Cheminformatics and Progress Towards a New Paradigm for Predictive Toxicology (2)

    EPA Science Inventory

    EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction through harnessing of legacy toxicity data, creation of data linkages, and generation of new in vitro screening data. In association with EPA...

  18. Recent Developments in Toxico-Cheminformatics and Progress Towards a New Paradigm for Predictive Toxicology

    EPA Science Inventory

    EPA’s Computational Toxicology Center is building capabilities to support a new paradigm for toxicity screening and prediction through harnessing of legacy toxicity data, creation of data linkages, and generation of new in vitro screening data. In association with EPA’s ToxCastTM...

  19. Cloud computing for context-aware enhanced m-Health services.

    PubMed

    Fernandez-Llatas, Carlos; Pileggi, Salvatore F; Ibañez, Gema; Valero, Zoe; Sala, Pilar

    2015-01-01

    m-Health services are increasing its presence in our lives due to the high penetration of new smartphone devices. This new scenario proposes new challenges in terms of information accessibility that require new paradigms which enable the new applications to access the data in a continuous and ubiquitous way, ensuring the privacy required depending on the kind of data accessed. This paper proposes an architecture based on cloud computing paradigms in order to empower new m-Health applications to enrich their results by providing secure access to user data.

  20. A multiply-add engine with monolithically integrated 3D memristor crossbar/CMOS hybrid circuit.

    PubMed

    Chakrabarti, B; Lastras-Montaño, M A; Adam, G; Prezioso, M; Hoskins, B; Payvand, M; Madhavan, A; Ghofrani, A; Theogarajan, L; Cheng, K-T; Strukov, D B

    2017-02-14

    Silicon (Si) based complementary metal-oxide semiconductor (CMOS) technology has been the driving force of the information-technology revolution. However, scaling of CMOS technology as per Moore's law has reached a serious bottleneck. Among the emerging technologies memristive devices can be promising for both memory as well as computing applications. Hybrid CMOS/memristor circuits with CMOL (CMOS + "Molecular") architecture have been proposed to combine the extremely high density of the memristive devices with the robustness of CMOS technology, leading to terabit-scale memory and extremely efficient computing paradigm. In this work, we demonstrate a hybrid 3D CMOL circuit with 2 layers of memristive crossbars monolithically integrated on a pre-fabricated CMOS substrate. The integrated crossbars can be fully operated through the underlying CMOS circuitry. The memristive devices in both layers exhibit analog switching behavior with controlled tunability and stable multi-level operation. We perform dot-product operations with the 2D and 3D memristive crossbars to demonstrate the applicability of such 3D CMOL hybrid circuits as a multiply-add engine. To the best of our knowledge this is the first demonstration of a functional 3D CMOL hybrid circuit.

  1. A multiply-add engine with monolithically integrated 3D memristor crossbar/CMOS hybrid circuit

    PubMed Central

    Chakrabarti, B.; Lastras-Montaño, M. A.; Adam, G.; Prezioso, M.; Hoskins, B.; Cheng, K.-T.; Strukov, D. B.

    2017-01-01

    Silicon (Si) based complementary metal-oxide semiconductor (CMOS) technology has been the driving force of the information-technology revolution. However, scaling of CMOS technology as per Moore’s law has reached a serious bottleneck. Among the emerging technologies memristive devices can be promising for both memory as well as computing applications. Hybrid CMOS/memristor circuits with CMOL (CMOS + “Molecular”) architecture have been proposed to combine the extremely high density of the memristive devices with the robustness of CMOS technology, leading to terabit-scale memory and extremely efficient computing paradigm. In this work, we demonstrate a hybrid 3D CMOL circuit with 2 layers of memristive crossbars monolithically integrated on a pre-fabricated CMOS substrate. The integrated crossbars can be fully operated through the underlying CMOS circuitry. The memristive devices in both layers exhibit analog switching behavior with controlled tunability and stable multi-level operation. We perform dot-product operations with the 2D and 3D memristive crossbars to demonstrate the applicability of such 3D CMOL hybrid circuits as a multiply-add engine. To the best of our knowledge this is the first demonstration of a functional 3D CMOL hybrid circuit. PMID:28195239

  2. In Silico Chemogenomics Drug Repositioning Strategies for Neglected Tropical Diseases.

    PubMed

    Andrade, Carolina Horta; Neves, Bruno Junior; Melo-Filho, Cleber Camilo; Rodrigues, Juliana; Silva, Diego Cabral; Braga, Rodolpho Campos; Cravo, Pedro Vitor Lemos

    2018-03-08

    Only ~1% of all drug candidates against Neglected Tropical Diseases (NTDs) have reached clinical trials in the last decades, underscoring the need for new, safe and effective treatments. In such context, drug repositioning, which allows finding novel indications for approved drugs whose pharmacokinetic and safety profiles are already known, is emerging as a promising strategy for tackling NTDs. Chemogenomics is a direct descendent of the typical drug discovery process that involves the systematic screening of chemical compounds against drug targets in high-throughput screening (HTS) efforts, for the identification of lead compounds. However, different to the one-drug-one-target paradigm, chemogenomics attempts to identify all potential ligands for all possible targets and diseases. In this review, we summarize current methodological development efforts in drug repositioning that use state-of-the-art computational ligand- and structure-based chemogenomics approaches. Furthermore, we highlighted the recent progress in computational drug repositioning for some NTDs, based on curation and modeling of genomic, biological, and chemical data. Additionally, we also present in-house and other successful examples and suggest possible solutions to existing pitfalls. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  3. Restoration of neurological functions by neuroprosthetic technologies: future prospects and trends towards micro-, nano-, and biohybrid systems.

    PubMed

    Stieglitz, T

    2007-01-01

    Today applications of neural prostheses that successfully help patients to increase their activities of daily living and participate in social life again are quite simple implants that yield definite tissue response and are well recognized as foreign body. Latest developments in genetic engineering, nanotechnologies and materials sciences have paved the way to new scenarios towards highly complex systems to interface the human nervous system. Combinations of neural cells with microimplants promise stable biohybrid interfaces. Nanotechnology opens the door to macromolecular landscapes on implants that mimic the biologic topology and surface interaction of biologic cells. Computer sciences dream of technical cognitive systems that act and react due to knowledge-based conclusion mechanisms to a changing or adaptive environment. Different sciences start to interact and discuss the synergies when methods and paradigms from biology, computer sciences and engineering, neurosciences, psychology will be combined. They envision the era of "converging technologies" to completely change the understanding of science and postulate a new vision of humans. In this chapter, these research lines will be discussed on some examples as well as the societal implications and ethical questions that arise from these new opportunities.

  4. New Toxico-Cheminformatics & Computational Toxicology ...

    EPA Pesticide Factsheets

    EPA’s National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than traditionally employed in SAR modeling, and in ways that facilitate data-mining, and data read-across. The DSSTox Structure-Browser provides structure searchability across all published DSSTox toxicity-related inventory, and is enabling linkages between previously isolated toxicity data resources. As of early March 2008, the public DSSTox inventory has been integrated into PubChem, allowing a user to take full advantage of PubChem structure-activity and bioassay clustering features. The most recent DSSTox version of the Carcinogenic Potency Database file (CPDBAS) illustrates ways in which various summary definitions of carcinogenic activity can be employed in modeling and data mining. Phase I of the ToxCastTM project is generating high-throughput screening data from several hundred biochemical and cell-based assays for a set of 320 chemicals, mostly pesticide actives, with rich toxicology profiles. Incorporating and expanding traditional SAR concepts into this new high-throughput and data-rich world pose conceptual and practical challenges, but also holds great promise for improving predictive capabilities.

  5. CNTRICS Final Biomarker Selection: Control of Attention

    PubMed Central

    Luck, Steven J.; Ford, Judith M.; Sarter, Martin; Lustig, Cindy

    2012-01-01

    Attention is widely believed to be dysfunctional in schizophrenia. The Cognitive Neuroscience Treatment Research to Improve Cognition in Schizophrenia (CNTRICS) group previously concluded that the processes involved in the top-down control of attention are particularly impaired in schizophrenia and should be the focus of future research. These processes determine which sources of input should be attended, linking goal representations in prefrontal cortex with more posterior regions that implement the actual selection of attended information. A more recent meeting of the CNTRICS group assessed several paradigms that might be useful for identifying biomarkers of attentional control and that could be used for treatment development and assessment. Two types of paradigms were identified as being particularly promising. In one approach, neural activity is measured (using electroencephalography or functional magnetic resonance imaging) during the period between an attention-directing cue and a target. In a second approach, neural activity is measured under low- and high-distraction conditions. These approaches make it possible to identify the goal representations that guide attention and the interactions between these goal representations and the implementation of selection. Although more basic science research with healthy volunteers and preclinical research with schizophrenia patients is needed before these paradigms will be ready to provide clinically useful biomarkers, they hold substantial promise for aiding in the development and assessment of new treatments. PMID:21765166

  6. Bridging Social and Semantic Computing - Design and Evaluation of User Interfaces for Hybrid Systems

    ERIC Educational Resources Information Center

    Bostandjiev, Svetlin Alex I.

    2012-01-01

    The evolution of the Web brought new interesting problems to computer scientists that we loosely classify in the fields of social and semantic computing. Social computing is related to two major paradigms: computations carried out by a large amount of people in a collective intelligence fashion (i.e. wikis), and performing computations on social…

  7. Behavioural and computational varieties of response inhibition in eye movements.

    PubMed

    Cutsuridis, Vassilis

    2017-04-19

    Response inhibition is the ability to override a planned or an already initiated response. It is the hallmark of executive control as its deficits favour impulsive behaviours, which may be detrimental to an individual's life. This article reviews behavioural and computational guises of response inhibition. It focuses only on inhibition of oculomotor responses. It first reviews behavioural paradigms of response inhibition in eye movement research, namely the countermanding and antisaccade paradigms, both proven to be useful tools for the study of response inhibition in cognitive neuroscience and psychopathology. Then, it briefly reviews the neural mechanisms of response inhibition in these two behavioural paradigms. Computational models that embody a hypothesis and/or a theory of mechanisms underlying performance in both behavioural paradigms as well as provide a critical analysis of strengths and weaknesses of these models are discussed. All models assume the race of decision processes. The decision process in each paradigm that wins the race depends on different mechanisms. It has been shown that response latency is a stochastic process and has been proven to be an important measure of the cognitive control processes involved in response stopping in healthy and patient groups. Then, the inhibitory deficits in different brain diseases are reviewed, including schizophrenia and obsessive-compulsive disorder. Finally, new directions are suggested to improve the performance of models of response inhibition by drawing inspiration from successes of models in other domains.This article is part of the themed issue 'Movement suppression: brain mechanisms for stopping and stillness'. © 2017 The Author(s).

  8. A PACS archive architecture supported on cloud services.

    PubMed

    Silva, Luís A Bastião; Costa, Carlos; Oliveira, José Luis

    2012-05-01

    Diagnostic imaging procedures have continuously increased over the last decade and this trend may continue in coming years, creating a great impact on storage and retrieval capabilities of current PACS. Moreover, many smaller centers do not have financial resources or requirements that justify the acquisition of a traditional infrastructure. Alternative solutions, such as cloud computing, may help address this emerging need. A tremendous amount of ubiquitous computational power, such as that provided by Google and Amazon, are used every day as a normal commodity. Taking advantage of this new paradigm, an architecture for a Cloud-based PACS archive that provides data privacy, integrity, and availability is proposed. The solution is independent from the cloud provider and the core modules were successfully instantiated in examples of two cloud computing providers. Operational metrics for several medical imaging modalities were tabulated and compared for Google Storage, Amazon S3, and LAN PACS. A PACS-as-a-Service archive that provides storage of medical studies using the Cloud was developed. The results show that the solution is robust and that it is possible to store, query, and retrieve all desired studies in a similar way as in a local PACS approach. Cloud computing is an emerging solution that promises high scalability of infrastructures, software, and applications, according to a "pay-as-you-go" business model. The presented architecture uses the cloud to setup medical data repositories and can have a significant impact on healthcare institutions by reducing IT infrastructures.

  9. The Seismic risk perception in Italy deduced by a statistical sample

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro; Pessina, Vera; Peruzza, Laura; Cerbara, Loredana; Crescimbene, Cristiana

    2015-04-01

    In 2014 EGU Assembly we presented the results of a web a survey on the perception of seismic risk in Italy. The data were derived from over 8,500 questionnaires coming from all Italian regions. Our questionnaire was built by using the semantic differential method (Osgood et al. 1957) with a seven points Likert scale. The questionnaire is inspired the main theoretical approaches of risk perception (psychometric paradigm, cultural theory, etc.) .The results were promising and seem to clearly indicate an underestimation of seismic risk by the italian population. Based on these promising results, the DPC has funded our research for the second year. In 2015 EGU Assembly we present the results of a new survey deduced by an italian statistical sample. The importance of statistical significance at national scale was also suggested by ISTAT (Italian Statistic Institute), considering the study as of national interest, accepted the "project on the perception of seismic risk" as a pilot study inside the National Statistical System (SISTAN), encouraging our RU to proceed in this direction. The survey was conducted by a company specialised in population surveys using the CATI method (computer assisted telephone interview). Preliminary results will be discussed. The statistical support was provided by the research partner CNR-IRPPS. This research is funded by Italian Civil Protection Department (DPC).

  10. Assessing social engagement in heterogeneous groups of zebrafish: a new paradigm for autism-like behavioral responses.

    PubMed

    Maaswinkel, Hans; Zhu, Liqun; Weng, Wei

    2013-01-01

    Because of its highly developed social character, zebrafish is a promising model system for the study of the genetic and neurochemical basis of altered social engagement such as is common in autism and schizophrenia. The traditional shoaling paradigm investigates social cohesion in homogeneous groups of zebrafish. However, the social dynamics of mixed groups is gaining interest from a therapeutic point of view and thus warrants animal modeling. Furthermore, mutant zebrafish are not always available in large numbers. Therefore, we developed a new paradigm that allows exploring shoaling in heterogeneous groups. The effects of MK-801, a non-competitive antagonist of the glutamate N-methyl-D-aspartate (NMDA) receptor, on social cohesion were studied to evaluate the paradigm. The drug has previously been shown to mimic aspects of autism and schizophrenia. Our results show that a single MK-801-treated zebrafish reduced social cohesion of the entire shoal drastically. Preliminary observations suggest that the social dynamics of the shoal as a whole was altered.

  11. Can false memories be corrected by feedback in the DRM paradigm?

    PubMed

    McConnell, Melissa D; Hunt, R Reed

    2007-07-01

    Normal processes of comprehension frequently yield false memories as an unwanted by-product. The simple paradigm now known as the Deese/Roediger-McDermott (DRM) paradigm takes advantage of this fact and has been used to reliably produce false memory for laboratory study. Among the findings from past research is the difficulty of preventing false memories in this paradigm. The purpose of the present experiments was to examine the effectiveness of feedback in correcting false memories. Two experiments were conducted, in which participants recalled DRM lists and either received feedback on their performance or did not. A subsequent recall test was administered to assess the effect of feedback. The results showed promising effects of feedback: Feedback enhanced both error correction and the propagation of correct recall. The data replicated other data of studies that have shown substantial error perseveration following feedback. These data also provide new information on the occurrence of errors following feedback. The results are discussed in terms of the activation-monitoring theory of false memory.

  12. Assessing Social Engagement in Heterogeneous Groups of Zebrafish: A New Paradigm for Autism-Like Behavioral Responses

    PubMed Central

    Maaswinkel, Hans; Zhu, Liqun; Weng, Wei

    2013-01-01

    Because of its highly developed social character, zebrafish is a promising model system for the study of the genetic and neurochemical basis of altered social engagement such as is common in autism and schizophrenia. The traditional shoaling paradigm investigates social cohesion in homogeneous groups of zebrafish. However, the social dynamics of mixed groups is gaining interest from a therapeutic point of view and thus warrants animal modeling. Furthermore, mutant zebrafish are not always available in large numbers. Therefore, we developed a new paradigm that allows exploring shoaling in heterogeneous groups. The effects of MK-801, a non-competitive antagonist of the glutamate N-methyl-D-aspartate (NMDA) receptor, on social cohesion were studied to evaluate the paradigm. The drug has previously been shown to mimic aspects of autism and schizophrenia. Our results show that a single MK-801-treated zebrafish reduced social cohesion of the entire shoal drastically. Preliminary observations suggest that the social dynamics of the shoal as a whole was altered. PMID:24116082

  13. The Study of Surface Computer Supported Cooperative Work and Its Design, Efficiency, and Challenges

    ERIC Educational Resources Information Center

    Hwang, Wu-Yuin; Su, Jia-Han

    2012-01-01

    In this study, a Surface Computer Supported Cooperative Work paradigm is proposed. Recently, multitouch technology has become widely available for human-computer interaction. We found it has great potential to facilitate more awareness of human-to-human interaction than personal computers (PCs) in colocated collaborative work. However, other…

  14. Designing Ubiquitous Computing to Enhance Children's Learning in Museums

    ERIC Educational Resources Information Center

    Hall, T.; Bannon, L.

    2006-01-01

    In recent years, novel paradigms of computing have emerged, which enable computational power to be embedded in artefacts and in environments in novel ways. These developments may create new possibilities for using computing to enhance learning. This paper presents the results of a design process that set out to explore interactive techniques,…

  15. An Introductory Course on Service-Oriented Computing for High Schools

    ERIC Educational Resources Information Center

    Tsai, W. T.; Chen, Yinong; Cheng, Calvin; Sun, Xin; Bitter, Gary; White, Mary

    2008-01-01

    Service-Oriented Computing (SOC) is a new computing paradigm that has been adopted by major computer companies as well as government agencies such as the Department of Defense for mission-critical applications. SOC is being used for developing Web and electronic business applications, as well as robotics, gaming, and scientific applications. Yet,…

  16. Assessing Transit Oriented Development Strategies with a New Combined Modal Split and Traffic Assignment Model

    DOT National Transportation Integrated Search

    2017-08-30

    Transit oriented development (TOD) has emerged in recent years as a promising paradigm to promote public transportation, increase active transportation usage, mitigate congestion, and alleviate air pollution. However, there is a lack of analytic stud...

  17. Changing computing paradigms towards power efficiency.

    PubMed

    Klavík, Pavel; Malossi, A Cristiano I; Bekas, Costas; Curioni, Alessandro

    2014-06-28

    Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  18. School on Cloud: Towards a Paradigm Shift

    ERIC Educational Resources Information Center

    Koutsopoulos, Kostis C.; Kotsanis, Yannis C.

    2014-01-01

    This paper presents the basic concept of the EU Network School on Cloud: Namely, that present conditions require a new teaching and learning paradigm based on the integrated dimension of education, when considering the use of cloud computing. In other words, it is suggested that there is a need for an integrated approach which is simultaneously…

  19. Social Studies and Emerging Paradigms: Artificial Intelligence and Consciousness Education.

    ERIC Educational Resources Information Center

    Braun, Joseph A., Jr.

    1987-01-01

    Asks three questions: (1) Are machines capable of thinking as people do? (2) How is the thinking of computers similar and different from human thinking? and (3) What exactly is thinking? Examines research in artificial intelligence. Describes the theory and research of consciousness education and discusses an emerging paradigm for human thinking…

  20. Hybrid cloud and cluster computing paradigms for life science applications

    PubMed Central

    2010-01-01

    Background Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Results Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. Conclusions The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. Methods We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments. PMID:21210982

  1. Hybrid cloud and cluster computing paradigms for life science applications.

    PubMed

    Qiu, Judy; Ekanayake, Jaliya; Gunarathne, Thilina; Choi, Jong Youl; Bae, Seung-Hee; Li, Hui; Zhang, Bingjing; Wu, Tak-Lon; Ruan, Yang; Ekanayake, Saliya; Hughes, Adam; Fox, Geoffrey

    2010-12-21

    Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments.

  2. Grid computing technology for hydrological applications

    NASA Astrophysics Data System (ADS)

    Lecca, G.; Petitdidier, M.; Hluchy, L.; Ivanovic, M.; Kussul, N.; Ray, N.; Thieron, V.

    2011-06-01

    SummaryAdvances in e-Infrastructure promise to revolutionize sensing systems and the way in which data are collected and assimilated, and complex water systems are simulated and visualized. According to the EU Infrastructure 2010 work-programme, data and compute infrastructures and their underlying technologies, either oriented to tackle scientific challenges or complex problem solving in engineering, are expected to converge together into the so-called knowledge infrastructures, leading to a more effective research, education and innovation in the next decade and beyond. Grid technology is recognized as a fundamental component of e-Infrastructures. Nevertheless, this emerging paradigm highlights several topics, including data management, algorithm optimization, security, performance (speed, throughput, bandwidth, etc.), and scientific cooperation and collaboration issues that require further examination to fully exploit it and to better inform future research policies. The paper illustrates the results of six different surface and subsurface hydrology applications that have been deployed on the Grid. All the applications aim to answer to strong requirements from the Civil Society at large, relatively to natural and anthropogenic risks. Grid technology has been successfully tested to improve flood prediction, groundwater resources management and Black Sea hydrological survey, by providing large computing resources. It is also shown that Grid technology facilitates e-cooperation among partners by means of services for authentication and authorization, seamless access to distributed data sources, data protection and access right, and standardization.

  3. "New Fatherhood" and the Politics of Dependency

    ERIC Educational Resources Information Center

    Shuffelton, Amy

    2014-01-01

    Although "new fatherhood" promises a reconstruction of the domesticity paradigm that positions fathers as breadwinners and mothers as caretakers, it maintains the notion that families are self-supporting entities and thereby neglects the extensive interdependence involved in raising children. As a result, it cannot successfully overturn…

  4. Agoraphobia and Paradigm Strain: A Family Systems Perspective.

    ERIC Educational Resources Information Center

    Shean, Glenn; Rohrbaugh, Michael

    Agoraphobia is an increasingly common, often chronically incapacitating anxiety disorder. Both behavior therapy and pharmacotherapy can be effective in reducing the intensity of agoraphobic symptoms. There are promising new developments, however, from a family systems perspective. Researchers are finding that an agoraphobic's marriage and family…

  5. Neural Bases Of Food Perception: Coordinate-Based Meta-Analyses Of Neuroimaging Studies In Multiple Modalities

    PubMed Central

    Huerta, Claudia I; Sarkar, Pooja R; Duong, Timothy Q.; Laird, Angela R; Fox, Peter T

    2013-01-01

    Objective The purpose of this study was to compare the results of the three food-cue paradigms most commonly used for functional neuroimaging studies to determine: i) commonalities and differences in the neural response patterns by paradigm; and, ii) the relative robustness and reliability of responses to each paradigm. Design and Methods functional magnetic resonance imaging (fMRI) studies using standardized stereotactic coordinates to report brain responses to food cues were identified using on-line databases. Studies were grouped by food-cue modality as: i) tastes (8 studies); ii) odors (8 studies); and, iii) images (11 studies). Activation likelihood estimation (ALE) was used to identify statistically reliable regional responses within each stimulation paradigm. Results Brain response distributions were distinctly different for the three stimulation modalities, corresponding to known differences in location of the respective primary and associative cortices. Visual stimulation induced the most robust and extensive responses. The left anterior insula was the only brain region reliably responding to all three stimulus categories. Conclusions These findings suggest visual food-cue paradigm as promising candidate for imaging studies addressing the neural substrate of therapeutic interventions. PMID:24174404

  6. The Use of Computers in Mathematics Education: A Paradigm Shift from "Computer Assisted Instruction" towards "Student Programming"

    ERIC Educational Resources Information Center

    Aydin, Emin

    2005-01-01

    The purpose of this study is to review the changes that computers have on mathematics itself and on mathematics curriculum. The study aims at investigating different applications of computers in education in general, and mathematics education in particular and their applications on mathematics curriculum and on teaching and learning of…

  7. The fuzzy cube and causal efficacy: representation of concomitant mechanisms in stroke.

    PubMed

    Jobe, Thomas H.; Helgason, Cathy M.

    1998-04-01

    Twentieth century medical science has embraced nineteenth century Boolean probability theory based upon two-valued Aristotelian logic. With the later addition of bit-based, von Neumann structured computational architectures, an epistemology based on randomness has led to a bivalent epidemiological methodology that dominates medical decision making. In contrast, fuzzy logic, based on twentieth century multi-valued logic, and computational structures that are content addressed and adaptively modified, has advanced a new scientific paradigm for the twenty-first century. Diseases such as stroke involve multiple concomitant causal factors that are difficult to represent using conventional statistical methods. We tested which paradigm best represented this complex multi-causal clinical phenomenon-stroke. We show that the fuzzy logic paradigm better represented clinical complexity in cerebrovascular disease than current probability theory based methodology. We believe this finding is generalizable to all of clinical science since multiple concomitant causal factors are involved in nearly all known pathological processes.

  8. Neuromorphic Computing, Architectures, Models, and Applications. A Beyond-CMOS Approach to Future Computing, June 29-July 1, 2016, Oak Ridge, TN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potok, Thomas; Schuman, Catherine; Patton, Robert

    The White House and Department of Energy have been instrumental in driving the development of a neuromorphic computing program to help the United States continue its lead in basic research into (1) Beyond Exascale—high performance computing beyond Moore’s Law and von Neumann architectures, (2) Scientific Discovery—new paradigms for understanding increasingly large and complex scientific data, and (3) Emerging Architectures—assessing the potential of neuromorphic and quantum architectures. Neuromorphic computing spans a broad range of scientific disciplines from materials science to devices, to computer science, to neuroscience, all of which are required to solve the neuromorphic computing grand challenge. In our workshopmore » we focus on the computer science aspects, specifically from a neuromorphic device through an application. Neuromorphic devices present a very different paradigm to the computer science community from traditional von Neumann architectures, which raises six major questions about building a neuromorphic application from the device level. We used these fundamental questions to organize the workshop program and to direct the workshop panels and discussions. From the white papers, presentations, panels, and discussions, there emerged several recommendations on how to proceed.« less

  9. Automation of learning-set testing - The video-task paradigm

    NASA Technical Reports Server (NTRS)

    Washburn, David A.; Hopkins, William D.; Rumbaugh, Duane M.

    1989-01-01

    Researchers interested in studying discrimination learning in primates have typically utilized variations in the Wisconsin General Test Apparatus (WGTA). In the present experiment, a new testing apparatus for the study of primate learning is proposed. In the video-task paradigm, rhesus monkeys (Macaca mulatta) respond to computer-generated stimuli by manipulating a joystick. Using this apparatus, discrimination learning-set data for 2 monkeys were obtained. Performance on Trial 2 exceeded 80 percent within 200 discrimination learning problems. These data illustrate the utility of the video-task paradigm in comparative research. Additionally, the efficient learning and rich data that were characteristic of this study suggest several advantages of the present testing paradigm over traditional WGTA testing.

  10. Solving Math and Science Problems in the Real World with a Computational Mind

    ERIC Educational Resources Information Center

    Olabe, Juan Carlos; Basogain, Xabier; Olabe, Miguel Ángel; Maíz, Inmaculada; Castaño, Carlos

    2014-01-01

    This article presents a new paradigm for the study of Math and Sciences curriculum during primary and secondary education. A workshop for Education undergraduates at four different campuses (n = 242) was designed to introduce participants to the new paradigm. In order to make a qualitative analysis of the current school methodologies in…

  11. The Paradigm Recursion: Is It More Accessible When Introduced in Middle School?

    ERIC Educational Resources Information Center

    Gunion, Katherine; Milford, Todd; Stege, Ulrike

    2009-01-01

    Recursion is a programming paradigm as well as a problem solving strategy thought to be very challenging to grasp for university students. This article outlines a pilot study, which expands the age range of students exposed to the concept of recursion in computer science through instruction in a series of interesting and engaging activities. In…

  12. Design, challenge, and promise of stimuli-responsive nanoantibiotics

    NASA Astrophysics Data System (ADS)

    Edson, Julius A.; Kwon, Young Jik

    2016-10-01

    Over the past few years, there have been calls for novel antimicrobials to combat the rise of drug-resistant bacteria. While some promising new discoveries have met this call, it is not nearly enough. The major problem is that although these new promising antimicrobials serve as a short-term solution, they lack the potential to provide a long-term solution. The conventional method of creating new antibiotics relies heavily on the discovery of an antimicrobial compound from another microbe. This paradigm of development is flawed due to the fact that microbes can easily transfer a resistant mechanism if faced with an environmental pressure. Furthermore, there has been some evidence to indicate that the environment of the microbe can provide a hint as to their virulence. Because of this, the use of materials with antimicrobial properties has been garnering interest. Nanoantibiotics, (nAbts), provide a new way to circumvent the current paradigm of antimicrobial discovery and presents a novel mechanism of attack not found in microbes yet; which may lead to a longer-term solution against drug-resistance formation. This allows for environment-specific activation and efficacy of the nAbts but may also open up and create new design methods for various applications. These nAbts provide promise, but there is still ample work to be done in their development. This review looks at possible ways of improving and optimizing nAbts by making them stimuli-responsive, then consider the challenges ahead, and industrial applications.[Figure not available: see fulltext.

  13. Instance-based learning: integrating sampling and repeated decisions from experience.

    PubMed

    Gonzalez, Cleotilde; Dutt, Varun

    2011-10-01

    In decisions from experience, there are 2 experimental paradigms: sampling and repeated-choice. In the sampling paradigm, participants sample between 2 options as many times as they want (i.e., the stopping point is variable), observe the outcome with no real consequences each time, and finally select 1 of the 2 options that cause them to earn or lose money. In the repeated-choice paradigm, participants select 1 of the 2 options for a fixed number of times and receive immediate outcome feedback that affects their earnings. These 2 experimental paradigms have been studied independently, and different cognitive processes have often been assumed to take place in each, as represented in widely diverse computational models. We demonstrate that behavior in these 2 paradigms relies upon common cognitive processes proposed by the instance-based learning theory (IBLT; Gonzalez, Lerch, & Lebiere, 2003) and that the stopping point is the only difference between the 2 paradigms. A single cognitive model based on IBLT (with an added stopping point rule in the sampling paradigm) captures human choices and predicts the sequence of choice selections across both paradigms. We integrate the paradigms through quantitative model comparison, where IBLT outperforms the best models created for each paradigm separately. We discuss the implications for the psychology of decision making. © 2011 American Psychological Association

  14. A Whirlwind Tour of Computational Geometry.

    ERIC Educational Resources Information Center

    Graham, Ron; Yao, Frances

    1990-01-01

    Described is computational geometry which used concepts and results from classical geometry, topology, combinatorics, as well as standard algorithmic techniques such as sorting and searching, graph manipulations, and linear programing. Also included are special techniques and paradigms. (KR)

  15. Potential Paradigms and Possible Problems for CALL.

    ERIC Educational Resources Information Center

    Phillips, Martin

    1987-01-01

    Describes three models of CALL (computer assisted language learning) activity--games, the expert system, and the prosthetic approaches. A case is made for CALL development within a more instrumental view of the role of computers. (Author/CB)

  16. Location-Based Services in Vehicular Networks

    ERIC Educational Resources Information Center

    Wu, Di

    2013-01-01

    Location-based services have been identified as a promising communication paradigm in highly mobile and dynamic vehicular networks. However, existing mobile ad hoc networking cannot be directly applied to vehicular networking due to differences in traffic conditions, mobility models and network topologies. On the other hand, hybrid architectures…

  17. MAX - An advanced parallel computer for space applications

    NASA Technical Reports Server (NTRS)

    Lewis, Blair F.; Bunker, Robert L.

    1991-01-01

    MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.

  18. Perceptual Measurement in Schizophrenia: Promising Electrophysiology and Neuroimaging Paradigms From CNTRICS

    PubMed Central

    Butler, Pamela D.; Chen, Yue; Ford, Judith M.; Geyer, Mark A.; Silverstein, Steven M.; Green, Michael F.

    2012-01-01

    The sixth meeting of the Cognitive Neuroscience Treatment Research to Improve Cognition in Schizophrenia (CNTRICS) focused on selecting promising imaging paradigms for each of the cognitive constructs selected in the first CNTRICS meeting. In the domain of perception, the 2 constructs of interest were “gain control” and “visual integration.” CNTRICS received 6 task nominations for imaging paradigms for gain control and 3 task nominations for integration. The breakout group for perception evaluated the degree to which each of these tasks met prespecified criteria. For gain control, the breakout group believed that one task (mismatch negativity) was already mature and was being incorporated into multisite clinical trials. The breakout group recommended that 1 visual task (steady-state visual evoked potentials to magnocellular- vs parvocellular-biased stimuli) and 2 auditory measures (an event-related potential (ERP) measure of corollary discharge and a functional magnetic resonance imaging (fMRI) version of prepulse inhibition of startle) be adapted for use in clinical trials in schizophrenia research. For visual integration, the breakout group recommended that fMRI and ERP versions of a contour integration test and an fMRI version of a coherent motion test be adapted for use in clinical trials. This manuscript describes the ways in which each of these tasks met the criteria used in the breakout group to evaluate and recommend tasks for further development. PMID:21890745

  19. XAFS investigation of polyamidoxime-bound uranyl contests the paradigm from small molecule studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayes, Richard T.; Piechowicz, Marek; Lin, Zekai

    In this study, limited resource availability and population growth have motivated interest in harvesting valuable metals from unconventional reserves, but developing selective adsorbents for this task requires structural knowledge of metal binding environments. Amidoxime polymers have been identified as the most promising platform for large-scale extraction of uranium from seawater. However, despite more than 30 years of research, the uranyl coordination environment on these adsorbents has not been positively identified. We report the first XAFS investigation of polyamidoxime-bound uranyl, with EXAFS fits suggesting a cooperative chelating model, rather than the tridentate or η 2 motifs proposed by small molecule andmore » computational studies. Samples exposed to environmental seawater also display a feature consistent with a μ 2-oxo-bridged transition metal in the uranyl coordination sphere, suggesting in situ formation of a specific binding site or mineralization of uranium on the polymer surface. These unexpected findings challenge several long-held assumptions and have significant implications for development of polymer adsorbents with high selectivity.« less

  20. XAFS investigation of polyamidoxime-bound uranyl contests the paradigm from small molecule studies

    DOE PAGES

    Mayes, Richard T.; Piechowicz, Marek; Lin, Zekai; ...

    2015-11-12

    In this study, limited resource availability and population growth have motivated interest in harvesting valuable metals from unconventional reserves, but developing selective adsorbents for this task requires structural knowledge of metal binding environments. Amidoxime polymers have been identified as the most promising platform for large-scale extraction of uranium from seawater. However, despite more than 30 years of research, the uranyl coordination environment on these adsorbents has not been positively identified. We report the first XAFS investigation of polyamidoxime-bound uranyl, with EXAFS fits suggesting a cooperative chelating model, rather than the tridentate or η 2 motifs proposed by small molecule andmore » computational studies. Samples exposed to environmental seawater also display a feature consistent with a μ 2-oxo-bridged transition metal in the uranyl coordination sphere, suggesting in situ formation of a specific binding site or mineralization of uranium on the polymer surface. These unexpected findings challenge several long-held assumptions and have significant implications for development of polymer adsorbents with high selectivity.« less

  1. Architectural Principles and Experimentation of Distributed High Performance Virtual Clusters

    ERIC Educational Resources Information Center

    Younge, Andrew J.

    2016-01-01

    With the advent of virtualization and Infrastructure-as-a-Service (IaaS), the broader scientific computing community is considering the use of clouds for their scientific computing needs. This is due to the relative scalability, ease of use, advanced user environment customization abilities, and the many novel computing paradigms available for…

  2. A Semantic Based Policy Management Framework for Cloud Computing Environments

    ERIC Educational Resources Information Center

    Takabi, Hassan

    2013-01-01

    Cloud computing paradigm has gained tremendous momentum and generated intensive interest. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this dissertation, we mainly focus on issues related to policy management and access…

  3. In the Clouds: The Implications of Cloud Computing for Higher Education Information Technology Governance and Decision Making

    ERIC Educational Resources Information Center

    Dulaney, Malik H.

    2013-01-01

    Emerging technologies challenge the management of information technology in organizations. Paradigm changing technologies, such as cloud computing, have the ability to reverse the norms in organizational management, decision making, and information technology governance. This study explores the effects of cloud computing on information technology…

  4. The Efficacy of the Internet-Based Blackboard Platform in Developmental Writing Classes

    ERIC Educational Resources Information Center

    Shudooh, Yusuf M.

    2016-01-01

    The application of computer-assisted platforms in writing classes is a relatively new paradigm in education. The adoption of computers-assisted writing classes is gaining ground in many western and non western universities. Numerous issues can be addressed when conducting computer-assisted classes (CAC). However, a few studies conducted to assess…

  5. The Implications of Well-Formedness on Web-Based Educational Resources.

    ERIC Educational Resources Information Center

    Mohler, James L.

    Within all institutions, Web developers are beginning to utilize technologies that make sites more than static information resources. Databases such as XML (Extensible Markup Language) and XSL (Extensible Stylesheet Language) are key technologies that promise to extend the Web beyond the "information storehouse" paradigm and provide…

  6. Philosophical Hermeneutics: A Tradition with Promise

    ERIC Educational Resources Information Center

    Agrey, Loren G.

    2014-01-01

    For years the predominant paradigm for educational research has been the privileged quantitative data collection and analysis methods which are "de rigueur" in the natural sciences and which are also dominant in the human sciences. An alternative to the approach of a dispassioned observer on the sidelines recording every observation…

  7. A novel hybrid BCI speller based on the incorporation of SSVEP into the P300 paradigm

    NASA Astrophysics Data System (ADS)

    Yin, Erwei; Zhou, Zongtan; Jiang, Jun; Chen, Fanglin; Liu, Yadong; Hu, Dewen

    2013-04-01

    Objective. Although extensive studies have shown improvement in spelling accuracy, the conventional P300 speller often exhibits errors, which occur in almost the same row or column relative to the target. To address this issue, we propose a novel hybrid brain-computer interface (BCI) approach by incorporating the steady-state visual evoked potential (SSVEP) into the conventional P300 paradigm. Approach. We designed a periodic stimuli mechanism and superimposed it onto the P300 stimuli to increase the difference between the symbols in the same row or column. Furthermore, we integrated the random flashings and periodic flickers to simultaneously evoke the P300 and SSVEP, respectively. Finally, we developed a hybrid detection mechanism based on the P300 and SSVEP in which the target symbols are detected by the fusion of three-dimensional, time-frequency features. Main results. The results obtained from 12 healthy subjects show that an online classification accuracy of 93.85% and information transfer rate of 56.44 bit/min were achieved using the proposed BCI speller in only a single trial. Specifically, 5 of the 12 subjects exhibited an information transfer rate of 63.56 bit/min with an accuracy of 100%. Significance. The pilot studies suggested that the proposed BCI speller could achieve a better and more stable system performance compared with the conventional P300 speller, and it is promising for achieving quick spelling in stimulus-driven BCI applications.

  8. High-Speed Photonic Reservoir Computing Using a Time-Delay-Based Architecture: Million Words per Second Classification

    NASA Astrophysics Data System (ADS)

    Larger, Laurent; Baylón-Fuentes, Antonio; Martinenghi, Romain; Udaltsov, Vladimir S.; Chembo, Yanne K.; Jacquot, Maxime

    2017-01-01

    Reservoir computing, originally referred to as an echo state network or a liquid state machine, is a brain-inspired paradigm for processing temporal information. It involves learning a "read-out" interpretation for nonlinear transients developed by high-dimensional dynamics when the latter is excited by the information signal to be processed. This novel computational paradigm is derived from recurrent neural network and machine learning techniques. It has recently been implemented in photonic hardware for a dynamical system, which opens the path to ultrafast brain-inspired computing. We report on a novel implementation involving an electro-optic phase-delay dynamics designed with off-the-shelf optoelectronic telecom devices, thus providing the targeted wide bandwidth. Computational efficiency is demonstrated experimentally with speech-recognition tasks. State-of-the-art speed performances reach one million words per second, with very low word error rate. Additionally, to record speed processing, our investigations have revealed computing-efficiency improvements through yet-unexplored temporal-information-processing techniques, such as simultaneous multisample injection and pitched sampling at the read-out compared to information "write-in".

  9. Integrating System Dynamics and Bayesian Networks with Application to Counter-IED Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarman, Kenneth D.; Brothers, Alan J.; Whitney, Paul D.

    2010-06-06

    The practice of choosing a single modeling paradigm for predictive analysis can limit the scope and relevance of predictions and their utility to decision-making processes. Considering multiple modeling methods simultaneously may improve this situation, but a better solution provides a framework for directly integrating different, potentially complementary modeling paradigms to enable more comprehensive modeling and predictions, and thus better-informed decisions. The primary challenges of this kind of model integration are to bridge language and conceptual gaps between modeling paradigms, and to determine whether natural and useful linkages can be made in a formal mathematical manner. To address these challenges inmore » the context of two specific modeling paradigms, we explore mathematical and computational options for linking System Dynamics (SD) and Bayesian network (BN) models and incorporating data into the integrated models. We demonstrate that integrated SD/BN models can naturally be described as either state space equations or Dynamic Bayes Nets, which enables the use of many existing computational methods for simulation and data integration. To demonstrate, we apply our model integration approach to techno-social models of insurgent-led attacks and security force counter-measures centered on improvised explosive devices.« less

  10. First-Principles Design of Novel Catalytic and Chemoresponsive Materials

    NASA Astrophysics Data System (ADS)

    Roling, Luke T.

    An emerging trend in materials design is the use of computational chemistry tools to accelerate materials discovery and implementation. In particular, the parallel nature of computational models enables high-throughput screening approaches that would be laborious and time-consuming with experiments alone, and can be useful for identifying promising candidate materials for experimental synthesis and evaluation. Additionally, atomic-scale modeling allows researchers to obtain a detailed understanding of phenomena invisible to many current experimental techniques. In this thesis, we highlight mechanistic studies and successes in catalyst design for heterogeneous electrochemical reactions, discussing both anode and cathode chemistries. In particular, we evaluate the properties of a new class of Pd-Pt core-shell and hollow nanocatalysts toward the oxygen reduction reaction. We do not limit our study to electrochemical reactivity, but also consider these catalysts in a broader context by performing in-depth studies of their stability at elevated temperatures as well as investigating the mechanisms by which they are able to form. We also present fundamental surface science studies, investigating graphene formation and H2 dissociation, which are processes of both fundamental and practical interest in many catalytic applications. Finally, we extend our materials design paradigm outside the field of catalysis to develop and apply a model for the detection of small chemical analytes by chemoresponsive liquid crystals, and offer several predictions for improving the detection of small chemicals. A close connection between computation, synthesis, and experimental evaluation is essential to the work described herein, as computations are used to gain fundamental insight into experimental observations, and experiments and synthesis are in turn used to validate predictions of material activities from computational models.

  11. Genetic manipulation for inherited neurodegenerative diseases: myth or reality?

    PubMed Central

    Yu-Wai-Man, Patrick

    2016-01-01

    Rare genetic diseases affect about 7% of the general population and over 7000 distinct clinical syndromes have been described with the majority being due to single gene defects. This review will provide a critical overview of genetic strategies that are being pioneered to halt or reverse disease progression in inherited neurodegenerative diseases. This field of research covers a vast area and only the most promising treatment paradigms will be discussed with a particular focus on inherited eye diseases, which have paved the way for innovative gene therapy paradigms, and mitochondrial diseases, which are currently generating a lot of debate centred on the bioethics of germline manipulation. PMID:27002113

  12. Research to reduce the suicide rate among older adults: methodology roadblocks and promising paradigms

    PubMed Central

    Szanto, Katalin; Lenze, Eric J.; Waern, Margda; Duberstein, Paul; Bruce, Martha L.; Epstein-Lubow, Gary; Conwell, Yeates

    2013-01-01

    The National Institute of Mental Health and the National Action Alliance for Suicide Prevention have requested input into the development of a national suicide research agenda. In response, a working group of the American Association for Geriatric Psychiatry has prepared recommendations to ensure that the suicide prevention dialogue includes older adults, a large and fast-growing population at high risk of suicide. In this Open Forum, the working group describes three methodology roadblocks to research into suicide prevention among elderly persons and three paradigms that might provide directions for future research into suicide prevention strategies for older adults. PMID:23728601

  13. Research to reduce the suicide rate among older adults: methodology roadblocks and promising paradigms.

    PubMed

    Szanto, Katalin; Lenze, Eric J; Waern, Margda; Duberstein, Paul; Bruce, Martha L; Epstein-Lubow, Gary; Conwell, Yeates

    2013-06-01

    The National Institute of Mental Health and the National Action Alliance for Suicide Prevention have requested input into the development of a national suicide research agenda. In response, a working group of the American Association for Geriatric Psychiatry has prepared recommendations to ensure that the suicide prevention dialogue includes older adults, a large and fast-growing population at high risk of suicide. In this Open Forum, the working group describes three methodology roadblocks to research into suicide prevention among elderly persons and three paradigms that might provide directions for future research into suicide prevention strategies for older adults.

  14. A novel channel selection method for optimal classification in different motor imagery BCI paradigms.

    PubMed

    Shan, Haijun; Xu, Haojie; Zhu, Shanan; He, Bin

    2015-10-21

    For sensorimotor rhythms based brain-computer interface (BCI) systems, classification of different motor imageries (MIs) remains a crucial problem. An important aspect is how many scalp electrodes (channels) should be used in order to reach optimal performance classifying motor imaginations. While the previous researches on channel selection mainly focus on MI tasks paradigms without feedback, the present work aims to investigate the optimal channel selection in MI tasks paradigms with real-time feedback (two-class control and four-class control paradigms). In the present study, three datasets respectively recorded from MI tasks experiment, two-class control and four-class control experiments were analyzed offline. Multiple frequency-spatial synthesized features were comprehensively extracted from every channel, and a new enhanced method IterRelCen was proposed to perform channel selection. IterRelCen was constructed based on Relief algorithm, but was enhanced from two aspects: change of target sample selection strategy and adoption of the idea of iterative computation, and thus performed more robust in feature selection. Finally, a multiclass support vector machine was applied as the classifier. The least number of channels that yield the best classification accuracy were considered as the optimal channels. One-way ANOVA was employed to test the significance of performance improvement among using optimal channels, all the channels and three typical MI channels (C3, C4, Cz). The results show that the proposed method outperformed other channel selection methods by achieving average classification accuracies of 85.2, 94.1, and 83.2 % for the three datasets, respectively. Moreover, the channel selection results reveal that the average numbers of optimal channels were significantly different among the three MI paradigms. It is demonstrated that IterRelCen has a strong ability for feature selection. In addition, the results have shown that the numbers of optimal channels in the three different motor imagery BCI paradigms are distinct. From a MI task paradigm, to a two-class control paradigm, and to a four-class control paradigm, the number of required channels for optimizing the classification accuracy increased. These findings may provide useful information to optimize EEG based BCI systems, and further improve the performance of noninvasive BCI.

  15. New Toxico-Cheminformatics & Computational Toxicology Initiatives At EPA

    EPA Science Inventory

    EPA’s National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than tr...

  16. The Fate of the Method of 'Paradigms' in Paleobiology.

    PubMed

    Rudwick, Martin J S

    2017-11-02

    An earlier article described the mid-twentieth century origins of the method of "paradigms" in paleobiology, as a way of making testable hypotheses about the functional morphology of extinct organisms. The present article describes the use of "paradigms" through the 1970s and, briefly, to the end of the century. After I had proposed the paradigm method to help interpret the ecological history of brachiopods, my students developed it in relation to that and other invertebrate phyla, notably in Euan Clarkson's analysis of vision in trilobites. David Raup's computer-aided "theoretical morphology" was then combined with my functional or adaptive emphasis, in Adolf Seilacher's tripartite "constructional morphology." Stephen Jay Gould, who had strongly endorsed the method, later switched to criticizing the "adaptationist program" he claimed it embodied. Although the explicit use of paradigms in paleobiology had declined by the end of the century, the method was tacitly subsumed into functional morphology as "biomechanics."

  17. Projection decomposition algorithm for dual-energy computed tomography via deep neural network.

    PubMed

    Xu, Yifu; Yan, Bin; Chen, Jian; Zeng, Lei; Li, Lei

    2018-03-15

    Dual-energy computed tomography (DECT) has been widely used to improve identification of substances from different spectral information. Decomposition of the mixed test samples into two materials relies on a well-calibrated material decomposition function. This work aims to establish and validate a data-driven algorithm for estimation of the decomposition function. A deep neural network (DNN) consisting of two sub-nets is proposed to solve the projection decomposition problem. The compressing sub-net, substantially a stack auto-encoder (SAE), learns a compact representation of energy spectrum. The decomposing sub-net with a two-layer structure fits the nonlinear transform between energy projection and basic material thickness. The proposed DNN not only delivers image with lower standard deviation and higher quality in both simulated and real data, and also yields the best performance in cases mixed with photon noise. Moreover, DNN costs only 0.4 s to generate a decomposition solution of 360 × 512 size scale, which is about 200 times faster than the competing algorithms. The DNN model is applicable to the decomposition tasks with different dual energies. Experimental results demonstrated the strong function fitting ability of DNN. Thus, the Deep learning paradigm provides a promising approach to solve the nonlinear problem in DECT.

  18. The Human Factors and Ergonomics of P300-Based Brain-Computer Interfaces

    PubMed Central

    Powers, J. Clark; Bieliaieva, Kateryna; Wu, Shuohao; Nam, Chang S.

    2015-01-01

    Individuals with severe neuromuscular impairments face many challenges in communication and manipulation of the environment. Brain-computer interfaces (BCIs) show promise in presenting real-world applications that can provide such individuals with the means to interact with the world using only brain waves. Although there has been a growing body of research in recent years, much relates only to technology, and not to technology in use—i.e., real-world assistive technology employed by users. This review examined the literature to highlight studies that implicate the human factors and ergonomics (HFE) of P300-based BCIs. We assessed 21 studies on three topics to speak directly to improving the HFE of these systems: (1) alternative signal evocation methods within the oddball paradigm; (2) environmental interventions to improve user performance and satisfaction within the constraints of current BCI systems; and (3) measures and methods of measuring user acceptance. We found that HFE is central to the performance of P300-based BCI systems, although researchers do not often make explicit this connection. Incorporation of measures of user acceptance and rigorous usability evaluations, increased engagement of disabled users as test participants, and greater realism in testing will help progress the advancement of P300-based BCI systems in assistive applications. PMID:26266424

  19. EPA Project Updates: DSSTox and ToxCast Generating New ...

    EPA Pesticide Factsheets

    EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than traditionally employed in SAR modeling, and in ways that facilitate data-mining, and data read-across. The DSSTox Structure-Browser, launched in September 2007, provides structure searchability across all published DSSTox toxicity-related inventory, and is enabling linkages between previously isolated toxicity data resources. As of early March 2008, the public DSSTox inventory as been integrated into PubChem, allowing a user to take full advantage of PubChem structure-activity and bioassay clustering features. The most recent DSSTox version of Carcinogenic Potency Database file (CPDBAS) illustrates ways in which various summary definitions of carcinogenic activity can be employed in modeling and data mining. Phase I of the ToxCast project is generating high-throughput screening data from several hundred biochemical and cell-based assays for a set of 320 chemicals, mostly pesticide actives, with rich toxicology profiles. Incorporating and expanding traditional SAR Concepts into this new high-throughput and data-rich would pose conceptual and practical challenges, but also holds great promise for improving predictive capabilities. EPA's National Center for Computational Toxicology is bu

  20. A Computer Model for Red Blood Cell Chemistry

    DTIC Science & Technology

    1996-10-01

    5012. 13. ABSTRACT (Maximum 200 There is a growing need for interactive computational tools for medical education and research. The most exciting...paradigm for interactive education is simulation. Fluid Mod is a simulation based computational tool developed in the late sixties and early seventies at...to a modern Windows, object oriented interface. This development will provide students with a useful computational tool for learning . More important

  1. Chaos, patterns, coherent structures, and turbulence: Reflections on nonlinear science.

    PubMed

    Ecke, Robert E

    2015-09-01

    The paradigms of nonlinear science were succinctly articulated over 25 years ago as deterministic chaos, pattern formation, coherent structures, and adaptation/evolution/learning. For chaos, the main unifying concept was universal routes to chaos in general nonlinear dynamical systems, built upon a framework of bifurcation theory. Pattern formation focused on spatially extended nonlinear systems, taking advantage of symmetry properties to develop highly quantitative amplitude equations of the Ginzburg-Landau type to describe early nonlinear phenomena in the vicinity of critical points. Solitons, mathematically precise localized nonlinear wave states, were generalized to a larger and less precise class of coherent structures such as, for example, concentrated regions of vorticity from laboratory wake flows to the Jovian Great Red Spot. The combination of these three ideas was hoped to provide the tools and concepts for the understanding and characterization of the strongly nonlinear problem of fluid turbulence. Although this early promise has been largely unfulfilled, steady progress has been made using the approaches of nonlinear science. I provide a series of examples of bifurcations and chaos, of one-dimensional and two-dimensional pattern formation, and of turbulence to illustrate both the progress and limitations of the nonlinear science approach. As experimental and computational methods continue to improve, the promise of nonlinear science to elucidate fluid turbulence continues to advance in a steady manner, indicative of the grand challenge nature of strongly nonlinear multi-scale dynamical systems.

  2. Computer modeling of prostate cancer treatment. A paradigm for oncologic management?

    PubMed

    Miles, B J; Kattan, M W

    1995-04-01

    This article discusses the relevance of computer modeling to the management of prostate cancer. Several computer modeling techniques are reviewed and the advantages and disadvantages of each are discussed. An example that uses a computer model to compare alternative strategies for clinically localized prostate cancer is examined in detail. The quality of the data used in computer models is critical, and these models play an important role in medical decision making.

  3. Paradigm Paralysis and the Plight of the PC in Education.

    ERIC Educational Resources Information Center

    O'Neil, Mick

    1998-01-01

    Examines the varied factors involved in providing Internet access in K-12 education, including expense, computer installation and maintenance, and security, and explores how the network computer could be useful in this context. Operating systems and servers are discussed. (MSE)

  4. Hybrid Human-Computing Distributed Sense-Making: Extending the SOA Paradigm for Dynamic Adjudication and Optimization of Human and Computer Roles

    ERIC Educational Resources Information Center

    Rimland, Jeffrey C.

    2013-01-01

    In many evolving systems, inputs can be derived from both human observations and physical sensors. Additionally, many computation and analysis tasks can be performed by either human beings or artificial intelligence (AI) applications. For example, weather prediction, emergency event response, assistive technology for various human sensory and…

  5. The nonequilibrium quantum many-body problem as a paradigm for extreme data science

    NASA Astrophysics Data System (ADS)

    Freericks, J. K.; Nikolić, B. K.; Frieder, O.

    2014-12-01

    Generating big data pervades much of physics. But some problems, which we call extreme data problems, are too large to be treated within big data science. The nonequilibrium quantum many-body problem on a lattice is just such a problem, where the Hilbert space grows exponentially with system size and rapidly becomes too large to fit on any computer (and can be effectively thought of as an infinite-sized data set). Nevertheless, much progress has been made with computational methods on this problem, which serve as a paradigm for how one can approach and attack extreme data problems. In addition, viewing these physics problems from a computer-science perspective leads to new approaches that can be tried to solve more accurately and for longer times. We review a number of these different ideas here.

  6. Effects of Person-Centered Attitudes on Professional and Social Competence in a Blended Learning Paradigm

    ERIC Educational Resources Information Center

    Motschnig-Pitrik, Renate; Mallich, Katharina

    2004-01-01

    Web-based technology increases the hours we spend sitting in front of the screens of our computers. But can it also be used in a way to improve our social skills? The blended learning paradigm of Person-Centered e-Learning (PCeL) precisely aims to achieve intellectual as well as social and personal development by combining the benefits of online…

  7. A self-paced brain-computer interface for controlling a robot simulator: an online event labelling paradigm and an extended Kalman filter based algorithm for online training.

    PubMed

    Tsui, Chun Sing Louis; Gan, John Q; Roberts, Stephen J

    2009-03-01

    Due to the non-stationarity of EEG signals, online training and adaptation are essential to EEG based brain-computer interface (BCI) systems. Self-paced BCIs offer more natural human-machine interaction than synchronous BCIs, but it is a great challenge to train and adapt a self-paced BCI online because the user's control intention and timing are usually unknown. This paper proposes a novel motor imagery based self-paced BCI paradigm for controlling a simulated robot in a specifically designed environment which is able to provide user's control intention and timing during online experiments, so that online training and adaptation of the motor imagery based self-paced BCI can be effectively investigated. We demonstrate the usefulness of the proposed paradigm with an extended Kalman filter based method to adapt the BCI classifier parameters, with experimental results of online self-paced BCI training with four subjects.

  8. Promises and Lies: An Exploration of Curriculum Managers' Experiences in FE

    ERIC Educational Resources Information Center

    Thompson, Carol; Wolstencroft, Peter

    2015-01-01

    This article examines the important but under-researched role of the curriculum manager within further education. It reviews managers' perceptions of the role through the lens of the professional-managerial paradigm, with a particular emphasis on the conflict in values experienced by managers trying to implement processes driven by the financial…

  9. Understanding Common Core State Standards

    ERIC Educational Resources Information Center

    Kendall, John S.

    2011-01-01

    Now that the Common Core standards are coming to just about every school, what every school leader needs is a straightforward explanation that lays out the benefits of the Common Core in plain English, provides a succinct overview, and gets everyone thinking about how to transition to this promising new paradigm. This handy, inexpensive booklet…

  10. Mixed Methods Research: What Are the Key Issues to Consider?

    ERIC Educational Resources Information Center

    Ghosh, Rajashi

    2016-01-01

    Mixed methods research (MMR) is increasingly becoming a popular methodological approach in several fields due to the promise it holds for comprehensive understanding of complex problems being researched. However, researchers interested in MMR often lack reference to a guide that can explain the key issues pertaining to the paradigm wars…

  11. Rational Choice Theory and the Politics of Education: Promise and Limitations.

    ERIC Educational Resources Information Center

    Boyd, William Lowe; And Others

    1994-01-01

    Rational choice theory and its three branches (game theory, collective choice theory, and organizational economics) has altered the face of political science, sociology, and organizational theory. This chapter reviews rational choice theory, examines a small body of work that relies on the rational choice paradigm to study educational politics,…

  12. Treating Childhood Anxiety in Schools: Service Delivery in a Response to Intervention Paradigm

    ERIC Educational Resources Information Center

    Sulkowski, Michael L.; Joyce, Diana K.; Storch, Eric A.

    2012-01-01

    Millions of youth who attend schools in the United States suffer from clinically significant anxiety. Left untreated, these students often experience significant disruptions in their academic, social, and family functioning. Fortunately, promising treatments exist for childhood anxiety that are amenable for delivery in school settings. However,…

  13. What's Cooking in the MOOC Kitchen: Layered MOOCs

    ERIC Educational Resources Information Center

    Crosslin, Matt; Wakefield, Jenny S.

    2016-01-01

    During several panel presentations at the AECT Annual Convention in Indianapolis in November 2015, concerns with MOOCs were raised. In this paper the authors discuss a few of those concerns of extra interest, and explain the relatively new customizable dual-layer MOOC course design. This new paradigm of MOOC design holds promise to alleviate some…

  14. Games for Participatory Science: A Paradigm for Game-Based Learning for Promoting Science Literacy

    ERIC Educational Resources Information Center

    Shapiro, R. Benjamin; Squire, Kurt D.

    2011-01-01

    Debates in forums such as "Educational Technology" and the National Academies of Science (National Research Council, 2011) emphasize the promise (and indeed recent successes) of digital game-based learning programs, but also the need for research-driven approaches that carefully delineate learning goals. This article introduces one such…

  15. Performance of the Heavy Flavor Tracker (HFT) detector in star experiment at RHIC

    NASA Astrophysics Data System (ADS)

    Alruwaili, Manal

    With the growing technology, the number of the processors is becoming massive. Current supercomputer processing will be available on desktops in the next decade. For mass scale application software development on massive parallel computing available on desktops, existing popular languages with large libraries have to be augmented with new constructs and paradigms that exploit massive parallel computing and distributed memory models while retaining the user-friendliness. Currently, available object oriented languages for massive parallel computing such as Chapel, X10 and UPC++ exploit distributed computing, data parallel computing and thread-parallelism at the process level in the PGAS (Partitioned Global Address Space) memory model. However, they do not incorporate: 1) any extension at for object distribution to exploit PGAS model; 2) the programs lack the flexibility of migrating or cloning an object between places to exploit load balancing; and 3) lack the programming paradigms that will result from the integration of data and thread-level parallelism and object distribution. In the proposed thesis, I compare different languages in PGAS model; propose new constructs that extend C++ with object distribution and object migration; and integrate PGAS based process constructs with these extensions on distributed objects. Object cloning and object migration. Also a new paradigm MIDD (Multiple Invocation Distributed Data) is presented when different copies of the same class can be invoked, and work on different elements of a distributed data concurrently using remote method invocations. I present new constructs, their grammar and their behavior. The new constructs have been explained using simple programs utilizing these constructs.

  16. Using Google Applications as Part of Cloud Computing to Improve Knowledge and Teaching Skills of Faculty Members at the University of Bisha, Bisha, Saudi Arabia

    ERIC Educational Resources Information Center

    Alshihri, Bandar A.

    2017-01-01

    Cloud computing is a recent computing paradigm that has been integrated into the educational system. It provides numerous opportunities for delivering a variety of computing services in a way that has not been experienced before. The Google Company is among the top business companies that afford their cloud services by launching a number of…

  17. A computational framework to empower probabilistic protein design

    PubMed Central

    Fromer, Menachem; Yanover, Chen

    2008-01-01

    Motivation: The task of engineering a protein to perform a target biological function is known as protein design. A commonly used paradigm casts this functional design problem as a structural one, assuming a fixed backbone. In probabilistic protein design, positional amino acid probabilities are used to create a random library of sequences to be simultaneously screened for biological activity. Clearly, certain choices of probability distributions will be more successful in yielding functional sequences. However, since the number of sequences is exponential in protein length, computational optimization of the distribution is difficult. Results: In this paper, we develop a computational framework for probabilistic protein design following the structural paradigm. We formulate the distribution of sequences for a structure using the Boltzmann distribution over their free energies. The corresponding probabilistic graphical model is constructed, and we apply belief propagation (BP) to calculate marginal amino acid probabilities. We test this method on a large structural dataset and demonstrate the superiority of BP over previous methods. Nevertheless, since the results obtained by BP are far from optimal, we thoroughly assess the paradigm using high-quality experimental data. We demonstrate that, for small scale sub-problems, BP attains identical results to those produced by exact inference on the paradigmatic model. However, quantitative analysis shows that the distributions predicted significantly differ from the experimental data. These findings, along with the excellent performance we observed using BP on the smaller problems, suggest potential shortcomings of the paradigm. We conclude with a discussion of how it may be improved in the future. Contact: fromer@cs.huji.ac.il PMID:18586717

  18. Jahn-Teller effect in molecular electronics: quantum cellular automata

    NASA Astrophysics Data System (ADS)

    Tsukerblat, B.; Palii, A.; Clemente-Juan, J. M.; Coronado, E.

    2017-05-01

    The article summarizes the main results of application of the theory of the Jahn-Teller (JT) and pseudo JT effects to the description of molecular quantum dot cellular automata (QCA), a new paradigm of quantum computing. The following issues are discussed: 1) QCA as a new paradigm of quantum computing, principles and advantages; 2) molecular implementation of QCA; 3) role of the JT effect in charge trapping, encoding of binary information in the quantum cell and non-linear cell-cell response; 4) spin-switching in molecular QCA based on mixed-valence cell; 5) intervalence optical absorption in tetrameric molecular mixed-valence cell through the symmetry assisted approach to the multimode/multilevel JT and pseudo JT problems.

  19. User-centered design in brain-computer interfaces-a case study.

    PubMed

    Schreuder, Martijn; Riccio, Angela; Risetti, Monica; Dähne, Sven; Ramsay, Andrew; Williamson, John; Mattia, Donatella; Tangermann, Michael

    2013-10-01

    The array of available brain-computer interface (BCI) paradigms has continued to grow, and so has the corresponding set of machine learning methods which are at the core of BCI systems. The latter have evolved to provide more robust data analysis solutions, and as a consequence the proportion of healthy BCI users who can use a BCI successfully is growing. With this development the chances have increased that the needs and abilities of specific patients, the end-users, can be covered by an existing BCI approach. However, most end-users who have experienced the use of a BCI system at all have encountered a single paradigm only. This paradigm is typically the one that is being tested in the study that the end-user happens to be enrolled in, along with other end-users. Though this corresponds to the preferred study arrangement for basic research, it does not ensure that the end-user experiences a working BCI. In this study, a different approach was taken; that of a user-centered design. It is the prevailing process in traditional assistive technology. Given an individual user with a particular clinical profile, several available BCI approaches are tested and - if necessary - adapted to him/her until a suitable BCI system is found. Described is the case of a 48-year-old woman who suffered from an ischemic brain stem stroke, leading to a severe motor- and communication deficit. She was enrolled in studies with two different BCI systems before a suitable system was found. The first was an auditory event-related potential (ERP) paradigm and the second a visual ERP paradigm, both of which are established in literature. The auditory paradigm did not work successfully, despite favorable preconditions. The visual paradigm worked flawlessly, as found over several sessions. This discrepancy in performance can possibly be explained by the user's clinical deficit in several key neuropsychological indicators, such as attention and working memory. While the auditory paradigm relies on both categories, the visual paradigm could be used with lower cognitive workload. Besides attention and working memory, several other neurophysiological and -psychological indicators - and the role they play in the BCIs at hand - are discussed. The user's performance on the first BCI paradigm would typically have excluded her from further ERP-based BCI studies. However, this study clearly shows that, with the numerous paradigms now at our disposal, the pursuit for a functioning BCI system should not be stopped after an initial failed attempt. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  20. Scientific Computing Paradigm

    NASA Technical Reports Server (NTRS)

    VanZandt, John

    1994-01-01

    The usage model of supercomputers for scientific applications, such as computational fluid dynamics (CFD), has changed over the years. Scientific visualization has moved scientists away from looking at numbers to looking at three-dimensional images, which capture the meaning of the data. This change has impacted the system models for computing. This report details the model which is used by scientists at NASA's research centers.

  1. Telecommunications Options Connect OCLC and Libraries to the Future: The Co-Evolution of OCLC Connectivity Options and the Library Computing Environment.

    ERIC Educational Resources Information Center

    Breeding, Marshall

    1998-01-01

    The Online Computer Library Center's (OCLC) access options have kept pace with the evolving trends in telecommunications and the library computing environment. As libraries deploy microcomputers and develop networks, OCLC offers access methods consistent with these environments. OCLC works toward reorienting its network paradigm through TCP/IP…

  2. Performance of OVERFLOW-D Applications based on Hybrid and MPI Paradigms on IBM Power4 System

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Biegel, Bryan (Technical Monitor)

    2002-01-01

    This report briefly discusses our preliminary performance experiments with parallel versions of OVERFLOW-D applications. These applications are based on MPI and hybrid paradigms on the IBM Power4 system here at the NAS Division. This work is part of an effort to determine the suitability of the system and its parallel libraries (MPI/OpenMP) for specific scientific computing objectives.

  3. Mobile Computing Solutions for Effective and Efficient Generation and Dissemination of Tactical Operations Orders

    DTIC Science & Technology

    2017-06-01

    and Joint doctrine, to establish a new paradigm for capturing, storing and transmitting the informational elements of an operations order. We...developed a working technology demonstrator that incorporates a new object-oriented data structure into existing open-source mobile technology to provide...instructor and practitioner in conjunction with Marine Corps and Joint doctrine, to establish a new paradigm for capturing, storing and transmitting the

  4. A Gaze Independent Brain-Computer Interface Based on Visual Stimulation through Closed Eyelids

    NASA Astrophysics Data System (ADS)

    Hwang, Han-Jeong; Ferreria, Valeria Y.; Ulrich, Daniel; Kilic, Tayfun; Chatziliadis, Xenofon; Blankertz, Benjamin; Treder, Matthias

    2015-10-01

    A classical brain-computer interface (BCI) based on visual event-related potentials (ERPs) is of limited application value for paralyzed patients with severe oculomotor impairments. In this study, we introduce a novel gaze independent BCI paradigm that can be potentially used for such end-users because visual stimuli are administered on closed eyelids. The paradigm involved verbally presented questions with 3 possible answers. Online BCI experiments were conducted with twelve healthy subjects, where they selected one option by attending to one of three different visual stimuli. It was confirmed that typical cognitive ERPs can be evidently modulated by the attention of a target stimulus in eyes-closed and gaze independent condition, and further classified with high accuracy during online operation (74.58% ± 17.85 s.d.; chance level 33.33%), demonstrating the effectiveness of the proposed novel visual ERP paradigm. Also, stimulus-specific eye movements observed during stimulation were verified as reflex responses to light stimuli, and they did not contribute to classification. To the best of our knowledge, this study is the first to show the possibility of using a gaze independent visual ERP paradigm in an eyes-closed condition, thereby providing another communication option for severely locked-in patients suffering from complex ocular dysfunctions.

  5. Effect of a combination of flip and zooming stimuli on the performance of a visual brain-computer interface for spelling.

    PubMed

    Cheng, Jiao; Jin, Jing; Daly, Ian; Zhang, Yu; Wang, Bei; Wang, Xingyu; Cichocki, Andrzej

    2018-02-13

    Brain-computer interface (BCI) systems can allow their users to communicate with the external world by recognizing intention directly from their brain activity without the assistance of the peripheral motor nervous system. The P300-speller is one of the most widely used visual BCI applications. In previous studies, a flip stimulus (rotating the background area of the character) that was based on apparent motion, suffered from less refractory effects. However, its performance was not improved significantly. In addition, a presentation paradigm that used a "zooming" action (changing the size of the symbol) has been shown to evoke relatively higher P300 amplitudes and obtain a better BCI performance. To extend this method of stimuli presentation within a BCI and, consequently, to improve BCI performance, we present a new paradigm combining both the flip stimulus with a zooming action. This new presentation modality allowed BCI users to focus their attention more easily. We investigated whether such an action could combine the advantages of both types of stimuli presentation to bring a significant improvement in performance compared to the conventional flip stimulus. The experimental results showed that the proposed paradigm could obtain significantly higher classification accuracies and bit rates than the conventional flip paradigm (p<0.01).

  6. Near-infrared spectroscopy (NIRS)-based eyes-closed brain-computer interface (BCI) using prefrontal cortex activation due to mental arithmetic

    PubMed Central

    Shin, Jaeyoung; Müller, Klaus-R; Hwang, Han-Jeong

    2016-01-01

    We propose a near-infrared spectroscopy (NIRS)-based brain-computer interface (BCI) that can be operated in eyes-closed (EC) state. To evaluate the feasibility of NIRS-based EC BCIs, we compared the performance of an eye-open (EO) BCI paradigm and an EC BCI paradigm with respect to hemodynamic response and classification accuracy. To this end, subjects performed either mental arithmetic or imagined vocalization of the English alphabet as a baseline task with very low cognitive loading. The performances of two linear classifiers were compared; resulting in an advantage of shrinkage linear discriminant analysis (LDA). The classification accuracy of EC paradigm (75.6 ± 7.3%) was observed to be lower than that of EO paradigm (77.0 ± 9.2%), which was statistically insignificant (p = 0.5698). Subjects reported they felt it more comfortable (p = 0.057) and easier (p < 0.05) to perform the EC BCI tasks. The different task difficulty may become a cause of the slightly lower classification accuracy of EC data. From the analysis results, we could confirm the feasibility of NIRS-based EC BCIs, which can be a BCI option that may ultimately be of use for patients who cannot keep their eyes open consistently. PMID:27824089

  7. Near-infrared spectroscopy (NIRS)-based eyes-closed brain-computer interface (BCI) using prefrontal cortex activation due to mental arithmetic.

    PubMed

    Shin, Jaeyoung; Müller, Klaus-R; Hwang, Han-Jeong

    2016-11-08

    We propose a near-infrared spectroscopy (NIRS)-based brain-computer interface (BCI) that can be operated in eyes-closed (EC) state. To evaluate the feasibility of NIRS-based EC BCIs, we compared the performance of an eye-open (EO) BCI paradigm and an EC BCI paradigm with respect to hemodynamic response and classification accuracy. To this end, subjects performed either mental arithmetic or imagined vocalization of the English alphabet as a baseline task with very low cognitive loading. The performances of two linear classifiers were compared; resulting in an advantage of shrinkage linear discriminant analysis (LDA). The classification accuracy of EC paradigm (75.6 ± 7.3%) was observed to be lower than that of EO paradigm (77.0 ± 9.2%), which was statistically insignificant (p = 0.5698). Subjects reported they felt it more comfortable (p = 0.057) and easier (p < 0.05) to perform the EC BCI tasks. The different task difficulty may become a cause of the slightly lower classification accuracy of EC data. From the analysis results, we could confirm the feasibility of NIRS-based EC BCIs, which can be a BCI option that may ultimately be of use for patients who cannot keep their eyes open consistently.

  8. Neuroimaging and Anxiety: the Neural Substrates of Pathological and Non-pathological Anxiety.

    PubMed

    Taylor, James M; Whalen, Paul J

    2015-06-01

    Advances in the use of noninvasive neuroimaging to study the neural correlates of pathological and non-pathological anxiety have shone new light on the underlying neural bases for both the development and manifestation of anxiety. This review summarizes the most commonly observed neural substrates of the phenotype of anxiety. We focus on the neuroimaging paradigms that have shown promise in exposing this relevant brain circuitry. In this way, we offer a broad overview of how anxiety is studied in the neuroimaging laboratory and the key findings that offer promise for future research and a clearer understanding of anxiety.

  9. Is Human-Computer Interaction Social or Parasocial?

    ERIC Educational Resources Information Center

    Sundar, S. Shyam

    Conducted in the attribution-research paradigm of social psychology, a study examined whether human-computer interaction is fundamentally social (as in human-human interaction) or parasocial (as in human-television interaction). All 30 subjects (drawn from an undergraduate class on communication) were exposed to an identical interaction with…

  10. The effects of semantic congruency: a research of audiovisual P300-speller.

    PubMed

    Cao, Yong; An, Xingwei; Ke, Yufeng; Jiang, Jin; Yang, Hanjun; Chen, Yuqian; Jiao, Xuejun; Qi, Hongzhi; Ming, Dong

    2017-07-25

    Over the past few decades, there have been many studies of aspects of brain-computer interface (BCI). Of particular interests are event-related potential (ERP)-based BCI spellers that aim at helping mental typewriting. Nowadays, audiovisual unimodal stimuli based BCI systems have attracted much attention from researchers, and most of the existing studies of audiovisual BCIs were based on semantic incongruent stimuli paradigm. However, no related studies had reported that whether there is difference of system performance or participant comfort between BCI based on semantic congruent paradigm and that based on semantic incongruent paradigm. The goal of this study was to investigate the effects of semantic congruency in system performance and participant comfort in audiovisual BCI. Two audiovisual paradigms (semantic congruent and incongruent) were adopted, and 11 healthy subjects participated in the experiment. High-density electrical mapping of ERPs and behavioral data were measured for the two stimuli paradigms. The behavioral data indicated no significant difference between congruent and incongruent paradigms for offline classification accuracy. Nevertheless, eight of the 11 participants reported their priority to semantic congruent experiment, two reported no difference between the two conditions, and only one preferred the semantic incongruent paradigm. Besides, the result indicted that higher amplitude of ERP was found in incongruent stimuli based paradigm. In a word, semantic congruent paradigm had a better participant comfort, and maintained the same recognition rate as incongruent paradigm. Furthermore, our study suggested that the paradigm design of spellers must take both system performance and user experience into consideration rather than merely pursuing a larger ERP response.

  11. Simultaneous detection of P300 and steady-state visually evoked potentials for hybrid brain-computer interface.

    PubMed

    Combaz, Adrien; Van Hulle, Marc M

    2015-01-01

    We study the feasibility of a hybrid Brain-Computer Interface (BCI) combining simultaneous visual oddball and Steady-State Visually Evoked Potential (SSVEP) paradigms, where both types of stimuli are superimposed on a computer screen. Potentially, such a combination could result in a system being able to operate faster than a purely P300-based BCI and encode more targets than a purely SSVEP-based BCI. We analyse the interactions between the brain responses of the two paradigms, and assess the possibility to detect simultaneously the brain activity evoked by both paradigms, in a series of 3 experiments where EEG data are analysed offline. Despite differences in the shape of the P300 response between pure oddball and hybrid condition, we observe that the classification accuracy of this P300 response is not affected by the SSVEP stimulation. We do not observe either any effect of the oddball stimulation on the power of the SSVEP response in the frequency of stimulation. Finally results from the last experiment show the possibility of detecting both types of brain responses simultaneously and suggest not only the feasibility of such hybrid BCI but also a gain over pure oddball- and pure SSVEP-based BCIs in terms of communication rate.

  12. A Comparison of PETSC Library and HPF Implementations of an Archetypal PDE Computation

    NASA Technical Reports Server (NTRS)

    Hayder, M. Ehtesham; Keyes, David E.; Mehrotra, Piyush

    1997-01-01

    Two paradigms for distributed-memory parallel computation that free the application programmer from the details of message passing are compared for an archetypal structured scientific computation a nonlinear, structured-grid partial differential equation boundary value problem using the same algorithm on the same hardware. Both paradigms, parallel libraries represented by Argonne's PETSC, and parallel languages represented by the Portland Group's HPF, are found to be easy to use for this problem class, and both are reasonably effective in exploiting concurrency after a short learning curve. The level of involvement required by the application programmer under either paradigm includes specification of the data partitioning (corresponding to a geometrically simple decomposition of the domain of the PDE). Programming in SPAM style for the PETSC library requires writing the routines that discretize the PDE and its Jacobian, managing subdomain-to-processor mappings (affine global- to-local index mappings), and interfacing to library solver routines. Programming for HPF requires a complete sequential implementation of the same algorithm, introducing concurrency through subdomain blocking (an effort similar to the index mapping), and modest experimentation with rewriting loops to elucidate to the compiler the latent concurrency. Correctness and scalability are cross-validated on up to 32 nodes of an IBM SP2.

  13. A prisoner's dilemma experiment on cooperation with people and human-like computers.

    PubMed

    Kiesler, S; Sproull, L; Waters, K

    1996-01-01

    The authors investigated basic properties of social exchange and interaction with technology in an experiment on cooperation with a human-like computer partner or a real human partner. Talking with a computer partner may trigger social identity feelings or commitment norms. Participants played a prisoner's dilemma game with a confederate or a computer partner. Discussion, inducements to make promises, and partner cooperation varied across trials. On Trial 1, after discussion, most participants proposed cooperation. They kept their promises as much with a text-only computer as with a person, but less with a more human-like computer. Cooperation dropped sharply when any partner avoided discussion. The strong impact of discussion fits a social contract explanation of cooperation following discussion. Participants broke their promises to a computer more than to a person, however, indicating that people make heterogeneous commitments.

  14. Amoeba-inspired nanoarchitectonic computing: solving intractable computational problems using nanoscale photoexcitation transfer dynamics.

    PubMed

    Aono, Masashi; Naruse, Makoto; Kim, Song-Ju; Wakabayashi, Masamitsu; Hori, Hirokazu; Ohtsu, Motoichi; Hara, Masahiko

    2013-06-18

    Biologically inspired computing devices and architectures are expected to overcome the limitations of conventional technologies in terms of solving computationally demanding problems, adapting to complex environments, reducing energy consumption, and so on. We previously demonstrated that a primitive single-celled amoeba (a plasmodial slime mold), which exhibits complex spatiotemporal oscillatory dynamics and sophisticated computing capabilities, can be used to search for a solution to a very hard combinatorial optimization problem. We successfully extracted the essential spatiotemporal dynamics by which the amoeba solves the problem. This amoeba-inspired computing paradigm can be implemented by various physical systems that exhibit suitable spatiotemporal dynamics resembling the amoeba's problem-solving process. In this Article, we demonstrate that photoexcitation transfer phenomena in certain quantum nanostructures mediated by optical near-field interactions generate the amoebalike spatiotemporal dynamics and can be used to solve the satisfiability problem (SAT), which is the problem of judging whether a given logical proposition (a Boolean formula) is self-consistent. SAT is related to diverse application problems in artificial intelligence, information security, and bioinformatics and is a crucially important nondeterministic polynomial time (NP)-complete problem, which is believed to become intractable for conventional digital computers when the problem size increases. We show that our amoeba-inspired computing paradigm dramatically outperforms a conventional stochastic search method. These results indicate the potential for developing highly versatile nanoarchitectonic computers that realize powerful solution searching with low energy consumption.

  15. Further Evidence in Support of the Universal Nilpotent Grammatical Computational Paradigm of Quantum Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcer, Peter J.; Rowlands, Peter

    2010-12-22

    Further evidence is presented in favour of the computational paradigm, conceived and constructed by Rowlands and Diaz, as detailed in Rowlands' book Zero to Infinity (2007), and in particular the authors' paper 'The Grammatical Universe: the Laws of Thermodynamics and Quantum Entanglement'. The paradigm, which has isomorphic group and algebraic quantum mechanical language interpretations, not only predicts the well-established facts of quantum physics, the periodic table, chemistry / valence and of molecular biology, whose understanding it extends; it also provides an elegant, simple solution to the unresolved quantum measurement problem. In this fundamental paradigm, all the computational constructs / predictionsmore » that emerge, follow from the simple fact, that, as in quantum mechanics, the wave function is defined only up to an arbitrary fixed phase. This fixed phase provides a simple physical understanding of the quantum vacuum in quantum field theory, where only relative phases, known to be able to encode 3+1 relativistic space-time geometries, can be measured. It is the arbitrary fixed measurement standard, against which everything that follows is to be measured, even though the standard itself cannot be, since nothing exists against which to measure it. The standard, as an arbitrary fixed reference phase, functions as the holographic basis for a self-organized universal quantum process of emergent novel fermion states of matter where, following each emergence, the arbitrary standard is re-fixed anew so as to provide a complete history / holographic record or hologram of the current fixed past, advancing an unending irreversible evolution, such as is the evidence of our senses. The fermion states, in accord with the Pauli exclusion principle, each correspond to a unique nilpotent symbol in the infinite alphabet (which specifies the grammar in this nilpotent universal computational rewrite system (NUCRS) paradigm); and the alphabet, as Hill and Rowlands hypothesize on substantial evidence [26], includes that of the RNA / DNA genetic code and, as holographic phase encodings / holograms, the 4D geometries of all living systems as self-organised grammatical computational rewrite machines / machinery. Human brains, natural grammatical (written symbol) languages, 4D geometric self-awareness and a totally new emergent property of matter, human consciousness, can thus with some measure of confidence be postulated as further genetic consequences which follow from this self-organizing fundamental rewrite NUCRS construction. For it, like natural language, possesses a semantics and not just a syntax, where the initial symbol, i.e. the arbitrary fixed phase measurement standard, is able to function as the template for the blueprints of the emergent 4D relativistic real and virtual geometries to come, in a 'from the Self Creation to the creation of the human self' computational rewrite process evolution.« less

  16. Further Evidence in Support of the Universal Nilpotent Grammatical Computational Paradigm of Quantum Physics

    NASA Astrophysics Data System (ADS)

    Marcer, Peter J.; Rowlands, Peter

    2010-12-01

    Further evidence is presented in favour of the computational paradigm, conceived and constructed by Rowlands and Diaz, as detailed in Rowlands' book Zero to Infinity (2007) [2], and in particular the authors' paper `The Grammatical Universe: the Laws of Thermodynamics and Quantum Entanglement' [1]. The paradigm, which has isomorphic group and algebraic quantum mechanical language interpretations, not only predicts the well-established facts of quantum physics, the periodic table, chemistry / valence and of molecular biology, whose understanding it extends; it also provides an elegant, simple solution to the unresolved quantum measurement problem. In this fundamental paradigm, all the computational constructs / predictions that emerge, follow from the simple fact, that, as in quantum mechanics, the wave function is defined only up to an arbitrary fixed phase. This fixed phase provides a simple physical understanding of the quantum vacuum in quantum field theory, where only relative phases, known to be able to encode 3+1 relativistic space-time geometries, can be measured. It is the arbitrary fixed measurement standard, against which everything that follows is to be measured, even though the standard itself cannot be, since nothing exists against which to measure it. The standard, as an arbitrary fixed reference phase, functions as the holographic basis for a self-organized universal quantum process of emergent novel fermion states of matter where, following each emergence, the arbitrary standard is re-fixed anew so as to provide a complete history / holographic record or hologram of the current fixed past, advancing an unending irreversible evolution, such as is the evidence of our senses. The fermion states, in accord with the Pauli exclusion principle, each correspond to a unique nilpotent symbol in the infinite alphabet (which specifies the grammar in this nilpotent universal computational rewrite system (NUCRS) paradigm); and the alphabet, as Hill and Rowlands hypothesize on substantial evidence [26], includes that of the RNA / DNA genetic code and, as holographic phase encodings / holograms, the 4D geometries of all living systems as self-organised grammatical computational rewrite machines / machinery. Human brains, natural grammatical (written symbol) languages, 4D geometric self-awareness and a totally new emergent property of matter, human consciousness, can thus with some measure of confidence be postulated as further genetic consequences which follow from this self-organizing fundamental rewrite NUCRS construction. For it, like natural language, possesses a semantics and not just a syntax, where the initial symbol, i.e. the arbitrary fixed phase measurement standard, is able to function as the template for the blueprints of the emergent 4D relativistic real and virtual geometries to come, in a `from the Self Creation to the creation of the human self' computational rewrite process evolution.

  17. Early life stress paradigms in rodents: potential animal models of depression?

    PubMed

    Schmidt, Mathias V; Wang, Xiao-Dong; Meijer, Onno C

    2011-03-01

    While human depressive illness is indeed uniquely human, many of its symptoms may be modeled in rodents. Based on human etiology, the assumption has been made that depression-like behavior in rats and mice can be modulated by some of the powerful early life programming effects that are known to occur after manipulations in the first weeks of life. Here we review the evidence that is available in literature for early life manipulation as risk factors for the development of depression-like symptoms such as anhedonia, passive coping strategies, and neuroendocrine changes. Early life paradigms that were evaluated include early handling, separation, and deprivation protocols, as well as enriched and impoverished environments. We have also included a small number of stress-related pharmacological models. We find that for most early life paradigms per se, the actual validity for depression is limited. A number of models have not been tested with respect to classical depression-like behaviors, while in many cases, the outcome of such experiments is variable and depends on strain and additional factors. Because programming effects confer vulnerability rather than disease, a number of paradigms hold promise for usefulness in depression research, in combination with the proper genetic background and adult life challenges.

  18. At the Edge of Chaos: A New Paradigm for Social Work?

    ERIC Educational Resources Information Center

    Hudson, Christopher G.

    2000-01-01

    Reviews key concepts and applications of chaos theory and the broader complex systems theory in the context of general systems theory and the search for a unified conceptual framework for social work. Concludes that chaos theory shows promise as a solution to many problems posed by the now dated general systems approach. (DB)

  19. Paradigm Shift: The New Promise of Information Technology.

    ERIC Educational Resources Information Center

    Tapscott, Don; Caston, Art

    A fundamental change is taking place in the nature and application of technology in business, a change with profound and far-reaching implications for companies of every size and shape. A multimillion dollar research program conducted by the DMR Group, Inc., studied more than 4,500 organizations in North America, Europe, and the Far East to…

  20. "Seeing the Whole Elephant": Changing Mindsets and Empowering Stakeholders to Meaningfully Manage Accountability and Improvement

    ERIC Educational Resources Information Center

    Bush-Mecenas, Susan; Marsh, Julie A.; Montes de Oca, David; Hough, Heather

    2018-01-01

    School accountability and improvement policy are on the precipice of a paradigm shift. While the multiple-measure dashboard accountability approach holds great promise for promoting more meaningful learning opportunities for all students, our research indicates that this can come with substantial challenges in practice. We reflect upon the lessons…

  1. At the Crossroads of Clinical Practice and Teacher Leadership: A Changing Paradigm for Professional Practice

    ERIC Educational Resources Information Center

    Sawyer, Richard D.; Neel, Michael; Coulter, Matthew

    2016-01-01

    This paper examines the endemic separation between K-12 schools and colleges of education in teacher preparation. Specifically, we examine a new approach related to the promise of clinical practice--a clinical practice program that overlaps a public high school, a graduate-level teacher preparation program, and a professional practice doctoral…

  2. Implementing Early Literacy: Promising Success for All Kindergarten Children. Technical Report No. 517.

    ERIC Educational Resources Information Center

    Stewart, Janice P.; And Others

    A study demonstrated the viability of an instructional paradigm that identifies adult mediation within the zone of proximal development to be a significant factor in young children's learning. Six teachers participated during the first year and eight teachers during the second year. Complete data were collected for 200 kindergarten children.…

  3. Scaling, Similarity, and the Fourth Paradigm for Hydrology

    NASA Technical Reports Server (NTRS)

    Peters-Lidard, Christa D.; Clark, Martyn; Samaniego, Luis; Verhoest, Niko E. C.; van Emmerik, Tim; Uijlenhoet, Remko; Achieng, Kevin; Franz, Trenton E.; Woods, Ross

    2017-01-01

    In this synthesis paper addressing hydrologic scaling and similarity, we posit that roadblocks in the search for universal laws of hydrology are hindered by our focus on computational simulation (the third paradigm), and assert that it is time for hydrology to embrace a fourth paradigm of data-intensive science. Advances in information-based hydrologic science, coupled with an explosion of hydrologic data and advances in parameter estimation and modelling, have laid the foundation for a data-driven framework for scrutinizing hydrological scaling and similarity hypotheses. We summarize important scaling and similarity concepts (hypotheses) that require testing, describe a mutual information framework for testing these hypotheses, describe boundary condition, state flux, and parameter data requirements across scales to support testing these hypotheses, and discuss some challenges to overcome while pursuing the fourth hydrological paradigm. We call upon the hydrologic sciences community to develop a focused effort towards adopting the fourth paradigm and apply this to outstanding challenges in scaling and similarity.

  4. Exploring Cloud Computing for Distance Learning

    ERIC Educational Resources Information Center

    He, Wu; Cernusca, Dan; Abdous, M'hammed

    2011-01-01

    The use of distance courses in learning is growing exponentially. To better support faculty and students for teaching and learning, distance learning programs need to constantly innovate and optimize their IT infrastructures. The new IT paradigm called "cloud computing" has the potential to transform the way that IT resources are utilized and…

  5. The Impact of Internet-Based Instruction on Teacher Education: The "Paradigm Shift."

    ERIC Educational Resources Information Center

    Lan, Jiang JoAnn

    This study incorporated Internet-based instruction into two education technology courses for preservice teachers. One was a required, undergraduate, beginning-level educational computing course. The other was a graduate, advanced-level computing course. The experiment incorporated Internet-based instruction into course delivery in order to create…

  6. Implementing Project Based Learning in Computer Classroom

    ERIC Educational Resources Information Center

    Asan, Askin; Haliloglu, Zeynep

    2005-01-01

    Project-based learning offers the opportunity to apply theoretical and practical knowledge, and to develop the student's group working, and collaboration skills. In this paper we presented a design of effective computer class that implements the well-known and highly accepted project-based learning paradigm. A pre-test/post-test control group…

  7. Developing an Academic Information Resources Master Plan: A Paradigm.

    ERIC Educational Resources Information Center

    Gentry, H. R.

    Since 1980, Johnson County Community College (JCCC) has rapidly expanded its use of personal computers in instruction and administration. Many computers were being acquired, however, before appropriate policies, plans, and procedures were developed for their effective management. In response to this problem, the following actions were identified…

  8. Semantic knowledge for histopathological image analysis: from ontologies to processing portals and deep learning

    NASA Astrophysics Data System (ADS)

    Kergosien, Yannick L.; Racoceanu, Daniel

    2017-11-01

    This article presents our vision about the next generation of challenges in computational/digital pathology. The key role of the domain ontology, developed in a sustainable manner (i.e. using reference checklists and protocols, as the living semantic repositories), opens the way to effective/sustainable traceability and relevance feedback concerning the use of existing machine learning algorithms, proven to be very performant in the latest digital pathology challenges (i.e. convolutional neural networks). Being able to work in an accessible web-service environment, with strictly controlled issues regarding intellectual property (image and data processing/analysis algorithms) and medical data/image confidentiality is essential for the future. Among the web-services involved in the proposed approach, the living yellow pages in the area of computational pathology seems to be very important in order to reach an operational awareness, validation, and feasibility. This represents a very promising way to go to the next generation of tools, able to bring more guidance to the computer scientists and confidence to the pathologists, towards an effective/efficient daily use. Besides, a consistent feedback and insights will be more likely to emerge in the near future - from these sophisticated machine learning tools - back to the pathologists-, strengthening, therefore, the interaction between the different actors of a sustainable biomedical ecosystem (patients, clinicians, biologists, engineers, scientists etc.). Beside going digital/computational - with virtual slide technology demanding new workflows-, Pathology must prepare for another coming revolution: semantic web technologies now enable the knowledge of experts to be stored in databases, shared through the Internet, and accessible by machines. Traceability, disambiguation of reports, quality monitoring, interoperability between health centers are some of the associated benefits that pathologists were seeking. However, major changes are also to be expected for the relation of human diagnosis to machine based procedures. Improving on a former imaging platform which used a local knowledge base and a reasoning engine to combine image processing modules into higher level tasks, we propose a framework where different actors of the histopathology imaging world can cooperate using web services - exchanging knowledge as well as imaging services - and where the results of such collaborations on diagnostic related tasks can be evaluated in international challenges such as those recently organized for mitosis detection, nuclear atypia, or tissue architecture in the context of cancer grading. This framework is likely to offer an effective context-guidance and traceability to Deep Learning approaches, with an interesting promising perspective given by the multi-task learning (MTL) paradigm, distinguished by its applicability to several different learning algorithms, its non- reliance on specialized architectures and the promising results demonstrated, in particular towards the problem of weak supervision-, an issue found when direct links from pathology terms in reports to corresponding regions within images are missing.

  9. Achieving High Performance on the i860 Microprocessor

    NASA Technical Reports Server (NTRS)

    Lee, King; Kutler, Paul (Technical Monitor)

    1998-01-01

    The i860 is a high performance microprocessor used in the Intel Touchstone project. This paper proposes a paradigm for programming the i860 that is modelled on the vector instructions of the Cray computers. Fortran callable assembler subroutines were written that mimic the concurrent vector instructions of the Cray. Cache takes the place of vector registers. Using this paradigm we have achieved twice the performance of compiled code on a traditional solve.

  10. "I'm Keeping Those There, Are You?": The Role of a New User Interface Paradigm--Separate Control of Shared Space (SCOSS)--in the Collaborative Decision-Making Process

    ERIC Educational Resources Information Center

    Kerawalla, Lucinda; Pearce, Darren; Yuill, Nicola; Luckin, Rosemary; Harris, Amanda

    2008-01-01

    We take a socio-cultural approach to comparing how dual control of a new user interface paradigm--Separate Control of Shared Space (SCOSS)--and dual control of a single user interface can work to mediate the collaborative decision-making process between pairs of children carrying out a multiple categorisation word task on a shared computer.…

  11. Paradigm of Legal Protection of Computer Software Contracts in the United States: Brief Overview of “Principles of the Law of Software Contracts”

    NASA Astrophysics Data System (ADS)

    Furuya, Haruhisa; Hiratsuka, Mitsuyoshi

    This article overviews the historical transition of legal protection of Computer software contracts in the Unite States and presents how it should function under Uniform Commercial Code and its amended Article 2B, Uniform Computer Information Transactions Act, and also recently-approved “Principles of the Law of Software Contracts”.

  12. Quantum Optical Implementations of Current Quantum Computing Paradigms

    DTIC Science & Technology

    2005-05-01

    Conferences and Proceedings: The results were presented at several conferences. These include: 1. M. O. Scully, " Foundations of Quantum Mechanics ", in...applications have revealed a strong connection between the fundamental aspects of quantum mechanics that governs physical systems and the informational...could be solved in polynomial time using quantum computers. Another set of problems where quantum mechanics can carry out computations substantially

  13. Passive in-home health and wellness monitoring: overview, value and examples.

    PubMed

    Alwan, Majd

    2009-01-01

    Modern sensor and communication technology, coupled with advances in data analysis and artificial intelligence techniques, is causing a paradigm shift in remote management and monitoring of chronic disease. In-home monitoring technology brings the added benefit of measuring individualized health status and reporting it to the care provider and caregivers alike, allowing timely and targeted preventive interventions, even in home and community based settings. This paper presents a paradigm for geriatric care based on monitoring older adults passively in their own living settings through placing sensors in their living environments or the objects they use. Activity and physiological data can be analyzed, archived and mined to detect indicators of early disease onset or changes in health conditions at various levels. Examples of monitoring systems are discussed and results from field evaluation pilot studies are summarized. The approach has shown great promise for a significant value proposition to all the stakeholders involved in caring for older adults. The paradigm would allow care providers to extend their services into the communities they serve.

  14. Shifting the HIV training and research paradigm to address disparities in HIV outcomes

    PubMed Central

    LEVISON, Julie H.; ALEGRÍA, Margarita

    2016-01-01

    Tailored programs to diversify the pool of HIV/AIDS investigators and provide sufficient training and support for minority investigators to compete successfully are uncommon in the US and abroad. This paper encourages a shift in the HIV/AIDS training and research paradigm to effectively train and mentor Latino researchers in the US, Latin America and the Caribbean. We suggest three strategies to accomplish this: 1) coaching senior administrative and academic staff of HIV/AIDS training programs on the needs, values, and experiences unique to Latino investigators; 2) encouraging mentors to be receptive to a different set of research questions and approaches that Latino researchers offer due to their life experiences and perspectives; and 3) creating a virtual infrastructure to share resources and tackle challenges faced by minority researchers. Shifts in the research paradigm to include, retain, and promote Latino HIV/AIDS researchers will benefit the scientific process and the patients and communities who await the promise of HIV/AIDS research. PMID:27501811

  15. Suppressing flashes of items surrounding targets during calibration of a P300-based brain-computer interface improves performance

    NASA Astrophysics Data System (ADS)

    Frye, G. E.; Hauser, C. K.; Townsend, G.; Sellers, E. W.

    2011-04-01

    Since the introduction of the P300 brain-computer interface (BCI) speller by Farwell and Donchin in 1988, the speed and accuracy of the system has been significantly improved. Larger electrode montages and various signal processing techniques are responsible for most of the improvement in performance. New presentation paradigms have also led to improvements in bit rate and accuracy (e.g. Townsend et al (2010 Clin. Neurophysiol. 121 1109-20)). In particular, the checkerboard paradigm for online P300 BCI-based spelling performs well, has started to document what makes for a successful paradigm, and is a good platform for further experimentation. The current paper further examines the checkerboard paradigm by suppressing items which surround the target from flashing during calibration (i.e. the suppression condition). In the online feedback mode the standard checkerboard paradigm is used with a stepwise linear discriminant classifier derived from the suppression condition and one classifier derived from the standard checkerboard condition, counter-balanced. The results of this research demonstrate that using suppression during calibration produces significantly more character selections/min ((6.46) time between selections included) than the standard checkerboard condition (5.55), and significantly fewer target flashes are needed per selection in the SUP condition (5.28) as compared to the RCP condition (6.17). Moreover, accuracy in the SUP and RCP conditions remained equivalent (~90%). Mean theoretical bit rate was 53.62 bits/min in the suppression condition and 46.36 bits/min in the standard checkerboard condition (ns). Waveform morphology also showed significant differences in amplitude and latency.

  16. An Agent Inspired Reconfigurable Computing Implementation of a Genetic Algorithm

    NASA Technical Reports Server (NTRS)

    Weir, John M.; Wells, B. Earl

    2003-01-01

    Many software systems have been successfully implemented using an agent paradigm which employs a number of independent entities that communicate with one another to achieve a common goal. The distributed nature of such a paradigm makes it an excellent candidate for use in high speed reconfigurable computing hardware environments such as those present in modem FPGA's. In this paper, a distributed genetic algorithm that can be applied to the agent based reconfigurable hardware model is introduced. The effectiveness of this new algorithm is evaluated by comparing the quality of the solutions found by the new algorithm with those found by traditional genetic algorithms. The performance of a reconfigurable hardware implementation of the new algorithm on an FPGA is compared to traditional single processor implementations.

  17. Engaging Women in Computer Science and Engineering: Promising Practices for Promoting Gender Equity in Undergraduate Research Experiences

    ERIC Educational Resources Information Center

    Kim, Karen A.; Fann, Amy J.; Misa-Escalante, Kimberly O.

    2011-01-01

    Building on research that identifies and addresses issues of women's underrepresentation in computing, this article describes promising practices in undergraduate research experiences that promote women's long-term interest in computer science and engineering. Specifically, this article explores whether and how REU programs include programmatic…

  18. Photochromic molecular implementations of universal computation.

    PubMed

    Chaplin, Jack C; Krasnogor, Natalio; Russell, Noah A

    2014-12-01

    Unconventional computing is an area of research in which novel materials and paradigms are utilised to implement computation. Previously we have demonstrated how registers, logic gates and logic circuits can be implemented, unconventionally, with a biocompatible molecular switch, NitroBIPS, embedded in a polymer matrix. NitroBIPS and related molecules have been shown elsewhere to be capable of modifying many biological processes in a manner that is dependent on its molecular form. Thus, one possible application of this type of unconventional computing is to embed computational processes into biological systems. Here we expand on our earlier proof-of-principle work and demonstrate that universal computation can be implemented using NitroBIPS. We have previously shown that spatially localised computational elements, including registers and logic gates, can be produced. We explain how parallel registers can be implemented, then demonstrate an application of parallel registers in the form of Turing machine tapes, and demonstrate both parallel registers and logic circuits in the form of elementary cellular automata. The Turing machines and elementary cellular automata utilise the same samples and same hardware to implement their registers, logic gates and logic circuits; and both represent examples of universal computing paradigms. This shows that homogenous photochromic computational devices can be dynamically repurposed without invasive reconfiguration. The result represents an important, necessary step towards demonstrating the general feasibility of interfacial computation embedded in biological systems or other unconventional materials and environments. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  19. Computational ecology as an emerging science

    PubMed Central

    Petrovskii, Sergei; Petrovskaya, Natalia

    2012-01-01

    It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336

  20. Machine learning in computational docking.

    PubMed

    Khamis, Mohamed A; Gomaa, Walid; Ahmed, Walaa F

    2015-03-01

    The objective of this paper is to highlight the state-of-the-art machine learning (ML) techniques in computational docking. The use of smart computational methods in the life cycle of drug design is relatively a recent development that has gained much popularity and interest over the last few years. Central to this methodology is the notion of computational docking which is the process of predicting the best pose (orientation + conformation) of a small molecule (drug candidate) when bound to a target larger receptor molecule (protein) in order to form a stable complex molecule. In computational docking, a large number of binding poses are evaluated and ranked using a scoring function. The scoring function is a mathematical predictive model that produces a score that represents the binding free energy, and hence the stability, of the resulting complex molecule. Generally, such a function should produce a set of plausible ligands ranked according to their binding stability along with their binding poses. In more practical terms, an effective scoring function should produce promising drug candidates which can then be synthesized and physically screened using high throughput screening process. Therefore, the key to computer-aided drug design is the design of an efficient highly accurate scoring function (using ML techniques). The methods presented in this paper are specifically based on ML techniques. Despite many traditional techniques have been proposed, the performance was generally poor. Only in the last few years started the application of the ML technology in the design of scoring functions; and the results have been very promising. The ML-based techniques are based on various molecular features extracted from the abundance of protein-ligand information in the public molecular databases, e.g., protein data bank bind (PDBbind). In this paper, we present this paradigm shift elaborating on the main constituent elements of the ML approach to molecular docking along with the state-of-the-art research in this area. For instance, the best random forest (RF)-based scoring function on PDBbind v2007 achieves a Pearson correlation coefficient between the predicted and experimentally determined binding affinities of 0.803 while the best conventional scoring function achieves 0.644. The best RF-based ranking power ranks the ligands correctly based on their experimentally determined binding affinities with accuracy 62.5% and identifies the top binding ligand with accuracy 78.1%. We conclude with open questions and potential future research directions that can be pursued in smart computational docking; using molecular features of different nature (geometrical, energy terms, pharmacophore), advanced ML techniques (e.g., deep learning), combining more than one ML models. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Implementing Signature Neural Networks with Spiking Neurons.

    PubMed

    Carrillo-Medina, José Luis; Latorre, Roberto

    2016-01-01

    Spiking Neural Networks constitute the most promising approach to develop realistic Artificial Neural Networks (ANNs). Unlike traditional firing rate-based paradigms, information coding in spiking models is based on the precise timing of individual spikes. It has been demonstrated that spiking ANNs can be successfully and efficiently applied to multiple realistic problems solvable with traditional strategies (e.g., data classification or pattern recognition). In recent years, major breakthroughs in neuroscience research have discovered new relevant computational principles in different living neural systems. Could ANNs benefit from some of these recent findings providing novel elements of inspiration? This is an intriguing question for the research community and the development of spiking ANNs including novel bio-inspired information coding and processing strategies is gaining attention. From this perspective, in this work, we adapt the core concepts of the recently proposed Signature Neural Network paradigm-i.e., neural signatures to identify each unit in the network, local information contextualization during the processing, and multicoding strategies for information propagation regarding the origin and the content of the data-to be employed in a spiking neural network. To the best of our knowledge, none of these mechanisms have been used yet in the context of ANNs of spiking neurons. This paper provides a proof-of-concept for their applicability in such networks. Computer simulations show that a simple network model like the discussed here exhibits complex self-organizing properties. The combination of multiple simultaneous encoding schemes allows the network to generate coexisting spatio-temporal patterns of activity encoding information in different spatio-temporal spaces. As a function of the network and/or intra-unit parameters shaping the corresponding encoding modality, different forms of competition among the evoked patterns can emerge even in the absence of inhibitory connections. These parameters also modulate the memory capabilities of the network. The dynamical modes observed in the different informational dimensions in a given moment are independent and they only depend on the parameters shaping the information processing in this dimension. In view of these results, we argue that plasticity mechanisms inside individual cells and multicoding strategies can provide additional computational properties to spiking neural networks, which could enhance their capacity and performance in a wide variety of real-world tasks.

  2. A novel task-oriented optimal design for P300-based brain-computer interfaces.

    PubMed

    Zhou, Zongtan; Yin, Erwei; Liu, Yang; Jiang, Jun; Hu, Dewen

    2014-10-01

    Objective. The number of items of a P300-based brain-computer interface (BCI) should be adjustable in accordance with the requirements of the specific tasks. To address this issue, we propose a novel task-oriented optimal approach aimed at increasing the performance of general P300 BCIs with different numbers of items. Approach. First, we proposed a stimulus presentation with variable dimensions (VD) paradigm as a generalization of the conventional single-character (SC) and row-column (RC) stimulus paradigms. Furthermore, an embedding design approach was employed for any given number of items. Finally, based on the score-P model of each subject, the VD flash pattern was selected by a linear interpolation approach for a certain task. Main results. The results indicate that the optimal BCI design consistently outperforms the conventional approaches, i.e., the SC and RC paradigms. Specifically, there is significant improvement in the practical information transfer rate for a large number of items. Significance. The results suggest that the proposed optimal approach would provide useful guidance in the practical design of general P300-based BCIs.

  3. Toward brain-actuated car applications: Self-paced control with a motor imagery-based brain-computer interface.

    PubMed

    Yu, Yang; Zhou, Zongtan; Yin, Erwei; Jiang, Jun; Tang, Jingsheng; Liu, Yadong; Hu, Dewen

    2016-10-01

    This study presented a paradigm for controlling a car using an asynchronous electroencephalogram (EEG)-based brain-computer interface (BCI) and presented the experimental results of a simulation performed in an experimental environment outside the laboratory. This paradigm uses two distinct MI tasks, imaginary left- and right-hand movements, to generate a multi-task car control strategy consisting of starting the engine, moving forward, turning left, turning right, moving backward, and stopping the engine. Five healthy subjects participated in the online car control experiment, and all successfully controlled the car by following a previously outlined route. Subject S1 exhibited the most satisfactory BCI-based performance, which was comparable to the manual control-based performance. We hypothesize that the proposed self-paced car control paradigm based on EEG signals could potentially be used in car control applications, and we provide a complementary or alternative way for individuals with locked-in disorders to achieve more mobility in the future, as well as providing a supplementary car-driving strategy to assist healthy people in driving a car. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. A novel task-oriented optimal design for P300-based brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Zhou, Zongtan; Yin, Erwei; Liu, Yang; Jiang, Jun; Hu, Dewen

    2014-10-01

    Objective. The number of items of a P300-based brain-computer interface (BCI) should be adjustable in accordance with the requirements of the specific tasks. To address this issue, we propose a novel task-oriented optimal approach aimed at increasing the performance of general P300 BCIs with different numbers of items. Approach. First, we proposed a stimulus presentation with variable dimensions (VD) paradigm as a generalization of the conventional single-character (SC) and row-column (RC) stimulus paradigms. Furthermore, an embedding design approach was employed for any given number of items. Finally, based on the score-P model of each subject, the VD flash pattern was selected by a linear interpolation approach for a certain task. Main results. The results indicate that the optimal BCI design consistently outperforms the conventional approaches, i.e., the SC and RC paradigms. Specifically, there is significant improvement in the practical information transfer rate for a large number of items. Significance. The results suggest that the proposed optimal approach would provide useful guidance in the practical design of general P300-based BCIs.

  5. Associative vocabulary learning: development and testing of two paradigms for the (re-) acquisition of action- and object-related words.

    PubMed

    Freundlieb, Nils; Ridder, Volker; Dobel, Christian; Enriquez-Geppert, Stefanie; Baumgaertner, Annette; Zwitserlood, Pienie; Gerloff, Christian; Hummel, Friedhelm C; Liuzzi, Gianpiero

    2012-01-01

    Despite a growing number of studies, the neurophysiology of adult vocabulary acquisition is still poorly understood. One reason is that paradigms that can easily be combined with neuroscientfic methods are rare. Here, we tested the efficiency of two paradigms for vocabulary (re-) acquisition, and compared the learning of novel words for actions and objects. Cortical networks involved in adult native-language word processing are widespread, with differences postulated between words for objects and actions. Words and what they stand for are supposed to be grounded in perceptual and sensorimotor brain circuits depending on their meaning. If there are specific brain representations for different word categories, we hypothesized behavioural differences in the learning of action-related and object-related words. Paradigm A, with the learning of novel words for body-related actions spread out over a number of days, revealed fast learning of these new action words, and stable retention up to 4 weeks after training. The single-session Paradigm B employed objects and actions. Performance during acquisition did not differ between action-related and object-related words (time*word category: p = 0.01), but the translation rate was clearly better for object-related (79%) than for action-related words (53%, p = 0.002). Both paradigms yielded robust associative learning of novel action-related words, as previously demonstrated for object-related words. Translation success differed for action- and object-related words, which may indicate different neural mechanisms. The paradigms tested here are well suited to investigate such differences with neuroscientific means. Given the stable retention and minimal requirements for conscious effort, these learning paradigms are promising for vocabulary re-learning in brain-lesioned people. In combination with neuroimaging, neuro-stimulation or pharmacological intervention, they may well advance the understanding of language learning to optimize therapeutic strategies.

  6. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  7. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  8. Voxel-based automated detection of focal cortical dysplasia lesions using diffusion tensor imaging and T2-weighted MRI data.

    PubMed

    Wang, Yanming; Zhou, Yawen; Wang, Huijuan; Cui, Jin; Nguchu, Benedictor Alexander; Zhang, Xufei; Qiu, Bensheng; Wang, Xiaoxiao; Zhu, Mingwang

    2018-05-21

    The aim of this study was to automatically detect focal cortical dysplasia (FCD) lesions in patients with extratemporal lobe epilepsy by relying on diffusion tensor imaging (DTI) and T2-weighted magnetic resonance imaging (MRI) data. We implemented an automated classifier using voxel-based multimodal features to identify gray and white matter abnormalities of FCD in patient cohorts. In addition to the commonly used T2-weighted image intensity feature, DTI-based features were also utilized. A Gaussian processes for machine learning (GPML) classifier was tested on 12 patients with FCD (8 with histologically confirmed FCD) scanned at 1.5 T and cross-validated using a leave-one-out strategy. Moreover, we compared the multimodal GPML paradigm's performance with that of single modal GPML and classical support vector machine (SVM). Our results demonstrated that the GPML performance on DTI-based features (mean AUC = 0.63) matches with the GPML performance on T2-weighted image intensity feature (mean AUC = 0.64). More promisingly, GPML yielded significantly improved performance (mean AUC = 0.76) when applying DTI-based features to multimodal paradigm. Based on the results, it can also be clearly stated that the proposed GPML strategy performed better and is robust to unbalanced dataset contrary to SVM that performed poorly (AUC = 0.69). Therefore, the GPML paradigm using multimodal MRI data containing DTI modality has promising result towards detection of the FCD lesions and provides an effective direction for future researches. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Systems Pharmacology-Based Discovery of Natural Products for Precision Oncology Through Targeting Cancer Mutated Genes.

    PubMed

    Fang, J; Cai, C; Wang, Q; Lin, P; Zhao, Z; Cheng, F

    2017-03-01

    Massive cancer genomics data have facilitated the rapid revolution of a novel oncology drug discovery paradigm through targeting clinically relevant driver genes or mutations for the development of precision oncology. Natural products with polypharmacological profiles have been demonstrated as promising agents for the development of novel cancer therapies. In this study, we developed an integrated systems pharmacology framework that facilitated identifying potential natural products that target mutated genes across 15 cancer types or subtypes in the realm of precision medicine. High performance was achieved for our systems pharmacology framework. In case studies, we computationally identified novel anticancer indications for several US Food and Drug Administration-approved or clinically investigational natural products (e.g., resveratrol, quercetin, genistein, and fisetin) through targeting significantly mutated genes in multiple cancer types. In summary, this study provides a powerful tool for the development of molecularly targeted cancer therapies through targeting the clinically actionable alterations by exploiting the systems pharmacology of natural products. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  10. Fractional charge and inter-Landau-level states at points of singular curvature.

    PubMed

    Biswas, Rudro R; Son, Dam Thanh

    2016-08-02

    The quest for universal properties of topological phases is fundamentally important because these signatures are robust to variations in system-specific details. Aspects of the response of quantum Hall states to smooth spatial curvature are well-studied, but challenging to observe experimentally. Here we go beyond this prevailing paradigm and obtain general results for the response of quantum Hall states to points of singular curvature in real space; such points may be readily experimentally actualized. We find, using continuum analytical methods, that the point of curvature binds an excess fractional charge and sequences of quantum states split away, energetically, from the degenerate bulk Landau levels. Importantly, these inter-Landau-level states are bound to the topological singularity and have energies that are universal functions of bulk parameters and the curvature. Our exact diagonalization of lattice tight-binding models on closed manifolds demonstrates that these results continue to hold even when lattice effects are significant. An important technological implication of these results is that these inter-Landau-level states, being both energetically and spatially isolated quantum states, are promising candidates for constructing qubits for quantum computation.

  11. Tensor-based classification of an auditory mobile BCI without a subject-specific calibration phase

    NASA Astrophysics Data System (ADS)

    Zink, Rob; Hunyadi, Borbála; Van Huffel, Sabine; De Vos, Maarten

    2016-04-01

    Objective. One of the major drawbacks in EEG brain-computer interfaces (BCI) is the need for subject-specific training of the classifier. By removing the need for a supervised calibration phase, new users could potentially explore a BCI faster. In this work we aim to remove this subject-specific calibration phase and allow direct classification. Approach. We explore canonical polyadic decompositions and block term decompositions of the EEG. These methods exploit structure in higher dimensional data arrays called tensors. The BCI tensors are constructed by concatenating ERP templates from other subjects to a target and non-target trial and the inherent structure guides a decomposition that allows accurate classification. We illustrate the new method on data from a three-class auditory oddball paradigm. Main results. The presented approach leads to a fast and intuitive classification with accuracies competitive with a supervised and cross-validated LDA approach. Significance. The described methods are a promising new way of classifying BCI data with a forthright link to the original P300 ERP signal over the conventional and widely used supervised approaches.

  12. Studying and Treating Schizophrenia Using Virtual Reality: A New Paradigm

    PubMed Central

    Freeman, Daniel

    2008-01-01

    Understanding schizophrenia requires consideration of patients’ interactions in the social world. Misinterpretation of other peoples’ behavior is a key feature of persecutory ideation. The occurrence and intensity of hallucinations is affected by the social context. Negative symptoms such as anhedonia, asociality, and blunted affect reflect difficulties in social interactions. Withdrawal and avoidance of other people is frequent in schizophrenia, leading to isolation and rumination. The use of virtual reality (VR)—interactive immersive computer environments—allows one of the key variables in understanding psychosis, social environments, to be controlled, providing exciting applications to research and treatment. Seven applications of virtual social environments to schizophrenia are set out: symptom assessment, identification of symptom markers, establishment of predictive factors, tests of putative causal factors, investigation of the differential prediction of symptoms, determination of toxic elements in the environment, and development of treatment. The initial VR studies of persecutory ideation, which illustrate the ascription of personalities and mental states to virtual people, are highlighted. VR, suitably applied, holds great promise in furthering the understanding and treatment of psychosis. PMID:18375568

  13. Tensor-based classification of an auditory mobile BCI without a subject-specific calibration phase.

    PubMed

    Zink, Rob; Hunyadi, Borbála; Huffel, Sabine Van; Vos, Maarten De

    2016-04-01

    One of the major drawbacks in EEG brain-computer interfaces (BCI) is the need for subject-specific training of the classifier. By removing the need for a supervised calibration phase, new users could potentially explore a BCI faster. In this work we aim to remove this subject-specific calibration phase and allow direct classification. We explore canonical polyadic decompositions and block term decompositions of the EEG. These methods exploit structure in higher dimensional data arrays called tensors. The BCI tensors are constructed by concatenating ERP templates from other subjects to a target and non-target trial and the inherent structure guides a decomposition that allows accurate classification. We illustrate the new method on data from a three-class auditory oddball paradigm. The presented approach leads to a fast and intuitive classification with accuracies competitive with a supervised and cross-validated LDA approach. The described methods are a promising new way of classifying BCI data with a forthright link to the original P300 ERP signal over the conventional and widely used supervised approaches.

  14. Context-Aware Recommender Systems

    NASA Astrophysics Data System (ADS)

    Adomavicius, Gediminas; Tuzhilin, Alexander

    The importance of contextual information has been recognized by researchers and practitioners in many disciplines, including e-commerce personalization, information retrieval, ubiquitous and mobile computing, data mining, marketing, and management. While a substantial amount of research has already been performed in the area of recommender systems, most existing approaches focus on recommending the most relevant items to users without taking into account any additional contextual information, such as time, location, or the company of other people (e.g., for watching movies or dining out). In this chapter we argue that relevant contextual information does matter in recommender systems and that it is important to take this information into account when providing recommendations. We discuss the general notion of context and how it can be modeled in recommender systems. Furthermore, we introduce three different algorithmic paradigms - contextual prefiltering, post-filtering, and modeling - for incorporating contextual information into the recommendation process, discuss the possibilities of combining several contextaware recommendation techniques into a single unifying approach, and provide a case study of one such combined approach. Finally, we present additional capabilities for context-aware recommenders and discuss important and promising directions for future research.

  15. Fillers as Signals: Evidence from a Question-Answering Paradigm

    ERIC Educational Resources Information Center

    Walker, Esther J.; Risko, Evan F.; Kingstone, Alan

    2014-01-01

    The present study examined the influence of a human or computer "partner" on the production of fillers ("um" and "uh") during a question and answer task. Experiment 1 investigated whether or not responding to a human partner as opposed to a computer partner results in a higher rate of filler production. Participants…

  16. Running R Statistical Computing Environment Software on the Peregrine

    Science.gov Websites

    for the development of new statistical methodologies and enjoys a large user base. Please consult the distribution details. Natural language support but running in an English locale R is a collaborative project programming paradigms to better leverage modern HPC systems. The CRAN task view for High Performance Computing

  17. Designing Social Presence in e-Learning Environments: Testing the Effect of Interactivity on Children

    ERIC Educational Resources Information Center

    Tung, Fang-Wu; Deng, Yi-Shin

    2006-01-01

    The "computers are social actors" paradigm asserts that human-to-computer interactions are fundamentally social responses. Earlier research has shown that effective management of the social presence in user interface design can improve user engagement and motivation. Much of this research has focused on adult subjects. This study…

  18. How Learning Logic Programming Affects Recursion Comprehension

    ERIC Educational Resources Information Center

    Haberman, Bruria

    2004-01-01

    Recursion is a central concept in computer science, yet it is difficult for beginners to comprehend. Israeli high-school students learn recursion in the framework of a special modular program in computer science (Gal-Ezer & Harel, 1999). Some of them are introduced to the concept of recursion in two different paradigms: the procedural…

  19. Philosophy of Language. Course Notes for a Tutorial on Computational Semantics.

    ERIC Educational Resources Information Center

    Wilks, Yorick

    This course was part of a tutorial focusing on the state of computational semantics, i.e., the state of work on natural language within the artificial intelligence (AI) paradigm. The discussion in the course centered on the philosophers Richard Montague and Ludwig Wittgenstein. The course was divided into three sections: (1)…

  20. Choosing Learning Methods Suitable for Teaching and Learning in Computer Science

    ERIC Educational Resources Information Center

    Taylor, Estelle; Breed, Marnus; Hauman, Ilette; Homann, Armando

    2013-01-01

    Our aim is to determine which teaching methods students in Computer Science and Information Systems prefer. There are in total 5 different paradigms (behaviorism, cognitivism, constructivism, design-based and humanism) with 32 models between them. Each model is unique and states different learning methods. Recommendations are made on methods that…

  1. Integrating a Music Curriculum into an External Degree Program Using Computer Assisted Instruction.

    ERIC Educational Resources Information Center

    Brinkley, Robert C.

    This paper outlines the method and theoretical basis for establishing and implementing an independent study music curriculum. The curriculum combines practical and theoretical paradigms and leads to an external degree. The computer, in direct interaction with the student, is the primary instructional tool, and the teacher is involved in indirect…

  2. Autonomous control systems: applications to remote sensing and image processing

    NASA Astrophysics Data System (ADS)

    Jamshidi, Mohammad

    2001-11-01

    One of the main challenges of any control (or image processing) paradigm is being able to handle complex systems under unforeseen uncertainties. A system may be called complex here if its dimension (order) is too high and its model (if available) is nonlinear, interconnected, and information on the system is uncertain such that classical techniques cannot easily handle the problem. Examples of complex systems are power networks, space robotic colonies, national air traffic control system, and integrated manufacturing plant, the Hubble Telescope, the International Space Station, etc. Soft computing, a consortia of methodologies such as fuzzy logic, neuro-computing, genetic algorithms and genetic programming, has proven to be powerful tools for adding autonomy and semi-autonomy to many complex systems. For such systems the size of soft computing control architecture will be nearly infinite. In this paper new paradigms using soft computing approaches are utilized to design autonomous controllers and image enhancers for a number of application areas. These applications are satellite array formations for synthetic aperture radar interferometry (InSAR) and enhancement of analog and digital images.

  3. A programming language for composable DNA circuits

    PubMed Central

    Phillips, Andrew; Cardelli, Luca

    2009-01-01

    Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing. PMID:19535415

  4. A programming language for composable DNA circuits.

    PubMed

    Phillips, Andrew; Cardelli, Luca

    2009-08-06

    Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing.

  5. GPU Accelerated Vector Median Filter

    NASA Technical Reports Server (NTRS)

    Aras, Rifat; Shen, Yuzhong

    2011-01-01

    Noise reduction is an important step for most image processing tasks. For three channel color images, a widely used technique is vector median filter in which color values of pixels are treated as 3-component vectors. Vector median filters are computationally expensive; for a window size of n x n, each of the n(sup 2) vectors has to be compared with other n(sup 2) - 1 vectors in distances. General purpose computation on graphics processing units (GPUs) is the paradigm of utilizing high-performance many-core GPU architectures for computation tasks that are normally handled by CPUs. In this work. NVIDIA's Compute Unified Device Architecture (CUDA) paradigm is used to accelerate vector median filtering. which has to the best of our knowledge never been done before. The performance of GPU accelerated vector median filter is compared to that of the CPU and MPI-based versions for different image and window sizes, Initial findings of the study showed 100x improvement of performance of vector median filter implementation on GPUs over CPU implementations and further speed-up is expected after more extensive optimizations of the GPU algorithm .

  6. Online Graph Completion: Multivariate Signal Recovery in Computer Vision.

    PubMed

    Kim, Won Hwa; Jalal, Mona; Hwang, Seongjae; Johnson, Sterling C; Singh, Vikas

    2017-07-01

    The adoption of "human-in-the-loop" paradigms in computer vision and machine learning is leading to various applications where the actual data acquisition (e.g., human supervision) and the underlying inference algorithms are closely interwined. While classical work in active learning provides effective solutions when the learning module involves classification and regression tasks, many practical issues such as partially observed measurements, financial constraints and even additional distributional or structural aspects of the data typically fall outside the scope of this treatment. For instance, with sequential acquisition of partial measurements of data that manifest as a matrix (or tensor), novel strategies for completion (or collaborative filtering) of the remaining entries have only been studied recently. Motivated by vision problems where we seek to annotate a large dataset of images via a crowdsourced platform or alternatively, complement results from a state-of-the-art object detector using human feedback, we study the "completion" problem defined on graphs, where requests for additional measurements must be made sequentially. We design the optimization model in the Fourier domain of the graph describing how ideas based on adaptive submodularity provide algorithms that work well in practice. On a large set of images collected from Imgur, we see promising results on images that are otherwise difficult to categorize. We also show applications to an experimental design problem in neuroimaging.

  7. A Review of Brain-Computer Interface Games and an Opinion Survey from Researchers, Developers and Users

    PubMed Central

    Ahn, Minkyu; Lee, Mijin; Choi, Jinyoung; Jun, Sung Chan

    2014-01-01

    In recent years, research on Brain-Computer Interface (BCI) technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state) have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users) agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to “the easiness of playing” and the “development platform” as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration. PMID:25116904

  8. A review of brain-computer interface games and an opinion survey from researchers, developers and users.

    PubMed

    Ahn, Minkyu; Lee, Mijin; Choi, Jinyoung; Jun, Sung Chan

    2014-08-11

    In recent years, research on Brain-Computer Interface (BCI) technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state) have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users) agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to "the easiness of playing" and the "development platform" as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration.

  9. A novel P300-based brain-computer interface stimulus presentation paradigm: moving beyond rows and columns

    PubMed Central

    Townsend, G.; LaPallo, B.K.; Boulay, C.B.; Krusienski, D.J.; Frye, G.E.; Hauser, C.K.; Schwartz, N.E.; Vaughan, T.M.; Wolpaw, J.R.; Sellers, E.W.

    2010-01-01

    Objective An electroencephalographic brain-computer interface (BCI) can provide a non-muscular means of communication for people with amyotrophic lateral sclerosis (ALS) or other neuromuscular disorders. We present a novel P300-based BCI stimulus presentation – the checkerboard paradigm (CBP). CBP performance is compared to that of the standard row/column paradigm (RCP) introduced by Farwell and Donchin (1988). Methods Using an 8×9 matrix of alphanumeric characters and keyboard commands, 18 participants used the CBP and RCP in counter-balanced fashion. With approximately 9 – 12 minutes of calibration data, we used a stepwise linear discriminant analysis for online classification of subsequent data. Results Mean online accuracy was significantly higher for the CBP, 92%, than for the RCP, 77%. Correcting for extra selections due to errors, mean bit rate was also significantly higher for the CBP, 23 bits/min, than for the RCP, 17 bits/min. Moreover, the two paradigms produced significantly different waveforms. Initial tests with three advanced ALS participants produced similar results. Furthermore, these individuals preferred the CBP to the RCP. Conclusions These results suggest that the CBP is markedly superior to the RCP in performance and user acceptability. Significance The CBP has the potential to provide a substantially more effective BCI than the RCP. This is especially important for people with severe neuromuscular disabilities. PMID:20347387

  10. Field Computation and Nonpropositional Knowledge.

    DTIC Science & Technology

    1987-09-01

    field computer It is based on xeneralization of Taylor’s theorem to continuous dimensional vector spaces. 20. DISTRIBUTION/AVAILABILITY OF ABSTRACT 21...generalization of Taylor’s theorem to continuous dimensional vector -5paces A number of field computations are illustrated, including several Lransforma...paradigm. The "old" Al has been quite successful in performing a number of difficult tasks, such as theorem prov- ing, chess playing, medical diagnosis and

  11. Attentional Selection in Object Recognition

    DTIC Science & Technology

    1993-02-01

    order. It also affects the choice of strategies in both the 24 A Computational Model of Attentional Selection filtering and arbiter stages. The set...such processing. In Treisman’s model this was hidden in the concept of the selection filter . Later computational models of attention tried to...This thesis presents a novel approach to the selection problem by propos. ing a computational model of visual attentional selection as a paradigm for

  12. A dynamic, embodied paradigm to investigate the role of serotonin in decision-making

    PubMed Central

    Asher, Derrik E.; Craig, Alexis B.; Zaldivar, Andrew; Brewer, Alyssa A.; Krichmar, Jeffrey L.

    2013-01-01

    Serotonin (5-HT) is a neuromodulator that has been attributed to cost assessment and harm aversion. In this review, we look at the role 5-HT plays in making decisions when subjects are faced with potential harmful or costly outcomes. We review approaches for examining the serotonergic system in decision-making. We introduce our group’s paradigm used to investigate how 5-HT affects decision-making. In particular, our paradigm combines techniques from computational neuroscience, socioeconomic game theory, human–robot interaction, and Bayesian statistics. We will highlight key findings from our previous studies utilizing this paradigm, which helped expand our understanding of 5-HT’s effect on decision-making in relation to cost assessment. Lastly, we propose a cyclic multidisciplinary approach that may aid in addressing the complexity of exploring 5-HT and decision-making by iteratively updating our assumptions and models of the serotonergic system through exhaustive experimentation. PMID:24319413

  13. Teacher Leadership: A Promising Paradigm for Improving Instruction in Science and Mathematics.

    ERIC Educational Resources Information Center

    Pellicer, Leonard O.; Anderson, Lorin W.

    Without question teacher leadership is more important today to the success of America's schools than it has ever been before. As schools and the populations they serve have grown in size and complexity, principals can no longer be expected to be the sole, or even the primary, source of instructional leadership. This realization has come about as a…

  14. More than Just Story-Telling: Cultural-Historical Activity Theory as an Under-Utilized Methodology for Educational Change Research

    ERIC Educational Resources Information Center

    Lee, Yew-Jin

    2011-01-01

    Sociocultural theory is increasingly popular as a paradigm for research in education. A recent member in this family of theories is introduced--cultural-historical activity theory (CHAT)--that shows much promise to complement and invigorate the field of educational change, a large, multi-faceted, and persistent problematic. In particular,…

  15. Quest to Learn: Developing the School for Digital Kids

    ERIC Educational Resources Information Center

    Salen, Katie; Torres, Robert; Wolozin, Loretta; Rufo-Tepper, Rebecca; Shapiro, Arana

    2011-01-01

    Quest to Learn, an innovative school for grades 6 to 12 in New York City, grew out of the idea that gaming and game design offer a promising new paradigm for curriculum and learning. The designers of Quest to Learn developed an approach to learning that draws from what games do best: drop kids into inquiry-based, complex problem spaces that are…

  16. Harm Reduction for the Prevention of Youth Gambling Problems: Lessons Learned From Adolescent High-Risk Behavior Prevention Programs

    ERIC Educational Resources Information Center

    Dickson, Laurie M.; Derevensky, Jeffrey L.; Gupta, Rina

    2004-01-01

    Despite the growing popularity of the harm reduction approach in the field of adolescent alcohol and substance abuse, a harm reduction approach to prevention and treatment of youth problem gambling remains largely unexplored. This article poses the question of whether the harm reduction paradigm is a promising approach to the prevention of…

  17. Tactile and bone-conduction auditory brain computer interface for vision and hearing impaired users.

    PubMed

    Rutkowski, Tomasz M; Mori, Hiromu

    2015-04-15

    The paper presents a report on the recently developed BCI alternative for users suffering from impaired vision (lack of focus or eye-movements) or from the so-called "ear-blocking-syndrome" (limited hearing). We report on our recent studies of the extents to which vibrotactile stimuli delivered to the head of a user can serve as a platform for a brain computer interface (BCI) paradigm. In the proposed tactile and bone-conduction auditory BCI novel multiple head positions are used to evoke combined somatosensory and auditory (via the bone conduction effect) P300 brain responses, in order to define a multimodal tactile and bone-conduction auditory brain computer interface (tbcaBCI). In order to further remove EEG interferences and to improve P300 response classification synchrosqueezing transform (SST) is applied. SST outperforms the classical time-frequency analysis methods of the non-linear and non-stationary signals such as EEG. The proposed method is also computationally more effective comparing to the empirical mode decomposition. The SST filtering allows for online EEG preprocessing application which is essential in the case of BCI. Experimental results with healthy BCI-naive users performing online tbcaBCI, validate the paradigm, while the feasibility of the concept is illuminated through information transfer rate case studies. We present a comparison of the proposed SST-based preprocessing method, combined with a logistic regression (LR) classifier, together with classical preprocessing and LDA-based classification BCI techniques. The proposed tbcaBCI paradigm together with data-driven preprocessing methods are a step forward in robust BCI applications research. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. The Impact of Information Technology on the Design, Development, and Implementation of a Lunar Exploration Mission

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R.; Sims, Michael H.; Briggs, Geoffrey A.

    1996-01-01

    From the beginning to the present expeditions to the Moon have involved a large investment of human labor. This has been true for all aspects of the process, from the initial design of the mission, whether scientific or technological, through the development of the instruments and the spacecraft, to the flight and operational phases. In addition to the time constraints that this situation imposes, there is also a significant cost associated with the large labor costs. As a result lunar expeditions have been limited to a few robotic missions and the manned Apollo program missions of the 1970s. With the rapid rise of the new information technologies, new paradigms are emerging that promise to greatly reduce both the time and cost of such missions. With the rapidly increasing capabilities of computer hardware and software systems, as well as networks and communication systems, a new balance of work is being developed between the human and the machine system. This new balance holds the promise of greatly increased exploration capability, along with dramatically reduced design, development, and operating costs. These new information technologies, utilizing knowledge-based software and very highspeed computer systems, will provide new design and development tools, scheduling mechanisms, and vehicle and system health monitoring capabilities that have hitherto been unavailable to the mission and spacecraft designer and the system operator. This paper will utilize typical lunar missions, both robotic and crewed, as a basis to describe and illustrate how these new information system technologies could be applied to all aspects such missions. In particular, new system design tradeoff tools will be described along with technologies that will allow a very much greater degree of autonomy of exploration vehicles than has heretofore been possible. In addition, new information technologies that will significantly reduce the human operational requirements will be discussed.

  19. Small Universal Bacteria and Plasmid Computing Systems.

    PubMed

    Wang, Xun; Zheng, Pan; Ma, Tongmao; Song, Tao

    2018-05-29

    Bacterial computing is a known candidate in natural computing, the aim being to construct "bacterial computers" for solving complex problems. In this paper, a new kind of bacterial computing system, named the bacteria and plasmid computing system (BP system), is proposed. We investigate the computational power of BP systems with finite numbers of bacteria and plasmids. Specifically, it is obtained in a constructive way that a BP system with 2 bacteria and 34 plasmids is Turing universal. The results provide a theoretical cornerstone to construct powerful bacterial computers and demonstrate a concept of paradigms using a "reasonable" number of bacteria and plasmids for such devices.

  20. Optimizing the stimulus presentation paradigm design for the P300-based brain-computer interface using performance prediction.

    PubMed

    Mainsah, B O; Reeves, G; Collins, L M; Throckmorton, C S

    2017-08-01

    The role of a brain-computer interface (BCI) is to discern a user's intended message or action by extracting and decoding relevant information from brain signals. Stimulus-driven BCIs, such as the P300 speller, rely on detecting event-related potentials (ERPs) in response to a user attending to relevant or target stimulus events. However, this process is error-prone because the ERPs are embedded in noisy electroencephalography (EEG) data, representing a fundamental problem in communication of the uncertainty in the information that is received during noisy transmission. A BCI can be modeled as a noisy communication system and an information-theoretic approach can be exploited to design a stimulus presentation paradigm to maximize the information content that is presented to the user. However, previous methods that focused on designing error-correcting codes failed to provide significant performance improvements due to underestimating the effects of psycho-physiological factors on the P300 ERP elicitation process and a limited ability to predict online performance with their proposed methods. Maximizing the information rate favors the selection of stimulus presentation patterns with increased target presentation frequency, which exacerbates refractory effects and negatively impacts performance within the context of an oddball paradigm. An information-theoretic approach that seeks to understand the fundamental trade-off between information rate and reliability is desirable. We developed a performance-based paradigm (PBP) by tuning specific parameters of the stimulus presentation paradigm to maximize performance while minimizing refractory effects. We used a probabilistic-based performance prediction method as an evaluation criterion to select a final configuration of the PBP. With our PBP, we demonstrate statistically significant improvements in online performance, both in accuracy and spelling rate, compared to the conventional row-column paradigm. By accounting for refractory effects, an information-theoretic approach can be exploited to significantly improve BCI performance across a wide range of performance levels.

  1. Optimizing the stimulus presentation paradigm design for the P300-based brain-computer interface using performance prediction

    NASA Astrophysics Data System (ADS)

    Mainsah, B. O.; Reeves, G.; Collins, L. M.; Throckmorton, C. S.

    2017-08-01

    Objective. The role of a brain-computer interface (BCI) is to discern a user’s intended message or action by extracting and decoding relevant information from brain signals. Stimulus-driven BCIs, such as the P300 speller, rely on detecting event-related potentials (ERPs) in response to a user attending to relevant or target stimulus events. However, this process is error-prone because the ERPs are embedded in noisy electroencephalography (EEG) data, representing a fundamental problem in communication of the uncertainty in the information that is received during noisy transmission. A BCI can be modeled as a noisy communication system and an information-theoretic approach can be exploited to design a stimulus presentation paradigm to maximize the information content that is presented to the user. However, previous methods that focused on designing error-correcting codes failed to provide significant performance improvements due to underestimating the effects of psycho-physiological factors on the P300 ERP elicitation process and a limited ability to predict online performance with their proposed methods. Maximizing the information rate favors the selection of stimulus presentation patterns with increased target presentation frequency, which exacerbates refractory effects and negatively impacts performance within the context of an oddball paradigm. An information-theoretic approach that seeks to understand the fundamental trade-off between information rate and reliability is desirable. Approach. We developed a performance-based paradigm (PBP) by tuning specific parameters of the stimulus presentation paradigm to maximize performance while minimizing refractory effects. We used a probabilistic-based performance prediction method as an evaluation criterion to select a final configuration of the PBP. Main results. With our PBP, we demonstrate statistically significant improvements in online performance, both in accuracy and spelling rate, compared to the conventional row-column paradigm. Significance. By accounting for refractory effects, an information-theoretic approach can be exploited to significantly improve BCI performance across a wide range of performance levels.

  2. A Co-Adaptive Brain-Computer Interface for End Users with Severe Motor Impairment

    PubMed Central

    Faller, Josef; Scherer, Reinhold; Costa, Ursula; Opisso, Eloy; Medina, Josep; Müller-Putz, Gernot R.

    2014-01-01

    Co-adaptive training paradigms for event-related desynchronization (ERD) based brain-computer interfaces (BCI) have proven effective for healthy users. As of yet, it is not clear whether co-adaptive training paradigms can also benefit users with severe motor impairment. The primary goal of our paper was to evaluate a novel cue-guided, co-adaptive BCI training paradigm with severely impaired volunteers. The co-adaptive BCI supports a non-control state, which is an important step toward intuitive, self-paced control. A secondary aim was to have the same participants operate a specifically designed self-paced BCI training paradigm based on the auto-calibrated classifier. The co-adaptive BCI analyzed the electroencephalogram from three bipolar derivations (C3, Cz, and C4) online, while the 22 end users alternately performed right hand movement imagery (MI), left hand MI and relax with eyes open (non-control state). After less than five minutes, the BCI auto-calibrated and proceeded to provide visual feedback for the MI task that could be classified better against the non-control state. The BCI continued to regularly recalibrate. In every calibration step, the system performed trial-based outlier rejection and trained a linear discriminant analysis classifier based on one auto-selected logarithmic band-power feature. In 24 minutes of training, the co-adaptive BCI worked significantly (p = 0.01) better than chance for 18 of 22 end users. The self-paced BCI training paradigm worked significantly (p = 0.01) better than chance in 11 of 20 end users. The presented co-adaptive BCI complements existing approaches in that it supports a non-control state, requires very little setup time, requires no BCI expert and works online based on only two electrodes. The preliminary results from the self-paced BCI paradigm compare favorably to previous studies and the collected data will allow to further improve self-paced BCI systems for disabled users. PMID:25014055

  3. Optimized Motor Imagery Paradigm Based on Imagining Chinese Characters Writing Movement.

    PubMed

    Qiu, Zhaoyang; Allison, Brendan Z; Jin, Jing; Zhang, Yu; Wang, Xingyu; Li, Wei; Cichocki, Andrzej

    2017-07-01

    motor imagery (MI) is a mental representation of motor behavior. The MI-based brain computer interfaces (BCIs) can provide communication for the physically impaired. The performance of MI-based BCI mainly depends on the subject's ability to self-modulate electroencephalogram signals. Proper training can help naive subjects learn to modulate brain activity proficiently. However, training subjects typically involve abstract motor tasks and are time-consuming. to improve the performance of naive subjects during motor imagery, a novel paradigm was presented that would guide naive subjects to modulate brain activity effectively. In this new paradigm, pictures of the left or right hand were used as cues for subjects to finish the motor imagery task. Fourteen healthy subjects (11 male, aged 22-25 years, and mean 23.6±1.16) participated in this study. The task was to imagine writing a Chinese character. Specifically, subjects could imagine hand movements corresponding to the sequence of writing strokes in the Chinese character. This paradigm was meant to find an effective and familiar action for most Chinese people, to provide them with a specific, extensively practiced task and help them modulate brain activity. results showed that the writing task paradigm yielded significantly better performance than the traditional arrow paradigm (p < 0.001). Questionnaire replies indicated that most subjects thought that the new paradigm was easier. the proposed new motor imagery paradigm could guide subjects to help them modulate brain activity effectively. Results showed that there were significant improvements using new paradigm, both in classification accuracy and usability.

  4. Computers in the English Program: Promises and Pitfalls. New York State English Council Monograph.

    ERIC Educational Resources Information Center

    Chew, Charles R., Ed.

    Since an increasing number of English teachers are being asked to find a place for computers in the English program, this monograph focuses on issues connected to this technology. The first article sets the stage with a discussion of the power and potential of the computer. Other articles focus on the following topics: (1) promises and…

  5. The Chinese Facial Emotion Recognition Database (CFERD): a computer-generated 3-D paradigm to measure the recognition of facial emotional expressions at different intensities.

    PubMed

    Huang, Charles Lung-Cheng; Hsiao, Sigmund; Hwu, Hai-Gwo; Howng, Shen-Long

    2012-12-30

    The Chinese Facial Emotion Recognition Database (CFERD), a computer-generated three-dimensional (3D) paradigm, was developed to measure the recognition of facial emotional expressions at different intensities. The stimuli consisted of 3D colour photographic images of six basic facial emotional expressions (happiness, sadness, disgust, fear, anger and surprise) and neutral faces of the Chinese. The purpose of the present study is to describe the development and validation of CFERD with nonclinical healthy participants (N=100; 50 men; age ranging between 18 and 50 years), and to generate normative data set. The results showed that the sensitivity index d' [d'=Z(hit rate)-Z(false alarm rate), where function Z(p), p∈[0,1

  6. Performance analysis of a large-grain dataflow scheduling paradigm

    NASA Technical Reports Server (NTRS)

    Young, Steven D.; Wills, Robert W.

    1993-01-01

    A paradigm for scheduling computations on a network of multiprocessors using large-grain data flow scheduling at run time is described and analyzed. The computations to be scheduled must follow a static flow graph, while the schedule itself will be dynamic (i.e., determined at run time). Many applications characterized by static flow exist, and they include real-time control and digital signal processing. With the advent of computer-aided software engineering (CASE) tools for capturing software designs in dataflow-like structures, macro-dataflow scheduling becomes increasingly attractive, if not necessary. For parallel implementations, using the macro-dataflow method allows the scheduling to be insulated from the application designer and enables the maximum utilization of available resources. Further, by allowing multitasking, processor utilizations can approach 100 percent while they maintain maximum speedup. Extensive simulation studies are performed on 4-, 8-, and 16-processor architectures that reflect the effects of communication delays, scheduling delays, algorithm class, and multitasking on performance and speedup gains.

  7. Computation of forces arising from the polarizable continuum model within the domain-decomposition paradigm

    NASA Astrophysics Data System (ADS)

    Gatto, Paolo; Lipparini, Filippo; Stamm, Benjamin

    2017-12-01

    The domain-decomposition (dd) paradigm, originally introduced for the conductor-like screening model, has been recently extended to the dielectric Polarizable Continuum Model (PCM), resulting in the ddPCM method. We present here a complete derivation of the analytical derivatives of the ddPCM energy with respect to the positions of the solute's atoms and discuss their efficient implementation. As it is the case for the energy, we observe a quadratic scaling, which is discussed and demonstrated with numerical tests.

  8. [The positioning of nursing research in the academic studies: the origin and development of qualitative and quantitative studies].

    PubMed

    Lu, Pei-Pei; Ting, Shing-Shiang; Chen, Mei-Ling; Tang, Woung-Ru

    2005-12-01

    The purpose of this study is to discuss the historical context of qualitative and quantitative research so as to explain the principle of qualitative study and examine the positioning of nursing research within academic study as a whole. This paper guides the readers towards the historical context from empirical science, discusses the influences of qualitative and quantitative research on nursing research, then investigates the nature of research paradigms, examines the positioning of nursing research, which includes the characteristics of fields such as natural science, humanity and social studies, and science, and lastly, presents the research standard proposed by Yardley in 2000. The research paradigms include Positivism, Postpositivism, Criticism, and Constructivism, which can be compared with Ontology, Epistemology, and Methodology. The nature of the paradigm is to determine the assumption of the paradigm on the basis of Ontology, Epistemology, and Methodology. The paradigm determines how the researcher views the world and decides on what to answer, how to research, and how to answer. The difference in academic environment is reflected in the long-term dialogue between qualitative and quantitative studies, as well as the standard for criticism. This paper introduces the method of evaluation of the quality of qualitative study proposed by Yardley in 2002, namely the sensitivity of the context, the promise and conscientiousness, transparency and consistency, influence and significance. The paper is intended to provide a guideline for readers in evaluating the quality of qualitative study.

  9. Testing neurophysiological markers related to fear-potentiated startle.

    PubMed

    Seligowski, Antonia V; Bondy, Erin; Singleton, Paris; Orcutt, Holly K; Ressler, Kerry J; Auerbach, Randy P

    2018-06-11

    Fear-potentiated startle (FPS) paradigms provide insight into fear learning mechanisms that contribute to impairment among individuals with posttraumatic stress symptoms (PTSS). Electrophysiology also has provided insight into these mechanisms through the examination of event-related potentials (ERPs) such as the P100 and LPP. It remains unclear, however, whether the P100 and LPP may be related to fear learning processes within the FPS paradigm. To this end, we tested differences in ERP amplitudes for conditioned stimuli associated (CS+) and not associated (CS-) with an aversive unconditioned stimulus (US) during fear acquisition. Participants included 54 female undergraduate students (mean age = 20.26). The FPS response was measured via electromyography of the orbicularis oculi muscle. EEG data were collected during the FPS paradigm. While the difference between CS+ and CS- P100 amplitude was not significant, LPP amplitudes were significantly enhanced following the CS+ relative to CS-. Furthermore, the LPP difference wave (CS+ minus CS-) was associated with FPS scores for the CS- during the later portion of fear acquisition. These findings suggest that conditioned stimuli may have altered emotional encoding (LPP) during the FPS paradigm. Thus, the LPP may be a promising neurophysiological marker that is related to fear learning processes. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Multidimensional Space-Time Methodology for Development of Planetary and Space Sciences, S-T Data Management and S-T Computational Tomography

    NASA Astrophysics Data System (ADS)

    Andonov, Zdravko

    This R&D represent innovative multidimensional 6D-N(6n)D Space-Time (S-T) Methodology, 6D-6nD Coordinate Systems, 6D Equations, new 6D strategy and technology for development of Planetary Space Sciences, S-T Data Management and S-T Computational To-mography. . . The Methodology is actual for brain new RS Microwaves' Satellites and Compu-tational Tomography Systems development, aimed to defense sustainable Earth, Moon, & Sun System evolution. Especially, extremely important are innovations for monitoring and protec-tion of strategic threelateral system H-OH-H2O Hydrogen, Hydroxyl and Water), correspond-ing to RS VHRS (Very High Resolution Systems) of 1.420-1.657-22.089GHz microwaves. . . One of the Greatest Paradox and Challenge of World Science is the "transformation" of J. L. Lagrange 4D Space-Time (S-T) System to H. Minkovski 4D S-T System (O-X,Y,Z,icT) for Einstein's "Theory of Relativity". As a global result: -In contemporary Advanced Space Sciences there is not real adequate 4D-6D Space-Time Coordinate System and 6D Advanced Cosmos Strategy & Methodology for Multidimensional and Multitemporal Space-Time Data Management and Tomography. . . That's one of the top actual S-T Problems. Simple and optimal nD S-T Methodology discovery is extremely important for all Universities' Space Sci-ences' Education Programs, for advances in space research and especially -for all young Space Scientists R&D!... The top ten 21-Century Challenges ahead of Planetary and Space Sciences, Space Data Management and Computational Space Tomography, important for successfully de-velopment of Young Scientist Generations, are following: 1. R&D of W. R. Hamilton General Idea for transformation all Space Sciences to Time Sciences, beginning with 6D Eukonal for 6D anisotropic mediums & velocities. Development of IERS Earth & Space Systems (VLBI; LLR; GPS; SLR; DORIS Etc.) for Planetary-Space Data Management & Computational Planetary & Space Tomography. 2. R&D of S. W. Hawking Paradigm for 2D Complex Time and Quan-tum Wave Cosmology Paradigm for Decision of the Main Problem of Contemporary Physics. 3. R&D of Einstein-Minkowski Geodesies' Paradigm in the 4D-Space-Time Continuum to 6D-6nD Space-Time Continuum Paradigms and 6D S-T Equations. . . 4. R&D of Erwin Schrüdinger 4D S-T Universe' Evolutional Equation; It's David Bohm 4D generalization for anisotropic mediums and innovative 6D -for instantaneously quantum measurement -Bohm-Schrüdinger 6D S-T Universe' Evolutional Equation. 5. R&D of brain new 6D Planning of S-T Experi-ments, brain new 6D Space Technicks and Space Technology Generalizations, especially for 6D RS VHRS Research, Monitoring and 6D Computational Tomography. 6. R&D of "6D Euler-Poisson Equations" and "6D Kolmogorov Turbulence Theory" for GeoDynamics and for Space Dynamics as evolution of Gauss-Riemann Paradigms. 7. R&D of N. Boneff NASA RD for Asteroid "Eros" & Space Science' Laws Evolution. 8. R&D of H. Poincare Paradigm for Nature and Cosmos as 6D Group of Transferences. 9. R&D of K. Popoff N-Body General Problem & General Thermodynamic S-T Theory as Einstein-Prigogine-Landau' Paradigms Development. ü 10. R&D of 1st GUT since 1958 by N. S. Kalitzin (Kalitzin N. S., 1958: Uber eine einheitliche Feldtheorie. ZAHeidelberg-ARI, WZHUmnR-B., 7 (2), 207-215) and "Multitemporal Theory of Relativity" -With special applications to Photon Rockets and all Space-Time R&D. GENERAL CONCLUSION: Multidimensional Space-Time Methodology is advance in space research, corresponding to the IAF-IAA-COSPAR Innovative Strategy and R&D Programs -UNEP, UNDP, GEOSS, GMES, Etc.

  11. Enhancing Human-Computer Interaction Design Education: Teaching Affordance Design for Emerging Mobile Devices

    ERIC Educational Resources Information Center

    Faiola, Anthony; Matei, Sorin Adam

    2010-01-01

    The evolution of human-computer interaction design (HCID) over the last 20 years suggests that there is a growing need for educational scholars to consider new and more applicable theoretical models of interactive product design. The authors suggest that such paradigms would call for an approach that would equip HCID students with a better…

  12. Development of a Computer-Assisted Cranial Nerve Simulation from the Visible Human Dataset

    ERIC Educational Resources Information Center

    Yeung, Jeffrey C.; Fung, Kevin; Wilson, Timothy D.

    2011-01-01

    Advancements in technology and personal computing have allowed for the development of novel teaching modalities such as online web-based modules. These modules are currently being incorporated into medical curricula and, in some paradigms, have been shown to be superior to classroom instruction. We believe that these modules have the potential of…

  13. Evaluating the Acceptance of Cloud-Based Productivity Computer Solutions in Small and Medium Enterprises

    ERIC Educational Resources Information Center

    Dominguez, Alfredo

    2013-01-01

    Cloud computing has emerged as a new paradigm for on-demand delivery and consumption of shared IT resources over the Internet. Research has predicted that small and medium organizations (SMEs) would be among the earliest adopters of cloud solutions; however, this projection has not materialized. This study set out to investigate if behavior…

  14. Cloud Implementation in Organizations: Critical Success Factors, Challenges, and Impacts on the IT Function

    ERIC Educational Resources Information Center

    Suo, Shuguang

    2013-01-01

    Organizations have been forced to rethink business models and restructure facilities through IT innovation as they have faced the challenges arising from globalization, mergers and acquisitions, big data, and the ever-changing demands of customers. Cloud computing has emerged as a new computing paradigm that has fundamentally shaped the business…

  15. Web-Based Seamless Migration for Task-Oriented Mobile Distance Learning

    ERIC Educational Resources Information Center

    Zhang, Degan; Li, Yuan-chao; Zhang, Huaiyu; Zhang, Xinshang; Zeng, Guangping

    2006-01-01

    As a new kind of computing paradigm, pervasive computing will meet the requirements of human being that anybody maybe obtain services in anywhere and at anytime, task-oriented seamless migration is one of its applications. Apparently, the function of seamless mobility is suitable for mobile services, such as mobile Web-based learning. In this…

  16. Information technologies for astrophysics circa 2001

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    It is easy to extrapolate current trends to see where technologies relating to information systems in astrophysics and other disciplines will be by the end of the decade. These technologies include mineaturization, multiprocessing, software technology, networking, databases, graphics, pattern computation, and interdisciplinary studies. It is easy to see what limits our current paradigms place on our thinking about technologies that will allow us to understand the laws governing very large systems about which we have large datasets. Three limiting paradigms are saving all the bits collected by instruments or generated by supercomputers; obtaining technology for information compression, storage and retrieval off the shelf; and the linear mode of innovation. We must extend these paradigms to meet our goals for information technology at the end of the decade.

  17. Challenging the paradigm of singularity excision in gravitational collapse.

    PubMed

    Baiotti, Luca; Rezzolla, Luciano

    2006-10-06

    A paradigm deeply rooted in modern numerical relativity calculations prescribes the removal of those regions of the computational domain where a physical singularity may develop. We here challenge this paradigm by performing three-dimensional simulations of the collapse of uniformly rotating stars to black holes without excision. We show that this choice, combined with suitable gauge conditions and the use of minute numerical dissipation, improves dramatically the long-term stability of the evolutions. In turn, this allows for the calculation of the waveforms well beyond what was previously possible, providing information on the black-hole ringing and setting a new mark on the present knowledge of the gravitational-wave emission from the stellar collapse to a rotating black hole.

  18. Genetic selection for coping style predicts stressor susceptibility.

    PubMed

    Veenema, A H; Meijer, O C; de Kloet, E R; Koolhaas, J M

    2003-03-01

    Genetically selected aggressive (SAL) and nonaggressive (LAL) male wild house-mice which show distinctly different coping styles, also display a differential regulation of the hypothalamic-pituitary-adrenal axis after exposure to an acute stressor. To test the hypothesis that coping style predicts stressor susceptibility, the present study examined line differences in response to a chronic stressor. Chronic psychosocial stress was evoked using two paradigms. In the first paradigm, a SAL or LAL male was living in sensory contact (except tactile contact) with a dominant SAL male for 25 days (sensory contact stress). In the second paradigm, a SAL or LAL male was, in addition to the first paradigm, defeated by a SAL male for 21 consecutive days (defeat stress). The sensory contact stressor induced in LAL mice chronic body weight loss and increased plasma adrenocorticotropic hormone levels compared to SAL mice and increased corticosterone levels, thymus involution and lower hippocampal mineralocorticoid receptor (MR) : glucocorticoid receptor (GR) ratio compared to LAL controls. The defeat stressor increased corticosterone secretion and caused adrenal hypertrophy and thymus involution in both mouse lines. Defeated LAL mice showed long-lasting body weight loss and higher corticosterone concentrations than SAL mice and lower hippocampal MR : GR ratio and decreased immobility behaviour in the forced swimming test than LAL controls. Hypothalamic corticotropin-releasing hormone mRNA expression was higher in defeated SAL than in controls. The present data show that both stress paradigms induced line-dependent physiological and neuroendocrine changes, but that the sensory contact stressor produced chronic stress symptoms in LAL mice only. This latter stress paradigm therefore seems promising to analyse the role of genetic factors in the individual differences in stress-related psychopathology.

  19. Big(ger) Data as Better Data in Open Distance Learning

    ERIC Educational Resources Information Center

    Prinsloo, Paul; Archer, Elizabeth; Barnes, Glen; Chetty, Yuraisha; van Zyl, Dion

    2015-01-01

    In the context of the hype, promise and perils of Big Data and the currently dominant paradigm of data-driven decision-making, it is important to critically engage with the potential of Big Data for higher education. We do not question the potential of Big Data, but we do raise a number of issues, and present a number of theses to be seriously…

  20. Oral Traditions: A Contextual Framework for Complex Science Concepts--Laying the Foundation for a Paradigm of Promise in Rural Science Education

    ERIC Educational Resources Information Center

    Avery, Leanne M.; Hains, Bryan J.

    2017-01-01

    The overarching goal of this paper is to bring a diverse educational context--rural sayings and oral traditions situated in ecological habitats--to light and emphasize that they need to be taken into consideration regarding twenty-first century science education. The rural sayings or tenets presented here are also considered alternative ways of…

  1. Bridging the practitioner-scientist gap in group psychotherapy research.

    PubMed

    Lau, Mark A; Ogrodniczuk, John; Joyce, Anthony S; Sochting, Ingrid

    2010-04-01

    Bridging the practitioner-scientist gap requires a different clinical research paradigm: participatory research that encourages community agency-academic partnerships. In this context, clinicians help define priorities, determine the type of evidence that will have an impact on their practice (affecting the methods that are used to produce the evidence), and develop strategies for translating, implementing, and disseminating their findings into evidence-based practice. Within this paradigm, different roles are assumed by the partners, and sometimes these roles are blended. This paper will consider the perspectives of people who assume these different roles (clinician, researcher, and clinician-researcher) with group psychotherapy as the specific focus. Finally, the establishment of a practice-research network will be discussed as a potentially promising way to better engage group therapists in research.

  2. Impact of Different Visual Field Testing Paradigms on Sample Size Requirements for Glaucoma Clinical Trials.

    PubMed

    Wu, Zhichao; Medeiros, Felipe A

    2018-03-20

    Visual field testing is an important endpoint in glaucoma clinical trials, and the testing paradigm used can have a significant impact on the sample size requirements. To investigate this, this study included 353 eyes of 247 glaucoma patients seen over a 3-year period to extract real-world visual field rates of change and variability estimates to provide sample size estimates from computer simulations. The clinical trial scenario assumed that a new treatment was added to one of two groups that were both under routine clinical care, with various treatment effects examined. Three different visual field testing paradigms were evaluated: a) evenly spaced testing, b) United Kingdom Glaucoma Treatment Study (UKGTS) follow-up scheme, which adds clustered tests at the beginning and end of follow-up in addition to evenly spaced testing, and c) clustered testing paradigm, with clusters of tests at the beginning and end of the trial period and two intermediary visits. The sample size requirements were reduced by 17-19% and 39-40% using the UKGTS and clustered testing paradigms, respectively, when compared to the evenly spaced approach. These findings highlight how the clustered testing paradigm can substantially reduce sample size requirements and improve the feasibility of future glaucoma clinical trials.

  3. Interaction Entropy: A New Paradigm for Highly Efficient and Reliable Computation of Protein-Ligand Binding Free Energy.

    PubMed

    Duan, Lili; Liu, Xiao; Zhang, John Z H

    2016-05-04

    Efficient and reliable calculation of protein-ligand binding free energy is a grand challenge in computational biology and is of critical importance in drug design and many other molecular recognition problems. The main challenge lies in the calculation of entropic contribution to protein-ligand binding or interaction systems. In this report, we present a new interaction entropy method which is theoretically rigorous, computationally efficient, and numerically reliable for calculating entropic contribution to free energy in protein-ligand binding and other interaction processes. Drastically different from the widely employed but extremely expensive normal mode method for calculating entropy change in protein-ligand binding, the new method calculates the entropic component (interaction entropy or -TΔS) of the binding free energy directly from molecular dynamics simulation without any extra computational cost. Extensive study of over a dozen randomly selected protein-ligand binding systems demonstrated that this interaction entropy method is both computationally efficient and numerically reliable and is vastly superior to the standard normal mode approach. This interaction entropy paradigm introduces a novel and intuitive conceptual understanding of the entropic effect in protein-ligand binding and other general interaction systems as well as a practical method for highly efficient calculation of this effect.

  4. Evaluating Brain-Computer Interface Performance in an ALS Population: Checkerboard and Color Paradigms.

    PubMed

    Ryan, David B; Colwell, Kenneth A; Throckmorton, Chandra S; Collins, Leslie M; Caves, Kevin; Sellers, Eric W

    2018-03-01

    The objective of this study was to investigate the performance of 3 brain-computer interface (BCI) paradigms in an amyotrophic lateral sclerosis (ALS) population (n = 11). Using a repeated-measures design, participants completed 3 BCI conditions: row/column (RCW), checkerboard (CBW), and gray-to-color (CBC). Based on previous studies, it is hypothesized that the CBC and CBW conditions will result in higher accuracy, information transfer rate, waveform amplitude, and user preference over the RCW condition. An offline dynamic stopping simulation will also increase information transfer rate. Higher mean accuracy was observed in the CBC condition (89.7%), followed by the CBW (84.3%) condition, and lowest in the RCW condition (78.7%); however, these differences did not reach statistical significance ( P = .062). Eight of the eleven participants preferred the CBC and the remaining three preferred the CBW conditions. The offline dynamic stopping simulation significantly increased information transfer rate ( P = .005) and decreased accuracy ( P < .000). The findings of this study suggest that color stimuli provide a modest improvement in performance and that participants prefer color stimuli over monochromatic stimuli. Given these findings, BCI paradigms that use color stimuli should be considered for individuals who have ALS.

  5. Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.

    PubMed

    Stolper, Charles D; Perer, Adam; Gotz, David

    2014-12-01

    As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.

  6. Biogeochemical processes on tree islands in the greater everglades: Initiating a new paradigm

    USGS Publications Warehouse

    Wetzel, P.R.; Sklar, Fred H.; Coronado, C.A.; Troxler, T.G.; Krupa, S.L.; Sullivan, P.L.; Ewe, S.; Price, R.M.; Newman, S.; Orem, W.H.

    2011-01-01

    Scientists' understanding of the role of tree islands in the Everglades has evolved from a plant community of minor biogeochemical importance to a plant community recognized as the driving force for localized phosphorus accumulation within the landscape. Results from this review suggest that tree transpiration, nutrient infiltration from the soil surface, and groundwater flow create a soil zone of confluence where nutrients and salts accumulate under the head of a tree island during dry periods. Results also suggest accumulated salts and nutrients are flushed downstream by regional water flows during wet periods. That trees modulate their environment to create biogeochemical hot spots and strong nutrient gradients is a significant ecological paradigm shift in the understanding of the biogeochemical processes in the Everglades. In terms of island sustainability, this new paradigm suggests the need for distinct dry-wet cycles as well as a hydrologic regime that supports tree survival. Restoration of historic tree islands needs further investigation but the creation of functional tree islands is promising. Copyright ?? 2011 Taylor & Francis Group, LLC.

  7. Contextual fear conditioning in zebrafish.

    PubMed

    Kenney, Justin W; Scott, Ian C; Josselyn, Sheena A; Frankland, Paul W

    2017-10-01

    Zebrafish are a genetically tractable vertebrate that hold considerable promise for elucidating the molecular basis of behavior. Although numerous recent advances have been made in the ability to precisely manipulate the zebrafish genome, much less is known about many aspects of learning and memory in adult fish. Here, we describe the development of a contextual fear conditioning paradigm using an electric shock as the aversive stimulus. We find that contextual fear conditioning is modulated by shock intensity, prevented by an established amnestic agent (MK-801), lasts at least 14 d, and exhibits extinction. Furthermore, fish of various background strains (AB, Tu, and TL) are able to acquire fear conditioning, but differ in fear extinction rates. Taken together, we find that contextual fear conditioning in zebrafish shares many similarities with the widely used contextual fear conditioning paradigm in rodents. Combined with the amenability of genetic manipulation in zebrafish, we anticipate that our paradigm will prove to be a useful complementary system in which to examine the molecular basis of vertebrate learning and memory. © 2017 Kenney et al.; Published by Cold Spring Harbor Laboratory Press.

  8. Evolving binary classifiers through parallel computation of multiple fitness cases.

    PubMed

    Cagnoni, Stefano; Bergenti, Federico; Mordonini, Monica; Adorni, Giovanni

    2005-06-01

    This paper describes two versions of a novel approach to developing binary classifiers, based on two evolutionary computation paradigms: cellular programming and genetic programming. Such an approach achieves high computation efficiency both during evolution and at runtime. Evolution speed is optimized by allowing multiple solutions to be computed in parallel. Runtime performance is optimized explicitly using parallel computation in the case of cellular programming or implicitly taking advantage of the intrinsic parallelism of bitwise operators on standard sequential architectures in the case of genetic programming. The approach was tested on a digit recognition problem and compared with a reference classifier.

  9. Reservoir computing with a slowly modulated mask signal for preprocessing using a mutually coupled optoelectronic system

    NASA Astrophysics Data System (ADS)

    Tezuka, Miwa; Kanno, Kazutaka; Bunsen, Masatoshi

    2016-08-01

    Reservoir computing is a machine-learning paradigm based on information processing in the human brain. We numerically demonstrate reservoir computing with a slowly modulated mask signal for preprocessing by using a mutually coupled optoelectronic system. The performance of our system is quantitatively evaluated by a chaotic time series prediction task. Our system can produce comparable performance with reservoir computing with a single feedback system and a fast modulated mask signal. We showed that it is possible to slow down the modulation speed of the mask signal by using the mutually coupled system in reservoir computing.

  10. Fortran for the nineties

    NASA Technical Reports Server (NTRS)

    Himer, J. T.

    1992-01-01

    Fortran has largely enjoyed prominence for the past few decades as the computer programming language of choice for numerically intensive scientific, engineering, and process control applications. Fortran's well understood static language syntax has allowed resulting parsers and compiler optimizing technologies to often generate among the most efficient and fastest run-time executables, particularly on high-end scalar and vector supercomputers. Computing architectures and paradigms have changed considerably since the last ANSI/ISO Fortran release in 1978, and while FORTRAN 77 has more than survived, it's aged features provide only partial functionality for today's demanding computing environments. The simple block procedural languages have been necessarily evolving, or giving way, to specialized supercomputing, network resource, and object-oriented paradigms. To address these new computing demands, ANSI has worked for the last 12-years with three international public reviews to deliver Fortran 90. Fortran 90 has superseded and replaced ISO FORTRAN 77 internationally as the sole Fortran standard; while in the US, Fortran 90 is expected to be adopted as the ANSI standard this summer, coexisting with ANSI FORTRAN 77 until at least 1996. The development path and current state of Fortran will be briefly described highlighting the many new Fortran 90 syntactic and semantic additions which support (among others): free form source; array syntax; new control structures; modules and interfaces; pointers; derived data types; dynamic memory; enhanced I/O; operator overloading; data abstraction; user optional arguments; new intrinsics for array, bit manipulation, and system inquiry; and enhanced portability through better generic control of underlying system arithmetic models. Examples from dynamical astronomy, signal and image processing will attempt to illustrate Fortran 90's applicability to today's general scalar, vector, and parallel scientific and engineering requirements and object oriented programming paradigms. Time permitting, current work proceeding on the future development of Fortran 2000 and collateral standards will be introduced.

  11. A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools

    DTIC Science & Technology

    2015-07-14

    computer that establishes an encrypted Virtual Private Network ( OpenVPN [44]) based on the Secure Socket Layer (SSL) paradigm. Each user is given a...security certificate for each device used to connect to the computing nodes. Stable OpenVPN clients are available for Linux, Microsoft Windows, Apple OSX...platform is granted by an encrypted connection base on the Secure Socket Layer (SSL) protocol, and implemented in the OpenVPN Virtual Personal Network

  12. Advances in Parallel Computing and Databases for Digital Pathology in Cancer Research

    DTIC Science & Technology

    2016-11-13

    these technologies and how we have used them in the past. We are interested in learning more about the needs of clinical pathologists as we continue to...such as image processing and correlation. Further, High Performance Computing (HPC) paradigms such as the Message Passing Interface (MPI) have been...Defense for Research and Engineering. such as pMatlab [4], or bcMPI [5] can significantly reduce the need for deep knowledge of parallel computing. In

  13. Student teaching and research laboratory focusing on brain-computer interface paradigms--A creative environment for computer science students.

    PubMed

    Rutkowski, Tomasz M

    2015-08-01

    This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.

  14. New computing systems, future computing environment, and their implications on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  15. Sign use and cognition in automated scientific discovery: are computers only special kinds of signs?

    NASA Astrophysics Data System (ADS)

    Giza, Piotr

    2018-04-01

    James Fetzer criticizes the computational paradigm, prevailing in cognitive science by questioning, what he takes to be, its most elementary ingredient: that cognition is computation across representations. He argues that if cognition is taken to be a purposive, meaningful, algorithmic problem solving activity, then computers are incapable of cognition. Instead, they appear to be signs of a special kind, that can facilitate computation. He proposes the conception of minds as semiotic systems as an alternative paradigm for understanding mental phenomena, one that seems to overcome the difficulties of computationalism. Now, I argue, that with computer systems dealing with scientific discovery, the matter is not so simple as that. The alleged superiority of humans using signs to stand for something other over computers being merely "physical symbol systems" or "automatic formal systems" is only easy to establish in everyday life, but becomes far from obvious when scientific discovery is at stake. In science, as opposed to everyday life, the meaning of symbols is, apart from very low-level experimental investigations, defined implicitly by the way the symbols are used in explanatory theories or experimental laws relevant to the field, and in consequence, human and machine discoverers are much more on a par. Moreover, the great practical success of the genetic programming method and recent attempts to apply it to automatic generation of cognitive theories seem to show, that computer systems are capable of very efficient problem solving activity in science, which is neither purposive nor meaningful, nor algorithmic. This, I think, undermines Fetzer's argument that computer systems are incapable of cognition because computation across representations is bound to be a purposive, meaningful, algorithmic problem solving activity.

  16. Execution environment for intelligent real-time control systems

    NASA Technical Reports Server (NTRS)

    Sztipanovits, Janos

    1987-01-01

    Modern telerobot control technology requires the integration of symbolic and non-symbolic programming techniques, different models of parallel computations, and various programming paradigms. The Multigraph Architecture, which has been developed for the implementation of intelligent real-time control systems is described. The layered architecture includes specific computational models, integrated execution environment and various high-level tools. A special feature of the architecture is the tight coupling between the symbolic and non-symbolic computations. It supports not only a data interface, but also the integration of the control structures in a parallel computing environment.

  17. Learning, epigenetics, and computation: An extension on Fitch's proposal. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    NASA Astrophysics Data System (ADS)

    Okanoya, Kazuo

    2014-09-01

    The comparative computational approach of Fitch [1] attempts to renew the classical David Marr paradigm of computation, algorithm, and implementation, by introducing evolutionary view of the relationship between neural architecture and cognition. This comparative evolutionary view provides constraints useful in narrowing down the problem space for both cognition and neural mechanisms. I will provide two examples from our own studies that reinforce and extend Fitch's proposal.

  18. AGIS: The ATLAS Grid Information System

    NASA Astrophysics Data System (ADS)

    Anisenkov, A.; Di Girolamo, A.; Klimentov, A.; Oleynik, D.; Petrosyan, A.; Atlas Collaboration

    2014-06-01

    ATLAS, a particle physics experiment at the Large Hadron Collider at CERN, produced petabytes of data annually through simulation production and tens of petabytes of data per year from the detector itself. The ATLAS computing model embraces the Grid paradigm and a high degree of decentralization and computing resources able to meet ATLAS requirements of petabytes scale data operations. In this paper we describe the ATLAS Grid Information System (AGIS), designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by the ATLAS Distributed Computing applications and services.

  19. Information Technology as the Paradigm High-Speed Management Support Tool: The Uses of Computer Mediated Communication, Virtual Realism, and Telepresence.

    ERIC Educational Resources Information Center

    Newby, Gregory B.

    Information technologies such as computer mediated communication (CMC), virtual reality, and telepresence can provide the communication flow required by high-speed management techniques that high-technology industries have adopted in response to changes in the climate of competition. Intra-corporate CMC might be used for a variety of purposes…

  20. Evaluating Students' Programming Skill Behaviour and Personalizing Their Computer Learning Environment Using "The Hour of Code" Paradigm

    ERIC Educational Resources Information Center

    Mallios, Nikolaos; Vassilakopoulos, Michael Gr.

    2015-01-01

    One of the most intriguing objectives when teaching computer science in mid-adolescence high school students is attracting and mainly maintaining their concentration within the limits of the class. A number of theories have been proposed and numerous methodologies have been applied, aiming to assist in the implementation of a personalized learning…

  1. CLIPS on the NeXT computer

    NASA Technical Reports Server (NTRS)

    Charnock, Elizabeth; Eng, Norman

    1990-01-01

    This paper discusses the integration of CLIPS into a hybrid expert system neural network AI tool for the NeXT computer. The main discussion is devoted to the joining of these two AI paradigms in a mutually beneficial relationship. We conclude that expert systems and neural networks should not be considered as competing AI implementation methods, but rather as complimentary components of a whole.

  2. Balancing Expression and Structure in Game Design: Developing Computational Participation Using Studio-Based Design Pedagogy

    ERIC Educational Resources Information Center

    DeVane, Ben; Steward, Cody; Tran, Kelly M.

    2016-01-01

    This article reports on a project that used a game-creation tool to introduce middle-school students ages 10 to 13 to problem-solving strategies similar to those in computer science through the lens of studio-based design arts. Drawing on historic paradigms in design pedagogy and contemporary educational approaches in the digital arts to teach…

  3. Consistent data-driven computational mechanics

    NASA Astrophysics Data System (ADS)

    González, D.; Chinesta, F.; Cueto, E.

    2018-05-01

    We present a novel method, within the realm of data-driven computational mechanics, to obtain reliable and thermodynamically sound simulation from experimental data. We thus avoid the need to fit any phenomenological model in the construction of the simulation model. This kind of techniques opens unprecedented possibilities in the framework of data-driven application systems and, particularly, in the paradigm of industry 4.0.

  4. The Computer as Coach: An Athletic Paradigm for Intellectual Education. AI Memo 389.

    ERIC Educational Resources Information Center

    Goldstein, Ira

    This paper is a preliminary proposal to develop the theory and design for "coaches" for computer games, to implement prototypes, and to experiment with their ability to convey important intellectual skills. The focus of this project will be restricted to developing a coach for a single example of an intellectual game called Wumpus. It is…

  5. Training to use a commercial brain-computer interface as access technology: a case study.

    PubMed

    Taherian, Sarvnaz; Selitskiy, Dmitry; Pau, James; Davies, T Claire; Owens, R Glynn

    2016-01-01

    This case study describes how an individual with spastic quadriplegic cerebral palsy was trained over a period of four weeks to use a commercial electroencephalography (EEG)-based brain-computer interface (BCI). The participant spent three sessions exploring the system, and seven sessions playing a game focused on EEG feedback training of left and right arm motor imagery and a customised, training game paradigm was employed. The participant showed improvement in the production of two distinct EEG patterns. The participant's performance was influenced by motivation, fatigue and concentration. Six weeks post-training the participant could still control the BCI and used this to type a sentence using an augmentative and alternative communication application on a wirelessly linked device. The results from this case study highlight the importance of creating a dynamic, relevant and engaging training environment for BCIs. Implications for Rehabilitation Customising a training paradigm to suit the users' interests can influence adherence to assistive technology training. Mood, fatigue, physical illness and motivation influence the usability of a brain-computer interface. Commercial brain-computer interfaces, which require little set up time, may be used as access technology for individuals with severe disabilities.

  6. Low-Speed Investigation of Upper-Surface Leading-Edge Blowing on a High-Speed Civil Transport Configuration

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.; Laflin, Brenda E. Gile; Kemmerly, Guy T.; Campbell, Bryan A.

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  7. The Road to the Global Village.

    ERIC Educational Resources Information Center

    Wright, Karen

    1990-01-01

    Discussed is the growth and development of electronic communications and computer technology. Modems, the "Whole Person Paradigm," artificial intelligence, multimedia, virtual reality, and communications technologies are described. Implications for world economies are suggested. (CW)

  8. Stochastic computing with biomolecular automata

    PubMed Central

    Adar, Rivka; Benenson, Yaakov; Linshiz, Gregory; Rosner, Amit; Tishby, Naftali; Shapiro, Ehud

    2004-01-01

    Stochastic computing has a broad range of applications, yet electronic computers realize its basic step, stochastic choice between alternative computation paths, in a cumbersome way. Biomolecular computers use a different computational paradigm and hence afford novel designs. We constructed a stochastic molecular automaton in which stochastic choice is realized by means of competition between alternative biochemical pathways, and choice probabilities are programmed by the relative molar concentrations of the software molecules coding for the alternatives. Programmable and autonomous stochastic molecular automata have been shown to perform direct analysis of disease-related molecular indicators in vitro and may have the potential to provide in situ medical diagnosis and cure. PMID:15215499

  9. A Prototype SSVEP Based Real Time BCI Gaming System

    PubMed Central

    Martišius, Ignas

    2016-01-01

    Although brain-computer interface technology is mainly designed with disabled people in mind, it can also be beneficial to healthy subjects, for example, in gaming or virtual reality systems. In this paper we discuss the typical architecture, paradigms, requirements, and limitations of electroencephalogram-based gaming systems. We have developed a prototype three-class brain-computer interface system, based on the steady state visually evoked potentials paradigm and the Emotiv EPOC headset. An online target shooting game, implemented in the OpenViBE environment, has been used for user feedback. The system utilizes wave atom transform for feature extraction, achieving an average accuracy of 78.2% using linear discriminant analysis classifier, 79.3% using support vector machine classifier with a linear kernel, and 80.5% using a support vector machine classifier with a radial basis function kernel. PMID:27051414

  10. Adaptive regularization network based neural modeling paradigm for nonlinear adaptive estimation of cerebral evoked potentials.

    PubMed

    Zhang, Jian-Hua; Böhme, Johann F

    2007-11-01

    In this paper we report an adaptive regularization network (ARN) approach to realizing fast blind separation of cerebral evoked potentials (EPs) from background electroencephalogram (EEG) activity with no need to make any explicit assumption on the statistical (or deterministic) signal model. The ARNs are proposed to construct nonlinear EEG and EP signal models. A novel adaptive regularization training (ART) algorithm is proposed to improve the generalization performance of the ARN. Two adaptive neural modeling methods based on the ARN are developed and their implementation and performance analysis are also presented. The computer experiments using simulated and measured visual evoked potential (VEP) data have shown that the proposed ARN modeling paradigm yields computationally efficient and more accurate VEP signal estimation owing to its intrinsic model-free and nonlinear processing characteristics.

  11. Paradigms of perception in clinical practice.

    PubMed

    Jacobson, Francine L; Berlanstein, Bruce P; Andriole, Katherine P

    2006-06-01

    Display strategies for medical images in radiology have evolved in tandem with the technology by which images are made. The close of the 20th century, nearly coincident with the 100th anniversary of the discovery of x-rays, brought radiologists to a new crossroad in the evolution of image display. The increasing availability, speed, and flexibility of computer technology can now revolutionize how images are viewed and interpreted. Radiologists are not yet in agreement regarding the next paradigm for image display. The possibilities are being explored systematically through the Society for Computer Applications in Radiology's Transforming the Radiological Interpretation Process initiative. The varied input of radiologists who work in a large variety of settings will enable new display strategies to best serve radiologists in the detection and quantification of disease. Considerations and possibilities for the future are presented in this paper.

  12. A Prototype SSVEP Based Real Time BCI Gaming System.

    PubMed

    Martišius, Ignas; Damaševičius, Robertas

    2016-01-01

    Although brain-computer interface technology is mainly designed with disabled people in mind, it can also be beneficial to healthy subjects, for example, in gaming or virtual reality systems. In this paper we discuss the typical architecture, paradigms, requirements, and limitations of electroencephalogram-based gaming systems. We have developed a prototype three-class brain-computer interface system, based on the steady state visually evoked potentials paradigm and the Emotiv EPOC headset. An online target shooting game, implemented in the OpenViBE environment, has been used for user feedback. The system utilizes wave atom transform for feature extraction, achieving an average accuracy of 78.2% using linear discriminant analysis classifier, 79.3% using support vector machine classifier with a linear kernel, and 80.5% using a support vector machine classifier with a radial basis function kernel.

  13. Brain communication in the locked-in state.

    PubMed

    De Massari, Daniele; Ruf, Carolin A; Furdea, Adrian; Matuz, Tamara; van der Heiden, Linda; Halder, Sebastian; Silvoni, Stefano; Birbaumer, Niels

    2013-06-01

    Patients in the completely locked-in state have no means of communication and they represent the target population for brain-computer interface research in the last 15 years. Although different paradigms have been tested and different physiological signals used, to date no sufficiently documented completely locked-in state patient was able to control a brain-computer interface over an extended time period. We introduce Pavlovian semantic conditioning to enable basic communication in completely locked-in state. This novel paradigm is based on semantic conditioning for online classification of neuroelectric or any other physiological signals to discriminate between covert (cognitive) 'yes' and 'no' responses. The paradigm comprised the presentation of affirmative and negative statements used as conditioned stimuli, while the unconditioned stimulus consisted of electrical stimulation of the skin paired with affirmative statements. Three patients with advanced amyotrophic lateral sclerosis participated over an extended time period, one of which was in a completely locked-in state, the other two in the locked-in state. The patients' level of vigilance was assessed through auditory oddball procedures to study the correlation between vigilance level and the classifier's performance. The average online classification accuracies of slow cortical components of electroencephalographic signals were around chance level for all the patients. The use of a non-linear classifier in the offline classification procedure resulted in a substantial improvement of the accuracy in one locked-in state patient achieving 70% correct classification. A reliable level of performance in the completely locked-in state patient was not achieved uniformly throughout the 37 sessions despite intact cognitive processing capacity, but in some sessions communication accuracies up to 70% were achieved. Paradigm modifications are proposed. Rapid drop of vigilance was detected suggesting attentional variations or variations of circadian period as important factors in brain-computer interface communication with locked-in state and completely locked-in state.

  14. High-frequency combination coding-based steady-state visual evoked potential for brain computer interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Feng; Zhang, Xin; Xie, Jun

    2015-03-10

    This study presents a new steady-state visual evoked potential (SSVEP) paradigm for brain computer interface (BCI) systems. The goal of this study is to increase the number of targets using fewer stimulation high frequencies, with diminishing subject’s fatigue and reducing the risk of photosensitive epileptic seizures. The new paradigm is High-Frequency Combination Coding-Based High-Frequency Steady-State Visual Evoked Potential (HFCC-SSVEP).Firstly, we studied SSVEP high frequency(beyond 25 Hz)response of SSVEP, whose paradigm is presented on the LED. The SNR (Signal to Noise Ratio) of high frequency(beyond 40 Hz) response is very low, which is been unable to be distinguished through the traditional analysis method;more » Secondly we investigated the HFCC-SSVEP response (beyond 25 Hz) for 3 frequencies (25Hz, 33.33Hz, and 40Hz), HFCC-SSVEP produces n{sup n} with n high stimulation frequencies through Frequence Combination Code. Further, Animproved Hilbert-huang transform (IHHT)-based variable frequency EEG feature extraction method and a local spectrum extreme target identification algorithmare adopted to extract time-frequency feature of the proposed HFCC-SSVEP response.Linear predictions and fixed sifting (iterating) 10 time is used to overcome the shortage of end effect and stopping criterion,generalized zero-crossing (GZC) is used to compute the instantaneous frequency of the proposed SSVEP respondent signals, the improved HHT-based feature extraction method for the proposed SSVEP paradigm in this study increases recognition efficiency, so as to improve ITR and to increase the stability of the BCI system. what is more, SSVEPs evoked by high-frequency stimuli (beyond 25Hz) minimally diminish subject’s fatigue and prevent safety hazards linked to photo-induced epileptic seizures, So as to ensure the system efficiency and undamaging.This study tests three subjects in order to verify the feasibility of the proposed method.« less

  15. Information technologies for astrophysics circa 2001

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1991-01-01

    It is easy to extrapolate current trends to see where technologies relating to information systems in astrophysics and other disciplines will be by the end of the decade. These technologies include miniaturization, multiprocessing, software technology, networking, databases, graphics, pattern computation, and interdisciplinary studies. It is less easy to see what limits our current paradigms place on our thinking about technologies that will allow us to understand the laws governing very large systems about which we have large data sets. Three limiting paradigms are as follows: saving all the bits collected by instruments or generated by supercomputers; obtaining technology for information compression, storage, and retrieval off the shelf; and the linear model of innovation. We must extend these paradigms to meet our goals for information technology at the end of the decade.

  16. Promising Directions in School-Based Systems Level Consultation: A Commentary on "Has Consultation Achieved Its Primary Prevention Potential?," an Article by Joseph E. Zins

    ERIC Educational Resources Information Center

    Hojnoski, Robin L.

    2007-01-01

    From his review of research on consultation, Zins (1995) concluded (a) that a paradigm shift to a more prevention-oriented approach was necessary; (b) that systems-level consultation held significant potential; and (c) that systems-level consultation had the most data to support its use in preventive efforts. This commentary reviews promising…

  17. The problems and prospects of the public-private partnership in the Russian fuel and energy sector

    NASA Astrophysics Data System (ADS)

    Nikitenko, SM; Goosen, EV

    2017-02-01

    This article highlights some opportunities for shifting the paradigm for the development of natural resources in the Russian fuel and energy sector using public-private partnership instruments. It shows three main directions for developing public-private partnerships in the area of subsoil use and emphasizes the role of innovations in implementing the most promising projects in the fuel and energy sector of Russia.

  18. Biology and data-intensive scientific discovery in the beginning of the 21st century.

    PubMed

    Smith, Arnold; Balazinska, Magdalena; Baru, Chaitan; Gomelsky, Mark; McLennan, Michael; Rose, Lynn; Smith, Burton; Stewart, Elizabeth; Kolker, Eugene

    2011-04-01

    The life sciences are poised at the beginning of a paradigm-changing evolution in the way scientific questions are answered. Data-Intensive Science (DIS) promise to provide new ways of approaching scientific challenges and answering questions. This article is a summary of the life sciences issues and challenges as discussed in the DIS workshop in Seattle, September 19-20, 2010. © Mary Ann Liebert, Inc.

  19. Proceedings of the Annual Meeting of the Association for Education in Journalism and Mass Communication (86th, Kansas City, Missouri, July 30-August 2, 2003). Science Communication Interest Group.

    ERIC Educational Resources Information Center

    2003

    The Science Communication Interest Group of the proceedings contains the following 7 papers: "Risk Perceptions and Food Safety: A Test of the Psychometric Paradigm" (Joye C. Gordon); "An Entertainment-Education Video as a Tool to Influence Mammography Compliance Behavior in Latinas" (Gail D. Love); "Promise or Peril: How…

  20. Supervised dictionary learning for inferring concurrent brain networks.

    PubMed

    Zhao, Shijie; Han, Junwei; Lv, Jinglei; Jiang, Xi; Hu, Xintao; Zhao, Yu; Ge, Bao; Guo, Lei; Liu, Tianming

    2015-10-01

    Task-based fMRI (tfMRI) has been widely used to explore functional brain networks via predefined stimulus paradigm in the fMRI scan. Traditionally, the general linear model (GLM) has been a dominant approach to detect task-evoked networks. However, GLM focuses on task-evoked or event-evoked brain responses and possibly ignores the intrinsic brain functions. In comparison, dictionary learning and sparse coding methods have attracted much attention recently, and these methods have shown the promise of automatically and systematically decomposing fMRI signals into meaningful task-evoked and intrinsic concurrent networks. Nevertheless, two notable limitations of current data-driven dictionary learning method are that the prior knowledge of task paradigm is not sufficiently utilized and that the establishment of correspondences among dictionary atoms in different brains have been challenging. In this paper, we propose a novel supervised dictionary learning and sparse coding method for inferring functional networks from tfMRI data, which takes both of the advantages of model-driven method and data-driven method. The basic idea is to fix the task stimulus curves as predefined model-driven dictionary atoms and only optimize the other portion of data-driven dictionary atoms. Application of this novel methodology on the publicly available human connectome project (HCP) tfMRI datasets has achieved promising results.

  1. The Handicap Principle for Trust in Computer Security, the Semantic Web and Social Networking

    NASA Astrophysics Data System (ADS)

    Ma, Zhanshan (Sam); Krings, Axel W.; Hung, Chih-Cheng

    Communication is a fundamental function of life, and it exists in almost all living things: from single-cell bacteria to human beings. Communication, together with competition and cooperation,arethree fundamental processes in nature. Computer scientists are familiar with the study of competition or 'struggle for life' through Darwin's evolutionary theory, or even evolutionary computing. They may be equally familiar with the study of cooperation or altruism through the Prisoner's Dilemma (PD) game. However, they are likely to be less familiar with the theory of animal communication. The objective of this article is three-fold: (i) To suggest that the study of animal communication, especially the honesty (reliability) of animal communication, in which some significant advances in behavioral biology have been achieved in the last three decades, should be on the verge to spawn important cross-disciplinary research similar to that generated by the study of cooperation with the PD game. One of the far-reaching advances in the field is marked by the publication of "The Handicap Principle: a Missing Piece of Darwin's Puzzle" by Zahavi (1997). The 'Handicap' principle [34][35], which states that communication signals must be costly in some proper way to be reliable (honest), is best elucidated with evolutionary games, e.g., Sir Philip Sidney (SPS) game [23]. Accordingly, we suggest that the Handicap principle may serve as a fundamental paradigm for trust research in computer science. (ii) To suggest to computer scientists that their expertise in modeling computer networks may help behavioral biologists in their study of the reliability of animal communication networks. This is largely due to the historical reason that, until the last decade, animal communication was studied with the dyadic paradigm (sender-receiver) rather than with the network paradigm. (iii) To pose several open questions, the answers to which may bear some refreshing insights to trust research in computer science, especially secure and resilient computing, the semantic web, and social networking. One important thread unifying the three aspects is the evolutionary game theory modeling or its extensions with survival analysis and agreement algorithms [19][20], which offer powerful game models for describing time-, space-, and covariate-dependent frailty (uncertainty and vulnerability) and deception (honesty).

  2. The Experiment Factory: Standardizing Behavioral Experiments.

    PubMed

    Sochat, Vanessa V; Eisenberg, Ian W; Enkavi, A Zeynep; Li, Jamie; Bissett, Patrick G; Poldrack, Russell A

    2016-01-01

    The administration of behavioral and experimental paradigms for psychology research is hindered by lack of a coordinated effort to develop and deploy standardized paradigms. While several frameworks (Mason and Suri, 2011; McDonnell et al., 2012; de Leeuw, 2015; Lange et al., 2015) have provided infrastructure and methods for individual research groups to develop paradigms, missing is a coordinated effort to develop paradigms linked with a system to easily deploy them. This disorganization leads to redundancy in development, divergent implementations of conceptually identical tasks, disorganized and error-prone code lacking documentation, and difficulty in replication. The ongoing reproducibility crisis in psychology and neuroscience research (Baker, 2015; Open Science Collaboration, 2015) highlights the urgency of this challenge: reproducible research in behavioral psychology is conditional on deployment of equivalent experiments. A large, accessible repository of experiments for researchers to develop collaboratively is most efficiently accomplished through an open source framework. Here we present the Experiment Factory, an open source framework for the development and deployment of web-based experiments. The modular infrastructure includes experiments, virtual machines for local or cloud deployment, and an application to drive these components and provide developers with functions and tools for further extension. We release this infrastructure with a deployment (http://www.expfactory.org) that researchers are currently using to run a set of over 80 standardized web-based experiments on Amazon Mechanical Turk. By providing open source tools for both deployment and development, this novel infrastructure holds promise to bring reproducibility to the administration of experiments, and accelerate scientific progress by providing a shared community resource of psychological paradigms.

  3. The Experiment Factory: Standardizing Behavioral Experiments

    PubMed Central

    Sochat, Vanessa V.; Eisenberg, Ian W.; Enkavi, A. Zeynep; Li, Jamie; Bissett, Patrick G.; Poldrack, Russell A.

    2016-01-01

    The administration of behavioral and experimental paradigms for psychology research is hindered by lack of a coordinated effort to develop and deploy standardized paradigms. While several frameworks (Mason and Suri, 2011; McDonnell et al., 2012; de Leeuw, 2015; Lange et al., 2015) have provided infrastructure and methods for individual research groups to develop paradigms, missing is a coordinated effort to develop paradigms linked with a system to easily deploy them. This disorganization leads to redundancy in development, divergent implementations of conceptually identical tasks, disorganized and error-prone code lacking documentation, and difficulty in replication. The ongoing reproducibility crisis in psychology and neuroscience research (Baker, 2015; Open Science Collaboration, 2015) highlights the urgency of this challenge: reproducible research in behavioral psychology is conditional on deployment of equivalent experiments. A large, accessible repository of experiments for researchers to develop collaboratively is most efficiently accomplished through an open source framework. Here we present the Experiment Factory, an open source framework for the development and deployment of web-based experiments. The modular infrastructure includes experiments, virtual machines for local or cloud deployment, and an application to drive these components and provide developers with functions and tools for further extension. We release this infrastructure with a deployment (http://www.expfactory.org) that researchers are currently using to run a set of over 80 standardized web-based experiments on Amazon Mechanical Turk. By providing open source tools for both deployment and development, this novel infrastructure holds promise to bring reproducibility to the administration of experiments, and accelerate scientific progress by providing a shared community resource of psychological paradigms. PMID:27199843

  4. Fractional Flow Reserve and Coronary Computed Tomographic Angiography: A Review and Critical Analysis.

    PubMed

    Hecht, Harvey S; Narula, Jagat; Fearon, William F

    2016-07-08

    Invasive fractional flow reserve (FFR) is now the gold standard for intervention. Noninvasive functional imaging analyses derived from coronary computed tomographic angiography (CTA) offer alternatives for evaluating lesion-specific ischemia. CT-FFR, CT myocardial perfusion imaging, and transluminal attenuation gradient/corrected contrast opacification have been studied using invasive FFR as the gold standard. CT-FFR has demonstrated significant improvement in specificity and positive predictive value compared with CTA alone for predicting FFR of ≤0.80, as well as decreasing the frequency of nonobstructive invasive coronary angiography. High-risk plaque characteristics have also been strongly implicated in abnormal FFR. Myocardial computed tomographic perfusion is an alternative method with promising results; it involves more radiation and contrast. Transluminal attenuation gradient/corrected contrast opacification is more controversial and may be more related to vessel diameter than stenosis. Important considerations remain: (1) improvement of CTA quality to decrease unevaluable studies, (2) is the diagnostic accuracy of CT-FFR sufficient? (3) can CT-FFR guide intervention without invasive FFR confirmation? (4) what are the long-term outcomes of CT-FFR-guided treatment and how do they compare with other functional imaging-guided paradigms? (5) what degree of stenosis on CTA warrants CT-FFR? (6) how should high-risk plaque be incorporated into treatment decisions? (7) how will CT-FFR influence other functional imaging test utilization, and what will be the effect on the practice of cardiology? (8) will a workstation-based CT-FFR be mandatory? Rapid progress to date suggests that CTA-based lesion-specific ischemia will be the gatekeeper to the cardiac catheterization laboratory and will transform the world of intervention. © 2016 American Heart Association, Inc.

  5. Data Network Weather Service Reporting - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael Frey

    2012-08-30

    A final report is made of a three-year effort to develop a new forecasting paradigm for computer network performance. This effort was made in co-ordination with Fermi Lab's construction of e-Weather Center.

  6. Computational Constraints in Cognitive Theories of Forgetting

    PubMed Central

    Ecker, Ullrich K. H.; Lewandowsky, Stephan

    2012-01-01

    This article highlights some of the benefits of computational modeling for theorizing in cognition. We demonstrate how computational models have been used recently to argue that (1) forgetting in short-term memory is based on interference not decay, (2) forgetting in list-learning paradigms is more parsimoniously explained by a temporal distinctiveness account than by various forms of consolidation, and (3) intrusion asymmetries that appear when information is learned in different contexts can be explained by temporal context reinstatement rather than labilization and reconsolidation processes. PMID:23091467

  7. Computational constraints in cognitive theories of forgetting.

    PubMed

    Ecker, Ullrich K H; Lewandowsky, Stephan

    2012-01-01

    This article highlights some of the benefits of computational modeling for theorizing in cognition. We demonstrate how computational models have been used recently to argue that (1) forgetting in short-term memory is based on interference not decay, (2) forgetting in list-learning paradigms is more parsimoniously explained by a temporal distinctiveness account than by various forms of consolidation, and (3) intrusion asymmetries that appear when information is learned in different contexts can be explained by temporal context reinstatement rather than labilization and reconsolidation processes.

  8. Generic Divide and Conquer Internet-Based Computing

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J. (Technical Monitor); Radenski, Atanas

    2003-01-01

    The growth of Internet-based applications and the proliferation of networking technologies have been transforming traditional commercial application areas as well as computer and computational sciences and engineering. This growth stimulates the exploration of Peer to Peer (P2P) software technologies that can open new research and application opportunities not only for the commercial world, but also for the scientific and high-performance computing applications community. The general goal of this project is to achieve better understanding of the transition to Internet-based high-performance computing and to develop solutions for some of the technical challenges of this transition. In particular, we are interested in creating long-term motivation for end users to provide their idle processor time to support computationally intensive tasks. We believe that a practical P2P architecture should provide useful service to both clients with high-performance computing needs and contributors of lower-end computing resources. To achieve this, we are designing dual -service architecture for P2P high-performance divide-and conquer computing; we are also experimenting with a prototype implementation. Our proposed architecture incorporates a master server, utilizes dual satellite servers, and operates on the Internet in a dynamically changing large configuration of lower-end nodes provided by volunteer contributors. A dual satellite server comprises a high-performance computing engine and a lower-end contributor service engine. The computing engine provides generic support for divide and conquer computations. The service engine is intended to provide free useful HTTP-based services to contributors of lower-end computing resources. Our proposed architecture is complementary to and accessible from computational grids, such as Globus, Legion, and Condor. Grids provide remote access to existing higher-end computing resources; in contrast, our goal is to utilize idle processor time of lower-end Internet nodes. Our project is focused on a generic divide and conquer paradigm and on mobile applications of this paradigm that can operate on a loose and ever changing pool of lower-end Internet nodes.

  9. Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆

    PubMed Central

    Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan

    2016-01-01

    The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875

  10. Comparative study of internet cloud and cloudlet over wireless mesh networks for real-time applications

    NASA Astrophysics Data System (ADS)

    Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos

    2014-05-01

    Mobile cloud computing is receiving world-wide momentum for ubiquitous on-demand cloud services for mobile users provided by Amazon, Google etc. with low capital cost. However, Internet-centric clouds introduce wide area network (WAN) delays that are often intolerable for real-time applications such as video streaming. One promising approach to addressing this challenge is to deploy decentralized mini-cloud facility known as cloudlets to enable localized cloud services. When supported by local wireless connectivity, a wireless cloudlet is expected to offer low cost and high performance cloud services for the users. In this work, we implement a realistic framework that comprises both a popular Internet cloud (Amazon Cloud) and a real-world cloudlet (based on Ubuntu Enterprise Cloud (UEC)) for mobile cloud users in a wireless mesh network. We focus on real-time video streaming over the HTTP standard and implement a typical application. We further perform a comprehensive comparative analysis and empirical evaluation of the application's performance when it is delivered over the Internet cloud and the cloudlet respectively. The study quantifies the influence of the two different cloud networking architectures on supporting real-time video streaming. We also enable movement of the users in the wireless mesh network and investigate the effect of user's mobility on mobile cloud computing over the cloudlet and Amazon cloud respectively. Our experimental results demonstrate the advantages of the cloudlet paradigm over its Internet cloud counterpart in supporting the quality of service of real-time applications.

  11. Pioneering topological methods for network-based drug-target prediction by exploiting a brain-network self-organization theory.

    PubMed

    Durán, Claudio; Daminelli, Simone; Thomas, Josephine M; Haupt, V Joachim; Schroeder, Michael; Cannistraci, Carlo Vittorio

    2017-04-26

    The bipartite network representation of the drug-target interactions (DTIs) in a biosystem enhances understanding of the drugs' multifaceted action modes, suggests therapeutic switching for approved drugs and unveils possible side effects. As experimental testing of DTIs is costly and time-consuming, computational predictors are of great aid. Here, for the first time, state-of-the-art DTI supervised predictors custom-made in network biology were compared-using standard and innovative validation frameworks-with unsupervised pure topological-based models designed for general-purpose link prediction in bipartite networks. Surprisingly, our results show that the bipartite topology alone, if adequately exploited by means of the recently proposed local-community-paradigm (LCP) theory-initially detected in brain-network topological self-organization and afterwards generalized to any complex network-is able to suggest highly reliable predictions, with comparable performance with the state-of-the-art-supervised methods that exploit additional (non-topological, for instance biochemical) DTI knowledge. Furthermore, a detailed analysis of the novel predictions revealed that each class of methods prioritizes distinct true interactions; hence, combining methodologies based on diverse principles represents a promising strategy to improve drug-target discovery. To conclude, this study promotes the power of bio-inspired computing, demonstrating that simple unsupervised rules inspired by principles of topological self-organization and adaptiveness arising during learning in living intelligent systems (like the brain) can efficiently equal perform complicated algorithms based on advanced, supervised and knowledge-based engineering. © The Author 2017. Published by Oxford University Press.

  12. Mixing Microworld and CAS Features in Building Computer Systems that Help Students Learn Algebra

    ERIC Educational Resources Information Center

    Nicaud, Jean-Francois; Bouhineau, Denis; Chaachoua, Hamid

    2004-01-01

    We present the design principles for a new kind of computer system that helps students learn algebra. The fundamental idea is to have a system based on the microworld paradigm that allows students to make their own calculations, as they do with paper and pencil, without being obliged to use commands, and to verify the correctness of these…

  13. Quantum simulations with noisy quantum computers

    NASA Astrophysics Data System (ADS)

    Gambetta, Jay

    Quantum computing is a new computational paradigm that is expected to lie beyond the standard model of computation. This implies a quantum computer can solve problems that can't be solved by a conventional computer with tractable overhead. To fully harness this power we need a universal fault-tolerant quantum computer. However the overhead in building such a machine is high and a full solution appears to be many years away. Nevertheless, we believe that we can build machines in the near term that cannot be emulated by a conventional computer. It is then interesting to ask what these can be used for. In this talk we will present our advances in simulating complex quantum systems with noisy quantum computers. We will show experimental implementations of this on some small quantum computers.

  14. Recent advances in exploring the neural underpinnings of auditory scene perception

    PubMed Central

    Snyder, Joel S.; Elhilali, Mounya

    2017-01-01

    Studies of auditory scene analysis have traditionally relied on paradigms using artificial sounds—and conventional behavioral techniques—to elucidate how we perceptually segregate auditory objects or streams from each other. In the past few decades, however, there has been growing interest in uncovering the neural underpinnings of auditory segregation using human and animal neuroscience techniques, as well as computational modeling. This largely reflects the growth in the fields of cognitive neuroscience and computational neuroscience and has led to new theories of how the auditory system segregates sounds in complex arrays. The current review focuses on neural and computational studies of auditory scene perception published in the past few years. Following the progress that has been made in these studies, we describe (1) theoretical advances in our understanding of the most well-studied aspects of auditory scene perception, namely segregation of sequential patterns of sounds and concurrently presented sounds; (2) the diversification of topics and paradigms that have been investigated; and (3) how new neuroscience techniques (including invasive neurophysiology in awake humans, genotyping, and brain stimulation) have been used in this field. PMID:28199022

  15. Spatial decoupling of targets and flashing stimuli for visual brain-computer interfaces

    NASA Astrophysics Data System (ADS)

    Waytowich, Nicholas R.; Krusienski, Dean J.

    2015-06-01

    Objective. Recently, paradigms using code-modulated visual evoked potentials (c-VEPs) have proven to achieve among the highest information transfer rates for noninvasive brain-computer interfaces (BCIs). One issue with current c-VEP paradigms, and visual-evoked paradigms in general, is that they require direct foveal fixation of the flashing stimuli. These interfaces are often visually unpleasant and can be irritating and fatiguing to the user, thus adversely impacting practical performance. In this study, a novel c-VEP BCI paradigm is presented that attempts to perform spatial decoupling of the targets and flashing stimuli using two distinct concepts: spatial separation and boundary positioning. Approach. For the paradigm, the flashing stimuli form a ring that encompasses the intended non-flashing targets, which are spatially separated from the stimuli. The user fixates on the desired target, which is classified using the changes to the EEG induced by the flashing stimuli located in the non-foveal visual field. Additionally, a subset of targets is also positioned at or near the stimulus boundaries, which decouples targets from direct association with a single stimulus. This allows a greater number of target locations for a fixed number of flashing stimuli. Main results. Results from 11 subjects showed practical classification accuracies for the non-foveal condition, with comparable performance to the direct-foveal condition for longer observation lengths. Online results from 5 subjects confirmed the offline results with an average accuracy across subjects of 95.6% for a 4-target condition. The offline analysis also indicated that targets positioned at or near the boundaries of two stimuli could be classified with the same accuracy as traditional superimposed (non-boundary) targets. Significance. The implications of this research are that c-VEPs can be detected and accurately classified to achieve comparable BCI performance without requiring potentially irritating direct foveation of flashing stimuli. Furthermore, this study shows that it is possible to increase the number of targets beyond the number of stimuli without degrading performance. Given the superior information transfer rate of c-VEP paradigms, these results can lead to the development of more practical and ergonomic BCIs.

  16. Managing water resources infrastructure in the face of different values

    NASA Astrophysics Data System (ADS)

    Mostert, Erik

    Water resources infrastructure (WRI) plays a key role in water management. It can serve or negatively affect some seven to ten different and sometimes conflicting values. WRI management is therefore not a purely technical issue. Economic analyses can help to some extent, but only for values related to current human use. Multi-criteria analysis can cover all values, but in the end WRI management is not an analytical issue, but a governance issue. Different governance paradigms exist: markets, hierarchies and “third alternatives”, such as common pool resources management and network management. This article presents social learning as the most promising paradigm. Positive experiences with social learning have been described and guidance on putting social learning into practice exists. Nonetheless, there are no magic solutions for managing WRI in the face of different values.

  17. The arcuate fasciculus and the disconnection theme in language and aphasia: History and current state

    PubMed Central

    Catani, Marco; Mesulam, Marsel

    2009-01-01

    Few themes have been more central to neurological models of aphasia than the disconnection paradigm and the role of the arcuate fasciculus. Introduced by luminaries of 19th Century neurology and resurrected by the charismatic work of Norman Geschwind, the disconnection theme has triggered spectacular advances of modern understanding of language and aphasia. But the disconnection paradigm had alternate fortunes, ranging from irrational exuberance to benign neglect, and its followers have not always shared the same view on its functional consequences and anatomical correlates. Our goal in this paper is, first, to survey the 19th Century roots of the connectionist approach to aphasia and, second, to describe emerging imaging technologies based on diffusion tensor imaging (DTI) that promise to consolidate and expand the disconnection approach to language and its disorders. PMID:18614162

  18. The arcuate fasciculus and the disconnection theme in language and aphasia: history and current state.

    PubMed

    Catani, Marco; Mesulam, Marsel

    2008-09-01

    Few themes have been more central to neurological models of aphasia than the disconnection paradigm and the role of the arcuate fasciculus. Introduced by luminaries of 19th Century neurology and resurrected by the charismatic work of Norman Geschwind, the disconnection theme has triggered spectacular advances of modern understanding of language and aphasia. But the disconnection paradigm had alternate fortunes, ranging from irrational exuberance to benign neglect, and its followers have not always shared the same view on its functional consequences and anatomical correlates. Our goal in this paper is, first, to survey the 19th Century roots of the connectionist approach to aphasia and, second, to describe emerging imaging technologies based on diffusion tensor imaging (DTI) that promise to consolidate and expand the disconnection approach to language and its disorders.

  19. Changing paradigm of cancer therapy: precision medicine by next-generation sequencing

    PubMed Central

    Xue, Yuan; Wilcox, William R.

    2016-01-01

    Precision medicine aims to identify the right drug, for the right patient, at the right dose, at the right time, which is particularly important in cancer therapy. Problems such as the variability of treatment response and resistance to medication have been long-standing challenges in oncology, especially for development of new medications. Solid tumors, unlike hematologic malignancies or brain tumors, are remarkably diverse in their cellular origins and developmental timing. The ability of next-generation sequencing (NGS) to analyze the comprehensive landscape of genetic alterations brings promises to diseases that have a highly complex and heterogeneous genetic composition such as cancer. Here we provide an overview of how NGS is able to facilitate precision medicine and change the paradigm of cancer therapy, especially for solid tumors, through technical advancements, molecular diagnosis, response monitoring and clinical trials. PMID:27144059

  20. A review of estimation of distribution algorithms in bioinformatics

    PubMed Central

    Armañanzas, Rubén; Inza, Iñaki; Santana, Roberto; Saeys, Yvan; Flores, Jose Luis; Lozano, Jose Antonio; Peer, Yves Van de; Blanco, Rosa; Robles, Víctor; Bielza, Concha; Larrañaga, Pedro

    2008-01-01

    Evolutionary search algorithms have become an essential asset in the algorithmic toolbox for solving high-dimensional optimization problems in across a broad range of bioinformatics problems. Genetic algorithms, the most well-known and representative evolutionary search technique, have been the subject of the major part of such applications. Estimation of distribution algorithms (EDAs) offer a novel evolutionary paradigm that constitutes a natural and attractive alternative to genetic algorithms. They make use of a probabilistic model, learnt from the promising solutions, to guide the search process. In this paper, we set out a basic taxonomy of EDA techniques, underlining the nature and complexity of the probabilistic model of each EDA variant. We review a set of innovative works that make use of EDA techniques to solve challenging bioinformatics problems, emphasizing the EDA paradigm's potential for further research in this domain. PMID:18822112

  1. Genetic manipulation for inherited neurodegenerative diseases: myth or reality?

    PubMed

    Yu-Wai-Man, Patrick

    2016-10-01

    Rare genetic diseases affect about 7% of the general population and over 7000 distinct clinical syndromes have been described with the majority being due to single gene defects. This review will provide a critical overview of genetic strategies that are being pioneered to halt or reverse disease progression in inherited neurodegenerative diseases. This field of research covers a vast area and only the most promising treatment paradigms will be discussed with a particular focus on inherited eye diseases, which have paved the way for innovative gene therapy paradigms, and mitochondrial diseases, which are currently generating a lot of debate centred on the bioethics of germline manipulation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. A data mining paradigm for identifying key factors in biological processes using gene expression data.

    PubMed

    Li, Jin; Zheng, Le; Uchiyama, Akihiko; Bin, Lianghua; Mauro, Theodora M; Elias, Peter M; Pawelczyk, Tadeusz; Sakowicz-Burkiewicz, Monika; Trzeciak, Magdalena; Leung, Donald Y M; Morasso, Maria I; Yu, Peng

    2018-06-13

    A large volume of biological data is being generated for studying mechanisms of various biological processes. These precious data enable large-scale computational analyses to gain biological insights. However, it remains a challenge to mine the data efficiently for knowledge discovery. The heterogeneity of these data makes it difficult to consistently integrate them, slowing down the process of biological discovery. We introduce a data processing paradigm to identify key factors in biological processes via systematic collection of gene expression datasets, primary analysis of data, and evaluation of consistent signals. To demonstrate its effectiveness, our paradigm was applied to epidermal development and identified many genes that play a potential role in this process. Besides the known epidermal development genes, a substantial proportion of the identified genes are still not supported by gain- or loss-of-function studies, yielding many novel genes for future studies. Among them, we selected a top gene for loss-of-function experimental validation and confirmed its function in epidermal differentiation, proving the ability of this paradigm to identify new factors in biological processes. In addition, this paradigm revealed many key genes in cold-induced thermogenesis using data from cold-challenged tissues, demonstrating its generalizability. This paradigm can lead to fruitful results for studying molecular mechanisms in an era of explosive accumulation of publicly available biological data.

  3. The Human Brain Project and neuromorphic computing

    PubMed Central

    Calimera, Andrea; Macii, Enrico; Poncino, Massimo

    Summary Understanding how the brain manages billions of processing units connected via kilometers of fibers and trillions of synapses, while consuming a few tens of Watts could provide the key to a completely new category of hardware (neuromorphic computing systems). In order to achieve this, a paradigm shift for computing as a whole is needed, which will see it moving away from current “bit precise” computing models and towards new techniques that exploit the stochastic behavior of simple, reliable, very fast, low-power computing devices embedded in intensely recursive architectures. In this paper we summarize how these objectives will be pursued in the Human Brain Project. PMID:24139655

  4. Designing Interactive Learning Systems.

    ERIC Educational Resources Information Center

    Barker, Philip

    1990-01-01

    Describes multimedia, computer-based interactive learning systems that support various forms of individualized study. Highlights include design models; user interfaces; design guidelines; media utilization paradigms, including hypermedia and learner-controlled models; metaphors and myths; authoring tools; optical media; workstations; four case…

  5. Multi-paradigm simulation at nanoscale: Methodology and application to functional carbon material

    NASA Astrophysics Data System (ADS)

    Su, Haibin

    2012-12-01

    Multiparadigm methods to span the scales from quantum mechanics to practical issues of functional nanoassembly and nanofabrication are enabling first principles predictions to guide and complement the experimental developments by designing and optimizing computationally the materials compositions and structures to assemble nanoscale systems with the requisite properties. In this talk, we employ multi-paradigm approaches to investigate functional carbon materials with versatile character, including fullerene, carbon nanotube (CNT), graphene, and related hybrid structures, which have already created an enormous impact on next generation nano devices. The topics will cover the reaction dynamics of C60 dimerization and the more challenging complex tubular fullerene formation process in the peapod structures; the computational design of a new generation of peapod nano-oscillators, the predicted magnetic state in Nano Buds; opto-electronic properties of graphene nanoribbons; and disorder / vibronic effects on transport in carbonrich materials.

  6. Computer-Aided Clinical Trial Recruitment Based on Domain-Specific Language Translation: A Case Study of Retinopathy of Prematurity

    PubMed Central

    2017-01-01

    Reusing the data from healthcare information systems can effectively facilitate clinical trials (CTs). How to select candidate patients eligible for CT recruitment criteria is a central task. Related work either depends on DBA (database administrator) to convert the recruitment criteria to native SQL queries or involves the data mapping between a standard ontology/information model and individual data source schema. This paper proposes an alternative computer-aided CT recruitment paradigm, based on syntax translation between different DSLs (domain-specific languages). In this paradigm, the CT recruitment criteria are first formally represented as production rules. The referenced rule variables are all from the underlying database schema. Then the production rule is translated to an intermediate query-oriented DSL (e.g., LINQ). Finally, the intermediate DSL is directly mapped to native database queries (e.g., SQL) automated by ORM (object-relational mapping). PMID:29065644

  7. Foundations and Emerging Paradigms for Computing in Living Cells.

    PubMed

    Ma, Kevin C; Perli, Samuel D; Lu, Timothy K

    2016-02-27

    Genetic circuits, composed of complex networks of interacting molecular machines, enable living systems to sense their dynamic environments, perform computation on the inputs, and formulate appropriate outputs. By rewiring and expanding these circuits with novel parts and modules, synthetic biologists have adapted living systems into vibrant substrates for engineering. Diverse paradigms have emerged for designing, modeling, constructing, and characterizing such artificial genetic systems. In this paper, we first provide an overview of recent advances in the development of genetic parts and highlight key engineering approaches. We then review the assembly of these parts into synthetic circuits from the perspectives of digital and analog logic, systems biology, and metabolic engineering, three areas of particular theoretical and practical interest. Finally, we discuss notable challenges that the field of synthetic biology still faces in achieving reliable and predictable forward-engineering of artificial biological circuits. Copyright © 2016. Published by Elsevier Ltd.

  8. Computer-Aided Clinical Trial Recruitment Based on Domain-Specific Language Translation: A Case Study of Retinopathy of Prematurity.

    PubMed

    Zhang, Yinsheng; Zhang, Guoming; Shang, Qian

    2017-01-01

    Reusing the data from healthcare information systems can effectively facilitate clinical trials (CTs). How to select candidate patients eligible for CT recruitment criteria is a central task. Related work either depends on DBA (database administrator) to convert the recruitment criteria to native SQL queries or involves the data mapping between a standard ontology/information model and individual data source schema. This paper proposes an alternative computer-aided CT recruitment paradigm, based on syntax translation between different DSLs (domain-specific languages). In this paradigm, the CT recruitment criteria are first formally represented as production rules. The referenced rule variables are all from the underlying database schema. Then the production rule is translated to an intermediate query-oriented DSL (e.g., LINQ). Finally, the intermediate DSL is directly mapped to native database queries (e.g., SQL) automated by ORM (object-relational mapping).

  9. Methods for Computationally Efficient Structured CFD Simulations of Complex Turbomachinery Flows

    NASA Technical Reports Server (NTRS)

    Herrick, Gregory P.; Chen, Jen-Ping

    2012-01-01

    This research presents more efficient computational methods by which to perform multi-block structured Computational Fluid Dynamics (CFD) simulations of turbomachinery, thus facilitating higher-fidelity solutions of complicated geometries and their associated flows. This computational framework offers flexibility in allocating resources to balance process count and wall-clock computation time, while facilitating research interests of simulating axial compressor stall inception with more complete gridding of the flow passages and rotor tip clearance regions than is typically practiced with structured codes. The paradigm presented herein facilitates CFD simulation of previously impractical geometries and flows. These methods are validated and demonstrate improved computational efficiency when applied to complicated geometries and flows.

  10. Open and closed cortico-subcortical loops: A neuro-computational account of access to consciousness in the distractor-induced blindness paradigm.

    PubMed

    Ebner, Christian; Schroll, Henning; Winther, Gesche; Niedeggen, Michael; Hamker, Fred H

    2015-09-01

    How the brain decides which information to process 'consciously' has been debated over for decades without a simple explanation at hand. While most experiments manipulate the perceptual energy of presented stimuli, the distractor-induced blindness task is a prototypical paradigm to investigate gating of information into consciousness without or with only minor visual manipulation. In this paradigm, subjects are asked to report intervals of coherent dot motion in a rapid serial visual presentation (RSVP) stream, whenever these are preceded by a particular color stimulus in a different RSVP stream. If distractors (i.e., intervals of coherent dot motion prior to the color stimulus) are shown, subjects' abilities to perceive and report intervals of target dot motion decrease, particularly with short delays between intervals of target color and target motion. We propose a biologically plausible neuro-computational model of how the brain controls access to consciousness to explain how distractor-induced blindness originates from information processing in the cortex and basal ganglia. The model suggests that conscious perception requires reverberation of activity in cortico-subcortical loops and that basal-ganglia pathways can either allow or inhibit this reverberation. In the distractor-induced blindness paradigm, inadequate distractor-induced response tendencies are suppressed by the inhibitory 'hyperdirect' pathway of the basal ganglia. If a target follows such a distractor closely, temporal aftereffects of distractor suppression prevent target identification. The model reproduces experimental data on how delays between target color and target motion affect the probability of target detection. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Flight simulation using a Brain-Computer Interface: A pilot, pilot study.

    PubMed

    Kryger, Michael; Wester, Brock; Pohlmeyer, Eric A; Rich, Matthew; John, Brendan; Beaty, James; McLoughlin, Michael; Boninger, Michael; Tyler-Kabara, Elizabeth C

    2017-01-01

    As Brain-Computer Interface (BCI) systems advance for uses such as robotic arm control it is postulated that the control paradigms could apply to other scenarios, such as control of video games, wheelchair movement or even flight. The purpose of this pilot study was to determine whether our BCI system, which involves decoding the signals of two 96-microelectrode arrays implanted into the motor cortex of a subject, could also be used to control an aircraft in a flight simulator environment. The study involved six sessions in which various parameters were modified in order to achieve the best flight control, including plane type, view, control paradigm, gains, and limits. Successful flight was determined qualitatively by evaluating the subject's ability to perform requested maneuvers, maintain flight paths, and avoid control losses such as dives, spins and crashes. By the end of the study, it was found that the subject could successfully control an aircraft. The subject could use both the jet and propeller plane with different views, adopting an intuitive control paradigm. From the subject's perspective, this was one of the most exciting and entertaining experiments she had performed in two years of research. In conclusion, this study provides a proof-of-concept that traditional motor cortex signals combined with a decoding paradigm can be used to control systems besides a robotic arm for which the decoder was developed. Aside from possible functional benefits, it also shows the potential for a new recreational activity for individuals with disabilities who are able to master BCI control. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Towards a symbiotic brain-computer interface: exploring the application-decoder interaction

    NASA Astrophysics Data System (ADS)

    Verhoeven, T.; Buteneers Wiersema, P., Jr.; Dambre, J.; Kindermans, PJ

    2015-12-01

    Objective. State of the art brain-computer interface (BCI) research focuses on improving individual components such as the application or the decoder that converts the user’s brain activity to control signals. In this study, we investigate the interaction between these components in the P300 speller, a BCI for communication. We introduce a synergistic approach in which the stimulus presentation sequence is modified to enhance the machine learning decoding. In this way we aim for an improved overall BCI performance. Approach. First, a new stimulus presentation paradigm is introduced which provides us flexibility in tuning the sequence of visual stimuli presented to the user. Next, an experimental setup in which this paradigm is compared to other paradigms uncovers the underlying mechanism of the interdependence between the application and the performance of the decoder. Main results. Extensive analysis of the experimental results reveals the changing requirements of the decoder concerning the data recorded during the spelling session. When few data is recorded, the balance in the number of target and non-target stimuli shown to the user is more important than the signal-to-noise rate (SNR) of the recorded response signals. Only when more data has been collected, the SNR becomes the dominant factor. Significance. For BCIs in general, knowing the dominant factor that affects the decoder performance and being able to respond to it is of utmost importance to improve system performance. For the P300 speller, the proposed tunable paradigm offers the possibility to tune the application to the decoder’s needs at any time and, as such, fully exploit this application-decoder interaction.

  13. Mentat: An object-oriented macro data flow system

    NASA Technical Reports Server (NTRS)

    Grimshaw, Andrew S.; Liu, Jane W. S.

    1988-01-01

    Mentat, an object-oriented macro data flow system designed to facilitate parallelism in distributed systems, is presented. The macro data flow model is a model of computation similar to the data flow model with two principal differences: the computational complexity of the actors is much greater than in traditional data flow systems, and there are persistent actors that maintain state information between executions. Mentat is a system that combines the object-oriented programming paradigm and the macro data flow model of computation. Mentat programs use a dynamic structure called a future list to represent the future of computations.

  14. Elastic Cloud Computing Infrastructures in the Open Cirrus Testbed Implemented via Eucalyptus

    NASA Astrophysics Data System (ADS)

    Baun, Christian; Kunze, Marcel

    Cloud computing realizes the advantages and overcomes some restrictionsof the grid computing paradigm. Elastic infrastructures can easily be createdand managed by cloud users. In order to accelerate the research ondata center management and cloud services the OpenCirrusTM researchtestbed has been started by HP, Intel and Yahoo!. Although commercialcloud offerings are proprietary, Open Source solutions exist in the field ofIaaS with Eucalyptus, PaaS with AppScale and at the applications layerwith Hadoop MapReduce. This paper examines the I/O performance ofcloud computing infrastructures implemented with Eucalyptus in contrastto Amazon S3.

  15. Challenges and potential solutions for big data implementations in developing countries.

    PubMed

    Luna, D; Mayan, J C; García, M J; Almerares, A A; Househ, M

    2014-08-15

    The volume of data, the velocity with which they are generated, and their variety and lack of structure hinder their use. This creates the need to change the way information is captured, stored, processed, and analyzed, leading to the paradigm shift called Big Data. To describe the challenges and possible solutions for developing countries when implementing Big Data projects in the health sector. A non-systematic review of the literature was performed in PubMed and Google Scholar. The following keywords were used: "big data", "developing countries", "data mining", "health information systems", and "computing methodologies". A thematic review of selected articles was performed. There are challenges when implementing any Big Data program including exponential growth of data, special infrastructure needs, need for a trained workforce, need to agree on interoperability standards, privacy and security issues, and the need to include people, processes, and policies to ensure their adoption. Developing countries have particular characteristics that hinder further development of these projects. The advent of Big Data promises great opportunities for the healthcare field. In this article, we attempt to describe the challenges developing countries would face and enumerate the options to be used to achieve successful implementations of Big Data programs.

  16. The Design and Analysis of a Novel Split-H-Shaped Metamaterial for Multi-Band Microwave Applications

    PubMed Central

    Islam, Sikder Sunbeam; Faruque, Mohammad Rashed Iqbal; Islam, Mohammad Tariqul

    2014-01-01

    This paper presents the design and analysis of a novel split-H-shaped metamaterial unit cell structure that is applicable in a multi-band frequency range and that exhibits negative permeability and permittivity in those frequency bands. In the basic design, the separate split-square resonators are joined by a metal link to form an H-shaped unit structure. Moreover, an analysis and a comparison of the 1 × 1 array and 2 × 2 array structures and the 1 × 1 and 2 × 2 unit cell configurations were performed. All of these configurations demonstrate multi-band operating frequencies (S-band, C-band, X-band and Ku-band) with double-negative characteristics. The equivalent circuit model and measured result for each unit cell are presented to validate the resonant behavior. The commercially available finite-difference time-domain (FDTD)-based simulation software, Computer Simulation Technology (CST) Microwave Studio, was used to obtain the reflection and transmission parameters of each unit cell. This is a novel and promising design in the electromagnetic paradigm for its simplicity, scalability, double-negative characteristics and multi-band operation. PMID:28788116

  17. Can surgical simulation be used to train detection and classification of neural networks?

    PubMed

    Zisimopoulos, Odysseas; Flouty, Evangello; Stacey, Mark; Muscroft, Sam; Giataganas, Petros; Nehme, Jean; Chow, Andre; Stoyanov, Danail

    2017-10-01

    Computer-assisted interventions (CAI) aim to increase the effectiveness, precision and repeatability of procedures to improve surgical outcomes. The presence and motion of surgical tools is a key information input for CAI surgical phase recognition algorithms. Vision-based tool detection and recognition approaches are an attractive solution and can be designed to take advantage of the powerful deep learning paradigm that is rapidly advancing image recognition and classification. The challenge for such algorithms is the availability and quality of labelled data used for training. In this Letter, surgical simulation is used to train tool detection and segmentation based on deep convolutional neural networks and generative adversarial networks. The authors experiment with two network architectures for image segmentation in tool classes commonly encountered during cataract surgery. A commercially-available simulator is used to create a simulated cataract dataset for training models prior to performing transfer learning on real surgical data. To the best of authors' knowledge, this is the first attempt to train deep learning models for surgical instrument detection on simulated data while demonstrating promising results to generalise on real data. Results indicate that simulated data does have some potential for training advanced classification methods for CAI systems.

  18. Imaging with terahertz radiation

    NASA Astrophysics Data System (ADS)

    Chan, Wai Lam; Deibel, Jason; Mittleman, Daniel M.

    2007-08-01

    Within the last several years, the field of terahertz science and technology has changed dramatically. Many new advances in the technology for generation, manipulation, and detection of terahertz radiation have revolutionized the field. Much of this interest has been inspired by the promise of valuable new applications for terahertz imaging and sensing. Among a long list of proposed uses, one finds compelling needs such as security screening and quality control, as well as whimsical notions such as counting the almonds in a bar of chocolate. This list has grown in parallel with the development of new technologies and new paradigms for imaging and sensing. Many of these proposed applications exploit the unique capabilities of terahertz radiation to penetrate common packaging materials and provide spectroscopic information about the materials within. Several of the techniques used for terahertz imaging have been borrowed from other, more well established fields such as x-ray computed tomography and synthetic aperture radar. Others have been developed exclusively for the terahertz field, and have no analogies in other portions of the spectrum. This review provides a comprehensive description of the various techniques which have been employed for terahertz image formation, as well as discussing numerous examples which illustrate the many exciting potential uses for these emerging technologies.

  19. The Design and Analysis of a Novel Split-H-Shaped Metamaterial for Multi-Band Microwave Applications.

    PubMed

    Islam, Sikder Sunbeam; Faruque, Mohammad Rashed Iqbal; Islam, Mohammad Tariqul

    2014-07-02

    This paper presents the design and analysis of a novel split-H-shaped metamaterial unit cell structure that is applicable in a multi-band frequency range and that exhibits negative permeability and permittivity in those frequency bands. In the basic design, the separate split-square resonators are joined by a metal link to form an H-shaped unit structure. Moreover, an analysis and a comparison of the 1 × 1 array and 2 × 2 array structures and the 1 × 1 and 2 × 2 unit cell configurations were performed. All of these configurations demonstrate multi-band operating frequencies (S-band, C-band, X-band and K u -band) with double-negative characteristics. The equivalent circuit model and measured result for each unit cell are presented to validate the resonant behavior. The commercially available finite-difference time-domain (FDTD)-based simulation software, Computer Simulation Technology (CST) Microwave Studio, was used to obtain the reflection and transmission parameters of each unit cell. This is a novel and promising design in the electromagnetic paradigm for its simplicity, scalability, double-negative characteristics and multi-band operation.

  20. When Dijkstra Meets Vanishing Point: A Stereo Vision Approach for Road Detection.

    PubMed

    Zhang, Yigong; Su, Yingna; Yang, Jian; Ponce, Jean; Kong, Hui

    2018-05-01

    In this paper, we propose a vanishing-point constrained Dijkstra road model for road detection in a stereo-vision paradigm. First, the stereo-camera is used to generate the u- and v-disparity maps of road image, from which the horizon can be extracted. With the horizon and ground region constraints, we can robustly locate the vanishing point of road region. Second, a weighted graph is constructed using all pixels of the image, and the detected vanishing point is treated as the source node of the graph. By computing a vanishing-point constrained Dijkstra minimum-cost map, where both disparity and gradient of gray image are used to calculate cost between two neighbor pixels, the problem of detecting road borders in image is transformed into that of finding two shortest paths that originate from the vanishing point to two pixels in the last row of image. The proposed approach has been implemented and tested over 2600 grayscale images of different road scenes in the KITTI data set. The experimental results demonstrate that this training-free approach can detect horizon, vanishing point, and road regions very accurately and robustly. It can achieve promising performance.

  1. Designed electromagnetic pulsed therapy: clinical applications.

    PubMed

    Gordon, Glen A

    2007-09-01

    First reduced to science by Maxwell in 1865, electromagnetic technology as therapy received little interest from basic scientists or clinicians until the 1980s. It now promises applications that include mitigation of inflammation (electrochemistry) and stimulation of classes of genes following onset of illness and injury (electrogenomics). The use of electromagnetism to stop inflammation and restore tissue seems a logical phenomenology, that is, stop the inflammation, then upregulate classes of restorative gene loci to initiate healing. Studies in the fields of MRI and NMR have aided the understanding of cell response to low energy EMF inputs via electromagnetically responsive elements. Understanding protein iterations, that is, how they process information to direct energy, we can maximize technology to aid restorative intervention, a promising step forward over current paradigms of therapy.

  2. Substance abuse prevention in American Indian and Alaska Native communities.

    PubMed

    Whitbeck, Les B; Walls, Melissa L; Welch, Melissa L

    2012-09-01

    In this article we review three categories of American Indian/Alaska Native (AIAN) substance abuse prevention programs: (1) published empirical trials; (2) promising programs published and unpublished that are in the process of development and that have the potential for empirical trials; and (3) examples of innovative grassroots programs that originate at the local level and may have promise for further development. AIAN communities are taking more and more independent control of substance abuse prevention. We point out that European American prevention scientists are largely unaware of the numerous grassroots prevention work going on in AIAN communities and urge a paradigm shift from adapting European American prevention science "best practices" to creating cultural "best practices" by working from inside AIAN communities.

  3. Fuel Efficiencies Through Airframe Improvements

    NASA Technical Reports Server (NTRS)

    Bezos-O'Connor, Gaudy M.; Mangelsdorf, Mark F.; Maliska, Heather A.; Washburn, Anthony E.; Wahls, Richard A.

    2011-01-01

    The factors of continuing strong growth in air traffic volume, the vital role of the air transport system on the economy, and concerns about the environmental impact of aviation have added focus to the National Aeronautics Research Policy. To address these concerns in the context of the National Policy, NASA has set aggressive goals in noise reduction, emissions, and energy consumption. With respect to the goal of reducing energy consumption in the fleet, the development of promising airframe technologies is required to realize the significant improvements that are desired. Furthermore, the combination of advances in materials and structures with aerodynamic technologies may lead to a paradigm shift in terms of potential configurations for the future. Some of these promising airframe technologies targeted at improved efficiency are highlighted.

  4. A bridge from physics to biology.

    PubMed

    Preparata, Giuliano

    2010-01-01

    Through molecular biology, the 'atomistic paradigm' tries to remove from the analysis of living matter every element of what appears as the distinguishing character of the chain of the biological processes: their cooperative, collective aspects. Living matter appears, on the contrary, governed by Quantum Field Theory (QFT), spontaneously creating order when the thermodynamical conditions are right. 'Electrodynamical coherence' (EC) is the most promising hint for the existence of a bridge between Physics and Biology.

  5. Comparison of tactile, auditory, and visual modality for brain-computer interface use: a case study with a patient in the locked-in state.

    PubMed

    Kaufmann, Tobias; Holz, Elisa M; Kübler, Andrea

    2013-01-01

    This paper describes a case study with a patient in the classic locked-in state, who currently has no means of independent communication. Following a user-centered approach, we investigated event-related potentials (ERP) elicited in different modalities for use in brain-computer interface (BCI) systems. Such systems could provide her with an alternative communication channel. To investigate the most viable modality for achieving BCI based communication, classic oddball paradigms (1 rare and 1 frequent stimulus, ratio 1:5) in the visual, auditory and tactile modality were conducted (2 runs per modality). Classifiers were built on one run and tested offline on another run (and vice versa). In these paradigms, the tactile modality was clearly superior to other modalities, displaying high offline accuracy even when classification was performed on single trials only. Consequently, we tested the tactile paradigm online and the patient successfully selected targets without any error. Furthermore, we investigated use of the visual or tactile modality for different BCI systems with more than two selection options. In the visual modality, several BCI paradigms were tested offline. Neither matrix-based nor so-called gaze-independent paradigms constituted a means of control. These results may thus question the gaze-independence of current gaze-independent approaches to BCI. A tactile four-choice BCI resulted in high offline classification accuracies. Yet, online use raised various issues. Although performance was clearly above chance, practical daily life use appeared unlikely when compared to other communication approaches (e.g., partner scanning). Our results emphasize the need for user-centered design in BCI development including identification of the best stimulus modality for a particular user. Finally, the paper discusses feasibility of EEG-based BCI systems for patients in classic locked-in state and compares BCI to other AT solutions that we also tested during the study.

  6. A novel paradigm for cell and molecule interaction ontology: from the CMM model to IMGT-ONTOLOGY

    PubMed Central

    2010-01-01

    Background Biology is moving fast toward the virtuous circle of other disciplines: from data to quantitative modeling and back to data. Models are usually developed by mathematicians, physicists, and computer scientists to translate qualitative or semi-quantitative biological knowledge into a quantitative approach. To eliminate semantic confusion between biology and other disciplines, it is necessary to have a list of the most important and frequently used concepts coherently defined. Results We propose a novel paradigm for generating new concepts for an ontology, starting from model rather than developing a database. We apply that approach to generate concepts for cell and molecule interaction starting from an agent based model. This effort provides a solid infrastructure that is useful to overcome the semantic ambiguities that arise between biologists and mathematicians, physicists, and computer scientists, when they interact in a multidisciplinary field. Conclusions This effort represents the first attempt at linking molecule ontology with cell ontology, in IMGT-ONTOLOGY, the well established ontology in immunogenetics and immunoinformatics, and a paradigm for life science biology. With the increasing use of models in biology and medicine, the need to link different levels, from molecules to cells to tissues and organs, is increasingly important. PMID:20167082

  7. Evaluating brain-computer interface performance using color in the P300 checkerboard speller.

    PubMed

    Ryan, D B; Townsend, G; Gates, N A; Colwell, K; Sellers, E W

    2017-10-01

    Current Brain-Computer Interface (BCI) systems typically flash an array of items from grey to white (GW). The objective of this study was to evaluate BCI performance using uniquely colored stimuli. In addition to the GW stimuli, the current study tested two types of color stimuli (grey to color [GC] and color intensification [CI]). The main hypotheses were that in a checkboard paradigm, unique color stimuli will: (1) increase BCI performance over the standard GW paradigm; (2) elicit larger event-related potentials (ERPs); and, (3) improve offline performance with an electrode selection algorithm (i.e., Jumpwise). Online results (n=36) showed that GC provides higher accuracy and information transfer rate than the CI and GW conditions. Waveform analysis showed that GC produced higher amplitude ERPs than CI and GW. Information transfer rate was improved by the Jumpwise-selected channel locations in all conditions. Unique color stimuli (GC) improved BCI performance and enhanced ERPs. Jumpwise-selected electrode locations improved offline performance. These results show that in a checkerboard paradigm, unique color stimuli increase BCI performance, are preferred by participants, and are important to the design of end-user applications; thus, could lead to an increase in end-user performance and acceptance of BCI technology. Copyright © 2017 International Federation of Clinical Neurophysiology. All rights reserved.

  8. Precision Agriculture Design Method Using a Distributed Computing Architecture on Internet of Things Context.

    PubMed

    Ferrández-Pastor, Francisco Javier; García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario; Mora-Martínez, José

    2018-05-28

    The Internet of Things (IoT) has opened productive ways to cultivate soil with the use of low-cost hardware (sensors/actuators) and communication (Internet) technologies. Remote equipment and crop monitoring, predictive analytic, weather forecasting for crops or smart logistics and warehousing are some examples of these new opportunities. Nevertheless, farmers are agriculture experts but, usually, do not have experience in IoT applications. Users who use IoT applications must participate in its design, improving the integration and use. In this work, different industrial agricultural facilities are analysed with farmers and growers to design new functionalities based on IoT paradigms deployment. User-centred design model is used to obtain knowledge and experience in the process of introducing technology in agricultural applications. Internet of things paradigms are used as resources to facilitate the decision making. IoT architecture, operating rules and smart processes are implemented using a distributed model based on edge and fog computing paradigms. A communication architecture is proposed using these technologies. The aim is to help farmers to develop smart systems both, in current and new facilities. Different decision trees to automate the installation, designed by the farmer, can be easily deployed using the method proposed in this document.

  9. Do You Think You Can? The Influence of Student Self-Efficacy on the Effectiveness of Tutorial Dialogue for Computer Science

    ERIC Educational Resources Information Center

    Wiggins, Joseph B.; Grafsgaard, Joseph F.; Boyer, Kristy Elizabeth; Wiebe, Eric N.; Lester, James C.

    2017-01-01

    In recent years, significant advances have been made in intelligent tutoring systems, and these advances hold great promise for adaptively supporting computer science (CS) learning. In particular, tutorial dialogue systems that engage students in natural language dialogue can create rich, adaptive interactions. A promising approach to increasing…

  10. The Technology Fix: The Promise and Reality of Computers in Our Schools

    ERIC Educational Resources Information Center

    Pflaum, William D.

    2004-01-01

    During the technology boom of the 1980s and 1990s, computers seemed set to revolutionize education. Do any of these promises sound familiar? (1) Technology would help all students learn better, thanks to multimedia programs capable of adapting to individual needs, learning styles, and skill levels; (2) Technology would transform the teacher's role…

  11. On the 'principle of the quantumness', the quantumness of Relativity, and the computational grand-unification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Ariano, Giacomo Mauro

    2010-05-04

    I will argue that the proposal of establishing operational foundations of Quantum Theory should have top-priority, and that the Lucien Hardy's program on Quantum Gravity should be paralleled by an analogous program on Quantum Field Theory (QFT), which needs to be reformulated, notwithstanding its experimental success. In this paper, after reviewing recently suggested operational 'principles of the quantumness', I address the problem on whether Quantum Theory and Special Relativity are unrelated theories, or instead, if the one implies the other. I show how Special Relativity can be indeed derived from causality of Quantum Theory, within the computational paradigm 'the universemore » is a huge quantum computer', reformulating QFT as a Quantum-Computational Field Theory (QCFT). In QCFT Special Relativity emerges from the fabric of the computational network, which also naturally embeds gauge invariance. In this scheme even the quantization rule and the Planck constant can in principle be derived as emergent from the underlying causal tapestry of space-time. In this way Quantum Theory remains the only theory operating the huge computer of the universe.Is the computational paradigm only a speculative tautology (theory as simulation of reality), or does it have a scientific value? The answer will come from Occam's razor, depending on the mathematical simplicity of QCFT. Here I will just start scratching the surface of QCFT, analyzing simple field theories, including Dirac's. The number of problems and unmotivated recipes that plague QFT strongly motivates us to undertake the QCFT project, since QCFT makes all such problems manifest, and forces a re-foundation of QFT.« less

  12. Basic concepts and development of an all-purpose computer interface for ROC/FROC observer study.

    PubMed

    Shiraishi, Junji; Fukuoka, Daisuke; Hara, Takeshi; Abe, Hiroyuki

    2013-01-01

    In this study, we initially investigated various aspects of requirements for a computer interface employed in receiver operating characteristic (ROC) and free-response ROC (FROC) observer studies which involve digital images and ratings obtained by observers (radiologists). Secondly, by taking into account these aspects, an all-purpose computer interface utilized for these observer performance studies was developed. Basically, the observer studies can be classified into three paradigms, such as one rating for one case without an identification of a signal location, one rating for one case with an identification of a signal location, and multiple ratings for one case with identification of signal locations. For these paradigms, display modes on the computer interface can be used for single/multiple views of a static image, continuous viewing with cascade images (i.e., CT, MRI), and dynamic viewing of movies (i.e., DSA, ultrasound). Various functions on these display modes, which include windowing (contrast/level), magnifications, and annotations, are needed to be selected by an experimenter corresponding to the purpose of the research. In addition, the rules of judgment for distinguishing between true positives and false positives are an important factor for estimating diagnostic accuracy in an observer study. We developed a computer interface which runs on a Windows operating system by taking into account all aspects required for various observer studies. This computer interface requires experimenters to have sufficient knowledge about ROC/FROC observer studies, but allows its use for any purpose of the observer studies. This computer interface will be distributed publicly in the near future.

  13. Psst, Can You Keep a Secret?

    PubMed

    Vassilev, Apostol; Mouha, Nicky; Brandão, Luís

    2018-01-01

    The security of encrypted data depends not only on the theoretical properties of cryptographic primitives but also on the robustness of their implementations in software and hardware. Threshold cryptography introduces a computational paradigm that enables higher assurance for such implementations.

  14. Feature Discovery by Competitive Learning.

    ERIC Educational Resources Information Center

    Rumelhart, David E.; Zipser, David

    1985-01-01

    Reports results of studies with an unsupervised learning paradigm called competitive learning which is examined using computer simulation and formal analysis. When competitive learning is applied to parallel networks of neuron-like elements, many potentially useful learning tasks can be accomplished. (Author)

  15. Behavior Models for Software Architecture

    DTIC Science & Technology

    2014-11-01

    MP. Existing process modeling frameworks (BPEL, BPMN [Grosskopf et al. 2009], IDEF) usually follow the “single flowchart” paradigm. MP separates...Process: Business Process Modeling using BPMN , Meghan Kiffer Press. HAREL, D., 1987, A Visual Formalism for Complex Systems. Science of Computer

  16. Highlights from the previous volumes

    NASA Astrophysics Data System (ADS)

    Vergini Eduardo, G.; Pan, Y.; al., Vardi R. et; al., Akkermans Eric et; et al.

    2014-01-01

    Semiclassical propagation up to the Heisenberg time Superconductivity and magnetic order in the half-Heusler compound ErPdBi An experimental evidence-based computational paradigm for new logic-gates in neuronal activity Universality in the symmetric exclusion process and diffusive systems

  17. From Image Analysis to Computer Vision: Motives, Methods, and Milestones.

    DTIC Science & Technology

    1998-07-01

    images. Initially, work on digital image analysis dealt with specific classes of images such as text, photomicrographs, nuclear particle tracks, and aerial...photographs; but by the 1960’s, general algorithms and paradigms for image analysis began to be formulated. When the artificial intelligence...scene, but eventually from image sequences obtained by a moving camera; at this stage, image analysis had become scene analysis or computer vision

  18. DEVSML 2.0: The Language and the Stack

    DTIC Science & Technology

    2012-03-01

    problems outside it. For example, HTML for web pages, Verilog and VHDL for hardware description, etc. are DSLs for very specific domains. A DSL can be...Engineering ( MDE ) paradigm where meta-modeling allows such transformations. The metamodeling approach to Model Integrated Computing (MIC) brings...University of Arizona, 2007 [5] Mittal, S, Martin, JLR, Zeigler, BP, "DEVS-Based Web Services for Net-centric T&E", Summer Computer Simulation

  19. Converging Paradigms: A Reflection on Parallel Theoretical Developments in Psychoanalytic Metapsychology and Empirical Dream Research.

    PubMed

    Schmelowszky, Ágoston

    2016-08-01

    In the last decades one can perceive a striking parallelism between the shifting perspective of leading representatives of empirical dream research concerning their conceptualization of dreaming and the paradigm shift within clinically based psychoanalytic metapsychology with respect to its theory on the significance of dreaming. In metapsychology, dreaming becomes more and more a central metaphor of mental functioning in general. The theories of Klein, Bion, and Matte-Blanco can be considered as milestones of this paradigm shift. In empirical dream research, the competing theories of Hobson and of Solms respectively argued for and against the meaningfulness of the dream-work in the functioning of the mind. In the meantime, empirical data coming from various sources seemed to prove the significance of dream consciousness for the development and maintenance of adaptive waking consciousness. Metapsychological speculations and hypotheses based on empirical research data seem to point in the same direction, promising for contemporary psychoanalytic practice a more secure theoretical base. In this paper the author brings together these diverse theoretical developments and presents conclusions regarding psychoanalytic theory and technique, as well as proposing an outline of an empirical research plan for testing the specificity of psychoanalysis in developing dream formation.

  20. Games people play-toward an enactive view of cooperation in social neuroscience.

    PubMed

    Engemann, Denis A; Bzdok, Danilo; Eickhoff, Simon B; Vogeley, Kai; Schilbach, Leonhard

    2012-01-01

    The field of social neuroscience has made considerable progress in unraveling the neural correlates of human cooperation by making use of brain imaging methods. Within this field, neuroeconomic research has drawn on paradigms from experimental economics, such as the Prisoner's Dilemma (PD) and the Trust Game. These paradigms capture the topic of conflict in cooperation, while focusing strongly on outcome-related decision processes. Cooperation, however, does not equate with that perspective, but relies on additional psychological processes and events, including shared intentions and mutually coordinated joint action. These additional facets of cooperation have been successfully addressed by research in developmental psychology, cognitive science, and social philosophy. Corresponding neuroimaging data, however, is still sparse. Therefore, in this paper, we present a juxtaposition of these mutually related but mostly independent trends in cooperation research. We propose that the neuroscientific study of cooperation could benefit from paradigms and concepts employed in developmental psychology and social philosophy. Bringing both to a neuroimaging environment might allow studying the neural correlates of cooperation by using formal models of decision-making as well as capturing the neural responses that underlie joint action scenarios, thus, promising to advance our understanding of the nature of human cooperation.

  1. Games people play—toward an enactive view of cooperation in social neuroscience

    PubMed Central

    Engemann, Denis A.; Bzdok, Danilo; Eickhoff, Simon B.; Vogeley, Kai; Schilbach, Leonhard

    2012-01-01

    The field of social neuroscience has made considerable progress in unraveling the neural correlates of human cooperation by making use of brain imaging methods. Within this field, neuroeconomic research has drawn on paradigms from experimental economics, such as the Prisoner's Dilemma (PD) and the Trust Game. These paradigms capture the topic of conflict in cooperation, while focusing strongly on outcome-related decision processes. Cooperation, however, does not equate with that perspective, but relies on additional psychological processes and events, including shared intentions and mutually coordinated joint action. These additional facets of cooperation have been successfully addressed by research in developmental psychology, cognitive science, and social philosophy. Corresponding neuroimaging data, however, is still sparse. Therefore, in this paper, we present a juxtaposition of these mutually related but mostly independent trends in cooperation research. We propose that the neuroscientific study of cooperation could benefit from paradigms and concepts employed in developmental psychology and social philosophy. Bringing both to a neuroimaging environment might allow studying the neural correlates of cooperation by using formal models of decision-making as well as capturing the neural responses that underlie joint action scenarios, thus, promising to advance our understanding of the nature of human cooperation. PMID:22675293

  2. Visualizing a silicon quantum computer

    NASA Astrophysics Data System (ADS)

    Sanders, Barry C.; Hollenberg, Lloyd C. L.; Edmundson, Darran; Edmundson, Andrew

    2008-12-01

    Quantum computation is a fast-growing, multi-disciplinary research field. The purpose of a quantum computer is to execute quantum algorithms that efficiently solve computational problems intractable within the existing paradigm of 'classical' computing built on bits and Boolean gates. While collaboration between computer scientists, physicists, chemists, engineers, mathematicians and others is essential to the project's success, traditional disciplinary boundaries can hinder progress and make communicating the aims of quantum computing and future technologies difficult. We have developed a four minute animation as a tool for representing, understanding and communicating a silicon-based solid-state quantum computer to a variety of audiences, either as a stand-alone animation to be used by expert presenters or embedded into a longer movie as short animated sequences. The paper includes a generally applicable recipe for successful scientific animation production.

  3. Balancing a simulated inverted pendulum through motor imagery: an EEG-based real-time control paradigm.

    PubMed

    Yue, Jingwei; Zhou, Zongtan; Jiang, Jun; Liu, Yadong; Hu, Dewen

    2012-08-30

    Most brain-computer interfaces (BCIs) are non-time-restraint systems. However, the method used to design a real-time BCI paradigm for controlling unstable devices is still a challenging problem. This paper presents a real-time feedback BCI paradigm for controlling an inverted pendulum on a cart (IPC). In this paradigm, sensorimotor rhythms (SMRs) were recorded using 15 active electrodes placed on the surface of the subject's scalp. Subsequently, common spatial pattern (CSP) was used as the basic filter to extract spatial patterns. Finally, linear discriminant analysis (LDA) was used to translate the patterns into control commands that could stabilize the simulated inverted pendulum. Offline trainings were employed to teach the subjects to execute corresponding mental tasks, such as left/right hand motor imagery. Five subjects could successfully balance the online inverted pendulum for more than 35s. The results demonstrated that BCIs are able to control nonlinear unstable devices. Furthermore, the demonstration and extension of real-time continuous control might be useful for the real-life application and generalization of BCI. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects

    NASA Technical Reports Server (NTRS)

    Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M

    1998-01-01

    SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.

  5. AGIS: The ATLAS Grid Information System

    NASA Astrophysics Data System (ADS)

    Anisenkov, Alexey; Belov, Sergey; Di Girolamo, Alessandro; Gayazov, Stavro; Klimentov, Alexei; Oleynik, Danila; Senchenko, Alexander

    2012-12-01

    ATLAS is a particle physics experiment at the Large Hadron Collider at CERN. The experiment produces petabytes of data annually through simulation production and tens petabytes of data per year from the detector itself. The ATLAS Computing model embraces the Grid paradigm and a high degree of decentralization and computing resources able to meet ATLAS requirements of petabytes scale data operations. In this paper we present ATLAS Grid Information System (AGIS) designed to integrate configuration and status information about resources, services and topology of whole ATLAS Grid needed by ATLAS Distributed Computing applications and services.

  6. On the Use of CAD-Native Predicates and Geometry in Surface Meshing

    NASA Technical Reports Server (NTRS)

    Aftosmis, M. J.

    1999-01-01

    Several paradigms for accessing computer-aided design (CAD) geometry during surface meshing for computational fluid dynamics are discussed. File translation, inconsistent geometry engines, and nonnative point construction are all identified as sources of nonrobustness. The paper argues in favor of accessing CAD parts and assemblies in their native format, without translation, and for the use of CAD-native predicates and constructors in surface mesh generation. The discussion also emphasizes the importance of examining the computational requirements for exact evaluation of triangulation predicates during surface meshing.

  7. Robust artifactual independent component classification for BCI practitioners.

    PubMed

    Winkler, Irene; Brandl, Stephanie; Horn, Franziska; Waldburger, Eric; Allefeld, Carsten; Tangermann, Michael

    2014-06-01

    EEG artifacts of non-neural origin can be separated from neural signals by independent component analysis (ICA). It is unclear (1) how robustly recently proposed artifact classifiers transfer to novel users, novel paradigms or changed electrode setups, and (2) how artifact cleaning by a machine learning classifier impacts the performance of brain-computer interfaces (BCIs). Addressing (1), the robustness of different strategies with respect to the transfer between paradigms and electrode setups of a recently proposed classifier is investigated on offline data from 35 users and 3 EEG paradigms, which contain 6303 expert-labeled components from two ICA and preprocessing variants. Addressing (2), the effect of artifact removal on single-trial BCI classification is estimated on BCI trials from 101 users and 3 paradigms. We show that (1) the proposed artifact classifier generalizes to completely different EEG paradigms. To obtain similar results under massively reduced electrode setups, a proposed novel strategy improves artifact classification. Addressing (2), ICA artifact cleaning has little influence on average BCI performance when analyzed by state-of-the-art BCI methods. When slow motor-related features are exploited, performance varies strongly between individuals, as artifacts may obstruct relevant neural activity or are inadvertently used for BCI control. Robustness of the proposed strategies can be reproduced by EEG practitioners as the method is made available as an EEGLAB plug-in.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorai, Prashun; Toberer, Eric S.; Stevanović, Vladan

    Here, at room temperature and above, most magnetic materials adopt a spin-disordered (paramagnetic) state whose electronic properties can differ significantly from their low-temperature, spin-ordered counterparts. Yet computational searches for new functional materials usually assume some type of magnetic order. In the present work, we demonstrate a methodology to incorporate spin disorder in computational searches and predict the electronic properties of the paramagnetic phase. We implement this method in a high-throughput framework to assess the potential for thermoelectric performance of 1350 transition-metal sulfides and find that all magnetic systems we identify as promising in the spin-ordered ground state cease to bemore » promising in the paramagnetic phase due to disorder-induced deterioration of the charge carrier transport properties. We also identify promising non-magnetic candidates that do not suffer from these spin disorder effects. In addition to identifying promising materials, our results offer insights into the apparent scarcity of magnetic systems among known thermoelectrics and highlight the importance of including spin disorder in computational searches.« less

  9. In Vitro Models of Human Toxicity Pathways

    EPA Science Inventory

    For toxicity testing and assessment programs to address the large numbers of substances of potential concern, a paradigm shift in the assessment of chemical hazard and risk is needed that takes advantage of advances in molecular toxicology, computational sciences, and information...

  10. Psst, Can You Keep a Secret?

    PubMed Central

    Vassilev, Apostol; Mouha, Nicky; Brandão, Luís

    2018-01-01

    The security of encrypted data depends not only on the theoretical properties of cryptographic primitives but also on the robustness of their implementations in software and hardware. Threshold cryptography introduces a computational paradigm that enables higher assurance for such implementations. PMID:29576634

  11. Performance Support on the Shop Floor.

    ERIC Educational Resources Information Center

    Kasvi, Jyrki J. J.; Vartiainen, Matti

    2000-01-01

    Discussion of performance support on the shop floor highlights four support systems for assembly lines that incorporate personal computer workstations in local area networks and use multimedia documents. Considers new customer-focused production paradigms; organizational learning; knowledge development; and electronic performance support systems…

  12. Cloud computing security.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.

    Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for bothmore » academia and government, including configuration options, hardware issues, challenges, and solutions.« less

  13. Consequences of "going digital" for pathology professionals - entering the cloud.

    PubMed

    Laurinavicius, Arvydas; Raslavicus, Paul

    2012-01-01

    New opportunities and the adoption of digital technologies will transform the way pathology professionals and services work. Many areas of our daily life as well as medical professions have experienced this change already which has resulted in a paradigm shift in many activities. Pathology is an image-based discipline, therefore, arrival of digital imaging into this domain promises major shift in our work and required mentality. Recognizing the physical and digital duality of the pathology workflow, we can prepare for the imminent increase of the digital component, synergize and enjoy its benefits. Development of a new generation of laboratory information systems along with seamless integration of digital imaging, decision-support, and knowledge databases will enable pathologists to work in a distributed environment. The paradigm of "cloud pathology" is proposed as an ultimate vision of digital pathology workstations plugged into the integrated multidisciplinary patient care systems.

  14. The value and validation of broad spectrum biosensors for diagnosis and biodefense

    PubMed Central

    Metzgar, David; Sampath, Rangarajan; Rounds, Megan A; Ecker, David J

    2013-01-01

    Broad spectrum biosensors capable of identifying diverse organisms are transitioning from the realm of research into the clinic. These technologies simultaneously capture signals from a wide variety of biological entities using universal processes. Specific organisms are then identified through bioinformatic signature-matching processes. This is in contrast to currently accepted molecular diagnostic technologies, which utilize unique reagents and processes to detect each organism of interest. This paradigm shift greatly increases the breadth of molecular diagnostic tools with little increase in biochemical complexity, enabling simultaneous diagnostic, epidemiologic, and biothreat surveillance capabilities at the point of care. This, in turn, offers the promise of increased biosecurity and better antimicrobial stewardship. Efficient realization of these potential gains will require novel regulatory paradigms reflective of the generalized, information-based nature of these assays, allowing extension of empirical data obtained from readily available organisms to support broader reporting of rare, difficult to culture, or extremely hazardous organisms. PMID:24128433

  15. A new biology for a new century.

    PubMed

    Woese, Carl R

    2004-06-01

    Biology today is at a crossroads. The molecular paradigm, which so successfully guided the discipline throughout most of the 20th century, is no longer a reliable guide. Its vision of biology now realized, the molecular paradigm has run its course. Biology, therefore, has a choice to make, between the comfortable path of continuing to follow molecular biology's lead or the more invigorating one of seeking a new and inspiring vision of the living world, one that addresses the major problems in biology that 20th century biology, molecular biology, could not handle and, so, avoided. The former course, though highly productive, is certain to turn biology into an engineering discipline. The latter holds the promise of making biology an even more fundamental science, one that, along with physics, probes and defines the nature of reality. This is a choice between a biology that solely does society's bidding and a biology that is society's teacher.

  16. Advances in thermoelectric materials research: Looking back and moving forward.

    PubMed

    He, Jian; Tritt, Terry M

    2017-09-29

    High-performance thermoelectric materials lie at the heart of thermoelectrics, the simplest technology applicable to direct thermal-to-electrical energy conversion. In its recent 60-year history, the field of thermoelectric materials research has stalled several times, but each time it was rejuvenated by new paradigms. This article reviews several potentially paradigm-changing mechanisms enabled by defects, size effects, critical phenomena, anharmonicity, and the spin degree of freedom. These mechanisms decouple the otherwise adversely interdependent physical quantities toward higher material performance. We also briefly discuss a number of promising materials, advanced material synthesis and preparation techniques, and new opportunities. The renewable energy landscape will be reshaped if the current trend in thermoelectric materials research is sustained into the foreseeable future. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  17. Kedalion: NASA's Adaptable and Agile Hardware/Software Integration and Test Lab

    NASA Technical Reports Server (NTRS)

    Mangieri, Mark L.; Vice, Jason

    2011-01-01

    NASA fs Kedalion engineering analysis lab at Johnson Space Center is on the forefront of validating and using many contemporary avionics hardware/software development and integration techniques, which represent new paradigms to heritage NASA culture. Kedalion has validated many of the Orion hardware/software engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, with the intention to build upon such techniques to better align with today fs aerospace market. Using agile techniques, commercial products, early rapid prototyping, in-house expertise and tools, and customer collaboration, Kedalion has demonstrated that cost effective contemporary paradigms hold the promise to serve future NASA endeavors within a diverse range of system domains. Kedalion provides a readily adaptable solution for medium/large scale integration projects. The Kedalion lab is currently serving as an in-line resource for the project and the Multipurpose Crew Vehicle (MPCV) program.

  18. Paradigms in Chronic Obstructive Pulmonary Disease: Phenotypes, Immunobiology, and Therapy with a Focus on Vascular Disease

    PubMed Central

    Schivo, Michael; Albertson, Timothy E.; Haczku, Angela; Kenyon, Nicholas J.; Zeki, Amir A.; Kuhn, Brooks T.; Louie, Samuel; Avdalovic, Mark V.

    2018-01-01

    Chronic obstructive pulmonary disease (COPD) is a complex and heterogenous syndrome that represents a major global health burden. COPD phenotypes have recently emerged based on large cohort studies addressing the need to better characterize the syndrome. Though comprehensive phenotyping is still at an early stage, factors such as ethnicity and radiographic, serum, and exhaled breath biomarkers have shown promise. COPD is also an immunological disease where innate and adaptive immune responses to the environment and tobacco smoke are altered. The frequent overlap between COPD and other systemic diseases, such as cardiovascular disease, have influenced COPD therapy, and treatments for both conditions may lead to improved patient outcomes. Here we discuss current paradigms that center on improving the definition of COPD, understanding the immunological overlap between COPD and vascular inflammation, and the treatment of COPD—with a focus on comorbid cardiovascular disease. PMID:28258130

  19. Advanced Bioinks for 3D Printing: A Materials Science Perspective.

    PubMed

    Chimene, David; Lennox, Kimberly K; Kaunas, Roland R; Gaharwar, Akhilesh K

    2016-06-01

    Advanced bioinks for 3D printing are rationally designed materials intended to improve the functionality of printed scaffolds outside the traditional paradigm of the "biofabrication window". While the biofabrication window paradigm necessitates compromise between suitability for fabrication and ability to accommodate encapsulated cells, recent developments in advanced bioinks have resulted in improved designs for a range of biofabrication platforms without this tradeoff. This has resulted in a new generation of bioinks with high print fidelity, shear-thinning characteristics, and crosslinked scaffolds with high mechanical strength, high cytocompatibility, and the ability to modulate cellular functions. In this review, we describe some of the promising strategies being pursued to achieve these goals, including multimaterial, interpenetrating network, nanocomposite, and supramolecular bioinks. We also provide an overview of current and emerging trends in advanced bioink synthesis and biofabrication, and evaluate the potential applications of these novel biomaterials to clinical use.

  20. Translating genomic information into clinical medicine: lung cancer as a paradigm.

    PubMed

    Levy, Mia A; Lovly, Christine M; Pao, William

    2012-11-01

    We are currently in an era of rapidly expanding knowledge about the genetic landscape and architectural blueprints of various cancers. These discoveries have led to a new taxonomy of malignant diseases based upon clinically relevant molecular alterations in addition to histology or tissue of origin. The new molecularly based classification holds the promise of rational rather than empiric approaches for the treatment of cancer patients. However, the accelerated pace of discovery and the expanding number of targeted anti-cancer therapies present a significant challenge for healthcare practitioners to remain informed and up-to-date on how to apply cutting-edge discoveries into daily clinical practice. In this Perspective, we use lung cancer as a paradigm to discuss challenges related to translating genomic information into the clinic, and we present one approach we took at Vanderbilt-Ingram Cancer Center to address these challenges.

  1. E-health beyond technology: analyzing the paradigm shift that lies beneath.

    PubMed

    Moerenhout, Tania; Devisch, Ignaas; Cornelis, Gustaaf C

    2018-03-01

    Information and computer technology has come to play an increasingly important role in medicine, to the extent that e-health has been described as a disruptive innovation or revolution in healthcare. The attention is very much focused on the technology itself, and advances that have been made in genetics and biology. This leads to the question: What is changing in medicine today concerning e-health? To what degree could these changes be characterized as a 'revolution'? We will apply the work of Thomas Kuhn, Larry Laudan, Michel Foucault and other philosophers-which offers an alternative understanding of progress and revolution in medicine to the classic discovery-oriented approach-to our analysis. Nowadays, the long-standing curative or reactive paradigm in medicine is facing a crisis due to an aging population, a significant increase in chronic diseases and the development of more expensive diagnostic tools and therapies. This promotes the evolution towards a new paradigm with an emphasis on preventive medicine. E-health constitutes an essential part of this new paradigm that seeks to solve the challenges presented by an aging population, skyrocketing costs and so forth. Our approach changes the focus from the technology itself toward the underlying paradigm shift in medicine. We will discuss the relevance of this approach by applying it to the surge in digital self-tracking through health apps and wearables: the recognition of the underlying paradigm shift leads to a more comprehensive understanding of self-tracking than a solely discovery-oriented or technology-focused view can provide.

  2. Bridging paradigms: hybrid mechanistic-discriminative predictive models.

    PubMed

    Doyle, Orla M; Tsaneva-Atansaova, Krasimira; Harte, James; Tiffin, Paul A; Tino, Peter; Díaz-Zuccarini, Vanessa

    2013-03-01

    Many disease processes are extremely complex and characterized by multiple stochastic processes interacting simultaneously. Current analytical approaches have included mechanistic models and machine learning (ML), which are often treated as orthogonal viewpoints. However, to facilitate truly personalized medicine, new perspectives may be required. This paper reviews the use of both mechanistic models and ML in healthcare as well as emerging hybrid methods, which are an exciting and promising approach for biologically based, yet data-driven advanced intelligent systems.

  3. Novel Androgen Deprivation Therapy (ADT) in the Treatment of Advanced Prostate Cancer.

    PubMed

    Aragon-Ching, Jeanny B; Dahut, William L

    2010-07-01

    Androgen deprivation therapy has been the mainstay of treatment for advanced and metastatic prostate cancer. The use of novel agents targeting the androgen receptor and its signaling pathways offers a promising approach that is both safe and effective. We describe the rationale behind the use of these compounds in clinical development and the existing challenges as to how best to incorporate these new and emerging therapies in the changing treatment paradigm of metastatic prostate cancer.

  4. Performance limitations of label-free sensors in molecular diagnosis using complex samples

    NASA Astrophysics Data System (ADS)

    Varma, Manoj

    2016-03-01

    Label-free biosensors promised a paradigm involving direct detection of biomarkers from complex samples such as serum without requiring multistep sample processing typical of labelled methods such as ELISA or immunofluorescence assays. Label-free sensors have witnessed decades of development with a veritable zoo of techniques available today exploiting a multitude of physical effects. It is appropriate now to critically assess whether label-free technologies have succeeded in delivering their promise with respect to diagnostic applications, particularly, ambitious goals such as early cancer detection using serum biomarkers, which require low limits of detection (LoD). Comparison of nearly 120 limits of detection (LoD) values reported by labelled and label-free sensing approaches over a wide range of detection techniques and target molecules in serum revealed that labeled techniques achieve 2-3 orders of magnitude better LoDs. Data from experiments where labelled and label-free assays were performed simultaneously using the same assay parameters also confirm that the LoD achieved by labelled techniques is 2 to 3 orders of magnitude better than that by label-free techniques. Furthermore, label-free techniques required significant signal amplification, for e.g. using nanoparticle conjugated secondary antibodies, to achieve LoDs comparable to labelled methods substantially deviating from the original "direct detection" paradigm. This finding has important implications on the practical limits of applying label-free detection methods for molecular diagnosis.

  5. Digital technology in respiratory diseases: Promises, (no) panacea and time for a new paradigm.

    PubMed

    Pinnock, Hilary; McKinstry, Brian

    2016-05-01

    In a world where digital technology has revolutionized the way we work, shop and manage our finances it is unsurprising that digital systems are suggested as potential solutions to delivering clinically and cost-effective care for an aging population with one or more long-term conditions. However, recent evidence suggesting that telehealth may not be quite the panacea that was promised, has led to discussions on the mechanisms and role of digital technology in respiratory care. Implementation in rural and remote settings offers significant benefits in terms of convenient access to care, but is contingent on technical and organizational infrastructure. Telemonitoring systems rely on algorithms to detect deterioration and trigger alerts; machine learning may enable telemonitoring of the future to develop personalized systems that are sensitive to clinical status whilst reducing false alerts. By providing access to information, offering convenient and flexible modes of communication and enabling the transfer of monitoring data to support professional assessment, telehealth can support self-management. At present, all too often, expensive 'off the shelf' systems are purchased and given to clinicians to use. It is time for the paradigm to shift. As clinicians we should identify the specific challenges we face in delivering care, and expect flexible systems that can be customized to individual patients' requirements and adapted to our diverse healthcare contexts. © The Author(s) 2016.

  6. Refilling the half-empty glass--Investigating the potential role of the Interpretation Modification Paradigm for Depression (IMP-D).

    PubMed

    Möbius, Martin; Tendolkar, Indira; Lohner, Valerie; Baltussen, Mirte; Becker, Eni S

    2015-12-01

    Cognitive biases are known to cause and maintain depression. However, little research has been done on techniques targeting interpretation tendencies found in depression, despite the promising findings of anxiety studies. This paper presents two experiments, investigating the suitability of an Interpretation Modification Paradigm for Depression (IMP-D) in healthy individuals, which has already proven its effectiveness in anxiety (Beard & Amir, 2008). Different from other paradigms, the IMP-D aims at modifying an interpretation bias on response- and on a more implicit reaction time-level, making this task less susceptible to demand effects. The Word-Sentence Association Paradigm for Depression (Hindash & Amir, 2011) was modified and administered in healthy volunteers (experiment I: N = 81; experiment II: N = 105). To enhance a positive interpretation bias, endorsing benign and rejecting negative interpretations of ambiguous scenarios was reinforced through feedback. This intervention was compared to the opposite training (both experiments) and a control training (experiment II only). Both experiments revealed a significant increase in bias towards benign interpretations on the level of overt decisions, while only in the first experiment a change was found on a reaction time level. These modifications are not reflected in group-differences in emotional vulnerability. Possible limitations regarding the reliability of inter-dependent response and reaction time measures are discussed. The IMP-D is able to modify interpretation biases, but adaptations are required to maximize its beneficial effects. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Perspectives of mobile learning in optics and photonics

    NASA Astrophysics Data System (ADS)

    Curticapean, Dan; Christ, Andreas; Feißt, Markus

    2010-08-01

    Mobile learning (m-learning) can be considered as a new paradigm of e-learning. The developed solution enables the presentation of animations and 3D virtual reality (VR) on mobile devices and is well suited for mobile learning. Difficult relations in physics as well as intricate experiments in optics can be visualised on mobile devices without need for a personal computer. By outsourcing the computational power to a server, the coverage is worldwide.

  8. Video-task assessment of learning and memory in Macaques (Macaca mulatta) - Effects of stimulus movement on performance

    NASA Technical Reports Server (NTRS)

    Washburn, David A.; Hopkins, William D.; Rumbaugh, Duane M.

    1989-01-01

    Effects of stimulus movement on learning, transfer, matching, and short-term memory performance were assessed with 2 monkeys using a video-task paradigm in which the animals responded to computer-generated images by manipulating a joystick. Performance on tests of learning set, transfer index, matching to sample, and delayed matching to sample in the video-task paradigm was comparable to that obtained in previous investigations using the Wisconsin General Testing Apparatus. Additionally, learning, transfer, and matching were reliably and significantly better when the stimuli or discriminanda moved than when the stimuli were stationary. External manipulations such as stimulus movement may increase attention to the demands of a task, which in turn should increase the efficiency of learning. These findings have implications for the investigation of learning in other populations, as well as for the application of the video-task paradigm to comparative study.

  9. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  10. Contemporary cybernetics and its facets of cognitive informatics and computational intelligence.

    PubMed

    Wang, Yingxu; Kinsner, Witold; Zhang, Du

    2009-08-01

    This paper explores the architecture, theoretical foundations, and paradigms of contemporary cybernetics from perspectives of cognitive informatics (CI) and computational intelligence. The modern domain and the hierarchical behavioral model of cybernetics are elaborated at the imperative, autonomic, and cognitive layers. The CI facet of cybernetics is presented, which explains how the brain may be mimicked in cybernetics via CI and neural informatics. The computational intelligence facet is described with a generic intelligence model of cybernetics. The compatibility between natural and cybernetic intelligence is analyzed. A coherent framework of contemporary cybernetics is presented toward the development of transdisciplinary theories and applications in cybernetics, CI, and computational intelligence.

  11. Heterotic computing: exploiting hybrid computational devices.

    PubMed

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  12. Spatiotemporal Beamforming: A Transparent and Unified Decoding Approach to Synchronous Visual Brain-Computer Interfacing.

    PubMed

    Wittevrongel, Benjamin; Van Hulle, Marc M

    2017-01-01

    Brain-Computer Interfaces (BCIs) decode brain activity with the aim to establish a direct communication channel with an external device. Albeit they have been hailed to (re-)establish communication in persons suffering from severe motor- and/or communication disabilities, only recently BCI applications have been challenging other assistive technologies. Owing to their considerably increased performance and the advent of affordable technological solutions, BCI technology is expected to trigger a paradigm shift not only in assistive technology but also in the way we will interface with technology. However, the flipside of the quest for accuracy and speed is most evident in EEG-based visual BCI where it has led to a gamut of increasingly complex classifiers, tailored to the needs of specific stimulation paradigms and use contexts. In this contribution, we argue that spatiotemporal beamforming can serve several synchronous visual BCI paradigms. We demonstrate this for three popular visual paradigms even without attempting to optimizing their electrode sets. For each selectable target, a spatiotemporal beamformer is applied to assess whether the corresponding signal-of-interest is present in the preprocessed multichannel EEG signals. The target with the highest beamformer output is then selected by the decoder (maximum selection). In addition to this simple selection rule, we also investigated whether interactions between beamformer outputs could be employed to increase accuracy by combining the outputs for all targets into a feature vector and applying three common classification algorithms. The results show that the accuracy of spatiotemporal beamforming with maximum selection is at par with that of the classification algorithms and interactions between beamformer outputs do not further improve that accuracy.

  13. A novel Brain Computer Interface for classification of social joint attention in autism and comparison of 3 experimental setups: A feasibility study.

    PubMed

    Amaral, Carlos P; Simões, Marco A; Mouga, Susana; Andrade, João; Castelo-Branco, Miguel

    2017-10-01

    We present a novel virtual-reality P300-based Brain Computer Interface (BCI) paradigm using social cues to direct the focus of attention. We combined interactive immersive virtual-reality (VR) technology with the properties of P300 signals in a training tool which can be used in social attention disorders such as autism spectrum disorder (ASD). We tested the novel social attention training paradigm (P300-based BCI paradigm for rehabilitation of joint-attention skills) in 13 healthy participants, in 3 EEG systems. The more suitable setup was tested online with 4 ASD subjects. Statistical accuracy was assessed based on the detection of P300, using spatial filtering and a Naïve-Bayes classifier. We compared: 1 - g.Mobilab+ (active dry-electrodes, wireless transmission); 2 - g.Nautilus (active electrodes, wireless transmission); 3 - V-Amp with actiCAP Xpress dry-electrodes. Significant statistical classification was achieved in all systems. g.Nautilus proved to be the best performing system in terms of accuracy in the detection of P300, preparation time, speed and reported comfort. Proof of concept tests in ASD participants proved that this setup is feasible for training joint attention skills in ASD. This work provides a unique combination of 'easy-to-use' BCI systems with new technologies such as VR to train joint-attention skills in autism. Our P300 BCI paradigm is feasible for future Phase I/II clinical trials to train joint-attention skills, with successful classification within few trials, online in ASD participants. The g.Nautilus system is the best performing one to use with the developed BCI setup. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. On the Modeling and Management of Cloud Data Analytics

    NASA Astrophysics Data System (ADS)

    Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni

    A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.

  15. Quaternary beetle research: the state of the art

    NASA Astrophysics Data System (ADS)

    Elias, Scott A.

    2006-08-01

    Quaternary beetle research has progressed in a variety of ways during the last decade. New kinds of data are being extracted from the fossil specimens themselves, such as ancient DNA and stable isotopes. The ancient DNA studies hold the promise of proving new insights on the stability of beetle genotypes. The study of stable isotopes of H and O from fossil beetle chitin holds the promise of providing an independent proxy for the reconstruction of temperature and precipitation. The discipline is also expanding into previously unstudied regions, such as Australia, New Zealand, and northern Asia. Along with the new study regions, new schools of thought are also forming in the discipline, challenging old research paradigms. This is a necessary step forward for the discipline, as it grows and develops in the 21st Century.

  16. Shifting Paradigms in Management Education: What Happens When We Take Groups Seriously.

    ERIC Educational Resources Information Center

    Mundell, Bryan; Pennarola, Ferdinando

    1999-01-01

    An Italian university's capstone business administration course is designed around andragogical principles. Students spend 90% of their time in independent teamwork on multidisciplinary problems. The course uses information technology in the form of databases and networked computers. (SK)

  17. A Paradigm for the Next Millenium: Health Information Science.

    ERIC Educational Resources Information Center

    Sadler, Lewis

    1991-01-01

    Described is a curriculum for a new multidisciplinary science-Health Information Science-that incorporates aspects of computer science, cognitive psychology, bioengineering, biomedical visualization, medicine, dentistry, anthropology, mathematics, library science, and the visual arts. The situation of the medical illustration profession is…

  18. The MUSOS (MUsic SOftware System) Toolkit: A computer-based, open source application for testing memory for melodies.

    PubMed

    Rainsford, M; Palmer, M A; Paine, G

    2018-04-01

    Despite numerous innovative studies, rates of replication in the field of music psychology are extremely low (Frieler et al., 2013). Two key methodological challenges affecting researchers wishing to administer and reproduce studies in music cognition are the difficulty of measuring musical responses, particularly when conducting free-recall studies, and access to a reliable set of novel stimuli unrestricted by copyright or licensing issues. In this article, we propose a solution for these challenges in computer-based administration. We present a computer-based application for testing memory for melodies. Created using the software Max/MSP (Cycling '74, 2014a), the MUSOS (Music Software System) Toolkit uses a simple modular framework configurable for testing common paradigms such as recall, old-new recognition, and stem completion. The program is accompanied by a stimulus set of 156 novel, copyright-free melodies, in audio and Max/MSP file formats. Two pilot tests were conducted to establish the properties of the accompanying stimulus set that are relevant to music cognition and general memory research. By using this software, a researcher without specialist musical training may administer and accurately measure responses from common paradigms used in the study of memory for music.

  19. A Comprehensive Review of Existing Risk Assessment Models in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Amini, Ahmad; Jamil, Norziana

    2018-05-01

    Cloud computing is a popular paradigm in information technology and computing as it offers numerous advantages in terms of economical saving and minimal management effort. Although elasticity and flexibility brings tremendous benefits, it still raises many information security issues due to its unique characteristic that allows ubiquitous computing. Therefore, the vulnerabilities and threats in cloud computing have to be identified and proper risk assessment mechanism has to be in place for better cloud computing management. Various quantitative and qualitative risk assessment models have been proposed but up to our knowledge, none of them is suitable for cloud computing environment. This paper, we compare and analyse the strengths and weaknesses of existing risk assessment models. We then propose a new risk assessment model that sufficiently address all the characteristics of cloud computing, which was not appeared in the existing models.

  20. Using Common Graphics Paradigms Implemented in a Java Applet to Represent Complex Scheduling Requirements

    NASA Technical Reports Server (NTRS)

    Jaap, John; Meyer, Patrick; Davis, Elizabeth

    1997-01-01

    The experiments planned for the International Space Station promise to be complex, lengthy and diverse. The scarcity of the space station resources will cause significant competition for resources between experiments. The scheduling job facing the Space Station mission planning software requires a concise and comprehensive description of the experiments' requirements (to ensure a valid schedule) and a good description of the experiments' flexibility (to effectively utilize available resources). In addition, the continuous operation of the station, the wide geographic dispersion of station users, and the budgetary pressure to reduce operations manpower make a low-cost solution mandatory. A graphical representation of the scheduling requirements for station payloads implemented via an Internet-based application promises to be an elegant solution that addresses all of these issues. The graphical representation of experiment requirements permits a station user to describe his experiment by defining "activities" and "sequences of activities". Activities define the resource requirements (with alternatives) and other quantitative constraints of tasks to be performed. Activities definitions use an "outline" graphics paradigm. Sequences define the time relationships between activities. Sequences may also define time relationships with activities of other payloads or space station systems. Sequences of activities are described by a "network" graphics paradigm. The bulk of this paper will describe the graphical approach to representing requirements and provide examples that show the ease and clarity with which complex requirements can be represented. A Java applet, to run in a web browser, is being developed to support the graphical representation of payload scheduling requirements. Implementing the entry and editing of requirements via the web solves the problems introduced by the geographic dispersion of users. Reducing manpower is accomplished by developing a concise representation which eliminates the misunderstanding possible with verbose representations and which captures the complete requirements and flexibility of the experiments.

Top