Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
2003-01-01
The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler)
2003-01-01
The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.
Social Studies and Emerging Paradigms: Artificial Intelligence and Consciousness Education.
ERIC Educational Resources Information Center
Braun, Joseph A., Jr.
1987-01-01
Asks three questions: (1) Are machines capable of thinking as people do? (2) How is the thinking of computers similar and different from human thinking? and (3) What exactly is thinking? Examines research in artificial intelligence. Describes the theory and research of consciousness education and discusses an emerging paradigm for human thinking…
Designing Ubiquitous Computing to Enhance Children's Learning in Museums
ERIC Educational Resources Information Center
Hall, T.; Bannon, L.
2006-01-01
In recent years, novel paradigms of computing have emerged, which enable computational power to be embedded in artefacts and in environments in novel ways. These developments may create new possibilities for using computing to enhance learning. This paper presents the results of a design process that set out to explore interactive techniques,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marcer, Peter J.; Rowlands, Peter
2010-12-22
Further evidence is presented in favour of the computational paradigm, conceived and constructed by Rowlands and Diaz, as detailed in Rowlands' book Zero to Infinity (2007), and in particular the authors' paper 'The Grammatical Universe: the Laws of Thermodynamics and Quantum Entanglement'. The paradigm, which has isomorphic group and algebraic quantum mechanical language interpretations, not only predicts the well-established facts of quantum physics, the periodic table, chemistry / valence and of molecular biology, whose understanding it extends; it also provides an elegant, simple solution to the unresolved quantum measurement problem. In this fundamental paradigm, all the computational constructs / predictionsmore » that emerge, follow from the simple fact, that, as in quantum mechanics, the wave function is defined only up to an arbitrary fixed phase. This fixed phase provides a simple physical understanding of the quantum vacuum in quantum field theory, where only relative phases, known to be able to encode 3+1 relativistic space-time geometries, can be measured. It is the arbitrary fixed measurement standard, against which everything that follows is to be measured, even though the standard itself cannot be, since nothing exists against which to measure it. The standard, as an arbitrary fixed reference phase, functions as the holographic basis for a self-organized universal quantum process of emergent novel fermion states of matter where, following each emergence, the arbitrary standard is re-fixed anew so as to provide a complete history / holographic record or hologram of the current fixed past, advancing an unending irreversible evolution, such as is the evidence of our senses. The fermion states, in accord with the Pauli exclusion principle, each correspond to a unique nilpotent symbol in the infinite alphabet (which specifies the grammar in this nilpotent universal computational rewrite system (NUCRS) paradigm); and the alphabet, as Hill and Rowlands hypothesize on substantial evidence [26], includes that of the RNA / DNA genetic code and, as holographic phase encodings / holograms, the 4D geometries of all living systems as self-organised grammatical computational rewrite machines / machinery. Human brains, natural grammatical (written symbol) languages, 4D geometric self-awareness and a totally new emergent property of matter, human consciousness, can thus with some measure of confidence be postulated as further genetic consequences which follow from this self-organizing fundamental rewrite NUCRS construction. For it, like natural language, possesses a semantics and not just a syntax, where the initial symbol, i.e. the arbitrary fixed phase measurement standard, is able to function as the template for the blueprints of the emergent 4D relativistic real and virtual geometries to come, in a 'from the Self Creation to the creation of the human self' computational rewrite process evolution.« less
NASA Astrophysics Data System (ADS)
Marcer, Peter J.; Rowlands, Peter
2010-12-01
Further evidence is presented in favour of the computational paradigm, conceived and constructed by Rowlands and Diaz, as detailed in Rowlands' book Zero to Infinity (2007) [2], and in particular the authors' paper `The Grammatical Universe: the Laws of Thermodynamics and Quantum Entanglement' [1]. The paradigm, which has isomorphic group and algebraic quantum mechanical language interpretations, not only predicts the well-established facts of quantum physics, the periodic table, chemistry / valence and of molecular biology, whose understanding it extends; it also provides an elegant, simple solution to the unresolved quantum measurement problem. In this fundamental paradigm, all the computational constructs / predictions that emerge, follow from the simple fact, that, as in quantum mechanics, the wave function is defined only up to an arbitrary fixed phase. This fixed phase provides a simple physical understanding of the quantum vacuum in quantum field theory, where only relative phases, known to be able to encode 3+1 relativistic space-time geometries, can be measured. It is the arbitrary fixed measurement standard, against which everything that follows is to be measured, even though the standard itself cannot be, since nothing exists against which to measure it. The standard, as an arbitrary fixed reference phase, functions as the holographic basis for a self-organized universal quantum process of emergent novel fermion states of matter where, following each emergence, the arbitrary standard is re-fixed anew so as to provide a complete history / holographic record or hologram of the current fixed past, advancing an unending irreversible evolution, such as is the evidence of our senses. The fermion states, in accord with the Pauli exclusion principle, each correspond to a unique nilpotent symbol in the infinite alphabet (which specifies the grammar in this nilpotent universal computational rewrite system (NUCRS) paradigm); and the alphabet, as Hill and Rowlands hypothesize on substantial evidence [26], includes that of the RNA / DNA genetic code and, as holographic phase encodings / holograms, the 4D geometries of all living systems as self-organised grammatical computational rewrite machines / machinery. Human brains, natural grammatical (written symbol) languages, 4D geometric self-awareness and a totally new emergent property of matter, human consciousness, can thus with some measure of confidence be postulated as further genetic consequences which follow from this self-organizing fundamental rewrite NUCRS construction. For it, like natural language, possesses a semantics and not just a syntax, where the initial symbol, i.e. the arbitrary fixed phase measurement standard, is able to function as the template for the blueprints of the emergent 4D relativistic real and virtual geometries to come, in a `from the Self Creation to the creation of the human self' computational rewrite process evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potok, Thomas; Schuman, Catherine; Patton, Robert
The White House and Department of Energy have been instrumental in driving the development of a neuromorphic computing program to help the United States continue its lead in basic research into (1) Beyond Exascale—high performance computing beyond Moore’s Law and von Neumann architectures, (2) Scientific Discovery—new paradigms for understanding increasingly large and complex scientific data, and (3) Emerging Architectures—assessing the potential of neuromorphic and quantum architectures. Neuromorphic computing spans a broad range of scientific disciplines from materials science to devices, to computer science, to neuroscience, all of which are required to solve the neuromorphic computing grand challenge. In our workshopmore » we focus on the computer science aspects, specifically from a neuromorphic device through an application. Neuromorphic devices present a very different paradigm to the computer science community from traditional von Neumann architectures, which raises six major questions about building a neuromorphic application from the device level. We used these fundamental questions to organize the workshop program and to direct the workshop panels and discussions. From the white papers, presentations, panels, and discussions, there emerged several recommendations on how to proceed.« less
ERIC Educational Resources Information Center
Dulaney, Malik H.
2013-01-01
Emerging technologies challenge the management of information technology in organizations. Paradigm changing technologies, such as cloud computing, have the ability to reverse the norms in organizational management, decision making, and information technology governance. This study explores the effects of cloud computing on information technology…
Client/Server Architecture Promises Radical Changes.
ERIC Educational Resources Information Center
Freeman, Grey; York, Jerry
1991-01-01
This article discusses the emergence of the client/server paradigm for the delivery of computer applications, its emergence in response to the proliferation of microcomputers and local area networks, the applicability of the model in academic institutions, and its implications for college campus information technology organizations. (Author/DB)
ERIC Educational Resources Information Center
Rimland, Jeffrey C.
2013-01-01
In many evolving systems, inputs can be derived from both human observations and physical sensors. Additionally, many computation and analysis tasks can be performed by either human beings or artificial intelligence (AI) applications. For example, weather prediction, emergency event response, assistive technology for various human sensory and…
A programming language for composable DNA circuits
Phillips, Andrew; Cardelli, Luca
2009-01-01
Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing. PMID:19535415
A programming language for composable DNA circuits.
Phillips, Andrew; Cardelli, Luca
2009-08-06
Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing.
Foundations and Emerging Paradigms for Computing in Living Cells.
Ma, Kevin C; Perli, Samuel D; Lu, Timothy K
2016-02-27
Genetic circuits, composed of complex networks of interacting molecular machines, enable living systems to sense their dynamic environments, perform computation on the inputs, and formulate appropriate outputs. By rewiring and expanding these circuits with novel parts and modules, synthetic biologists have adapted living systems into vibrant substrates for engineering. Diverse paradigms have emerged for designing, modeling, constructing, and characterizing such artificial genetic systems. In this paper, we first provide an overview of recent advances in the development of genetic parts and highlight key engineering approaches. We then review the assembly of these parts into synthetic circuits from the perspectives of digital and analog logic, systems biology, and metabolic engineering, three areas of particular theoretical and practical interest. Finally, we discuss notable challenges that the field of synthetic biology still faces in achieving reliable and predictable forward-engineering of artificial biological circuits. Copyright © 2016. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Faiola, Anthony; Matei, Sorin Adam
2010-01-01
The evolution of human-computer interaction design (HCID) over the last 20 years suggests that there is a growing need for educational scholars to consider new and more applicable theoretical models of interactive product design. The authors suggest that such paradigms would call for an approach that would equip HCID students with a better…
ERIC Educational Resources Information Center
Dominguez, Alfredo
2013-01-01
Cloud computing has emerged as a new paradigm for on-demand delivery and consumption of shared IT resources over the Internet. Research has predicted that small and medium organizations (SMEs) would be among the earliest adopters of cloud solutions; however, this projection has not materialized. This study set out to investigate if behavior…
ERIC Educational Resources Information Center
Suo, Shuguang
2013-01-01
Organizations have been forced to rethink business models and restructure facilities through IT innovation as they have faced the challenges arising from globalization, mergers and acquisitions, big data, and the ever-changing demands of customers. Cloud computing has emerged as a new computing paradigm that has fundamentally shaped the business…
Supervised Machine Learning for Population Genetics: A New Paradigm
Schrider, Daniel R.; Kern, Andrew D.
2018-01-01
As population genomic datasets grow in size, researchers are faced with the daunting task of making sense of a flood of information. To keep pace with this explosion of data, computational methodologies for population genetic inference are rapidly being developed to best utilize genomic sequence data. In this review we discuss a new paradigm that has emerged in computational population genomics: that of supervised machine learning (ML). We review the fundamentals of ML, discuss recent applications of supervised ML to population genetics that outperform competing methods, and describe promising future directions in this area. Ultimately, we argue that supervised ML is an important and underutilized tool that has considerable potential for the world of evolutionary genomics. PMID:29331490
Moving alcohol prevention research forward-Part I: introducing a complex systems paradigm.
Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller
2018-02-01
The drinking environment is a complex system consisting of a number of heterogeneous, evolving and interacting components, which exhibit circular causality and emergent properties. These characteristics reduce the efficacy of commonly used research approaches, which typically do not account for the underlying dynamic complexity of alcohol consumption and the interdependent nature of diverse factors influencing misuse over time. We use alcohol misuse among college students in the United States as an example for framing our argument for a complex systems paradigm. A complex systems paradigm, grounded in socio-ecological and complex systems theories and computational modeling and simulation, is introduced. Theoretical, conceptual, methodological and analytical underpinnings of this paradigm are described in the context of college drinking prevention research. The proposed complex systems paradigm can transcend limitations of traditional approaches, thereby fostering new directions in alcohol prevention research. By conceptualizing student alcohol misuse as a complex adaptive system, computational modeling and simulation methodologies and analytical techniques can be used. Moreover, use of participatory model-building approaches to generate simulation models can further increase stakeholder buy-in, understanding and policymaking. A complex systems paradigm for research into alcohol misuse can provide a holistic understanding of the underlying drinking environment and its long-term trajectory, which can elucidate high-leverage preventive interventions. © 2017 Society for the Study of Addiction.
Creating a Structured AOP Knowledgebase via Ontology-Based Annotations
The Adverse Outcome Pathway (AOP) framework is increasingly used to integrate data from traditional and emerging toxicity testing paradigms. As the number of AOP descriptions has increased, so has the need to define the AOP in terms that can be interpreted computationally. We wil...
P300 brain computer interface: current challenges and emerging trends
Fazel-Rezai, Reza; Allison, Brendan Z.; Guger, Christoph; Sellers, Eric W.; Kleih, Sonja C.; Kübler, Andrea
2012-01-01
A brain-computer interface (BCI) enables communication without movement based on brain signals measured with electroencephalography (EEG). BCIs usually rely on one of three types of signals: the P300 and other components of the event-related potential (ERP), steady state visual evoked potential (SSVEP), or event related desynchronization (ERD). Although P300 BCIs were introduced over twenty years ago, the past few years have seen a strong increase in P300 BCI research. This closed-loop BCI approach relies on the P300 and other components of the ERP, based on an oddball paradigm presented to the subject. In this paper, we overview the current status of P300 BCI technology, and then discuss new directions: paradigms for eliciting P300s; signal processing methods; applications; and hybrid BCIs. We conclude that P300 BCIs are quite promising, as several emerging directions have not yet been fully explored and could lead to improvements in bit rate, reliability, usability, and flexibility. PMID:22822397
The Strategic Nature of Changing Your Mind
ERIC Educational Resources Information Center
Walsh, Matthew M.; Anderson, John R.
2009-01-01
In two experiments, we studied how people's strategy choices emerge through an initial and then a more considered evaluation of available strategies. The experiments employed a computer-based paradigm where participants solved multiplication problems using mental and calculator solutions. In addition to recording responses and solution times, we…
Creating a Structured Adverse Outcome Pathway Knowledgebase via Ontology-Based Annotations
The Adverse Outcome Pathway (AOP) framework is increasingly used to integrate data based on traditional and emerging toxicity testing paradigms. As the number of AOP descriptions has increased, so has the need to define the AOP in computable terms. Herein, we present a comprehens...
An Architecture for Cross-Cloud System Management
NASA Astrophysics Data System (ADS)
Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad
The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.
Universal Design: Implications for Computing Education
ERIC Educational Resources Information Center
Burgstahler, Sheryl
2011-01-01
Universal design (UD), a concept that grew from the field of architecture, has recently emerged as a paradigm for designing instructional methods, curriculum, and assessments that are welcoming and accessible to students with a wide range of characteristics, including those related to race, ethnicity, native language, gender, age, and disability.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
D'Ariano, Giacomo Mauro
2010-05-04
I will argue that the proposal of establishing operational foundations of Quantum Theory should have top-priority, and that the Lucien Hardy's program on Quantum Gravity should be paralleled by an analogous program on Quantum Field Theory (QFT), which needs to be reformulated, notwithstanding its experimental success. In this paper, after reviewing recently suggested operational 'principles of the quantumness', I address the problem on whether Quantum Theory and Special Relativity are unrelated theories, or instead, if the one implies the other. I show how Special Relativity can be indeed derived from causality of Quantum Theory, within the computational paradigm 'the universemore » is a huge quantum computer', reformulating QFT as a Quantum-Computational Field Theory (QCFT). In QCFT Special Relativity emerges from the fabric of the computational network, which also naturally embeds gauge invariance. In this scheme even the quantization rule and the Planck constant can in principle be derived as emergent from the underlying causal tapestry of space-time. In this way Quantum Theory remains the only theory operating the huge computer of the universe.Is the computational paradigm only a speculative tautology (theory as simulation of reality), or does it have a scientific value? The answer will come from Occam's razor, depending on the mathematical simplicity of QCFT. Here I will just start scratching the surface of QCFT, analyzing simple field theories, including Dirac's. The number of problems and unmotivated recipes that plague QFT strongly motivates us to undertake the QCFT project, since QCFT makes all such problems manifest, and forces a re-foundation of QFT.« less
Multi-Step Usage of in Vivo Models During Rational Drug Design and Discovery
Williams, Charles H.; Hong, Charles C.
2011-01-01
In this article we propose a systematic development method for rational drug design while reviewing paradigms in industry, emerging techniques and technologies in the field. Although the process of drug development today has been accelerated by emergence of computational methodologies, it is a herculean challenge requiring exorbitant resources; and often fails to yield clinically viable results. The current paradigm of target based drug design is often misguided and tends to yield compounds that have poor absorption, distribution, metabolism, and excretion, toxicology (ADMET) properties. Therefore, an in vivo organism based approach allowing for a multidisciplinary inquiry into potent and selective molecules is an excellent place to begin rational drug design. We will review how organisms like the zebrafish and Caenorhabditis elegans can not only be starting points, but can be used at various steps of the drug development process from target identification to pre-clinical trial models. This systems biology based approach paired with the power of computational biology; genetics and developmental biology provide a methodological framework to avoid the pitfalls of traditional target based drug design. PMID:21731440
Dynamic partitioning as a way to exploit new computing paradigms: the cloud use case.
NASA Astrophysics Data System (ADS)
Ciaschini, Vincenzo; Dal Pra, Stefano; dell'Agnello, Luca
2015-12-01
The WLCG community and many groups in the HEP community have based their computing strategy on the Grid paradigm, which proved successful and still ensures its goals. However, Grid technology has not spread much over other communities; in the commercial world, the cloud paradigm is the emerging way to provide computing services. WLCG experiments aim to achieve integration of their existing current computing model with cloud deployments and take advantage of the so-called opportunistic resources (including HPC facilities) which are usually not Grid compliant. One missing feature in the most common cloud frameworks, is the concept of job scheduler, which plays a key role in a traditional computing centre, by enabling a fairshare based access at the resources to the experiments in a scenario where demand greatly outstrips availability. At CNAF we are investigating the possibility to access the Tier-1 computing resources as an OpenStack based cloud service. The system, exploiting the dynamic partitioning mechanism already being used to enable Multicore computing, allowed us to avoid a static splitting of the computing resources in the Tier-1 farm, while permitting a share friendly approach. The hosts in a dynamically partitioned farm may be moved to or from the partition, according to suitable policies for request and release of computing resources. Nodes being requested in the partition switch their role and become available to play a different one. In the cloud use case hosts may switch from acting as Worker Node in the Batch system farm to cloud compute node member, made available to tenants. In this paper we describe the dynamic partitioning concept, its implementation and integration with our current batch system, LSF.
Emerging Themes in Image Informatics and Molecular Analysis for Digital Pathology.
Bhargava, Rohit; Madabhushi, Anant
2016-07-11
Pathology is essential for research in disease and development, as well as for clinical decision making. For more than 100 years, pathology practice has involved analyzing images of stained, thin tissue sections by a trained human using an optical microscope. Technological advances are now driving major changes in this paradigm toward digital pathology (DP). The digital transformation of pathology goes beyond recording, archiving, and retrieving images, providing new computational tools to inform better decision making for precision medicine. First, we discuss some emerging innovations in both computational image analytics and imaging instrumentation in DP. Second, we discuss molecular contrast in pathology. Molecular DP has traditionally been an extension of pathology with molecularly specific dyes. Label-free, spectroscopic images are rapidly emerging as another important information source, and we describe the benefits and potential of this evolution. Third, we describe multimodal DP, which is enabled by computational algorithms and combines the best characteristics of structural and molecular pathology. Finally, we provide examples of application areas in telepathology, education, and precision medicine. We conclude by discussing challenges and emerging opportunities in this area.
Emerging Themes in Image Informatics and Molecular Analysis for Digital Pathology
Bhargava, Rohit; Madabhushi, Anant
2017-01-01
Pathology is essential for research in disease and development, as well as for clinical decision making. For more than 100 years, pathology practice has involved analyzing images of stained, thin tissue sections by a trained human using an optical microscope. Technological advances are now driving major changes in this paradigm toward digital pathology (DP). The digital transformation of pathology goes beyond recording, archiving, and retrieving images, providing new computational tools to inform better decision making for precision medicine. First, we discuss some emerging innovations in both computational image analytics and imaging instrumentation in DP. Second, we discuss molecular contrast in pathology. Molecular DP has traditionally been an extension of pathology with molecularly specific dyes. Label-free, spectroscopic images are rapidly emerging as another important information source, and we describe the benefits and potential of this evolution. Third, we describe multimodal DP, which is enabled by computational algorithms and combines the best characteristics of structural and molecular pathology. Finally, we provide examples of application areas in telepathology, education, and precision medicine. We conclude by discussing challenges and emerging opportunities in this area. PMID:27420575
ERIC Educational Resources Information Center
Stamas, Paul J.
2013-01-01
This case study research followed the two-year transition of a medium-sized manufacturing firm towards a service-oriented enterprise. A service-oriented enterprise is an emerging architecture of the firm that leverages the paradigm of services computing to integrate the capabilities of the firm with the complementary competencies of business…
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1998-01-01
The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1999-01-01
The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.
Bång, Magnus; Larsson, Anders; Eriksson, Henrik
2003-01-01
In this paper, we present a new approach to clinical workplace computerization that departs from the window-based user interface paradigm. NOSTOS is an experimental computer-augmented work environment designed to support data capture and teamwork in an emergency room. NOSTOS combines multiple technologies, such as digital pens, walk-up displays, headsets, a smart desk, and sensors to enhance an existing paper-based practice with computer power. The physical interfaces allow clinicians to retain mobile paper-based collaborative routines and still benefit from computer technology. The requirements for the system were elicited from situated workplace studies. We discuss the advantages and disadvantages of augmenting a paper-based clinical work environment.
Kassab, Ghassan S.; An, Gary; Sander, Edward A.; Miga, Michael; Guccione, Julius M.; Ji, Songbai; Vodovotz, Yoram
2016-01-01
In this era of tremendous technological capabilities and increased focus on improving clinical outcomes, decreasing costs, and increasing precision, there is a need for a more quantitative approach to the field of surgery. Multiscale computational modeling has the potential to bridge the gap to the emerging paradigms of Precision Medicine and Translational Systems Biology, in which quantitative metrics and data guide patient care through improved stratification, diagnosis, and therapy. Achievements by multiple groups have demonstrated the potential for 1) multiscale computational modeling, at a biological level, of diseases treated with surgery and the surgical procedure process at the level of the individual and the population; along with 2) patient-specific, computationally-enabled surgical planning, delivery, and guidance and robotically-augmented manipulation. In this perspective article, we discuss these concepts, and cite emerging examples from the fields of trauma, wound healing, and cardiac surgery. PMID:27015816
Graphics Processing Units for HEP trigger systems
NASA Astrophysics Data System (ADS)
Ammendola, R.; Bauce, M.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Gianoli, A.; Lamanna, G.; Lonardo, A.; Messina, A.; Neri, I.; Paolucci, P. S.; Piandani, R.; Pontisso, L.; Rescigno, M.; Simula, F.; Sozzi, M.; Vicini, P.
2016-07-01
General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.
NASA Technical Reports Server (NTRS)
Banks, Daniel W.; Laflin, Brenda E. Gile; Kemmerly, Guy T.; Campbell, Bryan A.
1999-01-01
The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.
Computational ecology as an emerging science
Petrovskii, Sergei; Petrovskaya, Natalia
2012-01-01
It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336
The emerging role of cloud computing in molecular modelling.
Ebejer, Jean-Paul; Fulle, Simone; Morris, Garrett M; Finn, Paul W
2013-07-01
There is a growing recognition of the importance of cloud computing for large-scale and data-intensive applications. The distinguishing features of cloud computing and their relationship to other distributed computing paradigms are described, as are the strengths and weaknesses of the approach. We review the use made to date of cloud computing for molecular modelling projects and the availability of front ends for molecular modelling applications. Although the use of cloud computing technologies for molecular modelling is still in its infancy, we demonstrate its potential by presenting several case studies. Rapid growth can be expected as more applications become available and costs continue to fall; cloud computing can make a major contribution not just in terms of the availability of on-demand computing power, but could also spur innovation in the development of novel approaches that utilize that capacity in more effective ways. Copyright © 2013 Elsevier Inc. All rights reserved.
Bång, Magnus; Larsson, Anders; Eriksson, Henrik
2003-01-01
In this paper, we present a new approach to clinical workplace computerization that departs from the window–based user interface paradigm. NOSTOS is an experimental computer–augmented work environment designed to support data capture and teamwork in an emergency room. NOSTOS combines multiple technologies, such as digital pens, walk–up displays, headsets, a smart desk, and sensors to enhance an existing paper–based practice with computer power. The physical interfaces allow clinicians to retain mobile paper–based collaborative routines and still benefit from computer technology. The requirements for the system were elicited from situated workplace studies. We discuss the advantages and disadvantages of augmenting a paper–based clinical work environment. PMID:14728131
NASA Astrophysics Data System (ADS)
Chen, Su Shing; Caulfield, H. John
1994-03-01
Adaptive Computing, vs. Classical Computing, is emerging to be a field which is the culmination during the last 40 and more years of various scientific and technological areas, including cybernetics, neural networks, pattern recognition networks, learning machines, selfreproducing automata, genetic algorithms, fuzzy logics, probabilistic logics, chaos, electronics, optics, and quantum devices. This volume of "Critical Reviews on Adaptive Computing: Mathematics, Electronics, and Optics" is intended as a synergistic approach to this emerging field. There are many researchers in these areas working on important results. However, we have not seen a general effort to summarize and synthesize these results in theory as well as implementation. In order to reach a higher level of synergism, we propose Adaptive Computing as the field which comprises of the above mentioned computational paradigms and various realizations. The field should include both the Theory (or Mathematics) and the Implementation. Our emphasis is on the interplay of Theory and Implementation. The interplay, an adaptive process itself, of Theory and Implementation is the only "holistic" way to advance our understanding and realization of brain-like computation. We feel that a theory without implementation has the tendency to become unrealistic and "out-of-touch" with reality, while an implementation without theory runs the risk to be superficial and obsolete.
Sustainable and Autonomic Space Exploration Missions
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Sterritt, Roy; Rouff, Christopher; Rash, James L.; Truszkowski, Walter
2006-01-01
Visions for future space exploration have long term science missions in sight, resulting in the need for sustainable missions. Survivability is a critical property of sustainable systems and may be addressed through autonomicity, an emerging paradigm for self-management of future computer-based systems based on inspiration from the human autonomic nervous system. This paper examines some of the ongoing research efforts to realize these survivable systems visions, with specific emphasis on developments in Autonomic Policies.
Liu, Charles Y; Spicer, Mark; Apuzzo, Michael L J
2003-01-01
The future development of the neurosurgical operative environment is driven principally by concurrent development in science and technology. In the new millennium, these developments are taking on a Jules Verne quality, with the ability to construct and manipulate the human organism and its surroundings at the level of atoms and molecules seemingly at hand. Thus, an examination of currents in technology advancement from the neurosurgical perspective can provide insight into the evolution of the neurosurgical operative environment. In the future, the optimal design solution for the operative environment requirements of specialized neurosurgery may take the form of composites of venues that are currently mutually distinct. Advances in microfabrication technology and laser optical manipulators are expanding the scope and role of robotics, with novel opportunities for bionic integration. Assimilation of biosensor technology into the operative environment promises to provide neurosurgeons of the future with a vastly expanded set of physiological data, which will require concurrent simplification and optimization of analysis and presentation schemes to facilitate practical usefulness. Nanotechnology derivatives are shattering the maximum limits of resolution and magnification allowed by conventional microscopes. Furthermore, quantum computing and molecular electronics promise to greatly enhance computational power, allowing the emerging reality of simulation and virtual neurosurgery for rehearsal and training purposes. Progressive minimalism is evident throughout, leading ultimately to a paradigm shift as the nanoscale is approached. At the interface between the old and new technological paradigms, issues related to integration may dictate the ultimate emergence of the products of the new paradigm. Once initiated, however, history suggests that the process of change will proceed rapidly and dramatically, with the ultimate neurosurgical operative environment of the future being far more complex in functional capacity but strikingly simple in apparent form.
Apuzzo, M L; Liu, C Y
2001-10-01
THIS ARTICLE DISCUSSES elements in the definition of modernity and emerging futurism in neurological surgery. In particular, it describes evolution, discovery, and paradigm shifts in the field and forces responsible for their realization. It analyzes the cyclical reinvention of the discipline experienced during the past generation and attempts to identify apertures to the near and more remote future. Subsequently, it focuses on forces and discovery in computational science, imaging, molecular science, biomedical engineering, and information processing as they relate to the theme of minimalism that is evident in the field. These areas are explained in the light of future possibilities offered by the emerging field of nanotechnology with molecular engineering.
Dreizin, David; Nam, Arthur J; Hirsch, Jeffrey; Bernstein, Mark P
2018-06-20
This article reviews the conceptual framework, available evidence, and practical considerations pertaining to nascent and emerging advances in patient-centered CT-imaging and CT-guided surgery for maxillofacial trauma. These include cinematic rendering-a novel method for advanced 3D visualization, incorporation of quantitative CT imaging into the assessment of orbital fractures, low-dose CT imaging protocols made possible with contemporary scanners and reconstruction techniques, the rapidly growing use of cone-beam CT, virtual fracture reduction with design software for surgical pre-planning, the use of 3D printing for fabricating models and implants, and new avenues in CT-guided computer-aided surgery.
The Soil Carbon Paradigm Shift: Triangulating Theories, Measurements, and Models
NASA Astrophysics Data System (ADS)
Blankinship, J. C.; Crow, S. E.; Schimel, J.; Sierra, C. A.; Schaedel, C.; Plante, A. F.; Thompson, A.; Berhe, A. A.; Druhan, J. L.; Heckman, K. A.; Keiluweit, M.; Lawrence, C. R.; Marin-Spiotta, E.; Rasmussen, C.; Wagai, R.; Wieder, W. R.
2016-12-01
Predicting global responses of soil carbon (C) to environmental change remains confounded by a number of paradigms that have emerged from separate approaches. A prevailing paradigm in biogeochemistry interprets soil C as discrete pools based on estimated or measured turnover times (e.g., CENTURY model). An alternative is emerging that envisions the stabilization of soil C in tension between decomposition by microbial agents and protection by physical and chemical mechanisms. We propose an approach to bridge the gap between different paradigms, and to improve soil C forecasting by conceptualizing each paradigm as a triangle composed of three nodes: theory, analytical measurement, and numerical model. Paradigms tend to emerge from what can either be represented in models or measured using analytical instruments. But they gain power when all three elements are integrated in a balanced trinity. Our goal was to compare how theory, measurement, and model fit together in our understanding of soil C to learn from past successes, evaluate the strengths and weaknesses of current paradigms, and guide development of new understanding. We used a case-study approach to analyze each corner of the paradigm-triangle: i) paradigms that have strong theory but are constrained by weak linkages with measurements or models, ii) paradigms with robust models that have weak linkages with theory or measurements, and iii) paradigms with many measurements but little theoretical support or ability to be parameterized in numerical models. We conclude that established models like CENTURY dominate because theory and measurements that underlie the model form strong linkages that previously created a balanced triangle. Evolving paradigms based on physical protection and microbial agency are still struggling to gain traction because the theory is challenging to represent in models. The explicit examination of the strengths of emerging paradigms can, therefore, help refine and accelerate our ability to constrain projections of soil C dynamics.
Metal oxide resistive random access memory based synaptic devices for brain-inspired computing
NASA Astrophysics Data System (ADS)
Gao, Bin; Kang, Jinfeng; Zhou, Zheng; Chen, Zhe; Huang, Peng; Liu, Lifeng; Liu, Xiaoyan
2016-04-01
The traditional Boolean computing paradigm based on the von Neumann architecture is facing great challenges for future information technology applications such as big data, the Internet of Things (IoT), and wearable devices, due to the limited processing capability issues such as binary data storage and computing, non-parallel data processing, and the buses requirement between memory units and logic units. The brain-inspired neuromorphic computing paradigm is believed to be one of the promising solutions for realizing more complex functions with a lower cost. To perform such brain-inspired computing with a low cost and low power consumption, novel devices for use as electronic synapses are needed. Metal oxide resistive random access memory (ReRAM) devices have emerged as the leading candidate for electronic synapses. This paper comprehensively addresses the recent work on the design and optimization of metal oxide ReRAM-based synaptic devices. A performance enhancement methodology and optimized operation scheme to achieve analog resistive switching and low-energy training behavior are provided. A three-dimensional vertical synapse network architecture is proposed for high-density integration and low-cost fabrication. The impacts of the ReRAM synaptic device features on the performances of neuromorphic systems are also discussed on the basis of a constructed neuromorphic visual system with a pattern recognition function. Possible solutions to achieve the high recognition accuracy and efficiency of neuromorphic systems are presented.
A Perspective on the Role of Computational Models in Immunology.
Chakraborty, Arup K
2017-04-26
This is an exciting time for immunology because the future promises to be replete with exciting new discoveries that can be translated to improve health and treat disease in novel ways. Immunologists are attempting to answer increasingly complex questions concerning phenomena that range from the genetic, molecular, and cellular scales to that of organs, whole animals or humans, and populations of humans and pathogens. An important goal is to understand how the many different components involved interact with each other within and across these scales for immune responses to emerge, and how aberrant regulation of these processes causes disease. To aid this quest, large amounts of data can be collected using high-throughput instrumentation. The nonlinear, cooperative, and stochastic character of the interactions between components of the immune system as well as the overwhelming amounts of data can make it difficult to intuit patterns in the data or a mechanistic understanding of the phenomena being studied. Computational models are increasingly important in confronting and overcoming these challenges. I first describe an iterative paradigm of research that integrates laboratory experiments, clinical data, computational inference, and mechanistic computational models. I then illustrate this paradigm with a few examples from the recent literature that make vivid the power of bringing together diverse types of computational models with experimental and clinical studies to fruitfully interrogate the immune system.
Simulating the formation of cosmic structure.
Frenk, C S
2002-06-15
A timely combination of new theoretical ideas and observational discoveries has brought about significant advances in our understanding of cosmic evolution. Computer simulations have played a key role in these developments by providing the means to interpret astronomical data in the context of physical and cosmological theory. In the current paradigm, our Universe has a flat geometry, is undergoing accelerated expansion and is gravitationally dominated by elementary particles that make up cold dark matter. Within this framework, it is possible to simulate in a computer the emergence of galaxies and other structures from small quantum fluctuations imprinted during an epoch of inflationary expansion shortly after the Big Bang. The simulations must take into account the evolution of the dark matter as well as the gaseous processes involved in the formation of stars and other visible components. Although many unresolved questions remain, a coherent picture for the formation of cosmic structure is now beginning to emerge.
Emergent Adaptive Noise Reduction from Communal Cooperation of Sensor Grid
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Jones, Michael G.; Nark, Douglas M.; Lodding, Kenneth N.
2010-01-01
In the last decade, the realization of small, inexpensive, and powerful devices with sensors, computers, and wireless communication has promised the development of massive sized sensor networks with dense deployments over large areas capable of high fidelity situational assessments. However, most management models have been based on centralized control and research has concentrated on methods for passing data from sensor devices to the central controller. Most implementations have been small but, as it is not scalable, this methodology is insufficient for massive deployments. Here, a specific application of a large sensor network for adaptive noise reduction demonstrates a new paradigm where communities of sensor/computer devices assess local conditions and make local decisions from which emerges a global behaviour. This approach obviates many of the problems of centralized control as it is not prone to single point of failure and is more scalable, efficient, robust, and fault tolerant
Toward superconducting critical current by design
Sadovskyy, Ivan A.; Jia, Ying; Leroux, Maxime; ...
2016-03-31
The interaction of vortex matter with defects in applied superconductors directly determines their current carrying capacity. Defects range from chemically grown nanostructures and crystalline imperfections to the layered structure of the material itself. The vortex-defect interactions are non-additive in general, leading to complex dynamic behavior that has proven difficult to capture in analytical models. With recent rapid progress in computational powers, a new paradigm has emerged that aims at simulation assisted design of defect structures with predictable ‘critical-current-by-design’: analogous to the materials genome concept of predicting stable materials structures of interest. We demonstrate the feasibility of this paradigm by combiningmore » large-scale time-dependent Ginzburg-Landau numerical simulations with experiments on commercial high temperature superconductor (HTS) containing well-controlled correlated defects.« less
Modeling the peak of emergence in systems: Design and katachi.
Cardier, Beth; Goranson, H T; Casas, Niccolo; Lundberg, Patric; Erioli, Alessio; Takaki, Ryuji; Nagy, Dénes; Ciavarra, Richard; Sanford, Larry D
2017-12-01
It is difficult to model emergence in biological systems using reductionist paradigms. A requirement for computational modeling is that individual entities can be recorded parametrically and related logically, but their transformation into whole systems cannot be captured this way. The problem stems from an inability to formally represent the implicit influences that inform emergent organization, such as context, shifts in causal agency or scale, and self-reference. This lack hampers biological systems modeling and its computational counterpart, indicating a need for new fundamental abstraction frameworks that support system-level characteristics. We develop an approach that formally captures these characteristics, focusing on the way they come together to enable transformation at the 'peak' of the emergent process. An example from virology is presented, in which two seemingly antagonistic systems - the herpes cold sore virus and its host - are capable of altering their basic biological objectives to achieve a new equilibrium. The usual barriers to modeling this process are overcome by incorporating mechanisms from practices centered on its emergent peak: design and katachi. In the Japanese science of form, katachi refers to the emergence of intrinsic structure from real situations, where an optimal balance between implicit influences is achieved. Design indicates how such optimization is guided by principles of flow. These practices leverage qualities of situated abstraction, which we understand through the intuitive method of physicist Kôdi Husimi. Early results indicate that this approach can capture the functional transformations of biological emergence, whilst being reasonably computable. Due to its geometric foundations and narrative-based extension to logic, the method will also generate speculative predictions. This research forms the foundations of a new biomedical modeling platform, which is discussed. Copyright © 2017. Published by Elsevier Ltd.
Parallel, distributed and GPU computing technologies in single-particle electron microscopy
Schmeisser, Martin; Heisen, Burkhard C.; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger
2009-01-01
Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today’s technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined. PMID:19564686
Parallel, distributed and GPU computing technologies in single-particle electron microscopy.
Schmeisser, Martin; Heisen, Burkhard C; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger
2009-07-01
Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined.
Co-"Lab"oration: A New Paradigm for Building a Management Information Systems Course
ERIC Educational Resources Information Center
Breimer, Eric; Cotler, Jami; Yoder, Robert
2010-01-01
We propose a new paradigm for building a Management Information Systems course that focuses on laboratory activities developed collaboratively using Computer-Mediated Communication and Collaboration tools. A highlight of our paradigm is the "practice what you preach" concept where the computer communication tools and collaboration…
NASA Astrophysics Data System (ADS)
Ercan, İlke; Suyabatmaz, Enes
2018-06-01
The saturation in the efficiency and performance scaling of conventional electronic technologies brings about the development of novel computational paradigms. Brownian circuits are among the promising alternatives that can exploit fluctuations to increase the efficiency of information processing in nanocomputing. A Brownian cellular automaton, where signals propagate randomly and are driven by local transition rules, can be made computationally universal by embedding arbitrary asynchronous circuits on it. One of the potential realizations of such circuits is via single electron tunneling (SET) devices since SET technology enable simulation of noise and fluctuations in a fashion similar to Brownian search. In this paper, we perform a physical-information-theoretic analysis on the efficiency limitations in a Brownian NAND and half-adder circuits implemented using SET technology. The method we employed here establishes a solid ground that enables studying computational and physical features of this emerging technology on an equal footing, and yield fundamental lower bounds that provide valuable insights into how far its efficiency can be improved in principle. In order to provide a basis for comparison, we also analyze a NAND gate and half-adder circuit implemented in complementary metal oxide semiconductor technology to show how the fundamental bound of the Brownian circuit compares against a conventional paradigm.
NASA Technical Reports Server (NTRS)
Davis, Bruce E.; Elliot, Gregory
1989-01-01
Jackson State University recently established the Center for Spatial Data Research and Applications, a Geographical Information System (GIS) and remote sensing laboratory. Taking advantage of new technologies and new directions in the spatial (geographic) sciences, JSU is building a Center of Excellence in Spatial Data Management. New opportunities for research, applications, and employment are emerging. GIS requires fundamental shifts and new demands in traditional computer science and geographic training. The Center is not merely another computer lab but is one setting the pace in a new applied frontier. GIS and its associated technologies are discussed. The Center's facilities are described. An ARC/INFO GIS runs on a Vax mainframe, with numerous workstations. Image processing packages include ELAS, LIPS, VICAR, and ERDAS. A host of hardware and software peripheral are used in support. Numerous projects are underway, such as the construction of a Gulf of Mexico environmental data base, development of AI in image processing, a land use dynamics study of metropolitan Jackson, and others. A new academic interdisciplinary program in Spatial Data Management is under development, combining courses in Geography and Computer Science. The broad range of JSU's GIS and remote sensing activities is addressed. The impacts on changing paradigms in the university and in the professional world conclude the discussion.
Policy Dispute and Paradigm Evaluation: A Response to Rowland.
ERIC Educational Resources Information Center
Lichtman, Allan J.; Rohrer, Daniel M.
1982-01-01
Responds to Rowland's article, (CS 705 841). Contends that policy systems analysis emerges as the only acceptable paradigm for competitive debate and that it satisfies the criteria for paradigm evaluation. (PD)
The 'Biologically-Inspired Computing' Column
NASA Technical Reports Server (NTRS)
Hinchey, Mike
2006-01-01
The field of Biology changed dramatically in 1953, with the determination by Francis Crick and James Dewey Watson of the double helix structure of DNA. This discovery changed Biology for ever, allowing the sequencing of the human genome, and the emergence of a "new Biology" focused on DNA, genes, proteins, data, and search. Computational Biology and Bioinformatics heavily rely on computing to facilitate research into life and development. Simultaneously, an understanding of the biology of living organisms indicates a parallel with computing systems: molecules in living cells interact, grow, and transform according to the "program" dictated by DNA. Moreover, paradigms of Computing are emerging based on modelling and developing computer-based systems exploiting ideas that are observed in nature. This includes building into computer systems self-management and self-governance mechanisms that are inspired by the human body's autonomic nervous system, modelling evolutionary systems analogous to colonies of ants or other insects, and developing highly-efficient and highly-complex distributed systems from large numbers of (often quite simple) largely homogeneous components to reflect the behaviour of flocks of birds, swarms of bees, herds of animals, or schools of fish. This new field of "Biologically-Inspired Computing", often known in other incarnations by other names, such as: Autonomic Computing, Pervasive Computing, Organic Computing, Biomimetics, and Artificial Life, amongst others, is poised at the intersection of Computer Science, Engineering, Mathematics, and the Life Sciences. Successes have been reported in the fields of drug discovery, data communications, computer animation, control and command, exploration systems for space, undersea, and harsh environments, to name but a few, and augur much promise for future progress.
Embracing the Cloud: Six Ways to Look at the Shift to Cloud Computing
ERIC Educational Resources Information Center
Ullman, David F.; Haggerty, Blake
2010-01-01
Cloud computing is the latest paradigm shift for the delivery of IT services. Where previous paradigms (centralized, decentralized, distributed) were based on fairly straightforward approaches to technology and its management, cloud computing is radical in comparison. The literature on cloud computing, however, suffers from many divergent…
The Newtonian Mechanistic Paradigm, Special Education, and Contours of Alternatives: An Overview.
ERIC Educational Resources Information Center
Heshusius, Lous
1989-01-01
The article examines theoretical reorientations in special education away from the Newtonian mechanistic paradigm toward an emerging holistic paradigm. Recent literature is critiqued for renaming theories as paradigms, thereby providing an illusion of change while leaving fundamental mechanistic assumptions in place. (Author/DB)
Neuromorphic Kalman filter implementation in IBM’s TrueNorth
NASA Astrophysics Data System (ADS)
Carney, R.; Bouchard, K.; Calafiura, P.; Clark, D.; Donofrio, D.; Garcia-Sciveres, M.; Livezey, J.
2017-10-01
Following the advent of a post-Moore’s law field of computation, novel architectures continue to emerge. With composite, multi-million connection neuromorphic chips like IBM’s TrueNorth, neural engineering has now become a feasible technology in this novel computing paradigm. High Energy Physics experiments are continuously exploring new methods of computation and data handling, including neuromorphic, to support the growing challenges of the field and be prepared for future commodity computing trends. This work details the first instance of a Kalman filter implementation in IBM’s neuromorphic architecture, TrueNorth, for both parallel and serial spike trains. The implementation is tested on multiple simulated systems and its performance is evaluated with respect to an equivalent non-spiking Kalman filter. The limits of the implementation are explored whilst varying the size of weight and threshold registers, the number of spikes used to encode a state, size of neuron block for spatial encoding, and neuron potential reset schemes.
Security Risks of Cloud Computing and Its Emergence as 5th Utility Service
NASA Astrophysics Data System (ADS)
Ahmad, Mushtaq
Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.
On the Modeling and Management of Cloud Data Analytics
NASA Astrophysics Data System (ADS)
Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni
A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.
Perceptual crossing: the simplest online paradigm
Auvray, Malika; Rohde, Marieke
2012-01-01
Researchers in social cognition increasingly realize that many phenomena cannot be understood by investigating offline situations only, focusing on individual mechanisms and an observer perspective. There are processes of dynamic emergence specific to online situations, when two or more persons are engaged in a real-time interaction that are more than just the sum of the individual capacities or behaviors, and these require the study of online social interaction. Auvray et al.'s (2009) perceptual crossing paradigm offers possibly the simplest paradigm for studying such online interactions: two persons, a one-dimensional space, one bit of information, and a yes/no answer. This study has provoked a lot of resonance in different areas of research, including experimental psychology, computer/robot modeling, philosophy, psychopathology, and even in the field of design. In this article, we review and critically assess this body of literature. We give an overview of both behavioral experimental research and simulated agent modeling done using the perceptual crossing paradigm. We discuss different contexts in which work on perceptual crossing has been cited. This includes the controversy about the possible constitutive role of perceptual crossing for social cognition. We conclude with an outlook on future research possibilities, in particular those that could elucidate the link between online interaction dynamics and individual social cognition. PMID:22723776
Kiper, Pawel; Szczudlik, Andrzej; Venneri, Annalena; Stozek, Joanna; Luque-Moreno, Carlos; Opara, Jozef; Baba, Alfonc; Agostini, Michela; Turolla, Andrea
2016-10-15
Computational approaches for modelling the central nervous system (CNS) aim to develop theories on processes occurring in the brain that allow the transformation of all information needed for the execution of motor acts. Computational models have been proposed in several fields, to interpret not only the CNS functioning, but also its efferent behaviour. Computational model theories can provide insights into neuromuscular and brain function allowing us to reach a deeper understanding of neuroplasticity. Neuroplasticity is the process occurring in the CNS that is able to permanently change both structure and function due to interaction with the external environment. To understand such a complex process several paradigms related to motor learning and computational modeling have been put forward. These paradigms have been explained through several internal model concepts, and supported by neurophysiological and neuroimaging studies. Therefore, it has been possible to make theories about the basis of different learning paradigms according to known computational models. Here we review the computational models and motor learning paradigms used to describe the CNS and neuromuscular functions, as well as their role in the recovery process. These theories have the potential to provide a way to rigorously explain all the potential of CNS learning, providing a basis for future clinical studies. Copyright © 2016 Elsevier B.V. All rights reserved.
Slater, James; Rill, Velisar
2004-04-01
Coronary artery disease (CAD) is the leading cause of morbidity and mortality in the United States and other industrialized countries. In the undeveloped world a similar epidemic is brewing. A new pathophysiologic paradigm has emerged, which assigns the mediators of inflammation a much larger role in the disease process. This paradigm has helped explain the unpredictable nature of many adverse consequences of CAD. The long latent phase of the disease, and often sudden initial presentation, make efforts at early detection extremely important. Considerable work has been devoted to identify, as well as influence, predisposing risk factors for developing arteriosclerosis. Novel markers of inflammation, like C-reactive protein, have been identified and compared to traditional risk factors. In addition, new imaging modalities introduce the possibility of screening for subclinical disease. Electron beam and multidetector computed tomography (CT) scanners, as well as other techniques, are emerging as powerful tools to detect early disease presence and allow intervention to take place before major clinical events occur. Advances in our understanding of the pathophysiology of CAD, and our ability to image the stages of silent disease will go hand in hand to revolutionize our approach to prevention and treatment of this deadly malady.
Toward a patient-based paradigm for blood transfusion.
Farrugia, Albert; Vamvakas, Eleftherios
2014-01-01
The current "manufacturing paradigm" of transfusion practice has detached transfusion from the clinical environment. As an example, fresh whole blood in large-volume hemorrhage may be superior to whole blood reconstituted from multiple components. Multicomponent apheresis can overcome logistical difficulties in matching patient needs with fresh component availability and can deliver the benefits of fresh whole blood. Because of the different transfusion needs of patients in emerging economies and the vulnerability of these blood systems to emerging infections, fresh whole blood and multicomponent apheresis can better meet patient needs when compared with transplants of the "manufacturing paradigm". We propose that patient blood management, along with panels of repeat, paid, accredited apheresis and fresh whole-blood donors can be used in emerging economies to support decentralized blood services. This alternative transfusion-medicine paradigm could eventually also be adopted by established economies to focus transfusion medicine on local patient needs and to alleviate the problem of the aging volunteer donor base.
Passive motion paradigm: an alternative to optimal control.
Mohan, Vishwanathan; Morasso, Pietro
2011-01-01
IN THE LAST YEARS, OPTIMAL CONTROL THEORY (OCT) HAS EMERGED AS THE LEADING APPROACH FOR INVESTIGATING NEURAL CONTROL OF MOVEMENT AND MOTOR COGNITION FOR TWO COMPLEMENTARY RESEARCH LINES: behavioral neuroscience and humanoid robotics. In both cases, there are general problems that need to be addressed, such as the "degrees of freedom (DoFs) problem," the common core of production, observation, reasoning, and learning of "actions." OCT, directly derived from engineering design techniques of control systems quantifies task goals as "cost functions" and uses the sophisticated formal tools of optimal control to obtain desired behavior (and predictions). We propose an alternative "softer" approach passive motion paradigm (PMP) that we believe is closer to the biomechanics and cybernetics of action. The basic idea is that actions (overt as well as covert) are the consequences of an internal simulation process that "animates" the body schema with the attractor dynamics of force fields induced by the goal and task-specific constraints. This internal simulation offers the brain a way to dynamically link motor redundancy with task-oriented constraints "at runtime," hence solving the "DoFs problem" without explicit kinematic inversion and cost function computation. We argue that the function of such computational machinery is not only restricted to shaping motor output during action execution but also to provide the self with information on the feasibility, consequence, understanding and meaning of "potential actions." In this sense, taking into account recent developments in neuroscience (motor imagery, simulation theory of covert actions, mirror neuron system) and in embodied robotics, PMP offers a novel framework for understanding motor cognition that goes beyond the engineering control paradigm provided by OCT. Therefore, the paper is at the same time a review of the PMP rationale, as a computational theory, and a perspective presentation of how to develop it for designing better cognitive architectures.
Additive manufacturing: Toward holistic design
Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.; ...
2017-03-18
Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.
Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.
An Object-Oriented Approach to Writing Computational Electromagnetics Codes
NASA Technical Reports Server (NTRS)
Zimmerman, Martin; Mallasch, Paul G.
1996-01-01
Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.
Granular computing with multiple granular layers for brain big data processing.
Wang, Guoyin; Xu, Ji
2014-12-01
Big data is the term for a collection of datasets so huge and complex that it becomes difficult to be processed using on-hand theoretical models and technique tools. Brain big data is one of the most typical, important big data collected using powerful equipments of functional magnetic resonance imaging, multichannel electroencephalography, magnetoencephalography, Positron emission tomography, near infrared spectroscopic imaging, as well as other various devices. Granular computing with multiple granular layers, referred to as multi-granular computing (MGrC) for short hereafter, is an emerging computing paradigm of information processing, which simulates the multi-granular intelligent thinking model of human brain. It concerns the processing of complex information entities called information granules, which arise in the process of data abstraction and derivation of information and even knowledge from data. This paper analyzes three basic mechanisms of MGrC, namely granularity optimization, granularity conversion, and multi-granularity joint computation, and discusses the potential of introducing MGrC into intelligent processing of brain big data.
Education Student Research Paradigms and Emerging Scholar Identities: A Mixed-Methods Study
ERIC Educational Resources Information Center
Hales, Patrick D.; Croxton, Rebecca A.; Kirkman, Christopher J.
2016-01-01
Using a mixed-methods approach, this study sought to understand a general sense of paradigm confidence and to see how this confidence relates to doctoral student identities as emerging scholars. Identity development was explored among 46 education doctoral students at a midsized public university in the Southeast. Researchers examined students'…
Mobile economics and pricing of health care services.
Huttin, Christine C
2012-01-01
This paper presents tools and concepts to analyze the business environment of the biopharmaceutical industry. It was presented at MEDETEL 2010. Emerging paradigms appear in that industry and new ways to value life science technologies are developed especially using mobile economics analysis. At a time, mobile computing technologies revolutionize the field of health care, this paper contributes to show how the value chain concept can be useful to analyze the value system in a mobile computing environment. It is also a milestone for the designs of future technology platforms and of health care infrastructure, in order to retain enough value between innovators, new and traditionnal players from life science, IT and other new comers, in a fragmented global competitive environment.
Making Spatial Statistics Service Accessible On Cloud Platform
NASA Astrophysics Data System (ADS)
Mu, X.; Wu, J.; Li, T.; Zhong, Y.; Gao, X.
2014-04-01
Web service can bring together applications running on diverse platforms, users can access and share various data, information and models more effectively and conveniently from certain web service platform. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services. With the rampant growth of massive data and restriction of net, traditional web services platforms have some prominent problems existing in development such as calculation efficiency, maintenance cost and data security. In this paper, we offer a spatial statistics service based on Microsoft cloud. An experiment was carried out to evaluate the availability and efficiency of this service. The results show that this spatial statistics service is accessible for the public conveniently with high processing efficiency.
RNA nanotechnology for computer design and in vivo computation
Qiu, Meikang; Khisamutdinov, Emil; Zhao, Zhengyi; Pan, Cheryl; Choi, Jeong-Woo; Leontis, Neocles B.; Guo, Peixuan
2013-01-01
Molecular-scale computing has been explored since 1989 owing to the foreseeable limitation of Moore's law for silicon-based computation devices. With the potential of massive parallelism, low energy consumption and capability of working in vivo, molecular-scale computing promises a new computational paradigm. Inspired by the concepts from the electronic computer, DNA computing has realized basic Boolean functions and has progressed into multi-layered circuits. Recently, RNA nanotechnology has emerged as an alternative approach. Owing to the newly discovered thermodynamic stability of a special RNA motif (Shu et al. 2011 Nat. Nanotechnol. 6, 658–667 (doi:10.1038/nnano.2011.105)), RNA nanoparticles are emerging as another promising medium for nanodevice and nanomedicine as well as molecular-scale computing. Like DNA, RNA sequences can be designed to form desired secondary structures in a straightforward manner, but RNA is structurally more versatile and more thermodynamically stable owing to its non-canonical base-pairing, tertiary interactions and base-stacking property. A 90-nucleotide RNA can exhibit 490 nanostructures, and its loops and tertiary architecture can serve as a mounting dovetail that eliminates the need for external linking dowels. Its enzymatic and fluorogenic activity creates diversity in computational design. Varieties of small RNA can work cooperatively, synergistically or antagonistically to carry out computational logic circuits. The riboswitch and enzymatic ribozyme activities and its special in vivo attributes offer a great potential for in vivo computation. Unique features in transcription, termination, self-assembly, self-processing and acid resistance enable in vivo production of RNA nanoparticles that harbour various regulators for intracellular manipulation. With all these advantages, RNA computation is promising, but it is still in its infancy. Many challenges still exist. Collaborations between RNA nanotechnologists and computer scientists are necessary to advance this nascent technology. PMID:24000362
RNA nanotechnology for computer design and in vivo computation.
Qiu, Meikang; Khisamutdinov, Emil; Zhao, Zhengyi; Pan, Cheryl; Choi, Jeong-Woo; Leontis, Neocles B; Guo, Peixuan
2013-10-13
Molecular-scale computing has been explored since 1989 owing to the foreseeable limitation of Moore's law for silicon-based computation devices. With the potential of massive parallelism, low energy consumption and capability of working in vivo, molecular-scale computing promises a new computational paradigm. Inspired by the concepts from the electronic computer, DNA computing has realized basic Boolean functions and has progressed into multi-layered circuits. Recently, RNA nanotechnology has emerged as an alternative approach. Owing to the newly discovered thermodynamic stability of a special RNA motif (Shu et al. 2011 Nat. Nanotechnol. 6, 658-667 (doi:10.1038/nnano.2011.105)), RNA nanoparticles are emerging as another promising medium for nanodevice and nanomedicine as well as molecular-scale computing. Like DNA, RNA sequences can be designed to form desired secondary structures in a straightforward manner, but RNA is structurally more versatile and more thermodynamically stable owing to its non-canonical base-pairing, tertiary interactions and base-stacking property. A 90-nucleotide RNA can exhibit 4⁹⁰ nanostructures, and its loops and tertiary architecture can serve as a mounting dovetail that eliminates the need for external linking dowels. Its enzymatic and fluorogenic activity creates diversity in computational design. Varieties of small RNA can work cooperatively, synergistically or antagonistically to carry out computational logic circuits. The riboswitch and enzymatic ribozyme activities and its special in vivo attributes offer a great potential for in vivo computation. Unique features in transcription, termination, self-assembly, self-processing and acid resistance enable in vivo production of RNA nanoparticles that harbour various regulators for intracellular manipulation. With all these advantages, RNA computation is promising, but it is still in its infancy. Many challenges still exist. Collaborations between RNA nanotechnologists and computer scientists are necessary to advance this nascent technology.
ERIC Educational Resources Information Center
Heslin, J. Alexander, Jr.
In senior-level undergraduate research courses in Computer Information Systems (CIS), students are required to read and assimilate a large volume of current research literature. One course objective is to demonstrate to the student that there are patterns or models or paradigms of research. A new approach in identifying research paradigms is…
Changing the Cultural Paradigm to Meet Emerging Requirements
NASA Technical Reports Server (NTRS)
Robbins, William W.
2007-01-01
This viewgraph presentation reviews changes that are required in Space Transportation. This new transportation paradigm from the reliance on the Space Shuttle to a mixture of space transportation vehicles, i.e., the Russian Progress, vehicle, the Japanese HTV, the European ATV and other commercial orbital transportation systems, requires a new cultural paradigm. This new paradigm is outlined, and reviewed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demeure, I.M.
The research presented here is concerned with representation techniques and tools to support the design, prototyping, simulation, and evaluation of message-based parallel, distributed computations. The author describes ParaDiGM-Parallel, Distributed computation Graph Model-a visual representation technique for parallel, message-based distributed computations. ParaDiGM provides several views of a computation depending on the aspect of concern. It is made of two complementary submodels, the DCPG-Distributed Computing Precedence Graph-model, and the PAM-Process Architecture Model-model. DCPGs are precedence graphs used to express the functionality of a computation in terms of tasks, message-passing, and data. PAM graphs are used to represent the partitioning of a computationmore » into schedulable units or processes, and the pattern of communication among those units. There is a natural mapping between the two models. He illustrates the utility of ParaDiGM as a representation technique by applying it to various computations (e.g., an adaptive global optimization algorithm, the client-server model). ParaDiGM representations are concise. They can be used in documenting the design and the implementation of parallel, distributed computations, in describing such computations to colleagues, and in comparing and contrasting various implementations of the same computation. He then describes VISA-VISual Assistant, a software tool to support the design, prototyping, and simulation of message-based parallel, distributed computations. VISA is based on the ParaDiGM model. In particular, it supports the editing of ParaDiGM graphs to describe the computations of interest, and the animation of these graphs to provide visual feedback during simulations. The graphs are supplemented with various attributes, simulation parameters, and interpretations which are procedures that can be executed by VISA.« less
The Emerging Paradigm in Probation and Parole in the United States
ERIC Educational Resources Information Center
Kimora
2008-01-01
There is an emerging paradigm in probation and parole in the United States. That new outlook encompasses a realization that these forms of supervision of offenders must meet the challenges of an increasing number of parolees and probationers. Recidivism continues to be the primary outcome measure for probation, as it is for all corrections…
ERIC Educational Resources Information Center
Mills, Steven C.; Ragan, Tillman J.
This paper examines a research paradigm that is particularly suited to experimentation-related computer-based instruction and integrated learning systems. The main assumption of the model is that one of the most powerful capabilities of computer-based instruction, and specifically of integrated learning systems, is the capacity to adapt…
ERIC Educational Resources Information Center
Siebler, Frank; Sabelus, Saskia; Bohner, Gerd
2008-01-01
A refined computer paradigm for assessing sexual harassment is presented, validated, and used for testing substantive hypotheses. Male participants were given an opportunity to send sexist jokes to a computer-simulated female chat partner. In Study 1 (N = 44), the harassment measure (number of sexist jokes sent) correlated positively with…
Spectral-temporal EEG dynamics of speech discrimination processing in infants during sleep.
Gilley, Phillip M; Uhler, Kristin; Watson, Kaylee; Yoshinaga-Itano, Christine
2017-03-22
Oddball paradigms are frequently used to study auditory discrimination by comparing event-related potential (ERP) responses from a standard, high probability sound and to a deviant, low probability sound. Previous research has established that such paradigms, such as the mismatch response or mismatch negativity, are useful for examining auditory processes in young children and infants across various sleep and attention states. The extent to which oddball ERP responses may reflect subtle discrimination effects, such as speech discrimination, is largely unknown, especially in infants that have not yet acquired speech and language. Mismatch responses for three contrasts (non-speech, vowel, and consonant) were computed as a spectral-temporal probability function in 24 infants, and analyzed at the group level by a modified multidimensional scaling. Immediately following an onset gamma response (30-50 Hz), the emergence of a beta oscillation (12-30 Hz) was temporally coupled with a lower frequency theta oscillation (2-8 Hz). The spectral-temporal probability of this coupling effect relative to a subsequent theta modulation corresponds with discrimination difficulty for non-speech, vowel, and consonant contrast features. The theta modulation effect suggests that unexpected sounds are encoded as a probabilistic measure of surprise. These results support the notion that auditory discrimination is driven by the development of brain networks for predictive processing, and can be measured in infants during sleep. The results presented here have implications for the interpretation of discrimination as a probabilistic process, and may provide a basis for the development of single-subject and single-trial classification in a clinically useful context. An infant's brain is processing information about the environment and performing computations, even during sleep. These computations reflect subtle differences in acoustic feature processing that are necessary for language-learning. Results from this study suggest that brain responses to deviant sounds in an oddball paradigm follow a cascade of oscillatory modulations. This cascade begins with a gamma response that later emerges as a beta synchronization, which is temporally coupled with a theta modulation, and followed by a second, subsequent theta modulation. The difference in frequency and timing of the theta modulations appears to reflect a measure of surprise. These insights into the neurophysiological mechanisms of auditory discrimination provide a basis for exploring the clinically utility of the MMR TF and other auditory oddball responses.
Paradigm shifts and the interplay between state, business and civil sectors.
Encarnação, Sara; Santos, Fernando P; Santos, Francisco C; Blass, Vered; Pacheco, Jorge M; Portugali, Juval
2016-12-01
The recent rise of the civil sector as a main player of socio-political actions, next to public and private sectors, has largely increased the complexity underlying the interplay between different sectors of our society. From urban planning to global governance, analysis of these complex interactions requires new mathematical and computational approaches. Here, we develop a novel framework, grounded on evolutionary game theory, to envisage situations in which each of these sectors is confronted with the dilemma of deciding between maintaining a status quo scenario or shifting towards a new paradigm. We consider multisector conflicts regarding environmentally friendly policies as an example of application, but the framework developed here has a considerably broader scope. We show that the public sector is crucial in initiating the shift, and determine explicitly under which conditions the civil sector-reflecting the emergent reality of civil society organizations playing an active role in modern societies-may influence the decision-making processes accruing to other sectors, while fostering new routes towards a paradigm shift of the society as a whole. Our results are shown to be robust to a wide variety of assumptions and model parametrizations.
Paradigm shifts and the interplay between state, business and civil sectors
NASA Astrophysics Data System (ADS)
Encarnação, Sara; Santos, Fernando P.; Santos, Francisco C.; Blass, Vered; Pacheco, Jorge M.; Portugali, Juval
2016-12-01
The recent rise of the civil sector as a main player of socio-political actions, next to public and private sectors, has largely increased the complexity underlying the interplay between different sectors of our society. From urban planning to global governance, analysis of these complex interactions requires new mathematical and computational approaches. Here, we develop a novel framework, grounded on evolutionary game theory, to envisage situations in which each of these sectors is confronted with the dilemma of deciding between maintaining a status quo scenario or shifting towards a new paradigm. We consider multisector conflicts regarding environmentally friendly policies as an example of application, but the framework developed here has a considerably broader scope. We show that the public sector is crucial in initiating the shift, and determine explicitly under which conditions the civil sector-reflecting the emergent reality of civil society organizations playing an active role in modern societies-may influence the decision-making processes accruing to other sectors, while fostering new routes towards a paradigm shift of the society as a whole. Our results are shown to be robust to a wide variety of assumptions and model parametrizations.
Mountain hydrology, snow color, and the fourth paradigm
NASA Astrophysics Data System (ADS)
Dozier, Jeff
2011-10-01
The world's mountain ranges accumulate substantial snow, whose melt produces the bulk of runoff and often combines with rain to cause floods. Worldwide, inadequate understanding and a reliance on sparsely distributed observations limit our ability to predict seasonal and paroxysmal runoff as climate changes, ecosystems adapt, populations grow, land use evolves, and societies make choices. To improve assessments of snow accumulation, melt, and runoff, scientists and community planners can take advantage of two emerging trends: (1) an ability to remotely sense snow properties from satellites at a spatial scale appropriate for mountain regions (10- to 100-meter resolution, coverage of the order of 100,000 square kilometers) and a daily temporal scale appropriate for the dynamic nature of snow and (2) The Fourth Paradigm [Hey et al., 2009], which posits a new scientific approach in which insight is discovered through the manipulation of large data sets as the evolutionary step in scientific thinking beyond the first three paradigms: empiricism, analyses, and simulation. The inspiration for the book's title comes from pioneering computer scientist Jim Gray, based on a lecture he gave at the National Academy of Sciences 3 weeks before he disappeared at sea.
Beyond the “urge to move”: objective measures for the study of agency in the post-Libet era
Rowe, James B.
2014-01-01
The investigation of human volition is a longstanding endeavor from both philosophers and researchers. Yet because of the major challenges associated with capturing voluntary movements in an ecologically relevant state in the research environment, it is only in recent years that human agency has grown as a field of cognitive neuroscience. In particular, the seminal work of Libet et al. (1983) paved the way for a neuroscientific approach to agency. Over the past decade, new objective paradigms have been developed to study agency, drawing upon emerging concepts from cognitive and computational neuroscience. These include the chronometric approach of Libet’s study which is embedded in the “intentional binding” paradigm, optimal motor control theory and most recent insights from active inference theory. Here we review these principal methods and their application to the study of agency in health and the insights gained from their application to neurological and psychiatric disorders. We show that the neuropsychological paradigms that are based upon these new approaches have key advantages over traditional experimental designs. We propose that these advantages, coupled with advances in neuroimaging, create a powerful set of tools for understanding human agency and its neurobiological basis. PMID:24999325
Beyond the "urge to move": objective measures for the study of agency in the post-Libet era.
Wolpe, Noham; Rowe, James B
2014-01-01
The investigation of human volition is a longstanding endeavor from both philosophers and researchers. Yet because of the major challenges associated with capturing voluntary movements in an ecologically relevant state in the research environment, it is only in recent years that human agency has grown as a field of cognitive neuroscience. In particular, the seminal work of Libet et al. (1983) paved the way for a neuroscientific approach to agency. Over the past decade, new objective paradigms have been developed to study agency, drawing upon emerging concepts from cognitive and computational neuroscience. These include the chronometric approach of Libet's study which is embedded in the "intentional binding" paradigm, optimal motor control theory and most recent insights from active inference theory. Here we review these principal methods and their application to the study of agency in health and the insights gained from their application to neurological and psychiatric disorders. We show that the neuropsychological paradigms that are based upon these new approaches have key advantages over traditional experimental designs. We propose that these advantages, coupled with advances in neuroimaging, create a powerful set of tools for understanding human agency and its neurobiological basis.
Optimization and large scale computation of an entropy-based moment closure
NASA Astrophysics Data System (ADS)
Kristopher Garrett, C.; Hauck, Cory; Hill, Judith
2015-12-01
We present computational advances and results in the implementation of an entropy-based moment closure, MN, in the context of linear kinetic equations, with an emphasis on heterogeneous and large-scale computing platforms. Entropy-based closures are known in several cases to yield more accurate results than closures based on standard spectral approximations, such as PN, but the computational cost is generally much higher and often prohibitive. Several optimizations are introduced to improve the performance of entropy-based algorithms over previous implementations. These optimizations include the use of GPU acceleration and the exploitation of the mathematical properties of spherical harmonics, which are used as test functions in the moment formulation. To test the emerging high-performance computing paradigm of communication bound simulations, we present timing results at the largest computational scales currently available. These results show, in particular, load balancing issues in scaling the MN algorithm that do not appear for the PN algorithm. We also observe that in weak scaling tests, the ratio in time to solution of MN to PN decreases.
Optimization and large scale computation of an entropy-based moment closure
Hauck, Cory D.; Hill, Judith C.; Garrett, C. Kristopher
2015-09-10
We present computational advances and results in the implementation of an entropy-based moment closure, M N, in the context of linear kinetic equations, with an emphasis on heterogeneous and large-scale computing platforms. Entropy-based closures are known in several cases to yield more accurate results than closures based on standard spectral approximations, such as P N, but the computational cost is generally much higher and often prohibitive. Several optimizations are introduced to improve the performance of entropy-based algorithms over previous implementations. These optimizations include the use of GPU acceleration and the exploitation of the mathematical properties of spherical harmonics, which aremore » used as test functions in the moment formulation. To test the emerging high-performance computing paradigm of communication bound simulations, we present timing results at the largest computational scales currently available. Lastly, these results show, in particular, load balancing issues in scaling the M N algorithm that do not appear for the P N algorithm. We also observe that in weak scaling tests, the ratio in time to solution of M N to P N decreases.« less
Toward an Implementation Paradigm of Educational Change.
ERIC Educational Resources Information Center
Berman, Paul
Besides identifying current problems in educational change research, this document describes the inadequate paradigm on which past research has been based and outlines a currently emerging paradigm of educational innovation. The author asserts that previous research findings have been inconsistent and not comparable. Basic to the research problems…
2012-01-01
computerized stimulation paradigms for use during functional neuroimaging (i.e., MSIT). Accomplishments: • The following computer tasks were...and Stability Test. • Programming of all computerized functional MRI stimulation paradigms and assessment tasks using E-prime software was completed...Computer stimulation paradigms were tested in the scanner environment to ensure that they could be presented and seen by subjects in the scanner
Rule-based programming paradigm: a formal basis for biological, chemical and physical computation.
Krishnamurthy, V; Krishnamurthy, E V
1999-03-01
A rule-based programming paradigm is described as a formal basis for biological, chemical and physical computations. In this paradigm, the computations are interpreted as the outcome arising out of interaction of elements in an object space. The interactions can create new elements (or same elements with modified attributes) or annihilate old elements according to specific rules. Since the interaction rules are inherently parallel, any number of actions can be performed cooperatively or competitively among the subsets of elements, so that the elements evolve toward an equilibrium or unstable or chaotic state. Such an evolution may retain certain invariant properties of the attributes of the elements. The object space resembles Gibbsian ensemble that corresponds to a distribution of points in the space of positions and momenta (called phase space). It permits the introduction of probabilities in rule applications. As each element of the ensemble changes over time, its phase point is carried into a new phase point. The evolution of this probability cloud in phase space corresponds to a distributed probabilistic computation. Thus, this paradigm can handle tor deterministic exact computation when the initial conditions are exactly specified and the trajectory of evolution is deterministic. Also, it can handle probabilistic mode of computation if we want to derive macroscopic or bulk properties of matter. We also explain how to support this rule-based paradigm using relational-database like query processing and transactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barhen, Jacob; Imam, Neena
2007-01-01
Revolutionary computing technologies are defined in terms of technological breakthroughs, which leapfrog over near-term projected advances in conventional hardware and software to produce paradigm shifts in computational science. For underwater threat source localization using information provided by a dynamical sensor network, one of the most promising computational advances builds upon the emergence of digital optical-core devices. In this article, we present initial results of sensor network calculations that focus on the concept of signal wavefront time-difference-of-arrival (TDOA). The corresponding algorithms are implemented on the EnLight processing platform recently introduced by Lenslet Laboratories. This tera-scale digital optical core processor is optimizedmore » for array operations, which it performs in a fixed-point-arithmetic architecture. Our results (i) illustrate the ability to reach the required accuracy in the TDOA computation, and (ii) demonstrate that a considerable speed-up can be achieved when using the EnLight 64a prototype processor as compared to a dual Intel XeonTM processor.« less
Prediction based proactive thermal virtual machine scheduling in green clouds.
Kinger, Supriya; Kumar, Rajesh; Sharma, Anju
2014-01-01
Cloud computing has rapidly emerged as a widely accepted computing paradigm, but the research on Cloud computing is still at an early stage. Cloud computing provides many advanced features but it still has some shortcomings such as relatively high operating cost and environmental hazards like increasing carbon footprints. These hazards can be reduced up to some extent by efficient scheduling of Cloud resources. Working temperature on which a machine is currently running can be taken as a criterion for Virtual Machine (VM) scheduling. This paper proposes a new proactive technique that considers current and maximum threshold temperature of Server Machines (SMs) before making scheduling decisions with the help of a temperature predictor, so that maximum temperature is never reached. Different workload scenarios have been taken into consideration. The results obtained show that the proposed system is better than existing systems of VM scheduling, which does not consider current temperature of nodes before making scheduling decisions. Thus, a reduction in need of cooling systems for a Cloud environment has been obtained and validated.
ERIC Educational Resources Information Center
Martin, C. Dianne, Ed.; Murchie-Beyma, Eric, Ed.
This monograph includes nine papers delivered at a National Educational Computing Conference (NECC) preconference workshop, and a previously unpublished paper on gender and attitudes. The papers, which are presented in four categories, are: (1) "Report on the Workshop: In Search of Gender Free Paradigms for Computer Science Education"…
Linking consistency with object/thread semantics - An approach to robust computation
NASA Technical Reports Server (NTRS)
Chen, Raymond C.; Dasgupta, Partha
1989-01-01
This paper presents an object/thread based paradigm that links data consistency with object/thread semantics. The paradigm can be used to achieve a wide range of consistency semantics from strict atomic transactions to standard process semantics. The paradigm supports three types of data consistency. Object programmers indicate the type of consistency desired on a per-operation basis and the system performs automatic concurrency control and recovery management to ensure that those consistency requirements are met. This allows programmers to customize consistency and recovery on a per-application basis without having to supply complicated, custom recovery management schemes. The paradigm allows robust and nonrobust computation to operate concurrently on the same data in a well defined manner. The operating system needs to support only one vehicle of computation - the thread.
NAS-current status and future plans
NASA Technical Reports Server (NTRS)
Bailey, F. R.
1987-01-01
The Numerical Aerodynamic Simulation (NAS) has met its first major milestone, the NAS Processing System Network (NPSN) Initial Operating Configuration (IOC). The program has met its goal of providing a national supercomputer facility capable of greatly enhancing the Nation's research and development efforts. Furthermore, the program is fulfilling its pathfinder role by defining and implementing a paradigm for supercomputing system environments. The IOC is only the begining and the NAS Program will aggressively continue to develop and implement emerging supercomputer, communications, storage, and software technologies to strengthen computations as a critical element in supporting the Nation's leadership role in aeronautics.
AN INTELLIGENT REPRODUCTIVE AND DEVELOPMENTAL TESTING PARADIGM FOR THE 21ST CENTURY
Addressing the chemical evaluation bottleneck that currently exists can only be achieved through progressive changes to the current testing paradigm. The primary resources for addressing these issues lie in computational toxicology, a field enriched by recent advances in computer...
Nonlinear Real-Time Optical Signal Processing
1990-09-01
pattern recognition. Additional work concerns the relationship of parallel computation paradigms to optical computing and halftone screen techniques...paradigms to optical computing and halftone screen techniques for implementing general nonlinear functions. 3\\ 2 Research Progress This section...Vol. 23, No. 8, pp. 34-57, 1986. 2.4 Nonlinear Optical Processing with Halftones : Degradation and Compen- sation Models This paper is concerned with
New Paradigms for Computer Aids to Invention.
ERIC Educational Resources Information Center
Langston, M. Diane
Many people are interested in computer aids to rhetorical invention and want to know how to evaluate an invention aid, what the criteria are for a good one, and how to assess the trade-offs involved in buying one product or another. The frame of reference for this evaluation is an "old paradigm," which treats the computer as if it were…
ERIC Educational Resources Information Center
Pavlik, John V.
2015-01-01
Emerging technologies are fueling a third paradigm of education. Digital, networked and mobile media are enabling a disruptive transformation of the teaching and learning process. This paradigm challenges traditional assumptions that have long characterized educational institutions and processes, including basic notions of space, time, content,…
Treatment of Children with Speech Oral Placement Disorders (OPDs): A Paradigm Emerges
ERIC Educational Resources Information Center
Bahr, Diane; Rosenfeld-Johnson, Sara
2010-01-01
Epidemiological research was used to develop the Speech Disorders Classification System (SDCS). The SDCS is an important speech diagnostic paradigm in the field of speech-language pathology. This paradigm could be expanded and refined to also address treatment while meeting the standards of evidence-based practice. The article assists that process…
Bukowski, Henryk; Hietanen, Jari K.; Samson, Dana
2015-01-01
ABSTRACT Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations. PMID:26924936
Bukowski, Henryk; Hietanen, Jari K; Samson, Dana
2015-09-14
Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations.
All-optical reservoir computing.
Duport, François; Schneider, Bendix; Smerieri, Anteo; Haelterman, Marc; Massar, Serge
2012-09-24
Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.
Computationally guided discovery of thermoelectric materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorai, Prashun; Stevanović, Vladan; Toberer, Eric S.
The potential for advances in thermoelectric materials, and thus solid-state refrigeration and power generation, is immense. Progress so far has been limited by both the breadth and diversity of the chemical space and the serial nature of experimental work. In this Review, we discuss how recent computational advances are revolutionizing our ability to predict electron and phonon transport and scattering, as well as materials dopability, and we examine efficient approaches to calculating critical transport properties across large chemical spaces. When coupled with experimental feedback, these high-throughput approaches can stimulate the discovery of new classes of thermoelectric materials. Within smaller materialsmore » subsets, computations can guide the optimal chemical and structural tailoring to enhance materials performance and provide insight into the underlying transport physics. Beyond perfect materials, computations can be used for the rational design of structural and chemical modifications (such as defects, interfaces, dopants and alloys) to provide additional control on transport properties to optimize performance. Through computational predictions for both materials searches and design, a new paradigm in thermoelectric materials discovery is emerging.« less
Computationally guided discovery of thermoelectric materials
Gorai, Prashun; Stevanović, Vladan; Toberer, Eric S.
2017-08-22
The potential for advances in thermoelectric materials, and thus solid-state refrigeration and power generation, is immense. Progress so far has been limited by both the breadth and diversity of the chemical space and the serial nature of experimental work. In this Review, we discuss how recent computational advances are revolutionizing our ability to predict electron and phonon transport and scattering, as well as materials dopability, and we examine efficient approaches to calculating critical transport properties across large chemical spaces. When coupled with experimental feedback, these high-throughput approaches can stimulate the discovery of new classes of thermoelectric materials. Within smaller materialsmore » subsets, computations can guide the optimal chemical and structural tailoring to enhance materials performance and provide insight into the underlying transport physics. Beyond perfect materials, computations can be used for the rational design of structural and chemical modifications (such as defects, interfaces, dopants and alloys) to provide additional control on transport properties to optimize performance. Through computational predictions for both materials searches and design, a new paradigm in thermoelectric materials discovery is emerging.« less
Reprogrammable logic in memristive crossbar for in-memory computing
NASA Astrophysics Data System (ADS)
Cheng, Long; Zhang, Mei-Yun; Li, Yi; Zhou, Ya-Xiong; Wang, Zhuo-Rui; Hu, Si-Yu; Long, Shi-Bing; Liu, Ming; Miao, Xiang-Shui
2017-12-01
Memristive stateful logic has emerged as a promising next-generation in-memory computing paradigm to address escalating computing-performance pressures in traditional von Neumann architecture. Here, we present a nonvolatile reprogrammable logic method that can process data between different rows and columns in a memristive crossbar array based on material implication (IMP) logic. Arbitrary Boolean logic can be executed with a reprogrammable cell containing four memristors in a crossbar array. In the fabricated Ti/HfO2/W memristive array, some fundamental functions, such as universal NAND logic and data transfer, were experimentally implemented. Moreover, using eight memristors in a 2 × 4 array, a one-bit full adder was theoretically designed and verified by simulation to exhibit the feasibility of our method to accomplish complex computing tasks. In addition, some critical logic-related performances were further discussed, such as the flexibility of data processing, cascading problem and bit error rate. Such a method could be a step forward in developing IMP-based memristive nonvolatile logic for large-scale in-memory computing architecture.
Method Engineering: A Service-Oriented Approach
NASA Astrophysics Data System (ADS)
Cauvet, Corine
In the past, a large variety of methods have been published ranging from very generic frameworks to methods for specific information systems. Method Engineering has emerged as a research discipline for designing, constructing and adapting methods for Information Systems development. Several approaches have been proposed as paradigms in method engineering. The meta modeling approach provides means for building methods by instantiation, the component-based approach aims at supporting the development of methods by using modularization constructs such as method fragments, method chunks and method components. This chapter presents an approach (SO2M) for method engineering based on the service paradigm. We consider services as autonomous computational entities that are self-describing, self-configuring and self-adapting. They can be described, published, discovered and dynamically composed for processing a consumer's demand (a developer's requirement). The method service concept is proposed to capture a development process fragment for achieving a goal. Goal orientation in service specification and the principle of service dynamic composition support method construction and method adaptation to different development contexts.
Ambient intelligence in health care.
Riva, Giuseppe
2003-06-01
Ambient Intelligence (AmI) is a new paradigm in information technology, in which people are empowered through a digital environment that is aware of their presence and context, and is sensitive, adaptive, and responsive to their needs, habits, gestures and emotions. The most ambitious expression of AmI is Intelligent Mixed Reality (IMR), an evolution of traditional virtual reality environments. Using IMR, it is possible to integrate computer interfaces into the real environment, so that the user can interact with other individuals and with the environment itself in the most natural and intuitive way. How does the emergence of the AmI paradigm influence the future of health care? Using a scenario-based approach, this paper outlines the possible role of AmI in health care by focusing on both its technological and relational nature. In this sense, clinicians and health care providers that want to exploit AmI potential need a significant attention to technology, ergonomics, project management, human factors and organizational changes in the structure of the relevant health service.
Summary: Special Session SpS15: Data Intensive Astronomy
NASA Astrophysics Data System (ADS)
Montmerle, Thierry
2015-03-01
A new paradigm in astronomical research has been emerging - ``Data Intensive Astronomy'' that utilizes large amounts of data combined with statistical data analyses. The first research method in astronomy was observations by our eyes. It is well known that the invention of telescope impacted the human view on our Universe (although it was almost limited to the solar system), and lead to Keplerfs law that was later used by Newton to derive his mechanics. Newtonian mechanics then enabled astronomers to provide the theoretical explanation to the motion of the planets. Thus astronomers obtained the second paradigm, theoretical astronomy. Astronomers succeeded to apply various laws of physics to reconcile phenomena in the Universe; e.g., nuclear fusion was found to be the energy source of a star. Theoretical astronomy has been paired with observational astronomy to better understand the background physics in observed phenomena in the Universe. Although theoretical astronomy succeeded to provide good physical explanations qualitatively, it was not easy to have quantitative agreements with observations in the Universe. Since the invention of high-performance computers, however, astronomers succeeded to have the third research method, simulations, to get better agreements with observations. Simulation astronomy developed so rapidly along with the development of computer hardware (CPUs, GPUs, memories, storage systems, networks, and others) and simulation codes.
Information physics fundamentals of nanophotonics.
Naruse, Makoto; Tate, Naoya; Aono, Masashi; Ohtsu, Motoichi
2013-05-01
Nanophotonics has been extensively studied with the aim of unveiling and exploiting light-matter interactions that occur at a scale below the diffraction limit of light, and recent progress made in experimental technologies--both in nanomaterial fabrication and characterization--is driving further advancements in the field. From the viewpoint of information, on the other hand, novel architectures, design and analysis principles, and even novel computing paradigms should be considered so that we can fully benefit from the potential of nanophotonics. This paper examines the information physics aspects of nanophotonics. More specifically, we present some fundamental and emergent information properties that stem from optical excitation transfer mediated by optical near-field interactions and the hierarchical properties inherent in optical near-fields. We theoretically and experimentally investigate aspects such as unidirectional signal transfer, energy efficiency and networking effects, among others, and we present their basic theoretical formalisms and describe demonstrations of practical applications. A stochastic analysis of light-assisted material formation is also presented, where an information-based approach provides a deeper understanding of the phenomena involved, such as self-organization. Furthermore, the spatio-temporal dynamics of optical excitation transfer and its inherent stochastic attributes are utilized for solution searching, paving the way to a novel computing paradigm that exploits coherent and dissipative processes in nanophotonics.
A comparison among several P300 brain-computer interface speller paradigms.
Fazel-Rezai, Reza; Gavett, Scott; Ahmad, Waqas; Rabbi, Ahmed; Schneider, Eric
2011-10-01
Since the brain-computer interface (BCI) speller was first proposed by Farwell and Donchin, there have been modifications in the visual aspects of P300 paradigms. Most of the changes are based on the original matrix format such as changes in the number of rows and columns, font size, flash/ blank time, and flash order. The improvement in the resulting accuracy and speed of such systems has always been the ultimate goal. In this study, we have compared several different speller paradigms including row-column, single character flashing, and two region-based paradigms which are not based on the matrix format. In the first region-based paradigm, at the first level, characters and symbols are distributed over seven regions alphabetically, while in the second region-based paradigm they are distributed in the most frequently used order. At the second level, each one of the regions is further subdivided into seven subsets. The experimental results showed that the average accuracy and user acceptability for two region-based paradigms were higher than those for traditional paradigms such as row/column and single character.
The Global Imperatives for an Education Paradigm Shift.
ERIC Educational Resources Information Center
Bright, Larry K.; And Others
The future role of education is covered in a discussion concerning the shifting of the dominant social paradigm of the United States. It is noted that the paradigm is changing from one that requires social institutions to seek and develop human resources to maintain a position of competitive dominance, to an emerging view of world interdependence.…
Cortex and Memory: Emergence of a New Paradigm
ERIC Educational Resources Information Center
Fuster, Joaquin M.
2009-01-01
Converging evidence from humans and nonhuman primates is obliging us to abandon conventional models in favor of a radically different, distributed-network paradigm of cortical memory. Central to the new paradigm is the concept of memory network or cognit--that is, a memory or an item of knowledge defined by a pattern of connections between neuron…
The Challenge of Infectious Diseases to the Biomedical Paradigm
ERIC Educational Resources Information Center
Foladori, Guillermo
2005-01-01
The resurgence of infectious diseases and the emergence of infectious diseases raise questions on how to cope with the situation. The germ or clinical approach is the hegemonic biomedical paradigm. In this article, the author argues that the spread of infectious diseases has posted a challenge to the biomedical paradigm and shows how lock-in…
Communities of Practice: A Research Paradigm for the Mixed Methods Approach
ERIC Educational Resources Information Center
Denscombe, Martyn
2008-01-01
The mixed methods approach has emerged as a "third paradigm" for social research. It has developed a platform of ideas and practices that are credible and distinctive and that mark the approach out as a viable alternative to quantitative and qualitative paradigms. However, there are also a number of variations and inconsistencies within the mixed…
Answer Set Programming and Other Computing Paradigms
ERIC Educational Resources Information Center
Meng, Yunsong
2013-01-01
Answer Set Programming (ASP) is one of the most prominent and successful knowledge representation paradigms. The success of ASP is due to its expressive non-monotonic modeling language and its efficient computational methods originating from building propositional satisfiability solvers. The wide adoption of ASP has motivated several extensions to…
Passive Motion Paradigm: An Alternative to Optimal Control
Mohan, Vishwanathan; Morasso, Pietro
2011-01-01
In the last years, optimal control theory (OCT) has emerged as the leading approach for investigating neural control of movement and motor cognition for two complementary research lines: behavioral neuroscience and humanoid robotics. In both cases, there are general problems that need to be addressed, such as the “degrees of freedom (DoFs) problem,” the common core of production, observation, reasoning, and learning of “actions.” OCT, directly derived from engineering design techniques of control systems quantifies task goals as “cost functions” and uses the sophisticated formal tools of optimal control to obtain desired behavior (and predictions). We propose an alternative “softer” approach passive motion paradigm (PMP) that we believe is closer to the biomechanics and cybernetics of action. The basic idea is that actions (overt as well as covert) are the consequences of an internal simulation process that “animates” the body schema with the attractor dynamics of force fields induced by the goal and task-specific constraints. This internal simulation offers the brain a way to dynamically link motor redundancy with task-oriented constraints “at runtime,” hence solving the “DoFs problem” without explicit kinematic inversion and cost function computation. We argue that the function of such computational machinery is not only restricted to shaping motor output during action execution but also to provide the self with information on the feasibility, consequence, understanding and meaning of “potential actions.” In this sense, taking into account recent developments in neuroscience (motor imagery, simulation theory of covert actions, mirror neuron system) and in embodied robotics, PMP offers a novel framework for understanding motor cognition that goes beyond the engineering control paradigm provided by OCT. Therefore, the paper is at the same time a review of the PMP rationale, as a computational theory, and a perspective presentation of how to develop it for designing better cognitive architectures. PMID:22207846
Are We Ready for Real-world Neuroscience?
Matusz, Pawel J; Dikker, Suzanne; Huth, Alexander G; Perrodin, Catherine
2018-06-19
Real-world environments are typically dynamic, complex, and multisensory in nature and require the support of top-down attention and memory mechanisms for us to be able to drive a car, make a shopping list, or pour a cup of coffee. Fundamental principles of perception and functional brain organization have been established by research utilizing well-controlled but simplified paradigms with basic stimuli. The last 30 years ushered a revolution in computational power, brain mapping, and signal processing techniques. Drawing on those theoretical and methodological advances, over the years, research has departed more and more from traditional, rigorous, and well-understood paradigms to directly investigate cognitive functions and their underlying brain mechanisms in real-world environments. These investigations typically address the role of one or, more recently, multiple attributes of real-world environments. Fundamental assumptions about perception, attention, or brain functional organization have been challenged-by studies adapting the traditional paradigms to emulate, for example, the multisensory nature or varying relevance of stimulation or dynamically changing task demands. Here, we present the state of the field within the emerging heterogeneous domain of real-world neuroscience. To be precise, the aim of this Special Focus is to bring together a variety of the emerging "real-world neuroscientific" approaches. These approaches differ in their principal aims, assumptions, or even definitions of "real-world neuroscience" research. Here, we showcase the commonalities and distinctive features of the different "real-world neuroscience" approaches. To do so, four early-career researchers and the speakers of the Cognitive Neuroscience Society 2017 Meeting symposium under the same title answer questions pertaining to the added value of such approaches in bringing us closer to accurate models of functional brain organization and cognitive functions.
Long Chen; Zhongpeng Wang; Feng He; Jiajia Yang; Hongzhi Qi; Peng Zhou; Baikun Wan; Dong Ming
2015-08-01
The hybrid brain computer interface (hBCI) could provide higher information transfer rate than did the classical BCIs. It included more than one brain-computer or human-machine interact paradigms, such as the combination of the P300 and SSVEP paradigms. Research firstly constructed independent subsystems of three different paradigms and tested each of them with online experiments. Then we constructed a serial hybrid BCI system which combined these paradigms to achieve the functions of typing letters, moving and clicking cursor, and switching among them for the purpose of browsing webpages. Five subjects were involved in this study. They all successfully realized these functions in the online tests. The subjects could achieve an accuracy above 90% after training, which met the requirement in operating the system efficiently. The results demonstrated that it was an efficient system capable of robustness, which provided an approach for the clinic application.
Rieffel, John A.; Valero-Cuevas, Francisco J.; Lipson, Hod
2010-01-01
Traditional engineering approaches strive to avoid, or actively suppress, nonlinear dynamic coupling among components. Biological systems, in contrast, are often rife with these dynamics. Could there be, in some cases, a benefit to high degrees of dynamical coupling? Here we present a distributed robotic control scheme inspired by the biological phenomenon of tensegrity-based mechanotransduction. This emergence of morphology-as-information-conduit or ‘morphological communication’, enabled by time-sensitive spiking neural networks, presents a new paradigm for the decentralized control of large, coupled, modular systems. These results significantly bolster, both in magnitude and in form, the idea of morphological computation in robotic control. Furthermore, they lend further credence to ideas of embodied anatomical computation in biological systems, on scales ranging from cellular structures up to the tendinous networks of the human hand. PMID:19776146
Towards a flexible middleware for context-aware pervasive and wearable systems.
Muro, Marco; Amoretti, Michele; Zanichelli, Francesco; Conte, Gianni
2012-11-01
Ambient intelligence and wearable computing call for innovative hardware and software technologies, including a highly capable, flexible and efficient middleware, allowing for the reuse of existing pervasive applications when developing new ones. In the considered application domain, middleware should also support self-management, interoperability among different platforms, efficient communications, and context awareness. In the on-going "everything is networked" scenario scalability appears as a very important issue, for which the peer-to-peer (P2P) paradigm emerges as an appealing solution for connecting software components in an overlay network, allowing for efficient and balanced data distribution mechanisms. In this paper, we illustrate how all these concepts can be placed into a theoretical tool, called networked autonomic machine (NAM), implemented into a NAM-based middleware, and evaluated against practical problems of pervasive computing.
The future: biomarkers, biosensors, neuroinformatics, and e-neuropsychiatry.
Lowe, Christopher R
2011-01-01
The emergence of molecular biomarkers for psychological, psychiatric, and neurodegenerative disorders is beginning to change current diagnostic paradigms for this debilitating family of mental illnesses. The development of new genomic, proteomic, and metabolomic tools has created the prospect of sensitive and specific biochemical tests to replace traditional pen-and-paper questionnaires. In the future, the realization of biosensor technologies, point-of-care testing, and the fusion of clinical biomarker data, electroencephalogram, and MRI data with the patient's past medical history, biopatterns, and prognosis may create personalized bioprofiles or fingerprints for brain disorders. Further, the application of mobile communications technology and grid computing to support data-, computation- and knowledge-based tasks will assist disease prediction, diagnosis, prognosis, and compliance monitoring. It is anticipated that, ultimately, mobile devices could become the next generation of personalized pharmacies. Copyright © 2011 Elsevier Inc. All rights reserved.
Artificial Intelligence and brain.
Shapshak, Paul
2018-01-01
From the start, Kurt Godel observed that computer and brain paradigms were considered on a par by researchers and that researchers had misunderstood his theorems. He hailed with displeasure that the brain transcends computers. In this brief article, we point out that Artificial Intelligence (AI) comprises multitudes of human-made methodologies, systems, and languages, and implemented with computer technology. These advances enhance development in the electron and quantum realms. In the biological realm, animal neurons function, also utilizing electron flow, and are products of evolution. Mirror neurons are an important paradigm in neuroscience research. Moreover, the paradigm shift proposed here - 'hall of mirror neurons' - is a potentially further productive research tactic. These concepts further expand AI and brain research.
CT-assisted agile manufacturing
NASA Astrophysics Data System (ADS)
Stanley, James H.; Yancey, Robert N.
1996-11-01
The next century will witness at least two great revolutions in the way goods are produced. First, workers will use the medium of virtual reality in all aspects of marketing, research, development, prototyping, manufacturing, sales and service. Second, market forces will drive manufacturing towards small-lot production and just-in-time delivery. Already, we can discern the merging of these megatrends into what some are calling agile manufacturing. Under this new paradigm, parts and processes will be designed and engineered within the mind of a computer, tooled and manufactured by the offspring of today's rapid prototyping equipment, and evaluated for performance and reliability by advanced nondestructive evaluation (NDE) techniques and sophisticated computational models. Computed tomography (CT) is the premier example of an NDE method suitable for future agile manufacturing activities. It is the only modality that provides convenient access to the full suite of engineering data that users will need to avail themselves of computer- aided design, computer-aided manufacturing, and computer- aided engineering capabilities, as well as newly emerging reverse engineering, rapid prototyping and solid freeform fabrication technologies. As such, CT is assured a central, utilitarian role in future industrial operations. An overview of this exciting future for industrial CT is presented.
A Lightweight Protocol for Secure Video Streaming
Morkevicius, Nerijus; Bagdonas, Kazimieras
2018-01-01
The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing “Fog Node-End Device” layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard. PMID:29757988
A Lightweight Protocol for Secure Video Streaming.
Venčkauskas, Algimantas; Morkevicius, Nerijus; Bagdonas, Kazimieras; Damaševičius, Robertas; Maskeliūnas, Rytis
2018-05-14
The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing "Fog Node-End Device" layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard.
Energy consumption analysis for various memristive networks under different learning strategies
NASA Astrophysics Data System (ADS)
Deng, Lei; Wang, Dong; Zhang, Ziyang; Tang, Pei; Li, Guoqi; Pei, Jing
2016-02-01
Recently, various memristive systems emerge to emulate the efficient computing paradigm of the brain cortex; whereas, how to make them energy efficient still remains unclear, especially from an overall perspective. Here, a systematical and bottom-up energy consumption analysis is demonstrated, including the memristor device level and the network learning level. We propose an energy estimating methodology when modulating the memristive synapses, which is simulated in three typical neural networks with different synaptic structures and learning strategies for both offline and online learning. These results provide an in-depth insight to create energy efficient brain-inspired neuromorphic devices in the future.
An Instructional Paradigm for the Teaching of Computer-Mediated Communication
ERIC Educational Resources Information Center
Howard, Craig D.
2012-01-01
This article outlines an instructional paradigm that guides the design of interventions that build skills in computer-mediated communication (CMC). It is applicable to learning at multiple levels of communicative proficiency and aims to heighten awareness, the understanding of the impact of media configurations, the role of cultures and social…
Approaches to Cyber Intrusion Response. A Legal Foundations Study. Report 12 of 12
1997-01-01
Computers: Controlling Behavior in Cyberspace through a Contract Law Paradigm, 35 Jurimetrics J. 1, 8-9 (1994). 5 Ingrid Becker, Cybercrime: Cops Can’t...Computers: Controlling Behavior in Cyberspace through a Contract Law Paradigm, 35 JURIMETRICS J. 1-15 (1994). 9 • Pro: This approach allows the victim
ERIC Educational Resources Information Center
Reyes Alamo, Jose M.
2010-01-01
The Service Oriented Computing (SOC) paradigm, defines services as software artifacts whose implementations are separated from their specifications. Application developers rely on services to simplify the design, reduce the development time and cost. Within the SOC paradigm, different Service Oriented Architectures (SOAs) have been developed.…
NASA Astrophysics Data System (ADS)
Schejbal, D.
2013-12-01
There is a new paradigm emerging in higher education. Like all paradigm shifts, however, during the evolutionary process, it is nearly impossible to tell what the final results will be. Nevertheless, we have indications of the directions of change, and they will be significant and transformative for many academic programs, colleges, and universities. In this session, we will explore some of the most notable changes that are impacting higher education, including the industrialization and commodification of education; the focus on accountability and its implications for teaching; the impact of external pressures from legislators, employers, students and parents; and the social, national, and global contexts that are forcing transformation. Geosciences, like many other disciplines is impacted by these changes and must find new ways to navigate and adapt in order to survive and thrive in the emerging paradigm.
Optimization of sparse matrix-vector multiplication on emerging multicore platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Samuel; Oliker, Leonid; Vuduc, Richard
2007-01-01
We are witnessing a dramatic change in computer architecture due to the multicore paradigm shift, as every electronic device from cell phones to supercomputers confronts parallelism of unprecedented scale. To fully unleash the potential of these systems, the HPC community must develop multicore specific optimization methodologies for important scientific computations. In this work, we examine sparse matrix-vector multiply (SpMV) - one of the most heavily used kernels in scientific computing - across a broad spectrum of multicore designs. Our experimental platform includes the homogeneous AMD dual-core and Intel quad-core designs, the heterogeneous STI Cell, as well as the first scientificmore » study of the highly multithreaded Sun Niagara2. We present several optimization strategies especially effective for the multicore environment, and demonstrate significant performance improvements compared to existing state-of-the-art serial and parallel SpMV implementations. Additionally, we present key insights into the architectural tradeoffs of leading multicore design strategies, in the context of demanding memory-bound numerical algorithms.« less
Graphical processors for HEP trigger systems
NASA Astrophysics Data System (ADS)
Ammendola, R.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Di Lorenzo, S.; Fantechi, R.; Fiorini, M.; Frezza, O.; Lamanna, G.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Neri, I.; Paolucci, P. S.; Pastorelli, E.; Piandani, R.; Pontisso, L.; Rossetti, D.; Simula, F.; Sozzi, M.; Vicini, P.
2017-02-01
General-purpose computing on GPUs is emerging as a new paradigm in several fields of science, although so far applications have been tailored to employ GPUs as accelerators in offline computations. With the steady decrease of GPU latencies and the increase in link and memory throughputs, time is ripe for real-time applications using GPUs in high-energy physics data acquisition and trigger systems. We will discuss the use of online parallel computing on GPUs for synchronous low level trigger systems, focusing on tests performed on the trigger of the CERN NA62 experiment. Latencies of all components need analysing, networking being the most critical. To keep it under control, we envisioned NaNet, an FPGA-based PCIe Network Interface Card (NIC) enabling GPUDirect connection. Moreover, we discuss how specific trigger algorithms can be parallelised and thus benefit from a GPU implementation, in terms of increased execution speed. Such improvements are particularly relevant for the foreseen LHC luminosity upgrade where highly selective algorithms will be crucial to maintain sustainable trigger rates with very high pileup.
Lambda Data Grid: Communications Architecture in Support of Grid Computing
2006-12-21
number of paradigm shifts in the 20th century, including the growth of large geographically dispersed teams and the use of simulations and computational...get results. The work in this thesis automates the orchestration of networks with other resources, better utilizing all resources in a time efficient...domains, over transatlantic links in around minute. The main goal of this thesis is to build a new grid-computing paradigm that fully harnesses the
Scholl, Jacqueline; Klein-Flügge, Miriam
2017-09-28
Recent research in cognitive neuroscience has begun to uncover the processes underlying increasingly complex voluntary behaviours, including learning and decision-making. Partly this success has been possible by progressing from simple experimental tasks to paradigms that incorporate more ecological features. More specifically, the premise is that to understand cognitions and brain functions relevant for real life, we need to introduce some of the ecological challenges that we have evolved to solve. This often entails an increase in task complexity, which can be managed by using computational models to help parse complex behaviours into specific component mechanisms. Here we propose that using computational models with tasks that capture ecologically relevant learning and decision-making processes may provide a critical advantage for capturing the mechanisms underlying symptoms of disorders in psychiatry. As a result, it may help develop mechanistic approaches towards diagnosis and treatment. We begin this review by mapping out the basic concepts and models of learning and decision-making. We then move on to consider specific challenges that emerge in realistic environments and describe how they can be captured by tasks. These include changes of context, uncertainty, reflexive/emotional biases, cost-benefit decision-making, and balancing exploration and exploitation. Where appropriate we highlight future or current links to psychiatry. We particularly draw examples from research on clinical depression, a disorder that greatly compromises motivated behaviours in real-life, but where simpler paradigms have yielded mixed results. Finally, we highlight several paradigms that could be used to help provide new insights into the mechanisms of psychiatric disorders. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
A novel paradigm for telemedicine using the personal bio-monitor.
Bhatikar, Sanjay R; Mahajan, Roop L; DeGroff, Curt
2002-01-01
The foray of solid-state technology in the medical field has yielded an arsenal of sophisticated healthcare tools. Personal, portable computing power coupled with the information superhighway open up the possibility of sophisticated healthcare management that will impact the medical field just as much. The full synergistic potential of three interwoven technologies: (1) compact electronics, (2) World Wide Web, and (3) Artificial Intelligence is yet to be realized. The system presented in this paper integrates these technologies synergistically, providing a new paradigm for healthcare. Our idea is to deploy internet-enabled, intelligent, handheld personal computers for medical diagnosis. The salient features of the 'Personal Bio-Monitor' we envisage are: (1) Utilization of the peripheral signals of the body which may be acquired non-invasively and with ease, for diagnosis of medical conditions; (2) An Artificial Neural Network (ANN) based approach for diagnosis; (3) Configuration of the diagnostic device as a handheld for personal use; (4) Internet connectivity, following the emerging bluetooth protocol, for prompt conveyance of information to a patient's health care provider via the World Wide Web. The proposal is substantiated with an intelligent handheld device developed by the investigators for pediatric cardiac auscultation. This device performed accurate diagnoses of cardiac abnormalities in pediatrics using an artificial neural network to process heart sounds acquired by a low-frequency microphone and transmitted its diagnosis to a desktop PC via infrared. The idea of the personal biomonitor presented here has the potential to streamline healthcare by optimizing two valuable resources: physicians' time and sophisticated equipment time. We show that the elements of such a system are in place, with our prototype. Our novel contribution is the synergistic integration of compact electronics' technology, artificial neural network methodology and the wireless web resulting in a revolutionary new paradigm for healthcare management.
Comparison of two paradigms for distributed shared memory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levelt, W.G.; Kaashoek, M.F.; Bal, H.E.
1990-08-01
The paper compares two paradigms for Distributed Shared Memory on loosely coupled computing systems: the shared data-object model as used in Orca, a programming language specially designed for loosely coupled computing systems and the Shared Virtual Memory model. For both paradigms the authors have implemented two systems, one using only point-to-point messages, the other using broadcasting as well. They briefly describe these two paradigms and their implementations. Then they compare their performance on four applications: the traveling salesman problem, alpha-beta search, matrix multiplication and the all pairs shortest paths problem. The measurements show that both paradigms can be used efficientlymore » for programming large-grain parallel applications. Significant speedups were obtained on all applications. The unstructured Shared Virtual Memory paradigm achieves the best absolute performance, although this is largely due to the preliminary nature of the Orca compiler used. The structured shared data-object model achieves the highest speedups and is much easier to program and to debug.« less
The emergence of spatial cyberinfrastructure.
Wright, Dawn J; Wang, Shaowen
2011-04-05
Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge.
The emergence of spatial cyberinfrastructure
Wright, Dawn J.; Wang, Shaowen
2011-01-01
Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge. PMID:21467227
Prediction Based Proactive Thermal Virtual Machine Scheduling in Green Clouds
Kinger, Supriya; Kumar, Rajesh; Sharma, Anju
2014-01-01
Cloud computing has rapidly emerged as a widely accepted computing paradigm, but the research on Cloud computing is still at an early stage. Cloud computing provides many advanced features but it still has some shortcomings such as relatively high operating cost and environmental hazards like increasing carbon footprints. These hazards can be reduced up to some extent by efficient scheduling of Cloud resources. Working temperature on which a machine is currently running can be taken as a criterion for Virtual Machine (VM) scheduling. This paper proposes a new proactive technique that considers current and maximum threshold temperature of Server Machines (SMs) before making scheduling decisions with the help of a temperature predictor, so that maximum temperature is never reached. Different workload scenarios have been taken into consideration. The results obtained show that the proposed system is better than existing systems of VM scheduling, which does not consider current temperature of nodes before making scheduling decisions. Thus, a reduction in need of cooling systems for a Cloud environment has been obtained and validated. PMID:24737962
a Non-Overlapping Discretization Method for Partial Differential Equations
NASA Astrophysics Data System (ADS)
Rosas-Medina, A.; Herrera, I.
2013-05-01
Mathematical models of many systems of interest, including very important continuous systems of Engineering and Science, lead to a great variety of partial differential equations whose solution methods are based on the computational processing of large-scale algebraic systems. Furthermore, the incredible expansion experienced by the existing computational hardware and software has made amenable to effective treatment problems of an ever increasing diversity and complexity, posed by engineering and scientific applications. The emergence of parallel computing prompted on the part of the computational-modeling community a continued and systematic effort with the purpose of harnessing it for the endeavor of solving boundary-value problems (BVPs) of partial differential equations. Very early after such an effort began, it was recognized that domain decomposition methods (DDM) were the most effective technique for applying parallel computing to the solution of partial differential equations, since such an approach drastically simplifies the coordination of the many processors that carry out the different tasks and also reduces very much the requirements of information-transmission between them. Ideally, DDMs intend producing algorithms that fulfill the DDM-paradigm; i.e., such that "the global solution is obtained by solving local problems defined separately in each subdomain of the coarse-mesh -or domain-decomposition-". Stated in a simplistic manner, the basic idea is that, when the DDM-paradigm is satisfied, full parallelization can be achieved by assigning each subdomain to a different processor. When intensive DDM research began much attention was given to overlapping DDMs, but soon after attention shifted to non-overlapping DDMs. This evolution seems natural when the DDM-paradigm is taken into account: it is easier to uncouple the local problems when the subdomains are separated. However, an important limitation of non-overlapping domain decompositions, as that concept is usually understood today, is that interface nodes are shared by two or more subdomains of the coarse-mesh and, therefore, even non-overlapping DDMs are actually overlapping when seen from the perspective of the nodes used in the discretization. In this talk we present and discuss a discretization method in which the nodes used are non-overlapping, in the sense that each one of them belongs to one and only one subdomain of the coarse-mesh.
ERIC Educational Resources Information Center
Hendriksen, Dan
It is important that we reflect on the conceptual framework from which our study of language has emerged, since the problems, methods, and aims of what has been called modern linguistics are rapidly being replaced by the concerns of another framework or paradigm. Such new paradigms, to be viable, must not be distorted by starting points that…
A PACS archive architecture supported on cloud services.
Silva, Luís A Bastião; Costa, Carlos; Oliveira, José Luis
2012-05-01
Diagnostic imaging procedures have continuously increased over the last decade and this trend may continue in coming years, creating a great impact on storage and retrieval capabilities of current PACS. Moreover, many smaller centers do not have financial resources or requirements that justify the acquisition of a traditional infrastructure. Alternative solutions, such as cloud computing, may help address this emerging need. A tremendous amount of ubiquitous computational power, such as that provided by Google and Amazon, are used every day as a normal commodity. Taking advantage of this new paradigm, an architecture for a Cloud-based PACS archive that provides data privacy, integrity, and availability is proposed. The solution is independent from the cloud provider and the core modules were successfully instantiated in examples of two cloud computing providers. Operational metrics for several medical imaging modalities were tabulated and compared for Google Storage, Amazon S3, and LAN PACS. A PACS-as-a-Service archive that provides storage of medical studies using the Cloud was developed. The results show that the solution is robust and that it is possible to store, query, and retrieve all desired studies in a similar way as in a local PACS approach. Cloud computing is an emerging solution that promises high scalability of infrastructures, software, and applications, according to a "pay-as-you-go" business model. The presented architecture uses the cloud to setup medical data repositories and can have a significant impact on healthcare institutions by reducing IT infrastructures.
Charting the expansion of strategic exploratory behavior during adolescence.
Somerville, Leah H; Sasse, Stephanie F; Garrad, Megan C; Drysdale, Andrew T; Abi Akar, Nadine; Insel, Catherine; Wilson, Robert C
2017-02-01
Although models of exploratory decision making implicate a suite of strategies that guide the pursuit of information, the developmental emergence of these strategies remains poorly understood. This study takes an interdisciplinary perspective, merging computational decision making and developmental approaches to characterize age-related shifts in exploratory strategy from adolescence to young adulthood. Participants were 149 12-28-year-olds who completed a computational explore-exploit paradigm that manipulated reward value, information value, and decision horizon (i.e., the utility that information holds for future choices). Strategic directed exploration, defined as information seeking selective for long time horizons, emerged during adolescence and maintained its level through early adulthood. This age difference was partially driven by adolescents valuing immediate reward over new information. Strategic random exploration, defined as stochastic choice behavior selective for long time horizons, was invoked at comparable levels over the age range, and predicted individual differences in attitudes toward risk taking in daily life within the adolescent portion of the sample. Collectively, these findings reveal an expansion of the diversity of strategic exploration over development, implicate distinct mechanisms for directed and random exploratory strategies, and suggest novel mechanisms for adolescent-typical shifts in decision making. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Bioinspired architecture approach for a one-billion transistor smart CMOS camera chip
NASA Astrophysics Data System (ADS)
Fey, Dietmar; Komann, Marcus
2007-05-01
In the paper we present a massively parallel VLSI architecture for future smart CMOS camera chips with up to one billion transistors. To exploit efficiently the potential offered by future micro- or nanoelectronic devices traditional on central structures oriented parallel architectures based on MIMD or SIMD approaches will fail. They require too long and too many global interconnects for the distribution of code or the access to common memory. On the other hand nature developed self-organising and emergent principles to manage successfully complex structures based on lots of interacting simple elements. Therefore we developed a new as Marching Pixels denoted emergent computing paradigm based on a mixture of bio-inspired computing models like cellular automaton and artificial ants. In the paper we present different Marching Pixels algorithms and the corresponding VLSI array architecture. A detailed synthesis result for a 0.18 μm CMOS process shows that a 256×256 pixel image is processed in less than 10 ms assuming a moderate 100 MHz clock rate for the processor array. Future higher integration densities and a 3D chip stacking technology will allow the integration and processing of Mega pixels within the same time since our architecture is fully scalable.
ERIC Educational Resources Information Center
Berrell, Michael M.; Macpherson, R. J. S.
1995-01-01
Traces the different paradigmatic pathways followed by educational sociology and educational administration. Educational sociology has followed ideostructural, interpretive, and psychosocial paradigms, with emergent holistic critical perspectives and sociobiological materialism. Educational administration has had one dominant tradition,…
KeyWare: an open wireless distributed computing environment
NASA Astrophysics Data System (ADS)
Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir
1995-12-01
Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.
Cloud@Home: A New Enhanced Computing Paradigm
NASA Astrophysics Data System (ADS)
Distefano, Salvatore; Cunsolo, Vincenzo D.; Puliafito, Antonio; Scarpa, Marco
Cloud computing is a distributed computing paradigm that mixes aspects of Grid computing, ("… hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities" (Foster, 2002)) Internet Computing ("…a computing platform geographically distributed across the Internet" (Milenkovic et al., 2003)), Utility computing ("a collection of technologies and business practices that enables computing to be delivered seamlessly and reliably across multiple computers, ... available as needed and billed according to usage, much like water and electricity are today" (Ross & Westerman, 2004)) Autonomic computing ("computing systems that can manage themselves given high-level objectives from administrators" (Kephart & Chess, 2003)), Edge computing ("… provides a generic template facility for any type of application to spread its execution across a dedicated grid, balancing the load …" Davis, Parikh, & Weihl, 2004) and Green computing (a new frontier of Ethical computing1 starting from the assumption that in next future energy costs will be related to the environment pollution).
Quantum computers: Definition and implementations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perez-Delgado, Carlos A.; Kok, Pieter
The DiVincenzo criteria for implementing a quantum computer have been seminal in focusing both experimental and theoretical research in quantum-information processing. These criteria were formulated specifically for the circuit model of quantum computing. However, several new models for quantum computing (paradigms) have been proposed that do not seem to fit the criteria well. Therefore, the question is what are the general criteria for implementing quantum computers. To this end, a formal operational definition of a quantum computer is introduced. It is then shown that, according to this definition, a device is a quantum computer if it obeys the following criteria:more » Any quantum computer must consist of a quantum memory, with an additional structure that (1) facilitates a controlled quantum evolution of the quantum memory; (2) includes a method for information theoretic cooling of the memory; and (3) provides a readout mechanism for subsets of the quantum memory. The criteria are met when the device is scalable and operates fault tolerantly. We discuss various existing quantum computing paradigms and how they fit within this framework. Finally, we present a decision tree for selecting an avenue toward building a quantum computer. This is intended to help experimentalists determine the most natural paradigm given a particular physical implementation.« less
Federated and Cloud Enabled Resources for Data Management and Utilization
NASA Astrophysics Data System (ADS)
Rankin, R.; Gordon, M.; Potter, R. G.; Satchwill, B.
2011-12-01
The emergence of cloud computing over the past three years has led to a paradigm shift in how data can be managed, processed and made accessible. Building on the federated data management system offered through the Canadian Space Science Data Portal (www.cssdp.ca), we demonstrate how heterogeneous and geographically distributed data sets and modeling tools have been integrated to form a virtual data center and computational modeling platform that has services for data processing and visualization embedded within it. We also discuss positive and negative experiences in utilizing Eucalyptus and OpenStack cloud applications, and job scheduling facilitated by Condor and Star Cluster. We summarize our findings by demonstrating use of these technologies in the Cloud Enabled Space Weather Data Assimilation and Modeling Platform CESWP (www.ceswp.ca), which is funded through Canarie's (canarie.ca) Network Enabled Platforms program in Canada.
EHR standards--A comparative study.
Blobel, Bernd; Pharow, Peter
2006-01-01
For ensuring quality and efficiency of patient's care, the care paradigm moves from organization-centered over process-controlled towards personal care. Such health system paradigm change leads to new paradigms for analyzing, designing, implementing and deploying supporting health information systems including EHR systems as core application in a distributed eHealth environment. The paper defines the architectural paradigm for future-proof EHR systems. It compares advanced EHR architectures referencing them at the Generic Component Model. The paper introduces the evolving paradigm of autonomous computing for self-organizing health information systems.
NASA Astrophysics Data System (ADS)
Ding, Hongxia; Chen, Shangbin; Zeng, Shuai; Zeng, Shaoqun; Liu, Qian; Luo, Qingming
2008-12-01
Spreading depression (SD) shows as propagating suppression of electrical activity, which relates with migraine and focal cerebral ischaemia. The putative mechanism of SD is the reaction-diffusion hypothesis involving potassium ions. In part inspired by optical imaging of two SD waves collision, we aimed to show the merged and large wavefront but not annihilation during collision by experimental and computational study. This paper modified Reggia et al established bistable equation with recovery to compute and visualize SD. Firstly, the media tissue of SD was assumed as one-dimensional continuum. The Crank-Nicholson method was used to solve the modified equations with recovery term. Then, the computation results were extended to two-dimensional space by symmetry. One individual SD was visualized as a concentric wave initiating from the stimulation point. The mergence but not annihilation of two colliding waves of SD was demonstrated. In addition, the dynamics of SD depending on the parameters was studied and presented. The results allied SD with the emerging concepts of volume transmission. This work not only supplied a paradigm to compute and visualize SD but also became a tool to explore the mechanisms of SD.
Information processing as a paradigm for decision making.
Oppenheimer, Daniel M; Kelso, Evan
2015-01-03
For decades, the dominant paradigm for studying decision making--the expected utility framework--has been burdened by an increasing number of empirical findings that question its validity as a model of human cognition and behavior. However, as Kuhn (1962) argued in his seminal discussion of paradigm shifts, an old paradigm cannot be abandoned until a new paradigm emerges to replace it. In this article, we argue that the recent shift in researcher attention toward basic cognitive processes that give rise to decision phenomena constitutes the beginning of that replacement paradigm. Models grounded in basic perceptual, attentional, memory, and aggregation processes have begun to proliferate. The development of this new approach closely aligns with Kuhn's notion of paradigm shift, suggesting that this is a particularly generative and revolutionary time to be studying decision science.
On-chip phase-change photonic memory and computing
NASA Astrophysics Data System (ADS)
Cheng, Zengguang; Ríos, Carlos; Youngblood, Nathan; Wright, C. David; Pernice, Wolfram H. P.; Bhaskaran, Harish
2017-08-01
The use of photonics in computing is a hot topic of interest, driven by the need for ever-increasing speed along with reduced power consumption. In existing computing architectures, photonic data storage would dramatically improve the performance by reducing latencies associated with electrical memories. At the same time, the rise of `big data' and `deep learning' is driving the quest for non-von Neumann and brain-inspired computing paradigms. To succeed in both aspects, we have demonstrated non-volatile multi-level photonic memory avoiding the von Neumann bottleneck in the existing computing paradigm and a photonic synapse resembling the biological synapses for brain-inspired computing using phase-change materials (Ge2Sb2Te5).
ECOSYSTEM MANAGEMENT: DESPERATELY SEEKING A PARADIGM
Two competing views of ecosystem management have emerged. One is that ecosystem management is another stage in the continual evolution of the basic management paradigm - one that natural resource managers have followed in North America for a century. The other view is that ecosys...
Relate@IU>>>Share@IU: A New and Different Computer-Based Communications Paradigm.
ERIC Educational Resources Information Center
Frick, Theodore W.; Roberto, Joseph; Korkmaz, Ali; Oh, Jeong-En; Twal, Riad
The purpose of this study was to examine problems with the current computer-based electronic communication systems and to initially test and revise a new and different paradigm for e-collaboration, Relate@IU. Understanding the concept of sending links to resources, rather than sending the resource itself, is at the core of how Relate@IU differs…
An engineering paradigm in the biomedical sciences: Knowledge as epistemic tool.
Boon, Mieke
2017-10-01
In order to deal with the complexity of biological systems and attempts to generate applicable results, current biomedical sciences are adopting concepts and methods from the engineering sciences. Philosophers of science have interpreted this as the emergence of an engineering paradigm, in particular in systems biology and synthetic biology. This article aims at the articulation of the supposed engineering paradigm by contrast with the physics paradigm that supported the rise of biochemistry and molecular biology. This articulation starts from Kuhn's notion of a disciplinary matrix, which indicates what constitutes a paradigm. It is argued that the core of the physics paradigm is its metaphysical and ontological presuppositions, whereas the core of the engineering paradigm is the epistemic aim of producing useful knowledge for solving problems external to the scientific practice. Therefore, the two paradigms involve distinct notions of knowledge. Whereas the physics paradigm entails a representational notion of knowledge, the engineering paradigm involves the notion of 'knowledge as epistemic tool'. Copyright © 2017 Elsevier Ltd. All rights reserved.
Application of two neural network paradigms to the study of voluntary employee turnover.
Somers, M J
1999-04-01
Two neural network paradigms--multilayer perceptron and learning vector quantization--were used to study voluntary employee turnover with a sample of 577 hospital employees. The objectives of the study were twofold. The 1st was to assess whether neural computing techniques offered greater predictive accuracy than did conventional turnover methodologies. The 2nd was to explore whether computer models of turnover based on neural network technologies offered new insights into turnover processes. When compared with logistic regression analysis, both neural network paradigms provided considerably more accurate predictions of turnover behavior, particularly with respect to the correct classification of leavers. In addition, these neural network paradigms captured nonlinear relationships that are relevant for theory development. Results are discussed in terms of their implications for future research.
Symmetry and Asymmetry: New Contours, Paradigms, and Politics in African Academic Partnerships
ERIC Educational Resources Information Center
Obamba, Milton Odhiambo; Mwema, Jane Kimbwarata
2009-01-01
International partnership spanning various organizational and geographical boundaries has emerged as the dominant paradigm for organizing modern scientific research; and for undertaking international development policy. Academic collaboration has become ubiquitous, embedded in organizational cultures, and is increasingly organized in a wide…
Emergent Behavior in the Macro World: Rigidity of Granular Solids
NASA Astrophysics Data System (ADS)
Chakraborty, Bulbul
2015-03-01
Diversity in the natural world emerges from the collective behavior of large numbers of interacting objects. The origin of collectively organized structures over the vast range of length scales from the subatomic to colloidal is the competition between energy and entropy. Thermal motion provides the mechanism for organization by allowing particles to explore the space of configurations. This well-established paradigm of emergent behavior breaks down for collections of macroscopic objects ranging from grains of sand to asteroids. In this macro-world of particulate systems, thermal motion is absent, and mechanical forces are all important. We lack understanding of the basic, unifying principles that underlie the emergence of order in this world. In this talk, I will explore the origin of rigidity of granular solids, and present a new paradigm for emergence of order in these athermal systems. This work has been supported by NSF-DMR 1409093 and by the W. M. Keck foundation
New paradigm for drug developments--from emerging market statistical perspective.
Quan, Hui; Chen, Xun; Zhang, Ji; Zhao, Peng-Liang
2013-11-01
Paradigm for new drug development has changed dramatically over the last decade. Even though new technology increases efficiency in many aspects, partially due to much more stringent regulatory requirements, it actually now takes longer and costs more to develop a new drug. To deal with challenge, some initiatives are taken by the pharmaceutical industry. These initiatives include exploring emerging markets, conducting global trials and building research and development centers in emerging markets to curb spending. It is particularly the current trend that major pharmaceutical companies offshore a part of their biostatistical support to China. In this paper, we first discuss the skill set for trial statisticians in the new era. We then elaborate on some of the approaches for acquiring statistical talent and capacity in general, particularly in emerging markets. We also make some recommendations on the use of the PDUFA strategy and collaborations among industry, health authority and academia from emerging market statistical perspective. © 2013.
A perspective of adaptation in healthcare.
Mezghani, Emna; Da Silveira, Marcos; Pruski, Cédric; Exposito, Ernesto; Drira, Khalil
2014-01-01
Emerging new technologies in healthcare has proven great promises for managing patient care. In recent years, the evolution of Information and Communication Technologies pushes many research studies to think about treatment plan adaptation in this area. The main goal is to accelerate the decision making by dynamically generating new treatment due to unexpected situations. This paper portrays the treatment adaptation from a new perspective inspired from the human nervous system named autonomic computing. Thus, the selected potential studies are classified according to the maturity levels of this paradigm. To guarantee optimal and accurate treatment adaptation, challenges related to medical knowledge and data are identified and future directions to be explored in healthcare systems are discussed.
Automated inspection of turbine blades: Challenges and opportunities
NASA Technical Reports Server (NTRS)
Mehta, Manish; Marron, Joseph C.; Sampson, Robert E.; Peace, George M.
1994-01-01
Current inspection methods for complex shapes and contours exemplified by aircraft engine turbine blades are expensive, time-consuming and labor intensive. The logistics support of new manufacturing paradigms such as integrated product-process development (IPPD) for current and future engine technology development necessitates high speed, automated inspection of forged and cast jet engine blades, combined with a capability of retaining and retrieving metrology data for process improvements upstream (designer-level) and downstream (end-user facilities) at commercial and military installations. The paper presents the opportunities emerging from a feasibility study conducted using 3-D holographic laser radar in blade inspection. Requisite developments in computing technologies for systems integration of blade inspection in production are also discussed.
Text-based CAPTCHAs over the years
NASA Astrophysics Data System (ADS)
Chow, Y. W.; Susilo, W.
2017-11-01
The notion of CAPTCHAs has been around for more than two decades. Since its introduction, CAPTCHAs have now become a ubiquitous part of the Internet. Over the years, research on various aspects of CAPTCHAs has evolved and different design principles have emerged. This article discusses text-based CAPTCHAs in terms of their fundamental requirements, namely, security and usability. Practicality necessitates that humans must be able to correctly solve CAPTCHA challenges, while at the same time automated computer programs should have difficulty solving the challenges. This article also presents alternative paradigms to text-based CAPTCHA design that have been examined in previous work. With the advances in techniques to defeat CAPTCHAs, the future of auto- mated Turing tests is an open question.
On emergence: a neo-psychoanalytic essay on change and science.
Whitehead, Clay C
2011-01-01
The neo-psychoanalytic paradigm re-establishes the connection between psychodynamics and evolution. This allows us to transcend the limitations of dualistic metapsychology, and to make seminal contributions to traditional science. The new paradigm employs the concept of emergence, the potential for change in the evolutionary and clinical process. Emergence is described as originating with the Big Bang, but also is reflected at much higher levels, for example, biochemistry, or the capacity of the evolved mind to produce insights in psychotherapy. The constraints of dualistic theories are examined. A neuron-based view of change illustrates the evolution of traditional science as well as the neuron, itself. The new mind paradigm recognizes individual, familial, communitarian, and global reciprocal influences mediated by culture and illustrated by the extended mind and the democratic spirit. Thus both traditional and psychodynamic sciences are undergoing revolutionary changes in their common efforts to better understand the mechanisms of knowledge, relationship and consciousness. The boundaries of the self and the consultation suite are also expanded in this view. Following a survey of invagination, the work is concluded by an application of emergence theory to the creationist controversy and Freud's views of religion.
NASA Astrophysics Data System (ADS)
Hill, L. C.
1999-12-01
The emergence of the largely silicate earth from a presumably cosmically normal, H-rich solar nebula 4.5 eons ago is an obviously important issue relevant to many disciplines of the physical sciences. The emergence of terrestrial life is an equally important issue for biological sciences. Recent discoveries of isotopically light carbon (i.e. putative chemical fossils) in 3.85+ Ga Issua, Greenland sediments have reopened the issue of whether terrestrial life may have emerged prior to the earliest known rocks so that one might use biological records to deduce early terrestrial environments. In addition, recent advances in molecular genetics have suggested that all known ancestral life forms passed through an early hydrogen-rich environment which is more consistent with the now rejected Urey hypothesis of a early jovian atmosphere than with contemporary geological and planetological paradigms. In this essay, then, we examine possible limitations of contemporary paradigms of planetary science since a prima facie case will be made that life could not emerge in those environments which those paradigms now allow. Of necessity, the discussion will also address some hidden conflicts embedded in various disciplinary methodologies (e.g. astronomy, biology, geology).
Performance improvement of ERP-based brain-computer interface via varied geometric patterns.
Ma, Zheng; Qiu, Tianshuang
2017-12-01
Recently, many studies have been focusing on optimizing the stimulus of an event-related potential (ERP)-based brain-computer interface (BCI). However, little is known about the effectiveness when increasing the stimulus unpredictability. We investigated a new stimulus type of varied geometric pattern where both complexity and unpredictability of the stimulus are increased. The proposed and classical paradigms were compared in within-subject experiments with 16 healthy participants. Results showed that the BCI performance was significantly improved for the proposed paradigm, with an average online written symbol rate increasing by 138% comparing with that of the classical paradigm. Amplitudes of primary ERP components, such as N1, P2a, P2b, N2, were also found to be significantly enhanced with the proposed paradigm. In this paper, a novel ERP BCI paradigm with a new stimulus type of varied geometric pattern is proposed. By jointly increasing the complexity and unpredictability of the stimulus, the performance of an ERP BCI could be considerably improved.
A Tutoring and Student Modelling Paradigm for Gaming Environments.
ERIC Educational Resources Information Center
Burton, Richard R.; Brown, John Seely
This paper describes a paradigm for tutorial systems capable of automatically providing feedback and hints in a game environment. The paradigm is illustrated by a tutoring system for the PLATO game "How the West Was Won." The system uses a computer-based "Expert" player to evaluate a student's moves and construct a "differential model" of the…
ERIC Educational Resources Information Center
Muwanga-Zake, J. W. F.
2009-01-01
This paper discusses how Ubuntu as a philosophy and a methodology was used among Bantu in South Africa together with participative Western paradigms in evaluating an educational computer game. The paper argues that research among Bantu has to articulate research experiences within Ubuntu paradigms if valid outcomes are to be realised. (Contains 1…
GPU-based Parallel Application Design for Emerging Mobile Devices
NASA Astrophysics Data System (ADS)
Gupta, Kshitij
A revolution is underway in the computing world that is causing a fundamental paradigm shift in device capabilities and form-factor, with a move from well-established legacy desktop/laptop computers to mobile devices in varying sizes and shapes. Amongst all the tasks these devices must support, graphics has emerged as the 'killer app' for providing a fluid user interface and high-fidelity game rendering, effectively making the graphics processor (GPU) one of the key components in (present and future) mobile systems. By utilizing the GPU as a general-purpose parallel processor, this dissertation explores the GPU computing design space from an applications standpoint, in the mobile context, by focusing on key challenges presented by these devices---limited compute, memory bandwidth, and stringent power consumption requirements---while improving the overall application efficiency of the increasingly important speech recognition workload for mobile user interaction. We broadly partition trends in GPU computing into four major categories. We analyze hardware and programming model limitations in current-generation GPUs and detail an alternate programming style called Persistent Threads, identify four use case patterns, and propose minimal modifications that would be required for extending native support. We show how by manually extracting data locality and altering the speech recognition pipeline, we are able to achieve significant savings in memory bandwidth while simultaneously reducing the compute burden on GPU-like parallel processors. As we foresee GPU computing to evolve from its current 'co-processor' model into an independent 'applications processor' that is capable of executing complex work independently, we create an alternate application framework that enables the GPU to handle all control-flow dependencies autonomously at run-time while minimizing host involvement to just issuing commands, that facilitates an efficient application implementation. Finally, as compute and communication capabilities of mobile devices improve, we analyze energy implications of processing speech recognition locally (on-chip) and offloading it to servers (in-cloud).
Heterogeneous concurrent computing with exportable services
NASA Technical Reports Server (NTRS)
Sunderam, Vaidy
1995-01-01
Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.
Comparing the OpenMP, MPI, and Hybrid Programming Paradigm on an SMP Cluster
NASA Technical Reports Server (NTRS)
Jost, Gabriele; Jin, Haoqiang; anMey, Dieter; Hatay, Ferhat F.
2003-01-01
With the advent of parallel hardware and software technologies users are faced with the challenge to choose a programming paradigm best suited for the underlying computer architecture. With the current trend in parallel computer architectures towards clusters of shared memory symmetric multi-processors (SMP), parallel programming techniques have evolved to support parallelism beyond a single level. Which programming paradigm is the best will depend on the nature of the given problem, the hardware architecture, and the available software. In this study we will compare different programming paradigms for the parallelization of a selected benchmark application on a cluster of SMP nodes. We compare the timings of different implementations of the same CFD benchmark application employing the same numerical algorithm on a cluster of Sun Fire SMP nodes. The rest of the paper is structured as follows: In section 2 we briefly discuss the programming models under consideration. We describe our compute platform in section 3. The different implementations of our benchmark code are described in section 4 and the performance results are presented in section 5. We conclude our study in section 6.
Synergetics in Science and Education
ERIC Educational Resources Information Center
Steklova, I.
2004-01-01
The natural crisis in contemporary culture, conditioned by the emergence of a new cultural paradigm, makes it essential to look for methodological and theoretical foundations of a possible new scientific paradigm, one closely linked to issues in education. In this article, the author presents basic conditions for the self-organization and…
Emerging Paradigms for Applied Drama and Theatre Practice in African Contexts
ERIC Educational Resources Information Center
Chinyowa, Kennedy C.
2009-01-01
The prevailing tendency in applied drama and theatre research and practice in African contexts has been for both critics and practitioners to apply the Freirian educational paradigm of "codification" and "decodification" in the interpretation of their work. Guarav Desai asserts that most of the theoretical premises of applied…
Victims of Domestic Violence and Front-Line Workers: A Helping Paradigm
ERIC Educational Resources Information Center
Peters, Scott W.; Trepal, Heather C.; de Vries, Sabina M.; Day, Sally W.; Leeth, Christopher
2009-01-01
Victims of domestic violence present a challenge to law enforcement and emergency room personnel. The authors propose a helping approach to assist these professionals. This paradigm is composed of: active and empathetic listening, acceptance without judgment, identifying victims' strengths, honoring victims as experts, and the process of leaving…
The paradigm compiler: Mapping a functional language for the connection machine
NASA Technical Reports Server (NTRS)
Dennis, Jack B.
1989-01-01
The Paradigm Compiler implements a new approach to compiling programs written in high level languages for execution on highly parallel computers. The general approach is to identify the principal data structures constructed by the program and to map these structures onto the processing elements of the target machine. The mapping is chosen to maximize performance as determined through compile time global analysis of the source program. The source language is Sisal, a functional language designed for scientific computations, and the target language is Paris, the published low level interface to the Connection Machine. The data structures considered are multidimensional arrays whose dimensions are known at compile time. Computations that build such arrays usually offer opportunities for highly parallel execution; they are data parallel. The Connection Machine is an attractive target for these computations, and the parallel for construct of the Sisal language is a convenient high level notation for data parallel algorithms. The principles and organization of the Paradigm Compiler are discussed.
A study on strategic provisioning of cloud computing services.
Whaiduzzaman, Md; Haque, Mohammad Nazmul; Rejaul Karim Chowdhury, Md; Gani, Abdullah
2014-01-01
Cloud computing is currently emerging as an ever-changing, growing paradigm that models "everything-as-a-service." Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.
Brain-Computer Interfaces: A Neuroscience Paradigm of Social Interaction? A Matter of Perspective
Mattout, Jérémie
2012-01-01
A number of recent studies have put human subjects in true social interactions, with the aim of better identifying the psychophysiological processes underlying social cognition. Interestingly, this emerging Neuroscience of Social Interactions (NSI) field brings up challenges which resemble important ones in the field of Brain-Computer Interfaces (BCI). Importantly, these challenges go beyond common objectives such as the eventual use of BCI and NSI protocols in the clinical domain or common interests pertaining to the use of online neurophysiological techniques and algorithms. Common fundamental challenges are now apparent and one can argue that a crucial one is to develop computational models of brain processes relevant to human interactions with an adaptive agent, whether human or artificial. Coupled with neuroimaging data, such models have proved promising in revealing the neural basis and mental processes behind social interactions. Similar models could help BCI to move from well-performing but offline static machines to reliable online adaptive agents. This emphasizes a social perspective to BCI, which is not limited to a computational challenge but extends to all questions that arise when studying the brain in interaction with its environment. PMID:22675291
NASA Astrophysics Data System (ADS)
Cunningham, Sally Jo
The current crop of digital libraries for the computing community are strongly grounded in the conventional library paradigm: they provide indexes to support searching of collections of research papers. As such, these digital libraries are relatively impoverished; the present computing digital libraries omit many of the documents and resources that are currently available to computing researchers, and offer few browsing structures. These computing digital libraries were built 'top down': the resources and collection contents are forced to fit an existing digital library architecture. A 'bottom up' approach to digital library development would begin with an investigation of a community's information needs and available documents, and then design a library to organize those documents in such a way as to fulfill the community's needs. The 'home grown', informal information resources developed by and for the machine learning community are examined as a case study, to determine the types of information and document organizations 'native' to this group of researchers. The insights gained in this type of case study can be used to inform construction of a digital library tailored to this community.
A Study on Strategic Provisioning of Cloud Computing Services
Rejaul Karim Chowdhury, Md
2014-01-01
Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified. PMID:25032243
A new paradigm for atomically detailed simulations of kinetics in biophysical systems.
Elber, Ron
2017-01-01
The kinetics of biochemical and biophysical events determined the course of life processes and attracted considerable interest and research. For example, modeling of biological networks and cellular responses relies on the availability of information on rate coefficients. Atomically detailed simulations hold the promise of supplementing experimental data to obtain a more complete kinetic picture. However, simulations at biological time scales are challenging. Typical computer resources are insufficient to provide the ensemble of trajectories at the correct length that is required for straightforward calculations of time scales. In the last years, new technologies emerged that make atomically detailed simulations of rate coefficients possible. Instead of computing complete trajectories from reactants to products, these approaches launch a large number of short trajectories at different positions. Since the trajectories are short, they are computed trivially in parallel on modern computer architecture. The starting and termination positions of the short trajectories are chosen, following statistical mechanics theory, to enhance efficiency. These trajectories are analyzed. The analysis produces accurate estimates of time scales as long as hours. The theory of Milestoning that exploits the use of short trajectories is discussed, and several applications are described.
The FuturICT education accelerator
NASA Astrophysics Data System (ADS)
Johnson, J.; Buckingham Shum, S.; Willis, A.; Bishop, S.; Zamenopoulos, T.; Swithenby, S.; MacKay, R.; Merali, Y.; Lorincz, A.; Costea, C.; Bourgine, P.; Louçã, J.; Kapenieks, A.; Kelley, P.; Caird, S.; Bromley, J.; Deakin Crick, R.; Goldspink, C.; Collet, P.; Carbone, A.; Helbing, D.
2012-11-01
Education is a major force for economic and social wellbeing. Despite high aspirations, education at all levels can be expensive and ineffective. Three Grand Challenges are identified: (1) enable people to learn orders of magnitude more effectively, (2) enable people to learn at orders of magnitude less cost, and (3) demonstrate success by exemplary interdisciplinary education in complex systems science. A ten year `man-on-the-moon' project is proposed in which FuturICT's unique combination of Complexity, Social and Computing Sciences could provide an urgently needed transdisciplinary language for making sense of educational systems. In close dialogue with educational theory and practice, and grounded in the emerging data science and learning analytics paradigms, this will translate into practical tools (both analytical and computational) for researchers, practitioners and leaders; generative principles for resilient educational ecosystems; and innovation for radically scalable, yet personalised, learner engagement and assessment. The proposed Education Accelerator will serve as a `wind tunnel' for testing these ideas in the context of real educational programmes, with an international virtual campus delivering complex systems education exploiting the new understanding of complex, social, computationally enhanced organisational structure developed within FuturICT.
An imperialist competitive algorithm for virtual machine placement in cloud computing
NASA Astrophysics Data System (ADS)
Jamali, Shahram; Malektaji, Sepideh; Analoui, Morteza
2017-05-01
Cloud computing, the recently emerged revolution in IT industry, is empowered by virtualisation technology. In this paradigm, the user's applications run over some virtual machines (VMs). The process of selecting proper physical machines to host these virtual machines is called virtual machine placement. It plays an important role on resource utilisation and power efficiency of cloud computing environment. In this paper, we propose an imperialist competitive-based algorithm for the virtual machine placement problem called ICA-VMPLC. The base optimisation algorithm is chosen to be ICA because of its ease in neighbourhood movement, good convergence rate and suitable terminology. The proposed algorithm investigates search space in a unique manner to efficiently obtain optimal placement solution that simultaneously minimises power consumption and total resource wastage. Its final solution performance is compared with several existing methods such as grouping genetic and ant colony-based algorithms as well as bin packing heuristic. The simulation results show that the proposed method is superior to other tested algorithms in terms of power consumption, resource wastage, CPU usage efficiency and memory usage efficiency.
Optimization of Sparse Matrix-Vector Multiplication on Emerging Multicore Platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Samuel; Oliker, Leonid; Vuduc, Richard
2008-10-16
We are witnessing a dramatic change in computer architecture due to the multicore paradigm shift, as every electronic device from cell phones to supercomputers confronts parallelism of unprecedented scale. To fully unleash the potential of these systems, the HPC community must develop multicore specific-optimization methodologies for important scientific computations. In this work, we examine sparse matrix-vector multiply (SpMV) - one of the most heavily used kernels in scientific computing - across a broad spectrum of multicore designs. Our experimental platform includes the homogeneous AMD quad-core, AMD dual-core, and Intel quad-core designs, the heterogeneous STI Cell, as well as one ofmore » the first scientific studies of the highly multithreaded Sun Victoria Falls (a Niagara2 SMP). We present several optimization strategies especially effective for the multicore environment, and demonstrate significant performance improvements compared to existing state-of-the-art serial and parallel SpMV implementations. Additionally, we present key insights into the architectural trade-offs of leading multicore design strategies, in the context of demanding memory-bound numerical algorithms.« less
Intelligence-Augmented Rat Cyborgs in Maze Solving.
Yu, Yipeng; Pan, Gang; Gong, Yongyue; Xu, Kedi; Zheng, Nenggan; Hua, Weidong; Zheng, Xiaoxiang; Wu, Zhaohui
2016-01-01
Cyborg intelligence is an emerging kind of intelligence paradigm. It aims to deeply integrate machine intelligence with biological intelligence by connecting machines and living beings via neural interfaces, enhancing strength by combining the biological cognition capability with the machine computational capability. Cyborg intelligence is considered to be a new way to augment living beings with machine intelligence. In this paper, we build rat cyborgs to demonstrate how they can expedite the maze escape task with integration of machine intelligence. We compare the performance of maze solving by computer, by individual rats, and by computer-aided rats (i.e. rat cyborgs). They were asked to find their way from a constant entrance to a constant exit in fourteen diverse mazes. Performance of maze solving was measured by steps, coverage rates, and time spent. The experimental results with six rats and their intelligence-augmented rat cyborgs show that rat cyborgs have the best performance in escaping from mazes. These results provide a proof-of-principle demonstration for cyborg intelligence. In addition, our novel cyborg intelligent system (rat cyborg) has great potential in various applications, such as search and rescue in complex terrains.
Intelligence-Augmented Rat Cyborgs in Maze Solving
Yu, Yipeng; Pan, Gang; Gong, Yongyue; Xu, Kedi; Zheng, Nenggan; Hua, Weidong; Zheng, Xiaoxiang; Wu, Zhaohui
2016-01-01
Cyborg intelligence is an emerging kind of intelligence paradigm. It aims to deeply integrate machine intelligence with biological intelligence by connecting machines and living beings via neural interfaces, enhancing strength by combining the biological cognition capability with the machine computational capability. Cyborg intelligence is considered to be a new way to augment living beings with machine intelligence. In this paper, we build rat cyborgs to demonstrate how they can expedite the maze escape task with integration of machine intelligence. We compare the performance of maze solving by computer, by individual rats, and by computer-aided rats (i.e. rat cyborgs). They were asked to find their way from a constant entrance to a constant exit in fourteen diverse mazes. Performance of maze solving was measured by steps, coverage rates, and time spent. The experimental results with six rats and their intelligence-augmented rat cyborgs show that rat cyborgs have the best performance in escaping from mazes. These results provide a proof-of-principle demonstration for cyborg intelligence. In addition, our novel cyborg intelligent system (rat cyborg) has great potential in various applications, such as search and rescue in complex terrains. PMID:26859299
Using the Electrocorticographic Speech Network to Control a Brain-Computer Interface in Humans
Leuthardt, Eric C.; Gaona, Charles; Sharma, Mohit; Szrama, Nicholas; Roland, Jarod; Freudenberg, Zac; Solis, Jamie; Breshears, Jonathan; Schalk, Gerwin
2013-01-01
Electrocorticography (ECoG) has emerged as a new signal platform for brain-computer interface (BCI) systems. Classically, the cortical physiology that has been commonly investigated and utilized for device control in humans has been brain signals from sensorimotor cortex. Hence, it was unknown whether other neurophysiological substrates, such as the speech network, could be used to further improve on or complement existing motor-based control paradigms. We demonstrate here for the first time that ECoG signals associated with different overt and imagined phoneme articulation can enable invasively monitored human patients to control a one-dimensional computer cursor rapidly and accurately. This phonetic content was distinguishable within higher gamma frequency oscillations and enabled users to achieve final target accuracies between 68 and 91% within 15 minutes. Additionally, one of the patients achieved robust control using recordings from a microarray consisting of 1 mm spaced microwires. These findings suggest that the cortical network associated with speech could provide an additional cognitive and physiologic substrate for BCI operation and that these signals can be acquired from a cortical array that is small and minimally invasive. PMID:21471638
Brown, C H; Liao, J
1999-10-01
An emerging population-based paradigm is now being used to guide the design of preventive trials used to test developmental models. We discuss elements of the designs of several ongoing randomized preventive trials involving reduction of risk for children of divorce, for children who exhibit behavioral or learning problems, and for children whose parents are being treated for depression. To test developmental models using this paradigm, we introduce three classes of design issues: design for prerandomization, design for intervention, and design for postintervention. For each of these areas, we present quantitative results from power calculations. Both scientific and cost implications of these power calculations are discussed in terms of variation among subjects on preintervention measures, unit of intervention, assignment, balancing, number of pretest and posttest measures, and the examination of moderation effects.
Holonic Rationale and Bio-inspiration on Design of Complex Emergent and Evolvable Systems
NASA Astrophysics Data System (ADS)
Leitao, Paulo
Traditional centralized and rigid control structures are becoming inflexible to face the requirements of reconfigurability, responsiveness and robustness, imposed by customer demands in the current global economy. The Holonic Manufacturing Systems (HMS) paradigm, which was pointed out as a suitable solution to face these requirements, translates the concepts inherited from social organizations and biology to the manufacturing world. It offers an alternative way of designing adaptive systems where the traditional centralized control is replaced by decentralization over distributed and autonomous entities organized in hierarchical structures formed by intermediate stable forms. In spite of its enormous potential, methods regarding the self-adaptation and self-organization of complex systems are still missing. This paper discusses how the insights from biology in connection with new fields of computer science can be useful to enhance the holonic design aiming to achieve more self-adaptive and evolvable systems. Special attention is devoted to the discussion of emergent behavior and self-organization concepts, and the way they can be combined with the holonic rationale.
Tradition and Revolution in ESL Teaching.
ERIC Educational Resources Information Center
Raimes, Ann
1983-01-01
Explores the development of language teaching in light of Thomas Kuhn's theory of scientific revolution and briefly defines the positivist tradition in language teaching. Argues that the current emphasis on communication does not mark the emergence of a new paradigm, as it still operates in the positivist tradition, but rather a paradigm shift.…
NASA Astrophysics Data System (ADS)
Ceballos, G. A.; Hernández, L. F.
2015-04-01
Objective. The classical ERP-based speller, or P300 Speller, is one of the most commonly used paradigms in the field of Brain Computer Interfaces (BCI). Several alterations to the visual stimuli presentation system have been developed to avoid unfavorable effects elicited by adjacent stimuli. However, there has been little, if any, regard to useful information contained in responses to adjacent stimuli about spatial location of target symbols. This paper aims to demonstrate that combining the classification of non-target adjacent stimuli with standard classification (target versus non-target) significantly improves classical ERP-based speller efficiency. Approach. Four SWLDA classifiers were trained and combined with the standard classifier: the lower row, upper row, right column and left column classifiers. This new feature extraction procedure and the classification method were carried out on three open databases: the UAM P300 database (Universidad Autonoma Metropolitana, Mexico), BCI competition II (dataset IIb) and BCI competition III (dataset II). Main results. The inclusion of the classification of non-target adjacent stimuli improves target classification in the classical row/column paradigm. A gain in mean single trial classification of 9.6% and an overall improvement of 25% in simulated spelling speed was achieved. Significance. We have provided further evidence that the ERPs produced by adjacent stimuli present discriminable features, which could provide additional information about the spatial location of intended symbols. This work promotes the searching of information on the peripheral stimulation responses to improve the performance of emerging visual ERP-based spellers.
Emergency strategies and trends in the management of liver trauma.
Jiang, Hongchi; Wang, Jizhou
2012-09-01
The liver is the most frequently injured organ during abdominal trauma. The management of hepatic trauma has undergone a paradigm shift over the past several decades, with mandatory operation giving way to nonoperative treatment. Better understanding of the mechanisms and grade of liver injury aids in the initial assessment and establishment of a management strategy. Hemodynamically unstable patients should undergo focused abdominal sonography for trauma, whereas stable patients may undergo computed tomography, the standard examination protocol. The grade of liver injury alone does not accurately predict the need for operation, and nonoperative management is rapidly becoming popular for high-grade injuries. Hemodynamic instability with positive focused abdominal sonography for trauma and peritonitis is an indicator of the need for emergent operative intervention. The damage control concept is appropriate for the treatment of major liver injuries and is associated with significant survival advantages compared with traditional prolonged surgical techniques. Although surgical intervention for hepatic trauma is not as common now as it was in the past, current trauma surgeons should be familiar with the emergency surgical skills necessary to manage complex hepatic injuries, such as packing, Pringle maneuver, selective vessel ligation, resectional debridement, and parenchymal sutures. The present review presents emergency strategies and trends in the management of liver trauma.
Paradigm Shift or Annoying Distraction
Spallek, H.; O’Donnell, J.; Clayton, M.; Anderson, P.; Krueger, A.
2010-01-01
Web 2.0 technologies, known as social media, social technologies or Web 2.0, have emerged into the mainstream. As they grow, these new technologies have the opportunity to influence the methods and procedures of many fields. This paper focuses on the clinical implications of the growing Web 2.0 technologies. Five developing trends are explored: information channels, augmented reality, location-based mobile social computing, virtual worlds and serious gaming, and collaborative research networks. Each trend is discussed based on their utilization and pattern of use by healthcare providers or healthcare organizations. In addition to explorative research for each trend, a vignette is presented which provides a future example of adoption. Lastly each trend lists several research challenge questions for applied clinical informatics. PMID:23616830
Mondragón, Esther; Gray, Jonathan; Alonso, Eduardo; Bonardi, Charlotte; Jennings, Dómhnall J.
2014-01-01
This paper presents a novel representational framework for the Temporal Difference (TD) model of learning, which allows the computation of configural stimuli – cumulative compounds of stimuli that generate perceptual emergents known as configural cues. This Simultaneous and Serial Configural-cue Compound Stimuli Temporal Difference model (SSCC TD) can model both simultaneous and serial stimulus compounds, as well as compounds including the experimental context. This modification significantly broadens the range of phenomena which the TD paradigm can explain, and allows it to predict phenomena which traditional TD solutions cannot, particularly effects that depend on compound stimuli functioning as a whole, such as pattern learning and serial structural discriminations, and context-related effects. PMID:25054799
NASA Astrophysics Data System (ADS)
Kaufmann, Tobias; Kübler, Andrea
2014-10-01
Objective. The speed of brain-computer interfaces (BCI), based on event-related potentials (ERP), is inherently limited by the commonly used one-stimulus paradigm. In this paper, we introduce a novel paradigm that can increase the spelling speed by a factor of 2, thereby extending the one-stimulus paradigm to a two-stimulus paradigm. Two different stimuli (a face and a symbol) are presented at the same time, superimposed on different characters and ERPs are classified using a multi-class classifier. Here, we present the proof-of-principle that is achieved with healthy participants. Approach. Eight participants were confronted with the novel two-stimulus paradigm and, for comparison, with two one-stimulus paradigms that used either one of the stimuli. Classification accuracies (percentage of correctly predicted letters) and elicited ERPs from the three paradigms were compared in a comprehensive offline analysis. Main results. The accuracies slightly decreased with the novel system compared to the established one-stimulus face paradigm. However, the use of two stimuli allowed for spelling at twice the maximum speed of the one-stimulus paradigms, and participants still achieved an average accuracy of 81.25%. This study introduced an alternative way of increasing the spelling speed in ERP-BCIs and illustrated that ERP-BCIs may not yet have reached their speed limit. Future research is needed in order to improve the reliability of the novel approach, as some participants displayed reduced accuracies. Furthermore, a comparison to the most recent BCI systems with individually adjusted, rapid stimulus timing is needed to draw conclusions about the practical relevance of the proposed paradigm. Significance. We introduced a novel two-stimulus paradigm that might be of high value for users who have reached the speed limit with the current one-stimulus ERP-BCI systems.
Intelligent holographic databases
NASA Astrophysics Data System (ADS)
Barbastathis, George
Memory is a key component of intelligence. In the human brain, physical structure and functionality jointly provide diverse memory modalities at multiple time scales. How could we engineer artificial memories with similar faculties? In this thesis, we attack both hardware and algorithmic aspects of this problem. A good part is devoted to holographic memory architectures, because they meet high capacity and parallelism requirements. We develop and fully characterize shift multiplexing, a novel storage method that simplifies disk head design for holographic disks. We develop and optimize the design of compact refreshable holographic random access memories, showing several ways that 1 Tbit can be stored holographically in volume less than 1 m3, with surface density more than 20 times higher than conventional silicon DRAM integrated circuits. To address the issue of photorefractive volatility, we further develop the two-lambda (dual wavelength) method for shift multiplexing, and combine electrical fixing with angle multiplexing to demonstrate 1,000 multiplexed fixed holograms. Finally, we propose a noise model and an information theoretic metric to optimize the imaging system of a holographic memory, in terms of storage density and error rate. Motivated by the problem of interfacing sensors and memories to a complex system with limited computational resources, we construct a computer game of Desert Survival, built as a high-dimensional non-stationary virtual environment in a competitive setting. The efficacy of episodic learning, implemented as a reinforced Nearest Neighbor scheme, and the probability of winning against a control opponent improve significantly by concentrating the algorithmic effort to the virtual desert neighborhood that emerges as most significant at any time. The generalized computational model combines the autonomous neural network and von Neumann paradigms through a compact, dynamic central representation, which contains the most salient features of the sensory inputs, fused with relevant recollections, reminiscent of the hypothesized cognitive function of awareness. The Declarative Memory is searched both by content and address, suggesting a holographic implementation. The proposed computer architecture may lead to a novel paradigm that solves 'hard' cognitive problems at low cost.
Computing with dynamical systems based on insulator-metal-transition oscillators
NASA Astrophysics Data System (ADS)
Parihar, Abhinav; Shukla, Nikhil; Jerry, Matthew; Datta, Suman; Raychowdhury, Arijit
2017-04-01
In this paper, we review recent work on novel computing paradigms using coupled oscillatory dynamical systems. We explore systems of relaxation oscillators based on linear state transitioning devices, which switch between two discrete states with hysteresis. By harnessing the dynamics of complex, connected systems, we embrace the philosophy of "let physics do the computing" and demonstrate how complex phase and frequency dynamics of such systems can be controlled, programmed, and observed to solve computationally hard problems. Although our discussion in this paper is limited to insulator-to-metallic state transition devices, the general philosophy of such computing paradigms can be translated to other mediums including optical systems. We present the necessary mathematical treatments necessary to understand the time evolution of these systems and demonstrate through recent experimental results the potential of such computational primitives.
Samant, Sanjiv S; Xia, Junyi; Muyan-Ozcelik, Pinar; Owens, John D
2008-08-01
The advent of readily available temporal imaging or time series volumetric (4D) imaging has become an indispensable component of treatment planning and adaptive radiotherapy (ART) at many radiotherapy centers. Deformable image registration (DIR) is also used in other areas of medical imaging, including motion corrected image reconstruction. Due to long computation time, clinical applications of DIR in radiation therapy and elsewhere have been limited and consequently relegated to offline analysis. With the recent advances in hardware and software, graphics processing unit (GPU) based computing is an emerging technology for general purpose computation, including DIR, and is suitable for highly parallelized computing. However, traditional general purpose computation on the GPU is limited because the constraints of the available programming platforms. As well, compared to CPU programming, the GPU currently has reduced dedicated processor memory, which can limit the useful working data set for parallelized processing. We present an implementation of the demons algorithm using the NVIDIA 8800 GTX GPU and the new CUDA programming language. The GPU performance will be compared with single threading and multithreading CPU implementations on an Intel dual core 2.4 GHz CPU using the C programming language. CUDA provides a C-like language programming interface, and allows for direct access to the highly parallel compute units in the GPU. Comparisons for volumetric clinical lung images acquired using 4DCT were carried out. Computation time for 100 iterations in the range of 1.8-13.5 s was observed for the GPU with image size ranging from 2.0 x 10(6) to 14.2 x 10(6) pixels. The GPU registration was 55-61 times faster than the CPU for the single threading implementation, and 34-39 times faster for the multithreading implementation. For CPU based computing, the computational time generally has a linear dependence on image size for medical imaging data. Computational efficiency is characterized in terms of time per megapixels per iteration (TPMI) with units of seconds per megapixels per iteration (or spmi). For the demons algorithm, our CPU implementation yielded largely invariant values of TPMI. The mean TPMIs were 0.527 spmi and 0.335 spmi for the single threading and multithreading cases, respectively, with <2% variation over the considered image data range. For GPU computing, we achieved TPMI =0.00916 spmi with 3.7% variation, indicating optimized memory handling under CUDA. The paradigm of GPU based real-time DIR opens up a host of clinical applications for medical imaging.
Dodd, Lori E; Proschan, Michael A; Neuhaus, Jacqueline; Koopmeiners, Joseph S; Neaton, James; Beigel, John D; Barrett, Kevin; Lane, Henry Clifford; Davey, Richard T
2016-06-15
Unique challenges posed by emerging infectious diseases often expose inadequacies in the conventional phased investigational therapeutic development paradigm. The recent Ebola outbreak in West Africa presents a critical case-study highlighting barriers to faster development. During the outbreak, clinical trials were implemented with unprecedented speed. Yet, in most cases, this fast-tracked approach proved too slow for the rapidly evolving epidemic. Controversy abounded as to the most appropriate study designs to yield safety and efficacy data, potentially causing delays in pivotal studies. Preparation for research during future outbreaks may require acceptance of a paradigm that circumvents, accelerates, or reorders traditional phases, without losing sight of the traditional benchmarks by which drug candidates must be assessed for activity, safety and efficacy. We present the design of an adaptive, parent protocol, ongoing in West Africa until January 2016. The exigent circumstances of the outbreak and limited prior clinical experience with experimental treatments, led to more direct bridging from preclinical studies to human trials than the conventional paradigm would typically have sanctioned, and required considerable design flexibility. Preliminary evaluation of the "barely Bayesian" design was provided through computer simulation studies. The understanding and public discussion of the study design will help its future implementation. Published by Oxford University Press for the Infectious Diseases Society of America 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.
ERIC Educational Resources Information Center
Brinkerhoff, Robert O.; Gill, Stephen J.
This book presents an approach that organizes the principles and processes of an emerging human resource development (HRD) paradigm requiring training to be everyone's business. Chapter 1 describes the paradigm and presents a practical approach for applying it. Chapter 2 draws a picture of highly effective training (HET), focusing particularly on…
ERIC Educational Resources Information Center
Neuman, Delia
2014-01-01
Over the past 30 years, qualitative research has emerged as a widely accepted alternative to the quantitative paradigm for performing research in educational communications and technology. As the new paradigm has evolved, it has spawned a variety of theoretical perspectives and methodological techniques that have both increased its potential…
ERIC Educational Resources Information Center
Sampson, Demetrios G.
2009-01-01
In the context of the emerging paradigm of Lifelong Learning, competence-based learning is gradually attracting the attention of the Technology-Enhanced Learning community, since it appears to meet the 21st Century learning and training expectations of both individuals and organisations. On the other hand, the paradigm of Learning Objects--as a…
ERIC Educational Resources Information Center
Huber, Tonya
1996-01-01
Explores the need for recognizing and respecting different learner perspectives in order to better develop culturally responsive pedagogy. It argues for a new constructionist paradigm for teaching within a multicultural setting, which does not emphasize assimilation into the dominant culture, but enhances the learner's potential for creating anew…
ERIC Educational Resources Information Center
Lightfoot, Thomas R.
The world is in the middle of a major paradigm shift, as the paradigm of dominion over nature is coming to an end with the acceptance of the arts and other subjectively oriented technologies as useful in our effort to live in the universe. Little by little, awareness of this fundamental change in world view is emerging. The importance of art in…
Science-Technology-Society (STS): A New Paradigm in Science Education
ERIC Educational Resources Information Center
Mansour, Nasser
2009-01-01
Changes in the past two decades of goals for science education in schools have induced new orientations in science education worldwide. One of the emerging complementary approaches was the science-technology-society (STS) movement. STS has been called the current megatrend in science education. Others have called it a paradigm shift for the field…
Paradigms and Problems: Thomas Kuhn and "Composition Revolutions".
ERIC Educational Resources Information Center
Tallman, John Gary
The arguments for and against Thomas Kuhn's paradigmatic thesis are presented in this paper as a means of discussing the appropriateness of Kuhn's thesis to the emerging composition discipline. Section one of the paper introduces the topic, noting how the term paradigm is used in composition to define a system of widely shared values, beliefs, and…
Turbomachinery computational fluid dynamics: asymptotes and paradigm shifts.
Dawes, W N
2007-10-15
This paper reviews the development of computational fluid dynamics (CFD) specifically for turbomachinery simulations and with a particular focus on application to problems with complex geometry. The review is structured by considering this development as a series of paradigm shifts, followed by asymptotes. The original S1-S2 blade-blade-throughflow model is briefly described, followed by the development of two-dimensional then three-dimensional blade-blade analysis. This in turn evolved from inviscid to viscous analysis and then from steady to unsteady flow simulations. This development trajectory led over a surprisingly small number of years to an accepted approach-a 'CFD orthodoxy'. A very important current area of intense interest and activity in turbomachinery simulation is in accounting for real geometry effects, not just in the secondary air and turbine cooling systems but also associated with the primary path. The requirements here are threefold: capturing and representing these geometries in a computer model; making rapid design changes to these complex geometries; and managing the very large associated computational models on PC clusters. Accordingly, the challenges in the application of the current CFD orthodoxy to complex geometries are described in some detail. The main aim of this paper is to argue that the current CFD orthodoxy is on a new asymptote and is not in fact suited for application to complex geometries and that a paradigm shift must be sought. In particular, the new paradigm must be geometry centric and inherently parallel without serial bottlenecks. The main contribution of this paper is to describe such a potential paradigm shift, inspired by the animation industry, based on a fundamental shift in perspective from explicit to implicit geometry and then illustrate this with a number of applications to turbomachinery.
Biologically based neural circuit modelling for the study of fear learning and extinction
NASA Astrophysics Data System (ADS)
Nair, Satish S.; Paré, Denis; Vicentic, Aleksandra
2016-11-01
The neuronal systems that promote protective defensive behaviours have been studied extensively using Pavlovian conditioning. In this paradigm, an initially neutral-conditioned stimulus is paired with an aversive unconditioned stimulus leading the subjects to display behavioural signs of fear. Decades of research into the neural bases of this simple behavioural paradigm uncovered that the amygdala, a complex structure comprised of several interconnected nuclei, is an essential part of the neural circuits required for the acquisition, consolidation and expression of fear memory. However, emerging evidence from the confluence of electrophysiological, tract tracing, imaging, molecular, optogenetic and chemogenetic methodologies, reveals that fear learning is mediated by multiple connections between several amygdala nuclei and their distributed targets, dynamical changes in plasticity in local circuit elements as well as neuromodulatory mechanisms that promote synaptic plasticity. To uncover these complex relations and analyse multi-modal data sets acquired from these studies, we argue that biologically realistic computational modelling, in conjunction with experiments, offers an opportunity to advance our understanding of the neural circuit mechanisms of fear learning and to address how their dysfunction may lead to maladaptive fear responses in mental disorders.
A Hybrid Brain-Computer Interface Based on the Fusion of P300 and SSVEP Scores.
Yin, Erwei; Zeyl, Timothy; Saab, Rami; Chau, Tom; Hu, Dewen; Zhou, Zongtan
2015-07-01
The present study proposes a hybrid brain-computer interface (BCI) with 64 selectable items based on the fusion of P300 and steady-state visually evoked potential (SSVEP) brain signals. With this approach, row/column (RC) P300 and two-step SSVEP paradigms were integrated to create two hybrid paradigms, which we denote as the double RC (DRC) and 4-D spellers. In each hybrid paradigm, the target is simultaneously detected based on both P300 and SSVEP potentials as measured by the electroencephalogram. We further proposed a maximum-probability estimation (MPE) fusion approach to combine the P300 and SSVEP on a score level and compared this approach to other approaches based on linear discriminant analysis, a naïve Bayes classifier, and support vector machines. The experimental results obtained from thirteen participants indicated that the 4-D hybrid paradigm outperformed the DRC paradigm and that the MPE fusion achieved higher accuracy compared with the other approaches. Importantly, 12 of the 13 participants, using the 4-D paradigm achieved an accuracy of over 90% and the average accuracy was 95.18%. These promising results suggest that the proposed hybrid BCI system could be used in the design of a high-performance BCI-based keyboard.
Open Source software and social networks: disruptive alternatives for medical imaging.
Ratib, Osman; Rosset, Antoine; Heuberger, Joris
2011-05-01
In recent decades several major changes in computer and communication technology have pushed the limits of imaging informatics and PACS beyond the traditional system architecture providing new perspectives and innovative approach to a traditionally conservative medical community. Disruptive technologies such as the world-wide-web, wireless networking, Open Source software and recent emergence of cyber communities and social networks have imposed an accelerated pace and major quantum leaps in the progress of computer and technology infrastructure applicable to medical imaging applications. This paper reviews the impact and potential benefits of two major trends in consumer market software development and how they will influence the future of medical imaging informatics. Open Source software is emerging as an attractive and cost effective alternative to traditional commercial software developments and collaborative social networks provide a new model of communication that is better suited to the needs of the medical community. Evidence shows that successful Open Source software tools have penetrated the medical market and have proven to be more robust and cost effective than their commercial counterparts. Developed by developers that are themselves part of the user community, these tools are usually better adapted to the user's need and are more robust than traditional software programs being developed and tested by a large number of contributing users. This context allows a much faster and more appropriate development and evolution of the software platforms. Similarly, communication technology has opened up to the general public in a way that has changed the social behavior and habits adding a new dimension to the way people communicate and interact with each other. The new paradigms have also slowly penetrated the professional market and ultimately the medical community. Secure social networks allowing groups of people to easily communicate and exchange information is a new model that is particularly suitable for some specific groups of healthcare professional and for physicians. It has also changed the expectations of how patients wish to communicate with their physicians. Emerging disruptive technologies and innovative paradigm such as Open Source software are leading the way to a new generation of information systems that slowly will change the way physicians and healthcare providers as well as patients will interact and communicate in the future. The impact of these new technologies is particularly effective in image communication, PACS and teleradiology. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Dai, Huanping; Micheyl, Christophe
2015-05-01
Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.
ERIC Educational Resources Information Center
Conn, Samuel S.; Reichgelt, Han
2013-01-01
Cloud computing represents an architecture and paradigm of computing designed to deliver infrastructure, platforms, and software as constructible computing resources on demand to networked users. As campuses are challenged to better accommodate academic needs for applications and computing environments, cloud computing can provide an accommodating…
Advancing the Science of Community-Level Interventions
Beehler, Sarah; Deutsch, Charles; Green, Lawrence W.; Hawe, Penelope; McLeroy, Kenneth; Miller, Robin Lin; Rapkin, Bruce D.; Schensul, Jean J.; Schulz, Amy J.; Trimble, Joseph E.
2011-01-01
Community interventions are complex social processes that need to move beyond single interventions and outcomes at individual levels of short-term change. A scientific paradigm is emerging that supports collaborative, multilevel, culturally situated community interventions aimed at creating sustainable community-level impact. This paradigm is rooted in a deep history of ecological and collaborative thinking across public health, psychology, anthropology, and other fields of social science. The new paradigm makes a number of primary assertions that affect conceptualization of health issues, intervention design, and intervention evaluation. To elaborate the paradigm and advance the science of community intervention, we offer suggestions for promoting a scientific agenda, developing collaborations among professionals and communities, and examining the culture of science. PMID:21680923
Angelcare mobile system: homecare patient monitoring using bluetooth and GPRS.
Ribeiro, Anna G D; Maitelli, Andre L; Valentim, Ricardo A M; Brandao, Glaucio B; Guerreiro, Ana M G
2010-01-01
The quick progress in technology has brought new paradigms to the computing area, bringing with them many benefits to society. The paradigm of ubiquitous computing brings innovations applying computing in people's daily life without being noticed. For this, it has used the combination of several existing technologies like wireless communications and sensors. Several of the benefits have reached the medical area, bringing new methods of surgery, appointments and examinations. This work presents telemedicine software that adds the idea of ubiquity to the medical area, innovating the relation between doctor and patient. It also brings security and confidence to a patient being monitored in homecare.
Collaborative Working Architecture for IoT-Based Applications.
Mora, Higinio; Signes-Pont, María Teresa; Gil, David; Johnsson, Magnus
2018-05-23
The new sensing applications need enhanced computing capabilities to handle the requirements of complex and huge data processing. The Internet of Things (IoT) concept brings processing and communication features to devices. In addition, the Cloud Computing paradigm provides resources and infrastructures for performing the computations and outsourcing the work from the IoT devices. This scenario opens new opportunities for designing advanced IoT-based applications, however, there is still much research to be done to properly gear all the systems for working together. This work proposes a collaborative model and an architecture to take advantage of the available computing resources. The resulting architecture involves a novel network design with different levels which combines sensing and processing capabilities based on the Mobile Cloud Computing (MCC) paradigm. An experiment is included to demonstrate that this approach can be used in diverse real applications. The results show the flexibility of the architecture to perform complex computational tasks of advanced applications.
NASA Astrophysics Data System (ADS)
Marinos, Alexandros; Briscoe, Gerard
Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.
Municipal resilience: A paradigm shift in emergency and continuity management.
Solecki, Greg; Luchia, Mike
More than a decade of emergency and continuity management vision was instrumental in providing the unprecedented level of response and recovery from the great flood of 2013. Earlier assessments, planning and validation promulgated development of corporate continuity, emergency and contingency plans along with tactical, strategic and recovery operations centres that all led to a reliable emergency management model that will continue to provide the backbone for municipal resilience.
Recent Developments in Toxico-Cheminformatics; Supporting a New Paradigm for Predictive Toxicology
EPA's National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction through the harnessing of legacy toxicity data, creation of data linkages, and generation of new high-content and high-thoughput screening d...
Advances in Toxico-Cheminformatics: Supporting a New Paradigm for Predictive Toxicology
EPA’s National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction through the harnessing of legacy toxicity data, creation of data linkages, and generation of new high-throughput screening (HTS) data. The D...
Region based Brain Computer Interface for a home control application.
Akman Aydin, Eda; Bay, Omer Faruk; Guler, Inan
2015-08-01
Environment control is one of the important challenges for disabled people who suffer from neuromuscular diseases. Brain Computer Interface (BCI) provides a communication channel between the human brain and the environment without requiring any muscular activation. The most important expectation for a home control application is high accuracy and reliable control. Region-based paradigm is a stimulus paradigm based on oddball principle and requires selection of a target at two levels. This paper presents an application of region based paradigm for a smart home control application for people with neuromuscular diseases. In this study, a region based stimulus interface containing 49 commands was designed. Five non-disabled subjects were attended to the experiments. Offline analysis results of the experiments yielded 95% accuracy for five flashes. This result showed that region based paradigm can be used to select commands of a smart home control application with high accuracy in the low number of repetitions successfully. Furthermore, a statistically significant difference was not observed between the level accuracies.
ERIC Educational Resources Information Center
Luthans, Fred; Youssef, Carolyn M.; Rawski, Shannon L.
2011-01-01
This study drew from two distinct paradigms: the social cognitively based emerging field of positive organizational behavior or POB and the more established behaviorally based area of organizational behavior modification or OB Mod. The intent was to show that both can contribute to complex challenges facing today's organizations. Using a…
One Size Doesn't Fit All: Reopening Discussion of the Research-Practice Connection
ERIC Educational Resources Information Center
Schoonmaker, Frances
2007-01-01
Research in the early part of the 20th century focused on finding scientific proofs for justification of practice and drew on educational psychology for its methods. Critics charged the quantitative, experimental paradigm with being seriously flawed. By the end of the 20th century, a new paradigm emerged as a variety of methodologies drawn from…
ERIC Educational Resources Information Center
Brady, Shane R.; Moxley, David
2016-01-01
The authors examine how the profession's adaptation to Flexner's criteria influences the emergence of the neomedical model as the majoritarian paradigm within social work even in the face of a pluralism manifest by the existence of other competing paradigms. One endorses empowerment in social work, and the other embraces societal transformation in…
Keeping School in Rural America: A New Paradigm for Rural Education and Community Development.
ERIC Educational Resources Information Center
Haas, Toni
This paper differentiates between the "old story" of rural education and the emerging "new story." It describes the tradition (old story) in which rural education is related to the local and national economies and lays out fragments of the new story, a paradigm that combines rural education and the rural economy in a way that strengthens them…
Globalization, technological changes and the search for a new paradigm for women's work.
Mitter, S
1999-01-01
This paper comments on a conceptual paradigm that illustrates the context of the opportunities and challenges that the current technology-led globalization has brought to women's employment in Asia. The UN University Institute for New Technologies project provides the basis for this framework/paradigm. This paradigm helps in analyzing the consequences of an emerging techno-economic order that is imported and does not take into account the specific needs of women. It also acknowledged the liberating aspects of new technologies and modernization. At the same time, it emphasizes the role of the state, the family, and women workers' organizations in counteracting the negative consequences of current globalization. Finally, the paradigm was predicated on the belief that technologies, indigenous or foreign, are appropriate so long as women concerned are given a voice in their countries' policy dialogue.
1-RAAP: An Efficient 1-Round Anonymous Authentication Protocol for Wireless Body Area Networks
Liu, Jingwei; Zhang, Lihuan; Sun, Rong
2016-01-01
Thanks to the rapid technological convergence of wireless communications, medical sensors and cloud computing, Wireless Body Area Networks (WBANs) have emerged as a novel networking paradigm enabling ubiquitous Internet services, allowing people to receive medical care, monitor health status in real-time, analyze sports data and even enjoy online entertainment remotely. However, because of the mobility and openness of wireless communications, WBANs are inevitably exposed to a large set of potential attacks, significantly undermining their utility and impeding their widespread deployment. To prevent attackers from threatening legitimate WBAN users or abusing WBAN services, an efficient and secure authentication protocol termed 1-Round Anonymous Authentication Protocol (1-RAAP) is proposed in this paper. In particular, 1-RAAP preserves anonymity, mutual authentication, non-repudiation and some other desirable security properties, while only requiring users to perform several low cost computational operations. More importantly, 1-RAAP is provably secure thanks to its design basis, which is resistant to the anonymous in the random oracle model. To validate the computational efficiency of 1-RAAP, a set of comprehensive comparative studies between 1-RAAP and other authentication protocols is conducted, and the results clearly show that 1-RAAP achieves the best performance in terms of computational overhead. PMID:27213384
1-RAAP: An Efficient 1-Round Anonymous Authentication Protocol for Wireless Body Area Networks.
Liu, Jingwei; Zhang, Lihuan; Sun, Rong
2016-05-19
Thanks to the rapid technological convergence of wireless communications, medical sensors and cloud computing, Wireless Body Area Networks (WBANs) have emerged as a novel networking paradigm enabling ubiquitous Internet services, allowing people to receive medical care, monitor health status in real-time, analyze sports data and even enjoy online entertainment remotely. However, because of the mobility and openness of wireless communications, WBANs are inevitably exposed to a large set of potential attacks, significantly undermining their utility and impeding their widespread deployment. To prevent attackers from threatening legitimate WBAN users or abusing WBAN services, an efficient and secure authentication protocol termed 1-Round Anonymous Authentication Protocol (1-RAAP) is proposed in this paper. In particular, 1-RAAP preserves anonymity, mutual authentication, non-repudiation and some other desirable security properties, while only requiring users to perform several low cost computational operations. More importantly, 1-RAAP is provably secure thanks to its design basis, which is resistant to the anonymous in the random oracle model. To validate the computational efficiency of 1-RAAP, a set of comprehensive comparative studies between 1-RAAP and other authentication protocols is conducted, and the results clearly show that 1-RAAP achieves the best performance in terms of computational overhead.
A paradigm analysis of ecological sustainability: The emerging polycentric climate change publics
NASA Astrophysics Data System (ADS)
Taminiau, Job B.
Climate change poses significant complications to the development model employed by modern societies. Using paradigm analysis, the dissertation explains why, after 21 years, policy failure haunts the field: a key impediment is the unquestioned assumption that policy must adhere to an economic optimality principle. This results in policy models which fail to uphold sustainability, justice, and equality due to an emphasis on economic growth, technology, and technical and bureaucratic expertise. Unable to build consensus among low- and high-carbon economies, and searching for what one economist has called an oxymoron -- "sustainable growth" (Daly, 1997) -- the policy process has foundered with its only international convention (the Kyoto Protocol) having lost relevance. In the midst of this policy failure, the dissertation offers and defends the premise that alternative strategies have emerged which signal the prospect of a paradigm shift to ecological sustainability -- a paradigm in which social change takes places through commons-based management and community authorship in the form of network governance and where sustainability serves as governor of growth -- something unavailable in an optimality-guided world. Especially, a strategy of polycentricity is discussed in detail in order to elucidate the potential for a paradigm shift. This discussion is followed by an evaluation of two innovative concepts -- the Sustainable Energy Utility and the Solar City -- that might fit the polycentricity strategy and bring forth transformative change. The dissertation finds considerable potential rests in these two concepts and argues the critical importance of further development of innovative approaches to implement the ecological sustainability paradigm.
Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.
Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P
2010-12-22
Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.
Tit for tat in heterogeneous populations
NASA Astrophysics Data System (ADS)
Nowak, Martin A.; Sigmund, Karl
1992-01-01
THE 'iterated prisoner's dilemma' is now the orthodox paradigm for the evolution of cooperation among selfish individuals. This viewpoint is strongly supported by Axelrod's computer tournaments, where 'tit for tat' (TFT) finished first1. This has stimulated interest in the role of reciprocity in biological societies1-8. Most theoretical investigations, however, assumed homogeneous populations (the setting for evolutionary stable strategies9,10) and programs immune to errors. Here we try to come closer to the biological situation by following a program6 that takes stochasticities into account and investigates representative samples. We find that a small fraction of TFT players is essential for the emergence of reciprocation in a heterogeneous population, but only paves the way for a more generous strategy. TFT is the pivot, rather than the aim, of an evolution towards cooperation.
An overview of wireless structural health monitoring for civil structures.
Lynch, Jerome Peter
2007-02-15
Wireless monitoring has emerged in recent years as a promising technology that could greatly impact the field of structural monitoring and infrastructure asset management. This paper is a summary of research efforts that have resulted in the design of numerous wireless sensing unit prototypes explicitly intended for implementation in civil structures. Wireless sensing units integrate wireless communications and mobile computing with sensors to deliver a relatively inexpensive sensor platform. A key design feature of wireless sensing units is the collocation of computational power and sensors; the tight integration of computing with a wireless sensing unit provides sensors with the opportunity to self-interrogate measurement data. In particular, there is strong interest in using wireless sensing units to build structural health monitoring systems that interrogate structural data for signs of damage. After the hardware and the software designs of wireless sensing units are completed, the Alamosa Canyon Bridge in New Mexico is utilized to validate their accuracy and reliability. To improve the ability of low-cost wireless sensing units to detect the onset of structural damage, the wireless sensing unit paradigm is extended to include the capability to command actuators and active sensors.
Hybrid cloud: bridging of private and public cloud computing
NASA Astrophysics Data System (ADS)
Aryotejo, Guruh; Kristiyanto, Daniel Y.; Mufadhol
2018-05-01
Cloud Computing is quickly emerging as a promising paradigm in the recent years especially for the business sector. In addition, through cloud service providers, cloud computing is widely used by Information Technology (IT) based startup company to grow their business. However, the level of most businesses awareness on data security issues is low, since some Cloud Service Provider (CSP) could decrypt their data. Hybrid Cloud Deployment Model (HCDM) has characteristic as open source, which is one of secure cloud computing model, thus HCDM may solve data security issues. The objective of this study is to design, deploy and evaluate a HCDM as Infrastructure as a Service (IaaS). In the implementation process, Metal as a Service (MAAS) engine was used as a base to build an actual server and node. Followed by installing the vsftpd application, which serves as FTP server. In comparison with HCDM, public cloud was adopted through public cloud interface. As a result, the design and deployment of HCDM was conducted successfully, instead of having good security, HCDM able to transfer data faster than public cloud significantly. To the best of our knowledge, Hybrid Cloud Deployment model is one of secure cloud computing model due to its characteristic as open source. Furthermore, this study will serve as a base for future studies about Hybrid Cloud Deployment model which may relevant for solving big security issues of IT-based startup companies especially in Indonesia.
Stochastic Computations in Cortical Microcircuit Models
Maass, Wolfgang
2013-01-01
Experimental data from neuroscience suggest that a substantial amount of knowledge is stored in the brain in the form of probability distributions over network states and trajectories of network states. We provide a theoretical foundation for this hypothesis by showing that even very detailed models for cortical microcircuits, with data-based diverse nonlinear neurons and synapses, have a stationary distribution of network states and trajectories of network states to which they converge exponentially fast from any initial state. We demonstrate that this convergence holds in spite of the non-reversibility of the stochastic dynamics of cortical microcircuits. We further show that, in the presence of background network oscillations, separate stationary distributions emerge for different phases of the oscillation, in accordance with experimentally reported phase-specific codes. We complement these theoretical results by computer simulations that investigate resulting computation times for typical probabilistic inference tasks on these internally stored distributions, such as marginalization or marginal maximum-a-posteriori estimation. Furthermore, we show that the inherent stochastic dynamics of generic cortical microcircuits enables them to quickly generate approximate solutions to difficult constraint satisfaction problems, where stored knowledge and current inputs jointly constrain possible solutions. This provides a powerful new computing paradigm for networks of spiking neurons, that also throws new light on how networks of neurons in the brain could carry out complex computational tasks such as prediction, imagination, memory recall and problem solving. PMID:24244126
Changing computing paradigms towards power efficiency
Klavík, Pavel; Malossi, A. Cristiano I.; Bekas, Costas; Curioni, Alessandro
2014-01-01
Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. PMID:24842033
Langley Ground Facilities and Testing in the 21st Century
NASA Technical Reports Server (NTRS)
Ambur, Damodar R.; Kegelman, Jerome T.; Kilgore, William A.
2010-01-01
A strategic approach for retaining and more efficiently operating the essential Langley Ground Testing Facilities in the 21st Century is presented. This effort takes advantage of the previously completed and ongoing studies at the Agency and National levels. This integrated approach takes into consideration the overall decline in test business base within the nation and reduced utilization in each of the Langley facilities with capabilities to test in the subsonic, transonic, supersonic, and hypersonic speed regimes. The strategy accounts for capability needs to meet the Agency programmatic requirements and strategic goals and to execute test activities in the most efficient and flexible facility operating structure. The structure currently being implemented at Langley offers agility to right-size our capability and capacity from a national perspective, to accommodate the dynamic nature of the testing needs, and will address the influence of existing and emerging analytical tools for design. The paradigm for testing in the retained facilities is to efficiently and reliably provide more accurate and high-quality test results at an affordable cost to support design information needs for flight regimes where the computational capability is not adequate and to verify and validate the existing and emerging computational tools. Each of the above goals are planned to be achieved, keeping in mind the increasing small industry customer base engaged in developing unpiloted aerial vehicles and commercial space transportation systems.
New Trends of Emerging Technologies in Digital Pathology.
Bueno, Gloria; Fernández-Carrobles, M Milagro; Deniz, Oscar; García-Rojo, Marcial
2016-01-01
The future paradigm of pathology will be digital. Instead of conventional microscopy, a pathologist will perform a diagnosis through interacting with images on computer screens and performing quantitative analysis. The fourth generation of virtual slide telepathology systems, so-called virtual microscopy and whole-slide imaging (WSI), has allowed for the storage and fast dissemination of image data in pathology and other biomedical areas. These novel digital imaging modalities encompass high-resolution scanning of tissue slides and derived technologies, including automatic digitization and computational processing of whole microscopic slides. Moreover, automated image analysis with WSI can extract specific diagnostic features of diseases and quantify individual components of these features to support diagnoses and provide informative clinical measures of disease. Therefore, the challenge is to apply information technology and image analysis methods to exploit the new and emerging digital pathology technologies effectively in order to process and model all the data and information contained in WSI. The final objective is to support the complex workflow from specimen receipt to anatomic pathology report transmission, that is, to improve diagnosis both in terms of pathologists' efficiency and with new information. This article reviews the main concerns about and novel methods of digital pathology discussed at the latest workshop in the field carried out within the European project AIDPATH (Academia and Industry Collaboration for Digital Pathology). © 2016 S. Karger AG, Basel.
Engaging TBR Faculty in Online Research Communities and Emerging Technologies
ERIC Educational Resources Information Center
Renner, Jasmine
2017-01-01
The growing impact of online research communities and emerging technologies is creating a significant paradigm shift and consequently changing the current research landscape of higher education. The rise of online research communities exemplifies a shift from traditional research engagements, to online research communities using "Web…
Rodríguez-Domínguez, Carlos; Benghazi, Kawtar; Noguera, Manuel; Garrido, José Luis; Rodríguez, María Luisa; Ruiz-López, Tomás
2012-01-01
The Request-Response (RR) paradigm is widely used in ubiquitous systems to exchange information in a secure, reliable and timely manner. Nonetheless, there is also an emerging need for adopting the Publish-Subscribe (PubSub) paradigm in this kind of systems, due to the advantages that this paradigm offers in supporting mobility by means of asynchronous, non-blocking and one-to-many message distribution semantics for event notification. This paper analyzes the strengths and weaknesses of both the RR and PubSub paradigms to support communications in ubiquitous systems and proposes an abstract communication model in order to enable their seamless integration. Thus, developers will be focused on communication semantics and the required quality properties, rather than be concerned about specific communication mechanisms. The aim is to provide developers with abstractions intended to decrease the complexity of integrating different communication paradigms commonly needed in ubiquitous systems. The proposal has been applied to implement a middleware and a real home automation system to show its applicability and benefits.
Integration of drug dosing data with physiological data streams using a cloud computing paradigm.
Bressan, Nadja; James, Andrew; McGregor, Carolyn
2013-01-01
Many drugs are used during the provision of intensive care for the preterm newborn infant. Recommendations for drug dosing in newborns depend upon data from population based pharmacokinetic research. There is a need to be able to modify drug dosing in response to the preterm infant's response to the standard dosing recommendations. The real-time integration of physiological data with drug dosing data would facilitate individualised drug dosing for these immature infants. This paper proposes the use of a novel computational framework that employs real-time, temporal data analysis for this task. Deployment of the framework within the cloud computing paradigm will enable widespread distribution of individualized drug dosing for newborn infants.
Computational Prosodic Markers for Autism
ERIC Educational Resources Information Center
Van Santen, Jan P.H.; Prud'hommeaux, Emily T.; Black, Lois M.; Mitchell, Margaret
2010-01-01
We present results obtained with new instrumental methods for the acoustic analysis of prosody to evaluate prosody production by children with Autism Spectrum Disorder (ASD) and Typical Development (TD). Two tasks elicit focal stress--one in a vocal imitation paradigm, the other in a picture-description paradigm; a third task also uses a vocal…
Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures
NASA Technical Reports Server (NTRS)
Biegel, Bryan A. (Technical Monitor); Jost, G.; Jin, H.; Labarta J.; Gimenez, J.; Caubet, J.
2003-01-01
Parallel programming paradigms include process level parallelism, thread level parallelization, and multilevel parallelism. This viewgraph presentation describes a detailed performance analysis of these paradigms for Shared Memory Architecture (SMA). This analysis uses the Paraver Performance Analysis System. The presentation includes diagrams of a flow of useful computations.
EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction through harnessing of legacy toxicity data, creation of data linkages, and generation of new in vitro screening data. In association with EPA...
EPA’s Computational Toxicology Center is building capabilities to support a new paradigm for toxicity screening and prediction through harnessing of legacy toxicity data, creation of data linkages, and generation of new in vitro screening data. In association with EPA’s ToxCastTM...
ERIC Educational Resources Information Center
Onwuegbuzie, Anthony J.
Since the latter part of the 19th century, a fervent debate has ensued about quantitative and qualitative research paradigms. From these disputes, purists have emerged on both sides. Quantitative purists express assumptions that are consistent with a positivist philosophy, whereas qualitative purists (i.e., post-positivists, post-structuralists,…
NASA Astrophysics Data System (ADS)
Kong, Yen P.; Jongpaiboonkit, Leena
2016-07-01
New regenerative paradigms are needed to address the growing global problem of heart failure as existing interventions are unsatisfactory. Outcomes from the current paradigm of cell transplantation have not been stellar but the mechanistic knowledge learned from them is instructive in the development of future paradigms. An emerging biomaterial-based approach incorporating key mechanisms and additional ones scrutinized from the process of natural heart regeneration in zebrafish may become the next evolution in cardiac repair. We highlight, with examples, tested key concepts and pivotal ones that may be integrated into a successful therapy.
Cyborgs, biotechnologies, and informatics in health care - new paradigms in nursing sciences.
Monteiro, Ana Paula Teixeira de Almeida Vieira
2016-01-01
Nursing Sciences are at a moment of paradigmatic transition. The aim of this paper is to reflect on the new epistemological paradigms of nursing science from a critical approach. In this paper, we identified and analysed some new research lines and trends which anticipate the reorganization of nursing sciences and the paradigms emerging from nursing care: biotechnology-centred knowledge; the interface between nursing knowledge and new information technologies; body care centred knowledge; the human body as a cyborg body; and the rediscovery of an aesthetic knowledge in nursing care. © 2015 John Wiley & Sons Ltd.
Cloud computing for context-aware enhanced m-Health services.
Fernandez-Llatas, Carlos; Pileggi, Salvatore F; Ibañez, Gema; Valero, Zoe; Sala, Pilar
2015-01-01
m-Health services are increasing its presence in our lives due to the high penetration of new smartphone devices. This new scenario proposes new challenges in terms of information accessibility that require new paradigms which enable the new applications to access the data in a continuous and ubiquitous way, ensuring the privacy required depending on the kind of data accessed. This paper proposes an architecture based on cloud computing paradigms in order to empower new m-Health applications to enrich their results by providing secure access to user data.
Hein, David W.
2009-01-01
Arylamine N-acetyltransferase 1 (NAT1) and 2 (NAT2) exhibit single nucleotide polymorphisms (SNPs) in human populations that modify drug and carcinogen metabolism. This paper updates the identity, location, and functional effects of these SNPs and then follows with emerging concepts for understanding why pharmacogenetic findings may not be replicated consistently. Using this paradigm as an example, laboratory-based mechanistic analyses can reveal complexities such that genetic polymorphisms become biologically and medically relevant when confounding factors are more fully understood and considered. As medical care moves to a more personalized approach, the implications of these confounding factors will be important in understanding the complexities of personalized medicine. PMID:19379125
NASA Astrophysics Data System (ADS)
Agrawal, Ankit; Choudhary, Alok
2016-05-01
Our ability to collect "big data" has greatly surpassed our capability to analyze it, underscoring the emergence of the fourth paradigm of science, which is data-driven discovery. The need for data informatics is also emphasized by the Materials Genome Initiative (MGI), further boosting the emerging field of materials informatics. In this article, we look at how data-driven techniques are playing a big role in deciphering processing-structure-property-performance relationships in materials, with illustrative examples of both forward models (property prediction) and inverse models (materials discovery). Such analytics can significantly reduce time-to-insight and accelerate cost-effective materials discovery, which is the goal of MGI.
Bridging Social and Semantic Computing - Design and Evaluation of User Interfaces for Hybrid Systems
ERIC Educational Resources Information Center
Bostandjiev, Svetlin Alex I.
2012-01-01
The evolution of the Web brought new interesting problems to computer scientists that we loosely classify in the fields of social and semantic computing. Social computing is related to two major paradigms: computations carried out by a large amount of people in a collective intelligence fashion (i.e. wikis), and performing computations on social…
Behavioural and computational varieties of response inhibition in eye movements.
Cutsuridis, Vassilis
2017-04-19
Response inhibition is the ability to override a planned or an already initiated response. It is the hallmark of executive control as its deficits favour impulsive behaviours, which may be detrimental to an individual's life. This article reviews behavioural and computational guises of response inhibition. It focuses only on inhibition of oculomotor responses. It first reviews behavioural paradigms of response inhibition in eye movement research, namely the countermanding and antisaccade paradigms, both proven to be useful tools for the study of response inhibition in cognitive neuroscience and psychopathology. Then, it briefly reviews the neural mechanisms of response inhibition in these two behavioural paradigms. Computational models that embody a hypothesis and/or a theory of mechanisms underlying performance in both behavioural paradigms as well as provide a critical analysis of strengths and weaknesses of these models are discussed. All models assume the race of decision processes. The decision process in each paradigm that wins the race depends on different mechanisms. It has been shown that response latency is a stochastic process and has been proven to be an important measure of the cognitive control processes involved in response stopping in healthy and patient groups. Then, the inhibitory deficits in different brain diseases are reviewed, including schizophrenia and obsessive-compulsive disorder. Finally, new directions are suggested to improve the performance of models of response inhibition by drawing inspiration from successes of models in other domains.This article is part of the themed issue 'Movement suppression: brain mechanisms for stopping and stillness'. © 2017 The Author(s).
Crayfish Self-Administer Amphetamine in a Spatially Contingent Task.
Datta, Udita; van Staaden, Moira; Huber, Robert
2018-01-01
Natural reward is an essential element of any organism's ability to adapt to environmental variation. Its underlying circuits and mechanisms guide the learning process as they help associate an event, or cue, with the perception of an outcome's value. More generally, natural reward serves as the fundamental generator of all motivated behavior. Addictive plant alkaloids are able to activate this circuitry in taxa ranging from planaria to humans. With modularly organized nervous systems and confirmed vulnerabilities to human drugs of abuse, crayfish have recently emerged as a compelling model for the study of the addiction cycle, including psychostimulant effects, sensitization, withdrawal, reinstatement, and drug reward in conditioned place preference paradigms. Here we extend this work with the demonstration of a spatially contingent, operant drug self-administration paradigm for amphetamine. When the animal enters a quadrant of the arena with a particular textured substrate, a computer-based control system delivers amphetamine through an indwelling fine-bore cannula. Resulting reward strength, dose-response, and the time course of operant conditioning were assessed. Individuals experiencing the drug contingent on their behavior, displayed enhanced rates of operant responses compared to that of their yoked (non-contingent) counterparts. Application of amphetamine near the supra-esophageal ganglion elicited stronger and more robust increases in operant responding than did systemic infusions. This work demonstrates automated implementation of a spatially contingent self-administration paradigm in crayfish, which provides a powerful tool to explore comparative perspectives in drug-sensitive reward, the mechanisms of learning underlying the addictive cycle, and phylogenetically conserved vulnerabilities to psychostimulant compounds.
Crayfish Self-Administer Amphetamine in a Spatially Contingent Task
Datta, Udita; van Staaden, Moira; Huber, Robert
2018-01-01
Natural reward is an essential element of any organism’s ability to adapt to environmental variation. Its underlying circuits and mechanisms guide the learning process as they help associate an event, or cue, with the perception of an outcome’s value. More generally, natural reward serves as the fundamental generator of all motivated behavior. Addictive plant alkaloids are able to activate this circuitry in taxa ranging from planaria to humans. With modularly organized nervous systems and confirmed vulnerabilities to human drugs of abuse, crayfish have recently emerged as a compelling model for the study of the addiction cycle, including psychostimulant effects, sensitization, withdrawal, reinstatement, and drug reward in conditioned place preference paradigms. Here we extend this work with the demonstration of a spatially contingent, operant drug self-administration paradigm for amphetamine. When the animal enters a quadrant of the arena with a particular textured substrate, a computer-based control system delivers amphetamine through an indwelling fine-bore cannula. Resulting reward strength, dose-response, and the time course of operant conditioning were assessed. Individuals experiencing the drug contingent on their behavior, displayed enhanced rates of operant responses compared to that of their yoked (non-contingent) counterparts. Application of amphetamine near the supra-esophageal ganglion elicited stronger and more robust increases in operant responding than did systemic infusions. This work demonstrates automated implementation of a spatially contingent self-administration paradigm in crayfish, which provides a powerful tool to explore comparative perspectives in drug-sensitive reward, the mechanisms of learning underlying the addictive cycle, and phylogenetically conserved vulnerabilities to psychostimulant compounds.
ERIC Educational Resources Information Center
Haegele, Justin A.; Hodge, Samuel R.
2015-01-01
Emerging professionals, particularly senior-level undergraduate and graduate students in kinesiology who have an interest in physical education for individuals with and without disabilities, should understand the basic assumptions of the quantitative research paradigm. Knowledge of basic assumptions is critical for conducting, analyzing, and…
Envisioning future cognitive telerehabilitation technologies: a co-design process with clinicians.
How, Tuck-Voon; Hwang, Amy S; Green, Robin E A; Mihailidis, Alex
2017-04-01
Purpose Cognitive telerehabilitation is the concept of delivering cognitive assessment, feedback, or therapeutic intervention at a distance through technology. With the increase of mobile devices, wearable sensors, and novel human-computer interfaces, new possibilities are emerging to expand the cognitive telerehabilitation paradigm. This research aims to: (1) explore design opportunities and considerations when applying emergent pervasive computing technologies to cognitive telerehabilitation and (2) develop a generative co-design process for use with rehabilitation clinicians. Methods We conducted a custom co-design process that used design cards, probes, and design sessions with traumatic brain injury (TBI) clinicians. All field notes and transcripts were analyzed qualitatively. Results Potential opportunities for TBI cognitive telerehabilitation exist in the areas of communication competency, executive functioning, emotional regulation, energy management, assessment, and skill training. Designers of TBI cognitive telerehabilitation technologies should consider how technologies are adapted to a patient's physical/cognitive/emotional state, their changing rehabilitation trajectory, and their surrounding life context (e.g. social considerations). Clinicians were receptive to our co-design approach. Conclusion Pervasive computing offers new opportunities for life-situated cognitive telerehabilitation. Convivial design methods, such as this co-design process, are a helpful way to explore new design opportunities and an important space for further methodological development. Implications for Rehabilitation Designers of rehabilitation technologies should consider how to extend current design methods in order to facilitate the creative contribution of rehabilitation stakeholders. This co-design approach enables a fuller participation from rehabilitation clinicians at the front-end of design. Pervasive computing has the potential to: extend the duration and intensity of cognitive telerehabilitation training (including the delivery of 'booster' sessions or maintenance therapies); provide assessment and treatment in the context of a traumatic brain injury (TBI) patient's everyday life (thereby enhancing generalization); and permit time-sensitive interventions. Long-term use of pervasive computing for TBI cognitive telerehabilitation should take into account a patient's changing recovery trajectory, their meaningful goals, and their journey from loss to redefinition.
The Study of Surface Computer Supported Cooperative Work and Its Design, Efficiency, and Challenges
ERIC Educational Resources Information Center
Hwang, Wu-Yuin; Su, Jia-Han
2012-01-01
In this study, a Surface Computer Supported Cooperative Work paradigm is proposed. Recently, multitouch technology has become widely available for human-computer interaction. We found it has great potential to facilitate more awareness of human-to-human interaction than personal computers (PCs) in colocated collaborative work. However, other…
An Introductory Course on Service-Oriented Computing for High Schools
ERIC Educational Resources Information Center
Tsai, W. T.; Chen, Yinong; Cheng, Calvin; Sun, Xin; Bitter, Gary; White, Mary
2008-01-01
Service-Oriented Computing (SOC) is a new computing paradigm that has been adopted by major computer companies as well as government agencies such as the Department of Defense for mission-critical applications. SOC is being used for developing Web and electronic business applications, as well as robotics, gaming, and scientific applications. Yet,…
Changing computing paradigms towards power efficiency.
Klavík, Pavel; Malossi, A Cristiano I; Bekas, Costas; Curioni, Alessandro
2014-06-28
Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Emerging interdisciplinary fields in the coming intelligence/convergence era
NASA Astrophysics Data System (ADS)
Noor, Ahmed K.
2012-09-01
Dramatic advances are in the horizon resulting from rapid pace of development of several technologies, including, computing, communication, mobile, robotic, and interactive technologies. These advances, along with the trend towards convergence of traditional engineering disciplines with physical, life and other science disciplines will result in the development of new interdisciplinary fields, as well as in new paradigms for engineering practice in the coming intelligence/convergence era (post-information age). The interdisciplinary fields include Cyber Engineering, Living Systems Engineering, Biomechatronics/Robotics Engineering, Knowledge Engineering, Emergent/Complexity Engineering, and Multiscale Systems engineering. The paper identifies some of the characteristics of the intelligence/convergence era, gives broad definition of convergence, describes some of the emerging interdisciplinary fields, and lists some of the academic and other organizations working in these disciplines. The need is described for establishing a Hierarchical Cyber-Physical Ecosystem for facilitating interdisciplinary collaborations, and accelerating development of skilled workforce in the new fields. The major components of the ecosystem are listed. The new interdisciplinary fields will yield critical advances in engineering practice, and help in addressing future challenges in broad array of sectors, from manufacturing to energy, transportation, climate, and healthcare. They will also enable building large future complex adaptive systems-of-systems, such as intelligent multimodal transportation systems, optimized multi-energy systems, intelligent disaster prevention systems, and smart cities.
ERIC Educational Resources Information Center
Abdallah, Mahmoud M. S.; Wegerif, Rupert B.
2014-01-01
This article discusses educational design-based research (DBR) as an emerging paradigm/methodology in educational enquiry that can be used as a mixed-method, problem-oriented research framework, and thus can act as an alternative to other traditional paradigms/methodologies prominent within the Egyptian context of educational enquiry. DBR is often…
School on Cloud: Towards a Paradigm Shift
ERIC Educational Resources Information Center
Koutsopoulos, Kostis C.; Kotsanis, Yannis C.
2014-01-01
This paper presents the basic concept of the EU Network School on Cloud: Namely, that present conditions require a new teaching and learning paradigm based on the integrated dimension of education, when considering the use of cloud computing. In other words, it is suggested that there is a need for an integrated approach which is simultaneously…
Integrative Mental Health (IMH): paradigm, research, and clinical practice.
Lake, James; Helgason, Chanel; Sarris, Jerome
2012-01-01
This paper provides an overview of the rapidly evolving paradigm of "Integrative Mental Health (IMH)." The paradigm of contemporary biomedical psychiatry and its contrast to non-allopathic systems of medicine is initially reviewed, followed by an exploration of the emerging paradigm of IMH, which aims to reconcile the bio-psycho-socio-spiritual model with evidence-based methods from traditional healing practices. IMH is rapidly transforming conventional understandings of mental illness and has significant positive implications for the day-to-day practice of mental health care. IMH incorporates mainstream interventions such as pharmacologic treatments, psychotherapy, and psychosocial interventions, as well as alternative therapies such as acupuncture, herbal and nutritional medicine, dietary modification, meditation, etc. Two recent international conferences in Europe and the United States show that interest in integrative mental health care is growing rapidly. In response, the International Network of Integrative Mental Health (INIMH: www.INIMH.org) was established in 2010 with the objective of creating an international network of clinicians, researchers, and public health advocates to advance a global agenda for research, education, and clinical practice of evidence-based integrative mental health care. The paper concludes with a discussion of emerging opportunities for research in IMH, and an exploration of potential clinical applications of integrative mental health care. Copyright © 2012 Elsevier Inc. All rights reserved.
Emergence of life: Physical chemistry changes the paradigm.
Spitzer, Jan; Pielak, Gary J; Poolman, Bert
2015-06-10
Origin of life research has been slow to advance not only because of its complex evolutionary nature (Franklin Harold: In Search of Cell History, 2014) but also because of the lack of agreement on fundamental concepts, including the question of 'what is life?'. To re-energize the research and define a new experimental paradigm, we advance four premises to better understand the physicochemical complexities of life's emergence: (1) Chemical and Darwinian (biological) evolutions are distinct, but become continuous with the appearance of heredity. (2) Earth's chemical evolution is driven by energies of cycling (diurnal) disequilibria and by energies of hydrothermal vents. (3) Earth's overall chemical complexity must be high at the origin of life for a subset of (complex) chemicals to phase separate and evolve into living states. (4) Macromolecular crowding in aqueous electrolytes under confined conditions enables evolution of molecular recognition and cellular self-organization. We discuss these premises in relation to current 'constructive' (non-evolutionary) paradigm of origins research - the process of complexification of chemical matter 'from the simple to the complex'. This paradigm artificially avoids planetary chemical complexity and the natural tendency of molecular compositions toward maximum disorder embodied in the second law of thermodynamics. Our four premises suggest an empirical program of experiments involving complex chemical compositions under cycling gradients of temperature, water activity and electromagnetic radiation.
Hayashibe, Mitsuhiro; Shimoda, Shingo
2014-01-01
A human motor system can improve its behavior toward optimal movement. The skeletal system has more degrees of freedom than the task dimensions, which incurs an ill-posed problem. The multijoint system involves complex interaction torques between joints. To produce optimal motion in terms of energy consumption, the so-called cost function based optimization has been commonly used in previous works.Even if it is a fact that an optimal motor pattern is employed phenomenologically, there is no evidence that shows the existence of a physiological process that is similar to such a mathematical optimization in our central nervous system.In this study, we aim to find a more primitive computational mechanism with a modular configuration to realize adaptability and optimality without prior knowledge of system dynamics.We propose a novel motor control paradigm based on tacit learning with task space feedback. The motor command accumulation during repetitive environmental interactions, play a major role in the learning process. It is applied to a vertical cyclic reaching which involves complex interaction torques.We evaluated whether the proposed paradigm can learn how to optimize solutions with a 3-joint, planar biomechanical model. The results demonstrate that the proposed method was valid for acquiring motor synergy and resulted in energy efficient solutions for different load conditions. The case in feedback control is largely affected by the interaction torques. In contrast, the trajectory is corrected over time with tacit learning toward optimal solutions.Energy efficient solutions were obtained by the emergence of motor synergy. During learning, the contribution from feedforward controller is augmented and the one from the feedback controller is significantly minimized down to 12% for no load at hand, 16% for a 0.5 kg load condition.The proposed paradigm could provide an optimization process in redundant system with dynamic-model-free and cost-function-free approach. PMID:24616695
Hayashibe, Mitsuhiro; Shimoda, Shingo
2014-01-01
A human motor system can improve its behavior toward optimal movement. The skeletal system has more degrees of freedom than the task dimensions, which incurs an ill-posed problem. The multijoint system involves complex interaction torques between joints. To produce optimal motion in terms of energy consumption, the so-called cost function based optimization has been commonly used in previous works.Even if it is a fact that an optimal motor pattern is employed phenomenologically, there is no evidence that shows the existence of a physiological process that is similar to such a mathematical optimization in our central nervous system.In this study, we aim to find a more primitive computational mechanism with a modular configuration to realize adaptability and optimality without prior knowledge of system dynamics.We propose a novel motor control paradigm based on tacit learning with task space feedback. The motor command accumulation during repetitive environmental interactions, play a major role in the learning process. It is applied to a vertical cyclic reaching which involves complex interaction torques.We evaluated whether the proposed paradigm can learn how to optimize solutions with a 3-joint, planar biomechanical model. The results demonstrate that the proposed method was valid for acquiring motor synergy and resulted in energy efficient solutions for different load conditions. The case in feedback control is largely affected by the interaction torques. In contrast, the trajectory is corrected over time with tacit learning toward optimal solutions.Energy efficient solutions were obtained by the emergence of motor synergy. During learning, the contribution from feedforward controller is augmented and the one from the feedback controller is significantly minimized down to 12% for no load at hand, 16% for a 0.5 kg load condition.The proposed paradigm could provide an optimization process in redundant system with dynamic-model-free and cost-function-free approach.
Guastello, Stephen J; Gorin, Hillary; Huschen, Samuel; Peters, Natalie E; Fabisch, Megan; Poston, Kirsten
2012-10-01
It has become well established in laboratory experiments that switching tasks, perhaps due to interruptions at work, incur costs in response time to complete the next task. Conditions are also known that exaggerate or lessen the switching costs. Although switching costs can contribute to fatigue, task switching can also be an adaptive response to fatigue. The present study introduces a new research paradigm for studying the emergence of voluntary task switching regimes, self-organizing processes therein, and the possibly conflicting roles of switching costs and minimum entropy. Fifty-four undergraduates performed 7 different computer-based cognitive tasks producing sets of 49 responses under instructional conditions requiring task quotas or no quotas. The sequences of task choices were analyzed using orbital decomposition to extract pattern types and lengths, which were then classified and compared with regard to Shannon entropy, topological entropy, number of task switches involved, and overall performance. Results indicated that similar but different patterns were generated under the two instructional conditions, and better performance was associated with lower topological entropy. Both entropy metrics were associated with the amount of voluntary task switching. Future research should explore conditions affecting the trade-off between switching costs and entropy, levels of automaticity between task elements, and the role of voluntary switching regimes on fatigue.
Animal-Free Chemical Safety Assessment
Loizou, George D.
2016-01-01
The exponential growth of the Internet of Things and the global popularity and remarkable decline in cost of the mobile phone is driving the digital transformation of medical practice. The rapidly maturing digital, non-medical world of mobile (wireless) devices, cloud computing and social networking is coalescing with the emerging digital medical world of omics data, biosensors and advanced imaging which offers the increasingly realistic prospect of personalized medicine. Described as a potential “seismic” shift from the current “healthcare” model to a “wellness” paradigm that is predictive, preventative, personalized and participatory, this change is based on the development of increasingly sophisticated biosensors which can track and measure key biochemical variables in people. Additional key drivers in this shift are metabolomic and proteomic signatures, which are increasingly being reported as pre-symptomatic, diagnostic and prognostic of toxicity and disease. These advancements also have profound implications for toxicological evaluation and safety assessment of pharmaceuticals and environmental chemicals. An approach based primarily on human in vivo and high-throughput in vitro human cell-line data is a distinct possibility. This would transform current chemical safety assessment practice which operates in a human “data poor” to a human “data rich” environment. This could also lead to a seismic shift from the current animal-based to an animal-free chemical safety assessment paradigm. PMID:27493630
Pezzulo, Giovanni; Levin, Michael
2016-11-01
It is widely assumed in developmental biology and bioengineering that optimal understanding and control of complex living systems follows from models of molecular events. The success of reductionism has overshadowed attempts at top-down models and control policies in biological systems. However, other fields, including physics, engineering and neuroscience, have successfully used the explanations and models at higher levels of organization, including least-action principles in physics and control-theoretic models in computational neuroscience. Exploiting the dynamic regulation of pattern formation in embryogenesis and regeneration requires new approaches to understand how cells cooperate towards large-scale anatomical goal states. Here, we argue that top-down models of pattern homeostasis serve as proof of principle for extending the current paradigm beyond emergence and molecule-level rules. We define top-down control in a biological context, discuss the examples of how cognitive neuroscience and physics exploit these strategies, and illustrate areas in which they may offer significant advantages as complements to the mainstream paradigm. By targeting system controls at multiple levels of organization and demystifying goal-directed (cybernetic) processes, top-down strategies represent a roadmap for using the deep insights of other fields for transformative advances in regenerative medicine and systems bioengineering. © 2016 The Author(s).
2016-01-01
It is widely assumed in developmental biology and bioengineering that optimal understanding and control of complex living systems follows from models of molecular events. The success of reductionism has overshadowed attempts at top-down models and control policies in biological systems. However, other fields, including physics, engineering and neuroscience, have successfully used the explanations and models at higher levels of organization, including least-action principles in physics and control-theoretic models in computational neuroscience. Exploiting the dynamic regulation of pattern formation in embryogenesis and regeneration requires new approaches to understand how cells cooperate towards large-scale anatomical goal states. Here, we argue that top-down models of pattern homeostasis serve as proof of principle for extending the current paradigm beyond emergence and molecule-level rules. We define top-down control in a biological context, discuss the examples of how cognitive neuroscience and physics exploit these strategies, and illustrate areas in which they may offer significant advantages as complements to the mainstream paradigm. By targeting system controls at multiple levels of organization and demystifying goal-directed (cybernetic) processes, top-down strategies represent a roadmap for using the deep insights of other fields for transformative advances in regenerative medicine and systems bioengineering. PMID:27807271
Good practices in free-energy calculations.
Pohorille, Andrew; Jarzynski, Christopher; Chipot, Christophe
2010-08-19
As access to computational resources continues to increase, free-energy calculations have emerged as a powerful tool that can play a predictive role in a wide range of research areas. Yet, the reliability of these calculations can often be improved significantly if a number of precepts, or good practices, are followed. Although the theory upon which these good practices rely has largely been known for many years, it is often overlooked or simply ignored. In other cases, the theoretical developments are too recent for their potential to be fully grasped and merged into popular platforms for the computation of free-energy differences. In this contribution, the current best practices for carrying out free-energy calculations using free energy perturbation and nonequilibrium work methods are discussed, demonstrating that at little to no additional cost, free-energy estimates could be markedly improved and bounded by meaningful error estimates. Monitoring the probability distributions that underlie the transformation between the states of interest, performing the calculation bidirectionally, stratifying the reaction pathway, and choosing the most appropriate paradigms and algorithms for transforming between states offer significant gains in both accuracy and precision.
Implications of the Java language on computer-based patient records.
Pollard, D; Kucharz, E; Hammond, W E
1996-01-01
The growth of the utilization of the World Wide Web (WWW) as a medium for the delivery of computer-based patient records (CBPR) has created a new paradigm in which clinical information may be delivered. Until recently the authoring tools and environment for application development on the WWW have been limited to Hyper Text Markup Language (HTML) utilizing common gateway interface scripts. While, at times, this provides an effective medium for the delivery of CBPR, it is a less than optimal solution. The server-centric dynamics and low levels of interactivity do not provide for a robust application which is required in a clinical environment. The emergence of Sun Microsystems' Java language is a solution to the problem. In this paper we examine the Java language and its implications to the CBPR. A quantitative and qualitative assessment was performed. The Java environment is compared to HTML and Telnet CBPR environments. Qualitative comparisons include level of interactivity, server load, client load, ease of use, and application capabilities. Quantitative comparisons include data transfer time delays. The Java language has demonstrated promise for delivering CBPRs.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1997-01-01
Economic stresses are forcing many industries to reduce cost and time-to-market, and to insert emerging technologies into their products. Engineers are asked to design faster, ever more complex systems. Hence, there is a need for novel design paradigms and effective design tools to reduce the design and development times. Several computational tools and facilities have been developed to support the design process. Some of these are described in subsequent presentations. The focus of the workshop is on the computational tools and facilities which have high potential for use in future design environment for aerospace systems. The outline for the introductory remarks is given. First, the characteristics and design drivers for future aerospace systems are outlined; second, simulation-based design environment, and some of its key modules are described; third, the vision for the next-generation design environment being planned by NASA, the UVA ACT Center and JPL is presented. The anticipated major benefits of the planned environment are listed; fourth, some of the government-supported programs related to simulation-based design are listed; and fifth, the objectives and format of the workshop are presented.
Paradigms for Realizing Machine Learning Algorithms.
Agneeswaran, Vijay Srinivas; Tonpay, Pranay; Tiwary, Jayati
2013-12-01
The article explains the three generations of machine learning algorithms-with all three trying to operate on big data. The first generation tools are SAS, SPSS, etc., while second generation realizations include Mahout and RapidMiner (that work over Hadoop), and the third generation paradigms include Spark and GraphLab, among others. The essence of the article is that for a number of machine learning algorithms, it is important to look beyond the Hadoop's Map-Reduce paradigm in order to make them work on big data. A number of promising contenders have emerged in the third generation that can be exploited to realize deep analytics on big data.
Vaccine development for emerging virulent infectious diseases.
Maslow, Joel N
2017-10-04
The recent outbreak of Zaire Ebola virus in West Africa altered the classical paradigm of vaccine development and that for emerging infectious diseases (EIDs) in general. In this paper, the precepts of vaccine discovery and advancement through pre-clinical and clinical assessment are discussed in the context of the recent Ebola virus, Middle East Respiratory Syndrome coronavirus (MERS-CoV), and Zika virus outbreaks. Clinical trial design for diseases with high mortality rates and/or high morbidity in the face of a global perception of immediate need and the factors that drive design in the face of a changing epidemiology are presented. Vaccines for EIDs thus present a unique paradigm to standard development precepts. Copyright © 2017 Elsevier Ltd. All rights reserved.
Ahmed, N; Zheng, Ziyi; Mueller, K
2012-12-01
Due to the inherent characteristics of the visualization process, most of the problems in this field have strong ties with human cognition and perception. This makes the human brain and sensory system the only truly appropriate evaluation platform for evaluating and fine-tuning a new visualization method or paradigm. However, getting humans to volunteer for these purposes has always been a significant obstacle, and thus this phase of the development process has traditionally formed a bottleneck, slowing down progress in visualization research. We propose to take advantage of the newly emerging field of Human Computation (HC) to overcome these challenges. HC promotes the idea that rather than considering humans as users of the computational system, they can be made part of a hybrid computational loop consisting of traditional computation resources and the human brain and sensory system. This approach is particularly successful in cases where part of the computational problem is considered intractable using known computer algorithms but is trivial to common sense human knowledge. In this paper, we focus on HC from the perspective of solving visualization problems and also outline a framework by which humans can be easily seduced to volunteer their HC resources. We introduce a purpose-driven game titled "Disguise" which serves as a prototypical example for how the evaluation of visualization algorithms can be mapped into a fun and addicting activity, allowing this task to be accomplished in an extensive yet cost effective way. Finally, we sketch out a framework that transcends from the pure evaluation of existing visualization methods to the design of a new one.
ERIC Educational Resources Information Center
Gates, Gordon S., Ed.; Wolverton, Mimi, Ed.; Gmelch, Walter H., Ed.
2007-01-01
This collection of chapters presents research focused on emerging strategies, paradigms, and theories on the sources, experiences, and consequences of stress, coping, and prevention pertaining to students, teachers and administrators. Studies analyze data collected through action research, program evaluation, surveys, qualitative interviewing,…
Teaching Business Strategy for an Emerging Economy: An Internet-Based Simulation.
ERIC Educational Resources Information Center
Miller, Van V.
2003-01-01
Describes an Internet-based simulation used in a course about business strategy in an emerging economy. The simulation, when coupled with today's dominant strategy paradigm, the Resource Based View, appears to yield a course design that attracts students while emphasizing the actual substance which is crucial in such a course. (EV)
ERIC Educational Resources Information Center
DePinto, Ross M.
2013-01-01
Much of the relevant literature in the domains of leadership development, succession planning, and cross-generational issues that discusses learning paradigms associated with emerging generational cohorts has been based on qualitative research and anecdotal evidence. In contrast, this study employed quantitative research methods using a validated…
Atlantoaxial Rotatory Subluxation: A Review for the Pediatric Emergency Physician.
Kinon, Merritt D; Nasser, Rani; Nakhla, Jonathan; Desai, Rupen; Moreno, Jessica R; Yassari, Reza; Bagley, Carlos A
2016-10-01
Pediatric emergency physicians must have a high clinical suspicion for atlantoaxial rotatory subluxation (AARS), particularly when a child presents with neck pain and an abnormal head posture without the ability to return to a neutral position. As shown in the neurosurgical literature, timely diagnosis and swift initiation of treatment have a greater chance of treatment success for the patient. However, timely treatment is complicated because torticollis can result from a variety of maladies, including: congenital abnormalities involving the C1-C2 joint or the surrounding supporting muscles and ligaments, central nervous system abnormalities, obstetric palsies from brachial plexus injuries, clavicle fractures, head and neck surgery, and infection. The treating pediatrician must discern the etiology of the underlying problem to determine both timing and treatment paradigms, which vary widely between these illnesses. We present a comprehensive review of AARS that is intended for pediatric emergency physicians. Management of AARS can vary widely bases on factors, such as duration of symptoms, as well as the patient's history. The goal of this review is to streamline the management paradigms and provide an inclusive review for pediatric emergency first responders.
ERIC Educational Resources Information Center
Aydin, Emin
2005-01-01
The purpose of this study is to review the changes that computers have on mathematics itself and on mathematics curriculum. The study aims at investigating different applications of computers in education in general, and mathematics education in particular and their applications on mathematics curriculum and on teaching and learning of…
The evolution and future of minimalism in neurological surgery.
Liu, Charles Y; Wang, Michael Y; Apuzzo, Michael L J
2004-11-01
The evolution of the field of neurological surgery has been marked by a progressive minimalism. This has been evident in the development of an entire arsenal of modern neurosurgical enterprises, including microneurosurgery, neuroendoscopy, stereotactic neurosurgery, endovascular techniques, radiosurgical systems, intraoperative and navigational devices, and in the last decade, cellular and molecular adjuvants. In addition to reviewing the major developments and paradigm shifts in the cyclic reinvention of the field as it currently stands, this paper attempts to identify forces and developments that are likely to fuel the irresistible escalation of minimalism into the future. These forces include discoveries in computational science, imaging, molecular science, biomedical engineering, and information processing as they relate to the theme of minimalism. These areas are explained in the light of future possibilities offered by the emerging field of nanotechnology with molecular engineering.
When fragments link: a bibliometric perspective on the development of fragment-based drug discovery.
Romasanta, Angelo K S; van der Sijde, Peter; Hellsten, Iina; Hubbard, Roderick E; Keseru, Gyorgy M; van Muijlwijk-Koezen, Jacqueline; de Esch, Iwan J P
2018-05-05
Fragment-based drug discovery (FBDD) is a highly interdisciplinary field, rich in ideas integrated from pharmaceutical sciences, chemistry, biology, and physics, among others. To enrich our understanding of the development of the field, we used bibliometric techniques to analyze 3642 publications in FBDD, complementing accounts by key practitioners. Mapping its core papers, we found the transfer of knowledge from academia to industry. Co-authorship analysis showed that university-industry collaboration has grown over time. Moreover, we show how ideas from other scientific disciplines have been integrated into the FBDD paradigm. Keyword analysis showed that the field is organized into four interconnected practices: library design, fragment screening, computational methods, and optimization. This study highlights the importance of interactions among various individuals and institutions from diverse disciplines in newly emerging scientific fields. Copyright © 2018. Published by Elsevier Ltd.
Interactions of timing and prediction error learning.
Kirkpatrick, Kimberly
2014-01-01
Timing and prediction error learning have historically been treated as independent processes, but growing evidence has indicated that they are not orthogonal. Timing emerges at the earliest time point when conditioned responses are observed, and temporal variables modulate prediction error learning in both simple conditioning and cue competition paradigms. In addition, prediction errors, through changes in reward magnitude or value alter timing of behavior. Thus, there appears to be a bi-directional interaction between timing and prediction error learning. Modern theories have attempted to integrate the two processes with mixed success. A neurocomputational approach to theory development is espoused, which draws on neurobiological evidence to guide and constrain computational model development. Heuristics for future model development are presented with the goal of sparking new approaches to theory development in the timing and prediction error fields. Copyright © 2013 Elsevier B.V. All rights reserved.
Early prediction of movie box office success based on Wikipedia activity big data.
Mestyán, Márton; Yasseri, Taha; Kertész, János
2013-01-01
Use of socially generated "big data" to access information about collective states of the minds in human societies has become a new paradigm in the emerging field of computational social science. A natural application of this would be the prediction of the society's reaction to a new product in the sense of popularity and adoption rate. However, bridging the gap between "real time monitoring" and "early predicting" remains a big challenge. Here we report on an endeavor to build a minimalistic predictive model for the financial success of movies based on collective activity data of online users. We show that the popularity of a movie can be predicted much before its release by measuring and analyzing the activity level of editors and viewers of the corresponding entry to the movie in Wikipedia, the well-known online encyclopedia.
The fuzzy cube and causal efficacy: representation of concomitant mechanisms in stroke.
Jobe, Thomas H.; Helgason, Cathy M.
1998-04-01
Twentieth century medical science has embraced nineteenth century Boolean probability theory based upon two-valued Aristotelian logic. With the later addition of bit-based, von Neumann structured computational architectures, an epistemology based on randomness has led to a bivalent epidemiological methodology that dominates medical decision making. In contrast, fuzzy logic, based on twentieth century multi-valued logic, and computational structures that are content addressed and adaptively modified, has advanced a new scientific paradigm for the twenty-first century. Diseases such as stroke involve multiple concomitant causal factors that are difficult to represent using conventional statistical methods. We tested which paradigm best represented this complex multi-causal clinical phenomenon-stroke. We show that the fuzzy logic paradigm better represented clinical complexity in cerebrovascular disease than current probability theory based methodology. We believe this finding is generalizable to all of clinical science since multiple concomitant causal factors are involved in nearly all known pathological processes.
A novel brain-computer interface based on the rapid serial visual presentation paradigm.
Acqualagna, Laura; Treder, Matthias Sebastian; Schreuder, Martijn; Blankertz, Benjamin
2010-01-01
Most present-day visual brain computer interfaces (BCIs) suffer from the fact that they rely on eye movements, are slow-paced, or feature a small vocabulary. As a potential remedy, we explored a novel BCI paradigm consisting of a central rapid serial visual presentation (RSVP) of the stimuli. It has a large vocabulary and realizes a BCI system based on covert non-spatial selective visual attention. In an offline study, eight participants were presented sequences of rapid bursts of symbols. Two different speeds and two different color conditions were investigated. Robust early visual and P300 components were elicited time-locked to the presentation of the target. Offline classification revealed a mean accuracy of up to 90% for selecting the correct symbol out of 30 possibilities. The results suggest that RSVP-BCI is a promising new paradigm, also for patients with oculomotor impairments.
Pitfalls of implementing acute care surgery.
Kaplan, Lewis J; Frankel, Heidi; Davis, Kimberly A; Barie, Philip S
2007-05-01
Incorporating emergency general surgery into the current practice of the trauma and critical care surgeon carries sweeping implications for future practice and training. Herein, we examine the known benefits of the practice of emergency general surgery, contrast it with the emerging paradigm of acute care surgery, and examine pitfalls already encountered in integration of emergency general surgery into a traditional trauma/critical care surgery service. A MEDLINE literature search was supplemented with local experience and national presentations at major meetings to provide data for this review. Considerations including faculty complement, service structure, resident staffing, physician extenders, the decreased role of community hospitals in providing trauma and emergency general surgery care, and the effects on an elective operative schedule are inadequately explored at present. There are no firm recommendations as to how to incorporate emergency general surgery into a trauma/critical care practice that will satisfy both academic and community practice paradigms. The near future seems likely to embrace the expanded training and clinical care program termed acute care surgery. A host of essential elements have yet to be examined to undertake a critical analysis of the applicability, advisability, and appropriate structure of both emergency general surgery and acute care surgery in the United States. Proceeding along this pathway may be fraught with training, education, and implementation pitfalls that are ideally addressed before deploying acute care surgery as a national standard.
Unidata Cyberinfrastructure in the Cloud
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Young, J. W.
2016-12-01
Data services, software, and user support are critical components of geosciences cyber-infrastructure to help researchers to advance science. With the maturity of and significant advances in cloud computing, it has recently emerged as an alternative new paradigm for developing and delivering a broad array of services over the Internet. Cloud computing is now mature enough in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Given the enormous potential of cloud-based services, Unidata has been moving to augment its software, services, data delivery mechanisms to align with the cloud-computing paradigm. To realize the above vision, Unidata has worked toward: * Providing access to many types of data from a cloud (e.g., via the THREDDS Data Server, RAMADDA and EDEX servers); * Deploying data-proximate tools to easily process, analyze, and visualize those data in a cloud environment cloud for consumption by any one, by any device, from anywhere, at any time; * Developing and providing a range of pre-configured and well-integrated tools and services that can be deployed by any university in their own private or public cloud settings. Specifically, Unidata has developed Docker for "containerized applications", making them easy to deploy. Docker helps to create "disposable" installs and eliminates many configuration challenges. Containerized applications include tools for data transport, access, analysis, and visualization: THREDDS Data Server, Integrated Data Viewer, GEMPAK, Local Data Manager, RAMADDA Data Server, and Python tools; * Leveraging Jupyter as a central platform and hub with its powerful set of interlinking tools to connect interactively data servers, Python scientific libraries, scripts, and workflows; * Exploring end-to-end modeling and prediction capabilities in the cloud; * Partnering with NOAA and public cloud vendors (e.g., Amazon and OCC) on the NOAA Big Data Project to harness their capabilities and resources for the benefit of the academic community.
Vijayakumar, Nandita; Cheng, Theresa W; Pfeifer, Jennifer H
2017-06-01
Given the recent surge in functional neuroimaging studies on social exclusion, the current study employed activation likelihood estimation (ALE) based meta-analyses to identify brain regions that have consistently been implicated across different experimental paradigms used to investigate exclusion. We also examined the neural correlates underlying Cyberball, the most commonly used paradigm to study exclusion, as well as differences in exclusion-related activation between developing (7-18 years of age, from pre-adolescence up to late adolescence) and emerging adult (broadly defined as undergraduates, including late adolescence and young adulthood) samples. Results revealed involvement of the bilateral medial prefrontal and posterior cingulate cortices, right precuneus and left ventrolateral prefrontal cortex across the different paradigms used to examine social exclusion; similar activation patterns were identified when restricting the analysis to Cyberball studies. Investigations into age-related effects revealed that ventrolateral prefrontal activations identified in the full sample were driven by (i.e. present in) developmental samples, while medial prefrontal activations were driven by emerging adult samples. In addition, the right ventral striatum was implicated in exclusion, but only in developmental samples. Subtraction analysis revealed significantly greater activation likelihood in striatal and ventrolateral prefrontal clusters in the developmental samples as compared to emerging adults, though the opposite contrast failed to identify any significant regions. Findings integrate the knowledge accrued from functional neuroimaging studies on social exclusion to date, highlighting involvement of lateral prefrontal regions implicated in regulation and midline structures involved in social cognitive and self-evaluative processes across experimental paradigms and ages, as well as limbic structures in developing samples specifically. Copyright © 2017 Elsevier Inc. All rights reserved.
Rodríguez-Domínguez, Carlos; Benghazi, Kawtar; Noguera, Manuel; Garrido, José Luis; Rodríguez, María Luisa; Ruiz-López, Tomás
2012-01-01
The Request-Response (RR) paradigm is widely used in ubiquitous systems to exchange information in a secure, reliable and timely manner. Nonetheless, there is also an emerging need for adopting the Publish-Subscribe (PubSub) paradigm in this kind of systems, due to the advantages that this paradigm offers in supporting mobility by means of asynchronous, non-blocking and one-to-many message distribution semantics for event notification. This paper analyzes the strengths and weaknesses of both the RR and PubSub paradigms to support communications in ubiquitous systems and proposes an abstract communication model in order to enable their seamless integration. Thus, developers will be focused on communication semantics and the required quality properties, rather than be concerned about specific communication mechanisms. The aim is to provide developers with abstractions intended to decrease the complexity of integrating different communication paradigms commonly needed in ubiquitous systems. The proposal has been applied to implement a middleware and a real home automation system to show its applicability and benefits. PMID:22969366
Automation of learning-set testing - The video-task paradigm
NASA Technical Reports Server (NTRS)
Washburn, David A.; Hopkins, William D.; Rumbaugh, Duane M.
1989-01-01
Researchers interested in studying discrimination learning in primates have typically utilized variations in the Wisconsin General Test Apparatus (WGTA). In the present experiment, a new testing apparatus for the study of primate learning is proposed. In the video-task paradigm, rhesus monkeys (Macaca mulatta) respond to computer-generated stimuli by manipulating a joystick. Using this apparatus, discrimination learning-set data for 2 monkeys were obtained. Performance on Trial 2 exceeded 80 percent within 200 discrimination learning problems. These data illustrate the utility of the video-task paradigm in comparative research. Additionally, the efficient learning and rich data that were characteristic of this study suggest several advantages of the present testing paradigm over traditional WGTA testing.
Happiness and the Family 2.0 Paradigm
NASA Astrophysics Data System (ADS)
Mocan, Rodica; Racorean, Stefana
Does new media technology have the potential to make us happier? This paper explores the influence of new information communication technologies on family life satisfaction while analyzing some of the factors that determine changes in the way we live our lives in the information age. Family 2.0 is the new paradigm of family life and the emergence of Web 2.0 type of applications is at the very core of its existence.
Emerging treatment paradigms of ocular surface disease: proceedings of the Ocular Surface Workshop.
Rolando, M; Geerling, G; Dua, H S; Benítez-del-Castillo, J M; Creuzot-Garcher, C
2010-01-01
The objective of the Ocular Surface Workshop in Rome, Italy, on 6 February 2009, was to enhance the understanding of ocular surface disease (OSD) through an exploration of the nature of its complexities and current treatment paradigms across Europe. It was hoped that the peer-to-peer discussions and updates regarding common knowledge, clinical practices and shared experiences at this workshop would subsequently shape future treatment approaches to OSD.
Solving Math and Science Problems in the Real World with a Computational Mind
ERIC Educational Resources Information Center
Olabe, Juan Carlos; Basogain, Xabier; Olabe, Miguel Ángel; Maíz, Inmaculada; Castaño, Carlos
2014-01-01
This article presents a new paradigm for the study of Math and Sciences curriculum during primary and secondary education. A workshop for Education undergraduates at four different campuses (n = 242) was designed to introduce participants to the new paradigm. In order to make a qualitative analysis of the current school methodologies in…
The Paradigm Recursion: Is It More Accessible When Introduced in Middle School?
ERIC Educational Resources Information Center
Gunion, Katherine; Milford, Todd; Stege, Ulrike
2009-01-01
Recursion is a programming paradigm as well as a problem solving strategy thought to be very challenging to grasp for university students. This article outlines a pilot study, which expands the age range of students exposed to the concept of recursion in computer science through instruction in a series of interesting and engaging activities. In…
Instance-based learning: integrating sampling and repeated decisions from experience.
Gonzalez, Cleotilde; Dutt, Varun
2011-10-01
In decisions from experience, there are 2 experimental paradigms: sampling and repeated-choice. In the sampling paradigm, participants sample between 2 options as many times as they want (i.e., the stopping point is variable), observe the outcome with no real consequences each time, and finally select 1 of the 2 options that cause them to earn or lose money. In the repeated-choice paradigm, participants select 1 of the 2 options for a fixed number of times and receive immediate outcome feedback that affects their earnings. These 2 experimental paradigms have been studied independently, and different cognitive processes have often been assumed to take place in each, as represented in widely diverse computational models. We demonstrate that behavior in these 2 paradigms relies upon common cognitive processes proposed by the instance-based learning theory (IBLT; Gonzalez, Lerch, & Lebiere, 2003) and that the stopping point is the only difference between the 2 paradigms. A single cognitive model based on IBLT (with an added stopping point rule in the sampling paradigm) captures human choices and predicts the sequence of choice selections across both paradigms. We integrate the paradigms through quantitative model comparison, where IBLT outperforms the best models created for each paradigm separately. We discuss the implications for the psychology of decision making. © 2011 American Psychological Association
A Whirlwind Tour of Computational Geometry.
ERIC Educational Resources Information Center
Graham, Ron; Yao, Frances
1990-01-01
Described is computational geometry which used concepts and results from classical geometry, topology, combinatorics, as well as standard algorithmic techniques such as sorting and searching, graph manipulations, and linear programing. Also included are special techniques and paradigms. (KR)
Potential Paradigms and Possible Problems for CALL.
ERIC Educational Resources Information Center
Phillips, Martin
1987-01-01
Describes three models of CALL (computer assisted language learning) activity--games, the expert system, and the prosthetic approaches. A case is made for CALL development within a more instrumental view of the role of computers. (Author/CB)
Lord, Louis-David; Stevner, Angus B.; Kringelbach, Morten L.
2017-01-01
To survive in an ever-changing environment, the brain must seamlessly integrate a rich stream of incoming information into coherent internal representations that can then be used to efficiently plan for action. The brain must, however, balance its ability to integrate information from various sources with a complementary capacity to segregate information into modules which perform specialized computations in local circuits. Importantly, evidence suggests that imbalances in the brain's ability to bind together and/or segregate information over both space and time is a common feature of several neuropsychiatric disorders. Most studies have, however, until recently strictly attempted to characterize the principles of integration and segregation in static (i.e. time-invariant) representations of human brain networks, hence disregarding the complex spatio-temporal nature of these processes. In the present Review, we describe how the emerging discipline of whole-brain computational connectomics may be used to study the causal mechanisms of the integration and segregation of information on behaviourally relevant timescales. We emphasize how novel methods from network science and whole-brain computational modelling can expand beyond traditional neuroimaging paradigms and help to uncover the neurobiological determinants of the abnormal integration and segregation of information in neuropsychiatric disorders. This article is part of the themed issue ‘Mathematical methods in medicine: neuroscience, cardiology and pathology’. PMID:28507228
The DYNES Instrument: A Description and Overview
NASA Astrophysics Data System (ADS)
Zurawski, Jason; Ball, Robert; Barczyk, Artur; Binkley, Mathew; Boote, Jeff; Boyd, Eric; Brown, Aaron; Brown, Robert; Lehman, Tom; McKee, Shawn; Meekhof, Benjeman; Mughal, Azher; Newman, Harvey; Rozsa, Sandor; Sheldon, Paul; Tackett, Alan; Voicu, Ramiro; Wolff, Stephen; Yang, Xi
2012-12-01
Scientific innovation continues to increase requirements for the computing and networking infrastructures of the world. Collaborative partners, instrumentation, storage, and processing facilities are often geographically and topologically separated, as is the case with LHC virtual organizations. These separations challenge the technology used to interconnect available resources, often delivered by Research and Education (R&E) networking providers, and leads to complications in the overall process of end-to-end data management. Capacity and traffic management are key concerns of R&E network operators; a delicate balance is required to serve both long-lived, high capacity network flows, as well as more traditional end-user activities. The advent of dynamic circuit services, a technology that enables the creation of variable duration, guaranteed bandwidth networking channels, allows for the efficient use of common network infrastructures. These gains are seen particularly in locations where overall capacity is scarce compared to the (sustained peak) needs of user communities. Related efforts, including those of the LHCOPN [3] operations group and the emerging LHCONE [4] project, may take advantage of available resources by designating specific network activities as a “high priority”, allowing reservation of dedicated bandwidth or optimizing for deadline scheduling and predicable delivery patterns. This paper presents the DYNES instrument, an NSF funded cyberinfrastructure project designed to facilitate end-to-end dynamic circuit services [2]. This combination of hardware and software innovation is being deployed across R&E networks in the United States at selected end-sites located on University Campuses. DYNES is peering with international efforts in other countries using similar solutions, and is increasing the reach of this emerging technology. This global data movement solution could be integrated into computing paradigms such as cloud and grid computing platforms, and through the use of APIs can be integrated into existing data movement software.
Understanding emerging treatment paradigms in rheumatoid arthritis
2011-01-01
Treatment strategies for rheumatoid arthritis (RA) will continue to evolve as new drugs are developed, as new data become available, and as our potential to achieve greater and more consistent outcomes becomes more routine. Many patients will find both symptom relief and modest control of their disease with disease-modifying antirheumatic drugs (DMARDs), yet this course of therapy is clearly not effective in all patients. In fact, despite strong evidence that intensive treatment in the early stages of RA can slow or stop disease progression and may prevent disability, many patients continue to be managed in a stepwise manner and are treated with an ongoing monotherapy regimen with DMARDs. There is now a large body of evidence demonstrating the success of treating RA patients with anti-TNF therapy, usually in combination with methotrexate. As a result of the increased use of anti-TNF therapy, treatment paradigms have changed – and our practice is beginning to reflect this change. In the present review, we summarize the salient points of several recently proposed and emerging treatment paradigms with an emphasis on how these strategies may impact future practice. PMID:21624182
Understanding emerging treatment paradigms in rheumatoid arthritis.
Breedveld, Ferdinand C; Combe, Bernard
2011-05-25
Treatment strategies for rheumatoid arthritis (RA) will continue to evolve as new drugs are developed, as new data become available, and as our potential to achieve greater and more consistent outcomes becomes more routine. Many patients will find both symptom relief and modest control of their disease with disease-modifying antirheumatic drugs (DMARDs), yet this course of therapy is clearly not effective in all patients. In fact, despite strong evidence that intensive treatment in the early stages of RA can slow or stop disease progression and may prevent disability, many patients continue to be managed in a stepwise manner and are treated with an ongoing monotherapy regimen with DMARDs. There is now a large body of evidence demonstrating the success of treating RA patients with anti-TNF therapy, usually in combination with methotrexate. As a result of the increased use of anti-TNF therapy, treatment paradigms have changed - and our practice is beginning to reflect this change. In the present review, we summarize the salient points of several recently proposed and emerging treatment paradigms with an emphasis on how these strategies may impact future practice.
Challenges of Big Data Analysis.
Fan, Jianqing; Han, Fang; Liu, Han
2014-06-01
Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.
Parallel Computing Strategies for Irregular Algorithms
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Oliker, Leonid; Shan, Hongzhang; Biegel, Bryan (Technical Monitor)
2002-01-01
Parallel computing promises several orders of magnitude increase in our ability to solve realistic computationally-intensive problems, but relies on their efficient mapping and execution on large-scale multiprocessor architectures. Unfortunately, many important applications are irregular and dynamic in nature, making their effective parallel implementation a daunting task. Moreover, with the proliferation of parallel architectures and programming paradigms, the typical scientist is faced with a plethora of questions that must be answered in order to obtain an acceptable parallel implementation of the solution algorithm. In this paper, we consider three representative irregular applications: unstructured remeshing, sparse matrix computations, and N-body problems, and parallelize them using various popular programming paradigms on a wide spectrum of computer platforms ranging from state-of-the-art supercomputers to PC clusters. We present the underlying problems, the solution algorithms, and the parallel implementation strategies. Smart load-balancing, partitioning, and ordering techniques are used to enhance parallel performance. Overall results demonstrate the complexity of efficiently parallelizing irregular algorithms.
MAX - An advanced parallel computer for space applications
NASA Technical Reports Server (NTRS)
Lewis, Blair F.; Bunker, Robert L.
1991-01-01
MAX is a fault-tolerant multicomputer hardware and software architecture designed to meet the needs of NASA spacecraft systems. It consists of conventional computing modules (computers) connected via a dual network topology. One network is used to transfer data among the computers and between computers and I/O devices. This network's topology is arbitrary. The second network operates as a broadcast medium for operating system synchronization messages and supports the operating system's Byzantine resilience. A fully distributed operating system supports multitasking in an asynchronous event and data driven environment. A large grain dataflow paradigm is used to coordinate the multitasking and provide easy control of concurrency. It is the basis of the system's fault tolerance and allows both static and dynamical location of tasks. Redundant execution of tasks with software voting of results may be specified for critical tasks. The dataflow paradigm also supports simplified software design, test and maintenance. A unique feature is a method for reliably patching code in an executing dataflow application.
Challenges of Big Data Analysis
Fan, Jianqing; Han, Fang; Liu, Han
2014-01-01
Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469
Lai, Daniel T H; Begg, Rezaul K; Palaniswami, Marimuthu
2009-09-01
Our mobility is an important daily requirement so much so that any disruption to it severely degrades our perceived quality of life. Studies in gait and human movement sciences, therefore, play a significant role in maintaining the well-being of our mobility. Current gait analysis involves numerous interdependent gait parameters that are difficult to adequately interpret due to the large volume of recorded data and lengthy assessment times in gait laboratories. A proposed solution to these problems is computational intelligence (CI), which is an emerging paradigm in biomedical engineering most notably in pathology detection and prosthesis design. The integration of CI technology in gait systems facilitates studies in disorders caused by lower limb defects, cerebral disorders, and aging effects by learning data relationships through a combination of signal processing and machine learning techniques. Learning paradigms, such as supervised learning, unsupervised learning, and fuzzy and evolutionary algorithms, provide advanced modeling capabilities for biomechanical systems that in the past have relied heavily on statistical analysis. CI offers the ability to investigate nonlinear data relationships, enhance data interpretation, design more efficient diagnostic methods, and extrapolate model functionality. These are envisioned to result in more cost-effective, efficient, and easy-to-use systems, which would address global shortages in medical personnel and rising medical costs. This paper surveys current signal processing and CI methodologies followed by gait applications ranging from normal gait studies and disorder detection to artificial gait simulation. We review recent systems focusing on the existing challenges and issues involved in making them successful. We also examine new research in sensor technologies for gait that could be combined with these intelligent systems to develop more effective healthcare solutions.
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096
Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.
Zapata-Fonseca, Leonardo; Froese, Tom; Schilbach, Leonhard; Vogeley, Kai; Timmermans, Bert
2018-02-08
Autism Spectrum Disorder (ASD) can be understood as a social interaction disorder. This makes the emerging "second-person approach" to social cognition a more promising framework for studying ASD than classical approaches focusing on mindreading capacities in detached, observer-based arrangements. According to the second-person approach, embodied, perceptual, and embedded or interactive capabilities are also required for understanding others, and these are hypothesized to be compromised in ASD. We therefore recorded the dynamics of real-time sensorimotor interaction in pairs of control participants and participants with High-Functioning Autism (HFA), using the minimalistic human-computer interface paradigm known as "perceptual crossing" (PC). We investigated whether HFA is associated with impaired detection of social contingency, i.e., a reduced sensitivity to the other's responsiveness to one's own behavior. Surprisingly, our analysis reveals that, at least under the conditions of this highly simplified, computer-mediated, embodied form of social interaction, people with HFA perform equally well as controls. This finding supports the increasing use of virtual reality interfaces for helping people with ASD to better compensate for their social disabilities. Further dynamical analyses are necessary for a better understanding of the mechanisms that are leading to the somewhat surprising results here obtained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costa, Pedro M.; Fadeel, Bengt, E-mail: Bengt.Fade
Engineered nanomaterials are being developed for a variety of technological applications. However, the increasing use of nanomaterials in society has led to concerns about their potential adverse effects on human health and the environment. During the first decade of nanotoxicological research, the realization has emerged that effective risk assessment of the multitudes of new nanomaterials would benefit from a comprehensive understanding of their toxicological mechanisms, which is difficult to achieve with traditional, low-throughput, single end-point oriented approaches. Therefore, systems biology approaches are being progressively applied within the nano(eco)toxicological sciences. This novel paradigm implies that the study of biological systems shouldmore » be integrative resulting in quantitative and predictive models of nanomaterial behaviour in a biological system. To this end, global ‘omics’ approaches with which to assess changes in genes, proteins, metabolites, etc. are deployed allowing for computational modelling of the biological effects of nanomaterials. Here, we highlight omics and systems biology studies in nanotoxicology, aiming towards the implementation of a systems nanotoxicology and mechanism-based risk assessment of nanomaterials. - Highlights: • Systems nanotoxicology is a multi-disciplinary approach to quantitative modelling. • Transcriptomics, proteomics and metabolomics remain the most common methods. • Global “omics” techniques should be coupled to computational modelling approaches. • The discovery of nano-specific toxicity pathways and biomarkers is a prioritized goal. • Overall, experimental nanosafety research must endeavour reproducibility and relevance.« less
"Serving Two Masters"--Academics' Perspectives on Working at an Offshore Campus in Malaysia
ERIC Educational Resources Information Center
Dobos, Katalin
2011-01-01
This paper explores the effects of the internationalisation of higher education on the working lives of academics at an offshore campus in eastern Malaysia. Using the interpretivist paradigm and grounded theory methods it investigates their perspectives on various themes as those emerge during a series of interviews. These emerging themes are:…
ERIC Educational Resources Information Center
Warner, Chantelle; Dupuy, Beatrice
2018-01-01
In recent years, literacy has emerged as a key critical term in foreign language (FL) teaching and learning. This essay reflects on the history of literacy and on current developments, in particular those related to the development of multiliteracies paradigms. The article concludes with a discussion of emergent topics related to literacy and…
ERIC Educational Resources Information Center
Butvilofsky, Sandra A.; Hopewell, Susan; Escamilla, Kathy; Sparrow, Wendy
2017-01-01
The purpose of this single-subject longitudinal study was to examine the Spanish and English biliterate development of U.S. Latino Spanish/English speaking students, who we call emerging bilingual students, as they participated in an innovative biliteracy instructional program titled Literacy Squared®. Findings indicate that across the three years…
Marriage, Parenting, and the Emergence of Early Self-Regulation in the Family System
ERIC Educational Resources Information Center
Volling, Brenda L.; Blandon, Alysia Y.; Kolak, Amy M.
2006-01-01
The early years of toddlerhood mark the emergence of self-regulation and the child's ability to comply with parental requests. The current study examined young children's compliance and noncompliance in a family context by observing mothers, fathers, and two children in a family clean-up paradigm. Marital conflict and mutual responsiveness in the…
Architectural Principles and Experimentation of Distributed High Performance Virtual Clusters
ERIC Educational Resources Information Center
Younge, Andrew J.
2016-01-01
With the advent of virtualization and Infrastructure-as-a-Service (IaaS), the broader scientific computing community is considering the use of clouds for their scientific computing needs. This is due to the relative scalability, ease of use, advanced user environment customization abilities, and the many novel computing paradigms available for…
A Semantic Based Policy Management Framework for Cloud Computing Environments
ERIC Educational Resources Information Center
Takabi, Hassan
2013-01-01
Cloud computing paradigm has gained tremendous momentum and generated intensive interest. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this dissertation, we mainly focus on issues related to policy management and access…
The Efficacy of the Internet-Based Blackboard Platform in Developmental Writing Classes
ERIC Educational Resources Information Center
Shudooh, Yusuf M.
2016-01-01
The application of computer-assisted platforms in writing classes is a relatively new paradigm in education. The adoption of computers-assisted writing classes is gaining ground in many western and non western universities. Numerous issues can be addressed when conducting computer-assisted classes (CAC). However, a few studies conducted to assess…
NASA Astrophysics Data System (ADS)
Larger, Laurent; Baylón-Fuentes, Antonio; Martinenghi, Romain; Udaltsov, Vladimir S.; Chembo, Yanne K.; Jacquot, Maxime
2017-01-01
Reservoir computing, originally referred to as an echo state network or a liquid state machine, is a brain-inspired paradigm for processing temporal information. It involves learning a "read-out" interpretation for nonlinear transients developed by high-dimensional dynamics when the latter is excited by the information signal to be processed. This novel computational paradigm is derived from recurrent neural network and machine learning techniques. It has recently been implemented in photonic hardware for a dynamical system, which opens the path to ultrafast brain-inspired computing. We report on a novel implementation involving an electro-optic phase-delay dynamics designed with off-the-shelf optoelectronic telecom devices, thus providing the targeted wide bandwidth. Computational efficiency is demonstrated experimentally with speech-recognition tasks. State-of-the-art speed performances reach one million words per second, with very low word error rate. Additionally, to record speed processing, our investigations have revealed computing-efficiency improvements through yet-unexplored temporal-information-processing techniques, such as simultaneous multisample injection and pitched sampling at the read-out compared to information "write-in".
ERIC Educational Resources Information Center
Kynard, Carmen
2007-01-01
By revisiting the work of the Black Caucus and the radical rhetorics connected to Black Power and the black radical tradition, in this essay the author hopes to rebuild a frame where the picture of an African-American-vernacularized paradigm for critical literacy and social justice can emerge. She revisits the twinning of "Black Power/Black…
Integrating System Dynamics and Bayesian Networks with Application to Counter-IED Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarman, Kenneth D.; Brothers, Alan J.; Whitney, Paul D.
2010-06-06
The practice of choosing a single modeling paradigm for predictive analysis can limit the scope and relevance of predictions and their utility to decision-making processes. Considering multiple modeling methods simultaneously may improve this situation, but a better solution provides a framework for directly integrating different, potentially complementary modeling paradigms to enable more comprehensive modeling and predictions, and thus better-informed decisions. The primary challenges of this kind of model integration are to bridge language and conceptual gaps between modeling paradigms, and to determine whether natural and useful linkages can be made in a formal mathematical manner. To address these challenges inmore » the context of two specific modeling paradigms, we explore mathematical and computational options for linking System Dynamics (SD) and Bayesian network (BN) models and incorporating data into the integrated models. We demonstrate that integrated SD/BN models can naturally be described as either state space equations or Dynamic Bayes Nets, which enables the use of many existing computational methods for simulation and data integration. To demonstrate, we apply our model integration approach to techno-social models of insurgent-led attacks and security force counter-measures centered on improvised explosive devices.« less
Retinal vein occlusion: pathophysiology and treatment options.
Karia, Niral
2010-07-30
This paper reviews the current thinking about retinal vein occlusion. It gives an overview of its pathophysiology and discusses the evidence behind the various established and emerging treatment paradigms.
Facial approximation-from facial reconstruction synonym to face prediction paradigm.
Stephan, Carl N
2015-05-01
Facial approximation was first proposed as a synonym for facial reconstruction in 1987 due to dissatisfaction with the connotations the latter label held. Since its debut, facial approximation's identity has morphed as anomalies in face prediction have accumulated. Now underpinned by differences in what problems are thought to count as legitimate, facial approximation can no longer be considered a synonym for, or subclass of, facial reconstruction. Instead, two competing paradigms of face prediction have emerged, namely: facial approximation and facial reconstruction. This paper shines a Kuhnian lens across the discipline of face prediction to comprehensively review these developments and outlines the distinguishing features between the two paradigms. © 2015 American Academy of Forensic Sciences.
Emerging New Physics with Major Implications for Energy Technology, Biology, and Medicine
NASA Astrophysics Data System (ADS)
Mallove, Eugene F.
2003-03-01
In the past 15 years, reproducible experiments and prototype technological devices have emerged that may revolutionize much of physics and chemistry(despite the common perception that modern physics is on very solid ground and is nearing a "Theory of Everything"). This new physics has flourished despite very strong opposition by the entrenched foundational paradigms within physics and chemistry ( not to forget vested financial interests within academia). In fact, beginning with "cold fusion" (more generically low-energy nuclear reactions, LENR), one of the most important discoveries of the late 20th Century has been the irrefutable proof of the failure of the physics establishment to deal ethically and appropriately with potential and real paradigm shifts, when its "sacred writ" ( i.e. Its textbooks) -- are threatened with the need for massive revision.
Holographic equipartition from first order action
NASA Astrophysics Data System (ADS)
Wang, Jingbo
2017-12-01
Recently, the idea that gravity is emergent has attract many people's attention. The "Emergent Gravity Paradigm" is a program that develop this idea from the thermodynamical point of view. It expresses the Einstein equation in the language of thermodynamics. A key equation in this paradigm is the holographic equipartition which says that, in all static spacetimes, the degrees of freedom on the boundary equal those in the bulk. And the time evolution of spacetime is drove by the departure from the holographic equipartition. In this paper, we get the holographic equipartition and its generalization from the first order formalism, that is, the connection and its conjugate momentum are considered to be the canonical variables. The final results have similar structure as those from the metric formalism. It gives another proof of holographic equipartition.
An Efficient ERP-Based Brain-Computer Interface Using Random Set Presentation and Face Familiarity
Müller, Klaus-Robert; Lee, Seong-Whan
2014-01-01
Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup. PMID:25384045
An efficient ERP-based brain-computer interface using random set presentation and face familiarity.
Yeom, Seul-Ki; Fazli, Siamac; Müller, Klaus-Robert; Lee, Seong-Whan
2014-01-01
Event-related potential (ERP)-based P300 spellers are commonly used in the field of brain-computer interfaces as an alternative channel of communication for people with severe neuro-muscular diseases. This study introduces a novel P300 based brain-computer interface (BCI) stimulus paradigm using a random set presentation pattern and exploiting the effects of face familiarity. The effect of face familiarity is widely studied in the cognitive neurosciences and has recently been addressed for the purpose of BCI. In this study we compare P300-based BCI performances of a conventional row-column (RC)-based paradigm with our approach that combines a random set presentation paradigm with (non-) self-face stimuli. Our experimental results indicate stronger deflections of the ERPs in response to face stimuli, which are further enhanced when using the self-face images, and thereby improving P300-based spelling performance. This lead to a significant reduction of stimulus sequences required for correct character classification. These findings demonstrate a promising new approach for improving the speed and thus fluency of BCI-enhanced communication with the widely used P300-based BCI setup.
Shan, Haijun; Xu, Haojie; Zhu, Shanan; He, Bin
2015-10-21
For sensorimotor rhythms based brain-computer interface (BCI) systems, classification of different motor imageries (MIs) remains a crucial problem. An important aspect is how many scalp electrodes (channels) should be used in order to reach optimal performance classifying motor imaginations. While the previous researches on channel selection mainly focus on MI tasks paradigms without feedback, the present work aims to investigate the optimal channel selection in MI tasks paradigms with real-time feedback (two-class control and four-class control paradigms). In the present study, three datasets respectively recorded from MI tasks experiment, two-class control and four-class control experiments were analyzed offline. Multiple frequency-spatial synthesized features were comprehensively extracted from every channel, and a new enhanced method IterRelCen was proposed to perform channel selection. IterRelCen was constructed based on Relief algorithm, but was enhanced from two aspects: change of target sample selection strategy and adoption of the idea of iterative computation, and thus performed more robust in feature selection. Finally, a multiclass support vector machine was applied as the classifier. The least number of channels that yield the best classification accuracy were considered as the optimal channels. One-way ANOVA was employed to test the significance of performance improvement among using optimal channels, all the channels and three typical MI channels (C3, C4, Cz). The results show that the proposed method outperformed other channel selection methods by achieving average classification accuracies of 85.2, 94.1, and 83.2 % for the three datasets, respectively. Moreover, the channel selection results reveal that the average numbers of optimal channels were significantly different among the three MI paradigms. It is demonstrated that IterRelCen has a strong ability for feature selection. In addition, the results have shown that the numbers of optimal channels in the three different motor imagery BCI paradigms are distinct. From a MI task paradigm, to a two-class control paradigm, and to a four-class control paradigm, the number of required channels for optimizing the classification accuracy increased. These findings may provide useful information to optimize EEG based BCI systems, and further improve the performance of noninvasive BCI.
Cost-Effective Cloud Computing: A Case Study Using the Comparative Genomics Tool, Roundup
Kudtarkar, Parul; DeLuca, Todd F.; Fusaro, Vincent A.; Tonellato, Peter J.; Wall, Dennis P.
2010-01-01
Background Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource—Roundup—using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Methods Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon’s Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. Results We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon’s computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure. PMID:21258651
New Toxico-Cheminformatics & Computational Toxicology Initiatives At EPA
EPA’s National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than tr...
Witchel, Selma F; Recabarren, Sergio E; Gonzalez, Frank; Diamanti-Kandarakis, Evanthia; Cheang, Kai I; Duleba, Antoni J; Legro, Richard S; Homburg, Roy; Pasquali, Renato; Lobo, Rogerio; Zouboulis, Christos C.; Kelestimur, Fahrettin; Fruzzetti, Franca; Futterweit, Walter; Norman, Robert J; Abbott, David H
2012-01-01
The interactive nature of the 8th Annual Meeting of the Androgen Excess & PCOS Society Annual Meeting in Munich, Germany (AEPCOS 2010) and subsequent exchanges between speakers led to emerging concepts in PCOS regarding its genesis, metabolic dysfunction, and clinical treatment of inflammation, metabolic dysfunction, anovulation and hirsutism. Transition of care in congenital adrenal hyperplasia from pediatric to adult providers emerged as a potential model for care transition involving PCOS adolescents. PMID:22661293
The Development of Genetics in the Light of Thomas Kuhn's Theory of Scientific Revolutions.
Portin, Petter
2015-01-01
The concept of a paradigm is in the key position in Thomas Kuhn's theory of scientific revolutions. A paradigm is the framework within which the results, concepts, hypotheses and theories of scientific research work are understood. According to Kuhn, a paradigm guides the working and efforts of scientists during the time period which he calls the period of normal science. Before long, however, normal science leads to unexplained matters, a situation that then leads the development of the scientific discipline in question to a paradigm shift--a scientific revolution. When a new theory is born, it has either gradually emerged as an extension of the past theory, or the old theory has become a borderline case in the new theory. In the former case, one can speak of a paradigm extension. According to the present author, the development of modern genetics has, until very recent years, been guided by a single paradigm, the Mendelian paradigm which Gregor Mendel launched 150 years ago, and under the guidance of this paradigm the development of genetics has proceeded in a normal fashion in the spirit of logical positivism. Modern discoveries in genetics have, however, created a situation which seems to be leading toward a paradigm shift. The most significant of these discoveries are the findings of adaptive mutations, the phenomenon of transgenerational epigenetic inheritance, and, above all, the present deeply critical state of the concept of the gene.
The Fate of the Method of 'Paradigms' in Paleobiology.
Rudwick, Martin J S
2017-11-02
An earlier article described the mid-twentieth century origins of the method of "paradigms" in paleobiology, as a way of making testable hypotheses about the functional morphology of extinct organisms. The present article describes the use of "paradigms" through the 1970s and, briefly, to the end of the century. After I had proposed the paradigm method to help interpret the ecological history of brachiopods, my students developed it in relation to that and other invertebrate phyla, notably in Euan Clarkson's analysis of vision in trilobites. David Raup's computer-aided "theoretical morphology" was then combined with my functional or adaptive emphasis, in Adolf Seilacher's tripartite "constructional morphology." Stephen Jay Gould, who had strongly endorsed the method, later switched to criticizing the "adaptationist program" he claimed it embodied. Although the explicit use of paradigms in paleobiology had declined by the end of the century, the method was tacitly subsumed into functional morphology as "biomechanics."
A Computer Model for Red Blood Cell Chemistry
1996-10-01
5012. 13. ABSTRACT (Maximum 200 There is a growing need for interactive computational tools for medical education and research. The most exciting...paradigm for interactive education is simulation. Fluid Mod is a simulation based computational tool developed in the late sixties and early seventies at...to a modern Windows, object oriented interface. This development will provide students with a useful computational tool for learning . More important
Retinal vein occlusion: pathophysiology and treatment options
Karia, Niral
2010-01-01
This paper reviews the current thinking about retinal vein occlusion. It gives an overview of its pathophysiology and discusses the evidence behind the various established and emerging treatment paradigms. PMID:20689798
Are Allopathic and Holistic Medicine Incommensurable?
Evangelatos, Nikolaos; Eliadi, Irini
2016-01-01
The shift from the Aristotelian to the Newtonian scientific paradigm gave birth to progresses in the natural, hard sciences and contributed to the emergence of modernity. Allopathic medicine gradually implemented those progresses, transforming itself into contemporary biomedicine. In the early 20th century, replacement of Newtonian physics by quantum mechanics and Einstein's theory of relativity resulted in a new paradigm shift in the natural, hard sciences. This shift gave birth to post-modern perceptions, which attempt to put those changes in context. Within this new context, holistic therapeutic approaches are considered more compatible with the new paradigm. Different paradigms in the natural, hard sciences are considered to be incommensurable (in the Kuhnian sense). This incommensurability is also transferred to the different societal contexts, the different «Weltanschauungen» that rely on different scientific paradigms. However, drawing on arguments that range from historical and philosophical to practical and sociological ones, we argue that, although based on different scientific paradigms, allopathic and holistic medicine are not incommensurable, but rather complementary. This may be related to the inherent attributes of medicine, a fact that reinforces the debate on its epistemological status. © 2016 S. Karger GmbH, Freiburg.
Computer modeling of prostate cancer treatment. A paradigm for oncologic management?
Miles, B J; Kattan, M W
1995-04-01
This article discusses the relevance of computer modeling to the management of prostate cancer. Several computer modeling techniques are reviewed and the advantages and disadvantages of each are discussed. An example that uses a computer model to compare alternative strategies for clinically localized prostate cancer is examined in detail. The quality of the data used in computer models is critical, and these models play an important role in medical decision making.
Energy challenges in optical access and aggregation networks.
Kilper, Daniel C; Rastegarfar, Houman
2016-03-06
Scalability is a critical issue for access and aggregation networks as they must support the growth in both the size of data capacity demands and the multiplicity of access points. The number of connected devices, the Internet of Things, is growing to the tens of billions. Prevailing communication paradigms are reaching physical limitations that make continued growth problematic. Challenges are emerging in electronic and optical systems and energy increasingly plays a central role. With the spectral efficiency of optical systems approaching the Shannon limit, increasing parallelism is required to support higher capacities. For electronic systems, as the density and speed increases, the total system energy, thermal density and energy per bit are moving into regimes that become impractical to support-for example requiring single-chip processor powers above the 100 W limit common today. We examine communication network scaling and energy use from the Internet core down to the computer processor core and consider implications for optical networks. Optical switching in data centres is identified as a potential model from which scalable access and aggregation networks for the future Internet, with the application of integrated photonic devices and intelligent hybrid networking, will emerge. © 2016 The Author(s).
Terrestrial biogeochemical cycles - Global interactions with the atmosphere and hydrology
NASA Technical Reports Server (NTRS)
Schimel, David S.; Parton, William J.; Kittel, Timothy G. F.
1991-01-01
A review is presented of developments in ecosystem theory, remote sensing, and geographic information systems that support new endeavors in spatial modeling. A paradigm has emerged to predict ecosystem behavior based on understanding responses to multiple resources. Ecosystem models couple primary production to decomposition and nutrient availability utilizing this paradigm. It is indicated that coupling of transport and ecosystem processes alters the behavior of earth system components (terrestrial ecosystems, hydrology, and the atmosphere) from that of an uncoupled model.
On motion in a resisting medium: A historical perspective
NASA Astrophysics Data System (ADS)
Hackborn, William W.
2016-02-01
This paper examines, compares, and contrasts ideas about motion, especially the motion of a body in a resisting medium, proposed by Galileo, Newton, and Tartaglia, the author of the first text on exterior ballistics, within the context of the Aristotelian philosophy prevalent when these scholars developed their ideas. This historical perspective offers insights on the emergence of a scientific paradigm for motion, particularly with respect to the challenge of incorporating into this paradigm the role played by the medium.
Ponterotto, Joseph G
2010-10-01
This article reviews the current and emerging status of qualitative research in psychology. The particular value of diverse philosophical paradigms and varied inquiry approaches to the advancement of psychology generally, and multicultural psychology specifically, is emphasized. Three specific qualitative inquiry approaches anchored in diverse philosophical research paradigms are highlighted: consensual qualitative research, grounded theory, and participatory action research. The article concludes by highlighting important ethical considerations in multicultural qualitative research. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Paradigm Paralysis and the Plight of the PC in Education.
ERIC Educational Resources Information Center
O'Neil, Mick
1998-01-01
Examines the varied factors involved in providing Internet access in K-12 education, including expense, computer installation and maintenance, and security, and explores how the network computer could be useful in this context. Operating systems and servers are discussed. (MSE)
Secure dissemination of electronic healthcare records in distributed wireless environments.
Belsis, Petros; Vassis, Dimitris; Skourlas, Christos; Pantziou, Grammati
2008-01-01
A new networking paradigm has emerged with the appearance of wireless computing. Among else ad-hoc networks, mobile and ubiquitous environments can boost the performance of systems in which they get applied. Among else, medical environments are a convenient example of their applicability. With the utilisation of wireless infrastructures, medical data may be accessible to healthcare practitioners, enabling continuous access to medical data. Due to the critical nature of medical information, the design and implementation of these infrastructures demands special treatment in order to meet specific requirements; among else, special care should be taken in order to manage interoperability, security, and in order to deal with bandwidth and hardware resource constraints that characterize the wireless topology. In this paper we present an architecture that attempts to deal with these issues; moreover, in order to prove the validity of our approach we have also evaluated the performance of our platform through simulation in different operating scenarios.
International disaster research
NASA Technical Reports Server (NTRS)
Silverstein, Martin Elliot
1991-01-01
No existing telecommunications system can be expected to provide strategy and tactics appropriate to the complex, many faceted problem of disaster. Despite the exciting capabilities of space, communications, remote sensing, and the miracles of modern medicine, complete turnkey transfers to the disaster problem do not make the fit, and cannot be expected to do so. In 1980, a Presidential team assigned the mission of exploring disaster response within the U.S. Federal Government encountered an unanticipated obstacle: disaster was essentially undefined. In the absence of a scientifically based paradigm of disaster, there can be no measure of cost effectiveness, optimum design of manpower structure, or precise application of any technology. These problems spawned a 10-year, multidisciplinary study designed to define the origins, anatomy, and necessary management techniques for catastrophes. The design of the study necessarily reflects interests and expertise in disaster medicine, emergency medicine, telecommunications, computer communications, and forencsic sciences. This study is described.
DIY-Bio - economic, epistemological and ethical implications and ambivalences.
Keulartz, Jozef; van den Belt, Henk
2016-12-01
Since 2008, we witness the emergence of the Do-It-Yourself Biology movement, a global movement spreading the use of biotechnology beyond traditional academic and industrial institutions and into the lay public. Practitioners include a broad mix of amateurs, enthusiasts, students, and trained scientists. At this moment, the movement counts nearly 50 local groups, mostly in America and Europe, but also increasingly in Asia. Do-It-Yourself Bio represents a direct translation of hacking culture and practicesfrom the realm of computers and software into the realm of genes and cells. Although the movement is still in its infancy, and it is even unclear whether it will ever reach maturity, the contours of a new paradigm of knowledge production are already becoming visible. We will subsequently sketch the economic, the epistemological and the ethical profile of Do-It-Yourself Bio, and discuss its implications and also its ambivalences.
Parallel computing for probabilistic fatigue analysis
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.
1993-01-01
This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.
Objective breast tissue image classification using Quantitative Transmission ultrasound tomography
NASA Astrophysics Data System (ADS)
Malik, Bilal; Klock, John; Wiskin, James; Lenox, Mark
2016-12-01
Quantitative Transmission Ultrasound (QT) is a powerful and emerging imaging paradigm which has the potential to perform true three-dimensional image reconstruction of biological tissue. Breast imaging is an important application of QT and allows non-invasive, non-ionizing imaging of whole breasts in vivo. Here, we report the first demonstration of breast tissue image classification in QT imaging. We systematically assess the ability of the QT images’ features to differentiate between normal breast tissue types. The three QT features were used in Support Vector Machines (SVM) classifiers, and classification of breast tissue as either skin, fat, glands, ducts or connective tissue was demonstrated with an overall accuracy of greater than 90%. Finally, the classifier was validated on whole breast image volumes to provide a color-coded breast tissue volume. This study serves as a first step towards a computer-aided detection/diagnosis platform for QT.
Time to rethink the neural mechanisms of learning and memory
Gallistel, Charles R.; Balsam, Peter D
2014-01-01
Most studies in the neurobiology of learning assume that the underlying learning process is a pairing – dependent change in synaptic strength that requires repeated experience of events presented in close temporal contiguity. However, much learning is rapid and does not depend on temporal contiguity which has never been precisely defined. These points are well illustrated by studies showing that temporal relationships between events are rapidly learned-even over long delays- and this knowledge governs the form and timing of behavior. The speed with which anticipatory responses emerge in conditioning paradigms is determined by the information that cues provide about the timing of rewards. The challenge for understanding the neurobiology of learning is to understand the mechanisms in the nervous system that encode information from even a single experience, the nature of the memory mechanisms that can encode quantities such as time, and how the brain can flexibly perform computations based on this information. PMID:24309167
Early Prediction of Movie Box Office Success Based on Wikipedia Activity Big Data
Mestyán, Márton; Yasseri, Taha; Kertész, János
2013-01-01
Use of socially generated “big data” to access information about collective states of the minds in human societies has become a new paradigm in the emerging field of computational social science. A natural application of this would be the prediction of the society's reaction to a new product in the sense of popularity and adoption rate. However, bridging the gap between “real time monitoring” and “early predicting” remains a big challenge. Here we report on an endeavor to build a minimalistic predictive model for the financial success of movies based on collective activity data of online users. We show that the popularity of a movie can be predicted much before its release by measuring and analyzing the activity level of editors and viewers of the corresponding entry to the movie in Wikipedia, the well-known online encyclopedia. PMID:23990938
The Science DMZ: A Network Design Pattern for Data-Intensive Science
Dart, Eli; Rotman, Lauren; Tierney, Brian; ...
2014-01-01
The ever-increasing scale of scientific data has become a significant challenge for researchers that rely on networks to interact with remote computing systems and transfer results to collaborators worldwide. Despite the availability of high-capacity connections, scientists struggle with inadequate cyberinfrastructure that cripples data transfer performance, and impedes scientific progress. The Science DMZ paradigm comprises a proven set of network design patterns that collectively address these problems for scientists. We explain the Science DMZ model, including network architecture, system configuration, cybersecurity, and performance tools, that creates an optimized network environment for science. We describe use cases from universities, supercomputing centers andmore » research laboratories, highlighting the effectiveness of the Science DMZ model in diverse operational settings. In all, the Science DMZ model is a solid platform that supports any science workflow, and flexibly accommodates emerging network technologies. As a result, the Science DMZ vastly improves collaboration, accelerating scientific discovery.« less
The nonequilibrium quantum many-body problem as a paradigm for extreme data science
NASA Astrophysics Data System (ADS)
Freericks, J. K.; Nikolić, B. K.; Frieder, O.
2014-12-01
Generating big data pervades much of physics. But some problems, which we call extreme data problems, are too large to be treated within big data science. The nonequilibrium quantum many-body problem on a lattice is just such a problem, where the Hilbert space grows exponentially with system size and rapidly becomes too large to fit on any computer (and can be effectively thought of as an infinite-sized data set). Nevertheless, much progress has been made with computational methods on this problem, which serve as a paradigm for how one can approach and attack extreme data problems. In addition, viewing these physics problems from a computer-science perspective leads to new approaches that can be tried to solve more accurately and for longer times. We review a number of these different ideas here.
PIC codes for plasma accelerators on emerging computer architectures (GPUS, Multicore/Manycore CPUS)
NASA Astrophysics Data System (ADS)
Vincenti, Henri
2016-03-01
The advent of exascale computers will enable 3D simulations of a new laser-plasma interaction regimes that were previously out of reach of current Petasale computers. However, the paradigm used to write current PIC codes will have to change in order to fully exploit the potentialities of these new computing architectures. Indeed, achieving Exascale computing facilities in the next decade will be a great challenge in terms of energy consumption and will imply hardware developments directly impacting our way of implementing PIC codes. As data movement (from die to network) is by far the most energy consuming part of an algorithm future computers will tend to increase memory locality at the hardware level and reduce energy consumption related to data movement by using more and more cores on each compute nodes (''fat nodes'') that will have a reduced clock speed to allow for efficient cooling. To compensate for frequency decrease, CPU machine vendors are making use of long SIMD instruction registers that are able to process multiple data with one arithmetic operator in one clock cycle. SIMD register length is expected to double every four years. GPU's also have a reduced clock speed per core and can process Multiple Instructions on Multiple Datas (MIMD). At the software level Particle-In-Cell (PIC) codes will thus have to achieve both good memory locality and vectorization (for Multicore/Manycore CPU) to fully take advantage of these upcoming architectures. In this talk, we present the portable solutions we implemented in our high performance skeleton PIC code PICSAR to both achieve good memory locality and cache reuse as well as good vectorization on SIMD architectures. We also present the portable solutions used to parallelize the Pseudo-sepctral quasi-cylindrical code FBPIC on GPUs using the Numba python compiler.
ERIC Educational Resources Information Center
Motschnig-Pitrik, Renate; Mallich, Katharina
2004-01-01
Web-based technology increases the hours we spend sitting in front of the screens of our computers. But can it also be used in a way to improve our social skills? The blended learning paradigm of Person-Centered e-Learning (PCeL) precisely aims to achieve intellectual as well as social and personal development by combining the benefits of online…
Testing paradigms of ecosystem change under climate warming in Antarctica.
Melbourne-Thomas, Jessica; Constable, Andrew; Wotherspoon, Simon; Raymond, Ben
2013-01-01
Antarctic marine ecosystems have undergone significant changes as a result of human activities in the past and are now responding in varied and often complicated ways to climate change impacts. Recent years have seen the emergence of large-scale mechanistic explanations-or "paradigms of change"-that attempt to synthesize our understanding of past and current changes. In many cases, these paradigms are based on observations that are spatially and temporally patchy. The West Antarctic Peninsula (WAP), one of Earth's most rapidly changing regions, has been an area of particular research focus. A recently proposed mechanistic explanation for observed changes in the WAP region relates changes in penguin populations to variability in krill biomass and regional warming. While this scheme is attractive for its simplicity and chronology, it may not account for complex spatio-temporal processes that drive ecosystem dynamics in the region. It might also be difficult to apply to other Antarctic regions that are experiencing some, though not all, of the changes documented for the WAP. We use qualitative network models of differing levels of complexity to test paradigms of change for the WAP ecosystem. Importantly, our approach captures the emergent effects of feedback processes in complex ecological networks and provides a means to identify and incorporate uncertain linkages between network elements. Our findings highlight key areas of uncertainty in the drivers of documented trends, and suggest that a greater level of model complexity is needed in devising explanations for ecosystem change in the Southern Ocean. We suggest that our network approach to evaluating a recent and widely cited paradigm of change for the Antarctic region could be broadly applied in hypothesis testing for other regions and research fields.
Testing Paradigms of Ecosystem Change under Climate Warming in Antarctica
Melbourne-Thomas, Jessica; Constable, Andrew; Wotherspoon, Simon; Raymond, Ben
2013-01-01
Antarctic marine ecosystems have undergone significant changes as a result of human activities in the past and are now responding in varied and often complicated ways to climate change impacts. Recent years have seen the emergence of large-scale mechanistic explanations–or “paradigms of change”–that attempt to synthesize our understanding of past and current changes. In many cases, these paradigms are based on observations that are spatially and temporally patchy. The West Antarctic Peninsula (WAP), one of Earth’s most rapidly changing regions, has been an area of particular research focus. A recently proposed mechanistic explanation for observed changes in the WAP region relates changes in penguin populations to variability in krill biomass and regional warming. While this scheme is attractive for its simplicity and chronology, it may not account for complex spatio-temporal processes that drive ecosystem dynamics in the region. It might also be difficult to apply to other Antarctic regions that are experiencing some, though not all, of the changes documented for the WAP. We use qualitative network models of differing levels of complexity to test paradigms of change for the WAP ecosystem. Importantly, our approach captures the emergent effects of feedback processes in complex ecological networks and provides a means to identify and incorporate uncertain linkages between network elements. Our findings highlight key areas of uncertainty in the drivers of documented trends, and suggest that a greater level of model complexity is needed in devising explanations for ecosystem change in the Southern Ocean. We suggest that our network approach to evaluating a recent and widely cited paradigm of change for the Antarctic region could be broadly applied in hypothesis testing for other regions and research fields. PMID:23405116
Tsui, Chun Sing Louis; Gan, John Q; Roberts, Stephen J
2009-03-01
Due to the non-stationarity of EEG signals, online training and adaptation are essential to EEG based brain-computer interface (BCI) systems. Self-paced BCIs offer more natural human-machine interaction than synchronous BCIs, but it is a great challenge to train and adapt a self-paced BCI online because the user's control intention and timing are usually unknown. This paper proposes a novel motor imagery based self-paced BCI paradigm for controlling a simulated robot in a specifically designed environment which is able to provide user's control intention and timing during online experiments, so that online training and adaptation of the motor imagery based self-paced BCI can be effectively investigated. We demonstrate the usefulness of the proposed paradigm with an extended Kalman filter based method to adapt the BCI classifier parameters, with experimental results of online self-paced BCI training with four subjects.
The virtual mirror: a new interaction paradigm for augmented reality environments.
Bichlmeier, Christoph; Heining, Sandro Michael; Feuerstein, Marco; Navab, Nassir
2009-09-01
Medical augmented reality (AR) has been widely discussed within the medical imaging as well as computer aided surgery communities. Different systems for exemplary medical applications have been proposed. Some of them produced promising results. One major issue still hindering AR technology to be regularly used in medical applications is the interaction between physician and the superimposed 3-D virtual data. Classical interaction paradigms, for instance with keyboard and mouse, to interact with visualized medical 3-D imaging data are not adequate for an AR environment. This paper introduces the concept of a tangible/controllable Virtual Mirror for medical AR applications. This concept intuitively augments the direct view of the surgeon with all desired views on volumetric medical imaging data registered with the operation site without moving around the operating table or displacing the patient. We selected two medical procedures to demonstrate and evaluate the potentials of the Virtual Mirror for the surgical workflow. Results confirm the intuitiveness of this new paradigm and its perceptive advantages for AR-based computer aided interventions.
A new paradigm for retrieval medicine.
Moloney, John
2018-06-12
A number of new time critical medical interventions are highly specialised. As such, they are not available in many hospitals and EDs. This necessitates transfer to another facility, which is often associated with some degree of delay. Processes to facilitate timely access to these interventions should aim to replicate or improve on that which would have been available should the patient have been in the community, and responded to, primarily, by an emergency medical service. © 2018 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
Performance of the Heavy Flavor Tracker (HFT) detector in star experiment at RHIC
NASA Astrophysics Data System (ADS)
Alruwaili, Manal
With the growing technology, the number of the processors is becoming massive. Current supercomputer processing will be available on desktops in the next decade. For mass scale application software development on massive parallel computing available on desktops, existing popular languages with large libraries have to be augmented with new constructs and paradigms that exploit massive parallel computing and distributed memory models while retaining the user-friendliness. Currently, available object oriented languages for massive parallel computing such as Chapel, X10 and UPC++ exploit distributed computing, data parallel computing and thread-parallelism at the process level in the PGAS (Partitioned Global Address Space) memory model. However, they do not incorporate: 1) any extension at for object distribution to exploit PGAS model; 2) the programs lack the flexibility of migrating or cloning an object between places to exploit load balancing; and 3) lack the programming paradigms that will result from the integration of data and thread-level parallelism and object distribution. In the proposed thesis, I compare different languages in PGAS model; propose new constructs that extend C++ with object distribution and object migration; and integrate PGAS based process constructs with these extensions on distributed objects. Object cloning and object migration. Also a new paradigm MIDD (Multiple Invocation Distributed Data) is presented when different copies of the same class can be invoked, and work on different elements of a distributed data concurrently using remote method invocations. I present new constructs, their grammar and their behavior. The new constructs have been explained using simple programs utilizing these constructs.
NASA Astrophysics Data System (ADS)
Vilotte, J.-P.; Atkinson, M.; Michelini, A.; Igel, H.; van Eck, T.
2012-04-01
Increasingly dense seismic and geodetic networks are continuously transmitting a growing wealth of data from around the world. The multi-use of these data leaded the seismological community to pioneer globally distributed open-access data infrastructures, standard services and formats, e.g., the Federation of Digital Seismic Networks (FDSN) and the European Integrated Data Archives (EIDA). Our ability to acquire observational data outpaces our ability to manage, analyze and model them. Research in seismology is today facing a fundamental paradigm shift. Enabling advanced data-intensive analysis and modeling applications challenges conventional storage, computation and communication models and requires a new holistic approach. It is instrumental to exploit the cornucopia of data, and to guarantee optimal operation and design of the high-cost monitoring facilities. The strategy of VERCE is driven by the needs of the seismological data-intensive applications in data analysis and modeling. It aims to provide a comprehensive architecture and framework adapted to the scale and the diversity of those applications, and integrating the data infrastructures with Grid, Cloud and HPC infrastructures. It will allow prototyping solutions for new use cases as they emerge within the European Plate Observatory Systems (EPOS), the ESFRI initiative of the solid Earth community. Computational seismology, and information management, is increasingly revolving around massive amounts of data that stem from: (1) the flood of data from the observational systems; (2) the flood of data from large-scale simulations and inversions; (3) the ability to economically store petabytes of data online; (4) the evolving Internet and Data-aware computing capabilities. As data-intensive applications are rapidly increasing in scale and complexity, they require additional services-oriented architectures offering a virtualization-based flexibility for complex and re-usable workflows. Scientific information management poses computer science challenges: acquisition, organization, query and visualization tasks scale almost linearly with the data volumes. Commonly used FTP-GREP metaphor allows today to scan gigabyte-sized datasets but will not work for scanning terabyte-sized continuous waveform datasets. New data analysis and modeling methods, exploiting the signal coherence within dense network arrays, are nonlinear. Pair-algorithms on N points scale as N2. Wave form inversion and stochastic simulations raise computing and data handling challenges These applications are unfeasible for tera-scale datasets without new parallel algorithms that use near-linear processing, storage and bandwidth, and that can exploit new computing paradigms enabled by the intersection of several technologies (HPC, parallel scalable database crawler, data-aware HPC). This issues will be discussed based on a number of core pilot data-intensive applications and use cases retained in VERCE. This core applications are related to: (1) data processing and data analysis methods based on correlation techniques; (2) cpu-intensive applications such as large-scale simulation of synthetic waveforms in complex earth systems, and full waveform inversion and tomography. We shall analyze their workflow and data flow, and their requirements for a new service-oriented architecture and a data-aware platform with services and tools. Finally, we will outline the importance of a new collaborative environment between seismology and computer science, together with the need for the emergence and the recognition of 'research technologists' mastering the evolving data-aware technologies and the data-intensive research goals in seismology.
Spacecraft Water Monitoring: Adapting to an Era of Emerging Scientific Challenges
NASA Technical Reports Server (NTRS)
McCoy, J. Torin
2009-01-01
This viewgraph presentation reviews spacecraft water monitoring, and the scientific challenges associated with spacecraft water quality. The contents include: 1) Spacecraft Water 101; 2) Paradigm Shift; and 3) Technology Needs.
ERIC Educational Resources Information Center
Alshihri, Bandar A.
2017-01-01
Cloud computing is a recent computing paradigm that has been integrated into the educational system. It provides numerous opportunities for delivering a variety of computing services in a way that has not been experienced before. The Google Company is among the top business companies that afford their cloud services by launching a number of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brier, Soren; Joslyn, Cliff A.
2013-04-01
This paper presents a critical analysis of code-semiotics, which we see as the latest attempt to create paradigmatic foundation for solving the question of the emergence of life and consciousness. We view code semiotics as a an attempt to revise the empirical scientific Darwinian paradigm, and to go beyond the complex systems, emergence, self-organization, and informational paradigms, and also the selfish gene theory of Dawkins and the Peircean pragmaticist semiotic theory built on the simultaneous types of evolution. As such it is a new and bold attempt to use semiotics to solve the problems created by the evolutionary paradigm’s commitmentmore » to produce a theory of how to connect the two sides of the Cartesian dualistic view of physical reality and consciousness in a consistent way.« less
A computational framework to empower probabilistic protein design
Fromer, Menachem; Yanover, Chen
2008-01-01
Motivation: The task of engineering a protein to perform a target biological function is known as protein design. A commonly used paradigm casts this functional design problem as a structural one, assuming a fixed backbone. In probabilistic protein design, positional amino acid probabilities are used to create a random library of sequences to be simultaneously screened for biological activity. Clearly, certain choices of probability distributions will be more successful in yielding functional sequences. However, since the number of sequences is exponential in protein length, computational optimization of the distribution is difficult. Results: In this paper, we develop a computational framework for probabilistic protein design following the structural paradigm. We formulate the distribution of sequences for a structure using the Boltzmann distribution over their free energies. The corresponding probabilistic graphical model is constructed, and we apply belief propagation (BP) to calculate marginal amino acid probabilities. We test this method on a large structural dataset and demonstrate the superiority of BP over previous methods. Nevertheless, since the results obtained by BP are far from optimal, we thoroughly assess the paradigm using high-quality experimental data. We demonstrate that, for small scale sub-problems, BP attains identical results to those produced by exact inference on the paradigmatic model. However, quantitative analysis shows that the distributions predicted significantly differ from the experimental data. These findings, along with the excellent performance we observed using BP on the smaller problems, suggest potential shortcomings of the paradigm. We conclude with a discussion of how it may be improved in the future. Contact: fromer@cs.huji.ac.il PMID:18586717
Think globally, research locally: paradigms and place in agroecological research.
Reynolds, Heather L; Smith, Alex A; Farmer, James R
2014-10-01
Conducting science for practical ends implicates scientists, whether they wish it or not, as agents in social-ecological systems, raising ethical, economic, environmental, and political issues. Considering these issues helps scientists to increase the relevance and sustainability of research outcomes. As we rise to the worthy call to connect basic research with food production, scientists have the opportunity to evaluate alternative food production paradigms and consider how our research funds and efforts are best employed. In this contribution, we review some of the problems produced by science conducted in service of industrial agriculture and its associated economic growth paradigm. We discuss whether the new concept of "ecological intensification" can rescue the industrial agriculture/growth paradigm and present an emerging alternative paradigm of decentralized, localized, biodiversity-promoting agriculture for a steady-state economy. This "custom fit" agriculture engages constructively with complex and highly localized ecosystems, and we draw from examples of published work to demonstrate how ecologists can contribute by using approaches that acknowledge local agricultural practices and draw on community participation. © 2014 Botanical Society of America, Inc.
Blackford, Martha G; Falletta, Lynn; Andrews, David A; Reed, Michael D
2012-09-01
To fulfill Food and Drug Administration and Department of Health and Human Services emergency care research informed consent requirements, our burn center planned and executed a deferred consent strategy gaining Institutional Review Board (IRB) approval to proceed with the clinical study. These federal regulations dictate public disclosure and community consultation unique to acute care research. Our regional burn center developed and implemented a deferred consent public notification and community consultation paradigm appropriate for a burn study. Published accounts of deferred consent strategies focus on acute care resuscitation practices. We adapted those strategies to design and conduct a comprehensive public notification/community consultation plan to satisfy deferred consent requirements for burn center research. To implement a robust media campaign we engaged the hospital's public relations department, distributed media materials, recruited hospital staff for speaking engagements, enlisted community volunteers, and developed initiatives to inform "hard-to-reach" populations. The hospital's IRB determined we fulfilled our obligation to notify the defined community. Our communication strategy should provide a paradigm other burn centers may appropriate and adapt when planning and executing a deferred consent initiative. Copyright © 2012 Elsevier Ltd and ISBI. All rights reserved.
Jahn-Teller effect in molecular electronics: quantum cellular automata
NASA Astrophysics Data System (ADS)
Tsukerblat, B.; Palii, A.; Clemente-Juan, J. M.; Coronado, E.
2017-05-01
The article summarizes the main results of application of the theory of the Jahn-Teller (JT) and pseudo JT effects to the description of molecular quantum dot cellular automata (QCA), a new paradigm of quantum computing. The following issues are discussed: 1) QCA as a new paradigm of quantum computing, principles and advantages; 2) molecular implementation of QCA; 3) role of the JT effect in charge trapping, encoding of binary information in the quantum cell and non-linear cell-cell response; 4) spin-switching in molecular QCA based on mixed-valence cell; 5) intervalence optical absorption in tetrameric molecular mixed-valence cell through the symmetry assisted approach to the multimode/multilevel JT and pseudo JT problems.
Real-Time Joint Streaming Data Processing from Social and Physical Sensors
NASA Astrophysics Data System (ADS)
Kropivnitskaya, Y. Y.; Qin, J.; Tiampo, K. F.; Bauer, M.
2014-12-01
The results of the technological breakthroughs in computing that have taken place over the last few decades makes it possible to achieve emergency management objectives that focus on saving human lives and decreasing economic effects. In particular, the integration of a wide variety of information sources, including observations from spatially-referenced physical sensors and new social media sources, enables better real-time seismic hazard analysis through distributed computing networks. The main goal of this work is to utilize innovative computational algorithms for better real-time seismic risk analysis by integrating different data sources and processing tools into streaming and cloud computing applications. The Geological Survey of Canada operates the Canadian National Seismograph Network (CNSN) with over 100 high-gain instruments and 60 low-gain or strong motion seismographs. The processing of the continuous data streams from each station of the CNSN provides the opportunity to detect possible earthquakes in near real-time. The information from physical sources is combined to calculate a location and magnitude for an earthquake. The automatically calculated results are not always sufficiently precise and prompt that can significantly reduce the response time to a felt or damaging earthquake. Social sensors, here represented as Twitter users, can provide information earlier to the general public and more rapidly to the emergency planning and disaster relief agencies. We introduce joint streaming data processing from social and physical sensors in real-time based on the idea that social media observations serve as proxies for physical sensors. By using the streams of data in the form of Twitter messages, each of which has an associated time and location, we can extract information related to a target event and perform enhanced analysis by combining it with physical sensor data. Results of this work suggest that the use of data from social media, in conjunction with the development of innovative computing algorithms, when combined with sensor data can provide a new paradigm for real-time earthquake detection in order to facilitate rapid and inexpensive natural risk reduction.
User-centered design in brain-computer interfaces-a case study.
Schreuder, Martijn; Riccio, Angela; Risetti, Monica; Dähne, Sven; Ramsay, Andrew; Williamson, John; Mattia, Donatella; Tangermann, Michael
2013-10-01
The array of available brain-computer interface (BCI) paradigms has continued to grow, and so has the corresponding set of machine learning methods which are at the core of BCI systems. The latter have evolved to provide more robust data analysis solutions, and as a consequence the proportion of healthy BCI users who can use a BCI successfully is growing. With this development the chances have increased that the needs and abilities of specific patients, the end-users, can be covered by an existing BCI approach. However, most end-users who have experienced the use of a BCI system at all have encountered a single paradigm only. This paradigm is typically the one that is being tested in the study that the end-user happens to be enrolled in, along with other end-users. Though this corresponds to the preferred study arrangement for basic research, it does not ensure that the end-user experiences a working BCI. In this study, a different approach was taken; that of a user-centered design. It is the prevailing process in traditional assistive technology. Given an individual user with a particular clinical profile, several available BCI approaches are tested and - if necessary - adapted to him/her until a suitable BCI system is found. Described is the case of a 48-year-old woman who suffered from an ischemic brain stem stroke, leading to a severe motor- and communication deficit. She was enrolled in studies with two different BCI systems before a suitable system was found. The first was an auditory event-related potential (ERP) paradigm and the second a visual ERP paradigm, both of which are established in literature. The auditory paradigm did not work successfully, despite favorable preconditions. The visual paradigm worked flawlessly, as found over several sessions. This discrepancy in performance can possibly be explained by the user's clinical deficit in several key neuropsychological indicators, such as attention and working memory. While the auditory paradigm relies on both categories, the visual paradigm could be used with lower cognitive workload. Besides attention and working memory, several other neurophysiological and -psychological indicators - and the role they play in the BCIs at hand - are discussed. The user's performance on the first BCI paradigm would typically have excluded her from further ERP-based BCI studies. However, this study clearly shows that, with the numerous paradigms now at our disposal, the pursuit for a functioning BCI system should not be stopped after an initial failed attempt. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.
Distributed GPU Computing in GIScience
NASA Astrophysics Data System (ADS)
Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.
2013-12-01
Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE Transactions on, 9(3), 378-394. 2. Li, J., Jiang, Y., Yang, C., Huang, Q., & Rice, M. (2013). Visualizing 3D/4D Environmental Data Using Many-core Graphics Processing Units (GPUs) and Multi-core Central Processing Units (CPUs). Computers & Geosciences, 59(9), 78-89. 3. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. (2008). GPU computing. Proceedings of the IEEE, 96(5), 879-899.
[The dimension of the paradigm of complexity in health systems].
Fajardo-Ortiz, Guillermo; Fernández-Ortega, Miguel Ángel; Ortiz-Montalvo, Armando; Olivares-Santos, Roberto Antonio
2015-01-01
This article presents elements to better understand health systems from the complety paradigm, innovative perspective that offers other ways in the conception of the scientific knowledge prevalent away from linear, characterized by the arise of emerging dissociative and behaviors, based on the intra and trans-disciplinarity concepts such knowledges explain and understand in a different way what happens in the health systems with a view to efficiency and effectiveness. The complexity paradigm means another way of conceptualizing the knowledge, is different from the prevalent epistemology, is still under construction does not separate, not isolated, is not reductionist, or fixed, does not solve the problems, but gives other bases to know them and study them, is a different strategy, a perspective that has basis in the systems theory, informatics and cybernetics beyond traditional knowledge, the positive logics, the newtonian physics and symmetric mathematics, in which everything is centered and balanced, joint the "soft sciences and hard sciences", it has present the Social Determinants of Health and organizational culture. Under the complexity paradigm the health systems are identified with the following concepts: entropy, neguentropy, the thermodynamic second law, attractors, chaos theory, fractals, selfmanagement and self-organization, emerging behaviors, percolation, uncertainty, networks and robusteness; such expressions open new possibilities to improve the management and better understanding of the health systems, giving rise to consider health systems as complex adaptive systems. Copyright © 2015. Published by Masson Doyma México S.A.
O'Hare, Ann M.; Rodriguez, Rudolph A.; Bowling, Christopher Barrett
2016-01-01
The last several decades have witnessed the emergence of evidence-based medicine as the dominant paradigm for medical teaching, research and practice. Under an evidence-based approach, populations rather than individuals become the primary focus of investigation. Treatment priorities are largely shaped by the availability, relevance and quality of evidence and study outcomes and results are assumed to have more or less universal significance based on their implications at the population level. However, population-level treatment goals do not always align with what matters the most to individual patients—who may weigh the risks, benefits and harms of recommended treatments quite differently. In this article we describe the rise of evidence-based medicine in historical context. We discuss limitations of this approach for supporting real-world treatment decisions—especially in older adults with confluent comorbidity, functional impairment and/or limited life expectancy—and we describe the emergence of more patient-centered paradigms to address these limitations. We explain how the principles of evidence-based medicine have helped to shape contemporary approaches to defining, classifying and managing patients with chronic kidney disease. We discuss the limitations of this approach and the potential value of a more patient-centered paradigm, with a particular focus on the care of older adults with this condition. We conclude by outlining ways in which the evidence-base might be reconfigured to better support real-world treatment decisions in individual patients and summarize relevant ongoing initiatives. PMID:25637639
NASA Technical Reports Server (NTRS)
VanZandt, John
1994-01-01
The usage model of supercomputers for scientific applications, such as computational fluid dynamics (CFD), has changed over the years. Scientific visualization has moved scientists away from looking at numbers to looking at three-dimensional images, which capture the meaning of the data. This change has impacted the system models for computing. This report details the model which is used by scientists at NASA's research centers.
ERIC Educational Resources Information Center
Breeding, Marshall
1998-01-01
The Online Computer Library Center's (OCLC) access options have kept pace with the evolving trends in telecommunications and the library computing environment. As libraries deploy microcomputers and develop networks, OCLC offers access methods consistent with these environments. OCLC works toward reorienting its network paradigm through TCP/IP…
The evolving role of telecommunications switching
DOE Office of Scientific and Technical Information (OSTI.GOV)
Personick, S.D.
1993-01-01
There are many forces impacting on the evolution of switching vis-a-vis its role in telecommunications/information networking. Many of the technologies that in the past 15 years have enabled the cost reductions the industry has experienced in digital switches, and the emergence of intelligent networks are now also enabling a wide range of new end-user applications. Many of these applications are rapidly emerging and evolving to meet the, as yet, uncertain needs of the marketplace. There is an explosion of new ideas for applications involving personalized, nomadic communications, multimedia communications, and information access. Some of these will succeed in the marketplacemore » and some will not. There is a continuing emergence of new and improved underlying electronic and photonic technologies and, most recently, the emergence of reliable, secure distributed computing, communications, and management environments. End-user CPE and servers have become increasingly powerful and cost effective as places to locate session (call) management and session enabling objects such as user-interfaces, directories, agents, multimedia bridges, and storage/server subsystems. Not only are dramatically new paradigms for building networks to support existing applications possible, but there is a pressing need to support the emerging and evolving new applications in a timely way. Competition is accelerating the rate of introduction of new technologies, architectures, and telecommunication services. Every aspect of the business is being reexamined to find better ways of meeting customers' needs more efficiently. Meanwhile, as new applications become deployed, there are increasing pressures to provide for security, privacy, and network integrity. This article reviews the author's personal views (many of which are widely shared by others) of the implications of all of these forces on what we traditionally call telecommunications switching. 10 refs.« less
Performance of OVERFLOW-D Applications based on Hybrid and MPI Paradigms on IBM Power4 System
NASA Technical Reports Server (NTRS)
Djomehri, M. Jahed; Biegel, Bryan (Technical Monitor)
2002-01-01
This report briefly discusses our preliminary performance experiments with parallel versions of OVERFLOW-D applications. These applications are based on MPI and hybrid paradigms on the IBM Power4 system here at the NAS Division. This work is part of an effort to determine the suitability of the system and its parallel libraries (MPI/OpenMP) for specific scientific computing objectives.
2017-06-01
and Joint doctrine, to establish a new paradigm for capturing, storing and transmitting the informational elements of an operations order. We...developed a working technology demonstrator that incorporates a new object-oriented data structure into existing open-source mobile technology to provide...instructor and practitioner in conjunction with Marine Corps and Joint doctrine, to establish a new paradigm for capturing, storing and transmitting the
Tangible Landscape: Cognitively Grasping the Flow of Water
NASA Astrophysics Data System (ADS)
Harmon, B. A.; Petrasova, A.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.
2016-06-01
Complex spatial forms like topography can be challenging to understand, much less intentionally shape, given the heavy cognitive load of visualizing and manipulating 3D form. Spatiotemporal processes like the flow of water over a landscape are even more challenging to understand and intentionally direct as they are dependent upon their context and require the simulation of forces like gravity and momentum. This cognitive work can be offloaded onto computers through 3D geospatial modeling, analysis, and simulation. Interacting with computers, however, can also be challenging, often requiring training and highly abstract thinking. Tangible computing - an emerging paradigm of human-computer interaction in which data is physically manifested so that users can feel it and directly manipulate it - aims to offload this added cognitive work onto the body. We have designed Tangible Landscape, a tangible interface powered by an open source geographic information system (GRASS GIS), so that users can naturally shape topography and interact with simulated processes with their hands in order to make observations, generate and test hypotheses, and make inferences about scientific phenomena in a rapid, iterative process. Conceptually Tangible Landscape couples a malleable physical model with a digital model of a landscape through a continuous cycle of 3D scanning, geospatial modeling, and projection. We ran a flow modeling experiment to test whether tangible interfaces like this can effectively enhance spatial performance by offloading cognitive processes onto computers and our bodies. We used hydrological simulations and statistics to quantitatively assess spatial performance. We found that Tangible Landscape enhanced 3D spatial performance and helped users understand water flow.
A Gaze Independent Brain-Computer Interface Based on Visual Stimulation through Closed Eyelids
NASA Astrophysics Data System (ADS)
Hwang, Han-Jeong; Ferreria, Valeria Y.; Ulrich, Daniel; Kilic, Tayfun; Chatziliadis, Xenofon; Blankertz, Benjamin; Treder, Matthias
2015-10-01
A classical brain-computer interface (BCI) based on visual event-related potentials (ERPs) is of limited application value for paralyzed patients with severe oculomotor impairments. In this study, we introduce a novel gaze independent BCI paradigm that can be potentially used for such end-users because visual stimuli are administered on closed eyelids. The paradigm involved verbally presented questions with 3 possible answers. Online BCI experiments were conducted with twelve healthy subjects, where they selected one option by attending to one of three different visual stimuli. It was confirmed that typical cognitive ERPs can be evidently modulated by the attention of a target stimulus in eyes-closed and gaze independent condition, and further classified with high accuracy during online operation (74.58% ± 17.85 s.d.; chance level 33.33%), demonstrating the effectiveness of the proposed novel visual ERP paradigm. Also, stimulus-specific eye movements observed during stimulation were verified as reflex responses to light stimuli, and they did not contribute to classification. To the best of our knowledge, this study is the first to show the possibility of using a gaze independent visual ERP paradigm in an eyes-closed condition, thereby providing another communication option for severely locked-in patients suffering from complex ocular dysfunctions.
Cheng, Jiao; Jin, Jing; Daly, Ian; Zhang, Yu; Wang, Bei; Wang, Xingyu; Cichocki, Andrzej
2018-02-13
Brain-computer interface (BCI) systems can allow their users to communicate with the external world by recognizing intention directly from their brain activity without the assistance of the peripheral motor nervous system. The P300-speller is one of the most widely used visual BCI applications. In previous studies, a flip stimulus (rotating the background area of the character) that was based on apparent motion, suffered from less refractory effects. However, its performance was not improved significantly. In addition, a presentation paradigm that used a "zooming" action (changing the size of the symbol) has been shown to evoke relatively higher P300 amplitudes and obtain a better BCI performance. To extend this method of stimuli presentation within a BCI and, consequently, to improve BCI performance, we present a new paradigm combining both the flip stimulus with a zooming action. This new presentation modality allowed BCI users to focus their attention more easily. We investigated whether such an action could combine the advantages of both types of stimuli presentation to bring a significant improvement in performance compared to the conventional flip stimulus. The experimental results showed that the proposed paradigm could obtain significantly higher classification accuracies and bit rates than the conventional flip paradigm (p<0.01).
Shin, Jaeyoung; Müller, Klaus-R; Hwang, Han-Jeong
2016-01-01
We propose a near-infrared spectroscopy (NIRS)-based brain-computer interface (BCI) that can be operated in eyes-closed (EC) state. To evaluate the feasibility of NIRS-based EC BCIs, we compared the performance of an eye-open (EO) BCI paradigm and an EC BCI paradigm with respect to hemodynamic response and classification accuracy. To this end, subjects performed either mental arithmetic or imagined vocalization of the English alphabet as a baseline task with very low cognitive loading. The performances of two linear classifiers were compared; resulting in an advantage of shrinkage linear discriminant analysis (LDA). The classification accuracy of EC paradigm (75.6 ± 7.3%) was observed to be lower than that of EO paradigm (77.0 ± 9.2%), which was statistically insignificant (p = 0.5698). Subjects reported they felt it more comfortable (p = 0.057) and easier (p < 0.05) to perform the EC BCI tasks. The different task difficulty may become a cause of the slightly lower classification accuracy of EC data. From the analysis results, we could confirm the feasibility of NIRS-based EC BCIs, which can be a BCI option that may ultimately be of use for patients who cannot keep their eyes open consistently. PMID:27824089
Shin, Jaeyoung; Müller, Klaus-R; Hwang, Han-Jeong
2016-11-08
We propose a near-infrared spectroscopy (NIRS)-based brain-computer interface (BCI) that can be operated in eyes-closed (EC) state. To evaluate the feasibility of NIRS-based EC BCIs, we compared the performance of an eye-open (EO) BCI paradigm and an EC BCI paradigm with respect to hemodynamic response and classification accuracy. To this end, subjects performed either mental arithmetic or imagined vocalization of the English alphabet as a baseline task with very low cognitive loading. The performances of two linear classifiers were compared; resulting in an advantage of shrinkage linear discriminant analysis (LDA). The classification accuracy of EC paradigm (75.6 ± 7.3%) was observed to be lower than that of EO paradigm (77.0 ± 9.2%), which was statistically insignificant (p = 0.5698). Subjects reported they felt it more comfortable (p = 0.057) and easier (p < 0.05) to perform the EC BCI tasks. The different task difficulty may become a cause of the slightly lower classification accuracy of EC data. From the analysis results, we could confirm the feasibility of NIRS-based EC BCIs, which can be a BCI option that may ultimately be of use for patients who cannot keep their eyes open consistently.
Special Issue "Biomaterials and Bioprinting".
Chua, Chee Kai; Yeong, Wai Yee; An, Jia
2016-09-14
The emergence of bioprinting in recent years represents a marvellous advancement in 3D printing technology. It expands the range of 3D printable materials from the world of non-living materials into the world of living materials. Biomaterials play an important role in this paradigm shift. This Special Issue focuses on biomaterials and bioprinting and contains eight articles covering a number of recent topics in this emerging area.
KODAMA and VPC based Framework for Ubiquitous Systems and its Experiment
NASA Astrophysics Data System (ADS)
Takahashi, Kenichi; Amamiya, Satoshi; Iwao, Tadashige; Zhong, Guoqiang; Kainuma, Tatsuya; Amamiya, Makoto
Recently, agent technologies have attracted a lot of interest as an emerging programming paradigm. With such agent technologies, services are provided through collaboration among agents. At the same time, the spread of mobile technologies and communication infrastructures has made it possible to access the network anytime and from anywhere. Using agents and mobile technologies to realize ubiquitous computing systems, we propose a new framework based on KODAMA and VPC. KODAMA provides distributed management mechanisms by using the concept of community and communication infrastructure to deliver messages among agents without agents being aware of the physical network. VPC provides a method of defining peer-to-peer services based on agent communication with policy packages. By merging the characteristics of both KODAMA and VPC functions, we propose a new framework for ubiquitous computing environments. It provides distributed management functions according to the concept of agent communities, agent communications which are abstracted from the physical environment, and agent collaboration with policy packages. Using our new framework, we conducted a large-scale experiment in shopping malls in Nagoya, which sent advertisement e-mails to users' cellular phones according to user location and attributes. The empirical results showed that our new framework worked effectively for sales in shopping malls.
A multiply-add engine with monolithically integrated 3D memristor crossbar/CMOS hybrid circuit.
Chakrabarti, B; Lastras-Montaño, M A; Adam, G; Prezioso, M; Hoskins, B; Payvand, M; Madhavan, A; Ghofrani, A; Theogarajan, L; Cheng, K-T; Strukov, D B
2017-02-14
Silicon (Si) based complementary metal-oxide semiconductor (CMOS) technology has been the driving force of the information-technology revolution. However, scaling of CMOS technology as per Moore's law has reached a serious bottleneck. Among the emerging technologies memristive devices can be promising for both memory as well as computing applications. Hybrid CMOS/memristor circuits with CMOL (CMOS + "Molecular") architecture have been proposed to combine the extremely high density of the memristive devices with the robustness of CMOS technology, leading to terabit-scale memory and extremely efficient computing paradigm. In this work, we demonstrate a hybrid 3D CMOL circuit with 2 layers of memristive crossbars monolithically integrated on a pre-fabricated CMOS substrate. The integrated crossbars can be fully operated through the underlying CMOS circuitry. The memristive devices in both layers exhibit analog switching behavior with controlled tunability and stable multi-level operation. We perform dot-product operations with the 2D and 3D memristive crossbars to demonstrate the applicability of such 3D CMOL hybrid circuits as a multiply-add engine. To the best of our knowledge this is the first demonstration of a functional 3D CMOL hybrid circuit.
NASA Astrophysics Data System (ADS)
Srinivasan, Gopalakrishnan; Sengupta, Abhronil; Roy, Kaushik
2016-07-01
Spiking Neural Networks (SNNs) have emerged as a powerful neuromorphic computing paradigm to carry out classification and recognition tasks. Nevertheless, the general purpose computing platforms and the custom hardware architectures implemented using standard CMOS technology, have been unable to rival the power efficiency of the human brain. Hence, there is a need for novel nanoelectronic devices that can efficiently model the neurons and synapses constituting an SNN. In this work, we propose a heterostructure composed of a Magnetic Tunnel Junction (MTJ) and a heavy metal as a stochastic binary synapse. Synaptic plasticity is achieved by the stochastic switching of the MTJ conductance states, based on the temporal correlation between the spiking activities of the interconnecting neurons. Additionally, we present a significance driven long-term short-term stochastic synapse comprising two unique binary synaptic elements, in order to improve the synaptic learning efficiency. We demonstrate the efficacy of the proposed synaptic configurations and the stochastic learning algorithm on an SNN trained to classify handwritten digits from the MNIST dataset, using a device to system-level simulation framework. The power efficiency of the proposed neuromorphic system stems from the ultra-low programming energy of the spintronic synapses.
Niamtu , J
2001-08-01
Carousel slide presentations have been used for academic and clinical presentations since the late 1950s. However, advances in computer technology have caused a paradigm shift, and digital presentations are quickly becoming standard for clinical presentations. The advantages of digital presentations include cost savings; portability; easy updating capability; Internet access; multimedia functions, such as animation, pictures, video, and sound; and customization to augment audience interest and attention. Microsoft PowerPoint has emerged as the most popular digital presentation software and is currently used by many practitioners with and without significant computer expertise. The user-friendly platform of PowerPoint enables even the novice presenter to incorporate digital presentations into his or her profession. PowerPoint offers many advanced options that, with a minimal investment of time, can be used to create more interactive and professional presentations for lectures, patient education, and marketing. Examples of advanced PowerPoint applications are presented in a stepwise manner to unveil the full power of PowerPoint. By incorporating these techniques, medical practitioners can easily personalize, customize, and enhance their PowerPoint presentations. Complications, pitfalls, and caveats are discussed to detour and prevent misadventures in digital presentations. Relevant Web sites are listed to further update, customize, and communicate PowerPoint techniques.
A multiply-add engine with monolithically integrated 3D memristor crossbar/CMOS hybrid circuit
Chakrabarti, B.; Lastras-Montaño, M. A.; Adam, G.; Prezioso, M.; Hoskins, B.; Cheng, K.-T.; Strukov, D. B.
2017-01-01
Silicon (Si) based complementary metal-oxide semiconductor (CMOS) technology has been the driving force of the information-technology revolution. However, scaling of CMOS technology as per Moore’s law has reached a serious bottleneck. Among the emerging technologies memristive devices can be promising for both memory as well as computing applications. Hybrid CMOS/memristor circuits with CMOL (CMOS + “Molecular”) architecture have been proposed to combine the extremely high density of the memristive devices with the robustness of CMOS technology, leading to terabit-scale memory and extremely efficient computing paradigm. In this work, we demonstrate a hybrid 3D CMOL circuit with 2 layers of memristive crossbars monolithically integrated on a pre-fabricated CMOS substrate. The integrated crossbars can be fully operated through the underlying CMOS circuitry. The memristive devices in both layers exhibit analog switching behavior with controlled tunability and stable multi-level operation. We perform dot-product operations with the 2D and 3D memristive crossbars to demonstrate the applicability of such 3D CMOL hybrid circuits as a multiply-add engine. To the best of our knowledge this is the first demonstration of a functional 3D CMOL hybrid circuit. PMID:28195239
Srinivasan, Gopalakrishnan; Sengupta, Abhronil; Roy, Kaushik
2016-07-13
Spiking Neural Networks (SNNs) have emerged as a powerful neuromorphic computing paradigm to carry out classification and recognition tasks. Nevertheless, the general purpose computing platforms and the custom hardware architectures implemented using standard CMOS technology, have been unable to rival the power efficiency of the human brain. Hence, there is a need for novel nanoelectronic devices that can efficiently model the neurons and synapses constituting an SNN. In this work, we propose a heterostructure composed of a Magnetic Tunnel Junction (MTJ) and a heavy metal as a stochastic binary synapse. Synaptic plasticity is achieved by the stochastic switching of the MTJ conductance states, based on the temporal correlation between the spiking activities of the interconnecting neurons. Additionally, we present a significance driven long-term short-term stochastic synapse comprising two unique binary synaptic elements, in order to improve the synaptic learning efficiency. We demonstrate the efficacy of the proposed synaptic configurations and the stochastic learning algorithm on an SNN trained to classify handwritten digits from the MNIST dataset, using a device to system-level simulation framework. The power efficiency of the proposed neuromorphic system stems from the ultra-low programming energy of the spintronic synapses.
In Silico Chemogenomics Drug Repositioning Strategies for Neglected Tropical Diseases.
Andrade, Carolina Horta; Neves, Bruno Junior; Melo-Filho, Cleber Camilo; Rodrigues, Juliana; Silva, Diego Cabral; Braga, Rodolpho Campos; Cravo, Pedro Vitor Lemos
2018-03-08
Only ~1% of all drug candidates against Neglected Tropical Diseases (NTDs) have reached clinical trials in the last decades, underscoring the need for new, safe and effective treatments. In such context, drug repositioning, which allows finding novel indications for approved drugs whose pharmacokinetic and safety profiles are already known, is emerging as a promising strategy for tackling NTDs. Chemogenomics is a direct descendent of the typical drug discovery process that involves the systematic screening of chemical compounds against drug targets in high-throughput screening (HTS) efforts, for the identification of lead compounds. However, different to the one-drug-one-target paradigm, chemogenomics attempts to identify all potential ligands for all possible targets and diseases. In this review, we summarize current methodological development efforts in drug repositioning that use state-of-the-art computational ligand- and structure-based chemogenomics approaches. Furthermore, we highlighted the recent progress in computational drug repositioning for some NTDs, based on curation and modeling of genomic, biological, and chemical data. Additionally, we also present in-house and other successful examples and suggest possible solutions to existing pitfalls. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Is Human-Computer Interaction Social or Parasocial?
ERIC Educational Resources Information Center
Sundar, S. Shyam
Conducted in the attribution-research paradigm of social psychology, a study examined whether human-computer interaction is fundamentally social (as in human-human interaction) or parasocial (as in human-television interaction). All 30 subjects (drawn from an undergraduate class on communication) were exposed to an identical interaction with…
The effects of semantic congruency: a research of audiovisual P300-speller.
Cao, Yong; An, Xingwei; Ke, Yufeng; Jiang, Jin; Yang, Hanjun; Chen, Yuqian; Jiao, Xuejun; Qi, Hongzhi; Ming, Dong
2017-07-25
Over the past few decades, there have been many studies of aspects of brain-computer interface (BCI). Of particular interests are event-related potential (ERP)-based BCI spellers that aim at helping mental typewriting. Nowadays, audiovisual unimodal stimuli based BCI systems have attracted much attention from researchers, and most of the existing studies of audiovisual BCIs were based on semantic incongruent stimuli paradigm. However, no related studies had reported that whether there is difference of system performance or participant comfort between BCI based on semantic congruent paradigm and that based on semantic incongruent paradigm. The goal of this study was to investigate the effects of semantic congruency in system performance and participant comfort in audiovisual BCI. Two audiovisual paradigms (semantic congruent and incongruent) were adopted, and 11 healthy subjects participated in the experiment. High-density electrical mapping of ERPs and behavioral data were measured for the two stimuli paradigms. The behavioral data indicated no significant difference between congruent and incongruent paradigms for offline classification accuracy. Nevertheless, eight of the 11 participants reported their priority to semantic congruent experiment, two reported no difference between the two conditions, and only one preferred the semantic incongruent paradigm. Besides, the result indicted that higher amplitude of ERP was found in incongruent stimuli based paradigm. In a word, semantic congruent paradigm had a better participant comfort, and maintained the same recognition rate as incongruent paradigm. Furthermore, our study suggested that the paradigm design of spellers must take both system performance and user experience into consideration rather than merely pursuing a larger ERP response.
Combaz, Adrien; Van Hulle, Marc M
2015-01-01
We study the feasibility of a hybrid Brain-Computer Interface (BCI) combining simultaneous visual oddball and Steady-State Visually Evoked Potential (SSVEP) paradigms, where both types of stimuli are superimposed on a computer screen. Potentially, such a combination could result in a system being able to operate faster than a purely P300-based BCI and encode more targets than a purely SSVEP-based BCI. We analyse the interactions between the brain responses of the two paradigms, and assess the possibility to detect simultaneously the brain activity evoked by both paradigms, in a series of 3 experiments where EEG data are analysed offline. Despite differences in the shape of the P300 response between pure oddball and hybrid condition, we observe that the classification accuracy of this P300 response is not affected by the SSVEP stimulation. We do not observe either any effect of the oddball stimulation on the power of the SSVEP response in the frequency of stimulation. Finally results from the last experiment show the possibility of detecting both types of brain responses simultaneously and suggest not only the feasibility of such hybrid BCI but also a gain over pure oddball- and pure SSVEP-based BCIs in terms of communication rate.
A Comparison of PETSC Library and HPF Implementations of an Archetypal PDE Computation
NASA Technical Reports Server (NTRS)
Hayder, M. Ehtesham; Keyes, David E.; Mehrotra, Piyush
1997-01-01
Two paradigms for distributed-memory parallel computation that free the application programmer from the details of message passing are compared for an archetypal structured scientific computation a nonlinear, structured-grid partial differential equation boundary value problem using the same algorithm on the same hardware. Both paradigms, parallel libraries represented by Argonne's PETSC, and parallel languages represented by the Portland Group's HPF, are found to be easy to use for this problem class, and both are reasonably effective in exploiting concurrency after a short learning curve. The level of involvement required by the application programmer under either paradigm includes specification of the data partitioning (corresponding to a geometrically simple decomposition of the domain of the PDE). Programming in SPAM style for the PETSC library requires writing the routines that discretize the PDE and its Jacobian, managing subdomain-to-processor mappings (affine global- to-local index mappings), and interfacing to library solver routines. Programming for HPF requires a complete sequential implementation of the same algorithm, introducing concurrency through subdomain blocking (an effort similar to the index mapping), and modest experimentation with rewriting loops to elucidate to the compiler the latent concurrency. Correctness and scalability are cross-validated on up to 32 nodes of an IBM SP2.
Neural network model for automatic traffic incident detection : executive summary.
DOT National Transportation Integrated Search
2001-04-01
Automatic freeway incident detection is an important component of advanced transportation management systems (ATMS) that provides information for emergency relief and traffic control and management purposes. In this research, a multi-paradigm intelli...
Aono, Masashi; Naruse, Makoto; Kim, Song-Ju; Wakabayashi, Masamitsu; Hori, Hirokazu; Ohtsu, Motoichi; Hara, Masahiko
2013-06-18
Biologically inspired computing devices and architectures are expected to overcome the limitations of conventional technologies in terms of solving computationally demanding problems, adapting to complex environments, reducing energy consumption, and so on. We previously demonstrated that a primitive single-celled amoeba (a plasmodial slime mold), which exhibits complex spatiotemporal oscillatory dynamics and sophisticated computing capabilities, can be used to search for a solution to a very hard combinatorial optimization problem. We successfully extracted the essential spatiotemporal dynamics by which the amoeba solves the problem. This amoeba-inspired computing paradigm can be implemented by various physical systems that exhibit suitable spatiotemporal dynamics resembling the amoeba's problem-solving process. In this Article, we demonstrate that photoexcitation transfer phenomena in certain quantum nanostructures mediated by optical near-field interactions generate the amoebalike spatiotemporal dynamics and can be used to solve the satisfiability problem (SAT), which is the problem of judging whether a given logical proposition (a Boolean formula) is self-consistent. SAT is related to diverse application problems in artificial intelligence, information security, and bioinformatics and is a crucially important nondeterministic polynomial time (NP)-complete problem, which is believed to become intractable for conventional digital computers when the problem size increases. We show that our amoeba-inspired computing paradigm dramatically outperforms a conventional stochastic search method. These results indicate the potential for developing highly versatile nanoarchitectonic computers that realize powerful solution searching with low energy consumption.
Grid today, clouds on the horizon
NASA Astrophysics Data System (ADS)
Shiers, Jamie
2009-04-01
By the time of CCP 2008, the largest scientific machine in the world - the Large Hadron Collider - had been cooled down as scheduled to its operational temperature of below 2 degrees Kelvin and injection tests were starting. Collisions of proton beams at 5+5 TeV were expected within one to two months of the initial tests, with data taking at design energy ( 7+7 TeV) foreseen for 2009. In order to process the data from this world machine, we have put our "Higgs in one basket" - that of Grid computing [The Worldwide LHC Computing Grid (WLCG), in: Proceedings of the Conference on Computational Physics 2006 (CCP 2006), vol. 177, 2007, pp. 219-223]. After many years of preparation, 2008 saw a final "Common Computing Readiness Challenge" (CCRC'08) - aimed at demonstrating full readiness for 2008 data taking, processing and analysis. By definition, this relied on a world-wide production Grid infrastructure. But change - as always - is on the horizon. The current funding model for Grids - which in Europe has been through 3 generations of EGEE projects, together with related projects in other parts of the world, including South America - is evolving towards a long-term, sustainable e-infrastructure, like the European Grid Initiative (EGI) [The European Grid Initiative Design Study, website at http://web.eu-egi.eu/]. At the same time, potentially new paradigms, such as that of "Cloud Computing" are emerging. This paper summarizes the results of CCRC'08 and discusses the potential impact of future Grid funding on both regional and international application communities. It contrasts Grid and Cloud computing models from both technical and sociological points of view. Finally, it discusses the requirements from production application communities, in terms of stability and continuity in the medium to long term.
Scaling, Similarity, and the Fourth Paradigm for Hydrology
NASA Technical Reports Server (NTRS)
Peters-Lidard, Christa D.; Clark, Martyn; Samaniego, Luis; Verhoest, Niko E. C.; van Emmerik, Tim; Uijlenhoet, Remko; Achieng, Kevin; Franz, Trenton E.; Woods, Ross
2017-01-01
In this synthesis paper addressing hydrologic scaling and similarity, we posit that roadblocks in the search for universal laws of hydrology are hindered by our focus on computational simulation (the third paradigm), and assert that it is time for hydrology to embrace a fourth paradigm of data-intensive science. Advances in information-based hydrologic science, coupled with an explosion of hydrologic data and advances in parameter estimation and modelling, have laid the foundation for a data-driven framework for scrutinizing hydrological scaling and similarity hypotheses. We summarize important scaling and similarity concepts (hypotheses) that require testing, describe a mutual information framework for testing these hypotheses, describe boundary condition, state flux, and parameter data requirements across scales to support testing these hypotheses, and discuss some challenges to overcome while pursuing the fourth hydrological paradigm. We call upon the hydrologic sciences community to develop a focused effort towards adopting the fourth paradigm and apply this to outstanding challenges in scaling and similarity.
Exploring Cloud Computing for Distance Learning
ERIC Educational Resources Information Center
He, Wu; Cernusca, Dan; Abdous, M'hammed
2011-01-01
The use of distance courses in learning is growing exponentially. To better support faculty and students for teaching and learning, distance learning programs need to constantly innovate and optimize their IT infrastructures. The new IT paradigm called "cloud computing" has the potential to transform the way that IT resources are utilized and…
The Impact of Internet-Based Instruction on Teacher Education: The "Paradigm Shift."
ERIC Educational Resources Information Center
Lan, Jiang JoAnn
This study incorporated Internet-based instruction into two education technology courses for preservice teachers. One was a required, undergraduate, beginning-level educational computing course. The other was a graduate, advanced-level computing course. The experiment incorporated Internet-based instruction into course delivery in order to create…
Implementing Project Based Learning in Computer Classroom
ERIC Educational Resources Information Center
Asan, Askin; Haliloglu, Zeynep
2005-01-01
Project-based learning offers the opportunity to apply theoretical and practical knowledge, and to develop the student's group working, and collaboration skills. In this paper we presented a design of effective computer class that implements the well-known and highly accepted project-based learning paradigm. A pre-test/post-test control group…
Developing an Academic Information Resources Master Plan: A Paradigm.
ERIC Educational Resources Information Center
Gentry, H. R.
Since 1980, Johnson County Community College (JCCC) has rapidly expanded its use of personal computers in instruction and administration. Many computers were being acquired, however, before appropriate policies, plans, and procedures were developed for their effective management. In response to this problem, the following actions were identified…
Good Practices in Free-energy Calculations
NASA Technical Reports Server (NTRS)
Pohorille, Andrew; Jarzynski, Christopher; Chipot, Christopher
2013-01-01
As access to computational resources continues to increase, free-energy calculations have emerged as a powerful tool that can play a predictive role in drug design. Yet, in a number of instances, the reliability of these calculations can be improved significantly if a number of precepts, or good practices are followed. For the most part, the theory upon which these good practices rely has been known for many years, but often overlooked, or simply ignored. In other cases, the theoretical developments are too recent for their potential to be fully grasped and merged into popular platforms for the computation of free-energy differences. The current best practices for carrying out free-energy calculations will be reviewed demonstrating that, at little to no additional cost, free-energy estimates could be markedly improved and bounded by meaningful error estimates. In energy perturbation and nonequilibrium work methods, monitoring the probability distributions that underlie the transformation between the states of interest, performing the calculation bidirectionally, stratifying the reaction pathway and choosing the most appropriate paradigms and algorithms for transforming between states offer significant gains in both accuracy and precision. In thermodynamic integration and probability distribution (histogramming) methods, properly designed adaptive techniques yield nearly uniform sampling of the relevant degrees of freedom and, by doing so, could markedly improve efficiency and accuracy of free energy calculations without incurring any additional computational expense.
Grid computing technology for hydrological applications
NASA Astrophysics Data System (ADS)
Lecca, G.; Petitdidier, M.; Hluchy, L.; Ivanovic, M.; Kussul, N.; Ray, N.; Thieron, V.
2011-06-01
SummaryAdvances in e-Infrastructure promise to revolutionize sensing systems and the way in which data are collected and assimilated, and complex water systems are simulated and visualized. According to the EU Infrastructure 2010 work-programme, data and compute infrastructures and their underlying technologies, either oriented to tackle scientific challenges or complex problem solving in engineering, are expected to converge together into the so-called knowledge infrastructures, leading to a more effective research, education and innovation in the next decade and beyond. Grid technology is recognized as a fundamental component of e-Infrastructures. Nevertheless, this emerging paradigm highlights several topics, including data management, algorithm optimization, security, performance (speed, throughput, bandwidth, etc.), and scientific cooperation and collaboration issues that require further examination to fully exploit it and to better inform future research policies. The paper illustrates the results of six different surface and subsurface hydrology applications that have been deployed on the Grid. All the applications aim to answer to strong requirements from the Civil Society at large, relatively to natural and anthropogenic risks. Grid technology has been successfully tested to improve flood prediction, groundwater resources management and Black Sea hydrological survey, by providing large computing resources. It is also shown that Grid technology facilitates e-cooperation among partners by means of services for authentication and authorization, seamless access to distributed data sources, data protection and access right, and standardization.
Promoting a culture of innovation: BJSP and the emergence of new paradigms in social psychology.
Reicher, Stephen
2011-09-01
In this paper, I start by describing the role played by British Journal of Social Psychology (BJSP) in nurturing two important new paradigms in social psychology - the social identity approach and discourse psychology. I then consider the forces in contemporary academia, in general, and psychology, in particular, that militate against innovation. I conclude by suggesting some ways in which individual social psychologists and our journals, particularly BJSP, can contribute to the development of an innovative and intellectually dynamic discipline. ©2011 The British Psychological Society.
Achieving High Performance on the i860 Microprocessor
NASA Technical Reports Server (NTRS)
Lee, King; Kutler, Paul (Technical Monitor)
1998-01-01
The i860 is a high performance microprocessor used in the Intel Touchstone project. This paper proposes a paradigm for programming the i860 that is modelled on the vector instructions of the Cray computers. Fortran callable assembler subroutines were written that mimic the concurrent vector instructions of the Cray. Cache takes the place of vector registers. Using this paradigm we have achieved twice the performance of compiled code on a traditional solve.
ERIC Educational Resources Information Center
Kerawalla, Lucinda; Pearce, Darren; Yuill, Nicola; Luckin, Rosemary; Harris, Amanda
2008-01-01
We take a socio-cultural approach to comparing how dual control of a new user interface paradigm--Separate Control of Shared Space (SCOSS)--and dual control of a single user interface can work to mediate the collaborative decision-making process between pairs of children carrying out a multiple categorisation word task on a shared computer.…
NASA Astrophysics Data System (ADS)
Furuya, Haruhisa; Hiratsuka, Mitsuyoshi
This article overviews the historical transition of legal protection of Computer software contracts in the Unite States and presents how it should function under Uniform Commercial Code and its amended Article 2B, Uniform Computer Information Transactions Act, and also recently-approved “Principles of the Law of Software Contracts”.
Quantum Optical Implementations of Current Quantum Computing Paradigms
2005-05-01
Conferences and Proceedings: The results were presented at several conferences. These include: 1. M. O. Scully, " Foundations of Quantum Mechanics ", in...applications have revealed a strong connection between the fundamental aspects of quantum mechanics that governs physical systems and the informational...could be solved in polynomial time using quantum computers. Another set of problems where quantum mechanics can carry out computations substantially
NASA Astrophysics Data System (ADS)
Salehi Fashami, Mohammad
Excessive energy dissipation in CMOS devices during switching is the primary threat to continued downscaling of computing devices in accordance with Moore's law. In the quest for alternatives to traditional transistor based electronics, nanomagnet-based computing [1, 2] is emerging as an attractive alternative since: (i) nanomagnets are intrinsically more energy-efficient than transistors due to the correlated switching of spins [3], and (ii) unlike transistors, magnets have no leakage and hence have no standby power dissipation. However, large energy dissipation in the clocking circuit appears to be a barrier to the realization of ultra low power logic devices with such nanomagnets. To alleviate this issue, we propose the use of a hybrid spintronics-straintronics or straintronic nanomagnetic logic (SML) paradigm. This uses a piezoelectric layer elastically coupled to an elliptically shaped magnetostrictive nanomagnetic layer for both logic [4-6] and memory [7-8] and other information processing [9-10] applications that could potentially be 2-3 orders of magnitude more energy efficient than current CMOS based devices. This dissertation focuses on studying the feasibility, performance and reliability of such nanomagnetic logic circuits by simulating the nanoscale magnetization dynamics of dipole coupled nanomagnets clocked by stress. Specifically, the topics addressed are: 1. Theoretical study of multiferroic nanomagnetic arrays laid out in specific geometric patterns to implement a "logic wire" for unidirectional information propagation and a universal logic gate [4-6]. 2. Monte Carlo simulations of the magnetization trajectories in a simple system of dipole coupled nanomagnets and NAND gate described by the Landau-Lifshitz-Gilbert (LLG) equations simulated in the presence of random thermal noise to understand the dynamics switching error [11, 12] in such devices. 3. Arriving at a lower bound for energy dissipation as a function of switching error [13] for a practical nanomagnetic logic scheme. 4. Clocking of nanomagnetic logic with surface acoustic waves (SAW) to drastically decrease the lithographic burden needed to contact each multiferroic nanomagnet while maintaining pipelined information processing. 5. Nanomagnets with four (or higher states) implemented with shape engineering. Two types of magnet that encode four states: (i) diamond, and (ii) concave nanomagnets are studied for coherence of the switching process.
Neural network model for automatic traffic incident detection : final report, August 2001.
DOT National Transportation Integrated Search
2001-08-01
Automatic freeway incident detection is an important component of advanced transportation management systems (ATMS) that provides information for emergency relief and traffic control and management purposes. In this research, a multi-paradigm intelli...
Vignes, Tore
2007-01-01
This study is a replication of Sundberg and Sundberg (1990) that compared topography-based verbal behavior with selection-based verbal behavior in terms of acquisition, accuracy, and testing for the emergence of a new verbal relation. Participants were three typical children and three developmentally disabled persons with autism. The study sought to determine which paradigm (topography-based or selection-based) resulted in more rapid acquisition of tacts and intraverbals, which was associated with the fewest errors, and which paradigm resulted in the emergence of the highest number of new verbal relations. The results of the study showed that the six participants performed quite differently from one another. Most importantly, the results from the person with autism contradicted previous findings favoring selection-based verbal behavior over topography-based approaches for teaching verbal behavior to low-functioning individuals. PMID:22477385
Towards oscillations-based simulation of social systems: a neurodynamic approach
NASA Astrophysics Data System (ADS)
Plikynas, Darius; Basinskas, Gytis; Laukaitis, Algirdas
2015-04-01
This multidisciplinary work presents synopsis of theories in the search for common field-like fundamental principles of self-organisation and communication existing on quantum, cellular, and even social levels. Based on these fundamental principles, we formulate conceptually novel social neuroscience paradigm (OSIMAS), which envisages social systems emerging from the coherent neurodynamical processes taking place in the individual mind-fields. In this way, societies are understood as global processes emerging from the superposition of the conscious and subconscious mind-fields of individual members of society. For the experimental validation of the biologically inspired OSIMAS paradigm, we have designed a framework of EEG-based experiments. Initial baseline individual tests of spectral cross-correlations of EEG-recorded brainwave patterns for some mental states have been provided in this paper. Preliminary experimental results do not refute the main OSIMAS postulates. This paper also provides some insights for the construction of OSIMAS-based simulation models.
Heuts, Samuel; Sardari Nia, Peyman; Maessen, Jos G
2016-01-01
For the past decades, surgeries have become more complex, due to the increasing age of the patient population referred for thoracic surgery, more complex pathology and the emergence of minimally invasive thoracic surgery. Together with the early detection of thoracic disease as a result of innovations in diagnostic possibilities and the paradigm shift to personalized medicine, preoperative planning is becoming an indispensable and crucial aspect of surgery. Several new techniques facilitating this paradigm shift have emerged. Pre-operative marking and staining of lesions are already a widely accepted method of preoperative planning in thoracic surgery. However, three-dimensional (3D) image reconstructions, virtual simulation and rapid prototyping (RP) are still in development phase. These new techniques are expected to become an important part of the standard work-up of patients undergoing thoracic surgery in the future. This review aims at graphically presenting and summarizing these new diagnostic and therapeutic tools.
Enhancing the hermeneutic single-case efficacy design: Bridging the research-practice gap.
Wall, Jessie M; Kwee, Janelle L; Hu, Monica; McDonald, Marvin J
2017-09-01
Systematic case study designs are emerging as alternative paradigm strategies for psychotherapy and social science research. Through enhanced sensitivity to context, these designs examine idiographic profiles of causal processes. We specifically advocate the use of the hermeneutic single-case efficacy design (HSCED). HSCED has recently been used to investigate the efficacy of an existing therapy with a new population (Observed and Experiential Integration for athlete performance barriers) and an emerging therapy (Lifespan Integration Therapy). We describe innovations in HSCED that were implemented for these studies. These developments include (a) integrating psychotherapists as case developers, (b) incorporating multiple cases in one investigation, and (c) tailoring the repertoire of assessment tools. These extensions strategically incorporated principles of contextual paradigms in HSCED, thus complementing single-case designs that neglect idiographic contexts. We discuss recommendations for using HSCED in practice-based research, highlighting its potential as a bridge to address the research-practice gap.
Philosophy of science and the emerging paradigm: implications for hypnosis.
Osowiec, Darlene A
2014-01-01
Within the hypnosis field, there is a disparity between clinical and research worldviews. Clinical practitioners work with patients who are dealing with serious, often unique, real-world problems-lived experience. Researchers adhere to objective measurements, standardization, data, and statistics. Although there is overlap, an ongoing divergence can be counterproductive to the hypnosis field and to the larger professional and social contexts. The purpose of this article is: (1) to examine some of the major assumptions, the history, and the philosophy that undergird the definition of science, which was constructed in the mid-17th century; (2) to discover how science is a product of prevailing social forces and is undergoing a paradigm shift; and (3) to understand the more encompassing, holistic paradigm with implications for the hypnosis field.
[Is the brain the creator of psychic phenomena or is a paradigm shift inevitable?].
Bonilla, Ernesto
2014-06-01
Every day new scientific information is appearing that cannot be explained using the classical Newtonian model and is calling for the emergence of a new paradigm that would include the explanation of such phenomena as telepathy, clairvoyance, presentiment, precognition, out of the body experiences, psychic healing, after-death communication, near-death experiences and reincarnation. The materialist paradigm which considers the brain as the sole cause of consciousness and psychic phenomena has been challenged by a new paradigm that seems to demonstrate that there is not a cause-effect relationship between brain activity and psychic phenomena but only a correlation between them, since these phenomena can be experienced without the body and appear to have an extra-cerebral origin (cosmic field, cosmic consciousness?). Of course, the brain is intensely involved in the manifestation of consciousness in our daily life but this is not equivalent to affirm that brain creates consciousness. Recent findings force us to consider a non-physical, spiritual and transpersonal aspect of reality.
Edoh, Thierry
2018-04-10
The risk of spreading diseases within (ad-hoc)crowds and the need to pervasively screen asymptomatic individuals to protect the population against emerging infectious diseases, request permanentcrowd surveillance., particularly in high-risk regions. Thecase of Ebola epidemic in West Africa in recent years has shown the need for pervasive screening. The trend today in diseases surveillance is consisting of epidemiological data collection about emerging infectious diseases using social media, wearable sensors systems, or mobile applications and data analysis. This approach presents various limitations. This paper proposes a novel approach for diseases monitoring and risk prevention of spreading infectious diseases. The proposed approach, aiming at overcoming the limitation of existing disease surveillance approaches, combines the hybrid crowdsensing paradigm with sensing individuals' bio-signals using optical sensors for monitoring any risks of spreading emerging infectious diseases in any (ad-hoc) crowds. A proof-of-concept has been performed using a drone armed with a cat s60 smartphone featuring a Forward Looking Infra-Red (FLIR) camera. According to the results of the conducted experiment, the concept has the potential to improve the conventional epidemiological data collection. The measurement is reliable, and the recorded data are valid. The measurement error rates are about 8%.
NASA Astrophysics Data System (ADS)
Frye, G. E.; Hauser, C. K.; Townsend, G.; Sellers, E. W.
2011-04-01
Since the introduction of the P300 brain-computer interface (BCI) speller by Farwell and Donchin in 1988, the speed and accuracy of the system has been significantly improved. Larger electrode montages and various signal processing techniques are responsible for most of the improvement in performance. New presentation paradigms have also led to improvements in bit rate and accuracy (e.g. Townsend et al (2010 Clin. Neurophysiol. 121 1109-20)). In particular, the checkerboard paradigm for online P300 BCI-based spelling performs well, has started to document what makes for a successful paradigm, and is a good platform for further experimentation. The current paper further examines the checkerboard paradigm by suppressing items which surround the target from flashing during calibration (i.e. the suppression condition). In the online feedback mode the standard checkerboard paradigm is used with a stepwise linear discriminant classifier derived from the suppression condition and one classifier derived from the standard checkerboard condition, counter-balanced. The results of this research demonstrate that using suppression during calibration produces significantly more character selections/min ((6.46) time between selections included) than the standard checkerboard condition (5.55), and significantly fewer target flashes are needed per selection in the SUP condition (5.28) as compared to the RCP condition (6.17). Moreover, accuracy in the SUP and RCP conditions remained equivalent (~90%). Mean theoretical bit rate was 53.62 bits/min in the suppression condition and 46.36 bits/min in the standard checkerboard condition (ns). Waveform morphology also showed significant differences in amplitude and latency.
Proxy-equation paradigm: A strategy for massively parallel asynchronous computations
NASA Astrophysics Data System (ADS)
Mittal, Ankita; Girimaji, Sharath
2017-09-01
Massively parallel simulations of transport equation systems call for a paradigm change in algorithm development to achieve efficient scalability. Traditional approaches require time synchronization of processing elements (PEs), which severely restricts scalability. Relaxing synchronization requirement introduces error and slows down convergence. In this paper, we propose and develop a novel "proxy equation" concept for a general transport equation that (i) tolerates asynchrony with minimal added error, (ii) preserves convergence order and thus, (iii) expected to scale efficiently on massively parallel machines. The central idea is to modify a priori the transport equation at the PE boundaries to offset asynchrony errors. Proof-of-concept computations are performed using a one-dimensional advection (convection) diffusion equation. The results demonstrate the promise and advantages of the present strategy.
An Agent Inspired Reconfigurable Computing Implementation of a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Weir, John M.; Wells, B. Earl
2003-01-01
Many software systems have been successfully implemented using an agent paradigm which employs a number of independent entities that communicate with one another to achieve a common goal. The distributed nature of such a paradigm makes it an excellent candidate for use in high speed reconfigurable computing hardware environments such as those present in modem FPGA's. In this paper, a distributed genetic algorithm that can be applied to the agent based reconfigurable hardware model is introduced. The effectiveness of this new algorithm is evaluated by comparing the quality of the solutions found by the new algorithm with those found by traditional genetic algorithms. The performance of a reconfigurable hardware implementation of the new algorithm on an FPGA is compared to traditional single processor implementations.
Photochromic molecular implementations of universal computation.
Chaplin, Jack C; Krasnogor, Natalio; Russell, Noah A
2014-12-01
Unconventional computing is an area of research in which novel materials and paradigms are utilised to implement computation. Previously we have demonstrated how registers, logic gates and logic circuits can be implemented, unconventionally, with a biocompatible molecular switch, NitroBIPS, embedded in a polymer matrix. NitroBIPS and related molecules have been shown elsewhere to be capable of modifying many biological processes in a manner that is dependent on its molecular form. Thus, one possible application of this type of unconventional computing is to embed computational processes into biological systems. Here we expand on our earlier proof-of-principle work and demonstrate that universal computation can be implemented using NitroBIPS. We have previously shown that spatially localised computational elements, including registers and logic gates, can be produced. We explain how parallel registers can be implemented, then demonstrate an application of parallel registers in the form of Turing machine tapes, and demonstrate both parallel registers and logic circuits in the form of elementary cellular automata. The Turing machines and elementary cellular automata utilise the same samples and same hardware to implement their registers, logic gates and logic circuits; and both represent examples of universal computing paradigms. This shows that homogenous photochromic computational devices can be dynamically repurposed without invasive reconfiguration. The result represents an important, necessary step towards demonstrating the general feasibility of interfacial computation embedded in biological systems or other unconventional materials and environments. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
A novel task-oriented optimal design for P300-based brain-computer interfaces.
Zhou, Zongtan; Yin, Erwei; Liu, Yang; Jiang, Jun; Hu, Dewen
2014-10-01
Objective. The number of items of a P300-based brain-computer interface (BCI) should be adjustable in accordance with the requirements of the specific tasks. To address this issue, we propose a novel task-oriented optimal approach aimed at increasing the performance of general P300 BCIs with different numbers of items. Approach. First, we proposed a stimulus presentation with variable dimensions (VD) paradigm as a generalization of the conventional single-character (SC) and row-column (RC) stimulus paradigms. Furthermore, an embedding design approach was employed for any given number of items. Finally, based on the score-P model of each subject, the VD flash pattern was selected by a linear interpolation approach for a certain task. Main results. The results indicate that the optimal BCI design consistently outperforms the conventional approaches, i.e., the SC and RC paradigms. Specifically, there is significant improvement in the practical information transfer rate for a large number of items. Significance. The results suggest that the proposed optimal approach would provide useful guidance in the practical design of general P300-based BCIs.
Yu, Yang; Zhou, Zongtan; Yin, Erwei; Jiang, Jun; Tang, Jingsheng; Liu, Yadong; Hu, Dewen
2016-10-01
This study presented a paradigm for controlling a car using an asynchronous electroencephalogram (EEG)-based brain-computer interface (BCI) and presented the experimental results of a simulation performed in an experimental environment outside the laboratory. This paradigm uses two distinct MI tasks, imaginary left- and right-hand movements, to generate a multi-task car control strategy consisting of starting the engine, moving forward, turning left, turning right, moving backward, and stopping the engine. Five healthy subjects participated in the online car control experiment, and all successfully controlled the car by following a previously outlined route. Subject S1 exhibited the most satisfactory BCI-based performance, which was comparable to the manual control-based performance. We hypothesize that the proposed self-paced car control paradigm based on EEG signals could potentially be used in car control applications, and we provide a complementary or alternative way for individuals with locked-in disorders to achieve more mobility in the future, as well as providing a supplementary car-driving strategy to assist healthy people in driving a car. Copyright © 2016 Elsevier Ltd. All rights reserved.
A novel task-oriented optimal design for P300-based brain-computer interfaces
NASA Astrophysics Data System (ADS)
Zhou, Zongtan; Yin, Erwei; Liu, Yang; Jiang, Jun; Hu, Dewen
2014-10-01
Objective. The number of items of a P300-based brain-computer interface (BCI) should be adjustable in accordance with the requirements of the specific tasks. To address this issue, we propose a novel task-oriented optimal approach aimed at increasing the performance of general P300 BCIs with different numbers of items. Approach. First, we proposed a stimulus presentation with variable dimensions (VD) paradigm as a generalization of the conventional single-character (SC) and row-column (RC) stimulus paradigms. Furthermore, an embedding design approach was employed for any given number of items. Finally, based on the score-P model of each subject, the VD flash pattern was selected by a linear interpolation approach for a certain task. Main results. The results indicate that the optimal BCI design consistently outperforms the conventional approaches, i.e., the SC and RC paradigms. Specifically, there is significant improvement in the practical information transfer rate for a large number of items. Significance. The results suggest that the proposed optimal approach would provide useful guidance in the practical design of general P300-based BCIs.
Sandak, Billie; Huss, Ephrat; Sarid, Orly; Harel, David
2015-01-01
Art therapy, as well as other arts-based therapies and interventions, is used to reduce pain, stress, depression, breathlessness and other symptoms in a wide variety of serious and chronic diseases, such as cancer, Alzheimer and schizophrenia. Arts-based approaches are also known to contribute to one’s well-being and quality of life. However, much research is required, since the mechanisms by which these non-pharmacological treatments exert their therapeutic and psychosocial effects are not adequately understood. A typical clinical setting utilizing the arts consists of the creation work itself, such as the artwork, as well as the therapist and the patient, all of which constitute a rich and dynamic environment of occurrences. The underlying complex, simultaneous and interwoven processes of this setting are often considered intractable to human observers, and as a consequence are usually interpreted subjectively and described verbally, which affect their subsequent analyses and understanding. We introduce a computational research method for elucidating and analyzing emergent expressive and social behaviors, aiming to understand how arts-based approaches operate. Our methodology, which centers on the visual language of Statecharts and tools for its execution, enables rigorous qualitative and quantitative tracking, analysis and documentation of the underlying creation and interaction processes. Also, it enables one to carry out exploratory, hypotheses-generating and knowledge discovery investigations, which are empirical-based. Furthermore, we illustrate our method’s use in a proof-of-principle study, applying it to a real-world artwork investigation with human participants. We explore individual and collective emergent behaviors impacted by diverse drawing tasks, yielding significant gender and age hypotheses, which may account for variation factors in response to art use. We also discuss how to gear our research method to systematic and mechanistic investigations, as we wish to provide a broad empirical evidence for the uptake of arts-based approaches, also aiming to ameliorate their use in clinical settings. PMID:26061736
A practical approach to virtualization in HEP
NASA Astrophysics Data System (ADS)
Buncic, P.; Aguado Sánchez, C.; Blomer, J.; Harutyunyan, A.; Mudrinic, M.
2011-01-01
In the attempt to solve the problem of processing data coming from LHC experiments at CERN at a rate of 15PB per year, for almost a decade the High Enery Physics (HEP) community has focused its efforts on the development of the Worldwide LHC Computing Grid. This generated large interest and expectations promising to revolutionize computing. Meanwhile, having initially taken part in the Grid standardization process, industry has moved in a different direction and started promoting the Cloud Computing paradigm which aims to solve problems on a similar scale and in equally seamless way as it was expected in the idealized Grid approach. A key enabling technology behind Cloud computing is server virtualization. In early 2008, an R&D project was established in the PH-SFT group at CERN to investigate how virtualization technology could be used to improve and simplify the daily interaction of physicists with experiment software frameworks and the Grid infrastructure. In this article we shall first briefly compare Grid and Cloud computing paradigms and then summarize the results of the R&D activity pointing out where and how virtualization technology could be effectively used in our field in order to maximize practical benefits whilst avoiding potential pitfalls.
Fillers as Signals: Evidence from a Question-Answering Paradigm
ERIC Educational Resources Information Center
Walker, Esther J.; Risko, Evan F.; Kingstone, Alan
2014-01-01
The present study examined the influence of a human or computer "partner" on the production of fillers ("um" and "uh") during a question and answer task. Experiment 1 investigated whether or not responding to a human partner as opposed to a computer partner results in a higher rate of filler production. Participants…
Running R Statistical Computing Environment Software on the Peregrine
for the development of new statistical methodologies and enjoys a large user base. Please consult the distribution details. Natural language support but running in an English locale R is a collaborative project programming paradigms to better leverage modern HPC systems. The CRAN task view for High Performance Computing
ERIC Educational Resources Information Center
Tung, Fang-Wu; Deng, Yi-Shin
2006-01-01
The "computers are social actors" paradigm asserts that human-to-computer interactions are fundamentally social responses. Earlier research has shown that effective management of the social presence in user interface design can improve user engagement and motivation. Much of this research has focused on adult subjects. This study…
How Learning Logic Programming Affects Recursion Comprehension
ERIC Educational Resources Information Center
Haberman, Bruria
2004-01-01
Recursion is a central concept in computer science, yet it is difficult for beginners to comprehend. Israeli high-school students learn recursion in the framework of a special modular program in computer science (Gal-Ezer & Harel, 1999). Some of them are introduced to the concept of recursion in two different paradigms: the procedural…
Philosophy of Language. Course Notes for a Tutorial on Computational Semantics.
ERIC Educational Resources Information Center
Wilks, Yorick
This course was part of a tutorial focusing on the state of computational semantics, i.e., the state of work on natural language within the artificial intelligence (AI) paradigm. The discussion in the course centered on the philosophers Richard Montague and Ludwig Wittgenstein. The course was divided into three sections: (1)…
Choosing Learning Methods Suitable for Teaching and Learning in Computer Science
ERIC Educational Resources Information Center
Taylor, Estelle; Breed, Marnus; Hauman, Ilette; Homann, Armando
2013-01-01
Our aim is to determine which teaching methods students in Computer Science and Information Systems prefer. There are in total 5 different paradigms (behaviorism, cognitivism, constructivism, design-based and humanism) with 32 models between them. Each model is unique and states different learning methods. Recommendations are made on methods that…
Integrating a Music Curriculum into an External Degree Program Using Computer Assisted Instruction.
ERIC Educational Resources Information Center
Brinkley, Robert C.
This paper outlines the method and theoretical basis for establishing and implementing an independent study music curriculum. The curriculum combines practical and theoretical paradigms and leads to an external degree. The computer, in direct interaction with the student, is the primary instructional tool, and the teacher is involved in indirect…
Autonomous control systems: applications to remote sensing and image processing
NASA Astrophysics Data System (ADS)
Jamshidi, Mohammad
2001-11-01
One of the main challenges of any control (or image processing) paradigm is being able to handle complex systems under unforeseen uncertainties. A system may be called complex here if its dimension (order) is too high and its model (if available) is nonlinear, interconnected, and information on the system is uncertain such that classical techniques cannot easily handle the problem. Examples of complex systems are power networks, space robotic colonies, national air traffic control system, and integrated manufacturing plant, the Hubble Telescope, the International Space Station, etc. Soft computing, a consortia of methodologies such as fuzzy logic, neuro-computing, genetic algorithms and genetic programming, has proven to be powerful tools for adding autonomy and semi-autonomy to many complex systems. For such systems the size of soft computing control architecture will be nearly infinite. In this paper new paradigms using soft computing approaches are utilized to design autonomous controllers and image enhancers for a number of application areas. These applications are satellite array formations for synthetic aperture radar interferometry (InSAR) and enhancement of analog and digital images.
GPU Accelerated Vector Median Filter
NASA Technical Reports Server (NTRS)
Aras, Rifat; Shen, Yuzhong
2011-01-01
Noise reduction is an important step for most image processing tasks. For three channel color images, a widely used technique is vector median filter in which color values of pixels are treated as 3-component vectors. Vector median filters are computationally expensive; for a window size of n x n, each of the n(sup 2) vectors has to be compared with other n(sup 2) - 1 vectors in distances. General purpose computation on graphics processing units (GPUs) is the paradigm of utilizing high-performance many-core GPU architectures for computation tasks that are normally handled by CPUs. In this work. NVIDIA's Compute Unified Device Architecture (CUDA) paradigm is used to accelerate vector median filtering. which has to the best of our knowledge never been done before. The performance of GPU accelerated vector median filter is compared to that of the CPU and MPI-based versions for different image and window sizes, Initial findings of the study showed 100x improvement of performance of vector median filter implementation on GPUs over CPU implementations and further speed-up is expected after more extensive optimizations of the GPU algorithm .
Barch, Deanna M; Gold, James M; Kring, Ann M
2017-07-01
Clinicians and researchers have long known that one of the debilitating aspects of psychotic disorders is the presence of "negative symptoms," which involve impairments in hedonic and motivational function, and/or alterations in expressive affect. We have a number of excellent clinical tools available for assessing the presence and severity of negative symptoms. However, to better understand the mechanisms that may give rise to negative symptoms, we need tools and methods that can help distinguish among different potential contributing causes, as a means to develop more targeted intervention pathways. Using such paradigms is particularly important if we wish to understand whether the causes are the same or different across disorders that may share surface features of negative symptoms. This approach is in line with the goals of the Research Diagnostic Criteria Initiative, which advocates understanding the nature of core dimensions of brain-behavior relationships transdiagnostically. Here we highlight some of the emerging measures and paradigms that may help us to parse the nature and causes of negative symptoms, illustrating both the research approaches from which they emerge and the types of constructs that they can help elucidate. © The Author 2017. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Teaching a changing paradigm in physiology: a historical perspective on gut interstitial cells.
Drumm, Bernard T; Baker, Salah A
2017-03-01
The study and teaching of gastrointestinal (GI) physiology necessitates an understanding of the cellular basis of contractile and electrical coupling behaviors in the muscle layers that comprise the gut wall. Our knowledge of the cellular origin of GI motility has drastically changed over the last 100 yr. While the pacing and coordination of GI contraction was once thought to be solely attributable to smooth muscle cells, it is now widely accepted that the motility patterns observed in the GI tract exist as a result of a multicellular system, consisting of not only smooth muscle cells but also enteric neurons and distinct populations of specialized interstitial cells that all work in concert to ensure proper GI functions. In this historical perspective, we focus on the emerging role of interstitial cells in GI motility and examine the key discoveries and experiments that led to a major shift in a paradigm of GI physiology regarding the role of interstitial cells in modulating GI contractile patterns. A review of these now classic experiments and papers will enable students and educators to fully appreciate the complex, multicellular nature of GI muscles as well as impart lessons on how shifting paradigms in physiology are fueled by new technologies that lead to new emerging discoveries. Copyright © 2017 the American Physiological Society.
Geia, Lynore K; Hayes, Barbara; Usher, Kim
2013-12-01
There is increasing recognition of Indigenous perspectives from various parts of the world in relation to storytelling, research and its effects on practice. The recent emergence of storytelling or yarning as a research method in Australian Aboriginal and Torres Strait Island studies and other Indigenous peoples of the world is gaining momentum. Narratives, stories, storytelling and yarning are emerging methods in research and has wide ranging potential to shape conventional research discourse making research more meaningful and accessible for researchers. In this paper we argue for the importance of Indigenous research methods and Indigenous method(ology), within collaborative respectful partnerships with non-Indigenous researchers. It is imperative to take these challenging steps together towards better outcomes for Indigenous people and their communities. In the Australian context we as researchers cannot afford to allow the gap between Aboriginal and Torres Strait Islanders and mainstream Australia health outcomes to grow even wider. One such pathway is the inclusion of Aboriginal storytelling or yarning from an Aboriginal and Torres Strait perspective within Indigenous and non-Indigenous research paradigms. Utilising Aboriginal storytelling or yarning will provide deeper understanding; complementing a two-way research paradigm for collaborative research. Furthermore, it has significant social implications for research and clinical practice amongst Indigenous populations; thus complementing the biomedical medical paradigm.
Koda, Hiroki; Basile, Muriel; Olivier, Marion; Remeuf, Kevin; Nagumo, Sumiharu; Blois-Heulin, Catherine; Lemasson, Alban
2013-08-01
The central position and universality of music in human societies raises the question of its phylogenetic origin. One of the most important properties of music involves harmonic musical intervals, in response to which humans show a spontaneous preference for consonant over dissonant sounds starting from early human infancy. Comparative studies conducted with organisms at different levels of the primate lineage are needed to understand the evolutionary scenario under which this phenomenon emerged. Although previous research found no preference for consonance in a New World monkey species, the question remained opened for Old World monkeys. We used an experimental paradigm based on a sensory reinforcement procedure to test auditory preferences for consonant sounds in Campbell's monkeys (Cercopithecus campbelli campbelli), an Old World monkey species. Although a systematic preference for soft (70 dB) over loud (90 dB) control white noise was found, Campbell's monkeys showed no preference for either consonant or dissonant sounds. The preference for soft white noise validates our noninvasive experimental paradigm, which can be easily reused in any captive facility to test for auditory preferences. This would suggest that human preference for consonant sounds is not systematically shared with New and Old World monkeys. The sensitivity for harmonic musical intervals emerged probably very late in the primate lineage.
Modeling emergent border-crossing behaviors during pandemics
NASA Astrophysics Data System (ADS)
Santos, Eunice E.; Santos, Eugene; Korah, John; Thompson, Jeremy E.; Gu, Qi; Kim, Keum Joo; Li, Deqing; Russell, Jacob; Subramanian, Suresh; Zhang, Yuxi; Zhao, Yan
2013-06-01
Modeling real-world scenarios is a challenge for traditional social science researchers, as it is often hard to capture the intricacies and dynamisms of real-world situations without making simplistic assumptions. This imposes severe limitations on the capabilities of such models and frameworks. Complex population dynamics during natural disasters such as pandemics is an area where computational social science can provide useful insights and explanations. In this paper, we employ a novel intent-driven modeling paradigm for such real-world scenarios by causally mapping beliefs, goals, and actions of individuals and groups to overall behavior using a probabilistic representation called Bayesian Knowledge Bases (BKBs). To validate our framework we examine emergent behavior occurring near a national border during pandemics, specifically the 2009 H1N1 pandemic in Mexico. The novelty of the work in this paper lies in representing the dynamism at multiple scales by including both coarse-grained (events at the national level) and finegrained (events at two separate border locations) information. This is especially useful for analysts in disaster management and first responder organizations who need to be able to understand both macro-level behavior and changes in the immediate vicinity, to help with planning, prevention, and mitigation. We demonstrate the capabilities of our framework in uncovering previously hidden connections and explanations by comparing independent models of the border locations with their fused model to identify emergent behaviors not found in either independent location models nor in a simple linear combination of those models.
Townsend, G.; LaPallo, B.K.; Boulay, C.B.; Krusienski, D.J.; Frye, G.E.; Hauser, C.K.; Schwartz, N.E.; Vaughan, T.M.; Wolpaw, J.R.; Sellers, E.W.
2010-01-01
Objective An electroencephalographic brain-computer interface (BCI) can provide a non-muscular means of communication for people with amyotrophic lateral sclerosis (ALS) or other neuromuscular disorders. We present a novel P300-based BCI stimulus presentation – the checkerboard paradigm (CBP). CBP performance is compared to that of the standard row/column paradigm (RCP) introduced by Farwell and Donchin (1988). Methods Using an 8×9 matrix of alphanumeric characters and keyboard commands, 18 participants used the CBP and RCP in counter-balanced fashion. With approximately 9 – 12 minutes of calibration data, we used a stepwise linear discriminant analysis for online classification of subsequent data. Results Mean online accuracy was significantly higher for the CBP, 92%, than for the RCP, 77%. Correcting for extra selections due to errors, mean bit rate was also significantly higher for the CBP, 23 bits/min, than for the RCP, 17 bits/min. Moreover, the two paradigms produced significantly different waveforms. Initial tests with three advanced ALS participants produced similar results. Furthermore, these individuals preferred the CBP to the RCP. Conclusions These results suggest that the CBP is markedly superior to the RCP in performance and user acceptability. Significance The CBP has the potential to provide a substantially more effective BCI than the RCP. This is especially important for people with severe neuromuscular disabilities. PMID:20347387
Field Computation and Nonpropositional Knowledge.
1987-09-01
field computer It is based on xeneralization of Taylor’s theorem to continuous dimensional vector spaces. 20. DISTRIBUTION/AVAILABILITY OF ABSTRACT 21...generalization of Taylor’s theorem to continuous dimensional vector -5paces A number of field computations are illustrated, including several Lransforma...paradigm. The "old" Al has been quite successful in performing a number of difficult tasks, such as theorem prov- ing, chess playing, medical diagnosis and
Attentional Selection in Object Recognition
1993-02-01
order. It also affects the choice of strategies in both the 24 A Computational Model of Attentional Selection filtering and arbiter stages. The set...such processing. In Treisman’s model this was hidden in the concept of the selection filter . Later computational models of attention tried to...This thesis presents a novel approach to the selection problem by propos. ing a computational model of visual attentional selection as a paradigm for
A dynamic, embodied paradigm to investigate the role of serotonin in decision-making
Asher, Derrik E.; Craig, Alexis B.; Zaldivar, Andrew; Brewer, Alyssa A.; Krichmar, Jeffrey L.
2013-01-01
Serotonin (5-HT) is a neuromodulator that has been attributed to cost assessment and harm aversion. In this review, we look at the role 5-HT plays in making decisions when subjects are faced with potential harmful or costly outcomes. We review approaches for examining the serotonergic system in decision-making. We introduce our group’s paradigm used to investigate how 5-HT affects decision-making. In particular, our paradigm combines techniques from computational neuroscience, socioeconomic game theory, human–robot interaction, and Bayesian statistics. We will highlight key findings from our previous studies utilizing this paradigm, which helped expand our understanding of 5-HT’s effect on decision-making in relation to cost assessment. Lastly, we propose a cyclic multidisciplinary approach that may aid in addressing the complexity of exploring 5-HT and decision-making by iteratively updating our assumptions and models of the serotonergic system through exhaustive experimentation. PMID:24319413
Tactile and bone-conduction auditory brain computer interface for vision and hearing impaired users.
Rutkowski, Tomasz M; Mori, Hiromu
2015-04-15
The paper presents a report on the recently developed BCI alternative for users suffering from impaired vision (lack of focus or eye-movements) or from the so-called "ear-blocking-syndrome" (limited hearing). We report on our recent studies of the extents to which vibrotactile stimuli delivered to the head of a user can serve as a platform for a brain computer interface (BCI) paradigm. In the proposed tactile and bone-conduction auditory BCI novel multiple head positions are used to evoke combined somatosensory and auditory (via the bone conduction effect) P300 brain responses, in order to define a multimodal tactile and bone-conduction auditory brain computer interface (tbcaBCI). In order to further remove EEG interferences and to improve P300 response classification synchrosqueezing transform (SST) is applied. SST outperforms the classical time-frequency analysis methods of the non-linear and non-stationary signals such as EEG. The proposed method is also computationally more effective comparing to the empirical mode decomposition. The SST filtering allows for online EEG preprocessing application which is essential in the case of BCI. Experimental results with healthy BCI-naive users performing online tbcaBCI, validate the paradigm, while the feasibility of the concept is illuminated through information transfer rate case studies. We present a comparison of the proposed SST-based preprocessing method, combined with a logistic regression (LR) classifier, together with classical preprocessing and LDA-based classification BCI techniques. The proposed tbcaBCI paradigm together with data-driven preprocessing methods are a step forward in robust BCI applications research. Copyright © 2014 Elsevier B.V. All rights reserved.
Neurobiomimetic constructs for intelligent unmanned systems and robotics
NASA Astrophysics Data System (ADS)
Braun, Jerome J.; Shah, Danelle C.; DeAngelus, Marianne A.
2014-06-01
This paper discusses a paradigm we refer to as neurobiomimetic, which involves emulations of brain neuroanatomy and neurobiology aspects and processes. Neurobiomimetic constructs include rudimentary and down-scaled computational representations of brain regions, sub-regions, and synaptic connectivity. Many different instances of neurobiomimetic constructs are possible, depending on various aspects such as the initial conditions of synaptic connectivity, number of neuron elements in regions, connectivity specifics, and more, and we refer to these instances as `animats'. While downscaled for computational feasibility, the animats are very large constructs; the animats implemented in this work contain over 47,000 neuron elements and over 720,000 synaptic connections. The paper outlines aspects of the animats implemented, spatial memory and learning cognitive task, the virtual-reality environment constructed to study the animat performing that task, and discussion of results. In a broad sense, we argue that the neurobiomimetic paradigm pursued in this work constitutes a particularly promising path to artificial cognition and intelligent unmanned systems. Biological brains readily cope with challenges of real-life tasks that consistently prove beyond even the most sophisticated algorithmic approaches known. At the cross-over point of neuroscience, cognitive science and computer science, paradigms such as the one pursued in this work aim to mimic the mechanisms of biological brains and as such, we argue, may lead to machines with abilities closer to those of biological species.
Atlas : A library for numerical weather prediction and climate modelling
NASA Astrophysics Data System (ADS)
Deconinck, Willem; Bauer, Peter; Diamantakis, Michail; Hamrud, Mats; Kühnlein, Christian; Maciel, Pedro; Mengaldo, Gianmarco; Quintino, Tiago; Raoult, Baudouin; Smolarkiewicz, Piotr K.; Wedi, Nils P.
2017-11-01
The algorithms underlying numerical weather prediction (NWP) and climate models that have been developed in the past few decades face an increasing challenge caused by the paradigm shift imposed by hardware vendors towards more energy-efficient devices. In order to provide a sustainable path to exascale High Performance Computing (HPC), applications become increasingly restricted by energy consumption. As a result, the emerging diverse and complex hardware solutions have a large impact on the programming models traditionally used in NWP software, triggering a rethink of design choices for future massively parallel software frameworks. In this paper, we present Atlas, a new software library that is currently being developed at the European Centre for Medium-Range Weather Forecasts (ECMWF), with the scope of handling data structures required for NWP applications in a flexible and massively parallel way. Atlas provides a versatile framework for the future development of efficient NWP and climate applications on emerging HPC architectures. The applications range from full Earth system models, to specific tools required for post-processing weather forecast products. The Atlas library thus constitutes a step towards affordable exascale high-performance simulations by providing the necessary abstractions that facilitate the application in heterogeneous HPC environments by promoting the co-design of NWP algorithms with the underlying hardware.
Autonomous target tracking of UAVs based on low-power neural network hardware
NASA Astrophysics Data System (ADS)
Yang, Wei; Jin, Zhanpeng; Thiem, Clare; Wysocki, Bryant; Shen, Dan; Chen, Genshe
2014-05-01
Detecting and identifying targets in unmanned aerial vehicle (UAV) images and videos have been challenging problems due to various types of image distortion. Moreover, the significantly high processing overhead of existing image/video processing techniques and the limited computing resources available on UAVs force most of the processing tasks to be performed by the ground control station (GCS) in an off-line manner. In order to achieve fast and autonomous target identification on UAVs, it is thus imperative to investigate novel processing paradigms that can fulfill the real-time processing requirements, while fitting the size, weight, and power (SWaP) constrained environment. In this paper, we present a new autonomous target identification approach on UAVs, leveraging the emerging neuromorphic hardware which is capable of massively parallel pattern recognition processing and demands only a limited level of power consumption. A proof-of-concept prototype was developed based on a micro-UAV platform (Parrot AR Drone) and the CogniMemTMneural network chip, for processing the video data acquired from a UAV camera on the y. The aim of this study was to demonstrate the feasibility and potential of incorporating emerging neuromorphic hardware into next-generation UAVs and their superior performance and power advantages towards the real-time, autonomous target tracking.
NASA Astrophysics Data System (ADS)
Mohan, C.
In this paper, I survey briefly some of the recent and emerging trends in hardware and software features which impact high performance transaction processing and data analytics applications. These features include multicore processor chips, ultra large main memories, flash storage, storage class memories, database appliances, field programmable gate arrays, transactional memory, key-value stores, and cloud computing. While some applications, e.g., Web 2.0 ones, were initially built without traditional transaction processing functionality in mind, slowly system architects and designers are beginning to address such previously ignored issues. The availability, analytics and response time requirements of these applications were initially given more importance than ACID transaction semantics and resource consumption characteristics. A project at IBM Almaden is studying the implications of phase change memory on transaction processing, in the context of a key-value store. Bitemporal data management has also become an important requirement, especially for financial applications. Power consumption and heat dissipation properties are also major considerations in the emergence of modern software and hardware architectural features. Considerations relating to ease of configuration, installation, maintenance and monitoring, and improvement of total cost of ownership have resulted in database appliances becoming very popular. The MapReduce paradigm is now quite popular for large scale data analysis, in spite of the major inefficiencies associated with it.
Advances in oncological treatment: limitations of RECIST 1.1 criteria.
Grimaldi, Serena; Terroir, Marie; Caramella, Caroline
2018-06-01
RECIST 1.1 criteria are the standard for the response assessment of most solid tumors on computed tomography (CT). Nevertheless, the emergence of new classes of treatment in the lasts decades has brought new challenges in the evaluation of response. A PubMed online database literature search was performed in order to identify papers in English with full text available published up to September 2017. Some oncologic treatments, such as antiangiogenic agents, immunotherapy and local treatments, have proven to be effective despite atypical patterns of response. In patients undergoing these treatments, size-based evaluations, such as RECIST1.1, show some limitations, since they often underestimate the response. Some modified criteria have been proposed to improve the response assessment in several specific settings, such in gastrointestinal stromal tumors treated by antiangiogenic agents, hepatocellular carcinoma treated by local ablation or solid tumors treated by immunotherapy. New techniques of image analysis and imaging modalities other than CT, such as magnetic resonance imaging and positron emission tomography, may provide additional information and amend some of the limitations of size-based criteria. The emergence of new treatment paradigms and the increasing trend toward personalizing treatment should be associated with a concomitant evolution of response assessment, in both research and clinical settings.
Poisson's ratio over two centuries: challenging hypotheses
Greaves, G. Neville
2013-01-01
This article explores Poisson's ratio, starting with the controversy concerning its magnitude and uniqueness in the context of the molecular and continuum hypotheses competing in the development of elasticity theory in the nineteenth century, moving on to its place in the development of materials science and engineering in the twentieth century, and concluding with its recent re-emergence as a universal metric for the mechanical performance of materials on any length scale. During these episodes France lost its scientific pre-eminence as paradigms switched from mathematical to observational, and accurate experiments became the prerequisite for scientific advance. The emergence of the engineering of metals followed, and subsequently the invention of composites—both somewhat separated from the discovery of quantum mechanics and crystallography, and illustrating the bifurcation of technology and science. Nowadays disciplines are reconnecting in the face of new scientific demands. During the past two centuries, though, the shape versus volume concept embedded in Poisson's ratio has remained invariant, but its application has exploded from its origins in describing the elastic response of solids and liquids, into areas such as materials with negative Poisson's ratio, brittleness, glass formation, and a re-evaluation of traditional materials. Moreover, the two contentious hypotheses have been reconciled in their complementarity within the hierarchical structure of materials and through computational modelling. PMID:24687094
Small Universal Bacteria and Plasmid Computing Systems.
Wang, Xun; Zheng, Pan; Ma, Tongmao; Song, Tao
2018-05-29
Bacterial computing is a known candidate in natural computing, the aim being to construct "bacterial computers" for solving complex problems. In this paper, a new kind of bacterial computing system, named the bacteria and plasmid computing system (BP system), is proposed. We investigate the computational power of BP systems with finite numbers of bacteria and plasmids. Specifically, it is obtained in a constructive way that a BP system with 2 bacteria and 34 plasmids is Turing universal. The results provide a theoretical cornerstone to construct powerful bacterial computers and demonstrate a concept of paradigms using a "reasonable" number of bacteria and plasmids for such devices.
Wilk, S; Michalowski, W; O'Sullivan, D; Farion, K; Sayyad-Shirabad, J; Kuziemsky, C; Kukawka, B
2013-01-01
The purpose of this study was to create a task-based support architecture for developing clinical decision support systems (CDSSs) that assist physicians in making decisions at the point-of-care in the emergency department (ED). The backbone of the proposed architecture was established by a task-based emergency workflow model for a patient-physician encounter. The architecture was designed according to an agent-oriented paradigm. Specifically, we used the O-MaSE (Organization-based Multi-agent System Engineering) method that allows for iterative translation of functional requirements into architectural components (e.g., agents). The agent-oriented paradigm was extended with ontology-driven design to implement ontological models representing knowledge required by specific agents to operate. The task-based architecture allows for the creation of a CDSS that is aligned with the task-based emergency workflow model. It facilitates decoupling of executable components (agents) from embedded domain knowledge (ontological models), thus supporting their interoperability, sharing, and reuse. The generic architecture was implemented as a pilot system, MET3-AE--a CDSS to help with the management of pediatric asthma exacerbation in the ED. The system was evaluated in a hospital ED. The architecture allows for the creation of a CDSS that integrates support for all tasks from the task-based emergency workflow model, and interacts with hospital information systems. Proposed architecture also allows for reusing and sharing system components and knowledge across disease-specific CDSSs.
Mainsah, B O; Reeves, G; Collins, L M; Throckmorton, C S
2017-08-01
The role of a brain-computer interface (BCI) is to discern a user's intended message or action by extracting and decoding relevant information from brain signals. Stimulus-driven BCIs, such as the P300 speller, rely on detecting event-related potentials (ERPs) in response to a user attending to relevant or target stimulus events. However, this process is error-prone because the ERPs are embedded in noisy electroencephalography (EEG) data, representing a fundamental problem in communication of the uncertainty in the information that is received during noisy transmission. A BCI can be modeled as a noisy communication system and an information-theoretic approach can be exploited to design a stimulus presentation paradigm to maximize the information content that is presented to the user. However, previous methods that focused on designing error-correcting codes failed to provide significant performance improvements due to underestimating the effects of psycho-physiological factors on the P300 ERP elicitation process and a limited ability to predict online performance with their proposed methods. Maximizing the information rate favors the selection of stimulus presentation patterns with increased target presentation frequency, which exacerbates refractory effects and negatively impacts performance within the context of an oddball paradigm. An information-theoretic approach that seeks to understand the fundamental trade-off between information rate and reliability is desirable. We developed a performance-based paradigm (PBP) by tuning specific parameters of the stimulus presentation paradigm to maximize performance while minimizing refractory effects. We used a probabilistic-based performance prediction method as an evaluation criterion to select a final configuration of the PBP. With our PBP, we demonstrate statistically significant improvements in online performance, both in accuracy and spelling rate, compared to the conventional row-column paradigm. By accounting for refractory effects, an information-theoretic approach can be exploited to significantly improve BCI performance across a wide range of performance levels.
NASA Astrophysics Data System (ADS)
Mainsah, B. O.; Reeves, G.; Collins, L. M.; Throckmorton, C. S.
2017-08-01
Objective. The role of a brain-computer interface (BCI) is to discern a user’s intended message or action by extracting and decoding relevant information from brain signals. Stimulus-driven BCIs, such as the P300 speller, rely on detecting event-related potentials (ERPs) in response to a user attending to relevant or target stimulus events. However, this process is error-prone because the ERPs are embedded in noisy electroencephalography (EEG) data, representing a fundamental problem in communication of the uncertainty in the information that is received during noisy transmission. A BCI can be modeled as a noisy communication system and an information-theoretic approach can be exploited to design a stimulus presentation paradigm to maximize the information content that is presented to the user. However, previous methods that focused on designing error-correcting codes failed to provide significant performance improvements due to underestimating the effects of psycho-physiological factors on the P300 ERP elicitation process and a limited ability to predict online performance with their proposed methods. Maximizing the information rate favors the selection of stimulus presentation patterns with increased target presentation frequency, which exacerbates refractory effects and negatively impacts performance within the context of an oddball paradigm. An information-theoretic approach that seeks to understand the fundamental trade-off between information rate and reliability is desirable. Approach. We developed a performance-based paradigm (PBP) by tuning specific parameters of the stimulus presentation paradigm to maximize performance while minimizing refractory effects. We used a probabilistic-based performance prediction method as an evaluation criterion to select a final configuration of the PBP. Main results. With our PBP, we demonstrate statistically significant improvements in online performance, both in accuracy and spelling rate, compared to the conventional row-column paradigm. Significance. By accounting for refractory effects, an information-theoretic approach can be exploited to significantly improve BCI performance across a wide range of performance levels.
A Co-Adaptive Brain-Computer Interface for End Users with Severe Motor Impairment
Faller, Josef; Scherer, Reinhold; Costa, Ursula; Opisso, Eloy; Medina, Josep; Müller-Putz, Gernot R.
2014-01-01
Co-adaptive training paradigms for event-related desynchronization (ERD) based brain-computer interfaces (BCI) have proven effective for healthy users. As of yet, it is not clear whether co-adaptive training paradigms can also benefit users with severe motor impairment. The primary goal of our paper was to evaluate a novel cue-guided, co-adaptive BCI training paradigm with severely impaired volunteers. The co-adaptive BCI supports a non-control state, which is an important step toward intuitive, self-paced control. A secondary aim was to have the same participants operate a specifically designed self-paced BCI training paradigm based on the auto-calibrated classifier. The co-adaptive BCI analyzed the electroencephalogram from three bipolar derivations (C3, Cz, and C4) online, while the 22 end users alternately performed right hand movement imagery (MI), left hand MI and relax with eyes open (non-control state). After less than five minutes, the BCI auto-calibrated and proceeded to provide visual feedback for the MI task that could be classified better against the non-control state. The BCI continued to regularly recalibrate. In every calibration step, the system performed trial-based outlier rejection and trained a linear discriminant analysis classifier based on one auto-selected logarithmic band-power feature. In 24 minutes of training, the co-adaptive BCI worked significantly (p = 0.01) better than chance for 18 of 22 end users. The self-paced BCI training paradigm worked significantly (p = 0.01) better than chance in 11 of 20 end users. The presented co-adaptive BCI complements existing approaches in that it supports a non-control state, requires very little setup time, requires no BCI expert and works online based on only two electrodes. The preliminary results from the self-paced BCI paradigm compare favorably to previous studies and the collected data will allow to further improve self-paced BCI systems for disabled users. PMID:25014055
Bats, emerging infectious diseases, and the rabies paradigm revisited
Kuzmin, Ivan V.; Bozick, Brooke; Guagliardo, Sarah A.; Kunkel, Rebekah; Shak, Joshua R.; Tong, Suxiang; Rupprecht, Charles E
2011-01-01
The significance of bats as sources of emerging infectious diseases has been increasingly appreciated, and new data have been accumulated rapidly during recent years. For some emerging pathogens the bat origin has been confirmed (such as lyssaviruses, henipaviruses, coronaviruses), for other it has been suggested (filoviruses). Several recently identified viruses remain to be ‘orphan’ but have a potential for further emergence (such as Tioman, Menangle, and Pulau viruses). In the present review we summarize information on major bat-associated emerging infections and discuss specific characteristics of bats as carriers of pathogens (from evolutionary, ecological, and immunological positions). We also discuss drivers and forces of an infectious disease emergence and describe various existing and potential approaches for control and prevention of such infections at individual, populational, and societal levels. PMID:24149032
Bats, emerging infectious diseases, and the rabies paradigm revisited.
Kuzmin, Ivan V; Bozick, Brooke; Guagliardo, Sarah A; Kunkel, Rebekah; Shak, Joshua R; Tong, Suxiang; Rupprecht, Charles E
2011-06-20
The significance of bats as sources of emerging infectious diseases has been increasingly appreciated, and new data have been accumulated rapidly during recent years. For some emerging pathogens the bat origin has been confirmed (such as lyssaviruses, henipaviruses, coronaviruses), for other it has been suggested (filoviruses). Several recently identified viruses remain to be 'orphan' but have a potential for further emergence (such as Tioman, Menangle, and Pulau viruses). In the present review we summarize information on major bat-associated emerging infections and discuss specific characteristics of bats as carriers of pathogens (from evolutionary, ecological, and immunological positions). We also discuss drivers and forces of an infectious disease emergence and describe various existing and potential approaches for control and prevention of such infections at individual, populational, and societal levels.
Ford, Shea A; Blanck, George
2015-01-01
Research in cancer biology has been largely driven by experimental approaches whereby discreet inputs are used to assess discreet outputs, for example, gene-knockouts to assess cancer occurrence. However, cancer hallmarks are only rarely, if ever, exclusively dependent on discreet regulatory processes. Rather, cancer-related regulatory factors affect multiple cancer hallmarks. Thus, novel approaches and paradigms are needed for further advances. Signal pathway persistence and amplification, rather than signal pathway activation resulting from an on/off switch, represent emerging paradigms for cancer research, closely related to developmental regulatory paradigms. In this review, we address both mechanisms and effects of signal pathway persistence and amplification in cancer settings; and address the possibility that hyper-activation of pro-proliferative signal pathways in certain cancer settings could be exploited for therapy. Copyright © 2014. Published by Elsevier B.V.
First-Principles Design of Novel Catalytic and Chemoresponsive Materials
NASA Astrophysics Data System (ADS)
Roling, Luke T.
An emerging trend in materials design is the use of computational chemistry tools to accelerate materials discovery and implementation. In particular, the parallel nature of computational models enables high-throughput screening approaches that would be laborious and time-consuming with experiments alone, and can be useful for identifying promising candidate materials for experimental synthesis and evaluation. Additionally, atomic-scale modeling allows researchers to obtain a detailed understanding of phenomena invisible to many current experimental techniques. In this thesis, we highlight mechanistic studies and successes in catalyst design for heterogeneous electrochemical reactions, discussing both anode and cathode chemistries. In particular, we evaluate the properties of a new class of Pd-Pt core-shell and hollow nanocatalysts toward the oxygen reduction reaction. We do not limit our study to electrochemical reactivity, but also consider these catalysts in a broader context by performing in-depth studies of their stability at elevated temperatures as well as investigating the mechanisms by which they are able to form. We also present fundamental surface science studies, investigating graphene formation and H2 dissociation, which are processes of both fundamental and practical interest in many catalytic applications. Finally, we extend our materials design paradigm outside the field of catalysis to develop and apply a model for the detection of small chemical analytes by chemoresponsive liquid crystals, and offer several predictions for improving the detection of small chemicals. A close connection between computation, synthesis, and experimental evaluation is essential to the work described herein, as computations are used to gain fundamental insight into experimental observations, and experiments and synthesis are in turn used to validate predictions of material activities from computational models.
SciDAC GSEP: Gyrokinetic Simulation of Energetic Particle Turbulence and Transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Zhihong
Energetic particle (EP) confinement is a key physics issue for burning plasma experiment ITER, the crucial next step in the quest for clean and abundant energy, since ignition relies on self-heating by energetic fusion products (α-particles). Due to the strong coupling of EP with burning thermal plasmas, plasma confinement property in the ignition regime is one of the most uncertain factors when extrapolating from existing fusion devices to the ITER tokamak. EP population in current tokamaks are mostly produced by auxiliary heating such as neutral beam injection (NBI) and radio frequency (RF) heating. Remarkable progress in developing comprehensive EP simulationmore » codes and understanding basic EP physics has been made by two concurrent SciDAC EP projects GSEP funded by the Department of Energy (DOE) Office of Fusion Energy Science (OFES), which have successfully established gyrokinetic turbulence simulation as a necessary paradigm shift for studying the EP confinement in burning plasmas. Verification and validation have rapidly advanced through close collaborations between simulation, theory, and experiment. Furthermore, productive collaborations with computational scientists have enabled EP simulation codes to effectively utilize current petascale computers and emerging exascale computers. We review here key physics progress in the GSEP projects regarding verification and validation of gyrokinetic simulations, nonlinear EP physics, EP coupling with thermal plasmas, and reduced EP transport models. Advances in high performance computing through collaborations with computational scientists that enable these large scale electromagnetic simulations are also highlighted. These results have been widely disseminated in numerous peer-reviewed publications including many Phys. Rev. Lett. papers and many invited presentations at prominent fusion conferences such as the biennial International Atomic Energy Agency (IAEA) Fusion Energy Conference and the annual meeting of the American Physics Society, Division of Plasma Physics (APS-DPP).« less
Synthetic Elucidation of Design Principles for Molecular Qubits
NASA Astrophysics Data System (ADS)
Graham, Michael James
Quantum information processing (QIP) is an emerging computational paradigm with the potential to enable a vast increase in computational power, fundamentally transforming fields from structural biology to finance. QIP employs qubits, or quantum bits, as its fundamental units of information, which can exist in not just the classical states of 0 or 1, but in a superposition of the two. In order to successfully perform QIP, this superposition state must be sufficiently long-lived. One promising paradigm for the implementation of QIP involves employing unpaired electrons in coordination complexes as qubits. This architecture is highly tunable and scalable, however coordination complexes frequently suffer from short superposition lifetimes, or T2. In order to capitalize on the promise of molecular qubits, it is necessary to develop a set of design principles that allow the rational synthesis of complexes with sufficiently long values of T2. In this dissertation, I report efforts to use the synthesis of series of complexes to elucidate design principles for molecular qubits. Chapter 1 details previous work by our group and others in the field. Chapter 2 details the first efforts of our group to determine the impact of varying spin and spin-orbit coupling on T2. Chapter 3 examines the effect of removing nuclear spins on coherence time, and reports a series of vanadyl bis(dithiolene) complexes which exhibit extremely long coherence lifetimes, in excess of the 100 mus threshold for qubit viability. Chapters 4 and 5 form two complimentary halves of a study to determine the exact relationship between electronic spin-nuclear spin distance and the effect of the nuclear spins on T2. Finally, chapter 6 suggests next directions for the field as a whole, including the potential for work in this field to impact the development of other technologies as diverse as quantum sensors and magnetic resonance imaging contrast agents.
Optimized Motor Imagery Paradigm Based on Imagining Chinese Characters Writing Movement.
Qiu, Zhaoyang; Allison, Brendan Z; Jin, Jing; Zhang, Yu; Wang, Xingyu; Li, Wei; Cichocki, Andrzej
2017-07-01
motor imagery (MI) is a mental representation of motor behavior. The MI-based brain computer interfaces (BCIs) can provide communication for the physically impaired. The performance of MI-based BCI mainly depends on the subject's ability to self-modulate electroencephalogram signals. Proper training can help naive subjects learn to modulate brain activity proficiently. However, training subjects typically involve abstract motor tasks and are time-consuming. to improve the performance of naive subjects during motor imagery, a novel paradigm was presented that would guide naive subjects to modulate brain activity effectively. In this new paradigm, pictures of the left or right hand were used as cues for subjects to finish the motor imagery task. Fourteen healthy subjects (11 male, aged 22-25 years, and mean 23.6±1.16) participated in this study. The task was to imagine writing a Chinese character. Specifically, subjects could imagine hand movements corresponding to the sequence of writing strokes in the Chinese character. This paradigm was meant to find an effective and familiar action for most Chinese people, to provide them with a specific, extensively practiced task and help them modulate brain activity. results showed that the writing task paradigm yielded significantly better performance than the traditional arrow paradigm (p < 0.001). Questionnaire replies indicated that most subjects thought that the new paradigm was easier. the proposed new motor imagery paradigm could guide subjects to help them modulate brain activity effectively. Results showed that there were significant improvements using new paradigm, both in classification accuracy and usability.
Photonic Design: From Fundamental Solar Cell Physics to Computational Inverse Design
NASA Astrophysics Data System (ADS)
Miller, Owen Dennis
Photonic innovation is becoming ever more important in the modern world. Optical systems are dominating shorter and shorter communications distances, LED's are rapidly emerging for a variety of applications, and solar cells show potential to be a mainstream technology in the energy space. The need for novel, energy-efficient photonic and optoelectronic devices will only increase. This work unites fundamental physics and a novel computational inverse design approach towards such innovation. The first half of the dissertation is devoted to the physics of high-efficiency solar cells. As solar cells approach fundamental efficiency limits, their internal physics transforms. Photonic considerations, instead of electronic ones, are the key to reaching the highest voltages and efficiencies. Proper photon management led to Alta Device's recent dramatic increase of the solar cell efficiency record to 28.3%. Moreover, approaching the Shockley-Queisser limit for any solar cell technology will require light extraction to become a part of all future designs. The second half of the dissertation introduces inverse design as a new computational paradigm in photonics. An assortment of techniques (FDTD, FEM, etc.) have enabled quick and accurate simulation of the "forward problem" of finding fields for a given geometry. However, scientists and engineers are typically more interested in the inverse problem: for a desired functionality, what geometry is needed? Answering this question breaks from the emphasis on the forward problem and forges a new path in computational photonics. The framework of shape calculus enables one to quickly find superior, non-intuitive designs. Novel designs for optical cloaking and sub-wavelength solar cell applications are presented.
Huang, Charles Lung-Cheng; Hsiao, Sigmund; Hwu, Hai-Gwo; Howng, Shen-Long
2012-12-30
The Chinese Facial Emotion Recognition Database (CFERD), a computer-generated three-dimensional (3D) paradigm, was developed to measure the recognition of facial emotional expressions at different intensities. The stimuli consisted of 3D colour photographic images of six basic facial emotional expressions (happiness, sadness, disgust, fear, anger and surprise) and neutral faces of the Chinese. The purpose of the present study is to describe the development and validation of CFERD with nonclinical healthy participants (N=100; 50 men; age ranging between 18 and 50 years), and to generate normative data set. The results showed that the sensitivity index d' [d'=Z(hit rate)-Z(false alarm rate), where function Z(p), p∈[0,1
Koorehdavoudi, Hana; Bogdan, Paul
2016-01-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity. PMID:27297496
NASA Astrophysics Data System (ADS)
Koorehdavoudi, Hana; Bogdan, Paul
2016-06-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity.
Cancer Vaccines: Moving Beyond Current Paradigms
Schlom, Jeffrey; Arlen, Philip M.; Gulley, James L.
2008-01-01
The field of cancer vaccines is currently in an active state of preclinical and clinical investigations. While no therapeutic cancer vaccine has to date been approved by the FDA, several new paradigms are emerging from recent clinical findings in both the use of combination therapy approaches and, perhaps more importantly, in clinical trial design and endpoint analyses. This paper will review recent clinical trials involving several different cancer vaccines from which data are emerging contrasting classical “tumor response” (RECIST) criteria with “patient response” in the manifestation of increased patient survival post-vaccine therapy. Also described are several strategies in which cancer vaccines can be exploited in combination with other agents and therapeutic modalities that are quite unique when compared with “conventional” combination therapies. This is most likely due to the phenomena that (a) cancer vaccines initiate a dynamic immune process that can be exploited in subsequent therapies, and (b) both radiation and certain chemotherapeutic agents have been shown to alter the phenotype of tumor cells as to render them more susceptible to T-cell–mediated killing. Consequently, evidence is emerging from several studies in which patient cohorts who first receive a cancer vaccine (as contrasted with control cohorts) benefit clinically from subsequent therapies. PMID:17606707
Determinism, chaos, self-organization and entropy.
Pontes, José
2016-01-01
We discuss two changes of paradigms that occurred in science along the XXth century: the end of the mechanist determinism, and the end of the apparent incompatibility between biology, where emergence of order is law, and physics, postulating a progressive loss of order in natural systems. We recognize today that three mechanisms play a major role in the building of order: the nonlinear nature of most evolution laws, along with distance to equilibrium, and with the new paradigm, that emerged in the last forty years, as we recognize that networks present collective order properties not found in the individual nodes. We also address the result presented by Blumenfeld (L.A. Blumenfeld, Problems of Biological Physics, Springer, Berlin, 1981) showing that entropy decreases resulting from building one of the most complex biological structures, the human being, are small and may be trivially compensated for compliance with thermodynamics. Life is made at the expense of very low thermodynamic cost, so thermodynamics does not pose major restrictions to the emergence of life. Besides, entropy does not capture our idea of order in biological systems. The above questions show that science is not free of confl icts and backlashes, often resulting from excessive extrapolations.
Performance analysis of a large-grain dataflow scheduling paradigm
NASA Technical Reports Server (NTRS)
Young, Steven D.; Wills, Robert W.
1993-01-01
A paradigm for scheduling computations on a network of multiprocessors using large-grain data flow scheduling at run time is described and analyzed. The computations to be scheduled must follow a static flow graph, while the schedule itself will be dynamic (i.e., determined at run time). Many applications characterized by static flow exist, and they include real-time control and digital signal processing. With the advent of computer-aided software engineering (CASE) tools for capturing software designs in dataflow-like structures, macro-dataflow scheduling becomes increasingly attractive, if not necessary. For parallel implementations, using the macro-dataflow method allows the scheduling to be insulated from the application designer and enables the maximum utilization of available resources. Further, by allowing multitasking, processor utilizations can approach 100 percent while they maintain maximum speedup. Extensive simulation studies are performed on 4-, 8-, and 16-processor architectures that reflect the effects of communication delays, scheduling delays, algorithm class, and multitasking on performance and speedup gains.
NASA Astrophysics Data System (ADS)
Gatto, Paolo; Lipparini, Filippo; Stamm, Benjamin
2017-12-01
The domain-decomposition (dd) paradigm, originally introduced for the conductor-like screening model, has been recently extended to the dielectric Polarizable Continuum Model (PCM), resulting in the ddPCM method. We present here a complete derivation of the analytical derivatives of the ddPCM energy with respect to the positions of the solute's atoms and discuss their efficient implementation. As it is the case for the energy, we observe a quadratic scaling, which is discussed and demonstrated with numerical tests.
NASA Astrophysics Data System (ADS)
Andonov, Zdravko
This R&D represent innovative multidimensional 6D-N(6n)D Space-Time (S-T) Methodology, 6D-6nD Coordinate Systems, 6D Equations, new 6D strategy and technology for development of Planetary Space Sciences, S-T Data Management and S-T Computational To-mography. . . The Methodology is actual for brain new RS Microwaves' Satellites and Compu-tational Tomography Systems development, aimed to defense sustainable Earth, Moon, & Sun System evolution. Especially, extremely important are innovations for monitoring and protec-tion of strategic threelateral system H-OH-H2O Hydrogen, Hydroxyl and Water), correspond-ing to RS VHRS (Very High Resolution Systems) of 1.420-1.657-22.089GHz microwaves. . . One of the Greatest Paradox and Challenge of World Science is the "transformation" of J. L. Lagrange 4D Space-Time (S-T) System to H. Minkovski 4D S-T System (O-X,Y,Z,icT) for Einstein's "Theory of Relativity". As a global result: -In contemporary Advanced Space Sciences there is not real adequate 4D-6D Space-Time Coordinate System and 6D Advanced Cosmos Strategy & Methodology for Multidimensional and Multitemporal Space-Time Data Management and Tomography. . . That's one of the top actual S-T Problems. Simple and optimal nD S-T Methodology discovery is extremely important for all Universities' Space Sci-ences' Education Programs, for advances in space research and especially -for all young Space Scientists R&D!... The top ten 21-Century Challenges ahead of Planetary and Space Sciences, Space Data Management and Computational Space Tomography, important for successfully de-velopment of Young Scientist Generations, are following: 1. R&D of W. R. Hamilton General Idea for transformation all Space Sciences to Time Sciences, beginning with 6D Eukonal for 6D anisotropic mediums & velocities. Development of IERS Earth & Space Systems (VLBI; LLR; GPS; SLR; DORIS Etc.) for Planetary-Space Data Management & Computational Planetary & Space Tomography. 2. R&D of S. W. Hawking Paradigm for 2D Complex Time and Quan-tum Wave Cosmology Paradigm for Decision of the Main Problem of Contemporary Physics. 3. R&D of Einstein-Minkowski Geodesies' Paradigm in the 4D-Space-Time Continuum to 6D-6nD Space-Time Continuum Paradigms and 6D S-T Equations. . . 4. R&D of Erwin Schrüdinger 4D S-T Universe' Evolutional Equation; It's David Bohm 4D generalization for anisotropic mediums and innovative 6D -for instantaneously quantum measurement -Bohm-Schrüdinger 6D S-T Universe' Evolutional Equation. 5. R&D of brain new 6D Planning of S-T Experi-ments, brain new 6D Space Technicks and Space Technology Generalizations, especially for 6D RS VHRS Research, Monitoring and 6D Computational Tomography. 6. R&D of "6D Euler-Poisson Equations" and "6D Kolmogorov Turbulence Theory" for GeoDynamics and for Space Dynamics as evolution of Gauss-Riemann Paradigms. 7. R&D of N. Boneff NASA RD for Asteroid "Eros" & Space Science' Laws Evolution. 8. R&D of H. Poincare Paradigm for Nature and Cosmos as 6D Group of Transferences. 9. R&D of K. Popoff N-Body General Problem & General Thermodynamic S-T Theory as Einstein-Prigogine-Landau' Paradigms Development. ü 10. R&D of 1st GUT since 1958 by N. S. Kalitzin (Kalitzin N. S., 1958: Uber eine einheitliche Feldtheorie. ZAHeidelberg-ARI, WZHUmnR-B., 7 (2), 207-215) and "Multitemporal Theory of Relativity" -With special applications to Photon Rockets and all Space-Time R&D. GENERAL CONCLUSION: Multidimensional Space-Time Methodology is advance in space research, corresponding to the IAF-IAA-COSPAR Innovative Strategy and R&D Programs -UNEP, UNDP, GEOSS, GMES, Etc.
Development of a Computer-Assisted Cranial Nerve Simulation from the Visible Human Dataset
ERIC Educational Resources Information Center
Yeung, Jeffrey C.; Fung, Kevin; Wilson, Timothy D.
2011-01-01
Advancements in technology and personal computing have allowed for the development of novel teaching modalities such as online web-based modules. These modules are currently being incorporated into medical curricula and, in some paradigms, have been shown to be superior to classroom instruction. We believe that these modules have the potential of…
Web-Based Seamless Migration for Task-Oriented Mobile Distance Learning
ERIC Educational Resources Information Center
Zhang, Degan; Li, Yuan-chao; Zhang, Huaiyu; Zhang, Xinshang; Zeng, Guangping
2006-01-01
As a new kind of computing paradigm, pervasive computing will meet the requirements of human being that anybody maybe obtain services in anywhere and at anytime, task-oriented seamless migration is one of its applications. Apparently, the function of seamless mobility is suitable for mobile services, such as mobile Web-based learning. In this…
Task Scheduling in Desktop Grids: Open Problems
NASA Astrophysics Data System (ADS)
Chernov, Ilya; Nikitina, Natalia; Ivashko, Evgeny
2017-12-01
We survey the areas of Desktop Grid task scheduling that seem to be insufficiently studied so far and are promising for efficiency, reliability, and quality of Desktop Grid computing. These topics include optimal task grouping, "needle in a haystack" paradigm, game-theoretical scheduling, domain-imposed approaches, special optimization of the final stage of the batch computation, and Enterprise Desktop Grids.
Physician leadership: influence on practice-based learning and improvement.
Prather, Stephen E; Jones, David N
2003-01-01
In response to the technology and information explosion, practice-based learning and improvement is emerging within the medical field to deliver systematic practice-linked improvements. However, its emergence has been inhibited by the slow acceptance of evidence-based medicine among physicians, who are reluctant to embrace proven high-performance leadership principles long established in other high-risk fields. This reluctance may be attributable to traditional medical training, which encourages controlling leadership styles that magnify the resistance common to all change efforts. To overcome this resistance, physicians must develop the same leadership skills that have proven to be critical to success in other service and high-performance industries. Skills such as self-awareness, shared authority, conflict resolution, and nonpunitive critique can emerge in practice only if they are taught. A dramatic shift away from control and blame has become a requirement for achieving success in other industries based on complex group process. This approach is so mainstream that the burden of proof that cooperative leadership is not a requirement for medical improvement falls to those institutions perpetuating the outmoded paradigm of the past. Cooperative leadership skills that have proven central to implementing change in the information era are suggested as a core cultural support for practice-based learning and improvement. Complex adaptive systems theory, long used as a way to understand evolutionary biology, and more recently computer science and economics, predicts that behavior emerging among some groups of providers will be selected for duplication by the competitive environment. A curriculum framework needed to teach leadership skills to expand the influence of practice-based learning and improvement is offered as a guide to accelerate change.
2010-06-01
DATES COVEREDAPR 2009 – JAN 2010 (From - To) APR 2009 – JAN 2010 4. TITLE AND SUBTITLE EMERGING NEUROMORPHIC COMPUTING ARCHITECTURES AND ENABLING...14. ABSTRACT The highly cross-disciplinary emerging field of neuromorphic computing architectures for cognitive information processing applications...belief systems, software, computer engineering, etc. In our effort to develop cognitive systems atop a neuromorphic computing architecture, we explored
Emergency managers as community change agents: an expanded vision of the profession.
Drabek, Thomas E
2014-01-01
Reflecting the historical evolution of attack preparedness, technological failures, and so-called natural disaster events, the profession of emergency management confronts new challenges today. In part, these reflect important cultural differences among stakeholder groups, especially local emergency managers. homeland security personnel, and those focused on public health threats and business continuity. An expanded and more strategic vision of the profession is required wherein fundamental assumption sets are placed into broader contexts. Contrary to the drift experienced in the US during the past decade, a major paradigm shift is required reflecting new orientations and program priorities.
DOT National Transportation Integrated Search
2017-08-30
Transit oriented development (TOD) has emerged in recent years as a promising paradigm to promote public transportation, increase active transportation usage, mitigate congestion, and alleviate air pollution. However, there is a lack of analytic stud...
DOT National Transportation Integrated Search
2011-01-01
The increasing emphasis on the maintenance of existing infrastructure systems have led to : greater use of advanced sensors and condition monitoring systems. Wireless sensors and : sensor networks are emerging as sensing paradigms that the structural...
Using the Triad Approach to Improve the Cost-effectiveness of Hazardous Waste Site Cleanups
U.S. EPA's Office of Solid Waste and Emergency Response is promoting more effective strategies for characterizing, monitoring, and cleaning up hazardous waste sites. In particular, a paradigm based on using an integrated triad of systematic planning...
Computational pan-genomics: status, promises and challenges.
2018-01-01
Many disciplines, from human genetics and oncology to plant breeding, microbiology and virology, commonly face the challenge of analyzing rapidly increasing numbers of genomes. In case of Homo sapiens, the number of sequenced genomes will approach hundreds of thousands in the next few years. Simply scaling up established bioinformatics pipelines will not be sufficient for leveraging the full potential of such rich genomic data sets. Instead, novel, qualitatively different computational methods and paradigms are needed. We will witness the rapid extension of computational pan-genomics, a new sub-area of research in computational biology. In this article, we generalize existing definitions and understand a pan-genome as any collection of genomic sequences to be analyzed jointly or to be used as a reference. We examine already available approaches to construct and use pan-genomes, discuss the potential benefits of future technologies and methodologies and review open challenges from the vantage point of the above-mentioned biological disciplines. As a prominent example for a computational paradigm shift, we particularly highlight the transition from the representation of reference genomes as strings to representations as graphs. We outline how this and other challenges from different application domains translate into common computational problems, point out relevant bioinformatics techniques and identify open problems in computer science. With this review, we aim to increase awareness that a joint approach to computational pan-genomics can help address many of the problems currently faced in various domains. © The Author 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Tolzman, Jean M.
1993-03-01
The potential for expanded communication among researchers, scholars, and students is supported by growth in the capabilities for electronic communication as well as expanding access to various forms of electronic interchange and computing capabilities. Research supported by the National Aeronautics and Space Administration points to a future where workstations with audio and video monitors and screen-sharing protocols are used to support collaborations with colleagues located throughout the world. Instruments and sensors all over the world will produce data streams that will be brought together and analyzed to produce new findings, which in turn can be distributed electronically. New forms of electronic journals will emerge and provide opportunities for researchers and scientists to electronically and interactively exchange information in a wide range of structures and formats. Ultimately, the wide-scale use of these technologies in the dissemination of research results and the stimulation of collegial dialogue will change the way we represent and express our knowledge of the world. A new paradigm will evolve-perhaps a truly worldwide 'invisible college'.
Small Molecule Docking from Theoretical Structural Models
NASA Astrophysics Data System (ADS)
Novoa, Eva Maria; de Pouplana, Lluis Ribas; Orozco, Modesto
Structural approaches to rational drug design rely on the basic assumption that pharmacological activity requires, as necessary but not sufficient condition, the binding of a drug to one or several cellular targets, proteins in most cases. The traditional paradigm assumes that drugs that interact only with a single cellular target are specific and accordingly have little secondary effects, while promiscuous molecules are more likely to generate undesirable side effects. However, current examples indicate that often efficient drugs are able to interact with several biological targets [1] and in fact some dirty drugs, such as chlorpromazine, dextromethorphan, and ibogaine exhibit desired pharmacological properties [2]. These considerations highlight the tremendous difficulty of designing small molecules that both have satisfactory ADME properties and the ability of interacting with a limited set of target proteins with a high affinity, avoiding at the same time undesirable interactions with other proteins. In this complex and challenging scenario, computer simulations emerge as the basic tool to guide medicinal chemists during the drug discovery process.
MoS 2-on-MXene Heterostructures as Highly Reversible Anode Materials for Lithium-Ion Batteries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Chi; Xie, Xiuqiang; Anasori, Babak
Two-dimensional (2D) heterostructured materials, combining the collective advantages of individual building blocks and synergistic properties, have spurred great interest as a new paradigm in materials science. The family of 2D transition-metal carbides and nitrides, MXenes, has emerged as an attractive platform to construct functional materials with enhanced performance for diverse applications. Here, we synthesized 2D MoS 2-on-MXene heterostructures through in situ sulfidation of Mo 2TiC 2Tx MXene. The computational results show that MoS 2-on-MXene heterostructures have metallic properties. Moreover, the presence of MXene leads to enhanced Li and Li2S adsorption during the intercalation and conversion reactions. These characteristics render themore » as-prepared MoS 2-on-MXene heterostructures stable Li-ion storage performance. In conclusion, this work paves the way to use MXene to construct 2D heterostructures for energy storage applications.« less
Looking at the Disordered Proteins through the Computational Microscope.
Das, Payel; Matysiak, Silvina; Mittal, Jeetain
2018-05-23
Intrinsically disordered proteins (IDPs) have attracted wide interest over the past decade due to their surprising prevalence in the proteome and versatile roles in cell physiology and pathology. A large selection of IDPs has been identified as potential targets for therapeutic intervention. Characterizing the structure-function relationship of disordered proteins is therefore an essential but daunting task, as these proteins can adapt transient structure, necessitating a new paradigm for connecting structural disorder to function. Molecular simulation has emerged as a natural complement to experiments for atomic-level characterizations and mechanistic investigations of this intriguing class of proteins. The diverse range of length and time scales involved in IDP function requires performing simulations at multiple levels of resolution. In this Outlook, we focus on summarizing available simulation methods, along with a few interesting example applications. We also provide an outlook on how these simulation methods can be further improved in order to provide a more accurate description of IDP structure, binding, and assembly.
Cho, Soojin; Park, Jong-Woong; Sim, Sung-Han
2015-01-01
Wireless sensor networks (WSNs) facilitate a new paradigm to structural identification and monitoring for civil infrastructure. Conventional structural monitoring systems based on wired sensors and centralized data acquisition systems are costly for installation as well as maintenance. WSNs have emerged as a technology that can overcome such difficulties, making deployment of a dense array of sensors on large civil structures both feasible and economical. However, as opposed to wired sensor networks in which centralized data acquisition and processing is common practice, WSNs require decentralized computing algorithms to reduce data transmission due to the limitation associated with wireless communication. In this paper, the stochastic subspace identification (SSI) technique is selected for system identification, and SSI-based decentralized system identification (SDSI) is proposed to be implemented in a WSN composed of Imote2 wireless sensors that measure acceleration. The SDSI is tightly scheduled in the hierarchical WSN, and its performance is experimentally verified in a laboratory test using a 5-story shear building model. PMID:25856325
Managing resource capacity using hybrid simulation
NASA Astrophysics Data System (ADS)
Ahmad, Norazura; Ghani, Noraida Abdul; Kamil, Anton Abdulbasah; Tahar, Razman Mat
2014-12-01
Due to the diversity of patient flows and interdependency of the emergency department (ED) with other units in hospital, the use of analytical models seems not practical for ED modeling. One effective approach to study the dynamic complexity of ED problems is by developing a computer simulation model that could be used to understand the structure and behavior of the system. Attempts to build a holistic model using DES only will be too complex while if only using SD will lack the detailed characteristics of the system. This paper discusses the combination of DES and SD in order to get a better representation of the actual system than using either modeling paradigm solely. The model is developed using AnyLogic software that will enable us to study patient flows and the complex interactions among hospital resources for ED operations. Results from the model show that patients' length of stay is influenced by laboratories turnaround time, bed occupancy rate and ward admission rate.
Time to rethink the neural mechanisms of learning and memory.
Gallistel, Charles R; Balsam, Peter D
2014-02-01
Most studies in the neurobiology of learning assume that the underlying learning process is a pairing - dependent change in synaptic strength that requires repeated experience of events presented in close temporal contiguity. However, much learning is rapid and does not depend on temporal contiguity, which has never been precisely defined. These points are well illustrated by studies showing that the temporal relations between events are rapidly learned- even over long delays- and that this knowledge governs the form and timing of behavior. The speed with which anticipatory responses emerge in conditioning paradigms is determined by the information that cues provide about the timing of rewards. The challenge for understanding the neurobiology of learning is to understand the mechanisms in the nervous system that encode information from even a single experience, the nature of the memory mechanisms that can encode quantities such as time, and how the brain can flexibly perform computations based on this information. Copyright © 2013 Elsevier Inc. All rights reserved.
MoS 2-on-MXene Heterostructures as Highly Reversible Anode Materials for Lithium-Ion Batteries
Chen, Chi; Xie, Xiuqiang; Anasori, Babak; ...
2018-01-02
Two-dimensional (2D) heterostructured materials, combining the collective advantages of individual building blocks and synergistic properties, have spurred great interest as a new paradigm in materials science. The family of 2D transition-metal carbides and nitrides, MXenes, has emerged as an attractive platform to construct functional materials with enhanced performance for diverse applications. Here, we synthesized 2D MoS 2-on-MXene heterostructures through in situ sulfidation of Mo 2TiC 2Tx MXene. The computational results show that MoS 2-on-MXene heterostructures have metallic properties. Moreover, the presence of MXene leads to enhanced Li and Li2S adsorption during the intercalation and conversion reactions. These characteristics render themore » as-prepared MoS 2-on-MXene heterostructures stable Li-ion storage performance. In conclusion, this work paves the way to use MXene to construct 2D heterostructures for energy storage applications.« less
Using Open and Interoperable Ways to Publish and Access LANCE AIRS Near-Real Time Data
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Lynnes, Christopher; Vollmer, Bruce; Savtchenko, Andrey; Theobald, Michael; Yang, Wenli
2011-01-01
The Atmospheric Infrared Sounder (AIRS) Near-Real Time (NRT) data from the Land Atmosphere Near real-time Capability for EOS (LANCE) element at the Goddard Earth Sciences Data and Information Services Center (GES DISC) provides information on the global and regional atmospheric state, with very low temporal latency, to support climate research and improve weather forecasting. An open and interoperable platform is useful to facilitate access to, and integration of, LANCE AIRS NRT data. As Web services technology has matured in recent years, a new scalable Service-Oriented Architecture (SOA) is emerging as the basic platform for distributed computing and large networks of interoperable applications. Following the provide-register-discover-consume SOA paradigm, this presentation discusses how to use open-source geospatial software components to build Web services for publishing and accessing AIRS NRT data, explore the metadata relevant to registering and discovering data and services in the catalogue systems, and implement a Web portal to facilitate users' consumption of the data and services.
Chatterjee, Dev Kumar; Wolfe, Tatiana; Lee, Jihyoun; Brown, Aaron P; Singh, Pankaj Kumar; Bhattarai, Shanta Raj; Diagaradjane, Parmeswaran; Krishnan, Sunil
2014-01-01
Improvements in accuracy and efficacy in treating tumors with radiation therapy (RT) over the years have been fueled by parallel technological and conceptual advances in imaging and image-guidance techniques, radiation treatment machines, computational methods, and the understanding of the biology of tumor response to RT. Recent advances in our understanding of the hallmarks of cancer and the emergence of strategies to combat these traits of cancer have resulted in an expanding repertoire of targeted therapeutics, many of which can be exploited for enhancing the efficacy of RT. Complementing this advent of new treatment options is the evolution of our knowledge of the interaction between nanoscale materials and human tissues (nanomedicine). As with the changes in RT paradigms when the field has encountered newer and maturing disciplines, the incorporation of nanotechnology innovations into radiation oncology has the potential to refine or redefine its principles and revolutionize its practice. This review provides a summary of the principles, applications, challenges and outlook for the use of metallic nanoparticles in RT. PMID:25279336
The basolateral amygdala in reward learning and addiction
Wassum, Kate M.; Izquierdo, Alicia
2015-01-01
Sophisticated behavioral paradigms partnered with the emergence of increasingly selective techniques to target the basolateral amygdala (BLA) have resulted in an enhanced understanding of the role of this nucleus in learning and using reward information. Due to the wide variety of behavioral approaches many questions remain on the circumscribed role of BLA in appetitive behavior. In this review, we integrate conclusions of BLA function in reward-related behavior using traditional interference techniques (lesion, pharmacological inactivation) with those using newer methodological approaches in experimental animals that allow in vivo manipulation of cell type-specific populations and neural recordings. Secondly, from a review of appetitive behavioral tasks in rodents and monkeys and recent computational models of reward procurement, we derive evidence for BLA as a neural integrator of reward value, history, and cost parameters. Taken together, BLA codes specific and temporally dynamic outcome representations in a distributed network to orchestrate adaptive responses. We provide evidence that experiences with opiates and psychostimulants alter these outcome representations in BLA, resulting in long-term modified action. PMID:26341938
NASA Technical Reports Server (NTRS)
Tolzman, Jean M.
1993-01-01
The potential for expanded communication among researchers, scholars, and students is supported by growth in the capabilities for electronic communication as well as expanding access to various forms of electronic interchange and computing capabilities. Research supported by the National Aeronautics and Space Administration points to a future where workstations with audio and video monitors and screen-sharing protocols are used to support collaborations with colleagues located throughout the world. Instruments and sensors all over the world will produce data streams that will be brought together and analyzed to produce new findings, which in turn can be distributed electronically. New forms of electronic journals will emerge and provide opportunities for researchers and scientists to electronically and interactively exchange information in a wide range of structures and formats. Ultimately, the wide-scale use of these technologies in the dissemination of research results and the stimulation of collegial dialogue will change the way we represent and express our knowledge of the world. A new paradigm will evolve-perhaps a truly worldwide 'invisible college'.
Looking at the Disordered Proteins through the Computational Microscope
2018-01-01
Intrinsically disordered proteins (IDPs) have attracted wide interest over the past decade due to their surprising prevalence in the proteome and versatile roles in cell physiology and pathology. A large selection of IDPs has been identified as potential targets for therapeutic intervention. Characterizing the structure–function relationship of disordered proteins is therefore an essential but daunting task, as these proteins can adapt transient structure, necessitating a new paradigm for connecting structural disorder to function. Molecular simulation has emerged as a natural complement to experiments for atomic-level characterizations and mechanistic investigations of this intriguing class of proteins. The diverse range of length and time scales involved in IDP function requires performing simulations at multiple levels of resolution. In this Outlook, we focus on summarizing available simulation methods, along with a few interesting example applications. We also provide an outlook on how these simulation methods can be further improved in order to provide a more accurate description of IDP structure, binding, and assembly.