Sample records for significant computational challenges

  1. High-End Computing Challenges in Aerospace Design and Engineering

    NASA Technical Reports Server (NTRS)

    Bailey, F. Ronald

    2004-01-01

    High-End Computing (HEC) has had significant impact on aerospace design and engineering and is poised to make even more in the future. In this paper we describe four aerospace design and engineering challenges: Digital Flight, Launch Simulation, Rocket Fuel System and Digital Astronaut. The paper discusses modeling capabilities needed for each challenge and presents projections of future near and far-term HEC computing requirements. NASA's HEC Project Columbia is described and programming strategies presented that are necessary to achieve high real performance.

  2. The power of an ontology-driven developmental toxicity database for data mining and computational modeling

    EPA Science Inventory

    Modeling of developmental toxicology presents a significant challenge to computational toxicology due to endpoint complexity and lack of data coverage. These challenges largely account for the relatively few modeling successes using the structure–activity relationship (SAR) parad...

  3. The computational challenges of Earth-system science.

    PubMed

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  4. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.

    PubMed

    Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments.

  5. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid

    PubMed Central

    Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  6. Implantable brain computer interface: challenges to neurotechnology translation.

    PubMed

    Konrad, Peter; Shanks, Todd

    2010-06-01

    This article reviews three concepts related to implantable brain computer interface (BCI) devices being designed for human use: neural signal extraction primarily for motor commands, signal insertion to restore sensation, and technological challenges that remain. A significant body of literature has occurred over the past four decades regarding motor cortex signal extraction for upper extremity movement or computer interface. However, little is discussed regarding postural or ambulation command signaling. Auditory prosthesis research continues to represent the majority of literature on BCI signal insertion. Significant hurdles continue in the technological translation of BCI implants. These include developing a stable neural interface, significantly increasing signal processing capabilities, and methods of data transfer throughout the human body. The past few years, however, have provided extraordinary human examples of BCI implant potential. Despite technological hurdles, proof-of-concept animal and human studies provide significant encouragement that BCI implants may well find their way into mainstream medical practice in the foreseeable future.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrett, Brian W.; Hemmert, K. Scott; Underwood, Keith Douglas

    Achieving the next three orders of magnitude performance increase to move from petascale to exascale computing will require a significant advancements in several fundamental areas. Recent studies have outlined many of the challenges in hardware and software that will be needed. In this paper, we examine these challenges with respect to high-performance networking. We describe the repercussions of anticipated changes to computing and networking hardware and discuss the impact that alternative parallel programming models will have on the network software stack. We also present some ideas on possible approaches that address some of these challenges.

  8. Fast normal mode computations of capsid dynamics inspired by resonance

    NASA Astrophysics Data System (ADS)

    Na, Hyuntae; Song, Guang

    2018-07-01

    Increasingly more and larger structural complexes are being determined experimentally. The sizes of these systems pose a formidable computational challenge to the study of their vibrational dynamics by normal mode analysis. To overcome this challenge, this work presents a novel resonance-inspired approach. Tests on large shell structures of protein capsids demonstrate that there is a strong resonance between the vibrations of a whole capsid and those of individual capsomeres. We then show how this resonance can be taken advantage of to significantly speed up normal mode computations.

  9. Predictive Models and Computational Toxicology

    EPA Science Inventory

    Understanding the potential health risks posed by environmental chemicals is a significant challenge elevated by the large number of diverse chemicals with generally uncharacterized exposures, mechanisms, and toxicities. The ToxCast computational toxicology research program was l...

  10. Exploring English as a Foreign Language (EFL) Teacher Trainers' Perspectives on Challenges to Promoting Computer Literacy of EFL Teachers

    ERIC Educational Resources Information Center

    Dashtestani, Reza

    2014-01-01

    Computer literacy is a significant component of language teachers' computer-assisted language learning (call) knowledge. Despite its importance, limited research has been undertaken to analyze factors which might influence language teachers' computer literacy levels. This qualitative study explored the perspectives of 39 Iranian EFL teacher…

  11. Mistaking Identities: Challenging Representations of Language, Gender, and Race in High Tech Television Programs.

    ERIC Educational Resources Information Center

    Voithofer, R. J.

    Television programs are increasingly featuring information technologies like computers as significant narrative devices, including the use of computer-based technologies as virtual worlds or environments in which characters interact, the use of computers as tools in problem solving and confronting conflict, and characters that are part human, part…

  12. Computational Social Creativity.

    PubMed

    Saunders, Rob; Bown, Oliver

    2015-01-01

    This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.

  13. Review of Affective Computing in Education/Learning: Trends and Challenges

    ERIC Educational Resources Information Center

    Wu, Chih-Hung; Huang, Yueh-Min; Hwang, Jan-Pan

    2016-01-01

    Affect can significantly influence education/learning. Thus, understanding a learner's affect throughout the learning process is crucial for understanding motivation. In conventional education/learning research, learner motivation can be known through postevent self-reported questionnaires. With the advance of affective computing technology,…

  14. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    NASA Astrophysics Data System (ADS)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  15. Facial Animations: Future Research Directions & Challenges

    NASA Astrophysics Data System (ADS)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Rehman, Amjad; Basori, Ahmad Hoirul

    2014-06-01

    Nowadays, computer facial animation is used in a significant multitude fields that brought human and social to study the computer games, films and interactive multimedia reality growth. Authoring the computer facial animation, complex and subtle expressions are challenging and fraught with problems. As a result, the current most authored using universal computer animation techniques often limit the production quality and quantity of facial animation. With the supplement of computer power, facial appreciative, software sophistication and new face-centric methods emerging are immature in nature. Therefore, this paper concentrates to define and managerially categorize current and emerged surveyed facial animation experts to define the recent state of the field, observed bottlenecks and developing techniques. This paper further presents a real-time simulation model of human worry and howling with detail discussion about their astonish, sorrow, annoyance and panic perception.

  16. Computational Systems Toxicology: recapitulating the logistical dynamics of cellular response networks in virtual tissue models (Eurotox_2017)

    EPA Science Inventory

    Translating in vitro data and biological information into a predictive model for human toxicity poses a significant challenge. This is especially true for complex adaptive systems such as the embryo where cellular dynamics are precisely orchestrated in space and time. Computer ce...

  17. Spintronic logic: from switching devices to computing systems

    NASA Astrophysics Data System (ADS)

    Friedman, Joseph S.

    2017-09-01

    Though numerous spintronic switching devices have been proposed or demonstrated, there has been significant difficulty in translating these advances into practical computing systems. The challenge of cascading has impeded the integration of multiple devices into a logic family, and several proposed solutions potentially overcome these challenges. Here, the cascading techniques by which the output of each spintronic device can drive the input of another device are described for several logic families, including spin-diode logic (in particular, all-carbon spin logic), complementary magnetic tunnel junction logic (CMAT), and emitter-coupled spin-transistor logic (ECSTL).

  18. Proteomics, lipidomics, metabolomics: a mass spectrometry tutorial from a computer scientist's point of view.

    PubMed

    Smith, Rob; Mathis, Andrew D; Ventura, Dan; Prince, John T

    2014-01-01

    For decades, mass spectrometry data has been analyzed to investigate a wide array of research interests, including disease diagnostics, biological and chemical theory, genomics, and drug development. Progress towards solving any of these disparate problems depends upon overcoming the common challenge of interpreting the large data sets generated. Despite interim successes, many data interpretation problems in mass spectrometry are still challenging. Further, though these challenges are inherently interdisciplinary in nature, the significant domain-specific knowledge gap between disciplines makes interdisciplinary contributions difficult. This paper provides an introduction to the burgeoning field of computational mass spectrometry. We illustrate key concepts, vocabulary, and open problems in MS-omics, as well as provide invaluable resources such as open data sets and key search terms and references. This paper will facilitate contributions from mathematicians, computer scientists, and statisticians to MS-omics that will fundamentally improve results over existing approaches and inform novel algorithmic solutions to open problems.

  19. A community computational challenge to predict the activity of pairs of compounds.

    PubMed

    Bansal, Mukesh; Yang, Jichen; Karan, Charles; Menden, Michael P; Costello, James C; Tang, Hao; Xiao, Guanghua; Li, Yajuan; Allen, Jeffrey; Zhong, Rui; Chen, Beibei; Kim, Minsoo; Wang, Tao; Heiser, Laura M; Realubit, Ronald; Mattioli, Michela; Alvarez, Mariano J; Shen, Yao; Gallahan, Daniel; Singer, Dinah; Saez-Rodriguez, Julio; Xie, Yang; Stolovitzky, Gustavo; Califano, Andrea

    2014-12-01

    Recent therapeutic successes have renewed interest in drug combinations, but experimental screening approaches are costly and often identify only small numbers of synergistic combinations. The DREAM consortium launched an open challenge to foster the development of in silico methods to computationally rank 91 compound pairs, from the most synergistic to the most antagonistic, based on gene-expression profiles of human B cells treated with individual compounds at multiple time points and concentrations. Using scoring metrics based on experimental dose-response curves, we assessed 32 methods (31 community-generated approaches and SynGen), four of which performed significantly better than random guessing. We highlight similarities between the methods. Although the accuracy of predictions was not optimal, we find that computational prediction of compound-pair activity is possible, and that community challenges can be useful to advance the field of in silico compound-synergy prediction.

  20. Image-guided tissue engineering

    PubMed Central

    Ballyns, Jeffrey J; Bonassar, Lawrence J

    2009-01-01

    Replication of anatomic shape is a significant challenge in developing implants for regenerative medicine. This has lead to significant interest in using medical imaging techniques such as magnetic resonance imaging and computed tomography to design tissue engineered constructs. Implementation of medical imaging and computer aided design in combination with technologies for rapid prototyping of living implants enables the generation of highly reproducible constructs with spatial resolution up to 25 μm. In this paper, we review the medical imaging modalities available and a paradigm for choosing a particular imaging technique. We also present fabrication techniques and methodologies for producing cellular engineered constructs. Finally, we comment on future challenges involved with image guided tissue engineering and efforts to generate engineered constructs ready for implantation. PMID:19583811

  1. Flood Forecasting in Wales: Challenges and Solutions

    NASA Astrophysics Data System (ADS)

    How, Andrew; Williams, Christopher

    2015-04-01

    With steep, fast-responding river catchments, exposed coastal reaches with large tidal ranges and large population densities in some of the most at-risk areas; flood forecasting in Wales presents many varied challenges. Utilising advances in computing power and learning from best practice within the United Kingdom and abroad have seen significant improvements in recent years - however, many challenges still remain. Developments in computing and increased processing power comes with a significant price tag; greater numbers of data sources and ensemble feeds brings a better understanding of uncertainty but the wealth of data needs careful management to ensure a clear message of risk is disseminated; new modelling techniques utilise better and faster computation, but lack the history of record and experience gained from the continued use of more established forecasting models. As a flood forecasting team we work to develop coastal and fluvial forecasting models, set them up for operational use and manage the duty role that runs the models in real time. An overview of our current operational flood forecasting system will be presented, along with a discussion on some of the solutions we have in place to address the challenges we face. These include: • real-time updating of fluvial models • rainfall forecasting verification • ensemble forecast data • longer range forecast data • contingency models • offshore to nearshore wave transformation • calculation of wave overtopping

  2. Dynamically allocating sets of fine-grained processors to running computations

    NASA Technical Reports Server (NTRS)

    Middleton, David

    1988-01-01

    Researchers explore an approach to using general purpose parallel computers which involves mapping hardware resources onto computations instead of mapping computations onto hardware. Problems such as processor allocation, task scheduling and load balancing, which have traditionally proven to be challenging, change significantly under this approach and may become amenable to new attacks. Researchers describe the implementation of this approach used by the FFP Machine whose computation and communication resources are repeatedly partitioned into disjoint groups that match the needs of available tasks from moment to moment. Several consequences of this system are examined.

  3. Paving the future: finding suitable ISMB venues

    PubMed Central

    Rost, Burkhard; Gaasterland, Terry; Lengauer, Thomas; Linial, Michal; Morrison McKay, B.J.; Schneider, Reinhard; Horton, Paul; Kelso, Janet

    2012-01-01

    The International Society for Computational Biology, ISCB, organizes the largest event in the field of computational biology and bioinformatics, namely the annual international conference on Intelligent Systems for Molecular Biology, the ISMB. This year at ISMB 2012 in Long Beach, ISCB celebrated the 20th anniversary of its flagship meeting. ISCB is a young, lean and efficient society that aspires to make a significant impact with only limited resources. Many constraints make the choice of venues for ISMB a tough challenge. Here, we describe those challenges and invite the contribution of ideas for solutions. Contact: assistant@rostlab.org PMID:22796959

  4. The Burn Medical Assistant: Developing Machine Learning Algorithms to Aid in the Estimation of Burn Wound Size

    DTIC Science & Technology

    2017-10-01

    hypothesis that a computer machine learning algorithm can analyze and classify burn injures using multispectral imaging within 5% of an expert clinician...morbidity. In response to these challenges, the USAISR developed and obtained FDA 510(k) clearance of the Burn Navigator™, a computer decision support... computer decision support software (CDSS), can significantly change the CDSS algorithm’s recommendations and thus the total fluid administered to a

  5. Computational challenges of structure-based approaches applied to HIV.

    PubMed

    Forli, Stefano; Olson, Arthur J

    2015-01-01

    Here, we review some of the opportunities and challenges that we face in computational modeling of HIV therapeutic targets and structural biology, both in terms of methodology development and structure-based drug design (SBDD). Computational methods have provided fundamental support to HIV research since the initial structural studies, helping to unravel details of HIV biology. Computational models have proved to be a powerful tool to analyze and understand the impact of mutations and to overcome their structural and functional influence in drug resistance. With the availability of structural data, in silico experiments have been instrumental in exploiting and improving interactions between drugs and viral targets, such as HIV protease, reverse transcriptase, and integrase. Issues such as viral target dynamics and mutational variability, as well as the role of water and estimates of binding free energy in characterizing ligand interactions, are areas of active computational research. Ever-increasing computational resources and theoretical and algorithmic advances have played a significant role in progress to date, and we envision a continually expanding role for computational methods in our understanding of HIV biology and SBDD in the future.

  6. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    NASA Astrophysics Data System (ADS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  7. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  8. Scaling an urban emergency evacuation framework : challenges and practices.

    DOT National Transportation Integrated Search

    2014-01-01

    Critical infrastructure disruption, caused by severe weather events, natural disasters, terrorist : attacks, etc., has significant impacts on urban transportation systems. We built a computational : framework to simulate urban transportation systems ...

  9. Back to the future: virtualization of the computing environment at the W. M. Keck Observatory

    NASA Astrophysics Data System (ADS)

    McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.

    2014-07-01

    Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.

  10. Addressing Ethics and Technology in Business: Preparing Today's Students for the Ethical Challenges Presented by Technology in the Workplace

    ERIC Educational Resources Information Center

    Brooks, Rochelle

    2008-01-01

    The ethical development of information systems is but one of those sensitive scenarios associated with computer technology that has a tremendous impact on individuals and social life. The significance of these issues of concern cannot be overstated. However, since computer ethics is meant to be everybody's responsibility, the result can often be…

  11. LLNL Mercury Project Trinity Open Science Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, Shawn A.

    The Mercury Monte Carlo particle transport code is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. In the proposed Trinity Open Science calculations, I will investigate computer science aspects of the code which are relevant to convergence of the simulation quantities with increasing Monte Carlo particle counts.

  12. Secure distributed genome analysis for GWAS and sequence comparison computation.

    PubMed

    Zhang, Yihua; Blanton, Marina; Almashaqbeh, Ghada

    2015-01-01

    The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice.

  13. Secure distributed genome analysis for GWAS and sequence comparison computation

    PubMed Central

    2015-01-01

    Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307

  14. Intentions of hospital nurses to work with computers: based on the theory of planned behavior.

    PubMed

    Shoham, Snunith; Gonen, Ayala

    2008-01-01

    The purpose of this study was to determine registered nurses' attitudes related to intent to use computers in the hospital setting as a predictor of their future behavior. The study was further aimed at identifying the relationship between these attitudes and selected sociological, professional, and personal factors and to describe a research model integrating these various factors. The study was based on the theory of planned behavior. A random sample of 411 registered nurses was selected from a single large medical center in Israel. The study tool was a Likert-style questionnaire. Nine different indices were used: (1) behavioral intention toward computer use; (2) general attitudes toward computer use; (3) nursing attitudes toward computer use; (4) threat involved in computer use; (5) challenge involved in computer use; (6) organizational climate; (7) departmental climate; (8) attraction to technological innovations/innovativeness; (9) self-efficacy, ability to control behavior. Strong significant positive correlations were found between the nurses' attitudes (general attitudes and nursing attitudes), self-efficacy, innovativeness, and intentions to use computers. Higher correlations were found between departmental climate and attitudes than between organizational climate and attitudes. The threat and challenge that are involved in computer use were shown as important mediating variables to the understanding of the process of predicting attitudes and intentions toward using computers.

  15. Experimental Actinobacillus pleuropneumoniae challenge in swine: Comparison of computed tomographic and radiographic findings during disease

    PubMed Central

    2012-01-01

    Background In pigs, diseases of the respiratory tract like pleuropneumonia due to Actinobacillus pleuropneumoniae (App) infection have led to high economic losses for decades. Further research on disease pathogenesis, pathogen-host-interactions and new prophylactic and therapeutic approaches are needed. In most studies, a large number of experimental animals are required to assess lung alterations at different stages of the disease. In order to reduce the required number of animals but nevertheless gather information on the nature and extent of lung alterations in living pigs, a computed tomographic scoring system for quantifying gross pathological findings was developed. In this study, five healthy pigs served as control animals while 24 pigs were infected with App, the causative agent of pleuropneumonia in pigs, in an established model for respiratory tract disease. Results Computed tomographic (CT) findings during the course of App challenge were verified by radiological imaging, clinical, serological, gross pathology and histological examinations. Findings from clinical examinations and both CT and radiological imaging, were recorded on day 7 and day 21 after challenge. Clinical signs after experimental App challenge were indicative of acute to chronic disease. Lung CT findings of infected pigs comprised ground-glass opacities and consolidation. On day 7 and 21 the clinical scores significantly correlated with the scores of both imaging techniques. At day 21, significant correlations were found between clinical scores, CT scores and lung lesion scores. In 19 out of 22 challenged pigs the determined disease grades (not affected, slightly affected, moderately affected, severely affected) from CT and gross pathological examination were in accordance. Disease classification by radiography and gross pathology agreed in 11 out of 24 pigs. Conclusions High-resolution, high-contrast CT examination with no overlapping of organs is superior to radiography in the assessment of pneumonic lung lesions after App challenge. The new CT scoring system allows for quantification of gross pathological lung alterations in living pigs. However, computed tomographic findings are not informative of the etiology of respiratory disease. PMID:22546414

  16. Addressing the Challenges of a New Digital Technologies Curriculum: MOOCs as a Scalable Solution for Teacher Professional Development

    ERIC Educational Resources Information Center

    Vivian, Rebecca; Falkner, Katrina; Falkner, Nickolas

    2014-01-01

    England and Australia have introduced new learning areas, teaching computer science to children from the first year of school. This is a significant milestone that also raises a number of big challenges: the preparation of teachers and the development of resources" at a national scale." Curriculum change is not easy for teachers, in any…

  17. Critical infrastructure protection : significant challenges in developing national capabilities

    DOT National Transportation Integrated Search

    2001-04-01

    To address the concerns about protecting the nation's critical computer-dependent infrastructure, this General Accounting Office (GOA) report describes the progress of the National Infrastructure Protection Center (NIPC) in (1) developing national ca...

  18. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    DOE PAGES

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; ...

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In thismore » study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.« less

  19. The expanded role of computers in Space Station Freedom real-time operations

    NASA Technical Reports Server (NTRS)

    Crawford, R. Paul; Cannon, Kathleen V.

    1990-01-01

    The challenges that NASA and its international partners face in their real-time operation of the Space Station Freedom necessitate an increased role on the part of computers. In building the operational concepts concerning the role of the computer, the Space Station program is using lessons learned experience from past programs, knowledge of the needs of future space programs, and technical advances in the computer industry. The computer is expected to contribute most significantly in real-time operations by forming a versatile operating architecture, a responsive operations tool set, and an environment that promotes effective and efficient utilization of Space Station Freedom resources.

  20. Computational Methods for Stability and Control (COMSAC): The Time Has Come

    NASA Technical Reports Server (NTRS)

    Hall, Robert M.; Biedron, Robert T.; Ball, Douglas N.; Bogue, David R.; Chung, James; Green, Bradford E.; Grismer, Matthew J.; Brooks, Gregory P.; Chambers, Joseph R.

    2005-01-01

    Powerful computational fluid dynamics (CFD) tools have emerged that appear to offer significant benefits as an adjunct to the experimental methods used by the stability and control community to predict aerodynamic parameters. The decreasing costs for and increasing availability of computing hours are making these applications increasingly viable as time goes on and the cost of computing continues to drop. This paper summarizes the efforts of four organizations to utilize high-end computational fluid dynamics (CFD) tools to address the challenges of the stability and control arena. General motivation and the backdrop for these efforts will be summarized as well as examples of current applications.

  1. solveME: fast and reliable solution of nonlinear ME models.

    PubMed

    Yang, Laurence; Ma, Ding; Ebrahim, Ali; Lloyd, Colton J; Saunders, Michael A; Palsson, Bernhard O

    2016-09-22

    Genome-scale models of metabolism and macromolecular expression (ME) significantly expand the scope and predictive capabilities of constraint-based modeling. ME models present considerable computational challenges: they are much (>30 times) larger than corresponding metabolic reconstructions (M models), are multiscale, and growth maximization is a nonlinear programming (NLP) problem, mainly due to macromolecule dilution constraints. Here, we address these computational challenges. We develop a fast and numerically reliable solution method for growth maximization in ME models using a quad-precision NLP solver (Quad MINOS). Our method was up to 45 % faster than binary search for six significant digits in growth rate. We also develop a fast, quad-precision flux variability analysis that is accelerated (up to 60× speedup) via solver warm-starts. Finally, we employ the tools developed to investigate growth-coupled succinate overproduction, accounting for proteome constraints. Just as genome-scale metabolic reconstructions have become an invaluable tool for computational and systems biologists, we anticipate that these fast and numerically reliable ME solution methods will accelerate the wide-spread adoption of ME models for researchers in these fields.

  2. Micro-Adaptivity: Protecting Immersion in Didactically Adaptive Digital Educational Games

    ERIC Educational Resources Information Center

    Kickmeier-Rust, M. D.; Albert, D.

    2010-01-01

    The idea of utilizing the rich potential of today's computer games for educational purposes excites educators, scientists and technicians. Despite the significant hype over digital game-based learning, the genre is currently at an early stage. One of the most significant challenges for research and development in this area is establishing…

  3. Brain-computer interfacing under distraction: an evaluation study

    NASA Astrophysics Data System (ADS)

    Brandl, Stephanie; Frølich, Laura; Höhne, Johannes; Müller, Klaus-Robert; Samek, Wojciech

    2016-10-01

    Objective. While motor-imagery based brain-computer interfaces (BCIs) have been studied over many years by now, most of these studies have taken place in controlled lab settings. Bringing BCI technology into everyday life is still one of the main challenges in this field of research. Approach. This paper systematically investigates BCI performance under 6 types of distractions that mimic out-of-lab environments. Main results. We report results of 16 participants and show that the performance of the standard common spatial patterns (CSP) + regularized linear discriminant analysis classification pipeline drops significantly in this ‘simulated’ out-of-lab setting. We then investigate three methods for improving the performance: (1) artifact removal, (2) ensemble classification, and (3) a 2-step classification approach. While artifact removal does not enhance the BCI performance significantly, both ensemble classification and the 2-step classification combined with CSP significantly improve the performance compared to the standard procedure. Significance. Systematically analyzing out-of-lab scenarios is crucial when bringing BCI into everyday life. Algorithms must be adapted to overcome nonstationary environments in order to tackle real-world challenges.

  4. Assessment of Computer Literacy of Nurses in Lesotho.

    PubMed

    Mugomeri, Eltony; Chatanga, Peter; Maibvise, Charles; Masitha, Matseliso

    2016-11-01

    Health systems worldwide are moving toward use of information technology to improve healthcare delivery. However, this requires basic computer skills. This study assessed the computer literacy of nurses in Lesotho using a cross-sectional quantitative approach. A structured questionnaire with 32 standardized computer skills was distributed to 290 randomly selected nurses in Maseru District. Univariate and multivariate logistic regression analyses in Stata 13 were performed to identify factors associated with having inadequate computer skills. Overall, 177 (61%) nurses scored below 16 of the 32 skills assessed. Finding hyperlinks on Web pages (63%), use of advanced search parameters (60.2%), and downloading new software (60.1%) proved to be challenging to the highest proportions of nurses. Age, sex, year of obtaining latest qualification, computer experience, and work experience were significantly (P < .05) associated with inadequate computer skills in univariate analysis. However, in multivariate analyses, sex (P = .001), year of obtaining latest qualification (P = .011), and computer experience (P < .001) emerged as significant factors. The majority of nurses in Lesotho have inadequate computer skills, and this is significantly associated with having many years since obtaining their latest qualification, being female, and lack of exposure to computers. These factors should be considered during planning of training curriculum for nurses in Lesotho.

  5. A New Biogeochemical Computational Framework Integrated within the Community Land Model

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Li, H.; Liu, C.; Huang, M.; Leung, L.

    2012-12-01

    Terrestrial biogeochemical processes, particularly carbon cycle dynamics, have been shown to significantly influence regional and global climate changes. Modeling terrestrial biogeochemical processes within the land component of Earth System Models such as the Community Land model (CLM), however, faces three major challenges: 1) extensive efforts in modifying modeling structures and rewriting computer programs to incorporate biogeochemical processes with increasing complexity, 2) expensive computational cost to solve the governing equations due to numerical stiffness inherited from large variations in the rates of biogeochemical processes, and 3) lack of an efficient framework to systematically evaluate various mathematical representations of biogeochemical processes. To address these challenges, we introduce a new computational framework to incorporate biogeochemical processes into CLM, which consists of a new biogeochemical module with a generic algorithm and reaction database. New and updated biogeochemical processes can be incorporated into CLM without significant code modification. To address the stiffness issue, algorithms and criteria will be developed to identify fast processes, which will be replaced with algebraic equations and decoupled from slow processes. This framework can serve as a generic and user-friendly platform to test out different mechanistic process representations and datasets and gain new insight on the behavior of the terrestrial ecosystems in response to climate change in a systematic way.

  6. In Silico Dynamics: computer simulation in a Virtual Embryo (SOT)

    EPA Science Inventory

    Abstract: Utilizing cell biological information to predict higher order biological processes is a significant challenge in predictive toxicology. This is especially true for highly dynamical systems such as the embryo where morphogenesis, growth and differentiation require preci...

  7. Development and Validation of a Novel Fusion Algorithm for Continuous, Accurate and Automated R-wave Detection and Calculation of Signal-Derived Metrics

    DTIC Science & Technology

    2013-01-01

    Predicting the onset of atrial fibrillation : the Computers in Cardiology Challenge 2001. Comput Cardiol 2001;28:113-6. [22] Moody GB, Mark RG, Goldberger AL...Computers in Cardiology Challenge 2006: QT interval measurement. Comput Cardiol 2006;33:313-6. [18] Moody GB. Spontaneous termination of atrial ... fibrillation : a challenge from PhysioNet and Computers in Cardiology 2004. Comput Cardiol 2004;31:101-4. [19] Moody GB, Jager F. Distinguishing ischemic from non

  8. Rock climbing: A local-global algorithm to compute minimum energy and minimum free energy pathways.

    PubMed

    Templeton, Clark; Chen, Szu-Hua; Fathizadeh, Arman; Elber, Ron

    2017-10-21

    The calculation of minimum energy or minimum free energy paths is an important step in the quantitative and qualitative studies of chemical and physical processes. The computations of these coordinates present a significant challenge and have attracted considerable theoretical and computational interest. Here we present a new local-global approach to study reaction coordinates, based on a gradual optimization of an action. Like other global algorithms, it provides a path between known reactants and products, but it uses a local algorithm to extend the current path in small steps. The local-global approach does not require an initial guess to the path, a major challenge for global pathway finders. Finally, it provides an exact answer (the steepest descent path) at the end of the calculations. Numerical examples are provided for the Mueller potential and for a conformational transition in a solvated ring system.

  9. A pervasive parallel framework for visualization: final report for FWP 10-014707

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.

    2014-01-01

    We are on the threshold of a transformative change in the basic architecture of highperformance computing. The use of accelerator processors, characterized by large core counts, shared but asymmetrical memory, and heavy thread loading, is quickly becoming the norm in high performance computing. These accelerators represent significant challenges in updating our existing base of software. An intrinsic problem with this transition is a fundamental programming shift from message passing processes to much more fine thread scheduling with memory sharing. Another problem is the lack of stability in accelerator implementation; processor and compiler technology is currently changing rapidly. This report documentsmore » the results of our three-year ASCR project to address these challenges. Our project includes the development of the Dax toolkit, which contains the beginnings of new algorithms for a new generation of computers and the underlying infrastructure to rapidly prototype and build further algorithms as necessary.« less

  10. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  11. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  12. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  13. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  14. The Astromaterials X-Ray Computed Tomography Laboratory at Johnson Space Center

    NASA Astrophysics Data System (ADS)

    Zeigler, R. A.; Blumenfeld, E. H.; Srinivasan, P.; McCubbin, F. M.; Evans, C. A.

    2018-04-01

    The Astromaterials Curation Office has recently begun incorporating X-ray CT data into the curation processes for lunar and meteorite samples, and long-term curation of that data and serving it to the public represent significant technical challenges.

  15. The implementation of AI technologies in computer wargames

    NASA Astrophysics Data System (ADS)

    Tiller, John A.

    2004-08-01

    Computer wargames involve the most in-depth analysis of general game theory. The enumerated turns of a game like chess are dwarfed by the exponentially larger possibilities of even a simple computer wargame. Implementing challenging AI is computer wargames is an important goal in both the commercial and military environments. In the commercial marketplace, customers demand a challenging AI opponent when they play a computer wargame and are frustrated by a lack of competence on the part of the AI. In the military environment, challenging AI opponents are important for several reasons. A challenging AI opponent will force the military professional to avoid routine or set-piece approaches to situations and cause them to think much deeper about military situations before taking action. A good AI opponent would also include national characteristics of the opponent being simulated, thus providing the military professional with even more of a challenge in planning and approach. Implementing current AI technologies in computer wargames is a technological challenge. The goal is to join the needs of AI in computer wargames with the solutions of current AI technologies. This talk will address several of those issues, possible solutions, and currently unsolved problems.

  16. Computational Methods for MOF/Polymer Membranes.

    PubMed

    Erucar, Ilknur; Keskin, Seda

    2016-04-01

    Metal-organic framework (MOF)/polymer mixed matrix membranes (MMMs) have received significant interest in the last decade. MOFs are incorporated into polymers to make MMMs that exhibit improved gas permeability and selectivity compared with pure polymer membranes. The fundamental challenge in this area is to choose the appropriate MOF/polymer combinations for a gas separation of interest. Even if a single polymer is considered, there are thousands of MOFs that could potentially be used as fillers in MMMs. As a result, there has been a large demand for computational studies that can accurately predict the gas separation performance of MOF/polymer MMMs prior to experiments. We have developed computational approaches to assess gas separation potentials of MOF/polymer MMMs and used them to identify the most promising MOF/polymer pairs. In this Personal Account, we aim to provide a critical overview of current computational methods for modeling MOF/polymer MMMs. We give our perspective on the background, successes, and failures that led to developments in this area and discuss the opportunities and challenges of using computational methods for MOF/polymer MMMs. © 2016 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Advancing Capabilities for Understanding the Earth System Through Intelligent Systems, the NSF Perspective

    NASA Astrophysics Data System (ADS)

    Gil, Y.; Zanzerkia, E. E.; Munoz-Avila, H.

    2015-12-01

    The National Science Foundation (NSF) Directorate for Geosciences (GEO) and Directorate for Computer and Information Science (CISE) acknowledge the significant scientific challenges required to understand the fundamental processes of the Earth system, within the atmospheric and geospace, Earth, ocean and polar sciences, and across those boundaries. A broad view of the opportunities and directions for GEO are described in the report "Dynamic Earth: GEO imperative and Frontiers 2015-2020." Many of the aspects of geosciences research, highlighted both in this document and other community grand challenges, pose novel problems for researchers in intelligent systems. Geosciences research will require solutions for data-intensive science, advanced computational capabilities, and transformative concepts for visualizing, using, analyzing and understanding geo phenomena and data. Opportunities for the scientific community to engage in addressing these challenges are available and being developed through NSF's portfolio of investments and activities. The NSF-wide initiative, Cyberinfrastructure Framework for 21st Century Science and Engineering (CIF21), looks to accelerate research and education through new capabilities in data, computation, software and other aspects of cyberinfrastructure. EarthCube, a joint program between GEO and the Advanced Cyberinfrastructure Division, aims to create a well-connected and facile environment to share data and knowledge in an open, transparent, and inclusive manner, thus accelerating our ability to understand and predict the Earth system. EarthCube's mission opens an opportunity for collaborative research on novel information systems enhancing and supporting geosciences research efforts. NSF encourages true, collaborative partnerships between scientists in computer sciences and the geosciences to meet these challenges.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael; Lethin, Richard

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that makemore » design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.« less

  19. Mesoscale Models of Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Boghosian, Bruce M.; Hadjiconstantinou, Nicolas G.

    During the last half century, enormous progress has been made in the field of computational materials modeling, to the extent that in many cases computational approaches are used in a predictive fashion. Despite this progress, modeling of general hydrodynamic behavior remains a challenging task. One of the main challenges stems from the fact that hydrodynamics manifests itself over a very wide range of length and time scales. On one end of the spectrum, one finds the fluid's "internal" scale characteristic of its molecular structure (in the absence of quantum effects, which we omit in this chapter). On the other end, the "outer" scale is set by the characteristic sizes of the problem's domain. The resulting scale separation or lack thereof as well as the existence of intermediate scales are key to determining the optimal approach. Successful treatments require a judicious choice of the level of description which is a delicate balancing act between the conflicting requirements of fidelity and manageable computational cost: a coarse description typically requires models for underlying processes occuring at smaller length and time scales; on the other hand, a fine-scale model will incur a significantly larger computational cost.

  20. Discriminative Hierarchical K-Means Tree for Large-Scale Image Classification.

    PubMed

    Chen, Shizhi; Yang, Xiaodong; Tian, Yingli

    2015-09-01

    A key challenge in large-scale image classification is how to achieve efficiency in terms of both computation and memory without compromising classification accuracy. The learning-based classifiers achieve the state-of-the-art accuracies, but have been criticized for the computational complexity that grows linearly with the number of classes. The nonparametric nearest neighbor (NN)-based classifiers naturally handle large numbers of categories, but incur prohibitively expensive computation and memory costs. In this brief, we present a novel classification scheme, i.e., discriminative hierarchical K-means tree (D-HKTree), which combines the advantages of both learning-based and NN-based classifiers. The complexity of the D-HKTree only grows sublinearly with the number of categories, which is much better than the recent hierarchical support vector machines-based methods. The memory requirement is the order of magnitude less than the recent Naïve Bayesian NN-based approaches. The proposed D-HKTree classification scheme is evaluated on several challenging benchmark databases and achieves the state-of-the-art accuracies, while with significantly lower computation cost and memory requirement.

  1. An Efficient Computational Framework for the Analysis of Whole Slide Images: Application to Follicular Lymphoma Immunohistochemistry

    PubMed Central

    Samsi, Siddharth; Krishnamurthy, Ashok K.; Gurcan, Metin N.

    2012-01-01

    Follicular Lymphoma (FL) is one of the most common non-Hodgkin Lymphoma in the United States. Diagnosis and grading of FL is based on the review of histopathological tissue sections under a microscope and is influenced by human factors such as fatigue and reader bias. Computer-aided image analysis tools can help improve the accuracy of diagnosis and grading and act as another tool at the pathologist’s disposal. Our group has been developing algorithms for identifying follicles in immunohistochemical images. These algorithms have been tested and validated on small images extracted from whole slide images. However, the use of these algorithms for analyzing the entire whole slide image requires significant changes to the processing methodology since the images are relatively large (on the order of 100k × 100k pixels). In this paper we discuss the challenges involved in analyzing whole slide images and propose potential computational methodologies for addressing these challenges. We discuss the use of parallel computing tools on commodity clusters and compare performance of the serial and parallel implementations of our approach. PMID:22962572

  2. A review of computer-aided oral and maxillofacial surgery: planning, simulation and navigation.

    PubMed

    Chen, Xiaojun; Xu, Lu; Sun, Yi; Politis, Constantinus

    2016-11-01

    Currently, oral and maxillofacial surgery (OMFS) still poses a significant challenge for surgeons due to the anatomic complexity and limited field of view of the oral cavity. With the great development of computer technologies, he computer-aided surgery has been widely used for minimizing the risks and improving the precision of surgery. Areas covered: The major goal of this paper is to provide a comprehensive reference source of current and future development of computer-aided OMFS including surgical planning, simulation and navigation for relevant researchers. Expert commentary: Compared with the traditional OMFS, computer-aided OMFS overcomes the disadvantage that the treatment on the region of anatomically complex maxillofacial depends almost exclusively on the experience of the surgeon.

  3. CFD Prediction for Spin Rate of Fixed Canards on a Spinning Projectile

    NASA Astrophysics Data System (ADS)

    Ji, X. L.; Jia, Ch. Y.; Jiang, T. Y.

    2011-09-01

    A computational study performed for spin rate of fixed canards on a spinning projectile is presented in this paper. The cancards configurations provide challenges in terms of the determination of the aerodynamic forces and moments and the flow field changes which could have significant effect on the stability, performance, and corrected round accuracy. Advanced time accurate Navier-Stokes computations have been performed to compute the spin rate associated with the spinning motion of the cancards configurations at supersonic speed. The results show that roll-damping moment of cancards varies linearly with the spin rate at supersonic velocity.

  4. This Rock 'n' Roll Video Teaches Math

    ERIC Educational Resources Information Center

    Niess, Margaret L.; Walker, Janet M.

    2009-01-01

    Mathematics is a discipline that has significantly advanced through the use of digital technologies with improved computational, graphical, and symbolic capabilities. Digital videos can be used to present challenging mathematical questions for students. Video clips offer instructional possibilities for moving students from a passive mode of…

  5. Exploring the Notion of Context in Medical Data.

    PubMed

    Mylonas, Phivos

    2017-01-01

    Scientific and technological knowledge and skills are becoming crucial for most data analysis activities. Two rather distinct, but at the same time collaborating, domains are the ones of computer science and medicine; the former offers significant aid towards a more efficient understanding of the latter's research trends. Still, the process of meaningfully analyzing and understanding medical information and data is a tedious one, bound to several challenges. One of them is the efficient utilization of contextual information in the process leading to optimized, context-aware data analysis results. Nowadays, researchers are provided with tools and opportunities to analytically study medical data, but at the same time significant and rather complex computational challenges are yet to be tackled, among others due to the humanistic nature and increased rate of new content and information production imposed by related hardware and applications. So, the ultimate goal of this position paper is to provide interested parties an overview of major contextual information types to be identified within the medical data processing framework.

  6. Multiscale Modeling in Computational Biomechanics: Determining Computational Priorities and Addressing Current Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tawhai, Merryn; Bischoff, Jeff; Einstein, Daniel R.

    2009-05-01

    Abstract In this article, we describe some current multiscale modeling issues in computational biomechanics from the perspective of the musculoskeletal and respiratory systems and mechanotransduction. First, we outline the necessity of multiscale simulations in these biological systems. Then we summarize challenges inherent to multiscale biomechanics modeling, regardless of the subdiscipline, followed by computational challenges that are system-specific. We discuss some of the current tools that have been utilized to aid research in multiscale mechanics simulations, and the priorities to further the field of multiscale biomechanics computation.

  7. Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.

    PubMed

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.

  8. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    ERIC Educational Resources Information Center

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  9. Software Systems for High-performance Quantum Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S; Britt, Keith A

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventionalmore » computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.« less

  10. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, Daniel; Berzins, Martin; Pennington, Robert

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observationsmore » and recommendations of the subcommittee.« less

  11. Pen-based computers: Computers without keys

    NASA Technical Reports Server (NTRS)

    Conklin, Cheryl L.

    1994-01-01

    The National Space Transportation System (NSTS) is comprised of many diverse and highly complex systems incorporating the latest technologies. Data collection associated with ground processing of the various Space Shuttle system elements is extremely challenging due to the many separate processing locations where data is generated. This presents a significant problem when the timely collection, transfer, collation, and storage of data is required. This paper describes how new technology, referred to as Pen-Based computers, is being used to transform the data collection process at Kennedy Space Center (KSC). Pen-Based computers have streamlined procedures, increased data accuracy, and now provide more complete information than previous methods. The end results is the elimination of Shuttle processing delays associated with data deficiencies.

  12. High Performance Processors for Space Environments: A Subproject of the NASA Exploration Missions Systems Directorate "Radiation Hardened Electronics for Space Environments" Technology Development Program

    NASA Technical Reports Server (NTRS)

    Johnson, M.; Label, K.; McCabe, J.; Powell, W.; Bolotin, G.; Kolawa, E.; Ng, T.; Hyde, D.

    2007-01-01

    Implementation of challenging Exploration Systems Missions Directorate objectives and strategies can be constrained by onboard computing capabilities and power efficiencies. The Radiation Hardened Electronics for Space Environments (RHESE) High Performance Processors for Space Environments project will address this challenge by significantly advancing the sustained throughput and processing efficiency of high-per$ormance radiation-hardened processors, targeting delivery of products by the end of FY12.

  13. Combining Modeling and Gaming for Predictive Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riensche, Roderick M.; Whitney, Paul D.

    2012-08-22

    Many of our most significant challenges involve people. While human behavior has long been studied, there are recent advances in computational modeling of human behavior. With advances in computational capabilities come increases in the volume and complexity of data that humans must understand in order to make sense of and capitalize on these modeling advances. Ultimately, models represent an encapsulation of human knowledge. One inherent challenge in modeling is efficient and accurate transfer of knowledge from humans to models, and subsequent retrieval. The simulated real-world environment of games presents one avenue for these knowledge transfers. In this paper we describemore » our approach of combining modeling and gaming disciplines to develop predictive capabilities, using formal models to inform game development, and using games to provide data for modeling.« less

  14. Final Report. Institute for Ultralscale Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Kwan-Liu; Galli, Giulia; Gygi, Francois

    The SciDAC Institute for Ultrascale Visualization brought together leading experts from visualization, high-performance computing, and science application areas to make advanced visualization solutions for SciDAC scientists and the broader community. Over the five-year project, the Institute introduced many new enabling visualization techniques, which have significantly enhanced scientists’ ability to validate their simulations, interpret their data, and communicate with others about their work and findings. This Institute project involved a large number of junior and student researchers, who received the opportunities to work on some of the most challenging science applications and gain access to the most powerful high-performance computing facilitiesmore » in the world. They were readily trained and prepared for facing the greater challenges presented by extreme-scale computing. The Institute’s outreach efforts, through publications, workshops and tutorials, successfully disseminated the new knowledge and technologies to the SciDAC and the broader scientific communities. The scientific findings and experience of the Institute team helped plan the SciDAC3 program.« less

  15. Force fields and scoring functions for carbohydrate simulation.

    PubMed

    Xiong, Xiuming; Chen, Zhaoqiang; Cossins, Benjamin P; Xu, Zhijian; Shao, Qiang; Ding, Kai; Zhu, Weiliang; Shi, Jiye

    2015-01-12

    Carbohydrate dynamics plays a vital role in many biological processes, but we are not currently able to probe this with experimental approaches. The highly flexible nature of carbohydrate structures differs in many aspects from other biomolecules, posing significant challenges for studies employing computational simulation. Over past decades, computational study of carbohydrates has been focused on the development of structure prediction methods, force field optimization, molecular dynamics simulation, and scoring functions for carbohydrate-protein interactions. Advances in carbohydrate force fields and scoring functions can be largely attributed to enhanced computational algorithms, application of quantum mechanics, and the increasing number of experimental structures determined by X-ray and NMR techniques. The conformational analysis of carbohydrates is challengeable and has gone into intensive study in elucidating the anomeric, the exo-anomeric, and the gauche effects. Here, we review the issues associated with carbohydrate force fields and scoring functions, which will have a broad application in the field of carbohydrate-based drug design. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Challenges and considerations for the design and production of a purpose-optimized body-worn wrist-watch computer

    NASA Astrophysics Data System (ADS)

    Narayanaswami, Chandra; Raghunath, Mandayam T.

    2004-09-01

    We outline a collection of technological challenges in the design of wearable computers with a focus on one of the most desirable form-factors, the wrist watch. We describe our experience with building three generations of wrist watch computers. We built these research prototypes as platforms to investigate the fundamental limitations of wearable computing. Results of our investigations are presented in the form of challenges that have been overcome and those that still remain.

  17. The National Shipbuilding Research Program. 1989 Ship Production Symposium. Paper No. 13: NIDDESC: Meeting the Data Exchange Challenge Through a Cooperative Effort

    DTIC Science & Technology

    1989-09-01

    RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c . THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18... Computer Aided Design (CAD) and Manufacturing (CAM) techniques in the marine industry has increased significantly in recent years, With more...somewhat from ship to ship. All of the activities and companies involved have improved this process by utilizing computer tools. For example, many

  18. Memory management in genome-wide association studies

    PubMed Central

    2009-01-01

    Genome-wide association is a powerful tool for the identification of genes that underlie common diseases. Genome-wide association studies generate billions of genotypes and pose significant computational challenges for most users including limited computer memory. We applied a recently developed memory management tool to two analyses of North American Rheumatoid Arthritis Consortium studies and measured the performance in terms of central processing unit and memory usage. We conclude that our memory management approach is simple, efficient, and effective for genome-wide association studies. PMID:20018047

  19. Now and Next-Generation Sequencing Techniques: Future of Sequence Analysis Using Cloud Computing

    PubMed Central

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed “cloud computing”) has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows. PMID:23248640

  20. Hybrid Quantum-Classical Approach to Quantum Optimal Control.

    PubMed

    Li, Jun; Yang, Xiaodong; Peng, Xinhua; Sun, Chang-Pu

    2017-04-14

    A central challenge in quantum computing is to identify more computational problems for which utilization of quantum resources can offer significant speedup. Here, we propose a hybrid quantum-classical scheme to tackle the quantum optimal control problem. We show that the most computationally demanding part of gradient-based algorithms, namely, computing the fitness function and its gradient for a control input, can be accomplished by the process of evolution and measurement on a quantum simulator. By posing queries to and receiving answers from the quantum simulator, classical computing devices update the control parameters until an optimal control solution is found. To demonstrate the quantum-classical scheme in experiment, we use a seven-qubit nuclear magnetic resonance system, on which we have succeeded in optimizing state preparation without involving classical computation of the large Hilbert space evolution.

  1. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    NASA Technical Reports Server (NTRS)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.

  2. Introducing Programmable Logic to Undergraduate Engineering Students in a Digital Electronics Course

    ERIC Educational Resources Information Center

    Todorovich, E.; Marone, J. A.; Vazquez, M.

    2012-01-01

    Due to significant technological advances and industry requirements, many universities have introduced programmable logic and hardware description languages into undergraduate engineering curricula. This has led to a number of logistical and didactical challenges, in particular for computer science students. In this paper, the integration of some…

  3. Using Virtual Reality with and without Gaming Attributes for Academic Achievement

    ERIC Educational Resources Information Center

    Vogel, Jennifer J.; Greenwood-Ericksen, Adams; Cannon-Bowers, Jan; Bowers, Clint A.

    2006-01-01

    A subcategory of computer-assisted instruction (CAI), games have additional attributes such as motivation, reward, interactivity, score, and challenge. This study used a quasi-experimental design to determine if previous findings generalize to non simulation-based game designs. Researchers observed significant improvement in the overall population…

  4. Deep Learning for Computer Vision: A Brief Review

    PubMed Central

    Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios

    2018-01-01

    Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein. PMID:29487619

  5. Fluid-Structure Interaction Modeling of Parachutes with Disreefing and Modified Geometric Porosity and Separation Aerodynamics of a Cover Jettisoned to the Spacecraft Wake

    NASA Astrophysics Data System (ADS)

    Fritze, Matthew D.

    Fluid-structure interaction (FSI) modeling of spacecraft parachutes involves a number of computational challenges. The canopy complexity created by the hundreds of gaps and slits and design-related modification of that geometric porosity by removal of some of the sails and panels are among the formidable challenges. Disreefing from one stage to another when the parachute is used in multiple stages is another formidable challenge. This thesis addresses the computational challenges involved in disreefing of spacecraft parachutes and fully-open and reefed stages of the parachutes with modified geometric porosity. The special techniques developed to address these challenges are described and the FSI computations are be reported. The thesis also addresses the modeling and computation challenges involved in very early stages, where the sudden separation of a cover jettisoned to the spacecraft wake needs to be modeled. Higher-order temporal representations used in modeling the separation motion are described, and the computed separation and wake-induced forces acting on the cover are reported.

  6. Challenges to Computational Aerothermodynamic Simulation and Validation for Planetary Entry Vehicle Analysis

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2010-01-01

    Challenges to computational aerothermodynamic (CA) simulation and validation of hypersonic flow over planetary entry vehicles are discussed. Entry, descent, and landing (EDL) of high mass to Mars is a significant driver of new simulation requirements. These requirements include simulation of large deployable, flexible structures and interactions with reaction control system (RCS) and retro-thruster jets. Simulation of radiation and ablation coupled to the flow solver continues to be a high priority for planetary entry analyses, especially for return to Earth and outer planet missions. Three research areas addressing these challenges are emphasized. The first addresses the need to obtain accurate heating on unstructured tetrahedral grid systems to take advantage of flexibility in grid generation and grid adaptation. A multi-dimensional inviscid flux reconstruction algorithm is defined that is oriented with local flow topology as opposed to grid. The second addresses coupling of radiation and ablation to the hypersonic flow solver - flight- and ground-based data are used to provide limited validation of these multi-physics simulations. The third addresses the challenges of retro-propulsion simulation and the criticality of grid adaptation in this application. The evolution of CA to become a tool for innovation of EDL systems requires a successful resolution of these challenges.

  7. A Randomized Phase I Trial of a Brief Computer-Delivered Intervention for Alcohol Use During Pregnancy

    PubMed Central

    Sokol, Robert J.; Ondersma, Steven J.

    2011-01-01

    Abstract Background Drinking alcohol during pregnancy has a range of negative consequences for the developing fetus. Screening and brief intervention approaches have significant promise, but their population impact may be limited by a range of challenges to implementation. We, therefore, conducted preliminary acceptability and feasibility evaluation of a computer-delivered brief intervention for alcohol use during pregnancy. Methods Participants were 50 pregnant women who screened positive for risky drinking during a routine prenatal clinic visit and were randomly assigned to computer-delivered brief intervention or assessment-only conditions. Results Ratings of intervention ease of use, helpfulness, and other factors were high (4.7–5.0 on a 1–5 scale). Participants in both conditions significantly decreased alcohol use at follow-up, with no group differences; however, birth weights for infants born to women in the intervention group were significantly higher (p<0.05, d = 0.62). Conclusions Further development and study of computer-delivered screening and intervention for alcohol use during pregnancy are warranted. PMID:21823917

  8. A large-scale evaluation of computational protein function prediction

    PubMed Central

    Radivojac, Predrag; Clark, Wyatt T; Ronnen Oron, Tal; Schnoes, Alexandra M; Wittkop, Tobias; Sokolov, Artem; Graim, Kiley; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa; Pandey, Gaurav; Yunes, Jeffrey M; Talwalkar, Ameet S; Repo, Susanna; Souza, Michael L; Piovesan, Damiano; Casadio, Rita; Wang, Zheng; Cheng, Jianlin; Fang, Hai; Gough, Julian; Koskinen, Patrik; Törönen, Petri; Nokso-Koivisto, Jussi; Holm, Liisa; Cozzetto, Domenico; Buchan, Daniel W A; Bryson, Kevin; Jones, David T; Limaye, Bhakti; Inamdar, Harshal; Datta, Avik; Manjari, Sunitha K; Joshi, Rajendra; Chitale, Meghana; Kihara, Daisuke; Lisewski, Andreas M; Erdin, Serkan; Venner, Eric; Lichtarge, Olivier; Rentzsch, Robert; Yang, Haixuan; Romero, Alfonso E; Bhat, Prajwal; Paccanaro, Alberto; Hamp, Tobias; Kassner, Rebecca; Seemayer, Stefan; Vicedo, Esmeralda; Schaefer, Christian; Achten, Dominik; Auer, Florian; Böhm, Ariane; Braun, Tatjana; Hecht, Maximilian; Heron, Mark; Hönigschmid, Peter; Hopf, Thomas; Kaufmann, Stefanie; Kiening, Michael; Krompass, Denis; Landerer, Cedric; Mahlich, Yannick; Roos, Manfred; Björne, Jari; Salakoski, Tapio; Wong, Andrew; Shatkay, Hagit; Gatzmann, Fanny; Sommer, Ingolf; Wass, Mark N; Sternberg, Michael J E; Škunca, Nives; Supek, Fran; Bošnjak, Matko; Panov, Panče; Džeroski, Sašo; Šmuc, Tomislav; Kourmpetis, Yiannis A I; van Dijk, Aalt D J; ter Braak, Cajo J F; Zhou, Yuanpeng; Gong, Qingtian; Dong, Xinran; Tian, Weidong; Falda, Marco; Fontana, Paolo; Lavezzo, Enrico; Di Camillo, Barbara; Toppo, Stefano; Lan, Liang; Djuric, Nemanja; Guo, Yuhong; Vucetic, Slobodan; Bairoch, Amos; Linial, Michal; Babbitt, Patricia C; Brenner, Steven E; Orengo, Christine; Rost, Burkhard; Mooney, Sean D; Friedberg, Iddo

    2013-01-01

    Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be high. Here we report the results from the first large-scale community-based Critical Assessment of protein Function Annotation (CAFA) experiment. Fifty-four methods representing the state-of-the-art for protein function prediction were evaluated on a target set of 866 proteins from eleven organisms. Two findings stand out: (i) today’s best protein function prediction algorithms significantly outperformed widely-used first-generation methods, with large gains on all types of targets; and (ii) although the top methods perform well enough to guide experiments, there is significant need for improvement of currently available tools. PMID:23353650

  9. The Future of Electronic Device Design: Device and Process Simulation Find Intelligence on the World Wide Web

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A.

    1999-01-01

    We are on the path to meet the major challenges ahead for TCAD (technology computer aided design). The emerging computational grid will ultimately solve the challenge of limited computational power. The Modular TCAD Framework will solve the TCAD software challenge once TCAD software developers realize that there is no other way to meet industry's needs. The modular TCAD framework (MTF) also provides the ideal platform for solving the TCAD model challenge by rapid implementation of models in a partial differential solver.

  10. Protecting genomic data analytics in the cloud: state of the art and opportunities.

    PubMed

    Tang, Haixu; Jiang, Xiaoqian; Wang, Xiaofeng; Wang, Shuang; Sofia, Heidi; Fox, Dov; Lauter, Kristin; Malin, Bradley; Telenti, Amalio; Xiong, Li; Ohno-Machado, Lucila

    2016-10-13

    The outsourcing of genomic data into public cloud computing settings raises concerns over privacy and security. Significant advancements in secure computation methods have emerged over the past several years, but such techniques need to be rigorously evaluated for their ability to support the analysis of human genomic data in an efficient and cost-effective manner. With respect to public cloud environments, there are concerns about the inadvertent exposure of human genomic data to unauthorized users. In analyses involving multiple institutions, there is additional concern about data being used beyond agreed research scope and being prcoessed in untrused computational environments, which may not satisfy institutional policies. To systematically investigate these issues, the NIH-funded National Center for Biomedical Computing iDASH (integrating Data for Analysis, 'anonymization' and SHaring) hosted the second Critical Assessment of Data Privacy and Protection competition to assess the capacity of cryptographic technologies for protecting computation over human genomes in the cloud and promoting cross-institutional collaboration. Data scientists were challenged to design and engineer practical algorithms for secure outsourcing of genome computation tasks in working software, whereby analyses are performed only on encrypted data. They were also challenged to develop approaches to enable secure collaboration on data from genomic studies generated by multiple organizations (e.g., medical centers) to jointly compute aggregate statistics without sharing individual-level records. The results of the competition indicated that secure computation techniques can enable comparative analysis of human genomes, but greater efficiency (in terms of compute time and memory utilization) are needed before they are sufficiently practical for real world environments.

  11. DOE Advanced Scientific Advisory Committee (ASCAC): Workforce Subcommittee Letter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Barbara; Calandra, Henri; Crivelli, Silvia

    2014-07-23

    Simulation and computing are essential to much of the research conducted at the DOE national laboratories. Experts in the ASCR ¬relevant Computing Sciences, which encompass a range of disciplines including Computer Science, Applied Mathematics, Statistics and domain Computational Sciences, are an essential element of the workforce in nearly all of the DOE national laboratories. This report seeks to identify the gaps and challenges facing DOE with respect to this workforce. This letter is ASCAC’s response to the charge of February 19, 2014 to identify disciplines in which significantly greater emphasis in workforce training at the graduate or postdoctoral levels ismore » necessary to address workforce gaps in current and future Office of Science mission needs.« less

  12. Building A Community Focused Data and Modeling Collaborative platform with Hardware Virtualization Technology

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.

    2009-12-01

    As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.

  13. Exploring the challenges faced by polytechnic students

    NASA Astrophysics Data System (ADS)

    Matore, Mohd Effendi @ Ewan Mohd; Khairani, Ahmad Zamri

    2015-02-01

    This study aims to identify other challenges besides those already faced by students, in seven polytechnics in Malaysia as a continuation to the previous research that had identified 52 main challenges faced by students using the Rasch Model. The explorative study focuses on the challenges that are not included in the Mooney Problem Checklist (MPCL). A total of 121 polytechnic students submitted 183 written responses through the open questions provided. Two hundred fifty two students had responded from a students' perspective on the dichotomous questions regarding their view on the challenges faced. The data was analysed qualitatively using the NVivo 8.0. The findings showed that students from Politeknik Seberang Perai (PSP) gave the highest response, which was 56 (30.6%) and Politeknik Metro Kuala Lumpur (PMKL) had the lowest response of 2 (1.09%). Five dominant challenges were identified, which were the English language (32, 17.5%), learning (14, 7.7%), vehicles (13, 7.1%), information technology and communication (ICT) (13, 7.1%), and peers (11, 6.0%). This article, however, focus on three apparent challenges, namely, English language, vehicles, as well as computer and ICT, as the challenges of learning and peers had been analysed in the previous MPCL. The challenge of English language that had been raised was regarding the weakness in commanding the aspects of speech and fluency. The computer and ICT challenge covered the weakness in mastering ICT and computers, as well as computer breakdowns and low-performance computers. The challenge of vehicles emphasized the unavailability of vehicles to attend lectures and go elsewhere, lack of transportation service in the polytechnic and not having a valid driving license. These challenges are very relevant and need to be discussed in an effort to prepare polytechnics in facing the transformational process of polytechnics.

  14. Toward Interactive Scenario Analysis and Exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gayle, Thomas R.; Summers, Kenneth Lee; Jungels, John

    2015-01-01

    As Modeling and Simulation (M&S) tools have matured, their applicability and importance have increased across many national security challenges. In particular, they provide a way to test how something may behave without the need to do real world testing. However, current and future changes across several factors including capabilities, policy, and funding are driving a need for rapid response or evaluation in ways that many M&S tools cannot address. Issues around large data, computational requirements, delivery mechanisms, and analyst involvement already exist and pose significant challenges. Furthermore, rising expectations, rising input complexity, and increasing depth of analysis will only increasemore » the difficulty of these challenges. In this study we examine whether innovations in M&S software coupled with advances in ''cloud'' computing and ''big-data'' methodologies can overcome many of these challenges. In particular, we propose a simple, horizontally-scalable distributed computing environment that could provide the foundation (i.e. ''cloud'') for next-generation M&S-based applications based on the notion of ''parallel multi-simulation''. In our context, the goal of parallel multi- simulation is to consider as many simultaneous paths of execution as possible. Therefore, with sufficient resources, the complexity is dominated by the cost of single scenario runs as opposed to the number of runs required. We show the feasibility of this architecture through a stable prototype implementation coupled with the Umbra Simulation Framework [6]. Finally, we highlight the utility through multiple novel analysis tools and by showing the performance improvement compared to existing tools.« less

  15. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  16. A 3D bioprinting exemplar of the consequences of the regulatory requirements on customized processes.

    PubMed

    Hourd, Paul; Medcalf, Nicholas; Segal, Joel; Williams, David J

    2015-01-01

    Computer-aided 3D printing approaches to the industrial production of customized 3D functional living constructs for restoration of tissue and organ function face significant regulatory challenges. Using the manufacture of a customized, 3D-bioprinted nasal implant as a well-informed but hypothetical exemplar, we examine how these products might be regulated. Existing EU and USA regulatory frameworks do not account for the differences between 3D printing and conventional manufacturing methods or the ability to create individual customized products using mechanized rather than craft approaches. Already subject to extensive regulatory control, issues related to control of the computer-aided design to manufacture process and the associated software system chain present additional scientific and regulatory challenges for manufacturers of these complex 3D-bioprinted advanced combination products.

  17. Using Ada: The deeper challenges

    NASA Technical Reports Server (NTRS)

    Feinberg, David A.

    1986-01-01

    The Ada programming language and the associated Ada Programming Support Environment (APSE) and Ada Run Time Environment (ARTE) provide the potential for significant life-cycle cost reductions in computer software development and maintenance activities. The Ada programming language itself is standardized, trademarked, and controlled via formal validation procedures. Though compilers are not yet production-ready as most would desire, the technology for constructing them is sufficiently well known and understood that time and money should suffice to correct current deficiencies. The APSE and ARTE are, on the other hand, significantly newer issues within most software development and maintenance efforts. Currently, APSE and ARTE are highly dependent on differing implementer concepts, strategies, and market objectives. Complex and sophisticated mission-critical computing systems require the use of a complete Ada-based capability, not just the programming language itself; yet the range of APSE and ARTE features which must actually be utilized can vary significantly from one system to another. As a consequence, the need to understand, objectively evaluate, and select differing APSE and ARTE capabilities and features is critical to the effective use of Ada and the life-cycle efficiencies it is intended to promote. It is the selection, collection, and understanding of APSE and ARTE which provide the deeper challenges of using Ada for real-life mission-critical computing systems. Some of the current issues which must be clarified, often on a case-by-case basis, in order to successfully realize the full capabilities of Ada are discussed.

  18. A world-wide databridge supported by a commercial cloud provider

    NASA Astrophysics Data System (ADS)

    Tat Cheung, Kwong; Field, Laurence; Furano, Fabrizio

    2017-10-01

    Volunteer computing has the potential to provide significant additional computing capacity for the LHC experiments. One of the challenges with exploiting volunteer computing is to support a global community of volunteers that provides heterogeneous resources. However, high energy physics applications require more data input and output than the CPU intensive applications that are typically used by other volunteer computing projects. While the so-called databridge has already been successfully proposed as a method to span the untrusted and trusted domains of volunteer computing and Grid computing respective, globally transferring data between potentially poor-performing residential networks and CERN could be unreliable, leading to wasted resources usage. The expectation is that by placing a storage endpoint that is part of a wider, flexible geographical databridge deployment closer to the volunteers, the transfer success rate and the overall performance can be improved. This contribution investigates the provision of a globally distributed databridge implemented upon a commercial cloud provider.

  19. Neuromorphic computing enabled by physics of electron spins: Prospects and perspectives

    NASA Astrophysics Data System (ADS)

    Sengupta, Abhronil; Roy, Kaushik

    2018-03-01

    “Spintronics” refers to the understanding of the physics of electron spin-related phenomena. While most of the significant advancements in this field has been driven primarily by memory, recent research has demonstrated that various facets of the underlying physics of spin transport and manipulation can directly mimic the functionalities of the computational primitives in neuromorphic computation, i.e., the neurons and synapses. Given the potential of these spintronic devices to implement bio-mimetic computations at very low terminal voltages, several spin-device structures have been proposed as the core building blocks of neuromorphic circuits and systems to implement brain-inspired computing. Such an approach is expected to play a key role in circumventing the problems of ever-increasing power dissipation and hardware requirements for implementing neuro-inspired algorithms in conventional digital CMOS technology. Perspectives on spin-enabled neuromorphic computing, its status, and challenges and future prospects are outlined in this review article.

  20. Automated Test for NASA CFS

    NASA Technical Reports Server (NTRS)

    McComas, David C.; Strege, Susanne L.; Carpenter, Paul B. Hartman, Randy

    2015-01-01

    The core Flight System (cFS) is a flight software (FSW) product line developed by the Flight Software Systems Branch (FSSB) at NASA's Goddard Space Flight Center (GSFC). The cFS uses compile-time configuration parameters to implement variable requirements to enable portability across embedded computing platforms and to implement different end-user functional needs. The verification and validation of these requirements is proving to be a significant challenge. This paper describes the challenges facing the cFS and the results of a pilot effort to apply EXB Solution's testing approach to the cFS applications.

  1. Multimodal Research: Addressing the Complexity of Multimodal Environments and the Challenges for CALL

    ERIC Educational Resources Information Center

    Tan, Sabine; O'Halloran, Kay L.; Wignell, Peter

    2016-01-01

    Multimodality, the study of the interaction of language with other semiotic resources such as images and sound resources, has significant implications for computer assisted language learning (CALL) with regards to understanding the impact of digital environments on language teaching and learning. In this paper, we explore recent manifestations of…

  2. Biomimetic robots using EAP as artificial muscles - progress and challenges

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph

    2004-01-01

    Biology offers a great model for emulation in areas ranging from tools, computational algorithms, materials science, mechanisms and information technology. In recent years, the field of biomimetics, namely mimicking biology, has blossomed with significant advances enabling the reverse engineering of many animals' functions and implementation of some of these capabilities.

  3. The Battle to Secure Our Public Access Computers

    ERIC Educational Resources Information Center

    Sendze, Monique

    2006-01-01

    Securing public access workstations should be a significant part of any library's network and information-security strategy because of the sensitive information patrons enter on these workstations. As the IT manager for the Johnson County Library in Kansas City, Kan., this author is challenged to make sure that thousands of patrons get the access…

  4. The Use of Computer Technology in Designing Appropriate Test Accommodations for English Language Learners

    ERIC Educational Resources Information Center

    Abedi, Jamal

    2014-01-01

    Among the several forms of accommodations used in the assessment of English language learners (ELLs), language-based accommodations are the most effective in making assessments linguistically accessible to these students. However, there are significant challenges associated with the implementation of many of these accommodations. This article…

  5. The Awareness and Challenges of Cloud Computing Adoption on Tertiary Education in Malaysia

    NASA Astrophysics Data System (ADS)

    Hazreeni Hamzah, Nor; Mahmud, Maziah; Zukri, Shamsunarnie Mohamed; Yaacob, Wan Fairos Wan; Yacob, Jusoh

    2017-09-01

    This preliminary study aims to investigate the awareness of the adoption of cloud computing among the academicians in tertiary education in Malaysia. Besides, this study also want to explore the possible challenges faced by the academician while adopting this new technology. The pilot study was done on 40 lecturers in Universiti Teknologi MARA Kampus Kota Bharu (UiTMKB) by using self administered questionnaire. The results found that almost half (40 percent) were not aware on the existing of cloud computing in teaching and learning (T&L) process. The challenges confronting the adoption of cloud computing are data insecurity, data insecurity, unsolicited advertisement, lock-in, reluctance to eliminate staff positions, privacy concerns, reliability challenge, regulatory compliance concerns/user control and institutional culture/resistance to change in technology. This possible challenges can be factorized in two major factors which were security and dependency factor and user control and mentality factor.

  6. BrainFrame: a node-level heterogeneous accelerator platform for neuron simulations

    NASA Astrophysics Data System (ADS)

    Smaragdos, Georgios; Chatzikonstantis, Georgios; Kukreja, Rahul; Sidiropoulos, Harry; Rodopoulos, Dimitrios; Sourdis, Ioannis; Al-Ars, Zaid; Kachris, Christoforos; Soudris, Dimitrios; De Zeeuw, Chris I.; Strydis, Christos

    2017-12-01

    Objective. The advent of high-performance computing (HPC) in recent years has led to its increasing use in brain studies through computational models. The scale and complexity of such models are constantly increasing, leading to challenging computational requirements. Even though modern HPC platforms can often deal with such challenges, the vast diversity of the modeling field does not permit for a homogeneous acceleration platform to effectively address the complete array of modeling requirements. Approach. In this paper we propose and build BrainFrame, a heterogeneous acceleration platform that incorporates three distinct acceleration technologies, an Intel Xeon-Phi CPU, a NVidia GP-GPU and a Maxeler Dataflow Engine. The PyNN software framework is also integrated into the platform. As a challenging proof of concept, we analyze the performance of BrainFrame on different experiment instances of a state-of-the-art neuron model, representing the inferior-olivary nucleus using a biophysically-meaningful, extended Hodgkin-Huxley representation. The model instances take into account not only the neuronal-network dimensions but also different network-connectivity densities, which can drastically affect the workload’s performance characteristics. Main results. The combined use of different HPC technologies demonstrates that BrainFrame is better able to cope with the modeling diversity encountered in realistic experiments while at the same time running on significantly lower energy budgets. Our performance analysis clearly shows that the model directly affects performance and all three technologies are required to cope with all the model use cases. Significance. The BrainFrame framework is designed to transparently configure and select the appropriate back-end accelerator technology for use per simulation run. The PyNN integration provides a familiar bridge to the vast number of models already available. Additionally, it gives a clear roadmap for extending the platform support beyond the proof of concept, with improved usability and directly useful features to the computational-neuroscience community, paving the way for wider adoption.

  7. Women in computer science: An interpretative phenomenological analysis exploring common factors contributing to women's selection and persistence in computer science as an academic major

    NASA Astrophysics Data System (ADS)

    Thackeray, Lynn Roy

    The purpose of this study is to understand the meaning that women make of the social and cultural factors that influence their reasons for entering and remaining in study of computer science. The twenty-first century presents many new challenges in career development and workforce choices for both men and women. Information technology has become the driving force behind many areas of the economy. As this trend continues, it has become essential that U.S. citizens need to pursue a career in technologies, including the computing sciences. Although computer science is a very lucrative profession, many Americans, especially women, are not choosing it as a profession. Recent studies have shown no significant differences in math, technical and science competency between men and women. Therefore, other factors, such as social, cultural, and environmental influences seem to affect women's decisions in choosing an area of study and career choices. A phenomenological method of qualitative research was used in this study, based on interviews of seven female students who are currently enrolled in a post-secondary computer science program. Their narratives provided meaning into the social and cultural environments that contribute to their persistence in their technical studies, as well as identifying barriers and challenges that are faced by female students who choose to study computer science. It is hoped that the data collected from this study may provide recommendations for the recruiting, retention and support for women in computer science departments of U.S. colleges and universities, and thereby increase the numbers of women computer scientists in industry. Keywords: gender access, self-efficacy, culture, stereotypes, computer education, diversity.

  8. [Computational chemistry in structure-based drug design].

    PubMed

    Cao, Ran; Li, Wei; Sun, Han-Zi; Zhou, Yu; Huang, Niu

    2013-07-01

    Today, the understanding of the sequence and structure of biologically relevant targets is growing rapidly and researchers from many disciplines, physics and computational science in particular, are making significant contributions to modern biology and drug discovery. However, it remains challenging to rationally design small molecular ligands with desired biological characteristics based on the structural information of the drug targets, which demands more accurate calculation of ligand binding free-energy. With the rapid advances in computer power and extensive efforts in algorithm development, physics-based computational chemistry approaches have played more important roles in structure-based drug design. Here we reviewed the newly developed computational chemistry methods in structure-based drug design as well as the elegant applications, including binding-site druggability assessment, large scale virtual screening of chemical database, and lead compound optimization. Importantly, here we address the current bottlenecks and propose practical solutions.

  9. Challenges and Security in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Chang, Hyokyung; Choi, Euiin

    People who live in this world want to solve any problems as they happen then. An IT technology called Ubiquitous computing should help the situations easier and we call a technology which makes it even better and powerful cloud computing. Cloud computing, however, is at the stage of the beginning to implement and use and it faces a lot of challenges in technical matters and security issues. This paper looks at the cloud computing security.

  10. Using the cloud to speed-up calibration of watershed-scale hydrologic models (Invited)

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Ercan, M. B.; Castronova, A. M.; Humphrey, M.; Beekwilder, N.; Steele, J.; Kim, I.

    2013-12-01

    This research focuses on using the cloud to address computational challenges associated with hydrologic modeling. One example is calibration of a watershed-scale hydrologic model, which can take days of execution time on typical computers. While parallel algorithms for model calibration exist and some researchers have used multi-core computers or clusters to run these algorithms, these solutions do not fully address the challenge because (i) calibration can still be too time consuming even on multicore personal computers and (ii) few in the community have the time and expertise needed to manage a compute cluster. Given this, another option for addressing this challenge that we are exploring through this work is the use of the cloud for speeding-up calibration of watershed-scale hydrologic models. The cloud used in this capacity provides a means for renting a specific number and type of machines for only the time needed to perform a calibration model run. The cloud allows one to precisely balance the duration of the calibration with the financial costs so that, if the budget allows, the calibration can be performed more quickly by renting more machines. Focusing specifically on the SWAT hydrologic model and a parallel version of the DDS calibration algorithm, we show significant speed-up time across a range of watershed sizes using up to 256 cores to perform a model calibration. The tool provides a simple web-based user interface and the ability to monitor the calibration job submission process during the calibration process. Finally this talk concludes with initial work to leverage the cloud for other tasks associated with hydrologic modeling including tasks related to preparing inputs for constructing place-based hydrologic models.

  11. Discretization of the induced-charge boundary integral equation.

    PubMed

    Bardhan, Jaydeep P; Eisenberg, Robert S; Gillespie, Dirk

    2009-07-01

    Boundary-element methods (BEMs) for solving integral equations numerically have been used in many fields to compute the induced charges at dielectric boundaries. In this paper, we consider a more accurate implementation of BEM in the context of ions in aqueous solution near proteins, but our results are applicable more generally. The ions that modulate protein function are often within a few angstroms of the protein, which leads to the significant accumulation of polarization charge at the protein-solvent interface. Computing the induced charge accurately and quickly poses a numerical challenge in solving a popular integral equation using BEM. In particular, the accuracy of simulations can depend strongly on seemingly minor details of how the entries of the BEM matrix are calculated. We demonstrate that when the dielectric interface is discretized into flat tiles, the qualocation method of Tausch [IEEE Trans Comput.-Comput.-Aided Des. 20, 1398 (2001)] to compute the BEM matrix elements is always more accurate than the traditional centroid-collocation method. Qualocation is not more expensive to implement than collocation and can save significant computational time by reducing the number of boundary elements needed to discretize the dielectric interfaces.

  12. Discretization of the induced-charge boundary integral equation

    NASA Astrophysics Data System (ADS)

    Bardhan, Jaydeep P.; Eisenberg, Robert S.; Gillespie, Dirk

    2009-07-01

    Boundary-element methods (BEMs) for solving integral equations numerically have been used in many fields to compute the induced charges at dielectric boundaries. In this paper, we consider a more accurate implementation of BEM in the context of ions in aqueous solution near proteins, but our results are applicable more generally. The ions that modulate protein function are often within a few angstroms of the protein, which leads to the significant accumulation of polarization charge at the protein-solvent interface. Computing the induced charge accurately and quickly poses a numerical challenge in solving a popular integral equation using BEM. In particular, the accuracy of simulations can depend strongly on seemingly minor details of how the entries of the BEM matrix are calculated. We demonstrate that when the dielectric interface is discretized into flat tiles, the qualocation method of Tausch [IEEE Trans Comput.-Comput.-Aided Des. 20, 1398 (2001)] to compute the BEM matrix elements is always more accurate than the traditional centroid-collocation method. Qualocation is not more expensive to implement than collocation and can save significant computational time by reducing the number of boundary elements needed to discretize the dielectric interfaces.

  13. Strategies, Challenges and Prospects for Active Learning in the Computer-Based Classroom

    ERIC Educational Resources Information Center

    Holbert, K. E.; Karady, G. G.

    2009-01-01

    The introduction of computer-equipped classrooms into engineering education has brought with it a host of opportunities and issues. Herein, some of the challenges and successes for creating an environment for active learning within computer-based classrooms are described. The particular teaching approach developed for undergraduate electrical…

  14. Change Detection of Mobile LIDAR Data Using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Liu, Kun; Boehm, Jan; Alis, Christian

    2016-06-01

    Change detection has long been a challenging problem although a lot of research has been conducted in different fields such as remote sensing and photogrammetry, computer vision, and robotics. In this paper, we blend voxel grid and Apache Spark together to propose an efficient method to address the problem in the context of big data. Voxel grid is a regular geometry representation consisting of the voxels with the same size, which fairly suites parallel computation. Apache Spark is a popular distributed parallel computing platform which allows fault tolerance and memory cache. These features can significantly enhance the performance of Apache Spark and results in an efficient and robust implementation. In our experiments, both synthetic and real point cloud data are employed to demonstrate the quality of our method.

  15. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Underwood, Keith D; Ulmer, Craig D.; Thompson, David

    Field programmable gate arrays (FPGAs) have been used as alternative computational de-vices for over a decade; however, they have not been used for traditional scientific com-puting due to their perceived lack of floating-point performance. In recent years, there hasbeen a surge of interest in alternatives to traditional microprocessors for high performancecomputing. Sandia National Labs began two projects to determine whether FPGAs wouldbe a suitable alternative to microprocessors for high performance scientific computing and,if so, how they should be integrated into the system. We present results that indicate thatFPGAs could have a significant impact on future systems. FPGAs have thepotentialtohave ordermore » of magnitude levels of performance wins on several key algorithms; however,there are serious questions as to whether the system integration challenge can be met. Fur-thermore, there remain challenges in FPGA programming and system level reliability whenusing FPGA devices.4 AcknowledgmentArun Rodrigues provided valuable support and assistance in the use of the Structural Sim-ulation Toolkit within an FPGA context. Curtis Janssen and Steve Plimpton provided valu-able insights into the workings of two Sandia applications (MPQC and LAMMPS, respec-tively).5« less

  17. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  18. Quantum computing and probability.

    PubMed

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  19. The Challenges and Benefits of Using Computer Technology for Communication and Teaching in the Geosciences

    NASA Astrophysics Data System (ADS)

    Fairley, J. P.; Hinds, J. J.

    2003-12-01

    The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.

  20. Best Performers Announced for the NCI-CPTAC DREAM Proteogenomics Computational Challenge | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The National Cancer Institute (NCI) Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce that teams led by Jaewoo Kang (Korea University), and Yuanfang Guan with Hongyang Li (University of Michigan) as the best performers of the NCI-CPTAC DREAM Proteogenomics Computational Challenge. Over 500 participants from 20 countries registered for the Challenge, which offered $25,000 in cash awards contributed by the NVIDIA Foundation through its Compute the Cure initiative.

  1. The Development of Computational Biology in South Africa: Successes Achieved and Lessons Learnt

    PubMed Central

    Mulder, Nicola J.; Christoffels, Alan; de Oliveira, Tulio; Gamieldien, Junaid; Hazelhurst, Scott; Joubert, Fourie; Kumuthini, Judit; Pillay, Ché S.; Snoep, Jacky L.; Tastan Bishop, Özlem; Tiffin, Nicki

    2016-01-01

    Bioinformatics is now a critical skill in many research and commercial environments as biological data are increasing in both size and complexity. South African researchers recognized this need in the mid-1990s and responded by working with the government as well as international bodies to develop initiatives to build bioinformatics capacity in the country. Significant injections of support from these bodies provided a springboard for the establishment of computational biology units at multiple universities throughout the country, which took on teaching, basic research and support roles. Several challenges were encountered, for example with unreliability of funding, lack of skills, and lack of infrastructure. However, the bioinformatics community worked together to overcome these, and South Africa is now arguably the leading country in bioinformatics on the African continent. Here we discuss how the discipline developed in the country, highlighting the challenges, successes, and lessons learnt. PMID:26845152

  2. The Development of Computational Biology in South Africa: Successes Achieved and Lessons Learnt.

    PubMed

    Mulder, Nicola J; Christoffels, Alan; de Oliveira, Tulio; Gamieldien, Junaid; Hazelhurst, Scott; Joubert, Fourie; Kumuthini, Judit; Pillay, Ché S; Snoep, Jacky L; Tastan Bishop, Özlem; Tiffin, Nicki

    2016-02-01

    Bioinformatics is now a critical skill in many research and commercial environments as biological data are increasing in both size and complexity. South African researchers recognized this need in the mid-1990s and responded by working with the government as well as international bodies to develop initiatives to build bioinformatics capacity in the country. Significant injections of support from these bodies provided a springboard for the establishment of computational biology units at multiple universities throughout the country, which took on teaching, basic research and support roles. Several challenges were encountered, for example with unreliability of funding, lack of skills, and lack of infrastructure. However, the bioinformatics community worked together to overcome these, and South Africa is now arguably the leading country in bioinformatics on the African continent. Here we discuss how the discipline developed in the country, highlighting the challenges, successes, and lessons learnt.

  3. Accelerating Astronomy & Astrophysics in the New Era of Parallel Computing: GPUs, Phi and Cloud Computing

    NASA Astrophysics Data System (ADS)

    Ford, Eric B.; Dindar, Saleh; Peters, Jorg

    2015-08-01

    The realism of astrophysical simulations and statistical analyses of astronomical data are set by the available computational resources. Thus, astronomers and astrophysicists are constantly pushing the limits of computational capabilities. For decades, astronomers benefited from massive improvements in computational power that were driven primarily by increasing clock speeds and required relatively little attention to details of the computational hardware. For nearly a decade, increases in computational capabilities have come primarily from increasing the degree of parallelism, rather than increasing clock speeds. Further increases in computational capabilities will likely be led by many-core architectures such as Graphical Processing Units (GPUs) and Intel Xeon Phi. Successfully harnessing these new architectures, requires significantly more understanding of the hardware architecture, cache hierarchy, compiler capabilities and network network characteristics.I will provide an astronomer's overview of the opportunities and challenges provided by modern many-core architectures and elastic cloud computing. The primary goal is to help an astronomical audience understand what types of problems are likely to yield more than order of magnitude speed-ups and which problems are unlikely to parallelize sufficiently efficiently to be worth the development time and/or costs.I will draw on my experience leading a team in developing the Swarm-NG library for parallel integration of large ensembles of small n-body systems on GPUs, as well as several smaller software projects. I will share lessons learned from collaborating with computer scientists, including both technical and soft skills. Finally, I will discuss the challenges of training the next generation of astronomers to be proficient in this new era of high-performance computing, drawing on experience teaching a graduate class on High-Performance Scientific Computing for Astrophysics and organizing a 2014 advanced summer school on Bayesian Computing for Astronomical Data Analysis with support of the Penn State Center for Astrostatistics and Institute for CyberScience.

  4. Computer-based personality judgments are more accurate than those made by humans

    PubMed Central

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  5. Computer-based personality judgments are more accurate than those made by humans.

    PubMed

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madduri, Kamesh; Im, Eun-Jin; Ibrahim, Khaled Z.

    The next decade of high-performance computing (HPC) systems will see a rapid evolution and divergence of multi- and manycore architectures as power and cooling constraints limit increases in microprocessor clock speeds. Understanding efficient optimization methodologies on diverse multicore designs in the context of demanding numerical methods is one of the greatest challenges faced today by the HPC community. In this paper, we examine the efficient multicore optimization of GTC, a petascale gyrokinetic toroidal fusion code for studying plasma microturbulence in tokamak devices. For GTC’s key computational components (charge deposition and particle push), we explore efficient parallelization strategies across a broadmore » range of emerging multicore designs, including the recently-released Intel Nehalem-EX, the AMD Opteron Istanbul, and the highly multithreaded Sun UltraSparc T2+. We also present the first study on tuning gyrokinetic particle-in-cell (PIC) algorithms for graphics processors, using the NVIDIA C2050 (Fermi). Our work discusses several novel optimization approaches for gyrokinetic PIC, including mixed-precision computation, particle binning and decomposition strategies, grid replication, SIMDized atomic floating-point operations, and effective GPU texture memory utilization. Overall, we achieve significant performance improvements of 1.3–4.7× on these complex PIC kernels, despite the inherent challenges of data dependency and locality. Finally, our work also points to several architectural and programming features that could significantly enhance PIC performance and productivity on next-generation architectures.« less

  7. Next Generation Distributed Computing for Cancer Research

    PubMed Central

    Agarwal, Pankaj; Owzar, Kouros

    2014-01-01

    Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing. PMID:25983539

  8. Next generation distributed computing for cancer research.

    PubMed

    Agarwal, Pankaj; Owzar, Kouros

    2014-01-01

    Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing.

  9. Assessment of immigrant certified nursing assistants' communication when responding to standardized care challenges.

    PubMed

    Massey, Meredith; Roter, Debra L

    2016-01-01

    Certified nursing assistants (CNAs) provide 80% of the hands-on care in US nursing homes; a significant portion of this work is performed by immigrants with limited English fluency. This study is designed to assess immigrant CNA's communication behavior in response to a series of virtual simulated care challenges. A convenience sample of 31 immigrant CNAs verbally responded to 9 care challenges embedded in an interactive computer platform. The responses were coded with the Roter Interaction Analysis System (RIAS), CNA instructors rated response quality and spoken English was rated. CNA communication behaviors varied across care challenges and a broad repertoire of communication was used; 69% of response content was characterized as psychosocial. Communication elements (both instrumental and psychosocial) were significant predictors of response quality for 5 of 9 scenarios. Overall these variables explained between 13% and 36% of the adjusted variance in quality ratings. Immigrant CNAs responded to common care challenges using a variety of communication strategies despite fluency deficits. Virtual simulation-based observation is a feasible, acceptable and low cost method of communication assessment with implications for supervision, training and evaluation of a para-professional workforce. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Reinforcement Learning and Episodic Memory in Humans and Animals: An Integrative Framework.

    PubMed

    Gershman, Samuel J; Daw, Nathaniel D

    2017-01-03

    We review the psychology and neuroscience of reinforcement learning (RL), which has experienced significant progress in the past two decades, enabled by the comprehensive experimental study of simple learning and decision-making tasks. However, one challenge in the study of RL is computational: The simplicity of these tasks ignores important aspects of reinforcement learning in the real world: (a) State spaces are high-dimensional, continuous, and partially observable; this implies that (b) data are relatively sparse and, indeed, precisely the same situation may never be encountered twice; furthermore, (c) rewards depend on the long-term consequences of actions in ways that violate the classical assumptions that make RL tractable. A seemingly distinct challenge is that, cognitively, theories of RL have largely involved procedural and semantic memory, the way in which knowledge about action values or world models extracted gradually from many experiences can drive choice. This focus on semantic memory leaves out many aspects of memory, such as episodic memory, related to the traces of individual events. We suggest that these two challenges are related. The computational challenge can be dealt with, in part, by endowing RL systems with episodic memory, allowing them to (a) efficiently approximate value functions over complex state spaces, (b) learn with very little data, and (c) bridge long-term dependencies between actions and rewards. We review the computational theory underlying this proposal and the empirical evidence to support it. Our proposal suggests that the ubiquitous and diverse roles of memory in RL may function as part of an integrated learning system.

  11. Progress and challenges in bioinformatics approaches for enhancer identification

    PubMed Central

    Kleftogiannis, Dimitrios; Kalnis, Panos

    2016-01-01

    Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration. PMID:26634919

  12. Fragment informatics and computational fragment-based drug design: an overview and update.

    PubMed

    Sheng, Chunquan; Zhang, Wannian

    2013-05-01

    Fragment-based drug design (FBDD) is a promising approach for the discovery and optimization of lead compounds. Despite its successes, FBDD also faces some internal limitations and challenges. FBDD requires a high quality of target protein and good solubility of fragments. Biophysical techniques for fragment screening necessitate expensive detection equipment and the strategies for evolving fragment hits to leads remain to be improved. Regardless, FBDD is necessary for investigating larger chemical space and can be applied to challenging biological targets. In this scenario, cheminformatics and computational chemistry can be used as alternative approaches that can significantly improve the efficiency and success rate of lead discovery and optimization. Cheminformatics and computational tools assist FBDD in a very flexible manner. Computational FBDD can be used independently or in parallel with experimental FBDD for efficiently generating and optimizing leads. Computational FBDD can also be integrated into each step of experimental FBDD and help to play a synergistic role by maximizing its performance. This review will provide critical analysis of the complementarity between computational and experimental FBDD and highlight recent advances in new algorithms and successful examples of their applications. In particular, fragment-based cheminformatics tools, high-throughput fragment docking, and fragment-based de novo drug design will provide the focus of this review. We will also discuss the advantages and limitations of different methods and the trends in new developments that should inspire future research. © 2012 Wiley Periodicals, Inc.

  13. Computational challenges in modeling gene regulatory events.

    PubMed

    Pataskar, Abhijeet; Tiwari, Vijay K

    2016-10-19

    Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.

  14. Computer-Assisted Diagnostic Decision Support: History, Challenges, and Possible Paths Forward

    ERIC Educational Resources Information Center

    Miller, Randolph A.

    2009-01-01

    This paper presents a brief history of computer-assisted diagnosis, including challenges and future directions. Some ideas presented in this article on computer-assisted diagnostic decision support systems (CDDSS) derive from prior work by the author and his colleagues (see list in Acknowledgments) on the INTERNIST-1 and QMR projects. References…

  15. Community Cloud Computing

    NASA Astrophysics Data System (ADS)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  16. Information Professionals as Intelligent Agents--Or When Is a Knowbot Only a Robot?

    ERIC Educational Resources Information Center

    Hey, Jessie

    With the explosion in information resources being developed by computer scientists, subject specialists, librarians, and commercial companies, the challenge for the information professional is to keep abreast of the most significant developments and to distill the information for a wide range of users. This paper looks at some of the developments…

  17. Using Cloud Computing Services in e-Learning Process: Benefits and Challenges

    ERIC Educational Resources Information Center

    El Mhouti, Abderrahim; Erradi, Mohamed; Nasseh, Azeddine

    2018-01-01

    During the recent years, Information and Communication Technologies (ICT) play a significant role in the field of education and e-learning has become a very popular trend of the education technology. However, with the huge growth of the number of users, data and educational resources generated, e-learning systems have become more and more…

  18. Innovating in the Cloud: Exploring Cloud Computing to Solve IT Challenges

    ERIC Educational Resources Information Center

    Sheard, Reed

    2010-01-01

    When the author was brought on as CIO of Westmont College in October 2008, the president, Board of Trustees, and campus environment made it clear that technology needed a major overhaul to meet the college's growing requirements. Also, these changes needed to happen without significantly increasing the IT budget or staff. Marketing Charts…

  19. Next Steps in Network Time Synchronization For Navy Shipboard Applications

    DTIC Science & Technology

    2008-12-01

    40th Annual Precise Time and Time Interval (PTTI) Meeting NEXT STEPS IN NETWORK TIME SYNCHRONIZATION FOR NAVY SHIPBOARD APPLICATIONS...dynamic manner than in previous designs. This new paradigm creates significant network time synchronization challenges. The Navy has been...deploying the Network Time Protocol (NTP) in shipboard computing infrastructures to meet the current network time synchronization requirements

  20. The Chinese Input Challenges for Chinese as Second Language Learners in Computer-Mediated Writing: An Exploratory Study

    ERIC Educational Resources Information Center

    Wong, Lung-Hsiang; Chai, Ching-Sing; Gao, Ping

    2011-01-01

    This paper reports an exploratory study on Singapore secondary and primary school students' perceptions and behaviors on using a variety of Chinese input methods for Chinese composition writing. Significant behavioral patterns were uncovered and mapped into a cognitive process, which are potentially critical to the training of students in…

  1. Early Morning Challenge: The Potential Effects of Chronobiology on Taking the Scholastic Aptitude Test.

    ERIC Educational Resources Information Center

    Callan, Roger John

    1995-01-01

    Cites research to support the notion that the time of day in which the SAT is administered has a significant adverse impact on many students taking the test. Suggests that changes in testing procedures (making tests available via computer at any time of the day or year) will serve students. (RS)

  2. Providing Hearing-Impaired Students with Learning Care after Classes through Smart Phones and the GPRS Network

    ERIC Educational Resources Information Center

    Liu, Chen-Chung; Hong, Yi-Ching

    2007-01-01

    Although computers and network technology have been widely utilised to assist students learn, few technical supports have been developed to help hearing-impaired students learn in Taiwan. A significant challenge for teachers is to provide after-class learning care and assistance to hearing-impaired students that sustain their motivation to…

  3. Crosscut report: Exascale Requirements Reviews, March 9–10, 2017 – Tysons Corner, Virginia. An Office of Science review sponsored by: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics, Nuclear Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard; Hack, James; Riley, Katherine

    The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less

  4. COMPASS: A computational model to predict changes in MMSE scores 24-months after initial assessment of Alzheimer's disease.

    PubMed

    Zhu, Fan; Panwar, Bharat; Dodge, Hiroko H; Li, Hongdong; Hampstead, Benjamin M; Albin, Roger L; Paulson, Henry L; Guan, Yuanfang

    2016-10-05

    We present COMPASS, a COmputational Model to Predict the development of Alzheimer's diSease Spectrum, to model Alzheimer's disease (AD) progression. This was the best-performing method in recent crowdsourcing benchmark study, DREAM Alzheimer's Disease Big Data challenge to predict changes in Mini-Mental State Examination (MMSE) scores over 24-months using standardized data. In the present study, we conducted three additional analyses beyond the DREAM challenge question to improve the clinical contribution of our approach, including: (1) adding pre-validated baseline cognitive composite scores of ADNI-MEM and ADNI-EF, (2) identifying subjects with significant declines in MMSE scores, and (3) incorporating SNPs of top 10 genes connected to APOE identified from functional-relationship network. For (1) above, we significantly improved predictive accuracy, especially for the Mild Cognitive Impairment (MCI) group. For (2), we achieved an area under ROC of 0.814 in predicting significant MMSE decline: our model has 100% precision at 5% recall, and 91% accuracy at 10% recall. For (3), "genetic only" model has Pearson's correlation of 0.15 to predict progression in the MCI group. Even though addition of this limited genetic model to COMPASS did not improve prediction of progression of MCI group, the predictive ability of SNP information extended beyond well-known APOE allele.

  5. Building confidence and credibility amid growing model and computing complexity

    NASA Astrophysics Data System (ADS)

    Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.

    2017-12-01

    As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.

  6. Real-time multiple human perception with color-depth cameras on a mobile robot.

    PubMed

    Zhang, Hao; Reardon, Christopher; Parker, Lynne E

    2013-10-01

    The ability to perceive humans is an essential requirement for safe and efficient human-robot interaction. In real-world applications, the need for a robot to interact in real time with multiple humans in a dynamic, 3-D environment presents a significant challenge. The recent availability of commercial color-depth cameras allow for the creation of a system that makes use of the depth dimension, thus enabling a robot to observe its environment and perceive in the 3-D space. Here we present a system for 3-D multiple human perception in real time from a moving robot equipped with a color-depth camera and a consumer-grade computer. Our approach reduces computation time to achieve real-time performance through a unique combination of new ideas and established techniques. We remove the ground and ceiling planes from the 3-D point cloud input to separate candidate point clusters. We introduce the novel information concept, depth of interest, which we use to identify candidates for detection, and that avoids the computationally expensive scanning-window methods of other approaches. We utilize a cascade of detectors to distinguish humans from objects, in which we make intelligent reuse of intermediary features in successive detectors to improve computation. Because of the high computational cost of some methods, we represent our candidate tracking algorithm with a decision directed acyclic graph, which allows us to use the most computationally intense techniques only where necessary. We detail the successful implementation of our novel approach on a mobile robot and examine its performance in scenarios with real-world challenges, including occlusion, robot motion, nonupright humans, humans leaving and reentering the field of view (i.e., the reidentification challenge), human-object and human-human interaction. We conclude with the observation that the incorporation of the depth information, together with the use of modern techniques in new ways, we are able to create an accurate system for real-time 3-D perception of humans by a mobile robot.

  7. Editorial: Cognitive Architectures, Model Comparison and AGI

    NASA Astrophysics Data System (ADS)

    Lebiere, Christian; Gonzalez, Cleotilde; Warwick, Walter

    2010-12-01

    Cognitive Science and Artificial Intelligence share compatible goals of understanding and possibly generating broadly intelligent behavior. In order to determine if progress is made, it is essential to be able to evaluate the behavior of complex computational models, especially those built on general cognitive architectures, and compare it to benchmarks of intelligent behavior such as human performance. Significant methodological challenges arise, however, when trying to extend approaches used to compare model and human performance from tightly controlled laboratory tasks to complex tasks involving more open-ended behavior. This paper describes a model comparison challenge built around a dynamic control task, the Dynamic Stocks and Flows. We present and discuss distinct approaches to evaluating performance and comparing models. Lessons drawn from this challenge are discussed in light of the challenge of using cognitive architectures to achieve Artificial General Intelligence.

  8. Accelerating Large Scale Image Analyses on Parallel, CPU-GPU Equipped Systems

    PubMed Central

    Teodoro, George; Kurc, Tahsin M.; Pan, Tony; Cooper, Lee A.D.; Kong, Jun; Widener, Patrick; Saltz, Joel H.

    2014-01-01

    The past decade has witnessed a major paradigm shift in high performance computing with the introduction of accelerators as general purpose processors. These computing devices make available very high parallel computing power at low cost and power consumption, transforming current high performance platforms into heterogeneous CPU-GPU equipped systems. Although the theoretical performance achieved by these hybrid systems is impressive, taking practical advantage of this computing power remains a very challenging problem. Most applications are still deployed to either GPU or CPU, leaving the other resource under- or un-utilized. In this paper, we propose, implement, and evaluate a performance aware scheduling technique along with optimizations to make efficient collaborative use of CPUs and GPUs on a parallel system. In the context of feature computations in large scale image analysis applications, our evaluations show that intelligently co-scheduling CPUs and GPUs can significantly improve performance over GPU-only or multi-core CPU-only approaches. PMID:25419545

  9. Challenges of Big Data Analysis.

    PubMed

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  10. Challenges of Big Data Analysis

    PubMed Central

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-01-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469

  11. Supporting research sites in resource-limited settings: Challenges in implementing IT infrastructure

    PubMed Central

    Whalen, Christopher; Donnell, Deborah; Tartakovsky, Michael

    2014-01-01

    As Information and Communication Technology infrastructure becomes more reliable, new methods of Electronic Data Capture (EDC), datamarts/Data warehouses, and mobile computing provide platforms for rapid coordination of international research projects and multisite studies. However, despite the increasing availability of internet connectivity and communication systems in remote regions of the world, there are still significant obstacles. Sites with poor infrastructure face serious challenges participating in modern clinical and basic research, particularly that relying on EDC and internet communication technologies. This report discusses our experiences in supporting research in resource-limited settings (RLS). We describe examples of the practical and ethical/regulatory challenges raised by use of these newer technologies for data collection in multisite clinical studies. PMID:24321986

  12. Supporting research sites in resource-limited settings: challenges in implementing information technology infrastructure.

    PubMed

    Whalen, Christopher J; Donnell, Deborah; Tartakovsky, Michael

    2014-01-01

    As information and communication technology infrastructure becomes more reliable, new methods of electronic data capture, data marts/data warehouses, and mobile computing provide platforms for rapid coordination of international research projects and multisite studies. However, despite the increasing availability of Internet connectivity and communication systems in remote regions of the world, there are still significant obstacles. Sites with poor infrastructure face serious challenges participating in modern clinical and basic research, particularly that relying on electronic data capture and Internet communication technologies. This report discusses our experiences in supporting research in resource-limited settings. We describe examples of the practical and ethical/regulatory challenges raised by the use of these newer technologies for data collection in multisite clinical studies.

  13. Leaderboard Now Open: CPTAC’s DREAM Proteogenomics Computational Challenge | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The National Cancer Institute’s Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the opening of the leaderboard to its Proteogenomics Computational DREAM Challenge. The leadership board remains open for submissions during September 25, 2017 through October 8, 2017, with the Challenge expected to run until November 17, 2017.

  14. 2005 White Paper on Institutional Capability Computing Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, B; McCoy, M; Seager, M

    This paper documents the need for a significant increase in the computing infrastructure provided to scientists working in the unclassified domains at Lawrence Livermore National Laboratory (LLNL). This need could be viewed as the next step in a broad strategy outlined in the January 2002 White Paper (UCRL-ID-147449) that bears essentially the same name as this document. Therein we wrote: 'This proposed increase could be viewed as a step in a broader strategy linking hardware evolution to applications development that would take LLNL unclassified computational science to a position of distinction if not preeminence by 2006.' This position of distinctionmore » has certainly been achieved. This paper provides a strategy for sustaining this success but will diverge from its 2002 predecessor in that it will: (1) Amplify the scientific and external success LLNL has enjoyed because of the investments made in 2002 (MCR, 11 TF) and 2004 (Thunder, 23 TF). (2) Describe in detail the nature of additional investments that are important to meet both the institutional objectives of advanced capability for breakthrough science and the scientists clearly stated request for adequate capacity and more rapid access to moderate-sized resources. (3) Put these requirements in the context of an overall strategy for simulation science and external collaboration. While our strategy for Multiprogrammatic and Institutional Computing (M&IC) has worked well, three challenges must be addressed to assure and enhance our position. The first is that while we now have over 50 important classified and unclassified simulation codes available for use by our computational scientists, we find ourselves coping with high demand for access and long queue wait times. This point was driven home in the 2005 Institutional Computing Executive Group (ICEG) 'Report Card' to the Deputy Director for Science and Technology (DDST) Office and Computation Directorate management. The second challenge is related to the balance that should be maintained in the simulation environment. With the advent of Thunder, the institution directed a change in course from past practice. Instead of making Thunder available to the large body of scientists, as was MCR, and effectively using it as a capacity system, the intent was to make it available to perhaps ten projects so that these teams could run very aggressive problems for breakthrough science. This usage model established Thunder as a capability system. The challenge this strategy raises is that the majority of scientists have not seen an improvement in capacity computing resources since MCR, thus creating significant tension in the system. The question then is: 'How do we address the institution's desire to maintain the potential for breakthrough science and also meet the legitimate requests from the ICEG to achieve balance?' Both the capability and the capacity environments must be addressed through this one procurement. The third challenge is to reach out more aggressively to the national science community to encourage access to LLNL resources as part of a strategy for sharpening our science through collaboration. Related to this, LLNL has been unable in the past to provide access for sensitive foreign nationals (SFNs) to the Livermore Computing (LC) unclassified 'yellow' network. Identifying some mechanism for data sharing between LLNL computational scientists and SFNs would be a first practical step in fostering cooperative, collaborative relationships with an important and growing sector of the American science community.« less

  15. Computational Glycobiology: Mechanistic Studies of Carbohydrate-Active Enzymes and Implication for Inhibitor Design.

    PubMed

    Montgomery, Andrew P; Xiao, Kela; Wang, Xingyong; Skropeta, Danielle; Yu, Haibo

    2017-01-01

    Carbohydrate-active enzymes (CAZymes) are families of essential and structurally related enzymes, which catalyze the creation, modification, and degradation of glycosidic bonds in carbohydrates to maintain essentially all kingdoms of life. CAZymes play a key role in many biological processes underpinning human health and diseases (e.g., cancer, diabetes, Alzheimer's diseases, AIDS) and have thus emerged as important drug targets in the fight against pathogenesis. The realization of the full potential of CAZymes remains a significant challenge, relying on a deeper understanding of the molecular mechanisms of catalysis. Considering numerous unsettled questions in the literature, while with a large amount of structural, kinetic, and mutagenesis data available for CAZymes, there is a pressing need and an abundant opportunity for collaborative computational and experimental investigations with the aim to unlock the secrets of CAZyme catalysis at an atomic level. In this review, we briefly survey key methodology development in computational studies of CAZyme catalysis. This is complemented by selected case studies highlighting mechanistic insights provided by computational glycobiology. Implication for inhibitor design by mimicking the transition state is also illustrated for both glycoside hydrolases and glycosyltransferases. The challenges for such studies will be noted and finally an outlook for future directions will be provided. © 2017 Elsevier Inc. All rights reserved.

  16. Multiscale Mechanics of Articular Cartilage: Potentials and Challenges of Coupling Musculoskeletal, Joint, and Microscale Computational Models

    PubMed Central

    Halloran, J. P.; Sibole, S.; van Donkelaar, C. C.; van Turnhout, M. C.; Oomens, C. W. J.; Weiss, J. A.; Guilak, F.; Erdemir, A.

    2012-01-01

    Articular cartilage experiences significant mechanical loads during daily activities. Healthy cartilage provides the capacity for load bearing and regulates the mechanobiological processes for tissue development, maintenance, and repair. Experimental studies at multiple scales have provided a fundamental understanding of macroscopic mechanical function, evaluation of the micromechanical environment of chondrocytes, and the foundations for mechanobiological response. In addition, computational models of cartilage have offered a concise description of experimental data at many spatial levels under healthy and diseased conditions, and have served to generate hypotheses for the mechanical and biological function. Further, modeling and simulation provides a platform for predictive risk assessment, management of dysfunction, as well as a means to relate multiple spatial scales. Simulation-based investigation of cartilage comes with many challenges including both the computational burden and often insufficient availability of data for model development and validation. This review outlines recent modeling and simulation approaches to understand cartilage function from a mechanical systems perspective, and illustrates pathways to associate mechanics with biological function. Computational representations at single scales are provided from the body down to the microstructure, along with attempts to explore multiscale mechanisms of load sharing that dictate the mechanical environment of the cartilage and chondrocytes. PMID:22648577

  17. Electro-textile garments for power and data distribution

    NASA Astrophysics Data System (ADS)

    Slade, Jeremiah R.; Winterhalter, Carole

    2015-05-01

    U.S. troops are increasingly being equipped with various electronic assets including flexible displays, computers, and communications systems. While these systems can significantly enhance operational capabilities, forming reliable connections between them poses a number of challenges in terms of comfort, weight, ergonomics, and operational security. IST has addressed these challenges by developing the technologies needed to integrate large-scale cross-seam electrical functionality into virtually any textile product, including the various garments and vests that comprise the warfighter's ensemble. Using this technology IST is able to develop textile products that do not simply support or accommodate a network but are the network.

  18. A platform for evolving intelligently interactive adversaries.

    PubMed

    Fogel, David B; Hays, Timothy J; Johnson, Douglas R

    2006-07-01

    Entertainment software developers face significant challenges in designing games with broad appeal. One of the challenges concerns creating nonplayer (computer-controlled) characters that can adapt their behavior in light of the current and prospective situation, possibly emulating human behaviors. This adaptation should be inherently novel, unrepeatable, yet within the bounds of realism. Evolutionary algorithms provide a suitable method for generating such behaviors. This paper provides background on the entertainment software industry, and details a prior and current effort to create a platform for evolving nonplayer characters with genetic and behavioral traits within a World War I combat flight simulator.

  19. Parallel Computational Fluid Dynamics: Current Status and Future Requirements

    NASA Technical Reports Server (NTRS)

    Simon, Horst D.; VanDalsem, William R.; Dagum, Leonardo; Kutler, Paul (Technical Monitor)

    1994-01-01

    One or the key objectives of the Applied Research Branch in the Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Allies Research Center is the accelerated introduction of highly parallel machines into a full operational environment. In this report we discuss the performance results obtained from the implementation of some computational fluid dynamics (CFD) applications on the Connection Machine CM-2 and the Intel iPSC/860. We summarize some of the experiences made so far with the parallel testbed machines at the NAS Applied Research Branch. Then we discuss the long term computational requirements for accomplishing some of the grand challenge problems in computational aerosciences. We argue that only massively parallel machines will be able to meet these grand challenge requirements, and we outline the computer science and algorithm research challenges ahead.

  20. ADDRESSING ENVIRONMENTAL ENGINEERING CHALLENGES WITH COMPUTATIONAL FLUID DYNAMICS

    EPA Science Inventory

    This paper discusses the status and application of Computational Fluid Dynamics )CFD) models to address environmental engineering challenges for more detailed understanding of air pollutant source emissions, atmospheric dispersion and resulting human exposure. CFD simulations ...

  1. Beyond moore computing research challenge workshop report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huey, Mark C.; Aidun, John Bahram

    2013-10-01

    We summarize the presentations and break out session discussions from the in-house workshop that was held on 11 July 2013 to acquaint a wider group of Sandians with the Beyond Moore Computing research challenge.

  2. Computer Programming in Middle School: How Pairs Respond to Challenges

    ERIC Educational Resources Information Center

    Denner, Jill; Werner, Linda

    2007-01-01

    Many believe that girls lack the confidence and motivation to persist with computers when they face a challenge. In order to increase the number of girls and women in information technology careers, we need a better understanding of how they think about and solve problems while working on the computer. In this article, we describe a qualitative…

  3. Computational challenges in modeling gene regulatory events

    PubMed Central

    Pataskar, Abhijeet; Tiwari, Vijay K.

    2016-01-01

    ABSTRACT Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating “omics” data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology. PMID:27390891

  4. A K-6 Computational Thinking Curriculum Framework: Implications for Teacher Knowledge

    ERIC Educational Resources Information Center

    Angeli, Charoula; Voogt, Joke; Fluck, Andrew; Webb, Mary; Cox, Margaret; Malyn-Smith, Joyce; Zagami, Jason

    2016-01-01

    Adding computer science as a separate school subject to the core K-6 curriculum is a complex issue with educational challenges. The authors herein address two of these challenges: (1) the design of the curriculum based on a generic computational thinking framework, and (2) the knowledge teachers need to teach the curriculum. The first issue is…

  5. Ising Processing Units: Potential and Challenges for Discrete Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coffrin, Carleton James; Nagarajan, Harsha; Bent, Russell Whitford

    The recent emergence of novel computational devices, such as adiabatic quantum computers, CMOS annealers, and optical parametric oscillators, presents new opportunities for hybrid-optimization algorithms that leverage these kinds of specialized hardware. In this work, we propose the idea of an Ising processing unit as a computational abstraction for these emerging tools. Challenges involved in using and bench- marking these devices are presented, and open-source software tools are proposed to address some of these challenges. The proposed benchmarking tools and methodology are demonstrated by conducting a baseline study of established solution methods to a D-Wave 2X adiabatic quantum computer, one examplemore » of a commercially available Ising processing unit.« less

  6. Accurate proteome-wide protein quantification from high-resolution 15N mass spectra

    PubMed Central

    2011-01-01

    In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234

  7. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    PubMed

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  8. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs.

    PubMed

    Cormode, Graham; Dasgupta, Anirban; Goyal, Amit; Lee, Chi Hoon

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.

  9. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silico brain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  10. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    2008-04-01

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silicobrain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  11. SPECT/CT in imaging foot and ankle pathology-the demise of other coregistration techniques.

    PubMed

    Mohan, Hosahalli K; Gnanasegaran, Gopinath; Vijayanathan, Sanjay; Fogelman, Ignac

    2010-01-01

    Disorders of the ankle and foot are common and given the complex anatomy and function of the foot, they present a significant clinical challenge. Imaging plays a crucial role in the management of these patients, with multiple imaging options available to the clinician. The American College of radiology has set the appropriateness criteria for the use of the available investigating modalities in the management of foot and ankle pathologies. These are broadly classified into anatomical and functional imaging modalities. Recently, single-photon emission computed tomography and/or computed tomography scanners, which can elegantly combine functional and anatomical images have been introduced, promising an exciting and important development. This review describes our clinical experience with single-photon emission computed tomography and/or computed tomography and discusses potential applications of these techniques.

  12. Thrombosis in Cerebral Aneurysms and the Computational Modeling Thereof: A Review

    PubMed Central

    Ngoepe, Malebogo N.; Frangi, Alejandro F.; Byrne, James V.; Ventikos, Yiannis

    2018-01-01

    Thrombosis is a condition closely related to cerebral aneurysms and controlled thrombosis is the main purpose of endovascular embolization treatment. The mechanisms governing thrombus initiation and evolution in cerebral aneurysms have not been fully elucidated and this presents challenges for interventional planning. Significant effort has been directed towards developing computational methods aimed at streamlining the interventional planning process for unruptured cerebral aneurysm treatment. Included in these methods are computational models of thrombus development following endovascular device placement. The main challenge with developing computational models for thrombosis in disease cases is that there exists a wide body of literature that addresses various aspects of the clotting process, but it may not be obvious what information is of direct consequence for what modeling purpose (e.g., for understanding the effect of endovascular therapies). The aim of this review is to present the information so it will be of benefit to the community attempting to model cerebral aneurysm thrombosis for interventional planning purposes, in a simplified yet appropriate manner. The paper begins by explaining current understanding of physiological coagulation and highlights the documented distinctions between the physiological process and cerebral aneurysm thrombosis. Clinical observations of thrombosis following endovascular device placement are then presented. This is followed by a section detailing the demands placed on computational models developed for interventional planning. Finally, existing computational models of thrombosis are presented. This last section begins with description and discussion of physiological computational clotting models, as they are of immense value in understanding how to construct a general computational model of clotting. This is then followed by a review of computational models of clotting in cerebral aneurysms, specifically. Even though some progress has been made towards computational predictions of thrombosis following device placement in cerebral aneurysms, many gaps still remain. Answering the key questions will require the combined efforts of the clinical, experimental and computational communities. PMID:29670533

  13. Thrombosis in Cerebral Aneurysms and the Computational Modeling Thereof: A Review.

    PubMed

    Ngoepe, Malebogo N; Frangi, Alejandro F; Byrne, James V; Ventikos, Yiannis

    2018-01-01

    Thrombosis is a condition closely related to cerebral aneurysms and controlled thrombosis is the main purpose of endovascular embolization treatment. The mechanisms governing thrombus initiation and evolution in cerebral aneurysms have not been fully elucidated and this presents challenges for interventional planning. Significant effort has been directed towards developing computational methods aimed at streamlining the interventional planning process for unruptured cerebral aneurysm treatment. Included in these methods are computational models of thrombus development following endovascular device placement. The main challenge with developing computational models for thrombosis in disease cases is that there exists a wide body of literature that addresses various aspects of the clotting process, but it may not be obvious what information is of direct consequence for what modeling purpose (e.g., for understanding the effect of endovascular therapies). The aim of this review is to present the information so it will be of benefit to the community attempting to model cerebral aneurysm thrombosis for interventional planning purposes, in a simplified yet appropriate manner. The paper begins by explaining current understanding of physiological coagulation and highlights the documented distinctions between the physiological process and cerebral aneurysm thrombosis. Clinical observations of thrombosis following endovascular device placement are then presented. This is followed by a section detailing the demands placed on computational models developed for interventional planning. Finally, existing computational models of thrombosis are presented. This last section begins with description and discussion of physiological computational clotting models, as they are of immense value in understanding how to construct a general computational model of clotting. This is then followed by a review of computational models of clotting in cerebral aneurysms, specifically. Even though some progress has been made towards computational predictions of thrombosis following device placement in cerebral aneurysms, many gaps still remain. Answering the key questions will require the combined efforts of the clinical, experimental and computational communities.

  14. Changing the Paradigm: Preparing Students for the Computing Profession in the 21st Century

    NASA Technical Reports Server (NTRS)

    Robbins, Kay A.

    2003-01-01

    The dramatic technological developments of the past decade have led to a tremendous growth in the demand for computer science professionals well-versed in advanced technology and techniques. NASA, traditionally a haven for cutting-edge innovators, is now competing with every industrial and government sector for computer science talent. The computer science program at University of Texas at San Antonio (UTSA) faces challenges beyond those intrinsically presented by rapid technological change, because a significant number of UTSA students come from low-income families with no Internet or computer access at home. An examination of enrollment statistics for the computer science program at UTSA showed that very few students who entered as freshmen successfully graduated. The upper division courses appeared to be populated by graduate students removing deficiencies and by transfer students. The faculty was also concerned that the students who did graduate from the program did not have the strong technical and programming skills that the CS program had been noted for in the community during the 1980's.

  15. Accurate de novo design of hyperstable constrained peptides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhardwaj, Gaurav; Mulligan, Vikram Khipple; Bahl, Christopher D.

    Covalently-crosslinked peptides present attractive opportunities for developing new therapeutics. Lying between small molecule and protein therapeutics in size, natural crosslinked peptides play critical roles in signaling, virulence and immunity. Engineering novel peptides with precise control over their three-dimensional structures is a significant challenge. Here we describe the development of computational methods for de novo design of conformationally-restricted peptides, and the use of these methods to design hyperstable disulfide-stabilized miniproteins, heterochiral peptides, and N-C cyclic peptides. Experimentally-determined X-ray and NMR structures for 12 of the designs are nearly identical to the computational models. The computational design methods and stable scaffolds providemore » the basis for a new generation of peptide-based drugs.« less

  16. Optimal subsystem approach to multi-qubit quantum state discrimination and experimental investigation

    NASA Astrophysics Data System (ADS)

    Xue, ShiChuan; Wu, JunJie; Xu, Ping; Yang, XueJun

    2018-02-01

    Quantum computing is a significant computing capability which is superior to classical computing because of its superposition feature. Distinguishing several quantum states from quantum algorithm outputs is often a vital computational task. In most cases, the quantum states tend to be non-orthogonal due to superposition; quantum mechanics has proved that perfect outcomes could not be achieved by measurements, forcing repetitive measurement. Hence, it is important to determine the optimum measuring method which requires fewer repetitions and a lower error rate. However, extending current measurement approaches mainly aiming at quantum cryptography to multi-qubit situations for quantum computing confronts challenges, such as conducting global operations which has considerable costs in the experimental realm. Therefore, in this study, we have proposed an optimum subsystem method to avoid these difficulties. We have provided an analysis of the comparison between the reduced subsystem method and the global minimum error method for two-qubit problems; the conclusions have been verified experimentally. The results showed that the subsystem method could effectively discriminate non-orthogonal two-qubit states, such as separable states, entangled pure states, and mixed states; the cost of the experimental process had been significantly reduced, in most circumstances, with acceptable error rate. We believe the optimal subsystem method is the most valuable and promising approach for multi-qubit quantum computing applications.

  17. Big questions, big science: meeting the challenges of global ecology

    Treesearch

    David Schimel; Michael Keller

    2015-01-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator’s or s group of investigator’s labs, sustained for longer...

  18. High Performance Biocomputation

    DTIC Science & Technology

    2005-03-01

    in some other fields (e.g. computational hydrodynamics, lattice quantum chroniodynamics, etc.) but appears wholly inappropriate here as pointed out...restrict the overall conformational space by putting the system on a lattice . These have been used to great effect to study folding kinetics. These...many important problems to be worked on, not a single unique challenge (contrast this to QCD , for example). " almost all problems require significant

  19. From an Executive Network to Executive Control: A Computational Model of the "n"-Back Task

    ERIC Educational Resources Information Center

    Chatham, Christopher H.; Herd, Seth A.; Brant, Angela M.; Hazy, Thomas E.; Miyake, Akira; O'Reilly, Randy; Friedman, Naomi P.

    2011-01-01

    A paradigmatic test of executive control, the n-back task, is known to recruit a widely distributed parietal, frontal, and striatal "executive network," and is thought to require an equally wide array of executive functions. The mapping of functions onto substrates in such a complex task presents a significant challenge to any theoretical…

  20. Role of Laboratory Plasma Experiments in exploring the Physics of Solar Eruptions

    NASA Astrophysics Data System (ADS)

    Tripathi, S.

    2017-12-01

    Solar eruptive events are triggered over a broad range of spatio-temporal scales by a variety of fundamental processes (e.g., force-imbalance, magnetic-reconnection, electrical-current driven instabilities) associated with arched magnetoplasma structures in the solar atmosphere. Contemporary research on solar eruptive events is at the forefront of solar and heliospheric physics due to its relevance to space weather. Details on the formation of magnetized plasma structures on the Sun, storage of magnetic energy in such structures over a long period (several Alfven transit times), and their impulsive eruptions have been recorded in numerous observations and simulated in computer models. Inherent limitations of space observations and uncontrolled nature of solar eruptions pose significant challenges in testing theoretical models and developing the predictive capability for space-weather. The pace of scientific progress in this area can be significantly boosted by tapping the potential of appropriately scaled laboratory plasma experiments to compliment solar observations, theoretical models, and computer simulations. To give an example, recent results from a laboratory plasma experiment on arched magnetic flux ropes will be presented and future challenges will be discussed. (Work supported by National Science Foundation, USA under award number 1619551)

  1. Synthetic Vision Displays for Planetary and Lunar Lander Vehicles

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Williams, Steven P.; Shelton, Kevin J.; Kramer, Lynda J.; Bailey, Randall E.; Norman, Robert M.

    2008-01-01

    Aviation research has demonstrated that Synthetic Vision (SV) technology can substantially enhance situation awareness, reduce pilot workload, improve aviation safety, and promote flight path control precision. SV, and related flight deck technologies are currently being extended for application in planetary exploration vehicles. SV, in particular, holds significant potential for many planetary missions since the SV presentation provides a computer-generated view for the flight crew of the terrain and other significant environmental characteristics independent of the outside visibility conditions, window locations, or vehicle attributes. SV allows unconstrained control of the computer-generated scene lighting, terrain coloring, and virtual camera angles which may provide invaluable visual cues to pilots/astronauts, not available from other vision technologies. In addition, important vehicle state information may be conformally displayed on the view such as forward and down velocities, altitude, and fuel remaining to enhance trajectory control and vehicle system status. The paper accompanies a conference demonstration that introduced a prototype NASA Synthetic Vision system for lunar lander spacecraft. The paper will describe technical challenges and potential solutions to SV applications for the lunar landing mission, including the requirements for high-resolution lunar terrain maps, accurate positioning and orientation, and lunar cockpit display concepts to support projected mission challenges.

  2. Computer-assisted diagnostic decision support: history, challenges, and possible paths forward.

    PubMed

    Miller, Randolph A

    2009-09-01

    This paper presents a brief history of computer-assisted diagnosis, including challenges and future directions. Some ideas presented in this article on computer-assisted diagnostic decision support systems (CDDSS) derive from prior work by the author and his colleagues (see list in Acknowledgments) on the INTERNIST-1 and QMR projects. References indicate the original sources of many of these ideas.

  3. FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption

    PubMed Central

    2015-01-01

    Background The increasing availability of genome data motivates massive research studies in personalized treatment and precision medicine. Public cloud services provide a flexible way to mitigate the storage and computation burden in conducting genome-wide association studies (GWAS). However, data privacy has been widely concerned when sharing the sensitive information in a cloud environment. Methods We presented a novel framework (FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption) to fully outsource GWAS (i.e., chi-square statistic computation) using homomorphic encryption. The proposed framework enables secure divisions over encrypted data. We introduced two division protocols (i.e., secure errorless division and secure approximation division) with a trade-off between complexity and accuracy in computing chi-square statistics. Results The proposed framework was evaluated for the task of chi-square statistic computation with two case-control datasets from the 2015 iDASH genome privacy protection challenge. Experimental results show that the performance of FORESEE can be significantly improved through algorithmic optimization and parallel computation. Remarkably, the secure approximation division provides significant performance gain, but without missing any significance SNPs in the chi-square association test using the aforementioned datasets. Conclusions Unlike many existing HME based studies, in which final results need to be computed by the data owner due to the lack of the secure division operation, the proposed FORESEE framework support complete outsourcing to the cloud and output the final encrypted chi-square statistics. PMID:26733391

  4. FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption.

    PubMed

    Zhang, Yuchen; Dai, Wenrui; Jiang, Xiaoqian; Xiong, Hongkai; Wang, Shuang

    2015-01-01

    The increasing availability of genome data motivates massive research studies in personalized treatment and precision medicine. Public cloud services provide a flexible way to mitigate the storage and computation burden in conducting genome-wide association studies (GWAS). However, data privacy has been widely concerned when sharing the sensitive information in a cloud environment. We presented a novel framework (FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption) to fully outsource GWAS (i.e., chi-square statistic computation) using homomorphic encryption. The proposed framework enables secure divisions over encrypted data. We introduced two division protocols (i.e., secure errorless division and secure approximation division) with a trade-off between complexity and accuracy in computing chi-square statistics. The proposed framework was evaluated for the task of chi-square statistic computation with two case-control datasets from the 2015 iDASH genome privacy protection challenge. Experimental results show that the performance of FORESEE can be significantly improved through algorithmic optimization and parallel computation. Remarkably, the secure approximation division provides significant performance gain, but without missing any significance SNPs in the chi-square association test using the aforementioned datasets. Unlike many existing HME based studies, in which final results need to be computed by the data owner due to the lack of the secure division operation, the proposed FORESEE framework support complete outsourcing to the cloud and output the final encrypted chi-square statistics.

  5. Exploiting the Proteome to Improve the Genome-Wide Genetic Analysis of Epistasis in Common Human Diseases

    PubMed Central

    Pattin, Kristine A.; Moore, Jason H.

    2009-01-01

    One of the central goals of human genetics is the identification of loci with alleles or genotypes that confer increased susceptibility. The availability of dense maps of single-nucleotide polymorphisms (SNPs) along with high-throughput genotyping technologies has set the stage for routine genome-wide association studies that are expected to significantly improve our ability to identify susceptibility loci. Before this promise can be realized, there are some significant challenges that need to be addressed. We address here the challenge of detecting epistasis or gene-gene interactions in genome-wide association studies. Discovering epistatic interactions in high dimensional datasets remains a challenge due to the computational complexity resulting from the analysis of all possible combinations of SNPs. One potential way to overcome the computational burden of a genome-wide epistasis analysis would be to devise a logical way to prioritize the many SNPs in a dataset so that the data may be analyzed more efficiently and yet still retain important biological information. One of the strongest demonstrations of the functional relationship between genes is protein-protein interaction. Thus, it is plausible that the expert knowledge extracted from protein interaction databases may allow for a more efficient analysis of genome-wide studies as well as facilitate the biological interpretation of the data. In this review we will discuss the challenges of detecting epistasis in genome-wide genetic studies and the means by which we propose to apply expert knowledge extracted from protein interaction databases to facilitate this process. We explore some of the fundamentals of protein interactions and the databases that are publicly available. PMID:18551320

  6. Continuing challenges for computer-based neuropsychological tests.

    PubMed

    Letz, Richard

    2003-08-01

    A number of issues critical to the development of computer-based neuropsychological testing systems that remain continuing challenges to their widespread use in occupational and environmental health are reviewed. Several computer-based neuropsychological testing systems have been developed over the last 20 years, and they have contributed substantially to the study of neurologic effects of a number of environmental exposures. However, many are no longer supported and do not run on contemporary personal computer operating systems. Issues that are continuing challenges for development of computer-based neuropsychological tests in environmental and occupational health are discussed: (1) some current technological trends that generally make test development more difficult; (2) lack of availability of usable speech recognition of the type required for computer-based testing systems; (3) implementing computer-based procedures and tasks that are improvements over, not just adaptations of, their manually-administered predecessors; (4) implementing tests of a wider range of memory functions than the limited range now available; (5) paying more attention to motivational influences that affect the reliability and validity of computer-based measurements; and (6) increasing the usability of and audience for computer-based systems. Partial solutions to some of these challenges are offered. The challenges posed by current technological trends are substantial and generally beyond the control of testing system developers. Widespread acceptance of the "tablet PC" and implementation of accurate small vocabulary, discrete, speaker-independent speech recognition would enable revolutionary improvements to computer-based testing systems, particularly for testing memory functions not covered in existing systems. Dynamic, adaptive procedures, particularly ones based on item-response theory (IRT) and computerized-adaptive testing (CAT) methods, will be implemented in new tests that will be more efficient, reliable, and valid than existing test procedures. These additional developments, along with implementation of innovative reporting formats, are necessary for more widespread acceptance of the testing systems.

  7. The assembly, collapse and restoration of food webs

    USGS Publications Warehouse

    Dobson, Andy; Allesina, Stefano; Lafferty, Kevin; Pascual, Mercedes

    2009-01-01

    Darwin chose the metaphor of a 'tangled bank' to conclude the 'Origin of species'. Two centuries after Darwin's birth, we are still untangling the complex ecological networks he has pondered. In particular, studies of food webs provide important insights into how natural ecosystems function (Pascual & Dunne 2005). Although the nonlinear interactions between many species creates challenges of scale, resolution of data and significant computational constraints, the last 10 years have seen significant advances built on the earlier classic studies of Cohen, May, Pimm, Polis, Lawton and Yodzis (May 1974; Cohen 1978; Pimm 1982; Briand & Cohen 1984, 1987; Yodzis 1989; Cohen et al. 1990; Pimm et al. 1991; Yodzis & Innes 1992; Yodzis 1998). These gains stem from advances in computing power and the collation of more comprehensive data from a broader array of empirical food webs.

  8. Benchmarking neuromorphic vision: lessons learnt from computer vision

    PubMed Central

    Tan, Cheston; Lallee, Stephane; Orchard, Garrick

    2015-01-01

    Neuromorphic Vision sensors have improved greatly since the first silicon retina was presented almost three decades ago. They have recently matured to the point where they are commercially available and can be operated by laymen. However, despite improved availability of sensors, there remains a lack of good datasets, while algorithms for processing spike-based visual data are still in their infancy. On the other hand, frame-based computer vision algorithms are far more mature, thanks in part to widely accepted datasets which allow direct comparison between algorithms and encourage competition. We are presented with a unique opportunity to shape the development of Neuromorphic Vision benchmarks and challenges by leveraging what has been learnt from the use of datasets in frame-based computer vision. Taking advantage of this opportunity, in this paper we review the role that benchmarks and challenges have played in the advancement of frame-based computer vision, and suggest guidelines for the creation of Neuromorphic Vision benchmarks and challenges. We also discuss the unique challenges faced when benchmarking Neuromorphic Vision algorithms, particularly when attempting to provide direct comparison with frame-based computer vision. PMID:26528120

  9. Business aspects of cardiovascular computed tomography: tackling the challenges.

    PubMed

    Bateman, Timothy M

    2008-01-01

    The purpose of this article is to provide a comprehensive understanding of the business issues surrounding provision of dedicated cardiovascular computed tomographic imaging. Some of the challenges include high up-front costs, current low utilization relative to scanner capability, and inadequate payments. Cardiovascular computed tomographic imaging is a valuable clinical modality that should be offered by cardiovascular centers-of-excellence. With careful consideration of the business aspects, moderate-to-large size cardiology programs should be able to implement an economically viable cardiovascular computed tomographic service.

  10. Tackling some of the most intricate geophysical challenges via high-performance computing

    NASA Astrophysics Data System (ADS)

    Khosronejad, A.

    2016-12-01

    Recently, world has been witnessing significant enhancements in computing power of supercomputers. Computer clusters in conjunction with the advanced mathematical algorithms has set the stage for developing and applying powerful numerical tools to tackle some of the most intricate geophysical challenges that today`s engineers face. One such challenge is to understand how turbulent flows, in real-world settings, interact with (a) rigid and/or mobile complex bed bathymetry of waterways and sea-beds in the coastal areas; (b) objects with complex geometry that are fully or partially immersed; and (c) free-surface of waterways and water surface waves in the coastal area. This understanding is especially important because the turbulent flows in real-world environments are often bounded by geometrically complex boundaries, which dynamically deform and give rise to multi-scale and multi-physics transport phenomena, and characterized by multi-lateral interactions among various phases (e.g. air/water/sediment phases). Herein, I present some of the multi-scale and multi-physics geophysical fluid mechanics processes that I have attempted to study using an in-house high-performance computational model, the so-called VFS-Geophysics. More specifically, I will present the simulation results of turbulence/sediment/solute/turbine interactions in real-world settings. Parts of the simulations I present are performed to gain scientific insights into the processes such as sand wave formation (A. Khosronejad, and F. Sotiropoulos, (2014), Numerical simulation of sand waves in a turbulent open channel flow, Journal of Fluid Mechanics, 753:150-216), while others are carried out to predict the effects of climate change and large flood events on societal infrastructures ( A. Khosronejad, et al., (2016), Large eddy simulation of turbulence and solute transport in a forested headwater stream, Journal of Geophysical Research:, doi: 10.1002/2014JF003423).

  11. Recycling potential of neodymium: the case of computer hard disk drives.

    PubMed

    Sprecher, Benjamin; Kleijn, Rene; Kramer, Gert Jan

    2014-08-19

    Neodymium, one of the more critically scarce rare earth metals, is often used in sustainable technologies. In this study, we investigate the potential contribution of neodymium recycling to reducing scarcity in supply, with a case study on computer hard disk drives (HDDs). We first review the literature on neodymium production and recycling potential. From this review, we find that recycling of computer HDDs is currently the most feasible pathway toward large-scale recycling of neodymium, even though HDDs do not represent the largest application of neodymium. We then use a combination of dynamic modeling and empirical experiments to conclude that within the application of NdFeB magnets for HDDs, the potential for loop-closing is significant: up to 57% in 2017. However, compared to the total NdFeB production capacity, the recovery potential from HDDs is relatively small (in the 1-3% range). The distributed nature of neodymium poses a significant challenge for recycling of neodymium.

  12. Nanoinformatics knowledge infrastructures: bringing efficient information management to nanomedical research.

    PubMed

    de la Iglesia, D; Cachau, R E; García-Remesal, M; Maojo, V

    2013-11-27

    Nanotechnology represents an area of particular promise and significant opportunity across multiple scientific disciplines. Ongoing nanotechnology research ranges from the characterization of nanoparticles and nanomaterials to the analysis and processing of experimental data seeking correlations between nanoparticles and their functionalities and side effects. Due to their special properties, nanoparticles are suitable for cellular-level diagnostics and therapy, offering numerous applications in medicine, e.g. development of biomedical devices, tissue repair, drug delivery systems and biosensors. In nanomedicine, recent studies are producing large amounts of structural and property data, highlighting the role for computational approaches in information management. While in vitro and in vivo assays are expensive, the cost of computing is falling. Furthermore, improvements in the accuracy of computational methods (e.g. data mining, knowledge discovery, modeling and simulation) have enabled effective tools to automate the extraction, management and storage of these vast data volumes. Since this information is widely distributed, one major issue is how to locate and access data where it resides (which also poses data-sharing limitations). The novel discipline of nanoinformatics addresses the information challenges related to nanotechnology research. In this paper, we summarize the needs and challenges in the field and present an overview of extant initiatives and efforts.

  13. Nanoinformatics knowledge infrastructures: bringing efficient information management to nanomedical research

    NASA Astrophysics Data System (ADS)

    de la Iglesia, D.; Cachau, R. E.; García-Remesal, M.; Maojo, V.

    2013-01-01

    Nanotechnology represents an area of particular promise and significant opportunity across multiple scientific disciplines. Ongoing nanotechnology research ranges from the characterization of nanoparticles and nanomaterials to the analysis and processing of experimental data seeking correlations between nanoparticles and their functionalities and side effects. Due to their special properties, nanoparticles are suitable for cellular-level diagnostics and therapy, offering numerous applications in medicine, e.g. development of biomedical devices, tissue repair, drug delivery systems and biosensors. In nanomedicine, recent studies are producing large amounts of structural and property data, highlighting the role for computational approaches in information management. While in vitro and in vivo assays are expensive, the cost of computing is falling. Furthermore, improvements in the accuracy of computational methods (e.g. data mining, knowledge discovery, modeling and simulation) have enabled effective tools to automate the extraction, management and storage of these vast data volumes. Since this information is widely distributed, one major issue is how to locate and access data where it resides (which also poses data-sharing limitations). The novel discipline of nanoinformatics addresses the information challenges related to nanotechnology research. In this paper, we summarize the needs and challenges in the field and present an overview of extant initiatives and efforts.

  14. Gyrokinetic particle-in-cell optimization on emerging multi- and manycore platforms

    DOE PAGES

    Madduri, Kamesh; Im, Eun-Jin; Ibrahim, Khaled Z.; ...

    2011-03-02

    The next decade of high-performance computing (HPC) systems will see a rapid evolution and divergence of multi- and manycore architectures as power and cooling constraints limit increases in microprocessor clock speeds. Understanding efficient optimization methodologies on diverse multicore designs in the context of demanding numerical methods is one of the greatest challenges faced today by the HPC community. In this paper, we examine the efficient multicore optimization of GTC, a petascale gyrokinetic toroidal fusion code for studying plasma microturbulence in tokamak devices. For GTC’s key computational components (charge deposition and particle push), we explore efficient parallelization strategies across a broadmore » range of emerging multicore designs, including the recently-released Intel Nehalem-EX, the AMD Opteron Istanbul, and the highly multithreaded Sun UltraSparc T2+. We also present the first study on tuning gyrokinetic particle-in-cell (PIC) algorithms for graphics processors, using the NVIDIA C2050 (Fermi). Our work discusses several novel optimization approaches for gyrokinetic PIC, including mixed-precision computation, particle binning and decomposition strategies, grid replication, SIMDized atomic floating-point operations, and effective GPU texture memory utilization. Overall, we achieve significant performance improvements of 1.3–4.7× on these complex PIC kernels, despite the inherent challenges of data dependency and locality. Finally, our work also points to several architectural and programming features that could significantly enhance PIC performance and productivity on next-generation architectures.« less

  15. Understanding the limits of animal models as predictors of human biology: lessons learned from the sbv IMPROVER Species Translation Challenge

    PubMed Central

    Mathis, Carole; Dulize, Rémi H. J.; Ivanov, Nikolai V.; Alexopoulos, Leonidas; Jeremy Rice, J.; Peitsch, Manuel C.; Stolovitzky, Gustavo; Meyer, Pablo; Hoeng, Julia

    2015-01-01

    Motivation: Inferring how humans respond to external cues such as drugs, chemicals, viruses or hormones is an essential question in biomedicine. Very often, however, this question cannot be addressed because it is not possible to perform experiments in humans. A reasonable alternative consists of generating responses in animal models and ‘translating’ those results to humans. The limitations of such translation, however, are far from clear, and systematic assessments of its actual potential are urgently needed. sbv IMPROVER (systems biology verification for Industrial Methodology for PROcess VErification in Research) was designed as a series of challenges to address translatability between humans and rodents. This collaborative crowd-sourcing initiative invited scientists from around the world to apply their own computational methodologies on a multilayer systems biology dataset composed of phosphoproteomics, transcriptomics and cytokine data derived from normal human and rat bronchial epithelial cells exposed in parallel to 52 different stimuli under identical conditions. Our aim was to understand the limits of species-to-species translatability at different levels of biological organization: signaling, transcriptional and release of secreted factors (such as cytokines). Participating teams submitted 49 different solutions across the sub-challenges, two-thirds of which were statistically significantly better than random. Additionally, similar computational methods were found to range widely in their performance within the same challenge, and no single method emerged as a clear winner across all sub-challenges. Finally, computational methods were able to effectively translate some specific stimuli and biological processes in the lung epithelial system, such as DNA synthesis, cytoskeleton and extracellular matrix, translation, immune/inflammation and growth factor/proliferation pathways, better than the expected response similarity between species. Contact: pmeyerr@us.ibm.com or Julia.Hoeng@pmi.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25236459

  16. Understanding the limits of animal models as predictors of human biology: lessons learned from the sbv IMPROVER Species Translation Challenge.

    PubMed

    Rhrissorrakrai, Kahn; Belcastro, Vincenzo; Bilal, Erhan; Norel, Raquel; Poussin, Carine; Mathis, Carole; Dulize, Rémi H J; Ivanov, Nikolai V; Alexopoulos, Leonidas; Rice, J Jeremy; Peitsch, Manuel C; Stolovitzky, Gustavo; Meyer, Pablo; Hoeng, Julia

    2015-02-15

    Inferring how humans respond to external cues such as drugs, chemicals, viruses or hormones is an essential question in biomedicine. Very often, however, this question cannot be addressed because it is not possible to perform experiments in humans. A reasonable alternative consists of generating responses in animal models and 'translating' those results to humans. The limitations of such translation, however, are far from clear, and systematic assessments of its actual potential are urgently needed. sbv IMPROVER (systems biology verification for Industrial Methodology for PROcess VErification in Research) was designed as a series of challenges to address translatability between humans and rodents. This collaborative crowd-sourcing initiative invited scientists from around the world to apply their own computational methodologies on a multilayer systems biology dataset composed of phosphoproteomics, transcriptomics and cytokine data derived from normal human and rat bronchial epithelial cells exposed in parallel to 52 different stimuli under identical conditions. Our aim was to understand the limits of species-to-species translatability at different levels of biological organization: signaling, transcriptional and release of secreted factors (such as cytokines). Participating teams submitted 49 different solutions across the sub-challenges, two-thirds of which were statistically significantly better than random. Additionally, similar computational methods were found to range widely in their performance within the same challenge, and no single method emerged as a clear winner across all sub-challenges. Finally, computational methods were able to effectively translate some specific stimuli and biological processes in the lung epithelial system, such as DNA synthesis, cytoskeleton and extracellular matrix, translation, immune/inflammation and growth factor/proliferation pathways, better than the expected response similarity between species. pmeyerr@us.ibm.com or Julia.Hoeng@pmi.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  17. BCM: toolkit for Bayesian analysis of Computational Models using samplers.

    PubMed

    Thijssen, Bram; Dijkstra, Tjeerd M H; Heskes, Tom; Wessels, Lodewyk F A

    2016-10-21

    Computational models in biology are characterized by a large degree of uncertainty. This uncertainty can be analyzed with Bayesian statistics, however, the sampling algorithms that are frequently used for calculating Bayesian statistical estimates are computationally demanding, and each algorithm has unique advantages and disadvantages. It is typically unclear, before starting an analysis, which algorithm will perform well on a given computational model. We present BCM, a toolkit for the Bayesian analysis of Computational Models using samplers. It provides efficient, multithreaded implementations of eleven algorithms for sampling from posterior probability distributions and for calculating marginal likelihoods. BCM includes tools to simplify the process of model specification and scripts for visualizing the results. The flexible architecture allows it to be used on diverse types of biological computational models. In an example inference task using a model of the cell cycle based on ordinary differential equations, BCM is significantly more efficient than existing software packages, allowing more challenging inference problems to be solved. BCM represents an efficient one-stop-shop for computational modelers wishing to use sampler-based Bayesian statistics.

  18. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  19. Functional Circuitry on Commercial Fabric via Textile-Compatible Nanoscale Film Coating Process for Fibertronics.

    PubMed

    Bae, Hagyoul; Jang, Byung Chul; Park, Hongkeun; Jung, Soo-Ho; Lee, Hye Moon; Park, Jun-Young; Jeon, Seung-Bae; Son, Gyeongho; Tcho, Il-Woong; Yu, Kyoungsik; Im, Sung Gap; Choi, Sung-Yool; Choi, Yang-Kyu

    2017-10-11

    Fabric-based electronic textiles (e-textiles) are the fundamental components of wearable electronic systems, which can provide convenient hand-free access to computer and electronics applications. However, e-textile technologies presently face significant technical challenges. These challenges include difficulties of fabrication due to the delicate nature of the materials, and limited operating time, a consequence of the conventional normally on computing architecture, with volatile power-hungry electronic components, and modest battery storage. Here, we report a novel poly(ethylene glycol dimethacrylate) (pEGDMA)-textile memristive nonvolatile logic-in-memory circuit, enabling normally off computing, that can overcome those challenges. To form the metal electrode and resistive switching layer, strands of cotton yarn were coated with aluminum (Al) using a solution dip coating method, and the pEGDMA was conformally applied using an initiated chemical vapor deposition process. The intersection of two Al/pEGDMA coated yarns becomes a unit memristor in the lattice structure. The pEGDMA-Textile Memristor (ETM), a form of crossbar array, was interwoven using a grid of Al/pEGDMA coated yarns and untreated yarns. The former were employed in the active memristor and the latter suppressed cell-to-cell disturbance. We experimentally demonstrated for the first time that the basic Boolean functions, including a half adder as well as NOT, NOR, OR, AND, and NAND logic gates, are successfully implemented with the ETM crossbar array on a fabric substrate. This research may represent a breakthrough development for practical wearable and smart fibertronics.

  20. New Unintended Adverse Consequences of Electronic Health Records

    PubMed Central

    Wright, A.; Ash, J.; Singh, H.

    2016-01-01

    Summary Although the health information technology industry has made considerable progress in the design, development, implementation, and use of electronic health records (EHRs), the lofty expectations of the early pioneers have not been met. In 2006, the Provider Order Entry Team at Oregon Health & Science University described a set of unintended adverse consequences (UACs), or unpredictable, emergent problems associated with computer-based provider order entry implementation, use, and maintenance. Many of these originally identified UACs have not been completely addressed or alleviated, some have evolved over time, and some new ones have emerged as EHRs became more widely available. The rapid increase in the adoption of EHRs, coupled with the changes in the types and attitudes of clinical users, has led to several new UACs, specifically: complete clinical information unavailable at the point of care; lack of innovations to improve system usability leading to frustrating user experiences; inadvertent disclosure of large amounts of patient-specific information; increased focus on computer-based quality measurement negatively affecting clinical workflows and patient-provider interactions; information overload from marginally useful computer-generated data; and a decline in the development and use of internally-developed EHRs. While each of these new UACs poses significant challenges to EHR developers and users alike, they also offer many opportunities. The challenge for clinical informatics researchers is to continue to refine our current systems while exploring new methods of overcoming these challenges and developing innovations to improve EHR interoperability, usability, security, functionality, clinical quality measurement, and information summarization and display. PMID:27830226

  1. Fluid-Structure Interaction Modeling of the Reefed Stages of the Orion Spacecraft Main Parachutes

    NASA Astrophysics Data System (ADS)

    Boswell, Cody W.

    Spacecraft parachutes are typically used in multiple stages, starting with a "reefed" stage where a cable along the parachute skirt constrains the diameter to be less than the diameter in the subsequent stage. After a certain period of time during the descent, the cable is cut and the parachute "disreefs" (i.e. expands) to the next stage. Computing the parachute shape at the reefed stage and fluid-structure interaction (FSI) modeling during the disreefing involve computational challenges beyond those we have in FSI modeling of fully-open spacecraft parachutes. These additional challenges are created by the increased geometric complexities and by the rapid changes in the parachute geometry. The computational challenges are further increased because of the added geometric porosity of the latest design, where the "windows" created by the removal of panels and the wider gaps created by the removal of sails compound the geometric and flow complexity. Orion spacecraft main parachutes will have three stages, with computation of the Stage 1 shape and FSI modeling of disreefing from Stage 1 to Stage 2 being the most challenging. We present the special modeling techniques we devised to address the computational challenges and the results from the computations carried out. We also present the methods we devised to calculate for a parachute gore the radius of curvature in the circumferential direction. The curvature values are intended for quick and simple engineering analysis in estimating the structural stresses.

  2. Tracking by Identification Using Computer Vision and Radio

    PubMed Central

    Mandeljc, Rok; Kovačič, Stanislav; Kristan, Matej; Perš, Janez

    2013-01-01

    We present a novel system for detection, localization and tracking of multiple people, which fuses a multi-view computer vision approach with a radio-based localization system. The proposed fusion combines the best of both worlds, excellent computer-vision-based localization, and strong identity information provided by the radio system, and is therefore able to perform tracking by identification, which makes it impervious to propagated identity switches. We present comprehensive methodology for evaluation of systems that perform person localization in world coordinate system and use it to evaluate the proposed system as well as its components. Experimental results on a challenging indoor dataset, which involves multiple people walking around a realistically cluttered room, confirm that proposed fusion of both systems significantly outperforms its individual components. Compared to the radio-based system, it achieves better localization results, while at the same time it successfully prevents propagation of identity switches that occur in pure computer-vision-based tracking. PMID:23262485

  3. Computational Analysis and Simulation of Empathic Behaviors: A Survey of Empathy Modeling with Behavioral Signal Processing Framework

    PubMed Central

    Xiao, Bo; Imel, Zac E.; Georgiou, Panayiotis; Atkins, David C.; Narayanan, Shrikanth S.

    2017-01-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation, and offer a series of open problems for future research. PMID:27017830

  4. Accelerating Full Configuration Interaction Calculations for Nuclear Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Chao; Sternberg, Philip; Maris, Pieter

    2008-04-14

    One of the emerging computational approaches in nuclear physics is the full configuration interaction (FCI) method for solving the many-body nuclear Hamiltonian in a sufficiently large single-particle basis space to obtain exact answers - either directly or by extrapolation. The lowest eigenvalues and correspondingeigenvectors for very large, sparse and unstructured nuclear Hamiltonian matrices are obtained and used to evaluate additional experimental quantities. These matrices pose a significant challenge to the design and implementation of efficient and scalable algorithms for obtaining solutions on massively parallel computer systems. In this paper, we describe the computational strategies employed in a state-of-the-art FCI codemore » MFDn (Many Fermion Dynamics - nuclear) as well as techniques we recently developed to enhance the computational efficiency of MFDn. We will demonstrate the current capability of MFDn and report the latest performance improvement we have achieved. We will also outline our future research directions.« less

  5. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs

    PubMed Central

    2018-01-01

    Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users’ queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with “vanilla” LSH, even when using the same amount of space. PMID:29346410

  6. Programming and Tuning a Quantum Annealing Device to Solve Real World Problems

    NASA Astrophysics Data System (ADS)

    Perdomo-Ortiz, Alejandro; O'Gorman, Bryan; Fluegemann, Joseph; Smelyanskiy, Vadim

    2015-03-01

    Solving real-world applications with quantum algorithms requires overcoming several challenges, ranging from translating the computational problem at hand to the quantum-machine language to tuning parameters of the quantum algorithm that have a significant impact on the performance of the device. In this talk, we discuss these challenges, strategies developed to enhance performance, and also a more efficient implementation of several applications. Although we will focus on applications of interest to NASA's Quantum Artificial Intelligence Laboratory, the methods and concepts presented here apply to a broader family of hard discrete optimization problems, including those that occur in many machine-learning algorithms.

  7. Computer use in primary care and patient-physician communication.

    PubMed

    Sobral, Dilermando; Rosenbaum, Marcy; Figueiredo-Braga, Margarida

    2015-07-08

    This study evaluated how physicians and patients perceive the impact of computer use on clinical communication, and how a patient-centered orientation can influence this impact. The study followed a descriptive cross-sectional design and included 106 family physicians and 392 patients. An original questionnaire assessed computer use, participants' perspective of its impact, and patient centered strategies. Physicians reported spending 42% of consultation time in contact with the computer. A negative impact of computer in patient-physician communication regarding the consultation length, confidentiality, maintaining eye contact, active listening to the patient, and ability to understand the patient was reported by physicians, while patients reported a positive effect for all the items. Physicians considered that the usual computer placement in their consultation room was significantly unfavorable to patient-physician communication. Physicians perceive the impact of computer use on patient-physician communication as negative, while patients have a positive perception of computer use on patient-physician communication. Consultation support can represent a challenge to physicians who recognize its negative impact in patient centered orientation. Medical education programs aiming to enhance specific communication skills and to better integrate computer use in primary care settings are needed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Computational pan-genomics: status, promises and challenges.

    PubMed

    2018-01-01

    Many disciplines, from human genetics and oncology to plant breeding, microbiology and virology, commonly face the challenge of analyzing rapidly increasing numbers of genomes. In case of Homo sapiens, the number of sequenced genomes will approach hundreds of thousands in the next few years. Simply scaling up established bioinformatics pipelines will not be sufficient for leveraging the full potential of such rich genomic data sets. Instead, novel, qualitatively different computational methods and paradigms are needed. We will witness the rapid extension of computational pan-genomics, a new sub-area of research in computational biology. In this article, we generalize existing definitions and understand a pan-genome as any collection of genomic sequences to be analyzed jointly or to be used as a reference. We examine already available approaches to construct and use pan-genomes, discuss the potential benefits of future technologies and methodologies and review open challenges from the vantage point of the above-mentioned biological disciplines. As a prominent example for a computational paradigm shift, we particularly highlight the transition from the representation of reference genomes as strings to representations as graphs. We outline how this and other challenges from different application domains translate into common computational problems, point out relevant bioinformatics techniques and identify open problems in computer science. With this review, we aim to increase awareness that a joint approach to computational pan-genomics can help address many of the problems currently faced in various domains. © The Author 2016. Published by Oxford University Press.

  9. Perspectives on biological growth and remodeling

    PubMed Central

    Ambrosi, D.; Ateshian, G. A.; Arruda, E. M.; Cowin, S. C.; Dumais, J.; Goriely, A.; Holzapfel, G. A.; Humphrey, J. D.; Kemkemer, R.; Kuhl, E.; Olberding, J. E.; Taber, L. A.; Garikipati, K.

    2011-01-01

    The continuum mechanical treatment of biological growth and remodeling has attracted considerable attention over the past fifteen years. Many aspects of these problems are now well-understood, yet there remain areas in need of significant development from the standpoint of experiments, theory, and computation. In this perspective paper we review the state of the field and highlight open questions, challenges, and avenues for further development. PMID:21532929

  10. Health workers’ knowledge of and attitudes towards computer applications in rural African health facilities

    PubMed Central

    Sukums, Felix; Mensah, Nathan; Mpembeni, Rose; Kaltschmidt, Jens; Haefeli, Walter E.; Blank, Antje

    2014-01-01

    Background The QUALMAT (Quality of Maternal and Prenatal Care: Bridging the Know-do Gap) project has introduced an electronic clinical decision support system (CDSS) for pre-natal and maternal care services in rural primary health facilities in Burkina Faso, Ghana, and Tanzania. Objective To report an assessment of health providers’ computer knowledge, experience, and attitudes prior to the implementation of the QUALMAT electronic CDSS. Design A cross-sectional study was conducted with providers in 24 QUALMAT project sites. Information was collected using structured questionnaires. Chi-squared tests and one-way ANOVA describe the association between computer knowledge, attitudes, and other factors. Semi-structured interviews and focus groups were conducted to gain further insights. Results A total of 108 providers responded, 63% were from Tanzania and 37% from Ghana. The mean age was 37.6 years, and 79% were female. Only 40% had ever used computers, and 29% had prior computer training. About 80% were computer illiterate or beginners. Educational level, age, and years of work experience were significantly associated with computer knowledge (p<0.01). Most (95.3%) had positive attitudes towards computers – average score (±SD) of 37.2 (±4.9). Females had significantly lower scores than males. Interviews and group discussions showed that although most were lacking computer knowledge and experience, they were optimistic about overcoming challenges associated with the introduction of computers in their workplace. Conclusions Given the low levels of computer knowledge among rural health workers in Africa, it is important to provide adequate training and support to ensure the successful uptake of electronic CDSSs in these settings. The positive attitudes to computers found in this study underscore that also rural care providers are ready to use such technology. PMID:25361721

  11. Health workers' knowledge of and attitudes towards computer applications in rural African health facilities.

    PubMed

    Sukums, Felix; Mensah, Nathan; Mpembeni, Rose; Kaltschmidt, Jens; Haefeli, Walter E; Blank, Antje

    2014-01-01

    The QUALMAT (Quality of Maternal and Prenatal Care: Bridging the Know-do Gap) project has introduced an electronic clinical decision support system (CDSS) for pre-natal and maternal care services in rural primary health facilities in Burkina Faso, Ghana, and Tanzania. To report an assessment of health providers' computer knowledge, experience, and attitudes prior to the implementation of the QUALMAT electronic CDSS. A cross-sectional study was conducted with providers in 24 QUALMAT project sites. Information was collected using structured questionnaires. Chi-squared tests and one-way ANOVA describe the association between computer knowledge, attitudes, and other factors. Semi-structured interviews and focus groups were conducted to gain further insights. A total of 108 providers responded, 63% were from Tanzania and 37% from Ghana. The mean age was 37.6 years, and 79% were female. Only 40% had ever used computers, and 29% had prior computer training. About 80% were computer illiterate or beginners. Educational level, age, and years of work experience were significantly associated with computer knowledge (p<0.01). Most (95.3%) had positive attitudes towards computers - average score (±SD) of 37.2 (±4.9). Females had significantly lower scores than males. Interviews and group discussions showed that although most were lacking computer knowledge and experience, they were optimistic about overcoming challenges associated with the introduction of computers in their workplace. Given the low levels of computer knowledge among rural health workers in Africa, it is important to provide adequate training and support to ensure the successful uptake of electronic CDSSs in these settings. The positive attitudes to computers found in this study underscore that also rural care providers are ready to use such technology.

  12. Design of a framework for modeling, integration and simulation of physiological models.

    PubMed

    Erson, E Zeynep; Cavuşoğlu, M Cenk

    2012-09-01

    Multiscale modeling and integration of physiological models carry challenges due to the complex nature of physiological processes. High coupling within and among scales present a significant challenge in constructing and integrating multiscale physiological models. In order to deal with such challenges in a systematic way, there is a significant need for an information technology framework together with related analytical and computational tools that will facilitate integration of models and simulations of complex biological systems. Physiological Model Simulation, Integration and Modeling Framework (Phy-SIM) is an information technology framework providing the tools to facilitate development, integration and simulation of integrated models of human physiology. Phy-SIM brings software level solutions to the challenges raised by the complex nature of physiological systems. The aim of Phy-SIM, and this paper is to lay some foundation with the new approaches such as information flow and modular representation of the physiological models. The ultimate goal is to enhance the development of both the models and the integration approaches of multiscale physiological processes and thus this paper focuses on the design approaches that would achieve such a goal. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  13. Energy Efficient Digital Logic Using Nanoscale Magnetic Devices

    NASA Astrophysics Data System (ADS)

    Lambson, Brian James

    Increasing demand for information processing in the last 50 years has been largely satisfied by the steadily declining price and improving performance of microelectronic devices. Much of this progress has been made by aggressively scaling the size of semiconductor transistors and metal interconnects that microprocessors are built from. As devices shrink to the size regime in which quantum effects pose significant challenges, new physics may be required in order to continue historical scaling trends. A variety of new devices and physics are currently under investigation throughout the scientific and engineering community to meet these challenges. One of the more drastic proposals on the table is to replace the electronic components of information processors with magnetic components. Magnetic components are already commonplace in computers for their information storage capability. Unlike most electronic devices, magnetic materials can store data in the absence of a power supply. Today's magnetic hard disk drives can routinely hold billions of bits of information and are in widespread commercial use. Their ability to function without a constant power source hints at an intrinsic energy efficiency. The question we investigate in this dissertation is whether or not this advantage can be extended from information storage to the notoriously energy intensive task of information processing. Several proof-of-concept magnetic logic devices were proposed and tested in the past decade. In this dissertation, we build on the prior work by answering fundamental questions about how magnetic devices achieve such high energy efficiency and how they can best function in digital logic applications. The results of this analysis are used to suggest and test improvements to nanomagnetic computing devices. Two of our results are seen as especially important to the field of nanomagnetic computing: (1) we show that it is possible to operate nanomagnetic computers at the fundamental thermodyanimic limits of computation and (2) we develop a nanomagnet with a unique shape that is engineered to significantly improve the reliability of nanomagnetic logic.

  14. Tensor methods for parameter estimation and bifurcation analysis of stochastic reaction networks

    PubMed Central

    Liao, Shuohao; Vejchodský, Tomáš; Erban, Radek

    2015-01-01

    Stochastic modelling of gene regulatory networks provides an indispensable tool for understanding how random events at the molecular level influence cellular functions. A common challenge of stochastic models is to calibrate a large number of model parameters against the experimental data. Another difficulty is to study how the behaviour of a stochastic model depends on its parameters, i.e. whether a change in model parameters can lead to a significant qualitative change in model behaviour (bifurcation). In this paper, tensor-structured parametric analysis (TPA) is developed to address these computational challenges. It is based on recently proposed low-parametric tensor-structured representations of classical matrices and vectors. This approach enables simultaneous computation of the model properties for all parameter values within a parameter space. The TPA is illustrated by studying the parameter estimation, robustness, sensitivity and bifurcation structure in stochastic models of biochemical networks. A Matlab implementation of the TPA is available at http://www.stobifan.org. PMID:26063822

  15. Faster PET reconstruction with a stochastic primal-dual hybrid gradient method

    NASA Astrophysics Data System (ADS)

    Ehrhardt, Matthias J.; Markiewicz, Pawel; Chambolle, Antonin; Richtárik, Peter; Schott, Jonathan; Schönlieb, Carola-Bibiane

    2017-08-01

    Image reconstruction in positron emission tomography (PET) is computationally challenging due to Poisson noise, constraints and potentially non-smooth priors-let alone the sheer size of the problem. An algorithm that can cope well with the first three of the aforementioned challenges is the primal-dual hybrid gradient algorithm (PDHG) studied by Chambolle and Pock in 2011. However, PDHG updates all variables in parallel and is therefore computationally demanding on the large problem sizes encountered with modern PET scanners where the number of dual variables easily exceeds 100 million. In this work, we numerically study the usage of SPDHG-a stochastic extension of PDHG-but is still guaranteed to converge to a solution of the deterministic optimization problem with similar rates as PDHG. Numerical results on a clinical data set show that by introducing randomization into PDHG, similar results as the deterministic algorithm can be achieved using only around 10 % of operator evaluations. Thus, making significant progress towards the feasibility of sophisticated mathematical models in a clinical setting.

  16. Tensor methods for parameter estimation and bifurcation analysis of stochastic reaction networks.

    PubMed

    Liao, Shuohao; Vejchodský, Tomáš; Erban, Radek

    2015-07-06

    Stochastic modelling of gene regulatory networks provides an indispensable tool for understanding how random events at the molecular level influence cellular functions. A common challenge of stochastic models is to calibrate a large number of model parameters against the experimental data. Another difficulty is to study how the behaviour of a stochastic model depends on its parameters, i.e. whether a change in model parameters can lead to a significant qualitative change in model behaviour (bifurcation). In this paper, tensor-structured parametric analysis (TPA) is developed to address these computational challenges. It is based on recently proposed low-parametric tensor-structured representations of classical matrices and vectors. This approach enables simultaneous computation of the model properties for all parameter values within a parameter space. The TPA is illustrated by studying the parameter estimation, robustness, sensitivity and bifurcation structure in stochastic models of biochemical networks. A Matlab implementation of the TPA is available at http://www.stobifan.org.

  17. Emissive flat panel displays: A challenge to the AMLCD

    NASA Astrophysics Data System (ADS)

    Walko, R. J.

    According to some sources, flat panel displays (FPD's) for computers will represent a 20-40 billion dollar industry by the end of the decade and could leverage up to 100-200 billion dollars in computer sales. Control of the flat panel display industry could be a significant factor in the global economy if FPD's manage to tap into the enormous audio/visual consumer market. Japan presently leads the world in active matrix liquid crystal display (AMLCD) manufacturing, the current leading FPD technology. The AMLCD is basically a light shutter which does not emit light on its own, but modulates the intensity of a separate backlight. However, other technologies, based on light emitting phosphors, could eventually challenge the AMLCD's lead position. These light-emissive technologies do not have the size, temperature and viewing angle limitations of AMLCD's. In addition, they could also be less expensive to manufacture, and require a smaller capital outlay for a manufacturing plant. An overview of these alternative technologies is presented.

  18. BigData and computing challenges in high energy and nuclear physics

    NASA Astrophysics Data System (ADS)

    Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.

    2017-06-01

    In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R&D computing projects started recently in National Research Center ``Kurchatov Institute''

  19. Towards Effective Non-Invasive Brain-Computer Interfaces Dedicated to Gait Rehabilitation Systems

    PubMed Central

    Castermans, Thierry; Duvinage, Matthieu; Cheron, Guy; Dutoit, Thierry

    2014-01-01

    In the last few years, significant progress has been made in the field of walk rehabilitation. Motor cortex signals in bipedal monkeys have been interpreted to predict walk kinematics. Epidural electrical stimulation in rats and in one young paraplegic has been realized to partially restore motor control after spinal cord injury. However, these experimental trials are far from being applicable to all patients suffering from motor impairments. Therefore, it is thought that more simple rehabilitation systems are desirable in the meanwhile. The goal of this review is to describe and summarize the progress made in the development of non-invasive brain-computer interfaces dedicated to motor rehabilitation systems. In the first part, the main principles of human locomotion control are presented. The paper then focuses on the mechanisms of supra-spinal centers active during gait, including results from electroencephalography, functional brain imaging technologies [near-infrared spectroscopy (NIRS), functional magnetic resonance imaging (fMRI), positron-emission tomography (PET), single-photon emission-computed tomography (SPECT)] and invasive studies. The first brain-computer interface (BCI) applications to gait rehabilitation are then presented, with a discussion about the different strategies developed in the field. The challenges to raise for future systems are identified and discussed. Finally, we present some proposals to address these challenges, in order to contribute to the improvement of BCI for gait rehabilitation. PMID:24961699

  20. The research of computer multimedia assistant in college English listening

    NASA Astrophysics Data System (ADS)

    Zhang, Qian

    2012-04-01

    With the technology development of network information, there exists more and more seriously questions to our education. Computer multimedia application breaks the traditional foreign language teaching and brings new challenges and opportunities for the education. Through the multiple media application, the teaching process is full of animation, image, voice, and characters. This can improve the learning initiative and objective with great development of learning efficiency. During the traditional foreign language teaching, people use characters learning. However, through this method, the theory performance is good but the practical application is low. During the long time computer multimedia application in the foreign language teaching, many teachers still have prejudice. Therefore, the method is not obtaining the effect. After all the above, the research has significant meaning for improving the teaching quality of foreign language.

  1. Decision support methods for the detection of adverse events in post-marketing data.

    PubMed

    Hauben, M; Bate, A

    2009-04-01

    Spontaneous reporting is a crucial component of post-marketing drug safety surveillance despite its significant limitations. The size and complexity of some spontaneous reporting system databases represent a challenge for drug safety professionals who traditionally have relied heavily on the scientific and clinical acumen of the prepared mind. Computer algorithms that calculate statistical measures of reporting frequency for huge numbers of drug-event combinations are increasingly used to support pharamcovigilance analysts screening large spontaneous reporting system databases. After an overview of pharmacovigilance and spontaneous reporting systems, we discuss the theory and application of contemporary computer algorithms in regular use, those under development, and the practical considerations involved in the implementation of computer algorithms within a comprehensive and holistic drug safety signal detection program.

  2. General rigid motion correction for computed tomography imaging based on locally linear embedding

    NASA Astrophysics Data System (ADS)

    Chen, Mianyi; He, Peng; Feng, Peng; Liu, Baodong; Yang, Qingsong; Wei, Biao; Wang, Ge

    2018-02-01

    The patient motion can damage the quality of computed tomography images, which are typically acquired in cone-beam geometry. The rigid patient motion is characterized by six geometric parameters and are more challenging to correct than in fan-beam geometry. We extend our previous rigid patient motion correction method based on the principle of locally linear embedding (LLE) from fan-beam to cone-beam geometry and accelerate the computational procedure with the graphics processing unit (GPU)-based all scale tomographic reconstruction Antwerp toolbox. The major merit of our method is that we need neither fiducial markers nor motion-tracking devices. The numerical and experimental studies show that the LLE-based patient motion correction is capable of calibrating the six parameters of the patient motion simultaneously, reducing patient motion artifacts significantly.

  3. Mathematical and Computational Challenges in Population Biology and Ecosystems Science

    NASA Technical Reports Server (NTRS)

    Levin, Simon A.; Grenfell, Bryan; Hastings, Alan; Perelson, Alan S.

    1997-01-01

    Mathematical and computational approaches provide powerful tools in the study of problems in population biology and ecosystems science. The subject has a rich history intertwined with the development of statistics and dynamical systems theory, but recent analytical advances, coupled with the enhanced potential of high-speed computation, have opened up new vistas and presented new challenges. Key challenges involve ways to deal with the collective dynamics of heterogeneous ensembles of individuals, and to scale from small spatial regions to large ones. The central issues-understanding how detail at one scale makes its signature felt at other scales, and how to relate phenomena across scales-cut across scientific disciplines and go to the heart of algorithmic development of approaches to high-speed computation. Examples are given from ecology, genetics, epidemiology, and immunology.

  4. A Fine-Grained and Privacy-Preserving Query Scheme for Fog Computing-Enhanced Location-Based Service

    PubMed Central

    Yin, Fan; Tang, Xiaohu

    2017-01-01

    Location-based services (LBS), as one of the most popular location-awareness applications, has been further developed to achieve low-latency with the assistance of fog computing. However, privacy issues remain a research challenge in the context of fog computing. Therefore, in this paper, we present a fine-grained and privacy-preserving query scheme for fog computing-enhanced location-based services, hereafter referred to as FGPQ. In particular, mobile users can obtain the fine-grained searching result satisfying not only the given spatial range but also the searching content. Detailed privacy analysis shows that our proposed scheme indeed achieves the privacy preservation for the LBS provider and mobile users. In addition, extensive performance analyses and experiments demonstrate that the FGPQ scheme can significantly reduce computational and communication overheads and ensure the low-latency, which outperforms existing state-of-the art schemes. Hence, our proposed scheme is more suitable for real-time LBS searching. PMID:28696395

  5. A Fine-Grained and Privacy-Preserving Query Scheme for Fog Computing-Enhanced Location-Based Service.

    PubMed

    Yang, Xue; Yin, Fan; Tang, Xiaohu

    2017-07-11

    Location-based services (LBS), as one of the most popular location-awareness applications, has been further developed to achieve low-latency with the assistance of fog computing. However, privacy issues remain a research challenge in the context of fog computing. Therefore, in this paper, we present a fine-grained and privacy-preserving query scheme for fog computing-enhanced location-based services, hereafter referred to as FGPQ. In particular, mobile users can obtain the fine-grained searching result satisfying not only the given spatial range but also the searching content. Detailed privacy analysis shows that our proposed scheme indeed achieves the privacy preservation for the LBS provider and mobile users. In addition, extensive performance analyses and experiments demonstrate that the FGPQ scheme can significantly reduce computational and communication overheads and ensure the low-latency, which outperforms existing state-of-the art schemes. Hence, our proposed scheme is more suitable for real-time LBS searching.

  6. Modeling Cardiac Electrophysiology at the Organ Level in the Peta FLOPS Computing Age

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Lawrence; Bishop, Martin; Hoetzl, Elena

    2010-09-30

    Despite a steep increase in available compute power, in-silico experimentation with highly detailed models of the heart remains to be challenging due to the high computational cost involved. It is hoped that next generation high performance computing (HPC) resources lead to significant reductions in execution times to leverage a new class of in-silico applications. However, performance gains with these new platforms can only be achieved by engaging a much larger number of compute cores, necessitating strongly scalable numerical techniques. So far strong scalability has been demonstrated only for a moderate number of cores, orders of magnitude below the range requiredmore » to achieve the desired performance boost.In this study, strong scalability of currently used techniques to solve the bidomain equations is investigated. Benchmark results suggest that scalability is limited to 512-4096 cores within the range of relevant problem sizes even when systems are carefully load-balanced and advanced IO strategies are employed.« less

  7. Fully nonlocal inelastic scattering computations for spectroscopical transmission electron microscopy methods

    NASA Astrophysics Data System (ADS)

    Rusz, Ján; Lubk, Axel; Spiegelberg, Jakob; Tyutyunnikov, Dmitry

    2017-12-01

    The complex interplay of elastic and inelastic scattering amenable to different levels of approximation constitutes the major challenge for the computation and hence interpretation of TEM-based spectroscopical methods. The two major approaches to calculate inelastic scattering cross sections of fast electrons on crystals—Yoshioka-equations-based forward propagation and the reciprocal wave method—are founded in two conceptually differing schemes—a numerical forward integration of each inelastically scattered wave function, yielding the exit density matrix, and a computation of inelastic scattering matrix elements using elastically scattered initial and final states (double channeling). Here, we compare both approaches and show that the latter is computationally competitive to the former by exploiting analytical integration schemes over multiple excited states. Moreover, we show how to include full nonlocality of the inelastic scattering event, neglected in the forward propagation approaches, at no additional computing costs in the reciprocal wave method. Detailed simulations show in some cases significant errors due to the z -locality approximation and hence pitfalls in the interpretation of spectroscopical TEM results.

  8. Risk in the Clouds?: Security Issues Facing Government Use of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Wyld, David C.

    Cloud computing is poised to become one of the most important and fundamental shifts in how computing is consumed and used. Forecasts show that government will play a lead role in adopting cloud computing - for data storage, applications, and processing power, as IT executives seek to maximize their returns on limited procurement budgets in these challenging economic times. After an overview of the cloud computing concept, this article explores the security issues facing public sector use of cloud computing and looks to the risk and benefits of shifting to cloud-based models. It concludes with an analysis of the challenges that lie ahead for government use of cloud resources.

  9. Comparing the Performance of Two Dynamic Load Distribution Methods

    NASA Technical Reports Server (NTRS)

    Kale, L. V.

    1987-01-01

    Parallel processing of symbolic computations on a message-passing multi-processor presents one challenge: To effectively utilize the available processors, the load must be distributed uniformly to all the processors. However, the structure of these computations cannot be predicted in advance. go, static scheduling methods are not applicable. In this paper, we compare the performance of two dynamic, distributed load balancing methods with extensive simulation studies. The two schemes are: the Contracting Within a Neighborhood (CWN) scheme proposed by us, and the Gradient Model proposed by Lin and Keller. We conclude that although simpler, the CWN is significantly more effective at distributing the work than the Gradient model.

  10. Overcoming challenges integrating patient-generated data into the clinical EHR: lessons from the CONtrolling Disease Using Inexpensive IT--Hypertension in Diabetes (CONDUIT-HID) Project.

    PubMed

    Marquard, Jenna L; Garber, Lawrence; Saver, Barry; Amster, Brian; Kelleher, Michael; Preusse, Peggy

    2013-10-01

    The CONDUIT-HID intervention integrates patients' electronic blood pressure measurements directly into the clinical EHR using Microsoft HealthVault as an intermediary data store. The goal of this paper is to describe generalizable categories of patient and technical challenges encountered in the development and implementation of this inexpensive, commercial off-the-shelf consumer health informatics intervention, examples of challenges within each category, and how the example challenges were resolved prior to conducting an RCT of the intervention. The research team logged all challenges and mediation strategies during the technical development of the intervention, conducted home visits to observe patients using the intervention, and conducted telephone calls with patients to understand challenges they encountered. We then used these data to iteratively refine the intervention. The research team identified a variety of generalizable categories of challenges associated with patients uploading data from their homes, patients uploading data from clinics because they did not have or were not comfortable using home computers, and patients establishing the connection between HealthVault and the clinical EHR. Specific challenges within these categories arose because: (1) the research team had little control over the device and application design, (2) multiple vendors needed to coordinate their actions and design changes, (3) the intervention use cases were not anticipated by the device and application designers, (4) PHI accessed on clinic computers needed to be kept secure, (5) the research team wanted the data in the clinical EHR to be valid and reliable, (6) patients needed the ability to share only the data they wanted, and (7) the development of some EHR functionalities were new to the organization. While these challenges were varied and complex, the research team was able to successfully resolve each one prior to the start of the RCT. By identifying these generalizable categories of challenges, we aim to help others proactively search for and remedy potential challenges associated with their interventions, rather than reactively responding to problems as they arise. We posit that this approach will significantly increase the likelihood that these types of interventions will be successful. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Challenges facing developers of CAD/CAM models that seek to predict human working postures

    NASA Astrophysics Data System (ADS)

    Wiker, Steven F.

    2005-11-01

    This paper outlines the need for development of human posture prediction models for Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) design applications in product, facility and work design. Challenges facing developers of posture prediction algorithms are presented and discussed.

  12. ADDRESSING HUMAN EXPOSURES TO AIR POLLUTANTS AROUND BUILDINGS IN URBAN AREAS WITH COMPUTATIONAL FLUID DYNAMICS MODELS

    EPA Science Inventory

    This paper discusses the status and application of Computational Fluid Dynamics (CFD) models to address challenges for modeling human exposures to air pollutants around urban building microenvironments. There are challenges for more detailed understanding of air pollutant sour...

  13. New Challenges of the Computation of Multiple Sequence Alignments in the High-Throughput Era (2010 JGI/ANL HPC Workshop)

    ScienceCinema

    Notredame, Cedric

    2018-05-02

    Cedric Notredame from the Centre for Genomic Regulation gives a presentation on New Challenges of the Computation of Multiple Sequence Alignments in the High-Throughput Era at the JGI/Argonne HPC Workshop on January 26, 2010.

  14. Naïve and Robust: Class-Conditional Independence in Human Classification Learning

    ERIC Educational Resources Information Center

    Jarecki, Jana B.; Meder, Björn; Nelson, Jonathan D.

    2018-01-01

    Humans excel in categorization. Yet from a computational standpoint, learning a novel probabilistic classification task involves severe computational challenges. The present paper investigates one way to address these challenges: assuming class-conditional independence of features. This feature independence assumption simplifies the inference…

  15. Arrhythmic risk biomarkers for the assessment of drug cardiotoxicity: from experiments to computer simulations

    PubMed Central

    Corrias, A.; Jie, X.; Romero, L.; Bishop, M. J.; Bernabeu, M.; Pueyo, E.; Rodriguez, B.

    2010-01-01

    In this paper, we illustrate how advanced computational modelling and simulation can be used to investigate drug-induced effects on cardiac electrophysiology and on specific biomarkers of pro-arrhythmic risk. To do so, we first perform a thorough literature review of proposed arrhythmic risk biomarkers from the ionic to the electrocardiogram levels. The review highlights the variety of proposed biomarkers, the complexity of the mechanisms of drug-induced pro-arrhythmia and the existence of significant animal species differences in drug-induced effects on cardiac electrophysiology. Predicting drug-induced pro-arrhythmic risk solely using experiments is challenging both preclinically and clinically, as attested by the rise in the cost of releasing new compounds to the market. Computational modelling and simulation has significantly contributed to the understanding of cardiac electrophysiology and arrhythmias over the last 40 years. In the second part of this paper, we illustrate how state-of-the-art open source computational modelling and simulation tools can be used to simulate multi-scale effects of drug-induced ion channel block in ventricular electrophysiology at the cellular, tissue and whole ventricular levels for different animal species. We believe that the use of computational modelling and simulation in combination with experimental techniques could be a powerful tool for the assessment of drug safety pharmacology. PMID:20478918

  16. Computer-Access-Code Matrices

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr.

    1990-01-01

    Authorized users respond to changing challenges with changing passwords. Scheme for controlling access to computers defeats eavesdroppers and "hackers". Based on password system of challenge and password or sign, challenge, and countersign correlated with random alphanumeric codes in matrices of two or more dimensions. Codes stored on floppy disk or plug-in card and changed frequently. For even higher security, matrices of four or more dimensions used, just as cubes compounded into hypercubes in concurrent processing.

  17. Computational modeling of electromechanical instabilities in dielectric elastomers (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Park, Harold

    2016-04-01

    Dielectric elastomers are a class of soft, active materials that have recently gained significant interest due to the fact that they can be electrostatically actuated into undergoing extremely large deformations. An ongoing challenge has been the development of robust and accurate computational models for elastomers, particularly those that can capture electromechanical instabilities that limit the performance of elastomers such as creasing, wrinkling, and snap-through. I discuss in this work a recently developed finite element model for elastomers that is dynamic, nonlinear, and fully electromechanically coupled. The model also significantly alleviates volumetric locking due that arises due to the incompressible nature of the elastomers, and incorporates viscoelasticity within a finite deformation framework. Numerical examples are shown that demonstrate the performance of the proposed method in capturing electromechanical instabilities (snap-through, creasing, cratering, wrinkling) that have been observed experimentally.

  18. Learning to assign binary weights to binary descriptor

    NASA Astrophysics Data System (ADS)

    Huang, Zhoudi; Wei, Zhenzhong; Zhang, Guangjun

    2016-10-01

    Constructing robust binary local feature descriptors are receiving increasing interest due to their binary nature, which can enable fast processing while requiring significantly less memory than their floating-point competitors. To bridge the performance gap between the binary and floating-point descriptors without increasing the computational cost of computing and matching, optimal binary weights are learning to assign to binary descriptor for considering each bit might contribute differently to the distinctiveness and robustness. Technically, a large-scale regularized optimization method is applied to learn float weights for each bit of the binary descriptor. Furthermore, binary approximation for the float weights is performed by utilizing an efficient alternatively greedy strategy, which can significantly improve the discriminative power while preserve fast matching advantage. Extensive experimental results on two challenging datasets (Brown dataset and Oxford dataset) demonstrate the effectiveness and efficiency of the proposed method.

  19. Quantum chemical approaches to [NiFe] hydrogenase.

    PubMed

    Vaissier, Valerie; Van Voorhis, Troy

    2017-05-09

    The mechanism by which [NiFe] hydrogenase catalyses the oxidation of molecular hydrogen is a significant yet challenging topic in bioinorganic chemistry. With far-reaching applications in renewable energy and carbon mitigation, significant effort has been invested in the study of these complexes. In particular, computational approaches offer a unique perspective on how this enzyme functions at an electronic and atomistic level. In this article, we discuss state-of-the art quantum chemical methods and how they have helped deepen our comprehension of [NiFe] hydrogenase. We outline the key strategies that can be used to compute the (i) geometry, (ii) electronic structure, (iii) thermodynamics and (iv) kinetic properties associated with the enzymatic activity of [NiFe] hydrogenase and other bioinorganic complexes. © 2017 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.

  20. Exascale computing and big data

    DOE PAGES

    Reed, Daniel A.; Dongarra, Jack

    2015-06-25

    Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less

  1. Exascale computing and big data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, Daniel A.; Dongarra, Jack

    Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less

  2. Intelligent Computational Systems. Opening Remarks: CFD Application Process Workshop

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    1994-01-01

    This discussion will include a short review of the challenges that must be overcome if computational physics technology is to have a larger impact on the design cycles of U.S. aerospace companies. Some of the potential solutions to these challenges may come from the information sciences fields. A few examples of potential computational physics/information sciences synergy will be presented, as motivation and inspiration for the Improving The CFD Applications Process Workshop.

  3. Has computational creativity successfully made it "Beyond the Fence" in musical theatre?

    NASA Astrophysics Data System (ADS)

    Jordanous, Anna

    2017-10-01

    A significant test for software is to task it with replicating human performance, as done recently with creative software and the commercial project Beyond the Fence (undertaken for a television documentary Computer Says Show). The remit of this project was to use computer software as much as possible to produce "the world's first computer-generated musical". Several creative systems were used to generate this musical, which was performed in London's West End in 2016. This paper considers the challenge of evaluating this project. Current computational creativity evaluation methods are ill-suited to evaluating projects that involve creative input from multiple systems and people. Following recent inspiration within computational creativity research from interaction design, here the DECIDE evaluation framework is applied to evaluate the Beyond the Fence project. Evaluation finds that the project was reasonably successful at achieving the task of using computational generation to produce a credible musical. Lessons have been learned for future computational creativity projects though, particularly for affording creative software more agency and enabling software to interact with other creative partners. Upon reflection, the DECIDE framework emerges as a useful evaluation "checklist" (if not a tangible operational methodology) for evaluating multiple creative systems participating in a creative task.

  4. High Tech: A Place in Our Lives and in Our Schools.

    ERIC Educational Resources Information Center

    Roach, John V.

    1986-01-01

    Discusses various aspects of high technology: computers in cars, computer-assisted design and manufacturing, computers in telephones, video recorders, laser technology, home computers, job training, computer education, and the challenge to the technology teacher. (CT)

  5. Frontiers in Neuromorphics Workshop

    DTIC Science & Technology

    2017-04-14

    Policy:  Nanotechnology ‐ Inspired  Grand  Challenge  for  Future  Computing.    Our  goal  is  to  bring  together  scientific  disciplines  and...Dr. Helen Li – Pittsburgh University Title: Embrace the BRAIN Century: Challenges in Nanotechnology Enabled Neuromorphic Computing Design 3

  6. Enhancement of Teaching-Learning Process through Multimedia Technology

    ERIC Educational Resources Information Center

    Charles, R.

    2011-01-01

    The Indian educational system has to meet the challenges of knowledge explosion and its requirement of increased enrolment in higher education. Computer and technology plays a pre-dominant role to meet out its challenges. Recent innovative Educational approach recommends self and sensory oriented instruction. Computer based multimedia is a tool…

  7. Scratch Your Brain Where It Itches: Math Games, Tricks and Quick Activities, Book C-1.

    ERIC Educational Resources Information Center

    Brumbaugh, Doug

    This resource book contains mathematical games, tricks, and quick activities for the classroom. Categories of activities include computation, manipulative challenges, puzzlers, picky puzzlers, patterns, measurement, money, and riddles. The computation section contains 13 classroom games and activities along with 4 manipulative challenges.…

  8. EPA and GSA Webinar: E Scrap Management, Computers for Learning and the Federal Green Challenge

    EPA Pesticide Factsheets

    EPA and the General Services Administration (GSA) are hosting a webinar on May 2, 2018. Topics will include policies and procedures on E Scrap management, a review of the Computers For Leaning Program, and benefits of joining the Federal Green Challenge.

  9. A Challenge to Watson

    ERIC Educational Resources Information Center

    Detterman, Douglas K.

    2011-01-01

    Watson's Jeopardy victory raises the question of the similarity of artificial intelligence and human intelligence. Those of us who study human intelligence issue a challenge to the artificial intelligence community. We will construct a unique battery of tests for any computer that would provide an actual IQ score for the computer. This is the same…

  10. A Fifth Grader's Guide to the World

    ERIC Educational Resources Information Center

    Purcell, April D.; Ponomarenko, Alyson L.; Brown, Stephen C.

    2006-01-01

    The challenge for today's elementary teachers is not "whether" but rather "how" to use computers to effectively teach students essential skills and concepts. One exciting way of meeting this challenge is to use Geographic Information Systems (GIS), computer software that captures, manipulates, analyzes, and displays data on specialized layered…

  11. Manyscale Computing for Sensor Processing in Support of Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Schmalz, M.; Chapman, W.; Hayden, E.; Sahni, S.; Ranka, S.

    2014-09-01

    Increasing image and signal data burden associated with sensor data processing in support of space situational awareness implies continuing computational throughput growth beyond the petascale regime. In addition to growing applications data burden and diversity, the breadth, diversity and scalability of high performance computing architectures and their various organizations challenge the development of a single, unifying, practicable model of parallel computation. Therefore, models for scalable parallel processing have exploited architectural and structural idiosyncrasies, yielding potential misapplications when legacy programs are ported among such architectures. In response to this challenge, we have developed a concise, efficient computational paradigm and software called Manyscale Computing to facilitate efficient mapping of annotated application codes to heterogeneous parallel architectures. Our theory, algorithms, software, and experimental results support partitioning and scheduling of application codes for envisioned parallel architectures, in terms of work atoms that are mapped (for example) to threads or thread blocks on computational hardware. Because of the rigor, completeness, conciseness, and layered design of our manyscale approach, application-to-architecture mapping is feasible and scalable for architectures at petascales, exascales, and above. Further, our methodology is simple, relying primarily on a small set of primitive mapping operations and support routines that are readily implemented on modern parallel processors such as graphics processing units (GPUs) and hybrid multi-processors (HMPs). In this paper, we overview the opportunities and challenges of manyscale computing for image and signal processing in support of space situational awareness applications. We discuss applications in terms of a layered hardware architecture (laboratory > supercomputer > rack > processor > component hierarchy). Demonstration applications include performance analysis and results in terms of execution time as well as storage, power, and energy consumption for bus-connected and/or networked architectures. The feasibility of the manyscale paradigm is demonstrated by addressing four principal challenges: (1) architectural/structural diversity, parallelism, and locality, (2) masking of I/O and memory latencies, (3) scalability of design as well as implementation, and (4) efficient representation/expression of parallel applications. Examples will demonstrate how manyscale computing helps solve these challenges efficiently on real-world computing systems.

  12. The future of PanDA in ATLAS distributed computing

    NASA Astrophysics Data System (ADS)

    De, K.; Klimentov, A.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Schovancova, J.; Vaniachine, A.; Wenaus, T.

    2015-12-01

    Experiments at the Large Hadron Collider (LHC) face unprecedented computing challenges. Heterogeneous resources are distributed worldwide at hundreds of sites, thousands of physicists analyse the data remotely, the volume of processed data is beyond the exabyte scale, while data processing requires more than a few billion hours of computing usage per year. The PanDA (Production and Distributed Analysis) system was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. In the process, the old batch job paradigm of locally managed computing in HEP was discarded in favour of a far more automated, flexible and scalable model. The success of PanDA in ATLAS is leading to widespread adoption and testing by other experiments. PanDA is the first exascale workload management system in HEP, already operating at more than a million computing jobs per day, and processing over an exabyte of data in 2013. There are many new challenges that PanDA will face in the near future, in addition to new challenges of scale, heterogeneity and increasing user base. PanDA will need to handle rapidly changing computing infrastructure, will require factorization of code for easier deployment, will need to incorporate additional information sources including network metrics in decision making, be able to control network circuits, handle dynamically sized workload processing, provide improved visualization, and face many other challenges. In this talk we will focus on the new features, planned or recently implemented, that are relevant to the next decade of distributed computing workload management using PanDA.

  13. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  14. Quantum simulator review

    NASA Astrophysics Data System (ADS)

    Bednar, Earl; Drager, Steven L.

    2007-04-01

    Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.

  15. DREAMTools: a Python package for scoring collaborative challenges

    PubMed Central

    Cokelaer, Thomas; Bansal, Mukesh; Bare, Christopher; Bilal, Erhan; Bot, Brian M.; Chaibub Neto, Elias; Eduati, Federica; de la Fuente, Alberto; Gönen, Mehmet; Hill, Steven M.; Hoff, Bruce; Karr, Jonathan R.; Küffner, Robert; Menden, Michael P.; Meyer, Pablo; Norel, Raquel; Pratap, Abhishek; Prill, Robert J.; Weirauch, Matthew T.; Costello, James C.; Stolovitzky, Gustavo; Saez-Rodriguez, Julio

    2016-01-01

    DREAM challenges are community competitions designed to advance computational methods and address fundamental questions in system biology and translational medicine. Each challenge asks participants to develop and apply computational methods to either predict unobserved outcomes or to identify unknown model parameters given a set of training data. Computational methods are evaluated using an automated scoring metric, scores are posted to a public leaderboard, and methods are published to facilitate community discussions on how to build improved methods. By engaging participants from a wide range of science and engineering backgrounds, DREAM challenges can comparatively evaluate a wide range of statistical, machine learning, and biophysical methods. Here, we describe DREAMTools, a Python package for evaluating DREAM challenge scoring metrics. DREAMTools provides a command line interface that enables researchers to test new methods on past challenges, as well as a framework for scoring new challenges. As of March 2016, DREAMTools includes more than 80% of completed DREAM challenges. DREAMTools complements the data, metadata, and software tools available at the DREAM website http://dreamchallenges.org and on the Synapse platform at https://www.synapse.org. Availability:  DREAMTools is a Python package. Releases and documentation are available at http://pypi.python.org/pypi/dreamtools. The source code is available at http://github.com/dreamtools/dreamtools. PMID:27134723

  16. Computational models of basal-ganglia pathway functions: focus on functional neuroanatomy

    PubMed Central

    Schroll, Henning; Hamker, Fred H.

    2013-01-01

    Over the past 15 years, computational models have had a considerable impact on basal-ganglia research. Most of these models implement multiple distinct basal-ganglia pathways and assume them to fulfill different functions. As there is now a multitude of different models, it has become complex to keep track of their various, sometimes just marginally different assumptions on pathway functions. Moreover, it has become a challenge to oversee to what extent individual assumptions are corroborated or challenged by empirical data. Focusing on computational, but also considering non-computational models, we review influential concepts of pathway functions and show to what extent they are compatible with or contradict each other. Moreover, we outline how empirical evidence favors or challenges specific model assumptions and propose experiments that allow testing assumptions against each other. PMID:24416002

  17. Unsteady Flow Simulation: A Numerical Challenge

    DTIC Science & Technology

    2003-03-01

    drive to convergence the numerical unsteady term. The time marching procedure is based on the approximate implicit Newton method for systems of non...computed through analytical derivatives of S. The linear system stemming from equation (3) is solved at each integration step by the same iterative method...significant reduction of memory usage, thanks to the reduced dimensions of the linear system matrix during the implicit marching of the solution. The

  18. Towards precision medicine: from quantitative imaging to radiomics

    PubMed Central

    Acharya, U. Rajendra; Hagiwara, Yuki; Sudarshan, Vidya K.; Chan, Wai Yee; Ng, Kwan Hoong

    2018-01-01

    Radiology (imaging) and imaging-guided interventions, which provide multi-parametric morphologic and functional information, are playing an increasingly significant role in precision medicine. Radiologists are trained to understand the imaging phenotypes, transcribe those observations (phenotypes) to correlate with underlying diseases and to characterize the images. However, in order to understand and characterize the molecular phenotype (to obtain genomic information) of solid heterogeneous tumours, the advanced sequencing of those tissues using biopsy is required. Thus, radiologists image the tissues from various views and angles in order to have the complete image phenotypes, thereby acquiring a huge amount of data. Deriving meaningful details from all these radiological data becomes challenging and raises the big data issues. Therefore, interest in the application of radiomics has been growing in recent years as it has the potential to provide significant interpretive and predictive information for decision support. Radiomics is a combination of conventional computer-aided diagnosis, deep learning methods, and human skills, and thus can be used for quantitative characterization of tumour phenotypes. This paper discusses the overview of radiomics workflow, the results of various radiomics-based studies conducted using various radiological images such as computed tomography (CT), magnetic resonance imaging (MRI), and positron-emission tomography (PET), the challenges we are facing, and the potential contribution of radiomics towards precision medicine. PMID:29308604

  19. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucas, Robert; Ang, James; Bergman, Keren

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a systemmore » that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.« less

  20. Terrain Hazard Detection and Avoidance During the Descent and Landing Phase of the Altair Mission

    NASA Technical Reports Server (NTRS)

    Strhan, Alan L.; Johnson, Andrew E.

    2010-01-01

    This paper describes some of the environmental challenges associated with landing a crewed or robotic vehicle at any certified location on the lunar surface (i.e. not a mountain peak, permanently dark crater floor or overly steep terrain), with a specific focus on how hazard detection technology may be incorporated to mitigate these challenges. For this discussion, the vehicle of interest is the Altair Lunar Lander, being the vehicle element of the NASA Constellation Program aimed at returning humans to the moon. Lunar environmental challenges for such global lunar access primarily involve terrain and lighting. These would include sizable rocks and slopes, which are more concentrated in highland areas; small craters, which are essentially everywhere independent of terrain type; and for polar regions, low-angle sunlight, which leaves significant terrain in shadow. To address these issues, as well as to provide for precision landing, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project was charted by NASA Headquarters, and has since been making significant progress. The ALHAT team considered several sensors for real-time hazard detection, settling on the use of a Flash Lidar mounted to a high-speed gimbal, with computationally intense image processing and elevation interpretation software. The Altair Project has been working with the ALHAT team to understand the capabilities and limitations of their concept, and has incorporated much of the ALHAT hazard detection system into the Altair baseline design. This integration, along with open issues relating to computational performance, the need for system redundancy, and potential pilot interaction, will be explored further in this paper.

  1. Coarse Grained Model for Biological Simulations: Recent Refinements and Validation

    PubMed Central

    Vicatos, Spyridon; Rychkova, Anna; Mukherjee, Shayantani; Warshel, Arieh

    2014-01-01

    Exploring the free energy landscape of proteins and modeling the corresponding functional aspects presents a major challenge for computer simulation approaches. This challenge is due to the complexity of the landscape and the enormous computer time needed for converging simulations. The use of various simplified coarse grained (CG) models offers an effective way of sampling the landscape, but most current models are not expected to give a reliable description of protein stability and functional aspects. The main problem is associated with insufficient focus on the electrostatic features of the model. In this respect our recent CG model offers significant advantage as it has been refined while focusing on its electrostatic free energy. Here we review the current state of our model, describing recent refinement, extensions and validation studies while focusing on demonstrating key applications. These include studies of protein stability, extending the model to include membranes and electrolytes and electrodes as well as studies of voltage activated proteins, protein insertion trough the translocon, the action of molecular motors and even the coupling of the stalled ribosome and the translocon. Our example illustrates the general potential of our approach in overcoming major challenges in studies of structure function correlation in proteins and large macromolecular complexes. PMID:25050439

  2. Efficient computation of hashes

    NASA Astrophysics Data System (ADS)

    Lopes, Raul H. C.; Franqueira, Virginia N. L.; Hobson, Peter R.

    2014-06-01

    The sequential computation of hashes at the core of many distributed storage systems and found, for example, in grid services can hinder efficiency in service quality and even pose security challenges that can only be addressed by the use of parallel hash tree modes. The main contributions of this paper are, first, the identification of several efficiency and security challenges posed by the use of sequential hash computation based on the Merkle-Damgard engine. In addition, alternatives for the parallel computation of hash trees are discussed, and a prototype for a new parallel implementation of the Keccak function, the SHA-3 winner, is introduced.

  3. Designing Collaborative Learning Environments Mediated by Computer Conferencing: Issues and Challenges in the Asian Socio-Cultural Context.

    ERIC Educational Resources Information Center

    Gunawardena, Charlotte N.

    1998-01-01

    Explores issues related to the design of collaborative-learning environments mediated by computer conferencing from the perspective of challenges faced in the sociocultural context of the Indian sub-continent. Examines the impact of online features on social cohesiveness, group dynamics, interaction, communication anxiety, and participation.…

  4. Challenges in Integrating a Complex Systems Computer Simulation in Class: An Educational Design Research

    ERIC Educational Resources Information Center

    Loke, Swee-Kin; Al-Sallami, Hesham S.; Wright, Daniel F. B.; McDonald, Jenny; Jadhav, Sheetal; Duffull, Stephen B.

    2012-01-01

    Complex systems are typically difficult for students to understand and computer simulations offer a promising way forward. However, integrating such simulations into conventional classes presents numerous challenges. Framed within an educational design research, we studied the use of an in-house built simulation of the coagulation network in four…

  5. Automating a Massive Online Course with Cluster Computing

    ERIC Educational Resources Information Center

    Haas, Timothy C.

    2016-01-01

    Before massive numbers of students can take online courses for college credit, the challenges of providing tutoring support, answers to student-posed questions, and the control of cheating will need to be addressed. These challenges are taken up here by developing an online course delivery system that runs in a cluster computing environment and is…

  6. Investigating the Benefits and Challenges of Using Laptop Computers in Higher Education Classrooms

    ERIC Educational Resources Information Center

    Kay, Robin Holding; Lauricella, Sharon

    2014-01-01

    The purpose of this study was to investigate the benefits and challenges using laptop computers (hereafter referred to as laptops) inside and outside higher education classrooms. Quantitative and qualitative data were collected from 156 university students (54 males, 102 females) enrolled in either education or communication studies. Benefits of…

  7. 78 FR 50404 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... for the exascale challenges charge. Tentative Agenda: Agenda will include discussion of the following: Exascale Challenges Workshop and preliminary list of most critical challenges, and technical approaches to...

  8. The next generation of command post computing

    NASA Astrophysics Data System (ADS)

    Arnold, Ross D.; Lieb, Aaron J.; Samuel, Jason M.; Burger, Mitchell A.

    2015-05-01

    The future of command post computing demands an innovative new solution to address a variety of challenging operational needs. The Command Post of the Future is the Army's primary command and control decision support system, providing situational awareness and collaborative tools for tactical decision making, planning, and execution management from Corps to Company level. However, as the U.S. Army moves towards a lightweight, fully networked battalion, disconnected operations, thin client architecture and mobile computing become increasingly essential. The Command Post of the Future is not designed to support these challenges in the coming decade. Therefore, research into a hybrid blend of technologies is in progress to address these issues. This research focuses on a new command and control system utilizing the rich collaboration framework afforded by Command Post of the Future coupled with a new user interface consisting of a variety of innovative workspace designs. This new system is called Tactical Applications. This paper details a brief history of command post computing, presents the challenges facing the modern Army, and explores the concepts under consideration for Tactical Applications that meet these challenges in a variety of innovative ways.

  9. Challenges in design of Kitaev materials: Magnetic interactions from competing energy scales

    NASA Astrophysics Data System (ADS)

    Winter, Stephen M.; Li, Ying; Jeschke, Harald O.; Valentí, Roser

    2016-06-01

    In this study, we reanalyze the magnetic interactions in the Kitaev spin-liquid candidate materials Na2IrO3,α -RuCl3 , and α -Li2IrO3 using nonperturbative exact diagonalization methods. These methods are more appropriate given the relatively itinerant nature of the systems suggested in previous works. We treat all interactions up to third neighbors on equal footing. The computed terms reveal significant long-range coupling, bond anisotropy, and/or off-diagonal couplings which we argue naturally explain the observed ordered phases in these systems. Given these observations, the potential for realizing the spin-liquid state in real materials is analyzed, and synthetic challenges are defined and explained.

  10. Opportunities and challenges of cloud computing to improve health care services.

    PubMed

    Kuo, Alex Mu-Hsing

    2011-09-21

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed.

  11. Cloud computing security.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.

    Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for bothmore » academia and government, including configuration options, hardware issues, challenges, and solutions.« less

  12. Using CFD Surface Solutions to Shape Sonic Boom Signatures Propagated from Off-Body Pressure

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Li, Wu

    2013-01-01

    The conceptual design of a low-boom and low-drag supersonic aircraft remains a challenge despite significant progress in recent years. Inverse design using reversed equivalent area and adjoint methods have been demonstrated to be effective in shaping the ground signature propagated from computational fluid dynamics (CFD) off-body pressure distributions. However, there is still a need to reduce the computational cost in the early stages of design to obtain a baseline that is feasible for low-boom shaping, and in the search for a robust low-boom design over the entire sonic boom footprint. The proposed design method addresses the need to reduce the computational cost for robust low-boom design by using surface pressure distributions from CFD solutions to shape sonic boom ground signatures propagated from CFD off-body pressure.

  13. Current algorithmic solutions for peptide-based proteomics data generation and identification.

    PubMed

    Hoopmann, Michael R; Moritz, Robert L

    2013-02-01

    Peptide-based proteomic data sets are ever increasing in size and complexity. These data sets provide computational challenges when attempting to quickly analyze spectra and obtain correct protein identifications. Database search and de novo algorithms must consider high-resolution MS/MS spectra and alternative fragmentation methods. Protein inference is a tricky problem when analyzing large data sets of degenerate peptide identifications. Combining multiple algorithms for improved peptide identification puts significant strain on computational systems when investigating large data sets. This review highlights some of the recent developments in peptide and protein identification algorithms for analyzing shotgun mass spectrometry data when encountering the aforementioned hurdles. Also explored are the roles that analytical pipelines, public spectral libraries, and cloud computing play in the evolution of peptide-based proteomics. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Large eddy simulation applications in gas turbines.

    PubMed

    Menzies, Kevin

    2009-07-28

    The gas turbine presents significant challenges to any computational fluid dynamics techniques. The combination of a wide range of flow phenomena with complex geometry is difficult to model in the context of Reynolds-averaged Navier-Stokes (RANS) solvers. We review the potential for large eddy simulation (LES) in modelling the flow in the different components of the gas turbine during a practical engineering design cycle. We show that while LES has demonstrated considerable promise for reliable prediction of many flows in the engine that are difficult for RANS it is not a panacea and considerable application challenges remain. However, for many flows, especially those dominated by shear layer mixing such as in combustion chambers and exhausts, LES has demonstrated a clear superiority over RANS for moderately complex geometries although at significantly higher cost which will remain an issue in making the calculations relevant within the design cycle.

  15. Existential anxiety and growth: an exploration of computerized drawings and perspectives of children and adolescents with cancer.

    PubMed

    Woodgate, Roberta L; West, Christina H; Tailor, Ketan

    2014-01-01

    Until now, most existentially focused cancer research has been conducted within adult populations. Only a handful of qualitative investigations have captured the experiences of children with cancer relative to themes such as existential fear and finitude, meaning/meaninglessness, uncertainty, authenticity, and inauthenticity. This article aimed to provide a deeper understanding of the existential challenges faced by children living with cancer. An interpretive, descriptive qualitative research approach was used. Thirteen children (8-17 years) undergoing treatment for cancer participated. Children participated in individual open-ended interviews and also had the opportunity to journal their experiences in a computerized drawing tool. The 4 main themes that emerged in relation to the existential challenges experienced by children with cancer included (1) existential worry, (2) existential vacuum, (3) existential longing, and (4) existential growth. The drawing tool within the computer diary was found to be particularly beneficial in assisting children to express the existential challenges that they had previously been unable to articulate in words. Children moved between existential anxiety and existential growth within the cancer world. The expressive means of drawing pictures gave children a therapeutic space to explore and work at understanding the existential challenges experienced. This research provides evidence that the active engagement of children's imaginations through the use of a computer-drawing tool may have significant therapeutic value for children with cancer. As well, the findings support the importance of nurses "being there" for young patients with cancer in their time of despair.

  16. Heterogeneous scalable framework for multiphase flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, Karla Vanessa

    2013-09-01

    Two categories of challenges confront the developer of computational spray models: those related to the computation and those related to the physics. Regarding the computation, the trend towards heterogeneous, multi- and many-core platforms will require considerable re-engineering of codes written for the current supercomputing platforms. Regarding the physics, accurate methods for transferring mass, momentum and energy from the dispersed phase onto the carrier fluid grid have so far eluded modelers. Significant challenges also lie at the intersection between these two categories. To be competitive, any physics model must be expressible in a parallel algorithm that performs well on evolving computermore » platforms. This work created an application based on a software architecture where the physics and software concerns are separated in a way that adds flexibility to both. The develop spray-tracking package includes an application programming interface (API) that abstracts away the platform-dependent parallelization concerns, enabling the scientific programmer to write serial code that the API resolves into parallel processes and threads of execution. The project also developed the infrastructure required to provide similar APIs to other application. The API allow object-oriented Fortran applications direct interaction with Trilinos to support memory management of distributed objects in central processing units (CPU) and graphic processing units (GPU) nodes for applications using C++.« less

  17. Nanoinformatics knowledge infrastructures: bringing efficient information management to nanomedical research

    PubMed Central

    de la Iglesia, D; Cachau, R E; García-Remesal, M; Maojo, V

    2014-01-01

    Nanotechnology represents an area of particular promise and significant opportunity across multiple scientific disciplines. Ongoing nanotechnology research ranges from the characterization of nanoparticles and nanomaterials to the analysis and processing of experimental data seeking correlations between nanoparticles and their functionalities and side effects. Due to their special properties, nanoparticles are suitable for cellular-level diagnostics and therapy, offering numerous applications in medicine, e.g. development of biomedical devices, tissue repair, drug delivery systems and biosensors. In nanomedicine, recent studies are producing large amounts of structural and property data, highlighting the role for computational approaches in information management. While in vitro and in vivo assays are expensive, the cost of computing is falling. Furthermore, improvements in the accuracy of computational methods (e.g. data mining, knowledge discovery, modeling and simulation) have enabled effective tools to automate the extraction, management and storage of these vast data volumes. Since this information is widely distributed, one major issue is how to locate and access data where it resides (which also poses data-sharing limitations). The novel discipline of nanoinformatics addresses the information challenges related to nanotechnology research. In this paper, we summarize the needs and challenges in the field and present an overview of extant initiatives and efforts. PMID:24932210

  18. Integrating numerical computation into the undergraduate education physics curriculum using spreadsheet excel

    NASA Astrophysics Data System (ADS)

    Fauzi, Ahmad

    2017-11-01

    Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.

  19. Announcing the Launch of CPTAC’s Proteogenomics DREAM Challenge | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    This week, we are excited to announce the launch of the National Cancer Institute’s Clinical Proteomic Tumor Analysis Consortium (CPTAC) Proteogenomics Computational DREAM Challenge.  The aim of this Challenge is to encourage the generation of computational methods for extracting information from the cancer proteome and for linking those data to genomic and transcriptomic information.  The specific goals are to predict proteomic and phosphoproteomic data from other multiple data types including transcriptomics and genetics.

  20. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set ofmore » recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.« less

  1. Engineering brain-computer interfaces: past, present and future.

    PubMed

    Hughes, M A

    2014-06-01

    Electricity governs the function of both nervous systems and computers. Whilst ions move in polar fluids to depolarize neuronal membranes, electrons move in the solid-state lattices of microelectronic semiconductors. Joining these two systems together, to create an iono-electric brain-computer interface, is an immense challenge. However, such interfaces offer (and in select clinical contexts have already delivered) a method of overcoming disability caused by neurological or musculoskeletal pathology. To fulfill their theoretical promise, several specific challenges demand consideration. Rate-limiting steps cover a diverse range of disciplines including microelectronics, neuro-informatics, engineering, and materials science. As those who work at the tangible interface between brain and outside world, neurosurgeons are well placed to contribute to, and inform, this cutting edge area of translational research. This article explores the historical background, status quo, and future of brain-computer interfaces; and outlines the challenges to progress and opportunities available to the clinical neurosciences community.

  2. Computational Aeroelastic Modeling of Airframes and TurboMachinery: Progress and Challenges

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.; Sayma, A. I.

    2006-01-01

    Computational analyses such as computational fluid dynamics and computational structural dynamics have made major advances toward maturity as engineering tools. Computational aeroelasticity is the integration of these disciplines. As computational aeroelasticity matures it too finds an increasing role in the design and analysis of aerospace vehicles. This paper presents a survey of the current state of computational aeroelasticity with a discussion of recent research, success and continuing challenges in its progressive integration into multidisciplinary aerospace design. This paper approaches computational aeroelasticity from the perspective of the two main areas of application: airframe and turbomachinery design. An overview will be presented of the different prediction methods used for each field of application. Differing levels of nonlinear modeling will be discussed with insight into accuracy versus complexity and computational requirements. Subjects will include current advanced methods (linear and nonlinear), nonlinear flow models, use of order reduction techniques and future trends in incorporating structural nonlinearity. Examples in which computational aeroelasticity is currently being integrated into the design of airframes and turbomachinery will be presented.

  3. Aerothermodynamics of Blunt Body Entry Vehicles. Chapter 3

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Borrelli, Salvatore

    2011-01-01

    In this chapter, the aerothermodynamic phenomena of blunt body entry vehicles are discussed. Four topics will be considered that present challenges to current computational modeling techniques for blunt body environments: turbulent flow, non-equilibrium flow, rarefied flow, and radiation transport. Examples of comparisons between computational tools to ground and flight-test data will be presented in order to illustrate the challenges existing in the numerical modeling of each of these phenomena and to provide test cases for evaluation of Computational Fluid Dynamics (CFD) code predictions.

  4. Aerothermodynamics of blunt body entry vehicles

    NASA Astrophysics Data System (ADS)

    Hollis, Brian R.; Borrelli, Salvatore

    2012-01-01

    In this chapter, the aerothermodynamic phenomena of blunt body entry vehicles are discussed. Four topics will be considered that present challenges to current computational modeling techniques for blunt body environments: turbulent flow, non-equilibrium flow, rarefied flow, and radiation transport. Examples of comparisons between computational tools to ground and flight-test data will be presented in order to illustrate the challenges existing in the numerical modeling of each of these phenomena and to provide test cases for evaluation of computational fluid dynamics (CFD) code predictions.

  5. Implementation of cascade logic gates and majority logic gate on a simple and universal molecular platform.

    PubMed

    Gao, Jinting; Liu, Yaqing; Lin, Xiaodong; Deng, Jiankang; Yin, Jinjin; Wang, Shuo

    2017-10-25

    Wiring a series of simple logic gates to process complex data is significantly important and a large challenge for untraditional molecular computing systems. The programmable property of DNA endows its powerful application in molecular computing. In our investigation, it was found that DNA exhibits excellent peroxidase-like activity in a colorimetric system of TMB/H 2 O 2 /Hemin (TMB, 3,3', 5,5'-Tetramethylbenzidine) in the presence of K + and Cu 2+ , which is significantly inhibited by the addition of an antioxidant. According to the modulated catalytic activity of this DNA-based catalyst, three cascade logic gates including AND-OR-INH (INHIBIT), AND-INH and OR-INH were successfully constructed. Interestingly, by only modulating the concentration of Cu 2+ , a majority logic gate with a single-vote veto function was realized following the same threshold value as that of the cascade logic gates. The strategy is quite straightforward and versatile and provides an instructive method for constructing multiple logic gates on a simple platform to implement complex molecular computing.

  6. Using computer-assisted learning to engage diverse learning styles in understanding business management principles.

    PubMed

    Frost, Mary E; Derby, Dustin C; Haan, Andrea G

    2013-01-01

    Objective : Changes in small business and insurance present challenges for newly graduated chiropractors. Technology that reaches identified, diverse learning styles may assist the chiropractic student in business classes to meet course outcomes better. Thus, the purpose of our study is to determine if the use of technology-based instructional aids enhance students' mastery of course learning outcomes. Methods : Using convenience sampling, 86 students completed a survey assessing course learning outcomes, learning style, and the helpfulness of lecture and computer-assisted learning related to content mastery. Quantitative analyses occurred. Results : Although respondents reported not finding the computer-assisted learning as helpful as the lecture, significant relationships were found between pre- and post-assisted learning measures of the learning outcomes 1 and 2 for the visual and kinesthetic groups. Surprisingly, however, all learning style groups exhibited significant pre- and post-assisted learning appraisal relationships with learning outcomes 3 and 4. Conclusion : While evidence exists within the current study of a relationship between students' learning of the course content corollary to the use of technologic instructional aids, the exact nature of the relationship remains unclear.

  7. Using computer-assisted learning to engage diverse learning styles in understanding business management principles.

    PubMed

    Frost, Mary E; Derby, Dustin C; Haan, Andrea G

    2013-06-27

    Objective : Changes in small business and insurance present challenges for newly graduated chiropractors. Technology that reaches identified, diverse learning styles may assist the chiropractic student in business classes to meet course outcomes better. Thus, the purpose of our study is to determine if the use of technology-based instructional aids enhance students' mastery of course learning outcomes. Methods : Using convenience sampling, 86 students completed a survey assessing course learning outcomes, learning style, and the helpfulness of lecture and computer-assisted learning related to content mastery. Quantitative analyses occurred. Results : Although respondents reported not finding the computer-assisted learning as helpful as the lecture, significant relationships were found between pre- and post-assisted learning measures of the learning outcomes 1 and 2 for the visual and kinesthetic groups. Surprisingly, however, all learning style groups exhibited significant pre- and post-assisted learning appraisal relationships with learning outcomes 3 and 4. Conclusion : While evidence exists within the current study of a relationship between students' learning of the course content corollary to the use of technologic instructional aids, the exact nature of the relationship remains unclear.

  8. Phase unwrapping with graph cuts optimization and dual decomposition acceleration for 3D high-resolution MRI data.

    PubMed

    Dong, Jianwu; Chen, Feng; Zhou, Dong; Liu, Tian; Yu, Zhaofei; Wang, Yi

    2017-03-01

    Existence of low SNR regions and rapid-phase variations pose challenges to spatial phase unwrapping algorithms. Global optimization-based phase unwrapping methods are widely used, but are significantly slower than greedy methods. In this paper, dual decomposition acceleration is introduced to speed up a three-dimensional graph cut-based phase unwrapping algorithm. The phase unwrapping problem is formulated as a global discrete energy minimization problem, whereas the technique of dual decomposition is used to increase the computational efficiency by splitting the full problem into overlapping subproblems and enforcing the congruence of overlapping variables. Using three dimensional (3D) multiecho gradient echo images from an agarose phantom and five brain hemorrhage patients, we compared this proposed method with an unaccelerated graph cut-based method. Experimental results show up to 18-fold acceleration in computation time. Dual decomposition significantly improves the computational efficiency of 3D graph cut-based phase unwrapping algorithms. Magn Reson Med 77:1353-1358, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  9. Modeling compressible multiphase flows with dispersed particles in both dense and dilute regimes

    NASA Astrophysics Data System (ADS)

    McGrath, T.; St. Clair, J.; Balachandar, S.

    2018-05-01

    Many important explosives and energetics applications involve multiphase formulations employing dispersed particles. While considerable progress has been made toward developing mathematical models and computational methodologies for these flows, significant challenges remain. In this work, we apply a mathematical model for compressible multiphase flows with dispersed particles to existing shock and explosive dispersal problems from the literature. The model is cast in an Eulerian framework, treats all phases as compressible, is hyperbolic, and satisfies the second law of thermodynamics. It directly applies the continuous-phase pressure gradient as a forcing function for particle acceleration and thereby retains relaxed characteristics for the dispersed particle phase that remove the constituent material sound velocity from the eigenvalues. This is consistent with the expected characteristics of dispersed particle phases and can significantly improve the stable time-step size for explicit methods. The model is applied to test cases involving the shock and explosive dispersal of solid particles and compared to data from the literature. Computed results compare well with experimental measurements, providing confidence in the model and computational methods applied.

  10. A Programming Framework for Scientific Applications on CPU-GPU Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, John

    2013-03-24

    At a high level, my research interests center around designing, programming, and evaluating computer systems that use new approaches to solve interesting problems. The rapid change of technology allows a variety of different architectural approaches to computationally difficult problems, and a constantly shifting set of constraints and trends makes the solutions to these problems both challenging and interesting. One of the most important recent trends in computing has been a move to commodity parallel architectures. This sea change is motivated by the industry’s inability to continue to profitably increase performance on a single processor and instead to move to multiplemore » parallel processors. In the period of review, my most significant work has been leading a research group looking at the use of the graphics processing unit (GPU) as a general-purpose processor. GPUs can potentially deliver superior performance on a broad range of problems than their CPU counterparts, but effectively mapping complex applications to a parallel programming model with an emerging programming environment is a significant and important research problem.« less

  11. The Man computer Interactive Data Access System: 25 Years of Interactive Processing.

    NASA Astrophysics Data System (ADS)

    Lazzara, Matthew A.; Benson, John M.; Fox, Robert J.; Laitsch, Denise J.; Rueden, Joseph P.; Santek, David A.; Wade, Delores M.; Whittaker, Thomas M.; Young, J. T.

    1999-02-01

    On 12 October 1998, it was the 25th anniversary of the Man computer Interactive Data Access System (McIDAS). On that date in 1973, McIDAS was first used operationally by scientists as a tool for data analysis. Over the last 25 years, McIDAS has undergone numerous architectural changes in an effort to keep pace with changing technology. In its early years, significant technological breakthroughs were required to achieve the functionality needed by atmospheric scientists. Today McIDAS is challenged by new Internet-based approaches to data access and data display. The history and impact of McIDAS, along with some of the lessons learned, are presented here

  12. Role of High-End Computing in Meeting NASA's Science and Engineering Challenges

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak

    2006-01-01

    High-End Computing (HEC) has always played a major role in meeting the modeling and simulation needs of various NASA missions. With NASA's newest 62 teraflops Columbia supercomputer, HEC is having an even greater impact within the Agency and beyond. Significant cutting-edge science and engineering simulations in the areas of space exploration, Shuttle operations, Earth sciences, and aeronautics research, are already occurring on Columbia, demonstrating its ability to accelerate NASA s exploration vision. The talk will describe how the integrated supercomputing production environment is being used to reduce design cycle time, accelerate scientific discovery, conduct parametric analysis of multiple scenarios, and enhance safety during the life cycle of NASA missions.

  13. On the design of computer-based models for integrated environmental science.

    PubMed

    McIntosh, Brian S; Jeffrey, Paul; Lemon, Mark; Winder, Nick

    2005-06-01

    The current research agenda in environmental science is dominated by calls to integrate science and policy to better understand and manage links between social (human) and natural (nonhuman) processes. Freshwater resource management is one area where such calls can be heard. Designing computer-based models for integrated environmental science poses special challenges to the research community. At present it is not clear whether such tools, or their outputs, receive much practical policy or planning application. It is argued that this is a result of (1) a lack of appreciation within the research modeling community of the characteristics of different decision-making processes including policy, planning, and (2) participation, (3) a lack of appreciation of the characteristics of different decision-making contexts, (4) the technical difficulties in implementing the necessary support tool functionality, and (5) the socio-technical demands of designing tools to be of practical use. This article presents a critical synthesis of ideas from each of these areas and interprets them in terms of design requirements for computer-based models being developed to provide scientific information support for policy and planning. Illustrative examples are given from the field of freshwater resources management. Although computer-based diagramming and modeling tools can facilitate processes of dialogue, they lack adequate simulation capabilities. Component-based models and modeling frameworks provide such functionality and may be suited to supporting problematic or messy decision contexts. However, significant technical (implementation) and socio-technical (use) challenges need to be addressed before such ambition can be realized.

  14. Big Data, Deep Learning and Tianhe-2 at Sun Yat-Sen University, Guangzhou

    NASA Astrophysics Data System (ADS)

    Yuen, D. A.; Dzwinel, W.; Liu, J.; Zhang, K.

    2014-12-01

    In this decade the big data revolution has permeated in many fields, ranging from financial transactions, medical surveys and scientific endeavors, because of the big opportunities people see ahead. What to do with all this data remains an intriguing question. This is where computer scientists together with applied mathematicians have made some significant inroads in developing deep learning techniques for unraveling new relationships among the different variables by means of correlation analysis and data-assimilation methods. Deep-learning and big data taken together is a grand challenge task in High-performance computing which demand both ultrafast speed and large memory. The Tianhe-2 recently installed at Sun Yat-Sen University in Guangzhou is well positioned to take up this challenge because it is currently the world's fastest computer at 34 Petaflops. Each compute node of Tianhe-2 has two CPUs of Intel Xeon E5-2600 and three Xeon Phi accelerators. The Tianhe-2 has a very large fast memory RAM of 88 Gigabytes on each node. The system has a total memory of 1,375 Terabytes. All of these technical features will allow very high dimensional (more than 10) problem in deep learning to be explored carefully on the Tianhe-2. Problems in seismology which can be solved include three-dimensional seismic wave simulations of the whole Earth with a few km resolution and the recognition of new phases in seismic wave form from assemblage of large data sets.

  15. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures.

    PubMed

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.

  16. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures

    PubMed Central

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315

  17. Designing and validating the joint battlespace infosphere

    NASA Astrophysics Data System (ADS)

    Peterson, Gregory D.; Alexander, W. Perry; Birdwell, J. Douglas

    2001-08-01

    Fielding and managing the dynamic, complex information systems infrastructure necessary for defense operations presents significant opportunities for revolutionary improvements in capabilities. An example of this technology trend is the creation and validation of the Joint Battlespace Infosphere (JBI) being developed by the Air Force Research Lab. The JBI is a system of systems that integrates, aggregates, and distributes information to users at all echelons, from the command center to the battlefield. The JBI is a key enabler of meeting the Air Force's Joint Vision 2010 core competencies such as Information Superiority, by providing increased situational awareness, planning capabilities, and dynamic execution. At the same time, creating this new operational environment introduces significant risk due to an increased dependency on computational and communications infrastructure combined with more sophisticated and frequent threats. Hence, the challenge facing the nation is the most effective means to exploit new computational and communications technologies while mitigating the impact of attacks, faults, and unanticipated usage patterns.

  18. Secure Enclaves: An Isolation-centric Approach for Creating Secure High Performance Computing Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aderholdt, Ferrol; Caldwell, Blake A.; Hicks, Susan Elaine

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data at various security levels but in so doing are often enclaved at the highest security posture. This approach places significant restrictions on the users of the system even when processing data at a lower security level and exposes data at higher levels of confidentiality to a much broader population than otherwise necessary. The traditional approach of isolation, while effective in establishing security enclaves poses significant challenges formore » the use of shared infrastructure in HPC environments. This report details current state-of-the-art in virtualization, reconfigurable network enclaving via Software Defined Networking (SDN), and storage architectures and bridging techniques for creating secure enclaves in HPC environments.« less

  19. Computer-Delivered Social Norm Message Increases Pain Tolerance

    PubMed Central

    Pulvers, Kim; Schroeder, Jacquelyn; Limas, Eleuterio F.; Zhu, Shu-Hong

    2013-01-01

    Background Few experimental studies have been conducted on social determinants of pain tolerance. Purpose This study tests a brief, computer-delivered social norm message for increasing pain tolerance. Methods Healthy young adults (N=260; 44 % Caucasian; 27 % Hispanic) were randomly assigned into a 2 (social norm)×2 (challenge) cold pressor study, stratified by gender. They received standard instructions or standard instructions plus a message that contained artifically elevated information about typical performance of others. Results Those receiving a social norm message displayed significantly higher pain tolerance, F(1, 255)=26.95, p<.001, ηp2=.10 and pain threshold F(1, 244)=9.81, p=.002, ηp2=.04, but comparable pain intensity, p>.05. There were no interactions between condition and gender on any outcome variables, p>.05. Conclusions Social norms can significantly increase pain tolerance, even with a brief verbal message delivered by a video. PMID:24146086

  20. Current Computational Challenges for CMC Processes, Properties, and Structures

    NASA Technical Reports Server (NTRS)

    DiCarlo, James

    2008-01-01

    In comparison to current state-of-the-art metallic alloys, ceramic matrix composites (CMC) offer a variety of performance advantages, such as higher temperature capability (greater than the approx.2100 F capability for best metallic alloys), lower density (approx.30-50% metal density), and lower thermal expansion. In comparison to other competing high-temperature materials, CMC are also capable of providing significantly better static and dynamic toughness than un-reinforced monolithic ceramics and significantly better environmental resistance than carbon-fiber reinforced composites. Because of these advantages, NASA, the Air Force, and other U.S. government agencies and industries are currently seeking to implement these advanced materials into hot-section components of gas turbine engines for both propulsion and power generation. For applications such as these, CMC are expected to result in many important performance benefits, such as reduced component cooling air requirements, simpler component design, reduced weight, improved fuel efficiency, reduced emissions, higher blade frequencies, reduced blade clearances, and higher thrust. Although much progress has been made recently in the development of CMC constituent materials and fabrication processes, major challenges still remain for implementation of these advanced composite materials into viable engine components. The objective of this presentation is to briefly review some of those challenges that are generally related to the need to develop physics-based computational approaches to allow CMC fabricators and designers to model (1) CMC processes for fiber architecture formation and matrix infiltration, (2) CMC properties of high technical interest such as multidirectional creep, thermal conductivity, matrix cracking stress, damage accumulation, and degradation effects in aggressive environments, and (3) CMC component life times when all of these effects are interacting in a complex stress and service environment. To put these computational issues in perspective, the various modeling needs within these three areas are briefly discussed in terms of their technical importance and their key controlling mechanistic factors as we know them today. Emphasis is placed primarily on the SiC/SiC ceramic composite system because of its higher temperature capability and enhanced development within the CMC industry. A brief summary is then presented concerning on-going property studies aimed at addressing these CMC modeling needs within NASA in terms of their computational approaches and recent important results. Finally an overview perspective is presented on those key areas where further CMC computational studies are needed today to enhance the viability of CMC structural components for high-temperature applications.

  1. Isolated gallbladder injury in a case of blunt abdominal trauma.

    PubMed

    Birn, Jeffrey; Jung, Melissa; Dearing, Mark

    2012-04-01

    The diagnosis of blunt injury to the gallbladder may constitute a significant challenge to the diagnostician. There is often a delay in presentation with non-specific clinical symptoms. In the absence of reliable clinical symptoms, diagnostic imaging becomes an invaluable tool in the rapid identification of gallbladder injury. We present a case of isolated gallbladder injury following blunt abdominal trauma which was diagnosed by computed tomography and subsequently confirmed by cholecystectomy.

  2. Cloud Computing in Support of Applied Learning: A Baseline Study of Infrastructure Design at Southern Polytechnic State University

    ERIC Educational Resources Information Center

    Conn, Samuel S.; Reichgelt, Han

    2013-01-01

    Cloud computing represents an architecture and paradigm of computing designed to deliver infrastructure, platforms, and software as constructible computing resources on demand to networked users. As campuses are challenged to better accommodate academic needs for applications and computing environments, cloud computing can provide an accommodating…

  3. Computing in the Curriculum: Challenges and Strategies from a Teacher's Perspective

    ERIC Educational Resources Information Center

    Sentance, Sue; Csizmadia, Andrew

    2017-01-01

    Computing is being introduced into the curriculum in many countries. Teachers' perspectives enable us to discover what challenges this presents, and also the strategies teachers claim to be using successfully in teaching the subject across primary and secondary education. The study described in this paper was carried out in the UK in 2014 where…

  4. Cloud Implementation in Organizations: Critical Success Factors, Challenges, and Impacts on the IT Function

    ERIC Educational Resources Information Center

    Suo, Shuguang

    2013-01-01

    Organizations have been forced to rethink business models and restructure facilities through IT innovation as they have faced the challenges arising from globalization, mergers and acquisitions, big data, and the ever-changing demands of customers. Cloud computing has emerged as a new computing paradigm that has fundamentally shaped the business…

  5. Image Processing Algorithms in the Secondary School Programming Education

    ERIC Educational Resources Information Center

    Gerják, István

    2017-01-01

    Learning computer programming for students of the age of 14-18 is difficult and requires endurance and engagement. Being familiar with the syntax of a computer language and writing programs in it are challenges for youngsters, not to mention that understanding algorithms is also a big challenge. To help students in the learning process, teachers…

  6. Enhancing Competence and Autonomy in Computer-Based Instruction Using a Skill-Challenge Balancing Strategy

    ERIC Educational Resources Information Center

    Kim, Jieun; Ryu, Hokyoung; Katuk, Norliza; Wang, Ruili; Choi, Gyunghyun

    2014-01-01

    The present study aims to show if a skill-challenge balancing (SCB) instruction strategy can assist learners to motivationally engage in computer-based learning. Csikszentmihalyi's flow theory (self-control, curiosity, focus of attention, and intrinsic interest) was applied to an account of the optimal learning experience in SCB-based learning…

  7. Challenges of Teaching Computer Science in Transition Countries: Albanian University Case

    ERIC Educational Resources Information Center

    Sotirofski, Kseanela; Kukeli, Agim; Kalemi, Edlira

    2010-01-01

    The main objective of our study is to determine the challenges faced during the process of teaching Computer Science in a university of a country in transition and make suggestions to improve this teaching process by perfecting the necessary conditions. Our survey builds on the thesis that we live in an information age; information technology is…

  8. Grid Generation for Multidisciplinary Design and Optimization of an Aerospace Vehicle: Issues and Challenges

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2000-01-01

    The purpose of this paper is to discuss grid generation issues and to challenge the grid generation community to develop tools suitable for automated multidisciplinary analysis and design optimization of aerospace vehicles. Special attention is given to the grid generation issues of computational fluid dynamics and computational structural mechanics disciplines.

  9. Answering the Challenge of Teletext, Viewdata Systems and Other Fast Growing Communications, Such as Home Computers.

    ERIC Educational Resources Information Center

    Hall, Sandra K.

    Newspapers are facing challenges from the new media of teletext, viewdata systems, and home computers. Teletext, which provides formated pages of text broadcast for viewing on a television screen, provides news immediately, simply, conveniently, and inexpensively. However, it does not provide the browse and scan options of newspapers. Of greater…

  10. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crabtree, George; Glotzer, Sharon; McCurdy, Bill

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less

  11. Communication analysis for feedback control of civil infrastructure using cochlea-inspired sensing nodes

    NASA Astrophysics Data System (ADS)

    Peckens, Courtney A.; Cook, Ireana; Lynch, Jerome P.

    2016-04-01

    Wireless sensor networks (WSNs) have emerged as a reliable, low-cost alternative to the traditional wired sensing paradigm. While such networks have made significant progress in the field of structural monitoring, significantly less development has occurred for feedback control applications. Previous work in WSNs for feedback control has highlighted many of the challenges of using this technology including latency in the wireless communication channel and computational inundation at the individual sensing nodes. This work seeks to overcome some of those challenges by drawing inspiration from the real-time sensing and control techniques employed by the biological central nervous system and in particular the mammalian cochlea. A novel bio-inspired wireless sensor node was developed that employs analog filtering techniques to perform time-frequency decomposition of a sensor signal, thus encompassing the functionality of the cochlea. The node then utilizes asynchronous sampling of the filtered signal to compress the signal prior to communication. This bio-inspired sensing architecture is extended to a feedback control application in order to overcome the traditional challenges currently faced by wireless control. In doing this, however, the network experiences high bandwidths of low-significance information exchange between nodes, resulting in some lost data. This study considers the impact of this lost data on the control capabilities of the bio-inspired control architecture and finds that it does not significantly impact the effectiveness of control.

  12. A new method for enhancer prediction based on deep belief network.

    PubMed

    Bu, Hongda; Gan, Yanglan; Wang, Yang; Zhou, Shuigeng; Guan, Jihong

    2017-10-16

    Studies have shown that enhancers are significant regulatory elements to play crucial roles in gene expression regulation. Since enhancers are unrelated to the orientation and distance to their target genes, it is a challenging mission for scholars and researchers to accurately predicting distal enhancers. In the past years, with the high-throughout ChiP-seq technologies development, several computational techniques emerge to predict enhancers using epigenetic or genomic features. Nevertheless, the inconsistency of computational models across different cell-lines and the unsatisfactory prediction performance call for further research in this area. Here, we propose a new Deep Belief Network (DBN) based computational method for enhancer prediction, which is called EnhancerDBN. This method combines diverse features, composed of DNA sequence compositional features, DNA methylation and histone modifications. Our computational results indicate that 1) EnhancerDBN outperforms 13 existing methods in prediction, and 2) GC content and DNA methylation can serve as relevant features for enhancer prediction. Deep learning is effective in boosting the performance of enhancer prediction.

  13. Security and Cloud Outsourcing Framework for Economic Dispatch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi

    The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less

  14. LOCAL ORTHOGONAL CUTTING METHOD FOR COMPUTING MEDIAL CURVES AND ITS BIOMEDICAL APPLICATIONS

    PubMed Central

    Einstein, Daniel R.; Dyedov, Vladimir

    2010-01-01

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method called local orthogonal cutting (LOC) for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stability and consistency tests. These concepts lend themselves to robust numerical techniques and result in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods. PMID:20628546

  15. Security and Cloud Outsourcing Framework for Economic Dispatch

    DOE PAGES

    Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi; ...

    2017-04-24

    The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less

  16. Computational tools for exact conditional logistic regression.

    PubMed

    Corcoran, C; Mehta, C; Patel, N; Senchaudhuri, P

    Logistic regression analyses are often challenged by the inability of unconditional likelihood-based approximations to yield consistent, valid estimates and p-values for model parameters. This can be due to sparseness or separability in the data. Conditional logistic regression, though useful in such situations, can also be computationally unfeasible when the sample size or number of explanatory covariates is large. We review recent developments that allow efficient approximate conditional inference, including Monte Carlo sampling and saddlepoint approximations. We demonstrate through real examples that these methods enable the analysis of significantly larger and more complex data sets. We find in this investigation that for these moderately large data sets Monte Carlo seems a better alternative, as it provides unbiased estimates of the exact results and can be executed in less CPU time than can the single saddlepoint approximation. Moreover, the double saddlepoint approximation, while computationally the easiest to obtain, offers little practical advantage. It produces unreliable results and cannot be computed when a maximum likelihood solution does not exist. Copyright 2001 John Wiley & Sons, Ltd.

  17. From sequencer to supercomputer: an automatic pipeline for managing and processing next generation sequencing data.

    PubMed

    Camerlengo, Terry; Ozer, Hatice Gulcin; Onti-Srinivasan, Raghuram; Yan, Pearlly; Huang, Tim; Parvin, Jeffrey; Huang, Kun

    2012-01-01

    Next Generation Sequencing is highly resource intensive. NGS Tasks related to data processing, management and analysis require high-end computing servers or even clusters. Additionally, processing NGS experiments requires suitable storage space and significant manual interaction. At The Ohio State University's Biomedical Informatics Shared Resource, we designed and implemented a scalable architecture to address the challenges associated with the resource intensive nature of NGS secondary analysis built around Illumina Genome Analyzer II sequencers and Illumina's Gerald data processing pipeline. The software infrastructure includes a distributed computing platform consisting of a LIMS called QUEST (http://bisr.osumc.edu), an Automation Server, a computer cluster for processing NGS pipelines, and a network attached storage device expandable up to 40TB. The system has been architected to scale to multiple sequencers without requiring additional computing or labor resources. This platform provides demonstrates how to manage and automate NGS experiments in an institutional or core facility setting.

  18. Massively parallel algorithms for real-time wavefront control of a dense adaptive optics system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fijany, A.; Milman, M.; Redding, D.

    1994-12-31

    In this paper massively parallel algorithms and architectures for real-time wavefront control of a dense adaptive optic system (SELENE) are presented. The authors have already shown that the computation of a near optimal control algorithm for SELENE can be reduced to the solution of a discrete Poisson equation on a regular domain. Although, this represents an optimal computation, due the large size of the system and the high sampling rate requirement, the implementation of this control algorithm poses a computationally challenging problem since it demands a sustained computational throughput of the order of 10 GFlops. They develop a novel algorithm,more » designated as Fast Invariant Imbedding algorithm, which offers a massive degree of parallelism with simple communication and synchronization requirements. Due to these features, this algorithm is significantly more efficient than other Fast Poisson Solvers for implementation on massively parallel architectures. The authors also discuss two massively parallel, algorithmically specialized, architectures for low-cost and optimal implementation of the Fast Invariant Imbedding algorithm.« less

  19. Quantitative morphometrical characterization of human pronuclear zygotes.

    PubMed

    Beuchat, A; Thévenaz, P; Unser, M; Ebner, T; Senn, A; Urner, F; Germond, M; Sorzano, C O S

    2008-09-01

    Identification of embryos with high implantation potential remains a challenge in in vitro fertilization (IVF). Subjective pronuclear (PN) zygote scoring systems have been developed for that purpose. The aim of this work was to provide a software tool that enables objective measuring of morphological characteristics of the human PN zygote. A computer program was created to analyse zygote images semi-automatically, providing precise morphological measurements. The accuracy of this approach was first validated by comparing zygotes from two different IVF centres with computer-assisted measurements or subjective scoring. Computer-assisted measurement and subjective scoring were then compared for their ability to classify zygotes with high and low implantation probability by using a linear discriminant analysis. Zygote images coming from the two IVF centres were analysed with the software, resulting in a series of precise measurements of 24 variables. Using subjective scoring, the cytoplasmic halo was the only feature which was significantly different between the two IVF centres. Computer-assisted measurements revealed significant differences between centres in PN centring, PN proximity, cytoplasmic halo and features related to nucleolar precursor bodies distribution. The zygote classification error achieved with the computer-assisted measurements (0.363) was slightly inferior to that of the subjective ones (0.393). A precise and objective characterization of the morphology of human PN zygotes can be achieved by the use of an advanced image analysis tool. This computer-assisted analysis allows for a better morphological characterization of human zygotes and can be used for classification.

  20. Programming the social computer.

    PubMed

    Robertson, David; Giunchiglia, Fausto

    2013-03-28

    The aim of 'programming the global computer' was identified by Milner and others as one of the grand challenges of computing research. At the time this phrase was coined, it was natural to assume that this objective might be achieved primarily through extending programming and specification languages. The Internet, however, has brought with it a different style of computation that (although harnessing variants of traditional programming languages) operates in a style different to those with which we are familiar. The 'computer' on which we are running these computations is a social computer in the sense that many of the elementary functions of the computations it runs are performed by humans, and successful execution of a program often depends on properties of the human society over which the program operates. These sorts of programs are not programmed in a traditional way and may have to be understood in a way that is different from the traditional view of programming. This shift in perspective raises new challenges for the science of the Web and for computing in general.

  1. Computer-assisted innovations in craniofacial surgery.

    PubMed

    Rudman, Kelli; Hoekzema, Craig; Rhee, John

    2011-08-01

    Reconstructive surgery for complex craniofacial defects challenges even the most experienced surgeons. Preoperative reconstructive planning requires consideration of both functional and aesthetic properties of the mandible, orbit, and midface. Technological innovations allow for computer-assisted preoperative planning, computer-aided manufacturing of patient-specific implants (PSIs), and computer-assisted intraoperative navigation. Although many case reports discuss computer-assisted preoperative planning and creation of custom implants, a general overview of computer-assisted innovations is not readily available. This article reviews innovations in computer-assisted reconstructive surgery including anatomic considerations when using PSIs, technologies available for preoperative planning, work flow and process of obtaining a PSI, and implant materials available for PSIs. A case example follows illustrating the use of this technology in the reconstruction of an orbital-frontal-temporal defect with a PSI. Computer-assisted reconstruction of complex craniofacial defects provides the reconstructive surgeon with innovative options for challenging reconstructive cases. As technology advances, applications of computer-assisted reconstruction will continue to expand. © Thieme Medical Publishers.

  2. The Mechanics of Embodiment: A Dialog on Embodiment and Computational Modeling

    PubMed Central

    Pezzulo, Giovanni; Barsalou, Lawrence W.; Cangelosi, Angelo; Fischer, Martin H.; McRae, Ken; Spivey, Michael J.

    2011-01-01

    Embodied theories are increasingly challenging traditional views of cognition by arguing that conceptual representations that constitute our knowledge are grounded in sensory and motor experiences, and processed at this sensorimotor level, rather than being represented and processed abstractly in an amodal conceptual system. Given the established empirical foundation, and the relatively underspecified theories to date, many researchers are extremely interested in embodied cognition but are clamoring for more mechanistic implementations. What is needed at this stage is a push toward explicit computational models that implement sensorimotor grounding as intrinsic to cognitive processes. In this article, six authors from varying backgrounds and approaches address issues concerning the construction of embodied computational models, and illustrate what they view as the critical current and next steps toward mechanistic theories of embodiment. The first part has the form of a dialog between two fictional characters: Ernest, the “experimenter,” and Mary, the “computational modeler.” The dialog consists of an interactive sequence of questions, requests for clarification, challenges, and (tentative) answers, and touches the most important aspects of grounded theories that should inform computational modeling and, conversely, the impact that computational modeling could have on embodied theories. The second part of the article discusses the most important open challenges for embodied computational modeling. PMID:21713184

  3. The SAMPL4 host-guest blind prediction challenge: an overview.

    PubMed

    Muddana, Hari S; Fenley, Andrew T; Mobley, David L; Gilson, Michael K

    2014-04-01

    Prospective validation of methods for computing binding affinities can help assess their predictive power and thus set reasonable expectations for their performance in drug design applications. Supramolecular host-guest systems are excellent model systems for testing such affinity prediction methods, because their small size and limited conformational flexibility, relative to proteins, allows higher throughput and better numerical convergence. The SAMPL4 prediction challenge therefore included a series of host-guest systems, based on two hosts, cucurbit[7]uril and octa-acid. Binding affinities in aqueous solution were measured experimentally for a total of 23 guest molecules. Participants submitted 35 sets of computational predictions for these host-guest systems, based on methods ranging from simple docking, to extensive free energy simulations, to quantum mechanical calculations. Over half of the predictions provided better correlations with experiment than two simple null models, but most methods underperformed the null models in terms of root mean squared error and linear regression slope. Interestingly, the overall performance across all SAMPL4 submissions was similar to that for the prior SAMPL3 host-guest challenge, although the experimentalists took steps to simplify the current challenge. While some methods performed fairly consistently across both hosts, no single approach emerged as consistent top performer, and the nonsystematic nature of the various submissions made it impossible to draw definitive conclusions regarding the best choices of energy models or sampling algorithms. Salt effects emerged as an issue in the calculation of absolute binding affinities of cucurbit[7]uril-guest systems, but were not expected to affect the relative affinities significantly. Useful directions for future rounds of the challenge might involve encouraging participants to carry out some calculations that replicate each others' studies, and to systematically explore parameter options.

  4. India's Computational Biology Growth and Challenges.

    PubMed

    Chakraborty, Chiranjib; Bandyopadhyay, Sanghamitra; Agoramoorthy, Govindasamy

    2016-09-01

    India's computational science is growing swiftly due to the outburst of internet and information technology services. The bioinformatics sector of India has been transforming rapidly by creating a competitive position in global bioinformatics market. Bioinformatics is widely used across India to address a wide range of biological issues. Recently, computational researchers and biologists are collaborating in projects such as database development, sequence analysis, genomic prospects and algorithm generations. In this paper, we have presented the Indian computational biology scenario highlighting bioinformatics-related educational activities, manpower development, internet boom, service industry, research activities, conferences and trainings undertaken by the corporate and government sectors. Nonetheless, this new field of science faces lots of challenges.

  5. Opportunities and Challenges of Cloud Computing to Improve Health Care Services

    PubMed Central

    2011-01-01

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed. PMID:21937354

  6. Network gateway security method for enterprise Grid: a literature review

    NASA Astrophysics Data System (ADS)

    Sujarwo, A.; Tan, J.

    2017-03-01

    The computational Grid has brought big computational resources closer to scientists. It enables people to do a large computational job anytime and anywhere without any physical border anymore. However, the massive and spread of computer participants either as user or computational provider arise problems in security. The challenge is on how the security system, especially the one which filters data in the gateway could works in flexibility depends on the registered Grid participants. This paper surveys what people have done to approach this challenge, in order to find the better and new method for enterprise Grid. The findings of this paper is the dynamically controlled enterprise firewall to secure the Grid resources from unwanted connections with a new firewall controlling method and components.

  7. Integrating Grid Services into the Cray XT4 Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NERSC; Cholia, Shreyas; Lin, Hwa-Chun Wendy

    2009-05-01

    The 38640 core Cray XT4"Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a shared-root environment and idiosyncratic application launching. Inour work, we describe how we resolved these challenges on a running, general-purpose production system to provide on-demand compute, storage, accounting and monitoring services through generic gridmore » interfaces that mask the underlying system-specific details for the end user.« less

  8. Computational modeling of cardiac hemodynamics: Current status and future outlook

    NASA Astrophysics Data System (ADS)

    Mittal, Rajat; Seo, Jung Hee; Vedula, Vijay; Choi, Young J.; Liu, Hang; Huang, H. Howie; Jain, Saurabh; Younes, Laurent; Abraham, Theodore; George, Richard T.

    2016-01-01

    The proliferation of four-dimensional imaging technologies, increasing computational speeds, improved simulation algorithms, and the widespread availability of powerful computing platforms is enabling simulations of cardiac hemodynamics with unprecedented speed and fidelity. Since cardiovascular disease is intimately linked to cardiovascular hemodynamics, accurate assessment of the patient's hemodynamic state is critical for the diagnosis and treatment of heart disease. Unfortunately, while a variety of invasive and non-invasive approaches for measuring cardiac hemodynamics are in widespread use, they still only provide an incomplete picture of the hemodynamic state of a patient. In this context, computational modeling of cardiac hemodynamics presents as a powerful non-invasive modality that can fill this information gap, and significantly impact the diagnosis as well as the treatment of cardiac disease. This article reviews the current status of this field as well as the emerging trends and challenges in cardiovascular health, computing, modeling and simulation and that are expected to play a key role in its future development. Some recent advances in modeling and simulations of cardiac flow are described by using examples from our own work as well as the research of other groups.

  9. A New Overview of The Trilinos Project

    DOE PAGES

    Heroux, Michael A.; Willenbring, James M.

    2012-01-01

    Since An Overview of the Trilinos Project [ACM Trans. Math. Softw. 31(3) (2005), 397–423] was published in 2005, Trilinos has grown significantly. It now supports the development of a broad collection of libraries for scalable computational science and engineering applications, and a full-featured software infrastructure for rigorous lean/agile software engineering. This growth has created significant opportunities and challenges. This paper focuses on some of the most notable changes to the Trilinos project in the last few years. At the time of the writing of this article, the current release version of Trilinos was 10.12.2.

  10. COMSAC: Computational Methods for Stability and Control. Part 1

    NASA Technical Reports Server (NTRS)

    Fremaux, C. Michael (Compiler); Hall, Robert M. (Compiler)

    2004-01-01

    Work on stability and control included the following reports:Introductory Remarks; Introduction to Computational Methods for Stability and Control (COMSAC); Stability & Control Challenges for COMSAC: a NASA Langley Perspective; Emerging CFD Capabilities and Outlook A NASA Langley Perspective; The Role for Computational Fluid Dynamics for Stability and Control:Is it Time?; Northrop Grumman Perspective on COMSAC; Boeing Integrated Defense Systems Perspective on COMSAC; Computational Methods in Stability and Control:WPAFB Perspective; Perspective: Raytheon Aircraft Company; A Greybeard's View of the State of Aerodynamic Prediction; Computational Methods for Stability and Control: A Perspective; Boeing TacAir Stability and Control Issues for Computational Fluid Dynamics; NAVAIR S&C Issues for CFD; An S&C Perspective on CFD; Issues, Challenges & Payoffs: A Boeing User s Perspective on CFD for S&C; and Stability and Control in Computational Simulations for Conceptual and Preliminary Design: the Past, Today, and Future?

  11. The challenge of computer mathematics.

    PubMed

    Barendregt, Henk; Wiedijk, Freek

    2005-10-15

    Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.

  12. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  13. Cloud computing approaches to accelerate drug discovery value chain.

    PubMed

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  14. Sequential computation of elementary modes and minimal cut sets in genome-scale metabolic networks using alternate integer linear programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Hyun-Seob; Goldberg, Noam; Mahajan, Ashutosh

    Elementary (flux) modes (EMs) have served as a valuable tool for investigating structural and functional properties of metabolic networks. Identification of the full set of EMs in genome-scale networks remains challenging due to combinatorial explosion of EMs in complex networks. It is often, however, that only a small subset of relevant EMs needs to be known, for which optimization-based sequential computation is a useful alternative. Most of the currently available methods along this line are based on the iterative use of mixed integer linear programming (MILP), the effectiveness of which significantly deteriorates as the number of iterations builds up. Tomore » alleviate the computational burden associated with the MILP implementation, we here present a novel optimization algorithm termed alternate integer linear programming (AILP). Results: Our algorithm was designed to iteratively solve a pair of integer programming (IP) and linear programming (LP) to compute EMs in a sequential manner. In each step, the IP identifies a minimal subset of reactions, the deletion of which disables all previously identified EMs. Thus, a subsequent LP solution subject to this reaction deletion constraint becomes a distinct EM. In cases where no feasible LP solution is available, IP-derived reaction deletion sets represent minimal cut sets (MCSs). Despite the additional computation of MCSs, AILP achieved significant time reduction in computing EMs by orders of magnitude. The proposed AILP algorithm not only offers a computational advantage in the EM analysis of genome-scale networks, but also improves the understanding of the linkage between EMs and MCSs.« less

  15. Challenges and Advances in Validating Enzyme Design Proposals: The Case of the Kemp Eliminase Catalysis†

    PubMed Central

    Frushicheva, Maria P.; Cao, Jie; Warshel, Arieh

    2011-01-01

    One of the fundamental challenges in biotechnology and biochemistry is the ability to design effective enzymes. Despite recent progress, most of the advances on this front have been made by placing the reacting fragments in the proper places, rather than by optimizing the preorganization of the environment, which is the key factor in enzyme catalysis. Thus, rational improvement of the preorganization would require approaches capable of evaluating reliably the actual catalytic effect. This work considers the catalytic effects in different Kemp eliminases as a benchmark for a computer aided enzyme design. It is shown that the empirical valence bond provides a powerful screening tool, with significant advantage over current alternative strategies. The insights provided by the empirical valence bond calculations are discussed emphasizing the ability to analyze the difference between the linear free energy relationships obtained in solution to those found in the enzymes. We also point out the trade off between reliability and speed of the calculations and try to determine what it takes to obtain reliable computer aided screening. PMID:21443179

  16. Challenges and advances in validating enzyme design proposals: the case of kemp eliminase catalysis.

    PubMed

    Frushicheva, Maria P; Cao, Jie; Warshel, Arieh

    2011-05-10

    One of the fundamental challenges in biotechnology and biochemistry is the ability to design effective enzymes. Despite recent progress, most of the advances on this front have been made by placing the reacting fragments in the proper places, rather than by optimizing the preorganization of the environment, which is the key factor in enzyme catalysis. Thus, rational improvement of the preorganization would require approaches capable of evaluating reliably the actual catalytic effect. This work considers the catalytic effects in different Kemp eliminases as a benchmark for a computer-aided enzyme design. It is shown that the empirical valence bond provides a powerful screening tool, with significant advantages over current alternative strategies. The insights provided by the empirical valence bond calculations are discussed with an emphasis on the ability to analyze the difference between the linear free energy relationships obtained in solution and those found in the enzymes. We also point out the trade-off between the reliability and speed of the calculations and try to determine what it takes to realize reliable computer-aided screening.

  17. Mesoscopic Lawlessness

    NASA Astrophysics Data System (ADS)

    Laughlin, R. B.

    2012-02-01

    Whether physics will contribute significantly to unraveling the secrets of life, the grandest challenge of them all, depends critically on whether proteins and other mesoscale objects exhibit emergent law. By this I mean quantitative relationships among their measured properties that are always true. The jury is still out on the matter, for there is evidence both for and against, but it is spotty, on account of the difficulty of measuring 100 nm - 1000 objects without damaging them quantum mechanically. It is therefore not clear that history will repeat itself. Physics contributed mightily to 20th century materials science through its identification and mastery of powerful macroscopic emergent laws such as crystalline rigidity, superconductivity and ferromagnetism, but it cannot do the same thing in biology, regardless of how powerful computers get, unless nature cooperates. The challenge before us as physicists is therefore not to amass more and more terabytes of data and computational output but rather to search for and, with luck, find operating principles at the scale of life greater than those of chemistry, which is to say, greater than a world ruled by nothing but miraculous accidents.

  18. Data collection and storage in long-term ecological and evolutionary studies: The Mongoose 2000 system.

    PubMed

    Marshall, Harry H; Griffiths, David J; Mwanguhya, Francis; Businge, Robert; Griffiths, Amber G F; Kyabulima, Solomon; Mwesige, Kenneth; Sanderson, Jennifer L; Thompson, Faye J; Vitikainen, Emma I K; Cant, Michael A

    2018-01-01

    Studying ecological and evolutionary processes in the natural world often requires research projects to follow multiple individuals in the wild over many years. These projects have provided significant advances but may also be hampered by needing to accurately and efficiently collect and store multiple streams of the data from multiple individuals concurrently. The increase in the availability and sophistication of portable computers (smartphones and tablets) and the applications that run on them has the potential to address many of these data collection and storage issues. In this paper we describe the challenges faced by one such long-term, individual-based research project: the Banded Mongoose Research Project in Uganda. We describe a system we have developed called Mongoose 2000 that utilises the potential of apps and portable computers to meet these challenges. We discuss the benefits and limitations of employing such a system in a long-term research project. The app and source code for the Mongoose 2000 system are freely available and we detail how it might be used to aid data collection and storage in other long-term individual-based projects.

  19. BrainFrame: a node-level heterogeneous accelerator platform for neuron simulations.

    PubMed

    Smaragdos, Georgios; Chatzikonstantis, Georgios; Kukreja, Rahul; Sidiropoulos, Harry; Rodopoulos, Dimitrios; Sourdis, Ioannis; Al-Ars, Zaid; Kachris, Christoforos; Soudris, Dimitrios; De Zeeuw, Chris I; Strydis, Christos

    2017-12-01

    The advent of high-performance computing (HPC) in recent years has led to its increasing use in brain studies through computational models. The scale and complexity of such models are constantly increasing, leading to challenging computational requirements. Even though modern HPC platforms can often deal with such challenges, the vast diversity of the modeling field does not permit for a homogeneous acceleration platform to effectively address the complete array of modeling requirements. In this paper we propose and build BrainFrame, a heterogeneous acceleration platform that incorporates three distinct acceleration technologies, an Intel Xeon-Phi CPU, a NVidia GP-GPU and a Maxeler Dataflow Engine. The PyNN software framework is also integrated into the platform. As a challenging proof of concept, we analyze the performance of BrainFrame on different experiment instances of a state-of-the-art neuron model, representing the inferior-olivary nucleus using a biophysically-meaningful, extended Hodgkin-Huxley representation. The model instances take into account not only the neuronal-network dimensions but also different network-connectivity densities, which can drastically affect the workload's performance characteristics. The combined use of different HPC technologies demonstrates that BrainFrame is better able to cope with the modeling diversity encountered in realistic experiments while at the same time running on significantly lower energy budgets. Our performance analysis clearly shows that the model directly affects performance and all three technologies are required to cope with all the model use cases. The BrainFrame framework is designed to transparently configure and select the appropriate back-end accelerator technology for use per simulation run. The PyNN integration provides a familiar bridge to the vast number of models already available. Additionally, it gives a clear roadmap for extending the platform support beyond the proof of concept, with improved usability and directly useful features to the computational-neuroscience community, paving the way for wider adoption.

  20. The DoD's High Performance Computing Modernization Program - Ensuing the National Earth Systems Prediction Capability Becomes Operational

    NASA Astrophysics Data System (ADS)

    Burnett, W.

    2016-12-01

    The Department of Defense's (DoD) High Performance Computing Modernization Program (HPCMP) provides high performance computing to address the most significant challenges in computational resources, software application support and nationwide research and engineering networks. Today, the HPCMP has a critical role in ensuring the National Earth System Prediction Capability (N-ESPC) achieves initial operational status in 2019. A 2015 study commissioned by the HPCMP found that N-ESPC computational requirements will exceed interconnect bandwidth capacity due to the additional load from data assimilation and passing connecting data between ensemble codes. Memory bandwidth and I/O bandwidth will continue to be significant bottlenecks for the Navy's Hybrid Coordinate Ocean Model (HYCOM) scalability - by far the major driver of computing resource requirements in the N-ESPC. The study also found that few of the N-ESPC model developers have detailed plans to ensure their respective codes scale through 2024. Three HPCMP initiatives are designed to directly address and support these issues: Productivity Enhancement, Technology, Transfer and Training (PETTT), the HPCMP Applications Software Initiative (HASI), and Frontier Projects. PETTT supports code conversion by providing assistance, expertise and training in scalable and high-end computing architectures. HASI addresses the continuing need for modern application software that executes effectively and efficiently on next-generation high-performance computers. Frontier Projects enable research and development that could not be achieved using typical HPCMP resources by providing multi-disciplinary teams access to exceptional amounts of high performance computing resources. Finally, the Navy's DoD Supercomputing Resource Center (DSRC) currently operates a 6 Petabyte system, of which Naval Oceanography receives 15% of operational computational system use, or approximately 1 Petabyte of the processing capability. The DSRC will provide the DoD with future computing assets to initially operate the N-ESPC in 2019. This talk will further describe how DoD's HPCMP will ensure N-ESPC becomes operational, efficiently and effectively, using next-generation high performance computing.

  1. Joint Model and Parameter Dimension Reduction for Bayesian Inversion Applied to an Ice Sheet Flow Problem

    NASA Astrophysics Data System (ADS)

    Ghattas, O.; Petra, N.; Cui, T.; Marzouk, Y.; Benjamin, P.; Willcox, K.

    2016-12-01

    Model-based projections of the dynamics of the polar ice sheets play a central role in anticipating future sea level rise. However, a number of mathematical and computational challenges place significant barriers on improving predictability of these models. One such challenge is caused by the unknown model parameters (e.g., in the basal boundary conditions) that must be inferred from heterogeneous observational data, leading to an ill-posed inverse problem and the need to quantify uncertainties in its solution. In this talk we discuss the problem of estimating the uncertainty in the solution of (large-scale) ice sheet inverse problems within the framework of Bayesian inference. Computing the general solution of the inverse problem--i.e., the posterior probability density--is intractable with current methods on today's computers, due to the expense of solving the forward model (3D full Stokes flow with nonlinear rheology) and the high dimensionality of the uncertain parameters (which are discretizations of the basal sliding coefficient field). To overcome these twin computational challenges, it is essential to exploit problem structure (e.g., sensitivity of the data to parameters, the smoothing property of the forward model, and correlations in the prior). To this end, we present a data-informed approach that identifies low-dimensional structure in both parameter space and the forward model state space. This approach exploits the fact that the observations inform only a low-dimensional parameter space and allows us to construct a parameter-reduced posterior. Sampling this parameter-reduced posterior still requires multiple evaluations of the forward problem, therefore we also aim to identify a low dimensional state space to reduce the computational cost. To this end, we apply a proper orthogonal decomposition (POD) approach to approximate the state using a low-dimensional manifold constructed using ``snapshots'' from the parameter reduced posterior, and the discrete empirical interpolation method (DEIM) to approximate the nonlinearity in the forward problem. We show that using only a limited number of forward solves, the resulting subspaces lead to an efficient method to explore the high-dimensional posterior.

  2. Progress Toward Affordable High Fidelity Combustion Simulations Using Filtered Density Functions for Hypersonic Flows in Complex Geometries

    NASA Technical Reports Server (NTRS)

    Drozda, Tomasz G.; Quinlan, Jesse R.; Pisciuneri, Patrick H.; Yilmaz, S. Levent

    2012-01-01

    Significant progress has been made in the development of subgrid scale (SGS) closures based on a filtered density function (FDF) for large eddy simulations (LES) of turbulent reacting flows. The FDF is the counterpart of the probability density function (PDF) method, which has proven effective in Reynolds averaged simulations (RAS). However, while systematic progress is being made advancing the FDF models for relatively simple flows and lab-scale flames, the application of these methods in complex geometries and high speed, wall-bounded flows with shocks remains a challenge. The key difficulties are the significant computational cost associated with solving the FDF transport equation and numerically stiff finite rate chemistry. For LES/FDF methods to make a more significant impact in practical applications a pragmatic approach must be taken that significantly reduces the computational cost while maintaining high modeling fidelity. An example of one such ongoing effort is at the NASA Langley Research Center, where the first generation FDF models, namely the scalar filtered mass density function (SFMDF) are being implemented into VULCAN, a production-quality RAS and LES solver widely used for design of high speed propulsion flowpaths. This effort leverages internal and external collaborations to reduce the overall computational cost of high fidelity simulations in VULCAN by: implementing high order methods that allow reduction in the total number of computational cells without loss in accuracy; implementing first generation of high fidelity scalar PDF/FDF models applicable to high-speed compressible flows; coupling RAS/PDF and LES/FDF into a hybrid framework to efficiently and accurately model the effects of combustion in the vicinity of the walls; developing efficient Lagrangian particle tracking algorithms to support robust solutions of the FDF equations for high speed flows; and utilizing finite rate chemistry parametrization, such as flamelet models, to reduce the number of transported reactive species and remove numerical stiffness. This paper briefly introduces the SFMDF model (highlighting key benefits and challenges), and discusses particle tracking for flows with shocks, the hybrid coupled RAS/PDF and LES/FDF model, flamelet generated manifolds (FGM) model, and the Irregularly Portioned Lagrangian Monte Carlo Finite Difference (IPLMCFD) methodology for scalable simulation of high-speed reacting compressible flows.

  3. Barriers and facilitators to home computer and internet use among urban novice computer users of low socioeconomic position.

    PubMed

    Kontos, Emily Z; Bennett, Gary G; Viswanath, K

    2007-10-22

    Despite the increasing penetration of the Internet and amount of online health information, there are significant barriers that limit its widespread adoption as a source of health information. One is the "digital divide," with people of higher socioeconomic position (SEP) demonstrating greater access and usage compared to those from lower SEP groups. However, as the access gap narrows over time and more people use the Internet, a shift in research needs to occur to explore how one might improve Internet use as well as website design for a range of audiences. This is particularly important in the case of novice users who may not have the technical skills, experience, or social connections that could help them search for health information using the Internet. The focus of our research is to investigate the challenges in the implementation of a project to improve health information seeking among low SEP groups. The goal of the project is not to promote health information seeking as much as to understand the barriers and facilitators to computer and Internet use, beyond access, among members of lower SEP groups in an urban setting. The purpose was to qualitatively describe participants' self-identified barriers and facilitators to computer and Internet use during a 1-year pilot study as well as the challenges encountered by the research team in the delivery of the intervention. Between August and November 2005, 12 low-SEP urban individuals with no or limited computer and Internet experience were recruited through a snowball sampling. Each participant received a free computer system, broadband Internet access, monthly computer training courses, and technical support for 1 year as the intervention condition. Upon completion of the study, participants were offered the opportunity to complete an in-depth semistructured interview. Interviews were approximately 1 hour in length and were conducted by the project director. The interviews were held in the participants' homes and were tape recorded for accuracy. Nine of the 12 study participants completed the semistructured interviews. Members of the research team conducted a qualitative analysis based on the transcripts from the nine interviews using the crystallization/immersion method. Nine of the 12 participants completed the in-depth interview (75% overall response rate), with three men and six women agreeing to be interviewed. Major barriers to Internet use that were mentioned included time constraints and family conflict over computer usage. The monthly training classes and technical assistance components of the intervention surfaced as the most important facilitators to computer and Internet use. The concept of received social support from other study members, such as assistance with computer-related questions, also emerged as an important facilitator to overall computer usage. This pilot study offers important insights into the self-identified barriers and facilitators in computer and Internet use among urban low-SEP novice users as well as the challenges faced by the research team in implementing the intervention.

  4. The Computational Fluid Dynamics Rupture Challenge 2013—Phase I: prediction of rupture status in intracranial aneurysms.

    PubMed

    Janiga, G; Berg, P; Sugiyama, S; Kono, K; Steinman, D A

    2015-03-01

    Rupture risk assessment for intracranial aneurysms remains challenging, and risk factors, including wall shear stress, are discussed controversially. The primary purpose of the presented challenge was to determine how consistently aneurysm rupture status and rupture site could be identified on the basis of computational fluid dynamics. Two geometrically similar MCA aneurysms were selected, 1 ruptured, 1 unruptured. Participating computational fluid dynamics groups were blinded as to which case was ruptured. Participants were provided with digitally segmented lumen geometries and, for this phase of the challenge, were free to choose their own flow rates, blood rheologies, and so forth. Participants were asked to report which case had ruptured and the likely site of rupture. In parallel, lumen geometries were provided to a group of neurosurgeons for their predictions of rupture status and site. Of 26 participating computational fluid dynamics groups, 21 (81%) correctly identified the ruptured case. Although the known rupture site was associated with low and oscillatory wall shear stress, most groups identified other sites, some of which also experienced low and oscillatory shear. Of the 43 participating neurosurgeons, 39 (91%) identified the ruptured case. None correctly identified the rupture site. Geometric or hemodynamic considerations favor identification of rupture status; however, retrospective identification of the rupture site remains a challenge for both engineers and clinicians. A more precise understanding of the hemodynamic factors involved in aneurysm wall pathology is likely required for computational fluid dynamics to add value to current clinical decision-making regarding rupture risk. © 2015 by American Journal of Neuroradiology.

  5. Cloud Computing Fundamentals

    NASA Astrophysics Data System (ADS)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  6. NASA Center for Computational Sciences: History and Resources

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  7. Two-stage atlas subset selection in multi-atlas based image segmentation.

    PubMed

    Zhao, Tingting; Ruan, Dan

    2015-06-01

    Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stage atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. The authors have developed a novel two-stage atlas subset selection scheme for multi-atlas based segmentation. It achieves good segmentation accuracy with significantly reduced computation cost, making it a suitable configuration in the presence of extensive heterogeneous atlases.

  8. Brain-Computer Interfaces: A Neuroscience Paradigm of Social Interaction? A Matter of Perspective

    PubMed Central

    Mattout, Jérémie

    2012-01-01

    A number of recent studies have put human subjects in true social interactions, with the aim of better identifying the psychophysiological processes underlying social cognition. Interestingly, this emerging Neuroscience of Social Interactions (NSI) field brings up challenges which resemble important ones in the field of Brain-Computer Interfaces (BCI). Importantly, these challenges go beyond common objectives such as the eventual use of BCI and NSI protocols in the clinical domain or common interests pertaining to the use of online neurophysiological techniques and algorithms. Common fundamental challenges are now apparent and one can argue that a crucial one is to develop computational models of brain processes relevant to human interactions with an adaptive agent, whether human or artificial. Coupled with neuroimaging data, such models have proved promising in revealing the neural basis and mental processes behind social interactions. Similar models could help BCI to move from well-performing but offline static machines to reliable online adaptive agents. This emphasizes a social perspective to BCI, which is not limited to a computational challenge but extends to all questions that arise when studying the brain in interaction with its environment. PMID:22675291

  9. VM Capacity-Aware Scheduling within Budget Constraints in IaaS Clouds

    PubMed Central

    Thanasias, Vasileios; Lee, Choonhwa; Hanif, Muhammad; Kim, Eunsam; Helal, Sumi

    2016-01-01

    Recently, cloud computing has drawn significant attention from both industry and academia, bringing unprecedented changes to computing and information technology. The infrastructure-as-a-Service (IaaS) model offers new abilities such as the elastic provisioning and relinquishing of computing resources in response to workload fluctuations. However, because the demand for resources dynamically changes over time, the provisioning of resources in a way that a given budget is efficiently utilized while maintaining a sufficing performance remains a key challenge. This paper addresses the problem of task scheduling and resource provisioning for a set of tasks running on IaaS clouds; it presents novel provisioning and scheduling algorithms capable of executing tasks within a given budget, while minimizing the slowdown due to the budget constraint. Our simulation study demonstrates a substantial reduction up to 70% in the overall task slowdown rate by the proposed algorithms. PMID:27501046

  10. Influence of savanna fire on Australian monsoon season precipitation and circulation as simulated using a distributed computing environment

    NASA Astrophysics Data System (ADS)

    Lynch, Amanda H.; Abramson, David; Görgen, Klaus; Beringer, Jason; Uotila, Petteri

    2007-10-01

    Fires in the Australian savanna have been hypothesized to affect monsoon evolution, but the hypothesis is controversial and the effects have not been quantified. A distributed computing approach allows the development of a challenging experimental design that permits simultaneous variation of all fire attributes. The climate model simulations are distributed around multiple independent computer clusters in six countries, an approach that has potential for a range of other large simulation applications in the earth sciences. The experiment clarifies that savanna burning can shape the monsoon through two mechanisms. Boundary-layer circulation and large-scale convergence is intensified monotonically through increasing fire intensity and area burned. However, thresholds of fire timing and area are evident in the consequent influence on monsoon rainfall. In the optimal band of late, high intensity fires with a somewhat limited extent, it is possible for the wet season to be significantly enhanced.

  11. NASA Advanced Supercomputing Facility Expansion

    NASA Technical Reports Server (NTRS)

    Thigpen, William W.

    2017-01-01

    The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.

  12. Predictive wind turbine simulation with an adaptive lattice Boltzmann method for moving boundaries

    NASA Astrophysics Data System (ADS)

    Deiterding, Ralf; Wood, Stephen L.

    2016-09-01

    Operating horizontal axis wind turbines create large-scale turbulent wake structures that affect the power output of downwind turbines considerably. The computational prediction of this phenomenon is challenging as efficient low dissipation schemes are necessary that represent the vorticity production by the moving structures accurately and that are able to transport wakes without significant artificial decay over distances of several rotor diameters. We have developed a parallel adaptive lattice Boltzmann method for large eddy simulation of turbulent weakly compressible flows with embedded moving structures that considers these requirements rather naturally and enables first principle simulations of wake-turbine interaction phenomena at reasonable computational costs. The paper describes the employed computational techniques and presents validation simulations for the Mexnext benchmark experiments as well as simulations of the wake propagation in the Scaled Wind Farm Technology (SWIFT) array consisting of three Vestas V27 turbines in triangular arrangement.

  13. Information Architecture for Quality Management Support in Hospitals.

    PubMed

    Rocha, Álvaro; Freixo, Jorge

    2015-10-01

    Quality Management occupies a strategic role in organizations, and the adoption of computer tools within an aligned information architecture facilitates the challenge of making more with less, promoting the development of a competitive edge and sustainability. A formal Information Architecture (IA) lends organizations an enhanced knowledge but, above all, favours management. This simplifies the reinvention of processes, the reformulation of procedures, bridging and the cooperation amongst the multiple actors of an organization. In the present investigation work we planned the IA for the Quality Management System (QMS) of a Hospital, which allowed us to develop and implement the QUALITUS (QUALITUS, name of the computer application developed to support Quality Management in a Hospital Unit) computer application. This solution translated itself in significant gains for the Hospital Unit under study, accelerating the quality management process and reducing the tasks, the number of documents, the information to be filled in and information errors, amongst others.

  14. Near-realtime simulations of biolelectric activity in small mammalian hearts using graphical processing units

    PubMed Central

    Vigmond, Edward J.; Boyle, Patrick M.; Leon, L. Joshua; Plank, Gernot

    2014-01-01

    Simulations of cardiac bioelectric phenomena remain a significant challenge despite continual advancements in computational machinery. Spanning large temporal and spatial ranges demands millions of nodes to accurately depict geometry, and a comparable number of timesteps to capture dynamics. This study explores a new hardware computing paradigm, the graphics processing unit (GPU), to accelerate cardiac models, and analyzes results in the context of simulating a small mammalian heart in real time. The ODEs associated with membrane ionic flow were computed on traditional CPU and compared to GPU performance, for one to four parallel processing units. The scalability of solving the PDE responsible for tissue coupling was examined on a cluster using up to 128 cores. Results indicate that the GPU implementation was between 9 and 17 times faster than the CPU implementation and scaled similarly. Solving the PDE was still 160 times slower than real time. PMID:19964295

  15. COMSAC: Computational Methods for Stability and Control. Part 2

    NASA Technical Reports Server (NTRS)

    Fremaux, C. Michael (Compiler); Hall, Robert M. (Compiler)

    2004-01-01

    The unprecedented advances being made in computational fluid dynamic (CFD) technology have demonstrated the powerful capabilities of codes in applications to civil and military aircraft. Used in conjunction with wind-tunnel and flight investigations, many codes are now routinely used by designers in diverse applications such as aerodynamic performance predictions and propulsion integration. Typically, these codes are most reliable for attached, steady, and predominantly turbulent flows. As a result of increasing reliability and confidence in CFD, wind-tunnel testing for some new configurations has been substantially reduced in key areas, such as wing trade studies for mission performance guarantees. Interest is now growing in the application of computational methods to other critical design challenges. One of the most important disciplinary elements for civil and military aircraft is prediction of stability and control characteristics. CFD offers the potential for significantly increasing the basic understanding, prediction, and control of flow phenomena associated with requirements for satisfactory aircraft handling characteristics.

  16. Enabling large-scale viscoelastic calculations via neural network acceleration

    NASA Astrophysics Data System (ADS)

    Robinson DeVries, P.; Thompson, T. B.; Meade, B. J.

    2017-12-01

    One of the most significant challenges involved in efforts to understand the effects of repeated earthquake cycle activity are the computational costs of large-scale viscoelastic earthquake cycle models. Deep artificial neural networks (ANNs) can be used to discover new, compact, and accurate computational representations of viscoelastic physics. Once found, these efficient ANN representations may replace computationally intensive viscoelastic codes and accelerate large-scale viscoelastic calculations by more than 50,000%. This magnitude of acceleration enables the modeling of geometrically complex faults over thousands of earthquake cycles across wider ranges of model parameters and at larger spatial and temporal scales than have been previously possible. Perhaps most interestingly from a scientific perspective, ANN representations of viscoelastic physics may lead to basic advances in the understanding of the underlying model phenomenology. We demonstrate the potential of artificial neural networks to illuminate fundamental physical insights with specific examples.

  17. Computational approaches for rational design of proteins with novel functionalities

    PubMed Central

    Tiwari, Manish Kumar; Singh, Ranjitha; Singh, Raushan Kumar; Kim, In-Won; Lee, Jung-Kul

    2012-01-01

    Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes. PMID:24688643

  18. New Treatment of Strongly Anisotropic Scattering Phase Functions: The Delta-M+ Method

    NASA Astrophysics Data System (ADS)

    Stamnes, K. H.; Lin, Z.; Chen, N.; Fan, Y.; Li, W.; Stamnes, S.

    2017-12-01

    The treatment of strongly anisotropic scattering phase functions is still a challenge for accurate radiance computations. The new Delta-M+ method resolves this problem by introducing a reliable, fast, accurate, and easy-to-use Legendre expansion of the scattering phase function with modified moments. Delta-M+ is an upgrade of the widely-used Delta-M method that truncates the forward scattering cone into a Dirac-delta-function (a direct beam), where the + symbol indicates that it essentially matches moments above the first 2M terms. Compared with the original Delta-M method, Delta-M+ has the same computational efficiency, but the accuracy has been increased dramatically. Tests show that the errors for strongly forward-peaked scattering phase functions are greatly reduced. Furthermore, the accuracy and stability of radiance computations are also significantly improved by applying the new Delta-M+ method.

  19. An Intelligent Model for Pairs Trading Using Genetic Algorithms.

    PubMed

    Huang, Chien-Feng; Hsu, Chi-Jen; Chen, Chi-Chung; Chang, Bao Rong; Li, Chen-An

    2015-01-01

    Pairs trading is an important and challenging research area in computational finance, in which pairs of stocks are bought and sold in pair combinations for arbitrage opportunities. Traditional methods that solve this set of problems mostly rely on statistical methods such as regression. In contrast to the statistical approaches, recent advances in computational intelligence (CI) are leading to promising opportunities for solving problems in the financial applications more effectively. In this paper, we present a novel methodology for pairs trading using genetic algorithms (GA). Our results showed that the GA-based models are able to significantly outperform the benchmark and our proposed method is capable of generating robust models to tackle the dynamic characteristics in the financial application studied. Based upon the promising results obtained, we expect this GA-based method to advance the research in computational intelligence for finance and provide an effective solution to pairs trading for investment in practice.

  20. An Intelligent Model for Pairs Trading Using Genetic Algorithms

    PubMed Central

    Hsu, Chi-Jen; Chen, Chi-Chung; Li, Chen-An

    2015-01-01

    Pairs trading is an important and challenging research area in computational finance, in which pairs of stocks are bought and sold in pair combinations for arbitrage opportunities. Traditional methods that solve this set of problems mostly rely on statistical methods such as regression. In contrast to the statistical approaches, recent advances in computational intelligence (CI) are leading to promising opportunities for solving problems in the financial applications more effectively. In this paper, we present a novel methodology for pairs trading using genetic algorithms (GA). Our results showed that the GA-based models are able to significantly outperform the benchmark and our proposed method is capable of generating robust models to tackle the dynamic characteristics in the financial application studied. Based upon the promising results obtained, we expect this GA-based method to advance the research in computational intelligence for finance and provide an effective solution to pairs trading for investment in practice. PMID:26339236

  1. PRESAGE: PRivacy-preserving gEnetic testing via SoftwAre Guard Extension.

    PubMed

    Chen, Feng; Wang, Chenghong; Dai, Wenrui; Jiang, Xiaoqian; Mohammed, Noman; Al Aziz, Md Momin; Sadat, Md Nazmus; Sahinalp, Cenk; Lauter, Kristin; Wang, Shuang

    2017-07-26

    Advances in DNA sequencing technologies have prompted a wide range of genomic applications to improve healthcare and facilitate biomedical research. However, privacy and security concerns have emerged as a challenge for utilizing cloud computing to handle sensitive genomic data. We present one of the first implementations of Software Guard Extension (SGX) based securely outsourced genetic testing framework, which leverages multiple cryptographic protocols and minimal perfect hash scheme to enable efficient and secure data storage and computation outsourcing. We compared the performance of the proposed PRESAGE framework with the state-of-the-art homomorphic encryption scheme, as well as the plaintext implementation. The experimental results demonstrated significant performance over the homomorphic encryption methods and a small computational overhead in comparison to plaintext implementation. The proposed PRESAGE provides an alternative solution for secure and efficient genomic data outsourcing in an untrusted cloud by using a hybrid framework that combines secure hardware and multiple crypto protocols.

  2. Simulations of thermodynamics and kinetics on rough energy landscapes with milestoning.

    PubMed

    Bello-Rivas, Juan M; Elber, Ron

    2016-03-05

    We investigated by computational means the kinetics and stationary behavior of stochastic dynamics on an ensemble of rough two-dimensional energy landscapes. There are no obvious separations of temporal scales in these systems, which constitute a simple model for the behavior of glasses and some biomaterials. Even though there are significant computational challenges present in these systems due to the large number of metastable states, the Milestoning method is able to compute their kinetic and thermodynamic properties exactly. We observe two clearly distinguished regimes in the overall kinetics: one in which diffusive behavior dominates and another that follows an Arrhenius law (despite the absence of a dominant barrier). We compare our results with those obtained with an exactly-solvable one-dimensional model, and with the results from the rough one-dimensional energy model introduced by Zwanzig. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  3. VM Capacity-Aware Scheduling within Budget Constraints in IaaS Clouds.

    PubMed

    Thanasias, Vasileios; Lee, Choonhwa; Hanif, Muhammad; Kim, Eunsam; Helal, Sumi

    2016-01-01

    Recently, cloud computing has drawn significant attention from both industry and academia, bringing unprecedented changes to computing and information technology. The infrastructure-as-a-Service (IaaS) model offers new abilities such as the elastic provisioning and relinquishing of computing resources in response to workload fluctuations. However, because the demand for resources dynamically changes over time, the provisioning of resources in a way that a given budget is efficiently utilized while maintaining a sufficing performance remains a key challenge. This paper addresses the problem of task scheduling and resource provisioning for a set of tasks running on IaaS clouds; it presents novel provisioning and scheduling algorithms capable of executing tasks within a given budget, while minimizing the slowdown due to the budget constraint. Our simulation study demonstrates a substantial reduction up to 70% in the overall task slowdown rate by the proposed algorithms.

  4. An efficient technique for the numerical solution of the bidomain equations.

    PubMed

    Whiteley, Jonathan P

    2008-08-01

    Computing the numerical solution of the bidomain equations is widely accepted to be a significant computational challenge. In this study we extend a previously published semi-implicit numerical scheme with good stability properties that has been used to solve the bidomain equations (Whiteley, J.P. IEEE Trans. Biomed. Eng. 53:2139-2147, 2006). A new, efficient numerical scheme is developed which utilizes the observation that the only component of the ionic current that must be calculated on a fine spatial mesh and updated frequently is the fast sodium current. Other components of the ionic current may be calculated on a coarser mesh and updated less frequently, and then interpolated onto the finer mesh. Use of this technique to calculate the transmembrane potential and extracellular potential induces very little error in the solution. For the simulations presented in this study an increase in computational efficiency of over two orders of magnitude over standard numerical techniques is obtained.

  5. Accelerating Climate and Weather Simulations through Hybrid Computing

    NASA Technical Reports Server (NTRS)

    Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark

    2011-01-01

    Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.

  6. Manycore Performance-Portability: Kokkos Multidimensional Array Library

    DOE PAGES

    Edwards, H. Carter; Sunderland, Daniel; Porter, Vicki; ...

    2012-01-01

    Large, complex scientific and engineering application code have a significant investment in computational kernels to implement their mathematical models. Porting these computational kernels to the collection of modern manycore accelerator devices is a major challenge in that these devices have diverse programming models, application programming interfaces (APIs), and performance requirements. The Kokkos Array programming model provides library-based approach to implement computational kernels that are performance-portable to CPU-multicore and GPGPU accelerator devices. This programming model is based upon three fundamental concepts: (1) manycore compute devices each with its own memory space, (2) data parallel kernels and (3) multidimensional arrays. Kernel executionmore » performance is, especially for NVIDIA® devices, extremely dependent on data access patterns. Optimal data access pattern can be different for different manycore devices – potentially leading to different implementations of computational kernels specialized for different devices. The Kokkos Array programming model supports performance-portable kernels by (1) separating data access patterns from computational kernels through a multidimensional array API and (2) introduce device-specific data access mappings when a kernel is compiled. An implementation of Kokkos Array is available through Trilinos [Trilinos website, http://trilinos.sandia.gov/, August 2011].« less

  7. Computational Investigations for Undergraduate Organic Chemistry: Predicting the Mechanism of the Ritter Reaction

    NASA Astrophysics Data System (ADS)

    Hessley, Rita K.

    2000-02-01

    In an effort to engage students more deeply in their laboratory work and provide them with valuable learning experiences in the applications and limitations of computational chemistry as a research tool, students are instructed to carry out a computational pre-lab exercise. Before carrying out a laboratory experiment that investigates the mechanism for the formation of N-t-butylbenzamide, students construct and obtain heats of formation for reactants, products, postulated reaction intermediates, and one transition state structure for each proposed mechanism. This is designed as a companion to an open-ended laboratory experiment that hones skills learned early in most traditional organic chemistry courses. The incorporation of a preliminary computational exercise enables students to move beyond guessing what the outcome of the reaction will be. It challenges them to test what they believe they "know" about such fundamental concepts as stability of carbocations, or the significance and utility of thermodynamic data relative to kinetic data. On the basis of their computations and their own experimental data, students then verify or dispute their hypothesis, finally arriving at a defensible and logical conclusion about the course of the reaction mechanism. The manner of implementation of the exercise and typical computational data are described.

  8. Computing in high-energy physics

    DOE PAGES

    Mount, Richard P.

    2016-05-31

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  9. Computing in high-energy physics

    NASA Astrophysics Data System (ADS)

    Mount, Richard P.

    2016-04-01

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Finally, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  10. Computing in high-energy physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mount, Richard P.

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  11. PEPIS: A Pipeline for Estimating Epistatic Effects in Quantitative Trait Locus Mapping and Genome-Wide Association Studies.

    PubMed

    Zhang, Wenchao; Dai, Xinbin; Wang, Qishan; Xu, Shizhong; Zhao, Patrick X

    2016-05-01

    The term epistasis refers to interactions between multiple genetic loci. Genetic epistasis is important in regulating biological function and is considered to explain part of the 'missing heritability,' which involves marginal genetic effects that cannot be accounted for in genome-wide association studies. Thus, the study of epistasis is of great interest to geneticists. However, estimating epistatic effects for quantitative traits is challenging due to the large number of interaction effects that must be estimated, thus significantly increasing computing demands. Here, we present a new web server-based tool, the Pipeline for estimating EPIStatic genetic effects (PEPIS), for analyzing polygenic epistatic effects. The PEPIS software package is based on a new linear mixed model that has been used to predict the performance of hybrid rice. The PEPIS includes two main sub-pipelines: the first for kinship matrix calculation, and the second for polygenic component analyses and genome scanning for main and epistatic effects. To accommodate the demand for high-performance computation, the PEPIS utilizes C/C++ for mathematical matrix computing. In addition, the modules for kinship matrix calculations and main and epistatic-effect genome scanning employ parallel computing technology that effectively utilizes multiple computer nodes across our networked cluster, thus significantly improving the computational speed. For example, when analyzing the same immortalized F2 rice population genotypic data examined in a previous study, the PEPIS returned identical results at each analysis step with the original prototype R code, but the computational time was reduced from more than one month to about five minutes. These advances will help overcome the bottleneck frequently encountered in genome wide epistatic genetic effect analysis and enable accommodation of the high computational demand. The PEPIS is publically available at http://bioinfo.noble.org/PolyGenic_QTL/.

  12. The SGI/CRAY T3E: Experiences and Insights

    NASA Technical Reports Server (NTRS)

    Bernard, Lisa Hamet

    1999-01-01

    The focus of the HPCC Earth and Space Sciences (ESS) Project is capability computing - pushing highly scalable computing testbeds to their performance limits. The drivers of this focus are the Grand Challenge problems in Earth and space science: those that could not be addressed in a capacity computing environment where large jobs must continually compete for resources. These Grand Challenge codes require a high degree of communication, large memory, and very large I/O (throughout the duration of the processing, not just in loading initial conditions and saving final results). This set of parameters led to the selection of an SGI/Cray T3E as the current ESS Computing Testbed. The T3E at the Goddard Space Flight Center is a unique computational resource within NASA. As such, it must be managed to effectively support the diverse research efforts across the NASA research community yet still enable the ESS Grand Challenge Investigator teams to achieve their performance milestones, for which the system was intended. To date, all Grand Challenge Investigator teams have achieved the 10 GFLOPS milestone, eight of nine have achieved the 50 GFLOPS milestone, and three have achieved the 100 GFLOPS milestone. In addition, many technical papers have been published highlighting results achieved on the NASA T3E, including some at this Workshop. The successes enabled by the NASA T3E computing environment are best illustrated by the 512 PE upgrade funded by the NASA Earth Science Enterprise earlier this year. Never before has an HPCC computing testbed been so well received by the general NASA science community that it was deemed critical to the success of a core NASA science effort. NASA looks forward to many more success stories before the conclusion of the NASA-SGI/Cray cooperative agreement in June 1999.

  13. Cloud Based Educational Systems and Its Challenges and Opportunities and Issues

    ERIC Educational Resources Information Center

    Paul, Prantosh Kr.; Lata Dangwal, Kiran

    2014-01-01

    Cloud Computing (CC) is actually is a set of hardware, software, networks, storage, services an interface combines to deliver aspects of computing as a service. Cloud Computing (CC) actually uses the central remote servers to maintain data and applications. Practically Cloud Computing (CC) is extension of Grid computing with independency and…

  14. Computer Literacy of Iranian Teachers of English as a Foreign Language: Challenges and Obstacles

    ERIC Educational Resources Information Center

    Dashtestani, Reza

    2014-01-01

    Basically, one of the requirements for the implementation of computer-assisted language learning (CALL) is English as a foreign language (EFL) teachers' ability to use computers effectively. Educational authorities and planners should identify EFL teachers' computer literacy levels and make attempts to improve the teachers' computer competence.…

  15. Introduction to computational aero-acoustics

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1996-01-01

    Computational aeroacoustics (CAA) is introduced, by presenting its definition, advantages, applications, and initial challenges. The effects of Mach number and Reynolds number on CAA are considered. The CAA method combines the methods of aeroacoustics and computational fluid dynamics.

  16. Systems Toxicology: Real World Applications and Opportunities.

    PubMed

    Hartung, Thomas; FitzGerald, Rex E; Jennings, Paul; Mirams, Gary R; Peitsch, Manuel C; Rostami-Hodjegan, Amin; Shah, Imran; Wilks, Martin F; Sturla, Shana J

    2017-04-17

    Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams ("big data"), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity.

  17. Systems Toxicology: Real World Applications and Opportunities

    PubMed Central

    2017-01-01

    Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized from empirical end points to describing modes of action as adverse outcome pathways and perturbed networks. Toward this aim, Systems Toxicology entails the integration of in vitro and in vivo toxicity data with computational modeling. This evolving approach depends critically on data reliability and relevance, which in turn depends on the quality of experimental models and bioanalysis techniques used to generate toxicological data. Systems Toxicology involves the use of large-scale data streams (“big data”), such as those derived from omics measurements that require computational means for obtaining informative results. Thus, integrative analysis of multiple molecular measurements, particularly acquired by omics strategies, is a key approach in Systems Toxicology. In recent years, there have been significant advances centered on in vitro test systems and bioanalytical strategies, yet a frontier challenge concerns linking observed network perturbations to phenotypes, which will require understanding pathways and networks that give rise to adverse responses. This summary perspective from a 2016 Systems Toxicology meeting, an international conference held in the Alps of Switzerland, describes the limitations and opportunities of selected emerging applications in this rapidly advancing field. Systems Toxicology aims to change the basis of how adverse biological effects of xenobiotics are characterized, from empirical end points to pathways of toxicity. This requires the integration of in vitro and in vivo data with computational modeling. Test systems and bioanalytical technologies have made significant advances, but ensuring data reliability and relevance is an ongoing concern. The major challenge facing the new pathway approach is determining how to link observed network perturbations to phenotypic toxicity. PMID:28362102

  18. Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping

    2015-09-15

    Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less

  19. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science

    PubMed Central

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-01-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, “Interdisciplinary Insights into Group and Team Dynamics,” which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges. PMID:29249891

  20. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science.

    PubMed

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-10-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, "Interdisciplinary Insights into Group and Team Dynamics," which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges.

  1. Recent advances in computational methodology for simulation of mechanical circulatory assist devices

    PubMed Central

    Marsden, Alison L.; Bazilevs, Yuri; Long, Christopher C.; Behr, Marek

    2014-01-01

    Ventricular assist devices (VADs) provide mechanical circulatory support to offload the work of one or both ventricles during heart failure. They are used in the clinical setting as destination therapy, as bridge to transplant, or more recently as bridge to recovery to allow for myocardial remodeling. Recent developments in computational simulation allow for detailed assessment of VAD hemodynamics for device design and optimization for both children and adults. Here, we provide a focused review of the recent literature on finite element methods and optimization for VAD simulations. As VAD designs typically fall into two categories, pulsatile and continuous flow devices, we separately address computational challenges of both types of designs, and the interaction with the circulatory system with three representative case studies. In particular, we focus on recent advancements in finite element methodology that has increased the fidelity of VAD simulations. We outline key challenges, which extend to the incorporation of biological response such as thrombosis and hemolysis, as well as shape optimization methods and challenges in computational methodology. PMID:24449607

  2. Coping with Computing Success.

    ERIC Educational Resources Information Center

    Breslin, Richard D.

    Elements of computing success of Iona College, the challenges it currently faces, and the strategies conceived to cope with future computing needs are discussed. The college has mandated computer literacy for students and offers nine degrees in the computerized information system/management information system areas. Since planning is needed in…

  3. Approaches to Classroom-Based Computational Science.

    ERIC Educational Resources Information Center

    Guzdial, Mark

    Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…

  4. CombiMotif: A new algorithm for network motifs discovery in protein-protein interaction networks

    NASA Astrophysics Data System (ADS)

    Luo, Jiawei; Li, Guanghui; Song, Dan; Liang, Cheng

    2014-12-01

    Discovering motifs in protein-protein interaction networks is becoming a current major challenge in computational biology, since the distribution of the number of network motifs can reveal significant systemic differences among species. However, this task can be computationally expensive because of the involvement of graph isomorphic detection. In this paper, we present a new algorithm (CombiMotif) that incorporates combinatorial techniques to count non-induced occurrences of subgraph topologies in the form of trees. The efficiency of our algorithm is demonstrated by comparing the obtained results with the current state-of-the art subgraph counting algorithms. We also show major differences between unicellular and multicellular organisms. The datasets and source code of CombiMotif are freely available upon request.

  5. In Pursuit of Improving Airburst and Ground Damage Predictions: Recent Advances in Multi-Body Aerodynamic Testing and Computational Tools Validation

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj; Gulhan, Ali; Aftosmis, Michael; Brock, Joseph; Mathias, Donovan; Need, Dominic; Rodriguez, David; Seltner, Patrick; Stern, Eric; Wiles, Sebastian

    2017-01-01

    An airburst from a large asteroid during entry can cause significant ground damage. The damage depends on the energy and the altitude of airburst. Breakup of asteroids into fragments and their lateral spread have been observed. Modeling the underlying physics of fragmented bodies interacting at hypersonic speeds and the spread of fragments is needed for a true predictive capability. Current models use heuristic arguments and assumptions such as pancaking or point source explosive energy release at pre-determined altitude or an assumed fragmentation spread rate to predict airburst damage. A multi-year collaboration between German Aerospace Center (DLR) and NASA has been established to develop validated computational tools to address the above challenge.

  6. An extended set of yeast-based functional assays accurately identifies human disease mutations

    PubMed Central

    Sun, Song; Yang, Fan; Tan, Guihong; Costanzo, Michael; Oughtred, Rose; Hirschman, Jodi; Theesfeld, Chandra L.; Bansal, Pritpal; Sahni, Nidhi; Yi, Song; Yu, Analyn; Tyagi, Tanya; Tie, Cathy; Hill, David E.; Vidal, Marc; Andrews, Brenda J.; Boone, Charles; Dolinski, Kara; Roth, Frederick P.

    2016-01-01

    We can now routinely identify coding variants within individual human genomes. A pressing challenge is to determine which variants disrupt the function of disease-associated genes. Both experimental and computational methods exist to predict pathogenicity of human genetic variation. However, a systematic performance comparison between them has been lacking. Therefore, we developed and exploited a panel of 26 yeast-based functional complementation assays to measure the impact of 179 variants (101 disease- and 78 non-disease-associated variants) from 22 human disease genes. Using the resulting reference standard, we show that experimental functional assays in a 1-billion-year diverged model organism can identify pathogenic alleles with significantly higher precision and specificity than current computational methods. PMID:26975778

  7. Enhanced Constraints for Accurate Lower Bounds on Many-Electron Quantum Energies from Variational Two-Electron Reduced Density Matrix Theory.

    PubMed

    Mazziotti, David A

    2016-10-07

    A central challenge of physics is the computation of strongly correlated quantum systems. The past ten years have witnessed the development and application of the variational calculation of the two-electron reduced density matrix (2-RDM) without the wave function. In this Letter we present an orders-of-magnitude improvement in the accuracy of 2-RDM calculations without an increase in their computational cost. The advance is based on a low-rank, dual formulation of an important constraint on the 2-RDM, the T2 condition. Calculations are presented for metallic chains and a cadmium-selenide dimer. The low-scaling T2 condition will have significant applications in atomic and molecular, condensed-matter, and nuclear physics.

  8. Enhanced Constraints for Accurate Lower Bounds on Many-Electron Quantum Energies from Variational Two-Electron Reduced Density Matrix Theory

    NASA Astrophysics Data System (ADS)

    Mazziotti, David A.

    2016-10-01

    A central challenge of physics is the computation of strongly correlated quantum systems. The past ten years have witnessed the development and application of the variational calculation of the two-electron reduced density matrix (2-RDM) without the wave function. In this Letter we present an orders-of-magnitude improvement in the accuracy of 2-RDM calculations without an increase in their computational cost. The advance is based on a low-rank, dual formulation of an important constraint on the 2-RDM, the T 2 condition. Calculations are presented for metallic chains and a cadmium-selenide dimer. The low-scaling T 2 condition will have significant applications in atomic and molecular, condensed-matter, and nuclear physics.

  9. Asynchronous sampled-data approach for event-triggered systems

    NASA Astrophysics Data System (ADS)

    Mahmoud, Magdi S.; Memon, Azhar M.

    2017-11-01

    While aperiodically triggered network control systems save a considerable amount of communication bandwidth, they also pose challenges such as coupling between control and event-condition design, optimisation of the available resources such as control, communication and computation power, and time-delays due to computation and communication network. With this motivation, the paper presents separate designs of control and event-triggering mechanism, thus simplifying the overall analysis, asynchronous linear quadratic Gaussian controller which tackles delays and aperiodic nature of transmissions, and a novel event mechanism which compares the cost of the aperiodic system against a reference periodic implementation. The proposed scheme is simulated on a linearised wind turbine model for pitch angle control and the results show significant improvement against the periodic counterpart.

  10. Personal Health Records: A Systematic Literature Review

    PubMed Central

    2017-01-01

    Background Information and communication technology (ICT) has transformed the health care field worldwide. One of the main drivers of this change is the electronic health record (EHR). However, there are still open issues and challenges because the EHR usually reflects the partial view of a health care provider without the ability for patients to control or interact with their data. Furthermore, with the growth of mobile and ubiquitous computing, the number of records regarding personal health is increasing exponentially. This movement has been characterized as the Internet of Things (IoT), including the widespread development of wearable computing technology and assorted types of health-related sensors. This leads to the need for an integrated method of storing health-related data, defined as the personal health record (PHR), which could be used by health care providers and patients. This approach could combine EHRs with data gathered from sensors or other wearable computing devices. This unified view of patients’ health could be shared with providers, who may not only use previous health-related records but also expand them with data resulting from their interactions. Another PHR advantage is that patients can interact with their health data, making decisions that may positively affect their health. Objective This work aimed to explore the recent literature related to PHRs by defining the taxonomy and identifying challenges and open questions. In addition, this study specifically sought to identify data types, standards, profiles, goals, methods, functions, and architecture with regard to PHRs. Methods The method to achieve these objectives consists of using the systematic literature review approach, which is guided by research questions using the population, intervention, comparison, outcome, and context (PICOC) criteria. Results As a result, we reviewed more than 5000 scientific studies published in the last 10 years, selected the most significant approaches, and thoroughly surveyed the health care field related to PHRs. We developed an updated taxonomy and identified challenges, open questions, and current data types, related standards, main profiles, input strategies, goals, functions, and architectures of the PHR. Conclusions All of these results contribute to the achievement of a significant degree of coverage regarding the technology related to PHRs. PMID:28062391

  11. The Role of Computer-Assisted Technology in Post-Traumatic Orbital Reconstruction: A PRISMA-driven Systematic Review.

    PubMed

    Wan, Kelvin H; Chong, Kelvin K L; Young, Alvin L

    2015-12-08

    Post-traumatic orbital reconstruction remains a surgical challenge and requires careful preoperative planning, sound anatomical knowledge and good intraoperative judgment. Computer-assisted technology has the potential to reduce error and subjectivity in the management of these complex injuries. A systematic review of the literature was conducted to explore the emerging role of computer-assisted technologies in post-traumatic orbital reconstruction, in terms of functional and safety outcomes. We searched for articles comparing computer-assisted procedures with conventional surgery and studied outcomes on diplopia, enophthalmos, or procedure-related complications. Six observational studies with 273 orbits at a mean follow-up of 13 months were included. Three out of 4 studies reported significantly fewer patients with residual diplopia in the computer-assisted group, while only 1 of the 5 studies reported better improvement in enophthalmos in the assisted group. Types and incidence of complications were comparable. Study heterogeneities limiting statistical comparison by meta-analysis will be discussed. This review highlights the scarcity of data on computer-assisted technology in orbital reconstruction. The result suggests that computer-assisted technology may offer potential advantage in treating diplopia while its role remains to be confirmed in enophthalmos. Additional well-designed and powered randomized controlled trials are much needed.

  12. Advanced Computational Methods in Bio-Mechanics.

    PubMed

    Al Qahtani, Waleed M S; El-Anwar, Mohamed I

    2018-04-15

    A novel partnership between surgeons and machines, made possible by advances in computing and engineering technology, could overcome many of the limitations of traditional surgery. By extending surgeons' ability to plan and carry out surgical interventions more accurately and with fewer traumas, computer-integrated surgery (CIS) systems could help to improve clinical outcomes and the efficiency of healthcare delivery. CIS systems could have a similar impact on surgery to that long since realised in computer-integrated manufacturing. Mathematical modelling and computer simulation have proved tremendously successful in engineering. Computational mechanics has enabled technological developments in virtually every area of our lives. One of the greatest challenges for mechanists is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. Biomechanics has significant potential for applications in orthopaedic industry, and the performance arts since skills needed for these activities are visibly related to the human musculoskeletal and nervous systems. Although biomechanics is widely used nowadays in the orthopaedic industry to design orthopaedic implants for human joints, dental parts, external fixations and other medical purposes, numerous researches funded by billions of dollars are still running to build a new future for sports and human healthcare in what is called biomechanics era.

  13. Computational chemistry for NH 3 synthesis, hydrotreating, and NO x reduction: Three topics of special interest to Haldor Topsøe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elnabawy, Ahmed O.; Rangarajan, Srinivas; Mavrikakis, Manos

    Computational chemistry, especially density functional theory, has experienced a remarkable growth in terms of application over the last few decades. This is attributed to the improvements in theory and computing infrastructure that enable the analysis of systems of unprecedented size and detail at an affordable computational expense. In this perspective, we discuss recent progress and current challenges facing electronic structure theory in the context of heterogeneous catalysis. We specifically focus on the impact of computational chemistry in elucidating and designing catalytic systems in three topics of interest to Haldor Topsøe – ammonia, synthesis, hydrotreating, and NO x reduction. Furthermore, wemore » then discuss the common tools and concepts in computational catalysis that underline these topics and provide a perspective on the challenges and future directions of research in this area of catalysis research.« less

  14. Computational chemistry for NH 3 synthesis, hydrotreating, and NO x reduction: Three topics of special interest to Haldor Topsøe

    DOE PAGES

    Elnabawy, Ahmed O.; Rangarajan, Srinivas; Mavrikakis, Manos

    2015-06-05

    Computational chemistry, especially density functional theory, has experienced a remarkable growth in terms of application over the last few decades. This is attributed to the improvements in theory and computing infrastructure that enable the analysis of systems of unprecedented size and detail at an affordable computational expense. In this perspective, we discuss recent progress and current challenges facing electronic structure theory in the context of heterogeneous catalysis. We specifically focus on the impact of computational chemistry in elucidating and designing catalytic systems in three topics of interest to Haldor Topsøe – ammonia, synthesis, hydrotreating, and NO x reduction. Furthermore, wemore » then discuss the common tools and concepts in computational catalysis that underline these topics and provide a perspective on the challenges and future directions of research in this area of catalysis research.« less

  15. Bioinspired Tuning of Hydrogel Permeability-Rigidity Dependency for 3D Cell Culture

    NASA Astrophysics Data System (ADS)

    Lee, Min Kyung; Rich, Max H.; Baek, Kwanghyun; Lee, Jonghwi; Kong, Hyunjoon

    2015-03-01

    Hydrogels are being extensively used for three-dimensional immobilization and culture of cells in fundamental biological studies, biochemical processes, and clinical treatments. However, it is still a challenge to support viability and regulate phenotypic activities of cells in a structurally stable gel, because the gel becomes less permeable with increasing rigidity. To resolve this challenge, this study demonstrates a unique method to enhance the permeability of a cell-laden hydrogel while avoiding a significant change in rigidity of the gel. Inspired by the grooved skin textures of marine organisms, a hydrogel is assembled to present computationally optimized micro-sized grooves on the surface. Separately, a gel is engineered to preset aligned microchannels similar to a plant's vascular bundles through a uniaxial freeze-drying process. The resulting gel displays significantly increased water diffusivity with reduced changes of gel stiffness, exclusively when the microgrooves and microchannels are aligned together. No significant enhancement of rehydration is achieved when the microgrooves and microchannels are not aligned. Such material design greatly enhances viability and neural differentiation of stem cells and 3D neural network formation within the gel.

  16. SAIL: Summation-bAsed Incremental Learning for Information-Theoretic Text Clustering.

    PubMed

    Cao, Jie; Wu, Zhiang; Wu, Junjie; Xiong, Hui

    2013-04-01

    Information-theoretic clustering aims to exploit information-theoretic measures as the clustering criteria. A common practice on this topic is the so-called Info-Kmeans, which performs K-means clustering with KL-divergence as the proximity function. While expert efforts on Info-Kmeans have shown promising results, a remaining challenge is to deal with high-dimensional sparse data such as text corpora. Indeed, it is possible that the centroids contain many zero-value features for high-dimensional text vectors, which leads to infinite KL-divergence values and creates a dilemma in assigning objects to centroids during the iteration process of Info-Kmeans. To meet this challenge, in this paper, we propose a Summation-bAsed Incremental Learning (SAIL) algorithm for Info-Kmeans clustering. Specifically, by using an equivalent objective function, SAIL replaces the computation of KL-divergence by the incremental computation of Shannon entropy. This can avoid the zero-feature dilemma caused by the use of KL-divergence. To improve the clustering quality, we further introduce the variable neighborhood search scheme and propose the V-SAIL algorithm, which is then accelerated by a multithreaded scheme in PV-SAIL. Our experimental results on various real-world text collections have shown that, with SAIL as a booster, the clustering performance of Info-Kmeans can be significantly improved. Also, V-SAIL and PV-SAIL indeed help improve the clustering quality at a lower cost of computation.

  17. Prospective Architectures for Onboard vs Cloud-Based Decision Making for Unmanned Aerial Systems

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Teubert, Christopher

    2017-01-01

    This paper investigates propsective architectures for decision-making in unmanned aerial systems. When these unmanned vehicles operate in urban environments, there are several sources of uncertainty that affect their behavior, and decision-making algorithms need to be robust to account for these different sources of uncertainty. It is important to account for several risk-factors that affect the flight of these unmanned systems, and facilitate decision-making by taking into consideration these various risk-factors. In addition, there are several technical challenges related to autonomous flight of unmanned aerial systems; these challenges include sensing, obstacle detection, path planning and navigation, trajectory generation and selection, etc. Many of these activities require significant computational power and in many situations, all of these activities need to be performed in real-time. In order to efficiently integrate these activities, it is important to develop a systematic architecture that can facilitate real-time decision-making. Four prospective architectures are discussed in this paper; on one end of the spectrum, the first architecture considers all activities/computations being performed onboard the vehicle whereas on the other end of the spectrum, the fourth and final architecture considers all activities/computations being performed in the cloud, using a new service known as Prognostics as a Service that is being developed at NASA Ames Research Center. The four different architectures are compared, their advantages and disadvantages are explained and conclusions are presented.

  18. 2014 Runtime Systems Summit. Runtime Systems Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarkar, Vivek; Budimlic, Zoran; Kulkani, Milind

    2016-09-19

    This report summarizes runtime system challenges for exascale computing, that follow from the fundamental challenges for exascale systems that have been well studied in past reports, e.g., [6, 33, 34, 32, 24]. Some of the key exascale challenges that pertain to runtime systems include parallelism, energy efficiency, memory hierarchies, data movement, heterogeneous processors and memories, resilience, performance variability, dynamic resource allocation, performance portability, and interoperability with legacy code. In addition to summarizing these challenges, the report also outlines different approaches to addressing these significant challenges that have been pursued by research projects in the DOE-sponsored X-Stack and OS/R programs. Sincemore » there is often confusion as to what exactly the term “runtime system” refers to in the software stack, we include a section on taxonomy to clarify the terminology used by participants in these research projects. In addition, we include a section on deployment opportunities for vendors and government labs to build on the research results from these projects. Finally, this report is also intended to provide a framework for discussing future research and development investments for exascale runtime systems, and for clarifying the role of runtime systems in exascale software.« less

  19. Global Software Development with Cloud Platforms

    NASA Astrophysics Data System (ADS)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  20. Designing a Versatile Dedicated Computing Lab to Support Computer Network Courses: Insights from a Case Study

    ERIC Educational Resources Information Center

    Gercek, Gokhan; Saleem, Naveed

    2006-01-01

    Providing adequate computing lab support for Management Information Systems (MIS) and Computer Science (CS) programs is a perennial challenge for most academic institutions in the US and abroad. Factors, such as lack of physical space, budgetary constraints, conflicting needs of different courses, and rapid obsolescence of computing technology,…

  1. The main challenges that remain in applying high-throughput sequencing to clinical diagnostics.

    PubMed

    Loeffelholz, Michael; Fofanov, Yuriy

    2015-01-01

    Over the last 10 years, the quality, price and availability of high-throughput sequencing instruments have improved to the point that this technology may be close to becoming a routine tool in the diagnostic microbiology laboratory. Two groups of challenges, however, have to be resolved in order to move this powerful research technology into routine use in the clinical microbiology laboratory. The computational/bioinformatics challenges include data storage cost and privacy concerns, requiring analysis to be performed without access to cloud storage or expensive computational infrastructure. The logistical challenges include interpretation of complex results and acceptance and understanding of the advantages and limitations of this technology by the medical community. This article focuses on the approaches to address these challenges, such as file formats, algorithms, data collection, reporting and good laboratory practices.

  2. Experience with a UNIX based batch computing facility for H1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhards, R.; Kruener-Marquis, U.; Szkutnik, Z.

    1994-12-31

    A UNIX based batch computing facility for the H1 experiment at DESY is described. The ultimate goal is to replace the DESY IBM mainframe by a multiprocessor SGI Challenge series computer, using the UNIX operating system, for most of the computing tasks in H1.

  3. 48 CFR 227.7203-13 - Government right to review, verify, challenge and validate asserted restrictions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227..., reproduce, release, or disclose computer software or computer software documentation do not, by themselves, determine the extent of the Government's rights in such software or documentation. The Government may...

  4. 48 CFR 227.7203-13 - Government right to review, verify, challenge and validate asserted restrictions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227..., reproduce, release, or disclose computer software or computer software documentation do not, by themselves, determine the extent of the Government's rights in such software or documentation. The Government may...

  5. 48 CFR 227.7203-13 - Government right to review, verify, challenge and validate asserted restrictions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227..., reproduce, release, or disclose computer software or computer software documentation do not, by themselves, determine the extent of the Government's rights in such software or documentation. The Government may...

  6. 48 CFR 227.7203-13 - Government right to review, verify, challenge and validate asserted restrictions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227..., reproduce, release, or disclose computer software or computer software documentation do not, by themselves, determine the extent of the Government's rights in such software or documentation. The Government may...

  7. 48 CFR 227.7203-13 - Government right to review, verify, challenge and validate asserted restrictions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227..., reproduce, release, or disclose computer software or computer software documentation do not, by themselves, determine the extent of the Government's rights in such software or documentation. The Government may...

  8. The Challenge of Computers.

    ERIC Educational Resources Information Center

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  9. Making Advanced Computer Science Topics More Accessible through Interactive Technologies

    ERIC Educational Resources Information Center

    Shao, Kun; Maher, Peter

    2012-01-01

    Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…

  10. A National Study of the Relationship between Home Access to a Computer and Academic Performance Scores of Grade 12 U.S. Science Students: An Analysis of the 2009 NAEP Data

    NASA Astrophysics Data System (ADS)

    Coffman, Mitchell Ward

    The purpose of this dissertation was to examine the relationship between student access to a computer at home and academic achievement. The 2009 National Assessment of Educational Progress (NAEP) dataset was probed using the National Data Explorer (NDE) to investigate correlations in the subsets of SES, Parental Education, Race, and Gender as it relates to access of a home computer and improved performance scores for U.S. public school grade 12 science students. A causal-comparative approach was employed seeking clarity on the relationship between home access and performance scores. The influence of home access cannot overcome the challenges students of lower SES face. The achievement gap, or a second digital divide, for underprivileged classes of students, including minorities does not appear to contract via student access to a home computer. Nonetheless, in tests for significance, statistically significant improvement in science performance scores was reported for those having access to a computer at home compared to those not having access. Additionally, regression models reported evidence of correlations between and among subsets of controls for the demographic factors gender, race, and socioeconomic status. Variability in these correlations was high; suggesting influence from unobserved factors may have more impact upon the dependent variable. Having access to a computer at home increases performance scores for grade 12 general science students of all races, genders and socioeconomic levels. However, the performance gap is roughly equivalent to the existing performance gap of the national average for science scores, suggesting little influence from access to a computer on academic achievement. The variability of scores reported in the regression analysis models reflects a moderate to low effect, suggesting an absence of causation. These statistical results are accurate and confirm the literature review, whereby having access to a computer at home and the predictor variables were found to have a significant impact on performance scores, although the data presented suggest computer access at home is less influential upon performance scores than poverty and its correlates.

  11. Precision Medicine and PET/Computed Tomography: Challenges and Implementation.

    PubMed

    Subramaniam, Rathan M

    2017-01-01

    Precision Medicine is about selecting the right therapy for the right patient, at the right time, specific to the molecular targets expressed by disease or tumors, in the context of patient's environment and lifestyle. Some of the challenges for delivery of precision medicine in oncology include biomarkers for patient selection for enrichment-precision diagnostics, mapping out tumor heterogeneity that contributes to therapy failures, and early therapy assessment to identify resistance to therapies. PET/computed tomography offers solutions in these important areas of challenges and facilitates implementation of precision medicine. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Satellite Tasking via a Tablet Computer

    DTIC Science & Technology

    2015-09-01

    connectivity have helped to overcome the challenges of information delivery , but there remains the challenge of real-time information. This thesis...have helped to overcome the challenges of information delivery , but there remains the challenge of real-time information. This thesis examines the...76  3.  Integration with Existing Programs for Access and Dissemination of Imagery

  13. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  14. Outcomes from the DOE Workshop on Turbulent Flow Simulation at the Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael; Boldyrev, Stanislav; Chang, Choong-Seock

    This paper summarizes the outcomes from the Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop, which was held 4-5 August 2015, and was sponsored by the U.S. Department of Energy Office of Advanced Scientific Computing Research. The workshop objective was to define and describe the challenges and opportunities that computing at the exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the U.S. Department of Energy applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought togethermore » experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.« less

  15. Cognitive Model Exploration and Optimization: A New Challenge for Computational Science

    DTIC Science & Technology

    2010-03-01

    the generation and analysis of computational cognitive models to explain various aspects of cognition. Typically the behavior of these models...computational scale of a workstation, so we have turned to high performance computing (HPC) clusters and volunteer computing for large-scale...computational resources. The majority of applications on the Department of Defense HPC clusters focus on solving partial differential equations (Post

  16. Computational Psychiatry and the Challenge of Schizophrenia.

    PubMed

    Krystal, John H; Murray, John D; Chekroud, Adam M; Corlett, Philip R; Yang, Genevieve; Wang, Xiao-Jing; Anticevic, Alan

    2017-05-01

    Schizophrenia research is plagued by enormous challenges in integrating and analyzing large datasets and difficulties developing formal theories related to the etiology, pathophysiology, and treatment of this disorder. Computational psychiatry provides a path to enhance analyses of these large and complex datasets and to promote the development and refinement of formal models for features of this disorder. This presentation introduces the reader to the notion of computational psychiatry and describes discovery-oriented and theory-driven applications to schizophrenia involving machine learning, reinforcement learning theory, and biophysically-informed neural circuit models. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center 2017.

  17. Computer Network Security- The Challenges of Securing a Computer Network

    NASA Technical Reports Server (NTRS)

    Scotti, Vincent, Jr.

    2011-01-01

    This article is intended to give the reader an overall perspective on what it takes to design, implement, enforce and secure a computer network in the federal and corporate world to insure the confidentiality, integrity and availability of information. While we will be giving you an overview of network design and security, this article will concentrate on the technology and human factors of securing a network and the challenges faced by those doing so. It will cover the large number of policies and the limits of technology and physical efforts to enforce such policies.

  18. Deformation of Soft Tissue and Force Feedback Using the Smoothed Particle Hydrodynamics

    PubMed Central

    Liu, Xuemei; Wang, Ruiyi; Li, Yunhua; Song, Dongdong

    2015-01-01

    We study the deformation and haptic feedback of soft tissue in virtual surgery based on a liver model by using a force feedback device named PHANTOM OMNI developed by SensAble Company in USA. Although a significant amount of research efforts have been dedicated to simulating the behaviors of soft tissue and implementing force feedback, it is still a challenging problem. This paper introduces a kind of meshfree method for deformation simulation of soft tissue and force computation based on viscoelastic mechanical model and smoothed particle hydrodynamics (SPH). Firstly, viscoelastic model can present the mechanical characteristics of soft tissue which greatly promotes the realism. Secondly, SPH has features of meshless technique and self-adaption, which supply higher precision than methods based on meshes for force feedback computation. Finally, a SPH method based on dynamic interaction area is proposed to improve the real time performance of simulation. The results reveal that SPH methodology is suitable for simulating soft tissue deformation and force feedback calculation, and SPH based on dynamic local interaction area has a higher computational efficiency significantly compared with usual SPH. Our algorithm has a bright prospect in the area of virtual surgery. PMID:26417380

  19. The present and future of de novo whole-genome assembly.

    PubMed

    Sohn, Jang-Il; Nam, Jin-Wu

    2018-01-01

    As the advent of next-generation sequencing (NGS) technology, various de novo assembly algorithms based on the de Bruijn graph have been developed to construct chromosome-level sequences. However, numerous technical or computational challenges in de novo assembly still remain, although many bright ideas and heuristics have been suggested to tackle the challenges in both experimental and computational settings. In this review, we categorize de novo assemblers on the basis of the type of de Bruijn graphs (Hamiltonian and Eulerian) and discuss the challenges of de novo assembly for short NGS reads regarding computational complexity and assembly ambiguity. Then, we discuss how the limitations of the short reads can be overcome by using a single-molecule sequencing platform that generates long reads of up to several kilobases. In fact, the long read assembly has caused a paradigm shift in whole-genome assembly in terms of algorithms and supporting steps. We also summarize (i) hybrid assemblies using both short and long reads and (ii) overlap-based assemblies for long reads and discuss their challenges and future prospects. This review provides guidelines to determine the optimal approach for a given input data type, computational budget or genome. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Computational Challenges of Viscous Incompressible Flows

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin; Kim, Chang Sung

    2004-01-01

    Over the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of the computational fluid dynamics (CFD) discipline. Although incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to the rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low-speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient CFD took become increasingly important in fluid engineering for incompressible and low-speed flow. This paper reviews some of the successes made possible by advances in computational technologies during the same period, and discusses some of the current challenges faced in computing incompressible flows.

  1. Human white matter and knowledge representation

    PubMed Central

    2018-01-01

    Understanding how knowledge is represented in the human brain is a fundamental challenge in neuroscience. To date, most of the work on this topic has focused on knowledge representation in cortical areas and debated whether knowledge is represented in a distributed or localized fashion. Fang and colleagues provide evidence that brain connections and the white matter supporting such connections might play a significant role. The work opens new avenues of investigation, breaking through disciplinary boundaries across network neuroscience, computational neuroscience, cognitive science, and classical lesion studies. PMID:29698391

  2. Human white matter and knowledge representation.

    PubMed

    Pestilli, Franco

    2018-04-01

    Understanding how knowledge is represented in the human brain is a fundamental challenge in neuroscience. To date, most of the work on this topic has focused on knowledge representation in cortical areas and debated whether knowledge is represented in a distributed or localized fashion. Fang and colleagues provide evidence that brain connections and the white matter supporting such connections might play a significant role. The work opens new avenues of investigation, breaking through disciplinary boundaries across network neuroscience, computational neuroscience, cognitive science, and classical lesion studies.

  3. Accurate macromolecular structures using minimal measurements from X-ray free-electron lasers

    PubMed Central

    Hattne, Johan; Echols, Nathaniel; Tran, Rosalie; Kern, Jan; Gildea, Richard J.; Brewster, Aaron S.; Alonso-Mori, Roberto; Glöckner, Carina; Hellmich, Julia; Laksmono, Hartawan; Sierra, Raymond G.; Lassalle-Kaiser, Benedikt; Lampe, Alyssa; Han, Guangye; Gul, Sheraz; DiFiore, Dörte; Milathianaki, Despina; Fry, Alan R.; Miahnahri, Alan; White, William E.; Schafer, Donald W.; Seibert, M. Marvin; Koglin, Jason E.; Sokaras, Dimosthenis; Weng, Tsu-Chien; Sellberg, Jonas; Latimer, Matthew J.; Glatzel, Pieter; Zwart, Petrus H.; Grosse-Kunstleve, Ralf W.; Bogan, Michael J.; Messerschmidt, Marc; Williams, Garth J.; Boutet, Sébastien; Messinger, Johannes; Zouni, Athina; Yano, Junko; Bergmann, Uwe; Yachandra, Vittal K.; Adams, Paul D.; Sauter, Nicholas K.

    2014-01-01

    X-ray free-electron laser (XFEL) sources enable the use of crystallography to solve three-dimensional macromolecular structures under native conditions and free from radiation damage. Results to date, however, have been limited by the challenge of deriving accurate Bragg intensities from a heterogeneous population of microcrystals, while at the same time modeling the X-ray spectrum and detector geometry. Here we present a computational approach designed to extract statistically significant high-resolution signals from fewer diffraction measurements. PMID:24633409

  4. A large and aggressive fibromatosis in the axilla: a rare case report and review of the literature.

    PubMed

    Duan, Mingyue; Xing, Hua; Wang, Keren; Niu, Chunbo; Jiang, Chengwei; Zhang, Lijuan; Ezzat, Shereen; Zhang, Le

    2018-01-01

    Aggressive fibromatosis (AF) is a rare benign tumor, which occurs in the deep part of bone and muscle fibrous tissue. Clinical and pathological features can be challenging for definitive diagnosis. Here, we report a rare case of a large AF in the axilla. Interestingly, 18 F-fluorodeoxyglucose-positron emission tomography/computed tomography showed significant increase in standard uptake value. Surgical resection yielded a spindle cell tumor likely of fibromatosis origin which was positive for β-catenin expression.

  5. Beyond Fourier

    NASA Astrophysics Data System (ADS)

    Hoch, Jeffrey C.

    2017-10-01

    Non-Fourier methods of spectrum analysis are gaining traction in NMR spectroscopy, driven by their utility for processing nonuniformly sampled data. These methods afford new opportunities for optimizing experiment time, resolution, and sensitivity of multidimensional NMR experiments, but they also pose significant challenges not encountered with the discrete Fourier transform. A brief history of non-Fourier methods in NMR serves to place different approaches in context. Non-Fourier methods reflect broader trends in the growing importance of computation in NMR, and offer insights for future software development.

  6. A generative, probabilistic model of local protein structure.

    PubMed

    Boomsma, Wouter; Mardia, Kanti V; Taylor, Charles C; Ferkinghoff-Borg, Jesper; Krogh, Anders; Hamelryck, Thomas

    2008-07-01

    Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state. Our method represents a significant theoretical and practical improvement over the widely used fragment assembly technique by avoiding the drawbacks associated with a discrete and nonprobabilistic approach.

  7. Sulforaphane improves the bronchoprotective response in asthmatics through Nrf2-mediated gene pathways.

    PubMed

    Brown, Robert H; Reynolds, Curt; Brooker, Allison; Talalay, Paul; Fahey, Jed W

    2015-09-15

    It is widely recognized that deep inspiration (DI), either before methacholine (MCh) challenge (Bronchoprotection, BP) or after MCh challenge (Bronchodilation, BD) protects against this challenge in healthy individuals, but not in asthmatics. Sulforaphane, a dietary antioxidant and antiinflammatory phytochemical derived from broccoli, may affect the pulmonary bronchoconstrictor responses to MCh and the responses to DI in asthmatic patients. Forty-five moderate asthmatics were administered sulforaphane (100 μmol daily for 14 days), BP, BD, lung volumes by body-plethsmography, and airway morphology by computed tomography (CT) were measured pre- and post sulforaphane consumption. Sulforaphane ameliorated the bronchoconstrictor effects of MCh on FEV1 significantly (on average by 21 %; p = 0.01) in 60 % of these asthmatics. Interestingly, in 20 % of the asthmatics, sulforaphane aggravated the bronchoconstrictor effects of MCh and in a similar number was without effect, documenting the great heterogeneity of the responsiveness of these individuals to sulforaphane. Moreover, in individuals in whom the FEV1 response to MCh challenge decreased after sulforaphane administration, i.e., sulforaphane was protective, the activities of Nrf2-regulated antioxidant and anti-inflammatory genes decreased. In contrast, individuals in whom sulforaphane treatment enhanced the FEV1 response to MCh, had increased expression of the activities of these genes. High resolution CT scans disclosed that in asthmatics sulforaphane treatment resulted in a significant reduction in specific airway resistance and also increased small airway luminal area and airway trapping modestly but significantly. These findings suggest the potential value of blocking the bronchoconstrictor hyperresponsiveness in some types of asthmatics by phytochemicals such as sulforaphane.

  8. Multicore Programming Challenges

    NASA Astrophysics Data System (ADS)

    Perrone, Michael

    The computer industry is facing fundamental challenges that are driving a major change in the design of computer processors. Due to restrictions imposed by quantum physics, one historical path to higher computer processor performance - by increased clock frequency - has come to an end. Increasing clock frequency now leads to power consumption costs that are too high to justify. As a result, we have seen in recent years that the processor frequencies have peaked and are receding from their high point. At the same time, competitive market conditions are giving business advantage to those companies that can field new streaming applications, handle larger data sets, and update their models to market conditions faster. The desire for newer, faster and larger is driving continued demand for higher computer performance.

  9. Achievements and challenges in structural bioinformatics and computational biophysics.

    PubMed

    Samish, Ilan; Bourne, Philip E; Najmanovich, Rafael J

    2015-01-01

    The field of structural bioinformatics and computational biophysics has undergone a revolution in the last 10 years. Developments that are captured annually through the 3DSIG meeting, upon which this article reflects. An increase in the accessible data, computational resources and methodology has resulted in an increase in the size and resolution of studied systems and the complexity of the questions amenable to research. Concomitantly, the parameterization and efficiency of the methods have markedly improved along with their cross-validation with other computational and experimental results. The field exhibits an ever-increasing integration with biochemistry, biophysics and other disciplines. In this article, we discuss recent achievements along with current challenges within the field. © The Author 2014. Published by Oxford University Press.

  10. Achievements and challenges in structural bioinformatics and computational biophysics

    PubMed Central

    Samish, Ilan; Bourne, Philip E.; Najmanovich, Rafael J.

    2015-01-01

    Motivation: The field of structural bioinformatics and computational biophysics has undergone a revolution in the last 10 years. Developments that are captured annually through the 3DSIG meeting, upon which this article reflects. Results: An increase in the accessible data, computational resources and methodology has resulted in an increase in the size and resolution of studied systems and the complexity of the questions amenable to research. Concomitantly, the parameterization and efficiency of the methods have markedly improved along with their cross-validation with other computational and experimental results. Conclusion: The field exhibits an ever-increasing integration with biochemistry, biophysics and other disciplines. In this article, we discuss recent achievements along with current challenges within the field. Contact: Rafael.Najmanovich@USherbrooke.ca PMID:25488929

  11. Grids: The Top Ten Questions

    DOE PAGES

    Schopf, Jennifer M.; Nitzberg, Bill

    2002-01-01

    The design and implementation of a national computing system and data grid has become a reachable goal from both the computer science and computational science point of view. A distributed infrastructure capable of sophisticated computational functions can bring many benefits to scientific work, but poses many challenges, both technical and socio-political. Technical challenges include having basic software tools, higher-level services, functioning and pervasive security, and standards, while socio-political issues include building a user community, adding incentives for sites to be part of a user-centric environment, and educating funding sources about the needs of this community. This paper details the areasmore » relating to Grid research that we feel still need to be addressed to fully leverage the advantages of the Grid.« less

  12. Adapting Teaching Strategies To Encompass New Technologies.

    ERIC Educational Resources Information Center

    Oravec, Jo Ann

    2001-01-01

    The explosion of special-purpose computing devices--Internet appliances, handheld computers, wireless Internet, networked household appliances--challenges business educators attempting to provide computer literacy education. At a minimum, they should address connectivity, expanded applications, and social and public policy implications of these…

  13. Snowmass Computing Frontier: Computing for the Cosmic Frontier, Astrophysics, and Cosmology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connolly, A.; Habib, S.; Szalay, A.

    2013-11-12

    This document presents (off-line) computing requrements and challenges for Cosmic Frontier science, covering the areas of data management, analysis, and simulations. We invite contributions to extend the range of covered topics and to enhance the current descriptions.

  14. Computational thinking and thinking about computing

    PubMed Central

    Wing, Jeannette M.

    2008-01-01

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing. PMID:18672462

  15. A Computer Science Educational Program for Establishing an Entry Point into the Computing Community of Practice

    ERIC Educational Resources Information Center

    Haberman, Bruria; Yehezkel, Cecile

    2008-01-01

    The rapid evolvement of the computing domain has posed challenges in attempting to bridge the gap between school and the contemporary world of computing, which is related to content, learning culture, and professional norms. We believe that the interaction of high-school students who major in computer science or software engineering with leading…

  16. Snow Leopard Cloud: A Multi-national Education Training and Experimentation Cloud and Its Security Challenges

    NASA Astrophysics Data System (ADS)

    Cayirci, Erdal; Rong, Chunming; Huiskamp, Wim; Verkoelen, Cor

    Military/civilian education training and experimentation networks (ETEN) are an important application area for the cloud computing concept. However, major security challenges have to be overcome to realize an ETEN. These challenges can be categorized as security challenges typical to any cloud and multi-level security challenges specific to an ETEN environment. The cloud approach for ETEN is introduced and its security challenges are explained in this paper.

  17. Migrating Educational Data and Services to Cloud Computing: Exploring Benefits and Challenges

    ERIC Educational Resources Information Center

    Lahiri, Minakshi; Moseley, James L.

    2013-01-01

    "Cloud computing" is currently the "buzzword" in the Information Technology field. Cloud computing facilitates convenient access to information and software resources as well as easy storage and sharing of files and data, without the end users being aware of the details of the computing technology behind the process. This…

  18. Developing Oral and Written Communication Skills in Undergraduate Computer Science and Information Systems Curriculum

    ERIC Educational Resources Information Center

    Kortsarts, Yana; Fischbach, Adam; Rufinus, Jeff; Utell, Janine M.; Yoon, Suk-Chung

    2010-01-01

    Developing and applying oral and written communication skills in the undergraduate computer science and computer information systems curriculum--one of the ABET accreditation requirements - is a very challenging and, at the same time, a rewarding task that provides various opportunities to enrich the undergraduate computer science and computer…

  19. The Study of Surface Computer Supported Cooperative Work and Its Design, Efficiency, and Challenges

    ERIC Educational Resources Information Center

    Hwang, Wu-Yuin; Su, Jia-Han

    2012-01-01

    In this study, a Surface Computer Supported Cooperative Work paradigm is proposed. Recently, multitouch technology has become widely available for human-computer interaction. We found it has great potential to facilitate more awareness of human-to-human interaction than personal computers (PCs) in colocated collaborative work. However, other…

  20. Use of Failure in IS Development Statistics: Lessons for IS Curriculum Design

    ERIC Educational Resources Information Center

    Longenecker, Herbert H., Jr.; Babb, Jeffry; Waguespack, Leslie; Tastle, William; Landry, Jeff

    2016-01-01

    The evolution of computing education reflects the history of the professional practice of computing. Keeping computing education current has been a major challenge due to the explosive advances in technologies. Academic programs in Information Systems, a long-standing computing discipline, develop and refine the theory and practice of computing…

  1. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  2. Reviews on Security Issues and Challenges in Cloud Computing

    NASA Astrophysics Data System (ADS)

    An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.

    2016-11-01

    Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.

  3. Clinical decision-making and secondary findings in systems medicine.

    PubMed

    Fischer, T; Brothers, K B; Erdmann, P; Langanke, M

    2016-05-21

    Systems medicine is the name for an assemblage of scientific strategies and practices that include bioinformatics approaches to human biology (especially systems biology); "big data" statistical analysis; and medical informatics tools. Whereas personalized and precision medicine involve similar analytical methods applied to genomic and medical record data, systems medicine draws on these as well as other sources of data. Given this distinction, the clinical translation of systems medicine poses a number of important ethical and epistemological challenges for researchers working to generate systems medicine knowledge and clinicians working to apply it. This article focuses on three key challenges: First, we will discuss the conflicts in decision-making that can arise when healthcare providers committed to principles of experimental medicine or evidence-based medicine encounter individualized recommendations derived from computer algorithms. We will explore in particular whether controlled experiments, such as comparative effectiveness trials, should mediate the translation of systems medicine, or if instead individualized findings generated through "big data" approaches can be applied directly in clinical decision-making. Second, we will examine the case of the Riyadh Intensive Care Program Mortality Prediction Algorithm, pejoratively referred to as the "death computer," to demonstrate the ethical challenges that can arise when big-data-driven scoring systems are applied in clinical contexts. We argue that the uncritical use of predictive clinical algorithms, including those envisioned for systems medicine, challenge basic understandings of the doctor-patient relationship. Third, we will build on the recent discourse on secondary findings in genomics and imaging to draw attention to the important implications of secondary findings derived from the joint analysis of data from diverse sources, including data recorded by patients in an attempt to realize their "quantified self." This paper examines possible ethical challenges that are likely to be raised as systems medicine to be translated into clinical medicine. These include the epistemological challenges for clinical decision-making, the use of scoring systems optimized by big data techniques and the risk that incidental and secondary findings will significantly increase. While some ethical implications remain still hypothetical we should use the opportunity to prospectively identify challenges to avoid making foreseeable mistakes when systems medicine inevitably arrives in routine care.

  4. Barriers and Facilitators to Home Computer and Internet Use Among Urban Novice Computer Users of Low Socioeconomic Position

    PubMed Central

    Bennett, Gary G; Viswanath, K

    2007-01-01

    Background Despite the increasing penetration of the Internet and amount of online health information, there are significant barriers that limit its widespread adoption as a source of health information. One is the “digital divide,” with people of higher socioeconomic position (SEP) demonstrating greater access and usage compared to those from lower SEP groups. However, as the access gap narrows over time and more people use the Internet, a shift in research needs to occur to explore how one might improve Internet use as well as website design for a range of audiences. This is particularly important in the case of novice users who may not have the technical skills, experience, or social connections that could help them search for health information using the Internet. The focus of our research is to investigate the challenges in the implementation of a project to improve health information seeking among low SEP groups. The goal of the project is not to promote health information seeking as much as to understand the barriers and facilitators to computer and Internet use, beyond access, among members of lower SEP groups in an urban setting. Objective The purpose was to qualitatively describe participants’ self-identified barriers and facilitators to computer and Internet use during a 1-year pilot study as well as the challenges encountered by the research team in the delivery of the intervention. Methods Between August and November 2005, 12 low-SEP urban individuals with no or limited computer and Internet experience were recruited through a snowball sampling. Each participant received a free computer system, broadband Internet access, monthly computer training courses, and technical support for 1 year as the intervention condition. Upon completion of the study, participants were offered the opportunity to complete an in-depth semistructured interview. Interviews were approximately 1 hour in length and were conducted by the project director. The interviews were held in the participants’ homes and were tape recorded for accuracy. Nine of the 12 study participants completed the semistructured interviews. Members of the research team conducted a qualitative analysis based on the transcripts from the nine interviews using the crystallization/immersion method. Results Nine of the 12 participants completed the in-depth interview (75% overall response rate), with three men and six women agreeing to be interviewed. Major barriers to Internet use that were mentioned included time constraints and family conflict over computer usage. The monthly training classes and technical assistance components of the intervention surfaced as the most important facilitators to computer and Internet use. The concept of received social support from other study members, such as assistance with computer-related questions, also emerged as an important facilitator to overall computer usage. Conclusions This pilot study offers important insights into the self-identified barriers and facilitators in computer and Internet use among urban low-SEP novice users as well as the challenges faced by the research team in implementing the intervention. PMID:17951215

  5. Exploiting HPC Platforms for Metagenomics: Challenges and Opportunities (MICW - Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema

    Canon, Shane

    2018-01-24

    DOE JGI's Zhong Wang, chair of the High-performance Computing session, gives a brief introduction before Berkeley Lab's Shane Canon talks about "Exploiting HPC Platforms for Metagenomics: Challenges and Opportunities" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  6. Efficient computation of the phylogenetic likelihood function on multi-gene alignments and multi-core architectures.

    PubMed

    Stamatakis, Alexandros; Ott, Michael

    2008-12-27

    The continuous accumulation of sequence data, for example, due to novel wet-laboratory techniques such as pyrosequencing, coupled with the increasing popularity of multi-gene phylogenies and emerging multi-core processor architectures that face problems of cache congestion, poses new challenges with respect to the efficient computation of the phylogenetic maximum-likelihood (ML) function. Here, we propose two approaches that can significantly speed up likelihood computations that typically represent over 95 per cent of the computational effort conducted by current ML or Bayesian inference programs. Initially, we present a method and an appropriate data structure to efficiently compute the likelihood score on 'gappy' multi-gene alignments. By 'gappy' we denote sampling-induced gaps owing to missing sequences in individual genes (partitions), i.e. not real alignment gaps. A first proof-of-concept implementation in RAXML indicates that this approach can accelerate inferences on large and gappy alignments by approximately one order of magnitude. Moreover, we present insights and initial performance results on multi-core architectures obtained during the transition from an OpenMP-based to a Pthreads-based fine-grained parallelization of the ML function.

  7. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  8. Defining Computational Thinking for Mathematics and Science Classrooms

    NASA Astrophysics Data System (ADS)

    Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri

    2016-02-01

    Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.

  9. The role of assisted self-help in services for alcohol-related disorders.

    PubMed

    Kavanagh, David J; Proctor, Dawn M

    2011-06-01

    Potentially harmful substance use is common, but many affected people do not receive treatment. Brief face-to-face treatments show impact, as do strategies to assist self-help remotely, by using bibliotherapies, computers or mobile phones. Remotely delivered treatments offer more sustained and multifaceted support than brief interventions, and they show a substantial cost advantage as users increase in number. They may also build skills, confidence and treatment fidelity in providers who use them in sessions. Engagement and retention remain challenges, but electronic treatments show promise in engaging younger populations. Recruitment may be assisted by integration with community campaigns or brief opportunistic interventions. However, routine use of assisted self-help by standard services faces significant challenges. Strategies to optimize adoption are discussed. Copyright © 2011. Published by Elsevier Ltd.

  10. Uncertainty Management in Remote Sensing of Climate Data. Summary of A Workshop

    NASA Technical Reports Server (NTRS)

    McConnell, M.; Weidman, S.

    2009-01-01

    Great advances have been made in our understanding of the climate system over the past few decades, and remotely sensed data have played a key role in supporting many of these advances. Improvements in satellites and in computational and data-handling techniques have yielded high quality, readily accessible data. However, rapid increases in data volume have also led to large and complex datasets that pose significant challenges in data analysis (NRC, 2007). Uncertainty characterization is needed for every satellite mission and scientists continue to be challenged by the need to reduce the uncertainty in remotely sensed climate records and projections. The approaches currently used to quantify the uncertainty in remotely sensed data, including statistical methods used to calibrate and validate satellite instruments, lack an overall mathematically based framework.

  11. Toward A Simulation-Based Tool for the Treatment of Vocal Fold Paralysis

    PubMed Central

    Mittal, Rajat; Zheng, Xudong; Bhardwaj, Rajneesh; Seo, Jung Hee; Xue, Qian; Bielamowicz, Steven

    2011-01-01

    Advances in high-performance computing are enabling a new generation of software tools that employ computational modeling for surgical planning. Surgical management of laryngeal paralysis is one area where such computational tools could have a significant impact. The current paper describes a comprehensive effort to develop a software tool for planning medialization laryngoplasty where a prosthetic implant is inserted into the larynx in order to medialize the paralyzed vocal fold (VF). While this is one of the most common procedures used to restore voice in patients with VF paralysis, it has a relatively high revision rate, and the tool being developed is expected to improve surgical outcomes. This software tool models the biomechanics of airflow-induced vibration in the human larynx and incorporates sophisticated approaches for modeling the turbulent laryngeal flow, the complex dynamics of the VFs, as well as the production of voiced sound. The current paper describes the key elements of the modeling approach, presents computational results that demonstrate the utility of the approach and also describes some of the limitations and challenges. PMID:21556320

  12. Performance of a Block Structured, Hierarchical Adaptive MeshRefinement Code on the 64k Node IBM BlueGene/L Computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenough, Jeffrey A.; de Supinski, Bronis R.; Yates, Robert K.

    2005-04-25

    We describe the performance of the block-structured Adaptive Mesh Refinement (AMR) code Raptor on the 32k node IBM BlueGene/L computer. This machine represents a significant step forward towards petascale computing. As such, it presents Raptor with many challenges for utilizing the hardware efficiently. In terms of performance, Raptor shows excellent weak and strong scaling when running in single level mode (no adaptivity). Hardware performance monitors show Raptor achieves an aggregate performance of 3:0 Tflops in the main integration kernel on the 32k system. Results from preliminary AMR runs on a prototype astrophysical problem demonstrate the efficiency of the current softwaremore » when running at large scale. The BG/L system is enabling a physics problem to be considered that represents a factor of 64 increase in overall size compared to the largest ones of this type computed to date. Finally, we provide a description of the development work currently underway to address our inefficiencies.« less

  13. A computational imaging target specific detectivity metric

    NASA Astrophysics Data System (ADS)

    Preece, Bradley L.; Nehmetallah, George

    2017-05-01

    Due to the large quantity of low-cost, high-speed computational processing available today, computational imaging (CI) systems are expected to have a major role for next generation multifunctional cameras. The purpose of this work is to quantify the performance of theses CI systems in a standardized manner. Due to the diversity of CI system designs that are available today or proposed in the near future, significant challenges in modeling and calculating a standardized detection signal-to-noise ratio (SNR) to measure the performance of these systems. In this paper, we developed a path forward for a standardized detectivity metric for CI systems. The detectivity metric is designed to evaluate the performance of a CI system searching for a specific known target or signal of interest, and is defined as the optimal linear matched filter SNR, similar to the Hotelling SNR, calculated in computational space with special considerations for standardization. Therefore, the detectivity metric is designed to be flexible, in order to handle various types of CI systems and specific targets, while keeping the complexity and assumptions of the systems to a minimum.

  14. Local Orthogonal Cutting Method for Computing Medial Curves and Its Biomedical Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiao, Xiangmin; Einstein, Daniel R.; Dyedov, Volodymyr

    2010-03-24

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stabilitymore » and consistency tests. These concepts lend themselves to robust numerical techniques including eigenvalue analysis, weighted least squares approximations, and numerical minimization, resulting in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.« less

  15. A light-stimulated synaptic device based on graphene hybrid phototransistor

    NASA Astrophysics Data System (ADS)

    Qin, Shuchao; Wang, Fengqiu; Liu, Yujie; Wan, Qing; Wang, Xinran; Xu, Yongbing; Shi, Yi; Wang, Xiaomu; Zhang, Rong

    2017-09-01

    Neuromorphic chips refer to an unconventional computing architecture that is modelled on biological brains. They are increasingly employed for processing sensory data for machine vision, context cognition, and decision making. Despite rapid advances, neuromorphic computing has remained largely an electronic technology, making it a challenge to access the superior computing features provided by photons, or to directly process vision data that has increasing importance to artificial intelligence. Here we report a novel light-stimulated synaptic device based on a graphene-carbon nanotube hybrid phototransistor. Significantly, the device can respond to optical stimuli in a highly neuron-like fashion and exhibits flexible tuning of both short- and long-term plasticity. These features combined with the spatiotemporal processability make our device a capable counterpart to today’s electrically-driven artificial synapses, with superior reconfigurable capabilities. In addition, our device allows for generic optical spike processing, which provides a foundation for more sophisticated computing. The silicon-compatible, multifunctional photosensitive synapse opens up a new opportunity for neural networks enabled by photonics and extends current neuromorphic systems in terms of system complexities and functionalities.

  16. The impact of computer usage on the perceptions of hospital secretaries.

    PubMed

    Foner, C; Nour, M; Luo, X; Kim, J

    1991-09-01

    This study explored the perceptions of hospital unit secretaries regarding computer usage. Specifically, six attitudinal variables: performance, resistance, interpersonal relations, satisfaction, challenge, and work overload were examined. The study had two major findings: (1) hospital unit secretaries have positive perceptions of job performance, satisfaction, and challenge as a result of using the PHAMIS computer system and (2) hospital unit secretaries do not feel resistant to the system, overloaded with work, or inclined to increase their interpersonal interaction with coworkers. These two findings might appear contradictory on the surface, but in fact are consistent with overall positive perceptions about the PHAMIS system. The study also considered the impact of two independent variables--age and number of years at work--on the responses of subjects. The analysis indicated that together these two variables explained some variations in the values of at least two of the dependent variables--resistance and challenge. The authors therefore concluded that the installation of the hospital computer system has established a favorable working environment for those whose work is affected by it. The dramatic expansion of computer systems in nonprofit institutions as well as in profit-oriented institutions has made people more familiar with computer technology. This trend can account for the overall positive perception of the unit secretaries toward the new computer system. Moreover, training programs and the support of top management for the system may also have enhanced the positive attitude of the users.

  17. A Cyber-ITS Framework for Massive Traffic Data Analysis Using Cyber Infrastructure

    PubMed Central

    Fontaine, Michael D.

    2013-01-01

    Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs), which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI), by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS) data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing. PMID:23766690

  18. A Cyber-ITS framework for massive traffic data analysis using cyber infrastructure.

    PubMed

    Xia, Yingjie; Hu, Jia; Fontaine, Michael D

    2013-01-01

    Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs), which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI), by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS) data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing.

  19. Challenges to Software/Computing for Experimentation at the LHC

    NASA Astrophysics Data System (ADS)

    Banerjee, Sunanda

    The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.

  20. The Opportunity and Challenge of The Age of Big Data

    NASA Astrophysics Data System (ADS)

    Yunguo, Hong

    2017-11-01

    The arrival of large data age has gradually expanded the scale of information industry in China, which has created favorable conditions for the expansion of information technology and computer network. Based on big data the computer system service function is becoming more and more perfect, and the efficiency of data processing in the system is improving, which provides important guarantee for the implementation of production plan in various industries. At the same time, the rapid development of fields such as Internet of things, social tools, cloud computing and the widen of information channel, these make the amount of data is increase, expand the influence range of the age of big data, we need to take the opportunities and challenges of the age of big data correctly, use data information resources effectively. Based on this, this paper will study the opportunities and challenges of the era of large data.

  1. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    PubMed

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  2. Acceleration of fluoro-CT reconstruction for a mobile C-Arm on GPU and FPGA hardware: a simulation study

    NASA Astrophysics Data System (ADS)

    Xue, Xinwei; Cheryauka, Arvi; Tubbs, David

    2006-03-01

    CT imaging in interventional and minimally-invasive surgery requires high-performance computing solutions that meet operational room demands, healthcare business requirements, and the constraints of a mobile C-arm system. The computational requirements of clinical procedures using CT-like data are increasing rapidly, mainly due to the need for rapid access to medical imagery during critical surgical procedures. The highly parallel nature of Radon transform and CT algorithms enables embedded computing solutions utilizing a parallel processing architecture to realize a significant gain of computational intensity with comparable hardware and program coding/testing expenses. In this paper, using a sample 2D and 3D CT problem, we explore the programming challenges and the potential benefits of embedded computing using commodity hardware components. The accuracy and performance results obtained on three computational platforms: a single CPU, a single GPU, and a solution based on FPGA technology have been analyzed. We have shown that hardware-accelerated CT image reconstruction can be achieved with similar levels of noise and clarity of feature when compared to program execution on a CPU, but gaining a performance increase at one or more orders of magnitude faster. 3D cone-beam or helical CT reconstruction and a variety of volumetric image processing applications will benefit from similar accelerations.

  3. A comparison of acceleration methods for solving the neutron transport k-eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Willert, Jeffrey; Park, H.; Knoll, D. A.

    2014-10-01

    Over the past several years a number of papers have been written describing modern techniques for numerically computing the dominant eigenvalue of the neutron transport criticality problem. These methods fall into two distinct categories. The first category of methods rewrite the multi-group k-eigenvalue problem as a nonlinear system of equations and solve the resulting system using either a Jacobian-Free Newton-Krylov (JFNK) method or Nonlinear Krylov Acceleration (NKA), a variant of Anderson Acceleration. These methods are generally successful in significantly reducing the number of transport sweeps required to compute the dominant eigenvalue. The second category of methods utilize Moment-Based Acceleration (or High-Order/Low-Order (HOLO) Acceleration). These methods solve a sequence of modified diffusion eigenvalue problems whose solutions converge to the solution of the original transport eigenvalue problem. This second class of methods is, in our experience, always superior to the first, as most of the computational work is eliminated by the acceleration from the LO diffusion system. In this paper, we review each of these methods. Our computational results support our claim that the choice of which nonlinear solver to use, JFNK or NKA, should be secondary. The primary computational savings result from the implementation of a HOLO algorithm. We display computational results for a series of challenging multi-dimensional test problems.

  4. A Survey of Recent Advances in Particle Filters and Remaining Challenges for Multitarget Tracking

    PubMed Central

    Wang, Xuedong; Sun, Shudong; Corchado, Juan M.

    2017-01-01

    We review some advances of the particle filtering (PF) algorithm that have been achieved in the last decade in the context of target tracking, with regard to either a single target or multiple targets in the presence of false or missing data. The first part of our review is on remarkable achievements that have been made for the single-target PF from several aspects including importance proposal, computing efficiency, particle degeneracy/impoverishment and constrained/multi-modal systems. The second part of our review is on analyzing the intractable challenges raised within the general multitarget (multi-sensor) tracking due to random target birth and termination, false alarm, misdetection, measurement-to-track (M2T) uncertainty and track uncertainty. The mainstream multitarget PF approaches consist of two main classes, one based on M2T association approaches and the other not such as the finite set statistics-based PF. In either case, significant challenges remain due to unknown tracking scenarios and integrated tracking management. PMID:29168772

  5. Quantifying Scheduling Challenges for Exascale System Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mondragon, Oscar; Bridges, Patrick G.; Jones, Terry R

    2015-01-01

    The move towards high-performance computing (HPC) ap- plications comprised of coupled codes and the need to dra- matically reduce data movement is leading to a reexami- nation of time-sharing vs. space-sharing in HPC systems. In this paper, we discuss and begin to quantify the perfor- mance impact of a move away from strict space-sharing of nodes for HPC applications. Specifically, we examine the po- tential performance cost of time-sharing nodes between ap- plication components, we determine whether a simple coor- dinated scheduling mechanism can address these problems, and we research how suitable simple constraint-based opti- mization techniques are for solvingmore » scheduling challenges in this regime. Our results demonstrate that current general- purpose HPC system software scheduling and resource al- location systems are subject to significant performance de- ciencies which we quantify for six representative applica- tions. Based on these results, we discuss areas in which ad- ditional research is needed to meet the scheduling challenges of next-generation HPC systems.« less

  6. WLCG scale testing during CMS data challenges

    NASA Astrophysics Data System (ADS)

    Gutsche, O.; Hajdu, C.

    2008-07-01

    The CMS computing model to process and analyze LHC collision data follows a data-location driven approach and is using the WLCG infrastructure to provide access to GRID resources. As a preparation for data taking, CMS tests its computing model during dedicated data challenges. An important part of the challenges is the test of the user analysis which poses a special challenge for the infrastructure with its random distributed access patterns. The CMS Remote Analysis Builder (CRAB) handles all interactions with the WLCG infrastructure transparently for the user. During the 2006 challenge, CMS set its goal to test the infrastructure at a scale of 50,000 user jobs per day using CRAB. Both direct submissions by individual users and automated submissions by robots were used to achieve this goal. A report will be given about the outcome of the user analysis part of the challenge using both the EGEE and OSG parts of the WLCG. In particular, the difference in submission between both GRID middlewares (resource broker vs. direct submission) will be discussed. In the end, an outlook for the 2007 data challenge is given.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.

    Emerging fossil energy power generation systems must operate with unprecedented efficiency and near-zero emissions, while optimizing profitably amid cost fluctuations for raw materials, finished products, and energy. To help address these challenges, the fossil energy industry will have to rely increasingly on the use advanced computational tools for modeling and simulating complex process systems. In this paper, we present the computational research challenges and opportunities for the optimization of fossil energy power generation systems across the plant lifecycle from process synthesis and design to plant operations. We also look beyond the plant gates to discuss research challenges and opportunities formore » enterprise-wide optimization, including planning, scheduling, and supply chain technologies.« less

  8. Secure Skyline Queries on Cloud Platform.

    PubMed

    Liu, Jinfei; Yang, Juncheng; Xiong, Li; Pei, Jian

    2017-04-01

    Outsourcing data and computation to cloud server provides a cost-effective way to support large scale data storage and query processing. However, due to security and privacy concerns, sensitive data (e.g., medical records) need to be protected from the cloud server and other unauthorized users. One approach is to outsource encrypted data to the cloud server and have the cloud server perform query processing on the encrypted data only. It remains a challenging task to support various queries over encrypted data in a secure and efficient way such that the cloud server does not gain any knowledge about the data, query, and query result. In this paper, we study the problem of secure skyline queries over encrypted data. The skyline query is particularly important for multi-criteria decision making but also presents significant challenges due to its complex computations. We propose a fully secure skyline query protocol on data encrypted using semantically-secure encryption. As a key subroutine, we present a new secure dominance protocol, which can be also used as a building block for other queries. Finally, we provide both serial and parallelized implementations and empirically study the protocols in terms of efficiency and scalability under different parameter settings, verifying the feasibility of our proposed solutions.

  9. Acoustic Source Localization via Time Difference of Arrival Estimation for Distributed Sensor Networks Using Tera-Scale Optical Core Devices

    DOE PAGES

    Imam, Neena; Barhen, Jacob

    2009-01-01

    For real-time acoustic source localization applications, one of the primary challenges is the considerable growth in computational complexity associated with the emergence of ever larger, active or passive, distributed sensor networks. These sensors rely heavily on battery-operated system components to achieve highly functional automation in signal and information processing. In order to keep communication requirements minimal, it is desirable to perform as much processing on the receiver platforms as possible. However, the complexity of the calculations needed to achieve accurate source localization increases dramatically with the size of sensor arrays, resulting in substantial growth of computational requirements that cannot bemore » readily met with standard hardware. One option to meet this challenge builds upon the emergence of digital optical-core devices. The objective of this work was to explore the implementation of key building block algorithms used in underwater source localization on the optical-core digital processing platform recently introduced by Lenslet Inc. This demonstration of considerably faster signal processing capability should be of substantial significance to the design and innovation of future generations of distributed sensor networks.« less

  10. Data collection and storage in long-term ecological and evolutionary studies: The Mongoose 2000 system

    PubMed Central

    Griffiths, David J.; Mwanguhya, Francis; Businge, Robert; Griffiths, Amber G. F.; Kyabulima, Solomon; Mwesige, Kenneth; Sanderson, Jennifer L.; Thompson, Faye J.; Vitikainen, Emma I. K.; Cant, Michael A.

    2018-01-01

    Studying ecological and evolutionary processes in the natural world often requires research projects to follow multiple individuals in the wild over many years. These projects have provided significant advances but may also be hampered by needing to accurately and efficiently collect and store multiple streams of the data from multiple individuals concurrently. The increase in the availability and sophistication of portable computers (smartphones and tablets) and the applications that run on them has the potential to address many of these data collection and storage issues. In this paper we describe the challenges faced by one such long-term, individual-based research project: the Banded Mongoose Research Project in Uganda. We describe a system we have developed called Mongoose 2000 that utilises the potential of apps and portable computers to meet these challenges. We discuss the benefits and limitations of employing such a system in a long-term research project. The app and source code for the Mongoose 2000 system are freely available and we detail how it might be used to aid data collection and storage in other long-term individual-based projects. PMID:29315317

  11. Redesign of a cross-reactive antibody to dengue virus with broad-spectrum activity and increased in vivo potency

    PubMed Central

    Tharakaraman, Kannan; Robinson, Luke N.; Hatas, Andrew; Chen, Yi-Ling; Siyue, Liu; Raguram, S.; Sasisekharan, V.; Wogan, Gerald N.; Sasisekharan, Ram

    2013-01-01

    Affinity improvement of proteins, including antibodies, by computational chemistry broadly relies on physics-based energy functions coupled with refinement. However, achieving significant enhancement of binding affinity (>10-fold) remains a challenging exercise, particularly for cross-reactive antibodies. We describe here an empirical approach that captures key physicochemical features common to antigen–antibody interfaces to predict protein–protein interaction and mutations that confer increased affinity. We apply this approach to the design of affinity-enhancing mutations in 4E11, a potent cross-reactive neutralizing antibody to dengue virus (DV), without a crystal structure. Combination of predicted mutations led to a 450-fold improvement in affinity to serotype 4 of DV while preserving, or modestly increasing, affinity to serotypes 1–3 of DV. We show that increased affinity resulted in strong in vitro neutralizing activity to all four serotypes, and that the redesigned antibody has potent antiviral activity in a mouse model of DV challenge. Our findings demonstrate an empirical computational chemistry approach for improving protein–protein docking and engineering antibody affinity, which will help accelerate the development of clinically relevant antibodies. PMID:23569282

  12. Technical challenges, past and future, in implementing THERESA: a one million patient, one billion item computer-based patient record and decision support system

    NASA Astrophysics Data System (ADS)

    Camp, Henry N.

    1996-02-01

    Challenges in implementing a computer-based patient record (CPR)--such as absolute data integrity, high availability, permanent on-line storage of very large complex records, rapid search times, ease of use, commercial viability, and portability to other hospitals and doctor's offices--are given along with their significance, the solutions, and their successes. The THERESA CPR has been used sine 1983 in direct patient care by a public hospital that is the primary care provider to 350,000 people. It has 1000 beds with 45,000 admissions and 750,000 outpatient visits annually. The system supports direct provider entry, including by physicians, of complete medical `documents'. Its demonstration site currently contains 1.1 billion data items on 1 million patients. It is also a clinical decision-aiding tool used for quality assurance and cost containment, for teaching as faculty and students can easily find and `thumb through' all cases similar to a particular study, and for research with over a billion medical items that can be searched and analyzed on-line within context and with continuity. The same software can also run in a desktop microcomputer managing a private practice physician's office.

  13. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  14. Scattering and radiative properties of complex soot and soot-containing particles

    NASA Astrophysics Data System (ADS)

    Liu, L.; Mishchenko, M. I.; Mackowski, D. W.; Dlugach, J.

    2012-12-01

    Tropospheric soot and soot containing aerosols often exhibit nonspherical overall shapes and complex morphologies. They can externally, semi-externally, and internally mix with other aerosol species. This poses a tremendous challenge in particle characterization, remote sensing, and global climate modeling studies. To address these challenges, we used the new numerically exact public-domain Fortran-90 code based on the superposition T-matrix method (STMM) and other theoretical models to analyze the potential effects of aggregation and heterogeneity on light scattering and absorption by morphologically complex soot containing particles. The parameters we computed include the whole scattering matrix elements, linear depolarization ratios, optical cross-sections, asymmetry parameters, and single scattering albedos. It is shown that the optical characteristics of soot and soot containing aerosols very much depend on particle sizes, compositions, and aerosol overall shapes. The soot particle configurations and heterogeneities can have a substantial effect that can result in a significant enhancement of extinction and absorption relative to those computed from the Lorenz-Mie theory. Meanwhile the model calculated information combined with in-situ and remote sensed data can be used to constrain soot particle shapes and sizes which are much needed in climate models.

  15. THE CHALLENGE OF QUALITY ASSURANCE FOR EMISSION FLUX MEASUREMENTS OF LARGE AREA SOURCES BY OPTICAL REMOTE SENSING

    EPA Science Inventory

    The paper examines the quality assurance challenges associated with open path Fourier transform infrared (OPFTIR) measurements of large area pollution sources with plume reconstruction by computed tomography (CT) and how each challenge may be met. Traditionally, pollutant concent...

  16. High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations

    NASA Technical Reports Server (NTRS)

    Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.

    2003-01-01

    Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.

  17. Single-Trial Classification of Multi-User P300-Based Brain-Computer Interface Using Riemannian Geometry.

    PubMed

    Korczowski, L; Congedo, M; Jutten, C

    2015-08-01

    The classification of electroencephalographic (EEG) data recorded from multiple users simultaneously is an important challenge in the field of Brain-Computer Interface (BCI). In this paper we compare different approaches for classification of single-trials Event-Related Potential (ERP) on two subjects playing a collaborative BCI game. The minimum distance to mean (MDM) classifier in a Riemannian framework is extended to use the diversity of the inter-subjects spatio-temporal statistics (MDM-hyper) or to merge multiple classifiers (MDM-multi). We show that both these classifiers outperform significantly the mean performance of the two users and analogous classifiers based on the step-wise linear discriminant analysis. More importantly, the MDM-multi outperforms the performance of the best player within the pair.

  18. Advanced Methodology for Simulation of Complex Flows Using Structured Grid Systems

    NASA Technical Reports Server (NTRS)

    Steinthorsson, Erlendur; Modiano, David

    1995-01-01

    Detailed simulations of viscous flows in complicated geometries pose a significant challenge to current capabilities of Computational Fluid Dynamics (CFD). To enable routine application of CFD to this class of problems, advanced methodologies are required that employ (a) automated grid generation, (b) adaptivity, (c) accurate discretizations and efficient solvers, and (d) advanced software techniques. Each of these ingredients contributes to increased accuracy, efficiency (in terms of human effort and computer time), and/or reliability of CFD software. In the long run, methodologies employing structured grid systems will remain a viable choice for routine simulation of flows in complex geometries only if genuinely automatic grid generation techniques for structured grids can be developed and if adaptivity is employed more routinely. More research in both these areas is urgently needed.

  19. Sustainable mobile information infrastructures in low resource settings.

    PubMed

    Braa, Kristin; Purkayastha, Saptarshi

    2010-01-01

    Developing countries represent the fastest growing mobile markets in the world. For people with no computing access, a mobile will be their first computing device. Mobile technologies offer a significant potential to strengthen health systems in developing countries with respect to community based monitoring, reporting, feedback to service providers, and strengthening communication and coordination between different health functionaries, medical officers and the community. However, there are various challenges in realizing this potential including technological such as lack of power, social, institutional and use issues. In this paper a case study from India on mobile health implementation and use will be reported. An underlying principle guiding this paper is to see mobile technology not as a "stand alone device" but potentially an integral component of an integrated mobile supported health information infrastructure.

  20. Opportunities and choice in a new vector era

    NASA Astrophysics Data System (ADS)

    Nowak, A.

    2014-06-01

    This work discusses the significant changes in computing landscape related to the progression of Moore's Law, and the implications on scientific computing. Particular attention is devoted to the High Energy Physics domain (HEP), which has always made good use of threading, but levels of parallelism closer to the hardware were often left underutilized. Findings of the CERN openlab Platform Competence Center are reported in the context of expanding "performance dimensions", and especially the resurgence of vectors. These suggest that data oriented designs are feasible in HEP and have considerable potential for performance improvements on multiple levels, but will rarely trump algorithmic enhancements. Finally, an analysis of upcoming hardware and software technologies identifies heterogeneity as a major challenge for software, which will require more emphasis on scalable, efficient design.

  1. Superposition Quantification

    NASA Astrophysics Data System (ADS)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  2. Security analysis of cyber-physical system

    NASA Astrophysics Data System (ADS)

    Li, Bo; Zhang, Lichen

    2017-05-01

    In recent years, Cyber-Physical System (CPS) has become an important research direction of academic circles and scientific and technological circles at home and abroad, is considered to be following the third wave of world information technology after the computer, the Internet. PS is a multi-dimensional, heterogeneous, deep integration of open systems, Involving the computer, communication, control and other disciplines of knowledge. As the various disciplines in the research theory and methods are significantly different, so the application of CPS has brought great challenges. This paper introduces the definition and characteristics of CPS, analyzes the current situation of CPS, analyzes the security threats faced by CPS, and gives the security solution for security threats. It also discusses CPS-specific security technology, to promote the healthy development of CPS in information security.

  3. Autocorrelated process control: Geometric Brownian Motion approach versus Box-Jenkins approach

    NASA Astrophysics Data System (ADS)

    Salleh, R. M.; Zawawi, N. I.; Gan, Z. F.; Nor, M. E.

    2018-04-01

    Existing of autocorrelation will bring a significant effect on the performance and accuracy of process control if the problem does not handle carefully. When dealing with autocorrelated process, Box-Jenkins method will be preferred because of the popularity. However, the computation of Box-Jenkins method is too complicated and challenging which cause of time-consuming. Therefore, an alternative method which known as Geometric Brownian Motion (GBM) is introduced to monitor the autocorrelated process. One real case of furnace temperature data is conducted to compare the performance of Box-Jenkins and GBM methods in monitoring autocorrelation process. Both methods give the same results in terms of model accuracy and monitoring process control. Yet, GBM is superior compared to Box-Jenkins method due to its simplicity and practically with shorter computational time.

  4. Economics and computer science of a radio spectrum reallocation.

    PubMed

    Leyton-Brown, Kevin; Milgrom, Paul; Segal, Ilya

    2017-07-11

    The recent "incentive auction" of the US Federal Communications Commission was the first auction to reallocate radio frequencies between two different kinds of uses: from broadcast television to wireless Internet access. The design challenge was not just to choose market rules to govern a fixed set of potential trades but also, to determine the broadcasters' property rights, the goods to be exchanged, the quantities to be traded, the computational procedures, and even some of the performance objectives. An essential and unusual challenge was to make the auction simple enough for human participants while still ensuring that the computations would be tractable and capable of delivering nearly efficient outcomes.

  5. Smart integrated microsystems: the energy efficiency challenge (Conference Presentation) (Plenary Presentation)

    NASA Astrophysics Data System (ADS)

    Benini, Luca

    2017-06-01

    The "internet of everything" envisions trillions of connected objects loaded with high-bandwidth sensors requiring massive amounts of local signal processing, fusion, pattern extraction and classification. From the computational viewpoint, the challenge is formidable and can be addressed only by pushing computing fabrics toward massive parallelism and brain-like energy efficiency levels. CMOS technology can still take us a long way toward this goal, but technology scaling is losing steam. Energy efficiency improvement will increasingly hinge on architecture, circuits, design techniques such as heterogeneous 3D integration, mixed-signal preprocessing, event-based approximate computing and non-Von-Neumann architectures for scalable acceleration.

  6. Where the Cloud Meets the Commons

    ERIC Educational Resources Information Center

    Ipri, Tom

    2011-01-01

    Changes presented by cloud computing--shared computing services, applications, and storage available to end users via the Internet--have the potential to seriously alter how libraries provide services, not only remotely, but also within the physical library, specifically concerning challenges facing the typical desktop computing experience.…

  7. The potential benefits of photonics in the computing platform

    NASA Astrophysics Data System (ADS)

    Bautista, Jerry

    2005-03-01

    The increase in computational requirements for real-time image processing, complex computational fluid dynamics, very large scale data mining in the health industry/Internet, and predictive models for financial markets are driving computer architects to consider new paradigms that rely upon very high speed interconnects within and between computing elements. Further challenges result from reduced power requirements, reduced transmission latency, and greater interconnect density. Optical interconnects may solve many of these problems with the added benefit extended reach. In addition, photonic interconnects provide relative EMI immunity which is becoming an increasing issue with a greater dependence on wireless connectivity. However, to be truly functional, the optical interconnect mesh should be able to support arbitration, addressing, etc. completely in the optical domain with a BER that is more stringent than "traditional" communication requirements. Outlined are challenges in the advanced computing environment, some possible optical architectures and relevant platform technologies, as well roughly sizing these opportunities which are quite large relative to the more "traditional" optical markets.

  8. Energy Efficiency Challenges of 5G Small Cell Networks.

    PubMed

    Ge, Xiaohu; Yang, Jing; Gharavi, Hamid; Sun, Yang

    2017-05-01

    The deployment of a large number of small cells poses new challenges to energy efficiency, which has often been ignored in fifth generation (5G) cellular networks. While massive multiple-input multiple outputs (MIMO) will reduce the transmission power at the expense of higher computational cost, the question remains as to which computation or transmission power is more important in the energy efficiency of 5G small cell networks. Thus, the main objective in this paper is to investigate the computation power based on the Landauer principle. Simulation results reveal that more than 50% of the energy is consumed by the computation power at 5G small cell base stations (BSs). Moreover, the computation power of 5G small cell BS can approach 800 watt when the massive MIMO (e.g., 128 antennas) is deployed to transmit high volume traffic. This clearly indicates that computation power optimization can play a major role in the energy efficiency of small cell networks.

  9. Energy Efficiency Challenges of 5G Small Cell Networks

    PubMed Central

    Ge, Xiaohu; Yang, Jing; Gharavi, Hamid; Sun, Yang

    2017-01-01

    The deployment of a large number of small cells poses new challenges to energy efficiency, which has often been ignored in fifth generation (5G) cellular networks. While massive multiple-input multiple outputs (MIMO) will reduce the transmission power at the expense of higher computational cost, the question remains as to which computation or transmission power is more important in the energy efficiency of 5G small cell networks. Thus, the main objective in this paper is to investigate the computation power based on the Landauer principle. Simulation results reveal that more than 50% of the energy is consumed by the computation power at 5G small cell base stations (BSs). Moreover, the computation power of 5G small cell BS can approach 800 watt when the massive MIMO (e.g., 128 antennas) is deployed to transmit high volume traffic. This clearly indicates that computation power optimization can play a major role in the energy efficiency of small cell networks. PMID:28757670

  10. Technology, Pedagogy, and Epistemology: Opportunities and Challenges of Using Computer Modeling and Simulation Tools in Elementary Science Methods

    ERIC Educational Resources Information Center

    Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay

    2007-01-01

    This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…

  11. Improving Student Learning Using State of the Art IT Equipment

    ERIC Educational Resources Information Center

    Okur, Mehmet Cudi; Basarici, Samsun Mustafa; Rana, Tohid Ahmed

    2007-01-01

    Fast growth of computer related technology both in software-hardware and application areas, brings new challenges to be faced when using computers for supporting education. In this paper some experiences and the results of a survey are presented in teaching computer topics using computer as a teaching tool. Our teaching activities are related to…

  12. TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Nelson, J.; Jones, N.; Ames, D. P.

    2015-12-01

    Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.

  13. A Machine LearningFramework to Forecast Wave Conditions

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; James, S. C.; O'Donncha, F.

    2017-12-01

    Recently, significant effort has been undertaken to quantify and extract wave energy because it is renewable, environmental friendly, abundant, and often close to population centers. However, a major challenge is the ability to accurately and quickly predict energy production, especially across a 48-hour cycle. Accurate forecasting of wave conditions is a challenging undertaking that typically involves solving the spectral action-balance equation on a discretized grid with high spatial resolution. The nature of the computations typically demands high-performance computing infrastructure. Using a case-study site at Monterey Bay, California, a machine learning framework was trained to replicate numerically simulated wave conditions at a fraction of the typical computational cost. Specifically, the physics-based Simulating WAves Nearshore (SWAN) model, driven by measured wave conditions, nowcast ocean currents, and wind data, was used to generate training data for machine learning algorithms. The model was run between April 1st, 2013 and May 31st, 2017 generating forecasts at three-hour intervals yielding 11,078 distinct model outputs. SWAN-generated fields of 3,104 wave heights and a characteristic period could be replicated through simple matrix multiplications using the mapping matrices from machine learning algorithms. In fact, wave-height RMSEs from the machine learning algorithms (9 cm) were less than those for the SWAN model-verification exercise where those simulations were compared to buoy wave data within the model domain (>40 cm). The validated machine learning approach, which acts as an accurate surrogate for the SWAN model, can now be used to perform real-time forecasts of wave conditions for the next 48 hours using available forecasted boundary wave conditions, ocean currents, and winds. This solution has obvious applications to wave-energy generation as accurate wave conditions can be forecasted with over a three-order-of-magnitude reduction in computational expense. The low computational cost (and by association low computer-power requirement) means that the machine learning algorithms could be installed on a wave-energy converter as a form of "edge computing" where a device could forecast its own 48-hour energy production.

  14. Prosthetic reconstruction of a patient with an acquired nasal defect using extraoral implants and a CAD/CAM copy-milled bar.

    PubMed

    Vera, Carolina; Barrero, Carlos; Shockley, William; Rothenberger, Sandra; Minsley, Glenn; Drago, Carl

    2014-10-01

    Traditionally, patients with maxillofacial defects have been challenging to treat. A multitude of challenges associated with maxillofacial prosthetic treatment are not typically seen with patients who need conventional prosthodontic treatment. These types of patients generally require replacement of significant amounts of hard and soft tissues than do conventional prosthodontic patients. Most maxillofacial patients also warrant more emotional support than do conventional prosthodontic patients. Successful maxillofacial prosthetics still need to embrace the traditional goals of prosthodontic treatment: stability, support, retention, and esthetics. It is unlikely that a maxillofacial prosthesis will exactly duplicate the anatomy and function of missing or damaged structures. Although craniofacial implants (CFI's) have lower cumulative survival rates (CSR's) than intraoral endosseous implants, osseointegrated CFI's have proven to be significant adjuncts to improving retention of maxillofacial prostheses. However, CSR's of CFI's have been reported to be lower than CSR's for intraoral endosseous implants. Lately, computer-assisted design and computer-assisted machining (CAD/CAM) has been used in dentistry to facilitate fabrication of implant-supported frameworks. CAD/CAM protocols have numerous advantages over conventional casting techniques, including improved accuracy and biocompatibility, and decreased costs. The purpose of this paper is to review the literature on cumulative survival rates (CSR's) reported for CFI's and to illustrate the treatment of a maxillofacial patient using CFI's and a CAD/CAM copy-milled framework for retention and support of a nasal prosthesis. © 2014 by the American College of Prosthodontists.

  15. Genome-wide gene–gene interaction analysis for next-generation sequencing

    PubMed Central

    Zhao, Jinying; Zhu, Yun; Xiong, Momiao

    2016-01-01

    The critical barrier in interaction analysis for next-generation sequencing (NGS) data is that the traditional pairwise interaction analysis that is suitable for common variants is difficult to apply to rare variants because of their prohibitive computational time, large number of tests and low power. The great challenges for successful detection of interactions with NGS data are (1) the demands in the paradigm of changes in interaction analysis; (2) severe multiple testing; and (3) heavy computations. To meet these challenges, we shift the paradigm of interaction analysis between two SNPs to interaction analysis between two genomic regions. In other words, we take a gene as a unit of analysis and use functional data analysis techniques as dimensional reduction tools to develop a novel statistic to collectively test interaction between all possible pairs of SNPs within two genome regions. By intensive simulations, we demonstrate that the functional logistic regression for interaction analysis has the correct type 1 error rates and higher power to detect interaction than the currently used methods. The proposed method was applied to a coronary artery disease dataset from the Wellcome Trust Case Control Consortium (WTCCC) study and the Framingham Heart Study (FHS) dataset, and the early-onset myocardial infarction (EOMI) exome sequence datasets with European origin from the NHLBI's Exome Sequencing Project. We discovered that 6 of 27 pairs of significantly interacted genes in the FHS were replicated in the independent WTCCC study and 24 pairs of significantly interacted genes after applying Bonferroni correction in the EOMI study. PMID:26173972

  16. Facilities | Computational Science | NREL

    Science.gov Websites

    technology innovation by providing scientists and engineers the ability to tackle energy challenges that scientists and engineers to take full advantage of advanced computing hardware and software resources

  17. Big Data: Next-Generation Machines for Big Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hack, James J.; Papka, Michael E.

    Addressing the scientific grand challenges identified by the US Department of Energy’s (DOE’s) Office of Science’s programs alone demands a total leadership-class computing capability of 150 to 400 Pflops by the end of this decade. The successors to three of the DOE’s most powerful leadership-class machines are set to arrive in 2017 and 2018—the products of the Collaboration Oak Ridge Argonne Livermore (CORAL) initiative, a national laboratory–industry design/build approach to engineering nextgeneration petascale computers for grand challenge science. These mission-critical machines will enable discoveries in key scientific fields such as energy, biotechnology, nanotechnology, materials science, and high-performance computing, and servemore » as a milestone on the path to deploying exascale computing capabilities.« less

  18. Identifying Challenges to the Integration of Computer-Based Surveillance Information Systems in a Large City Health Department: A Case Study.

    PubMed

    Jennings, Jacky M; Stover, Jeffrey A; Bair-Merritt, Megan H; Fichtenberg, Caroline; Munoz, Mary Grace; Maziad, Rafiq; Ketemepi, Sherry Johnson; Zenilman, Jonathan

    2009-01-01

    Integrated infectious disease surveillance information systems have the potential to provide important new surveillance capacities and business efficiencies for local health departments. We conducted a case study at a large city health department of the primary computer-based infectious disease surveillance information systems during a 10-year period to identify the major challenges for information integration across the systems. The assessment included key informant interviews and evaluations of the computer-based surveillance information systems used for acute communicable diseases, human immunodeficiency virus/acquired immunodeficiency syndrome, sexually transmitted diseases, and tuberculosis. Assessments were conducted in 1998 with a follow-up in 2008. Assessments specifically identified and described the primary computer-based surveillance information system, any duplicative information systems, and selected variables collected. Persistent challenges to information integration across the information systems included the existence of duplicative data systems, differences in the variables used to collect similar information, and differences in basic architecture. The assessments identified a number of challenges for information integration across the infectious disease surveillance information systems at this city health department. The results suggest that local disease control programs use computer-based surveillance information systems that were not designed for data integration. To the extent that integration provides important new surveillance capacities and business efficiencies, we recommend that patient-centric information systems be designed that provide all the epidemiologic, clinical, and research needs in one system. In addition, the systems should include a standard system of elements and fields across similar surveillance systems.

  19. Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, Mary

    2014-09-19

    Enhancing the performance of SciDAC applications on petascale systems has high priority within DOE SC. As we look to the future, achieving expected levels of performance on high-end com-puting (HEC) systems is growing ever more challenging due to enormous scale, increasing archi-tectural complexity, and increasing application complexity. To address these challenges, PERI has implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineering of high profile applications. The PERI performance modeling and prediction activity is developing and refining performance models, significantly reducing the cost of collecting the data upon whichmore » the models are based, and increasing model fidelity, speed and generality. Our primary research activity is automatic tuning (autotuning) of scientific software. This activity is spurred by the strong user preference for automatic tools and is based on previous successful activities such as ATLAS, which has automatically tuned components of the LAPACK linear algebra library, and other re-cent work on autotuning domain-specific libraries. Our third major component is application en-gagement, to which we are devoting approximately 30% of our effort to work directly with Sci-DAC-2 applications. This last activity not only helps DOE scientists meet their near-term per-formance goals, but also helps keep PERI research focused on the real challenges facing DOE computational scientists as they enter the Petascale Era.« less

  20. The challenge of ubiquitous computing in health care: technology, concepts and solutions. Findings from the IMIA Yearbook of Medical Informatics 2005.

    PubMed

    Bott, O J; Ammenwerth, E; Brigl, B; Knaup, P; Lang, E; Pilgram, R; Pfeifer, B; Ruderich, F; Wolff, A C; Haux, R; Kulikowski, C

    2005-01-01

    To review recent research efforts in the field of ubiquitous computing in health care. To identify current research trends and further challenges for medical informatics. Analysis of the contents of the Yearbook on Medical Informatics 2005 of the International Medical Informatics Association (IMIA). The Yearbook of Medical Informatics 2005 includes 34 original papers selected from 22 peer-reviewed scientific journals related to several distinct research areas: health and clinical management, patient records, health information systems, medical signal processing and biomedical imaging, decision support, knowledge representation and management, education and consumer informatics as well as bioinformatics. A special section on ubiquitous health care systems is devoted to recent developments in the application of ubiquitous computing in health care. Besides additional synoptical reviews of each of the sections the Yearbook includes invited reviews concerning E-Health strategies, primary care informatics and wearable healthcare. Several publications demonstrate the potential of ubiquitous computing to enhance effectiveness of health services delivery and organization. But ubiquitous computing is also a societal challenge, caused by the surrounding but unobtrusive character of this technology. Contributions from nearly all of the established sub-disciplines of medical informatics are demanded to turn the visions of this promising new research field into reality.

Top