Computational Aeroelastic Modeling of Airframes and TurboMachinery: Progress and Challenges
NASA Technical Reports Server (NTRS)
Bartels, R. E.; Sayma, A. I.
2006-01-01
Computational analyses such as computational fluid dynamics and computational structural dynamics have made major advances toward maturity as engineering tools. Computational aeroelasticity is the integration of these disciplines. As computational aeroelasticity matures it too finds an increasing role in the design and analysis of aerospace vehicles. This paper presents a survey of the current state of computational aeroelasticity with a discussion of recent research, success and continuing challenges in its progressive integration into multidisciplinary aerospace design. This paper approaches computational aeroelasticity from the perspective of the two main areas of application: airframe and turbomachinery design. An overview will be presented of the different prediction methods used for each field of application. Differing levels of nonlinear modeling will be discussed with insight into accuracy versus complexity and computational requirements. Subjects will include current advanced methods (linear and nonlinear), nonlinear flow models, use of order reduction techniques and future trends in incorporating structural nonlinearity. Examples in which computational aeroelasticity is currently being integrated into the design of airframes and turbomachinery will be presented.
ERIC Educational Resources Information Center
West, Patti; Rutstein, Daisy Wise; Mislevy, Robert J.; Liu, Junhui; Choi, Younyoung; Levy, Roy; Crawford, Aaron; DiCerbo, Kristen E.; Chappel, Kristina; Behrens, John T.
2010-01-01
A major issue in the study of learning progressions (LPs) is linking student performance on assessment tasks to the progressions. This report describes the challenges faced in making this linkage using Bayesian networks to model LPs in the field of computer networking. The ideas are illustrated with exemplar Bayesian networks built on Cisco…
ERIC Educational Resources Information Center
Johnson, James Nathaniel
2013-01-01
Ubiquitous computing is a near reality in both the private and public arena. Business and personal spaces are seeing a proliferation of mobile computing devices and pervasive computing technologies. This phenomenon is creating a unique set of challenges for organizational IT professionals, specifically in the numerous spillover effects of having…
USDA-ARS?s Scientific Manuscript database
The progressive improvement of computer science and development of auto-calibration techniques means that calibration of simulation models is no longer a major challenge for watershed planning and management. Modelers now increasingly focus on challenges such as improved representation of watershed...
Technology in Education. The Progress of Education Reform, 2006. Volume 6, Number 6
ERIC Educational Resources Information Center
Weiss, Suzanne
2006-01-01
For policymakers, educators and others interested in learning more about the one-to-one computing movement, this issue of "The Progress of Education Reform" spotlights three particularly useful resources: (1) a detailed review of the challenges faced by states and districts implementing laptop programs, and of lessons learned to date in…
Mesoscale Models of Fluid Dynamics
NASA Astrophysics Data System (ADS)
Boghosian, Bruce M.; Hadjiconstantinou, Nicolas G.
During the last half century, enormous progress has been made in the field of computational materials modeling, to the extent that in many cases computational approaches are used in a predictive fashion. Despite this progress, modeling of general hydrodynamic behavior remains a challenging task. One of the main challenges stems from the fact that hydrodynamics manifests itself over a very wide range of length and time scales. On one end of the spectrum, one finds the fluid's "internal" scale characteristic of its molecular structure (in the absence of quantum effects, which we omit in this chapter). On the other end, the "outer" scale is set by the characteristic sizes of the problem's domain. The resulting scale separation or lack thereof as well as the existence of intermediate scales are key to determining the optimal approach. Successful treatments require a judicious choice of the level of description which is a delicate balancing act between the conflicting requirements of fidelity and manageable computational cost: a coarse description typically requires models for underlying processes occuring at smaller length and time scales; on the other hand, a fine-scale model will incur a significantly larger computational cost.
Durkin, Kevin; Conti-Ramsden, Gina
2012-01-01
Computer use draws on linguistic abilities. Using this medium thus presents challenges for young people with Specific Language Impairment (SLI) and raises questions of whether computer-based tasks are appropriate for them. We consider theoretical arguments predicting impaired performance and negative outcomes relative to peers without SLI versus the possibility of positive gains. We examine the relationship between frequency of computer use (for leisure and educational purposes) and educational achievement; in particular examination performance at the end of compulsory education and level of educational progress two years later. Participants were 49 young people with SLI and 56 typically developing (TD) young people. At around age 17, the two groups did not differ in frequency of educational computer use or leisure computer use. There were no associations between computer use and educational outcomes in the TD group. In the SLI group, after PIQ was controlled for, educational computer use at around 17 years of age contributed substantially to the prediction of educational progress at 19 years. The findings suggest that educational uses of computers are conducive to educational progress in young people with SLI. PMID:23300610
Report of the Panel on Computer and Information Technology
NASA Technical Reports Server (NTRS)
Lundstrom, Stephen F.; Larsen, Ronald L.
1984-01-01
Aircraft have become more and more dependent on computers (information processing) for improved performance and safety. It is clear that this activity will grow, since information processing technology has advanced by a factor of 10 every 5 years for the past 35 years and will continue to do so. Breakthroughs in device technology, from vacuum tubes through transistors to integrated circuits, contribute to this rapid pace. This progress is nearly matched by similar, though not as dramatic, advances in numerical software and algorithms. Progress has not been easy. Many technical and nontechnical challenges were surmounted. The outlook is for continued growth in capability but will require surmounting new challenges. The technology forecast presented in this report has been developed by extrapolating current trends and assessing the possibilities of several high-risk research topics. In the process, critical problem areas that require research and development emphasis have been identified. The outlook assumes a positive perspective; the projected capabilities are possible by the year 2000, and adequate resources will be made available to achieve them. Computer and information technology forecasts and the potential impacts of this technology on aeronautics are identified. Critical issues and technical challenges underlying the achievement of forecasted performance and benefits are addressed.
Engineering brain-computer interfaces: past, present and future.
Hughes, M A
2014-06-01
Electricity governs the function of both nervous systems and computers. Whilst ions move in polar fluids to depolarize neuronal membranes, electrons move in the solid-state lattices of microelectronic semiconductors. Joining these two systems together, to create an iono-electric brain-computer interface, is an immense challenge. However, such interfaces offer (and in select clinical contexts have already delivered) a method of overcoming disability caused by neurological or musculoskeletal pathology. To fulfill their theoretical promise, several specific challenges demand consideration. Rate-limiting steps cover a diverse range of disciplines including microelectronics, neuro-informatics, engineering, and materials science. As those who work at the tangible interface between brain and outside world, neurosurgeons are well placed to contribute to, and inform, this cutting edge area of translational research. This article explores the historical background, status quo, and future of brain-computer interfaces; and outlines the challenges to progress and opportunities available to the clinical neurosciences community.
The challenge of computer mathematics.
Barendregt, Henk; Wiedijk, Freek
2005-10-15
Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.
Critical infrastructure protection : significant challenges in developing national capabilities
DOT National Transportation Integrated Search
2001-04-01
To address the concerns about protecting the nation's critical computer-dependent infrastructure, this General Accounting Office (GOA) report describes the progress of the National Infrastructure Protection Center (NIPC) in (1) developing national ca...
Progress in Computational Electron-Molecule Collisions
NASA Astrophysics Data System (ADS)
Rescigno, Tn
1997-10-01
The past few years have witnessed tremendous progress in the development of sophisticated ab initio methods for treating collisions of slow electrons with isolated small molecules. Researchers in this area have benefited greatly from advances in computer technology; indeed, the advent of parallel computers has made it possible to carry out calculations at a level of sophistication inconceivable a decade ago. But bigger and faster computers are only part of the picture. Even with today's computers, the practical need to study electron collisions with the kinds of complex molecules and fragments encountered in real-world plasma processing environments is taxing present methods beyond their current capabilities. Since extrapolation of existing methods to handle increasingly larger targets will ultimately fail as it would require computational resources beyond any imagined, continued progress must also be linked to new theoretical developments. Some of the techniques recently introduced to address these problems will be discussed and illustrated with examples of electron-molecule collision calculations we have carried out on some fairly complex target gases encountered in processing plasmas. Electron-molecule scattering continues to pose many formidable theoretical and computational challenges. I will touch on some of the outstanding open questions.
Towards a cyberinfrastructure for the biological sciences: progress, visions and challenges.
Stein, Lincoln D
2008-09-01
Biology is an information-driven science. Large-scale data sets from genomics, physiology, population genetics and imaging are driving research at a dizzying rate. Simultaneously, interdisciplinary collaborations among experimental biologists, theorists, statisticians and computer scientists have become the key to making effective use of these data sets. However, too many biologists have trouble accessing and using these electronic data sets and tools effectively. A 'cyberinfrastructure' is a combination of databases, network protocols and computational services that brings people, information and computational tools together to perform science in this information-driven world. This article reviews the components of a biological cyberinfrastructure, discusses current and pending implementations, and notes the many challenges that lie ahead.
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC.
This report addresses an opportunity to accelerate progress in virtually every branch of science and engineering concurrently, while also boosting the American economy as business firms also learn to exploit these new capabilities. The successful rapid advancement in both science and technology creates its own challenges, four of which are…
Language, Learning, and Identity in Social Networking Sites for Language Learning: The Case of Busuu
ERIC Educational Resources Information Center
Alvarez Valencia, Jose Aldemar
2014-01-01
Recent progress in the discipline of computer applications such as the advent of web-based communication, afforded by the Web 2.0, has paved the way for novel applications in language learning, namely, social networking. Social networking has challenged the area of Computer Mediated Communication (CMC) to expand its research palette in order to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucas, Robert; Ang, James; Bergman, Keren
2014-02-10
Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a systemmore » that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elnabawy, Ahmed O.; Rangarajan, Srinivas; Mavrikakis, Manos
Computational chemistry, especially density functional theory, has experienced a remarkable growth in terms of application over the last few decades. This is attributed to the improvements in theory and computing infrastructure that enable the analysis of systems of unprecedented size and detail at an affordable computational expense. In this perspective, we discuss recent progress and current challenges facing electronic structure theory in the context of heterogeneous catalysis. We specifically focus on the impact of computational chemistry in elucidating and designing catalytic systems in three topics of interest to Haldor Topsøe – ammonia, synthesis, hydrotreating, and NO x reduction. Furthermore, wemore » then discuss the common tools and concepts in computational catalysis that underline these topics and provide a perspective on the challenges and future directions of research in this area of catalysis research.« less
Elnabawy, Ahmed O.; Rangarajan, Srinivas; Mavrikakis, Manos
2015-06-05
Computational chemistry, especially density functional theory, has experienced a remarkable growth in terms of application over the last few decades. This is attributed to the improvements in theory and computing infrastructure that enable the analysis of systems of unprecedented size and detail at an affordable computational expense. In this perspective, we discuss recent progress and current challenges facing electronic structure theory in the context of heterogeneous catalysis. We specifically focus on the impact of computational chemistry in elucidating and designing catalytic systems in three topics of interest to Haldor Topsøe – ammonia, synthesis, hydrotreating, and NO x reduction. Furthermore, wemore » then discuss the common tools and concepts in computational catalysis that underline these topics and provide a perspective on the challenges and future directions of research in this area of catalysis research.« less
Agent 2003 Conference on Challenges in Social Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Margaret Clemmons, ed.
Welcome to the Proceedings of the fourth in a series of agent simulation conferences cosponsored by Argonne National Laboratory and The University of Chicago. Agent 2003 is the second conference in which three Special Interest Groups from the North American Association for Computational Social and Organizational Science (NAACSOS) have been involved in planning the program--Computational Social Theory; Simulation Applications; and Methods, Toolkits and Techniques. The theme of Agent 2003, Challenges in Social Simulation, is especially relevant, as there seems to be no shortage of such challenges. Agent simulation has been applied with increasing frequency to social domains for several decades,more » and its promise is clear and increasingly visible. Like any nascent scientific methodology, however, it faces a number of problems or issues that must be addressed in order to progress. These challenges include: (1) Validating models relative to the social settings they are designed to represent; (2) Developing agents and interactions simple enough to understand but sufficiently complex to do justice to the social processes of interest; (3) Bridging the gap between empirically spare artificial societies and naturally occurring social phenomena; (4) Building multi-level models that span processes across domains; (5) Promoting a dialog among theoretical, qualitative, and empirical social scientists and area experts, on the one hand, and mathematical and computational modelers and engineers, on the other; (6) Using that dialog to facilitate substantive progress in the social sciences; and (7) Fulfilling the aspirations of users in business, government, and other application areas, while recognizing and addressing the preceding challenges. Although this list hardly exhausts the challenges the field faces, it does identify topics addressed throughout the presentations of Agent 2003. Agent 2003 is part of a much larger process in which new methods and techniques are applied to difficult social issues. Among the resources that give us the prospect of success is the innovative and transdisciplinary research community being built. We believe that Agent 2003 contributes to further progress in computational modeling of social processes, and we hope that you find these Proceedings to be stimulating and rewarding. As the horizons of this transdiscipline continue to emerge and converge, we hope to provide similar forums that will promote development of agent simulation modeling in the years to come.« less
The next generation of command post computing
NASA Astrophysics Data System (ADS)
Arnold, Ross D.; Lieb, Aaron J.; Samuel, Jason M.; Burger, Mitchell A.
2015-05-01
The future of command post computing demands an innovative new solution to address a variety of challenging operational needs. The Command Post of the Future is the Army's primary command and control decision support system, providing situational awareness and collaborative tools for tactical decision making, planning, and execution management from Corps to Company level. However, as the U.S. Army moves towards a lightweight, fully networked battalion, disconnected operations, thin client architecture and mobile computing become increasingly essential. The Command Post of the Future is not designed to support these challenges in the coming decade. Therefore, research into a hybrid blend of technologies is in progress to address these issues. This research focuses on a new command and control system utilizing the rich collaboration framework afforded by Command Post of the Future coupled with a new user interface consisting of a variety of innovative workspace designs. This new system is called Tactical Applications. This paper details a brief history of command post computing, presents the challenges facing the modern Army, and explores the concepts under consideration for Tactical Applications that meet these challenges in a variety of innovative ways.
Progress and Challenges in Coupled Hydrodynamic-Ecological Estuarine Modeling
Numerical modeling has emerged over the last several decades as a widely accepted tool for investigations in environmental sciences. In estuarine research, hydrodynamic and ecological models have moved along parallel tracks with regard to complexity, refinement, computational po...
Smith, Rob; Mathis, Andrew D; Ventura, Dan; Prince, John T
2014-01-01
For decades, mass spectrometry data has been analyzed to investigate a wide array of research interests, including disease diagnostics, biological and chemical theory, genomics, and drug development. Progress towards solving any of these disparate problems depends upon overcoming the common challenge of interpreting the large data sets generated. Despite interim successes, many data interpretation problems in mass spectrometry are still challenging. Further, though these challenges are inherently interdisciplinary in nature, the significant domain-specific knowledge gap between disciplines makes interdisciplinary contributions difficult. This paper provides an introduction to the burgeoning field of computational mass spectrometry. We illustrate key concepts, vocabulary, and open problems in MS-omics, as well as provide invaluable resources such as open data sets and key search terms and references. This paper will facilitate contributions from mathematicians, computer scientists, and statisticians to MS-omics that will fundamentally improve results over existing approaches and inform novel algorithmic solutions to open problems.
Efficient Learning of Continuous-Time Hidden Markov Models for Disease Progression
Liu, Yu-Ying; Li, Shuang; Li, Fuxin; Song, Le; Rehg, James M.
2016-01-01
The Continuous-Time Hidden Markov Model (CT-HMM) is an attractive approach to modeling disease progression due to its ability to describe noisy observations arriving irregularly in time. However, the lack of an efficient parameter learning algorithm for CT-HMM restricts its use to very small models or requires unrealistic constraints on the state transitions. In this paper, we present the first complete characterization of efficient EM-based learning methods for CT-HMM models. We demonstrate that the learning problem consists of two challenges: the estimation of posterior state probabilities and the computation of end-state conditioned statistics. We solve the first challenge by reformulating the estimation problem in terms of an equivalent discrete time-inhomogeneous hidden Markov model. The second challenge is addressed by adapting three approaches from the continuous time Markov chain literature to the CT-HMM domain. We demonstrate the use of CT-HMMs with more than 100 states to visualize and predict disease progression using a glaucoma dataset and an Alzheimer’s disease dataset. PMID:27019571
Evaluating and assessing impacts to development is an Agency priority (EPA’s Children’s Environmental Health Research Roadmap); however, the quantity of chemicals needing assessment and challenges of species extrapolation require alternative approaches to traditional animal studi...
Plank, G; Prassl, AJ; Augustin, C
2014-01-01
Despite the evident multiphysics nature of the heart – it is an electrically controlled mechanical pump – most modeling studies considered electrophysiology and mechanics in isolation. In no small part, this is due to the formidable modeling challenges involved in building strongly coupled anatomically accurate and biophyically detailed multi-scale multi-physics models of cardiac electro-mechanics. Among the main challenges are the selection of model components and their adjustments to achieve integration into a consistent organ-scale model, dealing with technical difficulties such as the exchange of data between electro-physiological and mechanical model, particularly when using different spatio-temporal grids for discretization, and, finally, the implementation of advanced numerical techniques to deal with the substantial computational. In this study we report on progress made in developing a novel modeling framework suited to tackle these challenges. PMID:24043050
Memory Corruption Mitigations and Their Implementation Progress in Third-Party Windows Applications
2012-09-01
coverage in the news, which helped the public recognize the importance of computers in everyday life and, more importantly, the challenges in securing...Media Players Winamp, VLC Media Player, Quicktime Player, iTunes, Real Player Instant Messaging Applications mIRC, Yahoo Messenger, AIM, Nimbuzz...as cell phones and tablets, may boost this ratio in the upcoming years. In such a highly connected world, it is becoming more and more challenging
ERIC Educational Resources Information Center
Mellander, Gustavo A.; Mellander, Nelly
This document discusses the following critical issues in community college education: (1) computers and technology, (2) faculty retirements and the boomlets, (3) adjunct teachers, and (4) societal problems, including people being left behind. A digital divide has arisen in America and a national agenda is needed that will fully train all adults in…
Abou-Ayash, Samir; Boldt, Johannes; Vuck, Alexander
Full-arch rehabilitation of patients with severe tooth wear due to parafunctional behavior is a challenge for dentists and dental technicians, especially when a highly esthetic outcome is desired. A variety of different treatment options and prosthetic materials are available for such a clinical undertaking. The ongoing progress of computer-aided design/computer-assisted manufacture technologies in combination with all-ceramic materials provides a predictable workflow for these complex cases. This case history report describes a comprehensive, step-by-step treatment protocol leading to an optimally predictable treatment outcome for an esthetically compromised patient.
Computer-assisted cartography: an overview.
Guptill, S.C.; Starr, L.E.
1984-01-01
An assessment of the current status of computer-assisted cartography, in part, is biased by one's view of the cartographic process as a whole. From a traditional viewpoint we are concerned about automating the mapping process; from a progressive viewpoint we are concerned about using the tools of computer science to convey spatial information. On the surface these viewpoints appear to be in opposition. However, it is postulated that in the final analysis, they face the same goal. This overview uses the perspectives from two viewpoints to depict the current state of computer-assisted cartography and speculate on future goals, trends, and challenges.-Authors
Enabling Large-Scale Biomedical Analysis in the Cloud
Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen
2013-01-01
Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665
1985-02-18
expected to double in each of the next five years. This growth of computer technology presents education with several challenges. One of the...provide some reassurence to educators who P worry that the group discussion will slow down the progress of students’ learning . The second issue examined...on learning from media. Review of Educational Research, 53, 445-459. Collins, A., Bruce, B., & Rubin, A. (1982). Microcomputer-based writing activities
Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling
NASA Astrophysics Data System (ADS)
Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.
2014-12-01
Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.
Bayesian Methods for Scalable Multivariate Value-Added Assessment
ERIC Educational Resources Information Center
Lockwood, J. R.; McCaffrey, Daniel F.; Mariano, Louis T.; Setodji, Claude
2007-01-01
There is increased interest in value-added models relying on longitudinal student-level test score data to isolate teachers' contributions to student achievement. The complex linkage of students to teachers as students progress through grades poses both substantive and computational challenges. This article introduces a multivariate Bayesian…
PDP-11 Meeting School District Administrative Data Processing Challenges.
ERIC Educational Resources Information Center
Technological Horizons in Education, 1983
1983-01-01
Elmhurst Community Unit School District 205 (Illinois) is currently using a Digital PDP-11/70 computer system and administrative software to handle administrative functions with ease and efficiency. These functions include production of reports and payroll, student scheduling, reporting grades and student progress, and maintaining student…
Adult Learning Innovations: Vehicles for Social and Economic Progress.
ERIC Educational Resources Information Center
Hilton, William J.
The cost-effective use of communications technologies to extend adult learning opportunities is one way state policymakers can face the challenge of responding to growing demands for essential public services. The major new educational technologies that are available are television, radio, telephone, computers, communications satellites, and…
Quantum Computation: Entangling with the Future
NASA Technical Reports Server (NTRS)
Jiang, Zhang
2017-01-01
Commercial applications of quantum computation have become viable due to the rapid progress of the field in the recent years. Efficient quantum algorithms are discovered to cope with the most challenging real-world problems that are too hard for classical computers. Manufactured quantum hardware has reached unprecedented precision and controllability, enabling fault-tolerant quantum computation. Here, I give a brief introduction on what principles in quantum mechanics promise its unparalleled computational power. I will discuss several important quantum algorithms that achieve exponential or polynomial speedup over any classical algorithm. Building a quantum computer is a daunting task, and I will talk about the criteria and various implementations of quantum computers. I conclude the talk with near-future commercial applications of a quantum computer.
Computational challenges of structure-based approaches applied to HIV.
Forli, Stefano; Olson, Arthur J
2015-01-01
Here, we review some of the opportunities and challenges that we face in computational modeling of HIV therapeutic targets and structural biology, both in terms of methodology development and structure-based drug design (SBDD). Computational methods have provided fundamental support to HIV research since the initial structural studies, helping to unravel details of HIV biology. Computational models have proved to be a powerful tool to analyze and understand the impact of mutations and to overcome their structural and functional influence in drug resistance. With the availability of structural data, in silico experiments have been instrumental in exploiting and improving interactions between drugs and viral targets, such as HIV protease, reverse transcriptase, and integrase. Issues such as viral target dynamics and mutational variability, as well as the role of water and estimates of binding free energy in characterizing ligand interactions, are areas of active computational research. Ever-increasing computational resources and theoretical and algorithmic advances have played a significant role in progress to date, and we envision a continually expanding role for computational methods in our understanding of HIV biology and SBDD in the future.
Constraint-based stoichiometric modelling from single organisms to microbial communities
Olivier, Brett G.; Bruggeman, Frank J.; Teusink, Bas
2016-01-01
Microbial communities are ubiquitously found in Nature and have direct implications for the environment, human health and biotechnology. The species composition and overall function of microbial communities are largely shaped by metabolic interactions such as competition for resources and cross-feeding. Although considerable scientific progress has been made towards mapping and modelling species-level metabolism, elucidating the metabolic exchanges between microorganisms and steering the community dynamics remain an enormous scientific challenge. In view of the complexity, computational models of microbial communities are essential to obtain systems-level understanding of ecosystem functioning. This review discusses the applications and limitations of constraint-based stoichiometric modelling tools, and in particular flux balance analysis (FBA). We explain this approach from first principles and identify the challenges one faces when extending it to communities, and discuss the approaches used in the field in view of these challenges. We distinguish between steady-state and dynamic FBA approaches extended to communities. We conclude that much progress has been made, but many of the challenges are still open. PMID:28334697
Security: Progress and Challenges
ERIC Educational Resources Information Center
Luker, Mark A.
2004-01-01
The Homepage column in the March/April 2003 issue of "EDUCAUSE Review" explained the national implication of security vulnerabilities in higher education and the role of the EDUCAUSE/Internet2 Computer and Network Security Task Force in representing the higher education sector in the development of the National Strategy to Secure Cyberspace. Among…
Toxcast and the Use of Human Relevant In Vitro Exposures ...
The path for incorporating new approach methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. These challenges include sufficient coverage of toxicological mechanisms to meaningfully interpret negative test results, development of increasingly relevant test systems, computational modeling to integrate experimental data, putting results in a dose and exposure context, characterizing uncertainty, and efficient validation of the test systems and computational models. The presentation will cover progress at the U.S. EPA in systematically addressing each of these challenges and delivering more human-relevant risk-based assessments. This abstract does not necessarily reflect U.S. EPA policy. Presentation at the British Toxicological Society Annual Congress on ToxCast and the Use of Human Relevant In Vitro Exposures: Incorporating high-throughput exposure and toxicity testing data for 21st century risk assessments .
Towards Effective Non-Invasive Brain-Computer Interfaces Dedicated to Gait Rehabilitation Systems
Castermans, Thierry; Duvinage, Matthieu; Cheron, Guy; Dutoit, Thierry
2014-01-01
In the last few years, significant progress has been made in the field of walk rehabilitation. Motor cortex signals in bipedal monkeys have been interpreted to predict walk kinematics. Epidural electrical stimulation in rats and in one young paraplegic has been realized to partially restore motor control after spinal cord injury. However, these experimental trials are far from being applicable to all patients suffering from motor impairments. Therefore, it is thought that more simple rehabilitation systems are desirable in the meanwhile. The goal of this review is to describe and summarize the progress made in the development of non-invasive brain-computer interfaces dedicated to motor rehabilitation systems. In the first part, the main principles of human locomotion control are presented. The paper then focuses on the mechanisms of supra-spinal centers active during gait, including results from electroencephalography, functional brain imaging technologies [near-infrared spectroscopy (NIRS), functional magnetic resonance imaging (fMRI), positron-emission tomography (PET), single-photon emission-computed tomography (SPECT)] and invasive studies. The first brain-computer interface (BCI) applications to gait rehabilitation are then presented, with a discussion about the different strategies developed in the field. The challenges to raise for future systems are identified and discussed. Finally, we present some proposals to address these challenges, in order to contribute to the improvement of BCI for gait rehabilitation. PMID:24961699
Zhu, Fan; Panwar, Bharat; Dodge, Hiroko H; Li, Hongdong; Hampstead, Benjamin M; Albin, Roger L; Paulson, Henry L; Guan, Yuanfang
2016-10-05
We present COMPASS, a COmputational Model to Predict the development of Alzheimer's diSease Spectrum, to model Alzheimer's disease (AD) progression. This was the best-performing method in recent crowdsourcing benchmark study, DREAM Alzheimer's Disease Big Data challenge to predict changes in Mini-Mental State Examination (MMSE) scores over 24-months using standardized data. In the present study, we conducted three additional analyses beyond the DREAM challenge question to improve the clinical contribution of our approach, including: (1) adding pre-validated baseline cognitive composite scores of ADNI-MEM and ADNI-EF, (2) identifying subjects with significant declines in MMSE scores, and (3) incorporating SNPs of top 10 genes connected to APOE identified from functional-relationship network. For (1) above, we significantly improved predictive accuracy, especially for the Mild Cognitive Impairment (MCI) group. For (2), we achieved an area under ROC of 0.814 in predicting significant MMSE decline: our model has 100% precision at 5% recall, and 91% accuracy at 10% recall. For (3), "genetic only" model has Pearson's correlation of 0.15 to predict progression in the MCI group. Even though addition of this limited genetic model to COMPASS did not improve prediction of progression of MCI group, the predictive ability of SNP information extended beyond well-known APOE allele.
Thrifty: An Exascale Architecture for Energy Proportional Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torrellas, Josep
2014-12-23
The objective of this project is to design different aspects of a novel exascale architecture called Thrifty. Our goal is to focus on the challenges of power/energy efficiency, performance, and resiliency in exascale systems. The project includes work on computer architecture (Josep Torrellas from University of Illinois), compilation (Daniel Quinlan from Lawrence Livermore National Laboratory), runtime and applications (Laura Carrington from University of California San Diego), and circuits (Wilfred Pinfold from Intel Corporation). In this report, we focus on the progress at the University of Illinois during the last year of the grant (September 1, 2013 to August 31, 2014).more » We also point to the progress in the other collaborating institutions when needed.« less
Fermilab computing at the Intensity Frontier
Group, Craig; Fuess, S.; Gutsche, O.; ...
2015-12-23
The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less onmore » the development of tools and infrastructure.« less
Perspectives on the Future of CFD
NASA Technical Reports Server (NTRS)
Kwak, Dochan
2000-01-01
This viewgraph presentation gives an overview of the future of computational fluid dynamics (CFD), which in the past has pioneered the field of flow simulation. Over time CFD has progressed as computing power. Numerical methods have been advanced as CPU and memory capacity increases. Complex configurations are routinely computed now and direct numerical simulations (DNS) and large eddy simulations (LES) are used to study turbulence. As the computing resources changed to parallel and distributed platforms, computer science aspects such as scalability (algorithmic and implementation) and portability and transparent codings have advanced. Examples of potential future (or current) challenges include risk assessment, limitations of the heuristic model, and the development of CFD and information technology (IT) tools.
Computational Hemodynamics Involving Artificial Devices
NASA Technical Reports Server (NTRS)
Kwak, Dochan; Kiris, Cetin; Feiereisen, William (Technical Monitor)
2001-01-01
This paper reports the progress being made towards developing complete blood flow simulation capability in human, especially, in the presence of artificial devices such as valves and ventricular assist devices. Devices modeling poses unique challenges different from computing the blood flow in natural hearts and arteries. There are many elements needed such as flow solvers, geometry modeling including flexible walls, moving boundary procedures and physiological characterization of blood. As a first step, computational technology developed for aerospace applications was extended in the recent past to the analysis and development of mechanical devices. The blood flow in these devices is practically incompressible and Newtonian, and thus various incompressible Navier-Stokes solution procedures can be selected depending on the choice of formulations, variables and numerical schemes. Two primitive variable formulations used are discussed as well as the overset grid approach to handle complex moving geometry. This procedure has been applied to several artificial devices. Among these, recent progress made in developing DeBakey axial flow blood pump will be presented from computational point of view. Computational and clinical issues will be discussed in detail as well as additional work needed.
ERIC Educational Resources Information Center
Wang, Yinying
2017-01-01
Despite abundant data and increasing data availability brought by technological advances, there has been very limited education policy studies that have capitalized on big data--characterized by large volume, wide variety, and high velocity. Drawing on the recent progress of using big data in public policy and computational social science…
ERIC Educational Resources Information Center
Umstead, Lyubov
2013-01-01
The number of English Language Learners (ELLs) is rapidly growing. Teachers continue facing challenges in providing effective content instruction to ELLs while helping them learn English. New and improved approaches are necessary to meet the individual learning needs of this diverse group of students and help them progress academically while…
Biomimetic robots using EAP as artificial muscles - progress and challenges
NASA Technical Reports Server (NTRS)
Bar-Cohen, Yoseph
2004-01-01
Biology offers a great model for emulation in areas ranging from tools, computational algorithms, materials science, mechanisms and information technology. In recent years, the field of biomimetics, namely mimicking biology, has blossomed with significant advances enabling the reverse engineering of many animals' functions and implementation of some of these capabilities.
Challenges in reusing transactional data for daily documentation in neonatal intensive care.
Kim, G R; Lawson, E E; Lehmann, C U
2008-11-06
The reuse of transactional data for clinical documentation requires navigation of computational, institutional and adaptive barriers. We describe organizational and technical issues in developing and deploying a daily progress note tool in a tertiary neonatal intensive care unit that reuses and aggregates data from a commercial integrated clinical information system.
ELM Meets Urban Big Data Analysis: Case Studies
Chen, Huajun; Chen, Jiaoyan
2016-01-01
In the latest years, the rapid progress of urban computing has engendered big issues, which creates both opportunities and challenges. The heterogeneous and big volume of data and the big difference between physical and virtual worlds have resulted in lots of problems in quickly solving practical problems in urban computing. In this paper, we propose a general application framework of ELM for urban computing. We present several real case studies of the framework like smog-related health hazard prediction and optimal retain store placement. Experiments involving urban data in China show the efficiency, accuracy, and flexibility of our proposed framework. PMID:27656203
Computational approaches to identify functional genetic variants in cancer genomes
Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris; Ritchie, Graham R.S.; Creixell, Pau; Karchin, Rachel; Vazquez, Miguel; Fink, J. Lynn; Kassahn, Karin S.; Pearson, John V.; Bader, Gary; Boutros, Paul C.; Muthuswamy, Lakshmi; Ouellette, B.F. Francis; Reimand, Jüri; Linding, Rune; Shibata, Tatsuhiro; Valencia, Alfonso; Butler, Adam; Dronov, Serge; Flicek, Paul; Shannon, Nick B.; Carter, Hannah; Ding, Li; Sander, Chris; Stuart, Josh M.; Stein, Lincoln D.; Lopez-Bigas, Nuria
2014-01-01
The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor, but only a minority drive tumor progression. We present the result of discussions within the ICGC on how to address the challenge of identifying mutations that contribute to oncogenesis, tumor maintenance or response to therapy, and recommend computational techniques to annotate somatic variants and predict their impact on cancer phenotype. PMID:23900255
Scientific Discovery through Advanced Computing in Plasma Science
NASA Astrophysics Data System (ADS)
Tang, William
2005-03-01
Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research during the 21st Century. For example, the Department of Energy's ``Scientific Discovery through Advanced Computing'' (SciDAC) Program was motivated in large measure by the fact that formidable scientific challenges in its research portfolio could best be addressed by utilizing the combination of the rapid advances in super-computing technology together with the emergence of effective new algorithms and computational methodologies. The imperative is to translate such progress into corresponding increases in the performance of the scientific codes used to model complex physical systems such as those encountered in high temperature plasma research. If properly validated against experimental measurements and analytic benchmarks, these codes can provide reliable predictive capability for the behavior of a broad range of complex natural and engineered systems. This talk reviews recent progress and future directions for advanced simulations with some illustrative examples taken from the plasma science applications area. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by the combination of access to powerful new computational resources together with innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning a huge range in time and space scales. In particular, the plasma science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to the computational science area.
Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.
Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo
2017-01-01
The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.
ERIC Educational Resources Information Center
Beavis, Catherine; Muspratt, Sandy; Thompson, Roberta
2015-01-01
There is considerable enthusiasm in many quarters for the incorporation of digital games into the classroom, and the capacity of games to engage and challenge players, present complex representations and experiences, foster collaborative learning, and promote deep learning. But while there is increasing research documenting the progress and…
Jedynak, Bruno M.; Liu, Bo; Lang, Andrew; Gel, Yulia; Prince, Jerry L.
2014-01-01
Understanding the time-dependent changes of biomarkers related to Alzheimer’s disease (AD) is a key to assessing disease progression and to measuring the outcomes of disease-modifying therapies. In this paper, we validate an Alzheimer’s disease progression score model which uses multiple biomarkers to quantify the AD progression of subjects following three assumptions: (1) there is a unique disease progression for all subjects, (2) each subject has a different age of onset and rate of progression, and (3) each biomarker is sigmoidal as a function of disease progression. Fitting the parameters of this model is a challenging problem which we approach using an alternating least squares optimization algorithm. In order to validate this optimization scheme under realistic conditions, we use the Alzheimer’s Disease Neuroimaging Initiative (ADNI) cohort. With the help of Monte Carlo simulations, we show that most of the global parameters of the model are tightly estimated, thus enabling an ordering of the biomarkers that fit the model well, ordered as: the Rey auditory verbal learning test with 30 minutes delay, the sum of the two lateral hippocampal volumes divided by the intra-cranial volume, followed by (the clinical dementia rating sum of boxes score and the mini mental state examination score) in no particular order and lastly the Alzheimer’s disease assessment scale-cognitive subscale. PMID:25444605
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315
[Facing the challenges of ubiquitous computing in the health care sector].
Georgieff, Peter; Friedewald, Michael
2010-01-01
The steady progress of microelectronics, communications and information technology will enable the realisation of the vision for "ubiquitous computing" where the Internet extends into the real world embracing everyday objects. The necessary technical basis is already in place. Due to their diminishing size, constantly falling price and declining energy consumption, processors, communications modules and sensors are being increasingly integrated into everyday objects today. This development is opening up huge opportunities for both the economy and individuals. In the present paper we discuss possible applications, but also technical, social and economic barriers to a wide-spread use of ubiquitous computing in the health care sector. .
Successes and Challenges for Flow Control Simulations
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.
2008-01-01
A survey is made of recent computations published for synthetic jet flow control cases from a CFD workshop held in 2004. The three workshop cases were originally chosen to represent different aspects of flow control physics: nominally 2-D synthetic jet into quiescent air, 3-D circular synthetic jet into turbulent boundarylayer crossflow, and nominally 2-D flow-control (both steady suction and oscillatory zero-net-mass-flow) for separation control on a simple wall-mounted aerodynamic hump shape. The purpose of this survey is to summarize the progress as related to these workshop cases, particularly noting successes and remaining challenges for computational methods. It is hoped that this summary will also by extension serve as an overview of the state-of-the-art of CFD for these types of flow-controlled flow fields in general.
NASA Astrophysics Data System (ADS)
Baker, Kevin; Yu, Zhaoning; Ebert, Matthew; Sun, Yuan; Saffman, Mark
2016-05-01
One of the outstanding challenges facing neutral atom qubit approaches to quantum computation is suppression of crosstalk between proximal qubits due to scattered light that is generated during optical pumping and measurement operations. We have recently proposed a dual species approach to solving this challenge whereby computational qubits encoded in Cs atoms are entangled with Rb atoms via an interspecies Rydberg interaction. The quantum state of a Cs atom can then be readout by measuring the state of a Rb atom. The difference in resonant wavelengths of the two species effectively suppresses crosstalk. We will present progress towards experimental demonstration of dual species entanglement using Rb and Cs atoms cotrapped in a single beam optical trap. Work supported by the ARL CDQI.
An Unified Multiscale Framework for Planar, Surface, and Curve Skeletonization.
Jalba, Andrei C; Sobiecki, Andre; Telea, Alexandru C
2016-01-01
Computing skeletons of 2D shapes, and medial surface and curve skeletons of 3D shapes, is a challenging task. In particular, there is no unified framework that detects all types of skeletons using a single model, and also produces a multiscale representation which allows to progressively simplify, or regularize, all skeleton types. In this paper, we present such a framework. We model skeleton detection and regularization by a conservative mass transport process from a shape's boundary to its surface skeleton, next to its curve skeleton, and finally to the shape center. The resulting density field can be thresholded to obtain a multiscale representation of progressively simplified surface, or curve, skeletons. We detail a numerical implementation of our framework which is demonstrably stable and has high computational efficiency. We demonstrate our framework on several complex 2D and 3D shapes.
Shen, Hong-Bin; Yi, Dong-Liang; Yao, Li-Xiu; Yang, Jie; Chou, Kuo-Chen
2008-10-01
In the postgenomic age, with the avalanche of protein sequences generated and relatively slow progress in determining their structures by experiments, it is important to develop automated methods to predict the structure of a protein from its sequence. The membrane proteins are a special group in the protein family that accounts for approximately 30% of all proteins; however, solved membrane protein structures only represent less than 1% of known protein structures to date. Although a great success has been achieved for developing computational intelligence techniques to predict secondary structures in both globular and membrane proteins, there is still much challenging work in this regard. In this review article, we firstly summarize the recent progress of automation methodology development in predicting protein secondary structures, especially in membrane proteins; we will then give some future directions in this research field.
Ab initio calculations for industrial materials engineering: successes and challenges.
Wimmer, Erich; Najafabadi, Reza; Young, George A; Ballard, Jake D; Angeliu, Thomas M; Vollmer, James; Chambers, James J; Niimi, Hiroaki; Shaw, Judy B; Freeman, Clive; Christensen, Mikael; Wolf, Walter; Saxe, Paul
2010-09-29
Computational materials science based on ab initio calculations has become an important partner to experiment. This is demonstrated here for the effect of impurities and alloying elements on the strength of a Zr twist grain boundary, the dissociative adsorption and diffusion of iodine on a zirconium surface, the diffusion of oxygen atoms in a Ni twist grain boundary and in bulk Ni, and the dependence of the work function of a TiN-HfO(2) junction on the replacement of N by O atoms. In all of these cases, computations provide atomic-scale understanding as well as quantitative materials property data of value to industrial research and development. There are two key challenges in applying ab initio calculations, namely a higher accuracy in the electronic energy and the efficient exploration of large parts of the configurational space. While progress in these areas is fueled by advances in computer hardware, innovative theoretical concepts combined with systematic large-scale computations will be needed to realize the full potential of ab initio calculations for industrial applications.
Advances in the management of orbital fractures.
Nguyen, P N; Sullivan, P
1992-01-01
Great progress has been made in both the basic science and the clinical knowledge base used in orbital reconstruction. With this, increasing complex orbital reconstructive problems are better managed. The diagnosis, treatment plan, and the actual reconstruction have evolved to a higher level. Several areas of progress are of note: the greater appreciation of the intimate relation between the bony orbit's shape and the position of the globe; application of computer technology in orbital injuries; effect of rigid fixation on autogenous and alloplastic graft; and the use of advanced biocompatible synthetic materials in orbital reconstruction. Although this progress has great impact on treatment of orbital injuries, there are many unanswered challenges in the treatment of the fragile frame of the window to the human soul.
The application of quantum mechanics in structure-based drug design.
Mucs, Daniel; Bryce, Richard A
2013-03-01
Computational chemistry has become an established and valuable component in structure-based drug design. However the chemical complexity of many ligands and active sites challenges the accuracy of the empirical potentials commonly used to describe these systems. Consequently, there is a growing interest in utilizing electronic structure methods for addressing problems in protein-ligand recognition. In this review, the authors discuss recent progress in the development and application of quantum chemical approaches to modeling protein-ligand interactions. The authors specifically consider the development of quantum mechanics (QM) approaches for studying large molecular systems pertinent to biology, focusing on protein-ligand docking, protein-ligand binding affinities and ligand strain on binding. Although computation of binding energies remains a challenging and evolving area, current QM methods can underpin improved docking approaches and offer detailed insights into ligand strain and into the nature and relative strengths of complex active site interactions. The authors envisage that QM will become an increasingly routine and valued tool of the computational medicinal chemist.
Challenges of Future High-End Computing
NASA Technical Reports Server (NTRS)
Bailey, David; Kutler, Paul (Technical Monitor)
1998-01-01
The next major milestone in high performance computing is a sustained rate of one Pflop/s (also written one petaflops, or 10(circumflex)15 floating-point operations per second). In addition to prodigiously high computational performance, such systems must of necessity feature very large main memories, as well as comparably high I/O bandwidth and huge mass storage facilities. The current consensus of scientists who have studied these issues is that "affordable" petaflops systems may be feasible by the year 2010, assuming that certain key technologies continue to progress at current rates. One important question is whether applications can be structured to perform efficiently on such systems, which are expected to incorporate many thousands of processors and deeply hierarchical memory systems. To answer these questions, advanced performance modeling techniques, including simulation of future architectures and applications, may be required. It may also be necessary to formulate "latency tolerant algorithms" and other completely new algorithmic approaches for certain applications. This talk will give an overview of these challenges.
Successes and Challenges for Flow Control Simulations
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.
2008-01-01
A survey is made of recent computations published for synthetic jet flow control cases from a CFD workshop held in 2004. The three workshop cases were originally chosen to represent different aspects of flow control physics: nominally 2-D synthetic jet into quiescent air, 3-D circular synthetic jet into turbulent boundary-layer crossflow, and nominally 2-D flow-control (both steady suction and oscillatory zero-net-mass-flow) for separation control on a simple wall-mounted aerodynamic hump shape. The purpose of this survey is to summarize the progress as related to these workshop cases, particularly noting successes and remaining challenges for computational methods. It is hoped that this summary will also by extension serve as an overview of the state-of-the-art of CFD for these types of flow-controlled flow fields in general.
ISCB: past-present perspective for the International Society for Computational Biology.
Rost, Burkhard
2014-01-01
Since its establishment in 1997, International Society for Computational Biology (ISCB) has contributed importantly toward advancing the understanding of living systems through computation. The ISCB represents nearly 3000 members working in >70 countries. It has doubled the number of members since 2007. At the same time, the number of meetings organized by the ISCB has increased from two in 2007 to eight in 2013, and the society has cemented many lasting alliances with regional societies and specialist groups. ISCB is ready to grow into a challenging and promising future. The progress over the past 7 years has resulted from the vision, and possibly more importantly, the passion and hard working dedication of many individuals.
ISCB: past-present perspective for the International Society for Computational Biology.
Rost, Burkhard
2013-12-15
Since its establishment in 1997, International Society for Computational Biology (ISCB) has contributed importantly toward advancing the understanding of living systems through computation. The ISCB represents nearly 3000 members working in >70 countries. It has doubled the number of members since 2007. At the same time, the number of meetings organized by the ISCB has increased from two in 2007 to eight in 2013, and the society has cemented many lasting alliances with regional societies and specialist groups. ISCB is ready to grow into a challenging and promising future. The progress over the past 7 years has resulted from the vision, and possibly more importantly, the passion and hard working dedication of many individuals.
ERIC Educational Resources Information Center
Edwards, Virginia B., Ed.
2015-01-01
Lofty ed-tech visions are always tempered by reality. Unexpected problems that arose during a launch of a 1-to-1 computing program and ambitious digital curriculum initiative in Los Angeles led to the dialing back of the effort. Financial, legal, and managerial repercussions continue to swirl in the wake. These problems should not prevent schools…
Rejniak, Katarzyna A.; Gerlee, Philip
2013-01-01
Summary In this review we summarize our recent efforts using mathematical modeling and computation to simulate cancer invasion, with a special emphasis on the tumor microenvironment. We consider cancer progression as a complex multiscale process and approach it with three single-cell based mathematical models that examine the interactions between tumor microenvironment and cancer cells at several scales. The models exploit distinct mathematical and computational techniques, yet they share core elements and can be compared and/or related to each other. The overall aim of using mathematical models is to uncover the fundamental mechanisms that lend cancer progression its direction towards invasion and metastasis. The models effectively simulate various modes of cancer cell adaptation to the microenvironment in a growing tumor. All three point to a general mechanism underlying cancer invasion: competition for adaptation between distinct cancer cell phenotypes, driven by a tumor microenvironment with scarce resources. These theoretical predictions pose an intriguing experimental challenge: test the hypothesis that invasion is an emergent property of cancer cell populations adapting to selective microenvironment pressure, rather than culmination of cancer progression producing cells with the “invasive phenotype”. In broader terms, we propose that fundamental insights into cancer can be achieved by experimentation interacting with theoretical frameworks provided by computational and mathematical modeling. PMID:18524624
Computational Social Science: Exciting Progress and Future Challenges
NASA Astrophysics Data System (ADS)
Watts, Duncan
The past 15 years have witnessed a remarkable increase in both the scale and scope of social and behavioral data available to researchers, leading some to herald the emergence of a new field: ``computational social science.'' Against these exciting developments stands a stubborn fact: that in spite of many thousands of published papers, there has been surprisingly little progress on the ``big'' questions that motivated the field in the first place--questions concerning systemic risk in financial systems, problem solving in complex organizations, and the dynamics of epidemics or social movements, among others. In this talk I highlight some examples of research that would not have been possible just a handful of years ago and that illustrate the promise of CSS. At the same time, they illustrate its limitations. I then conclude with some thoughts on how CSS can bridge the gap between its current state and its potential.
Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.
Bian, Yuemin; Xie, Xiang-Qun Sean
2018-04-09
Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.
Laboratory directed research and development annual report 2004.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This report summarizes progress from the Laboratory Directed Research and Development (LDRD) program during fiscal year 2004. In addition to a programmatic and financial overview, the report includes progress reports from 352 individual R and D projects in 15 categories. The 15 categories are: (1) Advanced Concepts; (2) Advanced Manufacturing; (3) Biotechnology; (4) Chemical and Earth Sciences; (5) Computational and Information Sciences; (6) Differentiating Technologies; (7) Electronics and Photonics; (8) Emerging Threats; (9) Energy and Critical Infrastructures; (10) Engineering Sciences; (11) Grand Challenges; (12) Materials Science and Technology; (13) Nonproliferation and Materials Control; (14) Pulsed Power and High Energy Densitymore » Sciences; and (15) Corporate Objectives.« less
NASA Astrophysics Data System (ADS)
Marhadi, Kun Saptohartyadi
Structural optimization for damage tolerance under various unforeseen damage scenarios is computationally challenging. It couples non-linear progressive failure analysis with sampling-based stochastic analysis of random damage. The goal of this research was to understand the relationship between alternate load paths available in a structure and its damage tolerance, and to use this information to develop computationally efficient methods for designing damage tolerant structures. Progressive failure of a redundant truss structure subjected to small random variability was investigated to identify features that correlate with robustness and predictability of the structure's progressive failure. The identified features were used to develop numerical surrogate measures that permit computationally efficient deterministic optimization to achieve robustness and predictability of progressive failure. Analysis of damage tolerance on designs with robust progressive failure indicated that robustness and predictability of progressive failure do not guarantee damage tolerance. Damage tolerance requires a structure to redistribute its load to alternate load paths. In order to investigate the load distribution characteristics that lead to damage tolerance in structures, designs with varying degrees of damage tolerance were generated using brute force stochastic optimization. A method based on principal component analysis was used to describe load distributions (alternate load paths) in the structures. Results indicate that a structure that can develop alternate paths is not necessarily damage tolerant. The alternate load paths must have a required minimum load capability. Robustness analysis of damage tolerant optimum designs indicates that designs are tailored to specified damage. A design Optimized under one damage specification can be sensitive to other damages not considered. Effectiveness of existing load path definitions and characterizations were investigated for continuum structures. A load path definition using a relative compliance change measure (U* field) was demonstrated to be the most useful measure of load path. This measure provides quantitative information on load path trajectories and qualitative information on the effectiveness of the load path. The use of the U* description of load paths in optimizing structures for effective load paths was investigated.
Progress on the diagnosis and evaluation of brain tumors
Gao, Huile
2013-01-01
Abstract Brain tumors are one of the most challenging disorders encountered, and early and accurate diagnosis is essential for the management and treatment of these tumors. In this article, diagnostic modalities including single-photon emission computed tomography, positron emission tomography, magnetic resonance imaging, and optical imaging are reviewed. We mainly focus on the newly emerging, specific imaging probes, and their potential use in animal models and clinical settings. PMID:24334439
epiDMS: Data Management and Analytics for Decision-Making From Epidemic Spread Simulation Ensembles.
Liu, Sicong; Poccia, Silvestro; Candan, K Selçuk; Chowell, Gerardo; Sapino, Maria Luisa
2016-12-01
Carefully calibrated large-scale computational models of epidemic spread represent a powerful tool to support the decision-making process during epidemic emergencies. Epidemic models are being increasingly used for generating forecasts of the spatial-temporal progression of epidemics at different spatial scales and for assessing the likely impact of different intervention strategies. However, the management and analysis of simulation ensembles stemming from large-scale computational models pose challenges, particularly when dealing with multiple interdependent parameters, spanning multiple layers and geospatial frames, affected by complex dynamic processes operating at different resolutions. We describe and illustrate with examples a novel epidemic simulation data management system, epiDMS, that was developed to address the challenges that arise from the need to generate, search, visualize, and analyze, in a scalable manner, large volumes of epidemic simulation ensembles and observations during the progression of an epidemic. epiDMS is a publicly available system that facilitates management and analysis of large epidemic simulation ensembles. epiDMS aims to fill an important hole in decision-making during healthcare emergencies by enabling critical services with significant economic and health impact. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.
Computational fluid dynamics: Transition to design applications
NASA Technical Reports Server (NTRS)
Bradley, R. G.; Bhateley, I. C.; Howell, G. A.
1987-01-01
The development of aerospace vehicles, over the years, was an evolutionary process in which engineering progress in the aerospace community was based, generally, on prior experience and data bases obtained through wind tunnel and flight testing. Advances in the fundamental understanding of flow physics, wind tunnel and flight test capability, and mathematical insights into the governing flow equations were translated into improved air vehicle design. The modern day field of Computational Fluid Dynamics (CFD) is a continuation of the growth in analytical capability and the digital mathematics needed to solve the more rigorous form of the flow equations. Some of the technical and managerial challenges that result from rapidly developing CFD capabilites, some of the steps being taken by the Fort Worth Division of General Dynamics to meet these challenges, and some of the specific areas of application for high performance air vehicles are presented.
Editorial: Cognitive Architectures, Model Comparison and AGI
NASA Astrophysics Data System (ADS)
Lebiere, Christian; Gonzalez, Cleotilde; Warwick, Walter
2010-12-01
Cognitive Science and Artificial Intelligence share compatible goals of understanding and possibly generating broadly intelligent behavior. In order to determine if progress is made, it is essential to be able to evaluate the behavior of complex computational models, especially those built on general cognitive architectures, and compare it to benchmarks of intelligent behavior such as human performance. Significant methodological challenges arise, however, when trying to extend approaches used to compare model and human performance from tightly controlled laboratory tasks to complex tasks involving more open-ended behavior. This paper describes a model comparison challenge built around a dynamic control task, the Dynamic Stocks and Flows. We present and discuss distinct approaches to evaluating performance and comparing models. Lessons drawn from this challenge are discussed in light of the challenge of using cognitive architectures to achieve Artificial General Intelligence.
Nanotechnology: Opportunities and Challenges
NASA Technical Reports Server (NTRS)
Meyyappan, Meyya
2003-01-01
Nanotechnology seeks to exploit novel physical, chemical, biological, mechanical, electrical, and other properties, which arise primarily due to the nanoscale nature of certain materials. A key example is carbon nanotubes (CNTs) which exhibit unique electrical and extraordinary mechanical properties and offer remarkable potential for revolutionary applications in electronics devices, computing, and data storage technology, sensors, composites, nanoelectromechanical systems (NEMS), and as tip in scanning probe microscopy (SPM) for imaging and nanolithography. Thus the CNT synthesis, characterization, and applications touch upon all disciplines of science and engineering. This presentation will provide an overview and progress report on this and other major research candidates in Nanotechnology and address opportunities and challenges ahead.
(Extreme) Core-collapse Supernova Simulations
NASA Astrophysics Data System (ADS)
Mösta, Philipp
2017-01-01
In this talk I will present recent progress on modeling core-collapse supernovae with massively parallel simulations on the largest supercomputers available. I will discuss the unique challenges in both input physics and computational modeling that come with a problem involving all four fundamental forces and relativistic effects and will highlight recent breakthroughs overcoming these challenges in full 3D simulations. I will pay particular attention to how these simulations can be used to reveal the engines driving some of the most extreme explosions and conclude by discussing what remains to be done in simulation work to maximize what we can learn from current and future time-domain astronomy transient surveys.
Understanding Shale Gas: Recent Progress and Remaining Challenges
Striolo, Alberto; Cole, David R.
2017-08-27
Because of a number of technological advancements, unconventional hydrocarbons, and in particular shale gas, have transformed the US economy. Much is being learned, as demonstrated by the reduced cost of extracting shale gas in the US over the past five years. However, a number of challenges still need to be addressed. Many of these challenges represent grand scientific and technological tasks, overcoming which will have a number of positive impacts, ranging from the reduction of the environmental footprint of shale gas production to improvements and leaps forward in diverse sectors, including chemical manufacturing and catalytic transformations. This review addresses recentmore » advancements in computational and experimental approaches, which led to improved understanding of, in particular, structure and transport of fluids, including hydrocarbons, electrolytes, water, and CO 2 in heterogeneous subsurface rocks such as those typically found in shale formations. Finally, the narrative is concluded with a suggestion of a few research directions that, by synergistically combining computational and experimental advances, could allow us to overcome some of the hurdles that currently hinder the production of hydrocarbons from shale formations.« less
Computational modeling of brain tumors: discrete, continuum or hybrid?
NASA Astrophysics Data System (ADS)
Wang, Zhihui; Deisboeck, Thomas S.
In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silico brain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.
Computational modeling of brain tumors: discrete, continuum or hybrid?
NASA Astrophysics Data System (ADS)
Wang, Zhihui; Deisboeck, Thomas S.
2008-04-01
In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silicobrain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.
Liu, Jun; Zhang, Liqun; Cao, Dapeng; Wang, Wenchuan
2009-12-28
Polymer nanocomposites (PNCs) often exhibit excellent mechanical, thermal, electrical and optical properties, because they combine the performances of both polymers and inorganic or organic nanoparticles. Recently, computer modeling and simulation are playing an important role in exploring the reinforcement mechanism of the PNCs and even the design of functional PNCs. This report provides an overview of the progress made in past decades in the investigation of the static, rheological and mechanical properties of polymer nanocomposites studied by computer modeling and simulation. Emphases are placed on exploring the mechanisms at the molecular level for the dispersion of nanoparticles in nanocomposites, the effects of nanoparticles on chain conformation and glass transition temperature (T(g)), as well as viscoelastic and mechanical properties. Finally, some future challenges and opportunities in computer modeling and simulation of PNCs are addressed.
NASA Astrophysics Data System (ADS)
Jain, A. K.; Dorai, C.
Computer vision has emerged as a challenging and important area of research, both as an engineering and a scientific discipline. The growing importance of computer vision is evident from the fact that it was identified as one of the "Grand Challenges" and also from its prominent role in the National Information Infrastructure. While the design of a general-purpose vision system continues to be elusive machine vision systems are being used successfully in specific application elusive, machine vision systems are being used successfully in specific application domains. Building a practical vision system requires a careful selection of appropriate sensors, extraction and integration of information from available cues in the sensed data, and evaluation of system robustness and performance. The authors discuss and demonstrate advantages of (1) multi-sensor fusion, (2) combination of features and classifiers, (3) integration of visual modules, and (IV) admissibility and goal-directed evaluation of vision algorithms. The requirements of several prominent real world applications such as biometry, document image analysis, image and video database retrieval, and automatic object model construction offer exciting problems and new opportunities to design and evaluate vision algorithms.
Light and redox switchable molecular components for molecular electronics.
Browne, Wesley R; Feringa, Ben L
2010-01-01
The field of molecular and organic electronics has seen rapid progress in recent years, developing from concept and design to actual demonstration devices in which both single molecules and self-assembled monolayers are employed as light-responsive components. Research in this field has seen numerous unexpected challenges that have slowed progress and the initial promise of complex molecular-based computers has not yet been realised. Primarily this has been due to the realisation at an early stage that molecular-based nano-electronics brings with it the interface between the hard (semiconductor) and soft (molecular) worlds and the challenges which accompany working in such an environment. Issues such as addressability, cross-talk, molecular stability and perturbation of molecular properties (e.g., inhibition of photochemistry) have nevertheless driven development in molecular design and synthesis as well as our ability to interface molecular components with bulk metal contacts to a very high level of sophistication. Numerous groups have played key roles in progressing this field not least teams such as those led by Whitesides, Aviram, Ratner, Stoddart and Heath. In this short review we will however focus on the contributions from our own group and those of our collaborators, in employing diarylethene based molecular components.
Progress on Intelligent Guidance and Control for Wind Shear Encounter
NASA Technical Reports Server (NTRS)
Stratton, D. Alexander
1990-01-01
Low altitude wind shear poses a serious threat to air safety. Avoiding severe wind shear challenges the ability of flight crews, as it involves assessing risk from uncertain evidence. A computerized intelligent cockpit aid can increase flight crew awareness of wind shear, improving avoidance decisions. The primary functions of a cockpit advisory expert system for wind shear avoidance are discussed. Also introduced are computational techniques being implemented to enable these primary functions.
Density functional theory across chemistry, physics and biology.
van Mourik, Tanja; Bühl, Michael; Gaigeot, Marie-Pierre
2014-03-13
The past decades have seen density functional theory (DFT) evolve from a rising star in computational quantum chemistry to one of its major players. This Theme Issue, which comes half a century after the publication of the Hohenberg-Kohn theorems that laid the foundations of modern DFT, reviews progress and challenges in present-day DFT research. Rather than trying to be comprehensive, this Theme Issue attempts to give a flavour of selected aspects of DFT.
Big Computing in Astronomy: Perspectives and Challenges
NASA Astrophysics Data System (ADS)
Pankratius, Victor
2014-06-01
Hardware progress in recent years has led to astronomical instruments gathering large volumes of data. In radio astronomy for instance, the current generation of antenna arrays produces data at Tbits per second, and forthcoming instruments will expand these rates much further. As instruments are increasingly becoming software-based, astronomers will get more exposed to computer science. This talk therefore outlines key challenges that arise at the intersection of computer science and astronomy and presents perspectives on how both communities can collaborate to overcome these challenges.Major problems are emerging due to increases in data rates that are much larger than in storage and transmission capacity, as well as humans being cognitively overwhelmed when attempting to opportunistically scan through Big Data. As a consequence, the generation of scientific insight will become more dependent on automation and algorithmic instrument control. Intelligent data reduction will have to be considered across the entire acquisition pipeline. In this context, the presentation will outline the enabling role of machine learning and parallel computing.BioVictor Pankratius is a computer scientist who joined MIT Haystack Observatory following his passion for astronomy. He is currently leading efforts to advance astronomy through cutting-edge computer science and parallel computing. Victor is also involved in projects such as ALMA Phasing to enhance the ALMA Observatory with Very-Long Baseline Interferometry capabilities, the Event Horizon Telescope, as well as in the Radio Array of Portable Interferometric Detectors (RAPID) to create an analysis environment using parallel computing in the cloud. He has an extensive track record of research in parallel multicore systems and software engineering, with contributions to auto-tuning, debugging, and empirical experiments studying programmers. Victor has worked with major industry partners such as Intel, Sun Labs, and Oracle. He holds a distinguished doctorate and a Habilitation degree in Computer Science from the University of Karlsruhe. Contact him at pankrat@mit.edu, victorpankratius.com, or Twitter @vpankratius.
Gradient Models in Molecular Biophysics: Progress, Challenges, Opportunities
Bardhan, Jaydeep P.
2014-01-01
In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g. molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding features such as nonlocal dielectric response, and nonlinearities resulting from dielectric saturation. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost forty years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The paper concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics. PMID:25505358
Gradient Models in Molecular Biophysics: Progress, Challenges, Opportunities.
Bardhan, Jaydeep P
2013-12-01
In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g. molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding features such as nonlocal dielectric response, and nonlinearities resulting from dielectric saturation. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost forty years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The paper concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics.
Gradient models in molecular biophysics: progress, challenges, opportunities
NASA Astrophysics Data System (ADS)
Bardhan, Jaydeep P.
2013-12-01
In the interest of developing a bridge between researchers modeling materials and those modeling biological molecules, we survey recent progress in developing nonlocal-dielectric continuum models for studying the behavior of proteins and nucleic acids. As in other areas of science, continuum models are essential tools when atomistic simulations (e.g., molecular dynamics) are too expensive. Because biological molecules are essentially all nanoscale systems, the standard continuum model, involving local dielectric response, has basically always been dubious at best. The advanced continuum theories discussed here aim to remedy these shortcomings by adding nonlocal dielectric response. We begin by describing the central role of electrostatic interactions in biology at the molecular scale, and motivate the development of computationally tractable continuum models using applications in science and engineering. For context, we highlight some of the most important challenges that remain, and survey the diverse theoretical formalisms for their treatment, highlighting the rigorous statistical mechanics that support the use and improvement of continuum models. We then address the development and implementation of nonlocal dielectric models, an approach pioneered by Dogonadze, Kornyshev, and their collaborators almost 40 years ago. The simplest of these models is just a scalar form of gradient elasticity, and here we use ideas from gradient-based modeling to extend the electrostatic model to include additional length scales. The review concludes with a discussion of open questions for model development, highlighting the many opportunities for the materials community to leverage its physical, mathematical, and computational expertise to help solve one of the most challenging questions in molecular biology and biophysics.
NASA Technical Reports Server (NTRS)
Fisher, Kevin; Chang, Chein-I
2009-01-01
Progressive band selection (PBS) reduces spectral redundancy without significant loss of information, thereby reducing hyperspectral image data volume and processing time. Used onboard a spacecraft, it can also reduce image downlink time. PBS prioritizes an image's spectral bands according to priority scores that measure their significance to a specific application. Then it uses one of three methods to select an appropriate number of the most useful bands. Key challenges for PBS include selecting an appropriate criterion to generate band priority scores, and determining how many bands should be retained in the reduced image. The image's Virtual Dimensionality (VD), once computed, is a reasonable estimate of the latter. We describe the major design details of PBS and test PBS in a land classification experiment.
General overview on structure prediction of twilight-zone proteins.
Khor, Bee Yin; Tye, Gee Jun; Lim, Theam Soon; Choong, Yee Siew
2015-09-04
Protein structure prediction from amino acid sequence has been one of the most challenging aspects in computational structural biology despite significant progress in recent years showed by critical assessment of protein structure prediction (CASP) experiments. When experimentally determined structures are unavailable, the predictive structures may serve as starting points to study a protein. If the target protein consists of homologous region, high-resolution (typically <1.5 Å) model can be built via comparative modelling. However, when confronted with low sequence similarity of the target protein (also known as twilight-zone protein, sequence identity with available templates is less than 30%), the protein structure prediction has to be initiated from scratch. Traditionally, twilight-zone proteins can be predicted via threading or ab initio method. Based on the current trend, combination of different methods brings an improved success in the prediction of twilight-zone proteins. In this mini review, the methods, progresses and challenges for the prediction of twilight-zone proteins were discussed.
New Insights into Perfluorinated Sulfonic-Acid Ionomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kusoglu, Ahmet; Weber, Adam Z.
In this comprehensive review, recent progress and developments on perfluorinated sulfonic-acid (PFSA) membranes have been summarized on many key topics. Although quite well investigated for decades, PFSA ionomers’ complex behavior, along with their key role in many emerging technologies, have presented significant scientific challenges but also helped create a unique cross-disciplinary research field to overcome such challenges. Research and progress on PFSAs, especially when considered with their applications, are at the forefront of bridging electrochemistry and polymer (physics), which have also opened up development of state-of-the-art in situ characterization techniques as well as multiphysics computation models. Topics reviewed stem frommore » correlating the various physical (e.g., mechanical) and transport properties with morphology and structure across time and length scales. In addition, topics of recent interest such as structure/transport correlations and modeling, composite PFSA membranes, degradation phenomena, and PFSA thin films are presented. Throughout, the impact of PFSA chemistry and side-chain is also discussed to present a broader perspective.« less
New Insights into Perfluorinated Sulfonic-Acid Ionomers
Kusoglu, Ahmet; Weber, Adam Z.
2017-01-23
In this comprehensive review, recent progress and developments on perfluorinated sulfonic-acid (PFSA) membranes have been summarized on many key topics. Although quite well investigated for decades, PFSA ionomers’ complex behavior, along with their key role in many emerging technologies, have presented significant scientific challenges but also helped create a unique cross-disciplinary research field to overcome such challenges. Research and progress on PFSAs, especially when considered with their applications, are at the forefront of bridging electrochemistry and polymer (physics), which have also opened up development of state-of-the-art in situ characterization techniques as well as multiphysics computation models. Topics reviewed stem frommore » correlating the various physical (e.g., mechanical) and transport properties with morphology and structure across time and length scales. In addition, topics of recent interest such as structure/transport correlations and modeling, composite PFSA membranes, degradation phenomena, and PFSA thin films are presented. Throughout, the impact of PFSA chemistry and side-chain is also discussed to present a broader perspective.« less
The Mock LISA Data Challenges: History, Status, Prospects
NASA Technical Reports Server (NTRS)
Vallisneri, Michele; Babak, Stas; Baker, John; Benacquista, Matt; Cornish, Neil; Crowder, Jeff; Cutler, Curt; Larson, Shane; Littenberg, Tyson; Porter, Edward;
2007-01-01
This slide presentation reviews the importance for the Mock LISA Data Challenges (MLDC). Laser Interferometer Space Antenna (LISA) is a gravitational wave (GW) observatory that will return data such that data analysis is integral to the measurement concept. Further rationale of the MLDC are to kickstart the development of a LISA data-analysis computational infrastructure, and to encourage, track, and compare progress in LISA data-analysis development in the open community. The MLDCs is a coordinated, voluntary effort in GW community, that will periodically issue datasets with synthetic noise and GW signals from sources of undisclosed parameters; increasing difficulty. The challenge participants return parameter estimates and descriptions of search methods. Some of the challenges and the resultant entries are reviewed. The aim is to show that LISA data analysis is possible, and to develop new techniques, using multiple international teams for the development of LISA core analysis tools
eHealth in cardiovascular medicine: A clinical update.
Saner, Hugo; van der Velde, Enno
2016-10-01
Demographic changes, progress in medicine technology and regional problems in providing healthcare to low density populations are posing great challenges to our healthcare systems. Rapid progress in computer sciences and information technologies have a great impact on the way healthcare will be delivered in the near future. This article describes opportunities and challenges of eHealth and telemedicine in the framework of our health systems and, in particular, in the context of today's cardiology services. The most promising applications of eHealth and telemedicine include: (a) prevention and lifestyle interventions; (b) chronic disease management including hypertension, diabetes and heart failure; (c) arrhythmia detection including early detection of atrial fibrillation and telemonitoring of devices such as pacemaker, internal cardioverter defibrillators and implantable rhythm monitoring devices; (d) telerehabilitation. Major obstacles to the integration of eHealth and telemedicine into daily clinical practice include limited large-scale evidence, in particular, for cost-effectiveness, as well as lack of interoperability, inadequate or fragmented legal frameworks and lack of reimbursement. An important challenge for those involved in these new technologies will be to keep the main focus on patient's individual needs and to carefully evaluate the evidence behind the practice. © The European Society of Cardiology 2016.
Sociotechnical Challenges and Progress in Using Social Media for Health
Cavusoglu, Hasan; Frisch, Larry; Fels, Sidney
2013-01-01
Social media tools that connect patients, caregivers, and health providers offer great potential for helping people access health advice, receive and give social support, manage or cope with chronic conditions, and make day-to-day health decisions. These systems have seen widespread adoption, but often fail to support the goals as fully as designers and users would like. Through Ackerman’s lens of the “sociotechnical gap” and computer supported cooperative work (CSCW) as a science of the artificial, we review contemporary sociotechnical challenges and progress for using social media to support health. These challenges include a tension between privacy and sharing, policy information credibility, accessibility, and tailoring in social spaces. Those studying, building, deploying, and using social media systems to further health goals will benefit from approaching this work by borrowing from Ackerman’s framing of CSCW. In particular, this requires acknowledgment that technical systems will not fully meet our social goals, and then adopting design and educational approaches that are appropriate to fill this gap, building less-nuanced systems as partial solutions and tools for advancing our understanding, and by working with the CSCW research community to develop and pursue key lines of inquiry. PMID:24148206
Internet-based computer technology on radiotherapy.
Chow, James C L
2017-01-01
Recent rapid development of Internet-based computer technologies has made possible many novel applications in radiation dose delivery. However, translational speed of applying these new technologies in radiotherapy could hardly catch up due to the complex commissioning process and quality assurance protocol. Implementing novel Internet-based technology in radiotherapy requires corresponding design of algorithm and infrastructure of the application, set up of related clinical policies, purchase and development of software and hardware, computer programming and debugging, and national to international collaboration. Although such implementation processes are time consuming, some recent computer advancements in the radiation dose delivery are still noticeable. In this review, we will present the background and concept of some recent Internet-based computer technologies such as cloud computing, big data processing and machine learning, followed by their potential applications in radiotherapy, such as treatment planning and dose delivery. We will also discuss the current progress of these applications and their impacts on radiotherapy. We will explore and evaluate the expected benefits and challenges in implementation as well.
Optimized Materials From First Principles Simulations: Are We There Yet?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galli, G; Gygi, F
2005-07-26
In the past thirty years, the use of scientific computing has become pervasive in all disciplines: collection and interpretation of most experimental data is carried out using computers, and physical models in computable form, with various degrees of complexity and sophistication, are utilized in all fields of science. However, full prediction of physical and chemical phenomena based on the basic laws of Nature, using computer simulations, is a revolution still in the making, and it involves some formidable theoretical and computational challenges. We illustrate the progress and successes obtained in recent years in predicting fundamental properties of materials in condensedmore » phases and at the nanoscale, using ab-initio, quantum simulations. We also discuss open issues related to the validation of the approximate, first principles theories used in large scale simulations, and the resulting complex interplay between computation and experiment. Finally, we describe some applications, with focus on nanostructures and liquids, both at ambient and under extreme conditions.« less
A Computing Infrastructure for Supporting Climate Studies
NASA Astrophysics Data System (ADS)
Yang, C.; Bambacus, M.; Freeman, S. M.; Huang, Q.; Li, J.; Sun, M.; Xu, C.; Wojcik, G. S.; Cahalan, R. F.; NASA Climate @ Home Project Team
2011-12-01
Climate change is one of the major challenges facing us on the Earth planet in the 21st century. Scientists build many models to simulate the past and predict the climate change for the next decades or century. Most of the models are at a low resolution with some targeting high resolution in linkage to practical climate change preparedness. To calibrate and validate the models, millions of model runs are needed to find the best simulation and configuration. This paper introduces the NASA effort on Climate@Home project to build a supercomputer based-on advanced computing technologies, such as cloud computing, grid computing, and others. Climate@Home computing infrastructure includes several aspects: 1) a cloud computing platform is utilized to manage the potential spike access to the centralized components, such as grid computing server for dispatching and collecting models runs results; 2) a grid computing engine is developed based on MapReduce to dispatch models, model configuration, and collect simulation results and contributing statistics; 3) a portal serves as the entry point for the project to provide the management, sharing, and data exploration for end users; 4) scientists can access customized tools to configure model runs and visualize model results; 5) the public can access twitter and facebook to get the latest about the project. This paper will introduce the latest progress of the project and demonstrate the operational system during the AGU fall meeting. It will also discuss how this technology can become a trailblazer for other climate studies and relevant sciences. It will share how the challenges in computation and software integration were solved.
Tadmor, Brigitta; Tidor, Bruce
2005-09-01
Progress in the life sciences, including genome sequencing and high-throughput experimentation, offers an opportunity for understanding biology and medicine from a systems perspective. This 'new view', which complements the more traditional component-based approach, involves the integration of biological research with approaches from engineering disciplines and computer science. The result is more than a new set of technologies. Rather, it promises a fundamental reconceptualization of the life sciences based on the development of quantitative and predictive models to describe crucial processes. To achieve this change, learning communities are being formed at the interface of the life sciences, engineering and computer science. Through these communities, research and education will be integrated across disciplines and the challenges associated with multidisciplinary team-based science will be addressed.
On the convergence of nanotechnology and Big Data analysis for computer-aided diagnosis.
Rodrigues, Jose F; Paulovich, Fernando V; de Oliveira, Maria Cf; de Oliveira, Osvaldo N
2016-04-01
An overview is provided of the challenges involved in building computer-aided diagnosis systems capable of precise medical diagnostics based on integration and interpretation of data from different sources and formats. The availability of massive amounts of data and computational methods associated with the Big Data paradigm has brought hope that such systems may soon be available in routine clinical practices, which is not the case today. We focus on visual and machine learning analysis of medical data acquired with varied nanotech-based techniques and on methods for Big Data infrastructure. Because diagnosis is essentially a classification task, we address the machine learning techniques with supervised and unsupervised classification, making a critical assessment of the progress already made in the medical field and the prospects for the near future. We also advocate that successful computer-aided diagnosis requires a merge of methods and concepts from nanotechnology and Big Data analysis.
NASA Technical Reports Server (NTRS)
Varsi, Giulio
1989-01-01
The problem of the remote control of space operations is addressed by identifying the key technical challenge: the management of contact forces and the principal performance parameters. Three principal classes of devices for remote operation are identified: anthropomorphic exoskeletons, computer aided teleoperators, and supervised telerobots. Their fields of application are described, and areas in which progress has reached the level of system or subsystem laboratory demonstrations are indicated. Key test results, indicating performance at a level useful for design tradeoffs, are reported.
NASA Technical Reports Server (NTRS)
Andrews, Alison E.
1987-01-01
An approach to analyzing CFD knowledge-based systems is proposed which is based, in part, on the concept of knowledge-level analysis. Consideration is given to the expert cooling fan design system, the PAN AIR knowledge system, grid adaptation, and expert zonal grid generation. These AI/CFD systems demonstrate that current AI technology can be successfully applied to well-formulated problems that are solved by means of classification or selection of preenumerated solutions.
Robotic insects: Manufacturing, actuation, and power considerations
NASA Astrophysics Data System (ADS)
Wood, Robert
2015-12-01
As the characteristic size of a flying robot decreases, the challenges for successful flight revert to basic questions of fabrication, actuation, fluid mechanics, stabilization, and power - whereas such questions have in general been answered for larger aircraft. When developing a robot on the scale of a housefly, all hardware must be developed from scratch as there is nothing "off-the-shelf" which can be used for mechanisms, sensors, or computation that would satisfy the extreme mass and power limitations. With these challenges in mind, this talk will present progress in the essential technologies for insect-like robots with an emphasis on multi-scale manufacturing methods, high power density actuation, and energy-efficient power distribution.
Computational characterization of ordered nanostructured surfaces
NASA Astrophysics Data System (ADS)
Mohieddin Abukhdeir, Nasser
2016-08-01
A vital and challenging task for materials researchers is to determine relationships between material characteristics and desired properties. While the measurement and assessment of material properties can be complex, quantitatively characterizing their structure is frequently a more challenging task. This issue is magnified for materials researchers in the areas of nanoscience and nanotechnology, where material structure is further complicated by phenomena such as self-assembly, collective behavior, and measurement uncertainty. Recent progress has been made in this area for both self-assembled and nanostructured surfaces due to increasing accessibility of imaging techniques at the nanoscale. In this context, recent advances in nanomaterial surface structure characterization are reviewed including the development of new theory and image processing methods.
Mono- and binuclear non-heme iron chemistry from a theoretical perspective.
Rokob, Tibor András; Chalupský, Jakub; Bím, Daniel; Andrikopoulos, Prokopis C; Srnec, Martin; Rulíšek, Lubomír
2016-09-01
In this minireview, we provide an account of the current state-of-the-art developments in the area of mono- and binuclear non-heme enzymes (NHFe and NHFe2) and the smaller NHFe(2) synthetic models, mostly from a theoretical and computational perspective. The sheer complexity, and at the same time the beauty, of the NHFe(2) world represents a challenge for experimental as well as theoretical methods. We emphasize that the concerted progress on both theoretical and experimental side is a conditio sine qua non for future understanding, exploration and utilization of the NHFe(2) systems. After briefly discussing the current challenges and advances in the computational methodology, we review the recent spectroscopic and computational studies of NHFe(2) enzymatic and inorganic systems and highlight the correlations between various experimental data (spectroscopic, kinetic, thermodynamic, electrochemical) and computations. Throughout, we attempt to keep in mind the most fascinating and attractive phenomenon in the NHFe(2) chemistry, which is the fact that despite the strong oxidative power of many reactive intermediates, the NHFe(2) enzymes perform catalysis with high selectivity. We conclude with our personal viewpoint and hope that further developments in quantum chemistry and especially in the field of multireference wave function methods are needed to have a solid theoretical basis for the NHFe(2) studies, mostly by providing benchmarking and calibration of the computationally efficient and easy-to-use DFT methods.
Abstractions for DNA circuit design.
Lakin, Matthew R; Youssef, Simon; Cardelli, Luca; Phillips, Andrew
2012-03-07
DNA strand displacement techniques have been used to implement a broad range of information processing devices, from logic gates, to chemical reaction networks, to architectures for universal computation. Strand displacement techniques enable computational devices to be implemented in DNA without the need for additional components, allowing computation to be programmed solely in terms of nucleotide sequences. A major challenge in the design of strand displacement devices has been to enable rapid analysis of high-level designs while also supporting detailed simulations that include known forms of interference. Another challenge has been to design devices capable of sustaining precise reaction kinetics over long periods, without relying on complex experimental equipment to continually replenish depleted species over time. In this paper, we present a programming language for designing DNA strand displacement devices, which supports progressively increasing levels of molecular detail. The language allows device designs to be programmed using a common syntax and then analysed at varying levels of detail, with or without interference, without needing to modify the program. This allows a trade-off to be made between the level of molecular detail and the computational cost of analysis. We use the language to design a buffered architecture for DNA devices, capable of maintaining precise reaction kinetics for a potentially unbounded period. We test the effectiveness of buffered gates to support long-running computation by designing a DNA strand displacement system capable of sustained oscillations.
Davatzikos, Christos
2016-10-01
The past 20 years have seen a mushrooming growth of the field of computational neuroanatomy. Much of this work has been enabled by the development and refinement of powerful, high-dimensional image warping methods, which have enabled detailed brain parcellation, voxel-based morphometric analyses, and multivariate pattern analyses using machine learning approaches. The evolution of these 3 types of analyses over the years has overcome many challenges. We present the evolution of our work in these 3 directions, which largely follows the evolution of this field. We discuss the progression from single-atlas, single-registration brain parcellation work to current ensemble-based parcellation; from relatively basic mass-univariate t-tests to optimized regional pattern analyses combining deformations and residuals; and from basic application of support vector machines to generative-discriminative formulations of multivariate pattern analyses, and to methods dealing with heterogeneity of neuroanatomical patterns. We conclude with discussion of some of the future directions and challenges. Copyright © 2016. Published by Elsevier B.V.
Faster PET reconstruction with a stochastic primal-dual hybrid gradient method
NASA Astrophysics Data System (ADS)
Ehrhardt, Matthias J.; Markiewicz, Pawel; Chambolle, Antonin; Richtárik, Peter; Schott, Jonathan; Schönlieb, Carola-Bibiane
2017-08-01
Image reconstruction in positron emission tomography (PET) is computationally challenging due to Poisson noise, constraints and potentially non-smooth priors-let alone the sheer size of the problem. An algorithm that can cope well with the first three of the aforementioned challenges is the primal-dual hybrid gradient algorithm (PDHG) studied by Chambolle and Pock in 2011. However, PDHG updates all variables in parallel and is therefore computationally demanding on the large problem sizes encountered with modern PET scanners where the number of dual variables easily exceeds 100 million. In this work, we numerically study the usage of SPDHG-a stochastic extension of PDHG-but is still guaranteed to converge to a solution of the deterministic optimization problem with similar rates as PDHG. Numerical results on a clinical data set show that by introducing randomization into PDHG, similar results as the deterministic algorithm can be achieved using only around 10 % of operator evaluations. Thus, making significant progress towards the feasibility of sophisticated mathematical models in a clinical setting.
A Bridge for Accelerating Materials by Design
Sumpter, Bobby G.; Vasudevan, Rama K.; Potok, Thomas E.; ...
2015-11-25
Recent technical advances in the area of nanoscale imaging, spectroscopy, and scattering/diffraction have led to unprecedented capabilities for investigating materials structural, dynamical and functional characteristics. In addition, recent advances in computational algorithms and computer capacities that are orders of magnitude larger/faster have enabled large-scale simulations of materials properties starting with nothing but the identity of the atomic species and the basic principles of quantum- and statistical-mechanics and thermodynamics. Along with these advances, an explosion of high-resolution data has emerged. This confluence of capabilities and rise of big data offer grand opportunities for advancing materials sciences but also introduce several challenges.more » In this editorial we identify challenges impeding progress towards advancing materials by design (e.g., the design/discovery of materials with improved properties/performance), possible solutions, and provide examples of scientific issues that can be addressed by using a tightly integrated approach where theory and experiments are linked through big-deep data.« less
Davatzikos, Christos
2017-01-01
The past 20 years have seen a mushrooming growth of the field of computational neuroanatomy. Much of this work has been enabled by the development and refinement of powerful, high-dimensional image warping methods, which have enabled detailed brain parcellation, voxel-based morphometric analyses, and multivariate pattern analyses using machine learning approaches. The evolution of these 3 types of analyses over the years has overcome many challenges. We present the evolution of our work in these 3 directions, which largely follows the evolution of this field. We discuss the progression from single-atlas, single-registration brain parcellation work to current ensemble-based parcellation; from relatively basic mass-univariate t-tests to optimized regional pattern analyses combining deformations and residuals; and from basic application of support vector machines to generative-discriminative formulations of multivariate pattern analyses, and to methods dealing with heterogeneity of neuroanatomical patterns. We conclude with discussion of some of the future directions and challenges. PMID:27514582
NASA Astrophysics Data System (ADS)
Christensen, C.; Summa, B.; Scorzelli, G.; Lee, J. W.; Venkat, A.; Bremer, P. T.; Pascucci, V.
2017-12-01
Massive datasets are becoming more common due to increasingly detailed simulations and higher resolution acquisition devices. Yet accessing and processing these huge data collections for scientific analysis is still a significant challenge. Solutions that rely on extensive data transfers are increasingly untenable and often impossible due to lack of sufficient storage at the client side as well as insufficient bandwidth to conduct such large transfers, that in some cases could entail petabytes of data. Large-scale remote computing resources can be useful, but utilizing such systems typically entails some form of offline batch processing with long delays, data replications, and substantial cost for any mistakes. Both types of workflows can severely limit the flexible exploration and rapid evaluation of new hypotheses that are crucial to the scientific process and thereby impede scientific discovery. In order to facilitate interactivity in both analysis and visualization of these massive data ensembles, we introduce a dynamic runtime system suitable for progressive computation and interactive visualization of arbitrarily large, disparately located spatiotemporal datasets. Our system includes an embedded domain-specific language (EDSL) that allows users to express a wide range of data analysis operations in a simple and abstract manner. The underlying runtime system transparently resolves issues such as remote data access and resampling while at the same time maintaining interactivity through progressive and interruptible processing. Computations involving large amounts of data can be performed remotely in an incremental fashion that dramatically reduces data movement, while the client receives updates progressively thereby remaining robust to fluctuating network latency or limited bandwidth. This system facilitates interactive, incremental analysis and visualization of massive remote datasets up to petabytes in size. Our system is now available for general use in the community through both docker and anaconda.
Progress and challenges in bioinformatics approaches for enhancer identification
Kleftogiannis, Dimitrios; Kalnis, Panos
2016-01-01
Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration. PMID:26634919
Scheduling Earth Observing Fleets Using Evolutionary Algorithms: Problem Description and Approach
NASA Technical Reports Server (NTRS)
Globus, Al; Crawford, James; Lohn, Jason; Morris, Robert; Clancy, Daniel (Technical Monitor)
2002-01-01
We describe work in progress concerning multi-instrument, multi-satellite scheduling. Most, although not all, Earth observing instruments currently in orbit are unique. In the relatively near future, however, we expect to see fleets of Earth observing spacecraft, many carrying nearly identical instruments. This presents a substantially new scheduling challenge. Inspired by successful commercial applications of evolutionary algorithms in scheduling domains, this paper presents work in progress regarding the use of evolutionary algorithms to solve a set of Earth observing related model problems. Both the model problems and the software are described. Since the larger problems will require substantial computation and evolutionary algorithms are embarrassingly parallel, we discuss our parallelization techniques using dedicated and cycle-scavenged workstations.
Challenges to the development of complex virtual reality surgical simulations.
Seymour, N E; Røtnes, J S
2006-11-01
Virtual reality simulation in surgical training has become more widely used and intensely investigated in an effort to develop safer, more efficient, measurable training processes. The development of virtual reality simulation of surgical procedures has begun, but well-described technical obstacles must be overcome to permit varied training in a clinically realistic computer-generated environment. These challenges include development of realistic surgical interfaces and physical objects within the computer-generated environment, modeling of realistic interactions between objects, rendering of the surgical field, and development of signal processing for complex events associated with surgery. Of these, the realistic modeling of tissue objects that are fully responsive to surgical manipulations is the most challenging. Threats to early success include relatively limited resources for development and procurement, as well as smaller potential for return on investment than in other simulation industries that face similar problems. Despite these difficulties, steady progress continues to be made in these areas. If executed properly, virtual reality offers inherent advantages over other training systems in creating a realistic surgical environment and facilitating measurement of surgeon performance. Once developed, complex new virtual reality training devices must be validated for their usefulness in formative training and assessment of skill to be established.
Zhou, Hengbo; Costello, James C
2017-10-01
Identifying key factors that regulate the transition from primary to metastatic cancer is a fundamental challenge. Walsh et al. took a systems biology approach integrating computational, in vitro, and in vivo experiments to identify TRIM25 (tripartite motif containing 25) as a key factor that regulates metastatic gene signatures both at the transcriptional and post-transcriptional level in breast cancer. Targeting TRIM25 therapeutically is attractive because it governs a broad set of coordinated transcriptional modules that dictate metastatic progression. Copyright © 2017 Elsevier Inc. All rights reserved.
The difficult legacy of Turing's wager.
Thwaites, Andrew; Soltan, Andrew; Wieser, Eric; Nimmo-Smith, Ian
2017-08-01
Describing the human brain in mathematical terms is an important ambition of neuroscience research, yet the challenges remain considerable. It was Alan Turing, writing in 1950, who first sought to demonstrate how time-consuming such an undertaking would be. Through analogy to the computer program, Turing argued that arriving at a complete mathematical description of the mind would take well over a thousand years. In this opinion piece, we argue that - despite seventy years of progress in the field - his arguments remain both prescient and persuasive.
Reinforcement Learning and Episodic Memory in Humans and Animals: An Integrative Framework.
Gershman, Samuel J; Daw, Nathaniel D
2017-01-03
We review the psychology and neuroscience of reinforcement learning (RL), which has experienced significant progress in the past two decades, enabled by the comprehensive experimental study of simple learning and decision-making tasks. However, one challenge in the study of RL is computational: The simplicity of these tasks ignores important aspects of reinforcement learning in the real world: (a) State spaces are high-dimensional, continuous, and partially observable; this implies that (b) data are relatively sparse and, indeed, precisely the same situation may never be encountered twice; furthermore, (c) rewards depend on the long-term consequences of actions in ways that violate the classical assumptions that make RL tractable. A seemingly distinct challenge is that, cognitively, theories of RL have largely involved procedural and semantic memory, the way in which knowledge about action values or world models extracted gradually from many experiences can drive choice. This focus on semantic memory leaves out many aspects of memory, such as episodic memory, related to the traces of individual events. We suggest that these two challenges are related. The computational challenge can be dealt with, in part, by endowing RL systems with episodic memory, allowing them to (a) efficiently approximate value functions over complex state spaces, (b) learn with very little data, and (c) bridge long-term dependencies between actions and rewards. We review the computational theory underlying this proposal and the empirical evidence to support it. Our proposal suggests that the ubiquitous and diverse roles of memory in RL may function as part of an integrated learning system.
Using CFD Surface Solutions to Shape Sonic Boom Signatures Propagated from Off-Body Pressure
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Li, Wu
2013-01-01
The conceptual design of a low-boom and low-drag supersonic aircraft remains a challenge despite significant progress in recent years. Inverse design using reversed equivalent area and adjoint methods have been demonstrated to be effective in shaping the ground signature propagated from computational fluid dynamics (CFD) off-body pressure distributions. However, there is still a need to reduce the computational cost in the early stages of design to obtain a baseline that is feasible for low-boom shaping, and in the search for a robust low-boom design over the entire sonic boom footprint. The proposed design method addresses the need to reduce the computational cost for robust low-boom design by using surface pressure distributions from CFD solutions to shape sonic boom ground signatures propagated from CFD off-body pressure.
Lattice QCD Calculations in Nuclear Physics towards the Exascale
NASA Astrophysics Data System (ADS)
Joo, Balint
2017-01-01
The combination of algorithmic advances and new highly parallel computing architectures are enabling lattice QCD calculations to tackle ever more complex problems in nuclear physics. In this talk I will review some computational challenges that are encountered in large scale cold nuclear physics campaigns such as those in hadron spectroscopy calculations. I will discuss progress in addressing these with algorithmic improvements such as multi-grid solvers and software for recent hardware architectures such as GPUs and Intel Xeon Phi, Knights Landing. Finally, I will highlight some current topics for research and development as we head towards the Exascale era This material is funded by the U.S. Department of Energy, Office Of Science, Offices of Nuclear Physics, High Energy Physics and Advanced Scientific Computing Research, as well as the Office of Nuclear Physics under contract DE-AC05-06OR23177.
Orion Absolute Navigation System Progress and Challenge
NASA Technical Reports Server (NTRS)
Holt, Greg N.; D'Souza, Christopher
2012-01-01
The absolute navigation design of NASA's Orion vehicle is described. It has undergone several iterations and modifications since its inception, and continues as a work-in-progress. This paper seeks to benchmark the current state of the design and some of the rationale and analysis behind it. There are specific challenges to address when preparing a timely and effective design for the Exploration Flight Test (EFT-1), while still looking ahead and providing software extensibility for future exploration missions. The primary onboard measurements in a Near-Earth or Mid-Earth environment consist of GPS pseudo-range and delta-range, but for future explorations missions the use of star-tracker and optical navigation sources need to be considered. Discussions are presented for state size and composition, processing techniques, and consider states. A presentation is given for the processing technique using the computationally stable and robust UDU formulation with an Agee-Turner Rank-One update. This allows for computational savings when dealing with many parameters which are modeled as slowly varying Gauss-Markov processes. Preliminary analysis shows up to a 50% reduction in computation versus a more traditional formulation. Several state elements are discussed and evaluated, including position, velocity, attitude, clock bias/drift, and GPS measurement biases in addition to bias, scale factor, misalignment, and non-orthogonalities of the accelerometers and gyroscopes. Another consideration is the initialization of the EKF in various scenarios. Scenarios such as single-event upset, ground command, and cold start are discussed as are strategies for whole and partial state updates as well as covariance considerations. Strategies are given for dealing with latent measurements and high-rate propagation using multi-rate architecture. The details of the rate groups and the data ow between the elements is discussed and evaluated.
NASA Technical Reports Server (NTRS)
Drozda, Tomasz G.; Quinlan, Jesse R.; Pisciuneri, Patrick H.; Yilmaz, S. Levent
2012-01-01
Significant progress has been made in the development of subgrid scale (SGS) closures based on a filtered density function (FDF) for large eddy simulations (LES) of turbulent reacting flows. The FDF is the counterpart of the probability density function (PDF) method, which has proven effective in Reynolds averaged simulations (RAS). However, while systematic progress is being made advancing the FDF models for relatively simple flows and lab-scale flames, the application of these methods in complex geometries and high speed, wall-bounded flows with shocks remains a challenge. The key difficulties are the significant computational cost associated with solving the FDF transport equation and numerically stiff finite rate chemistry. For LES/FDF methods to make a more significant impact in practical applications a pragmatic approach must be taken that significantly reduces the computational cost while maintaining high modeling fidelity. An example of one such ongoing effort is at the NASA Langley Research Center, where the first generation FDF models, namely the scalar filtered mass density function (SFMDF) are being implemented into VULCAN, a production-quality RAS and LES solver widely used for design of high speed propulsion flowpaths. This effort leverages internal and external collaborations to reduce the overall computational cost of high fidelity simulations in VULCAN by: implementing high order methods that allow reduction in the total number of computational cells without loss in accuracy; implementing first generation of high fidelity scalar PDF/FDF models applicable to high-speed compressible flows; coupling RAS/PDF and LES/FDF into a hybrid framework to efficiently and accurately model the effects of combustion in the vicinity of the walls; developing efficient Lagrangian particle tracking algorithms to support robust solutions of the FDF equations for high speed flows; and utilizing finite rate chemistry parametrization, such as flamelet models, to reduce the number of transported reactive species and remove numerical stiffness. This paper briefly introduces the SFMDF model (highlighting key benefits and challenges), and discusses particle tracking for flows with shocks, the hybrid coupled RAS/PDF and LES/FDF model, flamelet generated manifolds (FGM) model, and the Irregularly Portioned Lagrangian Monte Carlo Finite Difference (IPLMCFD) methodology for scalable simulation of high-speed reacting compressible flows.
Rocksalt nitride metal/semiconductor superlattices: A new class of artificially structured materials
NASA Astrophysics Data System (ADS)
Saha, Bivas; Shakouri, Ali; Sands, Timothy D.
2018-06-01
Artificially structured materials in the form of superlattice heterostructures enable the search for exotic new physics and novel device functionalities, and serve as tools to push the fundamentals of scientific and engineering knowledge. Semiconductor heterostructures are the most celebrated and widely studied artificially structured materials, having led to the development of quantum well lasers, quantum cascade lasers, measurements of the fractional quantum Hall effect, and numerous other scientific concepts and practical device technologies. However, combining metals with semiconductors at the atomic scale to develop metal/semiconductor superlattices and heterostructures has remained a profoundly difficult scientific and engineering challenge. Though the potential applications of metal/semiconductor heterostructures could range from energy conversion to photonic computing to high-temperature electronics, materials challenges primarily had severely limited progress in this pursuit until very recently. In this article, we detail the progress that has taken place over the last decade to overcome the materials engineering challenges to grow high quality epitaxial, nominally single crystalline metal/semiconductor superlattices based on transition metal nitrides (TMN). The epitaxial rocksalt TiN/(Al,Sc)N metamaterials are the first pseudomorphic metal/semiconductor superlattices to the best of our knowledge, and their physical properties promise a new era in superlattice physics and device engineering.
ChIP-seq: advantages and challenges of a maturing technology.
Park, Peter J
2009-10-01
Chromatin immunoprecipitation followed by sequencing (ChIP-seq) is a technique for genome-wide profiling of DNA-binding proteins, histone modifications or nucleosomes. Owing to the tremendous progress in next-generation sequencing technology, ChIP-seq offers higher resolution, less noise and greater coverage than its array-based predecessor ChIP-chip. With the decreasing cost of sequencing, ChIP-seq has become an indispensable tool for studying gene regulation and epigenetic mechanisms. In this Review, I describe the benefits and challenges in harnessing this technique with an emphasis on issues related to experimental design and data analysis. ChIP-seq experiments generate large quantities of data, and effective computational analysis will be crucial for uncovering biological mechanisms.
NASA Technical Reports Server (NTRS)
Luthcke, Scott; Rowlands, David; Lemoine, Frank; Zelensky, Nikita; Beckley, Brian; Klosko, Steve; Chinn, Doug
2006-01-01
Although satellite altimetry has been around for thirty years, the last fifteen beginning with the launch of TOPEX/Poseidon (TP) have yielded an abundance of significant results including: monitoring of ENS0 events, detection of internal tides, determination of accurate global tides, unambiguous delineation of Rossby waves and their propagation characteristics, accurate determination of geostrophic currents, and a multi-decadal time series of mean sea level trend and dynamic ocean topography variability. While the high level of accuracy being achieved is a result of both instrument maturity and the quality of models and correction algorithms applied to the data, improving the quality of the Climate Data Records produced from altimetry is highly dependent on concurrent progress being made in fields such as orbit determination. The precision orbits form the reference frame from which the radar altimeter observations are made. Therefore, the accuracy of the altimetric mapping is limited to a great extent by the accuracy to which a satellite orbit can be computed. The TP mission represents the first time that the radial component of an altimeter orbit was routinely computed with an accuracy of 2-cm. Recently it has been demonstrated that it is possible to compute the radial component of Jason orbits with an accuracy of better than 1-cm. Additionally, still further improvements in TP orbits are being achieved with new techniques and algorithms largely developed from combined Jason and TP data analysis. While these recent POD achievements are impressive, the new accuracies are now revealing subtle systematic orbit error that manifest as both intra and inter annual ocean topography errors. Additionally the construction of inter-decadal time series of climate data records requires the removal of systematic differences across multiple missions. Current and future efforts must focus on the understanding and reduction of these errors in order to generate a complete and consistent time series of improved orbits across multiple missions and decades required for the most stringent climate-related research. This presentation discusses the POD progress and achievements made over nearly three decades, and presents the future challenges, goals and their impact on altimetric derived ocean sciences.
Accomplishments and challenges of surgical simulation.
Satava, R M
2001-03-01
For nearly a decade, advanced computer technologies have created extraordinary educational tools using three-dimensional (3D) visualization and virtual reality. Pioneering efforts in surgical simulation with these tools have resulted in a first generation of simulators for surgical technical skills. Accomplishments include simulations with 3D models of anatomy for practice of surgical tasks, initial assessment of student performance in technical skills, and awareness by professional societies of potential in surgical education and certification. However, enormous challenges remain, which include improvement of technical fidelity, standardization of accurate metrics for performance evaluation, integration of simulators into a robust educational curriculum, stringent evaluation of simulators for effectiveness and value added to surgical training, determination of simulation application to certification of surgical technical skills, and a business model to implement and disseminate simulation successfully throughout the medical education community. This review looks at the historical progress of surgical simulators, their accomplishments, and the challenges that remain.
NASA Astrophysics Data System (ADS)
Walton, A. L.
2015-12-01
In 2016, the National Science Foundation (NSF) will support a portfolio of activities and investments focused upon challenges in data access, interoperability, and sustainability. These topics are fundamental to science questions of increasing complexity that require multidisciplinary approaches and expertise. Progress has become tractable because of (and sometimes complicated by) unprecedented growth in data (both simulations and observations) and rapid advances in technology (such as instrumentation in all aspects of the discovery process, together with ubiquitous cyberinfrastructure to connect, compute, visualize, store, and discover). The goal is an evolution of capabilities for the research community based on these investments, scientific priorities, technology advances, and policies. Examples from multiple NSF directorates, including investments by the Advanced Cyberinfrastructure Division, are aimed at these challenges and can provide the geosciences research community with models and opportunities for participation. Implications for the future are highlighted, along with the importance of continued community engagement on key issues.
Computational simulation of progressive fracture in fiber composites
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
Computational methods for simulating and predicting progressive fracture in fiber composite structures are presented. These methods are integrated into a computer code of modular form. The modules include composite mechanics, finite element analysis, and fracture criteria. The code is used to computationally simulate progressive fracture in composite laminates with and without defects. The simulation tracks the fracture progression in terms of modes initiating fracture, damage growth, and imminent global (catastrophic) laminate fracture.
OPENING REMARKS: SciDAC: Scientific Discovery through Advanced Computing
NASA Astrophysics Data System (ADS)
Strayer, Michael
2005-01-01
Good morning. Welcome to SciDAC 2005 and San Francisco. SciDAC is all about computational science and scientific discovery. In a large sense, computational science characterizes SciDAC and its intent is change. It transforms both our approach and our understanding of science. It opens new doors and crosses traditional boundaries while seeking discovery. In terms of twentieth century methodologies, computational science may be said to be transformational. There are a number of examples to this point. First are the sciences that encompass climate modeling. The application of computational science has in essence created the field of climate modeling. This community is now international in scope and has provided precision results that are challenging our understanding of our environment. A second example is that of lattice quantum chromodynamics. Lattice QCD, while adding precision and insight to our fundamental understanding of strong interaction dynamics, has transformed our approach to particle and nuclear science. The individual investigator approach has evolved to teams of scientists from different disciplines working side-by-side towards a common goal. SciDAC is also undergoing a transformation. This meeting is a prime example. Last year it was a small programmatic meeting tracking progress in SciDAC. This year, we have a major computational science meeting with a variety of disciplines and enabling technologies represented. SciDAC 2005 should position itself as a new corner stone for Computational Science and its impact on science. As we look to the immediate future, FY2006 will bring a new cycle to SciDAC. Most of the program elements of SciDAC will be re-competed in FY2006. The re-competition will involve new instruments for computational science, new approaches for collaboration, as well as new disciplines. There will be new opportunities for virtual experiments in carbon sequestration, fusion, and nuclear power and nuclear waste, as well as collaborations with industry and virtual prototyping. New instruments of collaboration will include institutes and centers while summer schools, workshops and outreach will invite new talent and expertise. Computational science adds new dimensions to science and its practice. Disciplines of fusion, accelerator science, and combustion are poised to blur the boundaries between pure and applied science. As we open the door into FY2006 we shall see a landscape of new scientific challenges: in biology, chemistry, materials, and astrophysics to name a few. The enabling technologies of SciDAC have been transformational as drivers of change. Planning for major new software systems assumes a base line employing Common Component Architectures and this has become a household word for new software projects. While grid algorithms and mesh refinement software have transformed applications software, data management and visualization have transformed our understanding of science from data. The Gordon Bell prize now seems to be dominated by computational science and solvers developed by TOPS ISIC. The priorities of the Office of Science in the Department of Energy are clear. The 20 year facilities plan is driven by new science. High performance computing is placed amongst the two highest priorities. Moore's law says that by the end of the next cycle of SciDAC we shall have peta-flop computers. The challenges of petascale computing are enormous. These and the associated computational science are the highest priorities for computing within the Office of Science. Our effort in Leadership Class computing is just a first step towards this goal. Clearly, computational science at this scale will face enormous challenges and possibilities. Performance evaluation and prediction will be critical to unraveling the needed software technologies. We must not lose sight of our overarching goal—that of scientific discovery. Science does not stand still and the landscape of science discovery and computing holds immense promise. In this environment, I believe it is necessary to institute a system of science based performance metrics to help quantify our progress towards science goals and scientific computing. As a final comment I would like to reaffirm that the shifting landscapes of science will force changes to our computational sciences, and leave you with the quote from Richard Hamming, 'The purpose of computing is insight, not numbers'.
Paul, J T; Singh, A K; Dong, Z; Zhuang, H; Revard, B C; Rijal, B; Ashton, M; Linscheid, A; Blonsky, M; Gluhovic, D; Guo, J; Hennig, R G
2017-11-29
The discovery of two-dimensional (2D) materials comes at a time when computational methods are mature and can predict novel 2D materials, characterize their properties, and guide the design of 2D materials for applications. This article reviews the recent progress in computational approaches for 2D materials research. We discuss the computational techniques and provide an overview of the ongoing research in the field. We begin with an overview of known 2D materials, common computational methods, and available cyber infrastructures. We then move onto the discovery of novel 2D materials, discussing the stability criteria for 2D materials, computational methods for structure prediction, and interactions of monolayers with electrochemical and gaseous environments. Next, we describe the computational characterization of the 2D materials' electronic, optical, magnetic, and superconducting properties and the response of the properties under applied mechanical strain and electrical fields. From there, we move on to discuss the structure and properties of defects in 2D materials, and describe methods for 2D materials device simulations. We conclude by providing an outlook on the needs and challenges for future developments in the field of computational research for 2D materials.
Computational methods for 2D materials: discovery, property characterization, and application design
NASA Astrophysics Data System (ADS)
Paul, J. T.; Singh, A. K.; Dong, Z.; Zhuang, H.; Revard, B. C.; Rijal, B.; Ashton, M.; Linscheid, A.; Blonsky, M.; Gluhovic, D.; Guo, J.; Hennig, R. G.
2017-11-01
The discovery of two-dimensional (2D) materials comes at a time when computational methods are mature and can predict novel 2D materials, characterize their properties, and guide the design of 2D materials for applications. This article reviews the recent progress in computational approaches for 2D materials research. We discuss the computational techniques and provide an overview of the ongoing research in the field. We begin with an overview of known 2D materials, common computational methods, and available cyber infrastructures. We then move onto the discovery of novel 2D materials, discussing the stability criteria for 2D materials, computational methods for structure prediction, and interactions of monolayers with electrochemical and gaseous environments. Next, we describe the computational characterization of the 2D materials’ electronic, optical, magnetic, and superconducting properties and the response of the properties under applied mechanical strain and electrical fields. From there, we move on to discuss the structure and properties of defects in 2D materials, and describe methods for 2D materials device simulations. We conclude by providing an outlook on the needs and challenges for future developments in the field of computational research for 2D materials.
Structural behavior of composites with progressive fracture
NASA Technical Reports Server (NTRS)
Minnetyan, L.; Murthy, P. L. N.; Chamis, C. C.
1989-01-01
The objective of the study is to unify several computational tools developed for the prediction of progressive damage and fracture with efforts for the prediction of the overall response of damaged composite structures. In particular, a computational finite element model for the damaged structure is developed using a computer program as a byproduct of the analysis of progressive damage and fracture. Thus, a single computational investigation can predict progressive fracture and the resulting variation in structural properties of angleplied composites.
New Unintended Adverse Consequences of Electronic Health Records
Wright, A.; Ash, J.; Singh, H.
2016-01-01
Summary Although the health information technology industry has made considerable progress in the design, development, implementation, and use of electronic health records (EHRs), the lofty expectations of the early pioneers have not been met. In 2006, the Provider Order Entry Team at Oregon Health & Science University described a set of unintended adverse consequences (UACs), or unpredictable, emergent problems associated with computer-based provider order entry implementation, use, and maintenance. Many of these originally identified UACs have not been completely addressed or alleviated, some have evolved over time, and some new ones have emerged as EHRs became more widely available. The rapid increase in the adoption of EHRs, coupled with the changes in the types and attitudes of clinical users, has led to several new UACs, specifically: complete clinical information unavailable at the point of care; lack of innovations to improve system usability leading to frustrating user experiences; inadvertent disclosure of large amounts of patient-specific information; increased focus on computer-based quality measurement negatively affecting clinical workflows and patient-provider interactions; information overload from marginally useful computer-generated data; and a decline in the development and use of internally-developed EHRs. While each of these new UACs poses significant challenges to EHR developers and users alike, they also offer many opportunities. The challenge for clinical informatics researchers is to continue to refine our current systems while exploring new methods of overcoming these challenges and developing innovations to improve EHR interoperability, usability, security, functionality, clinical quality measurement, and information summarization and display. PMID:27830226
Integrating CFD, CAA, and Experiments Towards Benchmark Datasets for Airframe Noise Problems
NASA Technical Reports Server (NTRS)
Choudhari, Meelan M.; Yamamoto, Kazuomi
2012-01-01
Airframe noise corresponds to the acoustic radiation due to turbulent flow in the vicinity of airframe components such as high-lift devices and landing gears. The combination of geometric complexity, high Reynolds number turbulence, multiple regions of separation, and a strong coupling with adjacent physical components makes the problem of airframe noise highly challenging. Since 2010, the American Institute of Aeronautics and Astronautics has organized an ongoing series of workshops devoted to Benchmark Problems for Airframe Noise Computations (BANC). The BANC workshops are aimed at enabling a systematic progress in the understanding and high-fidelity predictions of airframe noise via collaborative investigations that integrate state of the art computational fluid dynamics, computational aeroacoustics, and in depth, holistic, and multifacility measurements targeting a selected set of canonical yet realistic configurations. This paper provides a brief summary of the BANC effort, including its technical objectives, strategy, and selective outcomes thus far.
Mechanical Computing Redux: Limitations at the Nanoscale
NASA Astrophysics Data System (ADS)
Liu, Tsu-Jae King
2014-03-01
Technology solutions for overcoming the energy efficiency limits of nanoscale complementary metal oxide semiconductor (CMOS) technology ultimately will be needed in order to address the growing issue of integrated-circuit chip power density. Off-state leakage current sets a fundamental lower limit in energy per operation for any voltage-level-based digital logic implemented with transistors (CMOS and beyond), which leads to practical limits for device density (i.e. cost) and operating frequency (i.e. system performance). Mechanical switches have zero off-state leakag and hence can overcome this fundamental limit. Contact adhesive force sets a lower limit for the switching energy of a mechanical switch, however, and also directly impacts its performance. This paper will review recent progress toward the development of nano-electro-mechanical relay technology and discuss remaining challenges for realizing the promise of mechanical computing for ultra-low-power computing. Supported by the Center for Energy Efficient Electronics Science (NSF Award 0939514).
Uncertainty Quantification in Aeroelasticity
NASA Astrophysics Data System (ADS)
Beran, Philip; Stanford, Bret; Schrock, Christopher
2017-01-01
Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.
Milestones on the road to independence for the blind
NASA Astrophysics Data System (ADS)
Reed, Kenneth
1997-02-01
Ken will talk about his experiences as an end user of technology. Even moderate technological progress in the field of pattern recognition and artificial intelligence can be, often surprisingly, of great help to the blind. An example is the providing of portable bar code scanners so that a blind person knows what he is buying and what color it is. In this age of microprocessors controlling everything, how can a blind person find out what his VCR is doing? Is there some technique that will allow a blind musician to convert print music into midi files to drive a synthesizer? Can computer vision help the blind cross a road including predictions of where oncoming traffic will be located? Can computer vision technology provide spoken description of scenes so a blind person can figure out where doors and entrances are located, and what the signage on the building says? He asks 'can computer vision help me flip a pancake?' His challenge to those in the computer vision field is 'where can we go from here?'
Images as drivers of progress in cardiac computational modelling
Lamata, Pablo; Casero, Ramón; Carapella, Valentina; Niederer, Steve A.; Bishop, Martin J.; Schneider, Jürgen E.; Kohl, Peter; Grau, Vicente
2014-01-01
Computational models have become a fundamental tool in cardiac research. Models are evolving to cover multiple scales and physical mechanisms. They are moving towards mechanistic descriptions of personalised structure and function, including effects of natural variability. These developments are underpinned to a large extent by advances in imaging technologies. This article reviews how novel imaging technologies, or the innovative use and extension of established ones, integrate with computational models and drive novel insights into cardiac biophysics. In terms of structural characterization, we discuss how imaging is allowing a wide range of scales to be considered, from cellular levels to whole organs. We analyse how the evolution from structural to functional imaging is opening new avenues for computational models, and in this respect we review methods for measurement of electrical activity, mechanics and flow. Finally, we consider ways in which combined imaging and modelling research is likely to continue advancing cardiac research, and identify some of the main challenges that remain to be solved. PMID:25117497
Digital pathology in nephrology clinical trials, research, and pathology practice.
Barisoni, Laura; Hodgin, Jeffrey B
2017-11-01
In this review, we will discuss (i) how the recent advancements in digital technology and computational engineering are currently applied to nephropathology in the setting of clinical research, trials, and practice; (ii) the benefits of the new digital environment; (iii) how recognizing its challenges provides opportunities for transformation; and (iv) nephropathology in the upcoming era of kidney precision and predictive medicine. Recent studies highlighted how new standardized protocols facilitate the harmonization of digital pathology database infrastructure and morphologic, morphometric, and computer-aided quantitative analyses. Digital pathology enables robust protocols for clinical trials and research, with the potential to identify previously underused or unrecognized clinically useful parameters. The integration of digital pathology with molecular signatures is leading the way to establishing clinically relevant morpho-omic taxonomies of renal diseases. The introduction of digital pathology in clinical research and trials, and the progressive implementation of the modern software ecosystem, opens opportunities for the development of new predictive diagnostic paradigms and computer-aided algorithms, transforming the practice of renal disease into a modern computational science.
Thrombosis in Cerebral Aneurysms and the Computational Modeling Thereof: A Review
Ngoepe, Malebogo N.; Frangi, Alejandro F.; Byrne, James V.; Ventikos, Yiannis
2018-01-01
Thrombosis is a condition closely related to cerebral aneurysms and controlled thrombosis is the main purpose of endovascular embolization treatment. The mechanisms governing thrombus initiation and evolution in cerebral aneurysms have not been fully elucidated and this presents challenges for interventional planning. Significant effort has been directed towards developing computational methods aimed at streamlining the interventional planning process for unruptured cerebral aneurysm treatment. Included in these methods are computational models of thrombus development following endovascular device placement. The main challenge with developing computational models for thrombosis in disease cases is that there exists a wide body of literature that addresses various aspects of the clotting process, but it may not be obvious what information is of direct consequence for what modeling purpose (e.g., for understanding the effect of endovascular therapies). The aim of this review is to present the information so it will be of benefit to the community attempting to model cerebral aneurysm thrombosis for interventional planning purposes, in a simplified yet appropriate manner. The paper begins by explaining current understanding of physiological coagulation and highlights the documented distinctions between the physiological process and cerebral aneurysm thrombosis. Clinical observations of thrombosis following endovascular device placement are then presented. This is followed by a section detailing the demands placed on computational models developed for interventional planning. Finally, existing computational models of thrombosis are presented. This last section begins with description and discussion of physiological computational clotting models, as they are of immense value in understanding how to construct a general computational model of clotting. This is then followed by a review of computational models of clotting in cerebral aneurysms, specifically. Even though some progress has been made towards computational predictions of thrombosis following device placement in cerebral aneurysms, many gaps still remain. Answering the key questions will require the combined efforts of the clinical, experimental and computational communities. PMID:29670533
Thrombosis in Cerebral Aneurysms and the Computational Modeling Thereof: A Review.
Ngoepe, Malebogo N; Frangi, Alejandro F; Byrne, James V; Ventikos, Yiannis
2018-01-01
Thrombosis is a condition closely related to cerebral aneurysms and controlled thrombosis is the main purpose of endovascular embolization treatment. The mechanisms governing thrombus initiation and evolution in cerebral aneurysms have not been fully elucidated and this presents challenges for interventional planning. Significant effort has been directed towards developing computational methods aimed at streamlining the interventional planning process for unruptured cerebral aneurysm treatment. Included in these methods are computational models of thrombus development following endovascular device placement. The main challenge with developing computational models for thrombosis in disease cases is that there exists a wide body of literature that addresses various aspects of the clotting process, but it may not be obvious what information is of direct consequence for what modeling purpose (e.g., for understanding the effect of endovascular therapies). The aim of this review is to present the information so it will be of benefit to the community attempting to model cerebral aneurysm thrombosis for interventional planning purposes, in a simplified yet appropriate manner. The paper begins by explaining current understanding of physiological coagulation and highlights the documented distinctions between the physiological process and cerebral aneurysm thrombosis. Clinical observations of thrombosis following endovascular device placement are then presented. This is followed by a section detailing the demands placed on computational models developed for interventional planning. Finally, existing computational models of thrombosis are presented. This last section begins with description and discussion of physiological computational clotting models, as they are of immense value in understanding how to construct a general computational model of clotting. This is then followed by a review of computational models of clotting in cerebral aneurysms, specifically. Even though some progress has been made towards computational predictions of thrombosis following device placement in cerebral aneurysms, many gaps still remain. Answering the key questions will require the combined efforts of the clinical, experimental and computational communities.
Back to the future: virtualization of the computing environment at the W. M. Keck Observatory
NASA Astrophysics Data System (ADS)
McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.
2014-07-01
Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.
NASA Astrophysics Data System (ADS)
Tang, William M., Dr.
2006-01-01
The second annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held from June 25-29, 2006 at the new Hyatt Regency Hotel in Denver, Colorado. This conference showcased outstanding SciDAC-sponsored computational science results achieved during the past year across many scientific domains, with an emphasis on science at scale. Exciting computational science that has been accomplished outside of the SciDAC program both nationally and internationally was also featured to help foster communication between SciDAC computational scientists and those funded by other agencies. This was illustrated by many compelling examples of how domain scientists collaborated productively with applied mathematicians and computer scientists to effectively take advantage of terascale computers (capable of performing trillions of calculations per second) not only to accelerate progress in scientific discovery in a variety of fields but also to show great promise for being able to utilize the exciting petascale capabilities in the near future. The SciDAC program was originally conceived as an interdisciplinary computational science program based on the guiding principle that strong collaborative alliances between domain scientists, applied mathematicians, and computer scientists are vital to accelerated progress and associated discovery on the world's most challenging scientific problems. Associated verification and validation are essential in this successful program, which was funded by the US Department of Energy Office of Science (DOE OS) five years ago. As is made clear in many of the papers in these proceedings, SciDAC has fundamentally changed the way that computational science is now carried out in response to the exciting challenge of making the best use of the rapid progress in the emergence of more and more powerful computational platforms. In this regard, Dr. Raymond Orbach, Energy Undersecretary for Science at the DOE and Director of the OS has stated: `SciDAC has strengthened the role of high-end computing in furthering science. It is defining whole new fields for discovery.' (SciDAC Review, Spring 2006, p8). Application domains within the SciDAC 2006 conference agenda encompassed a broad range of science including: (i) the DOE core mission of energy research involving combustion studies relevant to fuel efficiency and pollution issues faced today and magnetic fusion investigations impacting prospects for future energy sources; (ii) fundamental explorations into the building blocks of matter, ranging from quantum chromodynamics - the basic theory that describes how quarks make up the protons and neutrons of all matter - to the design of modern high-energy accelerators; (iii) the formidable challenges of predicting and controlling the behavior of molecules in quantum chemistry and the complex biomolecules determining the evolution of biological systems; (iv) studies of exploding stars for insights into the nature of the universe; and (v) integrated climate modeling to enable realistic analysis of earth's changing climate. Associated research has made it quite clear that advanced computation is often the only means by which timely progress is feasible when dealing with these complex, multi-component physical, chemical, and biological systems operating over huge ranges of temporal and spatial scales. Working with the domain scientists, applied mathematicians and computer scientists have continued to develop the discretizations of the underlying equations and the complementary algorithms to enable improvements in solutions on modern parallel computing platforms as they evolve from the terascale toward the petascale regime. Moreover, the associated tremendous growth of data generated from the terabyte to the petabyte range demands not only the advanced data analysis and visualization methods to harvest the scientific information but also the development of efficient workflow strategies which can deal with the data input/output, management, movement, and storage challenges. If scientific discovery is expected to keep apace with the continuing progression from tera- to petascale platforms, the vital alliance between domain scientists, applied mathematicians, and computer scientists will be even more crucial. During the SciDAC 2006 Conference, some of the future challenges and opportunities in interdisciplinary computational science were emphasized in the Advanced Architectures Panel and by Dr. Victor Reis, Senior Advisor to the Secretary of Energy, who gave a featured presentation on `Simulation, Computation, and the Global Nuclear Energy Partnership.' Overall, the conference provided an excellent opportunity to highlight the rising importance of computational science in the scientific enterprise and to motivate future investment in this area. As Michael Strayer, SciDAC Program Director, has noted: `While SciDAC may have started out as a specific program, Scientific Discovery through Advanced Computing has become a powerful concept for addressing some of the biggest challenges facing our nation and our world.' Looking forward to next year, the SciDAC 2007 Conference will be held from June 24-28 at the Westin Copley Plaza in Boston, Massachusetts. Chairman: David Keyes, Columbia University. The Organizing Committee for the SciDAC 2006 Conference would like to acknowledge the individuals whose talents and efforts were essential to the success of the meeting. Special thanks go to Betsy Riley for her leadership in building the infrastructure support for the conference, for identifying and then obtaining contributions from our corporate sponsors, for coordinating all media communications, and for her efforts in organizing and preparing the conference proceedings for publication; to Tim Jones for handling the hotel scouting, subcontracts, and exhibits and stage production; to Angela Harris for handling supplies, shipping, and tracking, poster sessions set-up, and for her efforts in coordinating and scheduling the promotional activities that took place during the conference; to John Bui and John Smith for their superb wireless networking and A/V set-up and support; to Cindy Latham for Web site design, graphic design, and quality control of proceedings submissions; and to Pamelia Nixon-Hartje of Ambassador for budget and quality control of catering. We are grateful for the highly professional dedicated efforts of all of these individuals, who were the cornerstones of the SciDAC 2006 Conference. Thanks also go to Angela Beach of the ORNL Conference Center for her efforts in executing the contracts with the hotel, Carolyn James of Colorado State for on-site registration supervision, Lora Wolfe and Brittany Hagen for administrative support at ORNL, and Dami Rich and Andrew Sproles for graphic design and production. We are also most grateful to the Oak Ridge National Laboratory, especially Jeff Nichols, and to our corporate sponsors, Data Direct Networks, Cray, IBM, SGI, and Institute of Physics Publishing for their support. We especially express our gratitude to the featured speakers, invited oral speakers, invited poster presenters, session chairs, and advanced architecture panelists and chair for their excellent contributions on behalf of SciDAC 2006. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas, Margaret Smith, and the production team of Institute of Physics Publishing, who worked tirelessly to publish the final conference proceedings in a timely manner. Finally, heartfelt thanks are extended to Michael Strayer, Associate Director for OASCR and SciDAC Director, and to the DOE program managers associated with SciDAC for their continuing enthusiasm and strong support for the annual SciDAC Conferences as a special venue to showcase the exciting scientific discovery achievements enabled by the interdisciplinary collaborations championed by the SciDAC program.
TOPICAL REVIEW: Advances and challenges in computational plasma science
NASA Astrophysics Data System (ADS)
Tang, W. M.; Chan, V. S.
2005-02-01
Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.
Advances and challenges in computational plasma science
NASA Astrophysics Data System (ADS)
Tang, W. M.
2005-02-01
Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.
Computer Simulations of Ion Transport in Polymer Electrolyte Membranes.
Mogurampelly, Santosh; Borodin, Oleg; Ganesan, Venkat
2016-06-07
Understanding the mechanisms and optimizing ion transport in polymer membranes have been the subject of active research for more than three decades. We present an overview of the progress and challenges involved with the modeling and simulation aspects of the ion transport properties of polymer membranes. We are concerned mainly with atomistic and coarser level simulation studies and discuss some salient work in the context of pure binary and single ion conducting polymer electrolytes, polymer nanocomposites, block copolymers, and ionic liquid-based hybrid electrolytes. We conclude with an outlook highlighting future directions.
Roskams, Jane; Popović, Zoran
2016-11-02
Global neuroscience projects are producing big data at an unprecedented rate that informatic and artificial intelligence (AI) analytics simply cannot handle. Online games, like Foldit, Eterna, and Eyewire-and now a new neuroscience game, Mozak-are fueling a people-powered research science (PPRS) revolution, creating a global community of "new experts" that over time synergize with computational efforts to accelerate scientific progress, empowering us to use our collective cerebral talents to drive our understanding of our brain. Copyright © 2016 Elsevier Inc. All rights reserved.
The quantum computer game: citizen science
NASA Astrophysics Data System (ADS)
Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob
2013-05-01
Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.
50 Years of Educational Progress and Challenge in Ghana. Research Monograph No. 33
ERIC Educational Resources Information Center
Akyeampong, Kwame
2010-01-01
In 2007 Ghana celebrated 50 years of independence from British colonial rule. The golden jubilee offered an opportunity to take stock of how the country had progressed in expanding education and the challenges for the future. This paper offers a critique of the journey, highlighting the challenges and progress. What reforms in education has taught…
Frushicheva, Maria P.; Cao, Jie; Warshel, Arieh
2011-01-01
One of the fundamental challenges in biotechnology and biochemistry is the ability to design effective enzymes. Despite recent progress, most of the advances on this front have been made by placing the reacting fragments in the proper places, rather than by optimizing the preorganization of the environment, which is the key factor in enzyme catalysis. Thus, rational improvement of the preorganization would require approaches capable of evaluating reliably the actual catalytic effect. This work considers the catalytic effects in different Kemp eliminases as a benchmark for a computer aided enzyme design. It is shown that the empirical valence bond provides a powerful screening tool, with significant advantage over current alternative strategies. The insights provided by the empirical valence bond calculations are discussed emphasizing the ability to analyze the difference between the linear free energy relationships obtained in solution to those found in the enzymes. We also point out the trade off between reliability and speed of the calculations and try to determine what it takes to obtain reliable computer aided screening. PMID:21443179
Challenges and advances in validating enzyme design proposals: the case of kemp eliminase catalysis.
Frushicheva, Maria P; Cao, Jie; Warshel, Arieh
2011-05-10
One of the fundamental challenges in biotechnology and biochemistry is the ability to design effective enzymes. Despite recent progress, most of the advances on this front have been made by placing the reacting fragments in the proper places, rather than by optimizing the preorganization of the environment, which is the key factor in enzyme catalysis. Thus, rational improvement of the preorganization would require approaches capable of evaluating reliably the actual catalytic effect. This work considers the catalytic effects in different Kemp eliminases as a benchmark for a computer-aided enzyme design. It is shown that the empirical valence bond provides a powerful screening tool, with significant advantages over current alternative strategies. The insights provided by the empirical valence bond calculations are discussed with an emphasis on the ability to analyze the difference between the linear free energy relationships obtained in solution and those found in the enzymes. We also point out the trade-off between the reliability and speed of the calculations and try to determine what it takes to realize reliable computer-aided screening.
Facing the challenges of multiscale modelling of bacterial and fungal pathogen–host interactions
Schleicher, Jana; Conrad, Theresia; Gustafsson, Mika; Cedersund, Gunnar; Guthke, Reinhard
2017-01-01
Abstract Recent and rapidly evolving progress on high-throughput measurement techniques and computational performance has led to the emergence of new disciplines, such as systems medicine and translational systems biology. At the core of these disciplines lies the desire to produce multiscale models: mathematical models that integrate multiple scales of biological organization, ranging from molecular, cellular and tissue models to organ, whole-organism and population scale models. Using such models, hypotheses can systematically be tested. In this review, we present state-of-the-art multiscale modelling of bacterial and fungal infections, considering both the pathogen and host as well as their interaction. Multiscale modelling of the interactions of bacteria, especially Mycobacterium tuberculosis, with the human host is quite advanced. In contrast, models for fungal infections are still in their infancy, in particular regarding infections with the most important human pathogenic fungi, Candida albicans and Aspergillus fumigatus. We reflect on the current availability of computational approaches for multiscale modelling of host–pathogen interactions and point out current challenges. Finally, we provide an outlook for future requirements of multiscale modelling. PMID:26857943
A second golden age of aeroacoustics?
Lele, Sanjiva K; Nichols, Joseph W
2014-08-13
In 1992, Sir James Lighthill foresaw the dawn of a second golden age in aeroacoustics enabled by computer simulations (Hardin JC, Hussaini MY (eds) 1993 Computational aeroacoustics, New York, NY: Springer (doi:10.1007/978-1-4613-8342-0)). This review traces the progress in large-scale computations to resolve the noise-source processes and the methods devised to predict the far-field radiated sound using this information. Keeping focus on aviation-related noise sources a brief account of the progress in simulations of jet noise, fan noise and airframe noise is given highlighting the key technical issues and challenges. The complex geometry of nozzle elements and airframe components as well as the high Reynolds number of target applications require careful assessment of the discretization algorithms on unstructured grids and modelling compromises. High-fidelity simulations with 200-500 million points are not uncommon today and are used to improve scientific understanding of the noise generation process in specific situations. We attempt to discern where the future might take us, especially if exascale computing becomes a reality in 10 years. A pressing question in this context concerns the role of modelling in the coming era. While the sheer scale of the data generated by large-scale simulations will require new methods for data analysis and data visualization, it is our view that suitable theoretical formulations and reduced models will be even more important in future. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Advances in medical image computing.
Tolxdorff, T; Deserno, T M; Handels, H; Meinzer, H-P
2009-01-01
Medical image computing has become a key technology in high-tech applications in medicine and an ubiquitous part of modern imaging systems and the related processes of clinical diagnosis and intervention. Over the past years significant progress has been made in the field, both on methodological and on application level. Despite this progress there are still big challenges to meet in order to establish image processing routinely in health care. In this issue, selected contributions of the German Conference on Medical Image Processing (BVM) are assembled to present latest advances in the field of medical image computing. The winners of scientific awards of the German Conference on Medical Image Processing (BVM) 2008 were invited to submit a manuscript on their latest developments and results for possible publication in Methods of Information in Medicine. Finally, seven excellent papers were selected to describe important aspects of recent advances in the field of medical image processing. The selected papers give an impression of the breadth and heterogeneity of new developments. New methods for improved image segmentation, non-linear image registration and modeling of organs are presented together with applications of image analysis methods in different medical disciplines. Furthermore, state-of-the-art tools and techniques to support the development and evaluation of medical image processing systems in practice are described. The selected articles describe different aspects of the intense development in medical image computing. The image processing methods presented enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.
Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv
2013-12-01
Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Modeling and Simulation of Nanoindentation
NASA Astrophysics Data System (ADS)
Huang, Sixie; Zhou, Caizhi
2017-11-01
Nanoindentation is a hardness test method applied to small volumes of material which can provide some unique effects and spark many related research activities. To fully understand the phenomena observed during nanoindentation tests, modeling and simulation methods have been developed to predict the mechanical response of materials during nanoindentation. However, challenges remain with those computational approaches, because of their length scale, predictive capability, and accuracy. This article reviews recent progress and challenges for modeling and simulation of nanoindentation, including an overview of molecular dynamics, the quasicontinuum method, discrete dislocation dynamics, and the crystal plasticity finite element method, and discusses how to integrate multiscale modeling approaches seamlessly with experimental studies to understand the length-scale effects and microstructure evolution during nanoindentation tests, creating a unique opportunity to establish new calibration procedures for the nanoindentation technique.
NASA Astrophysics Data System (ADS)
Li, Guoliang; Xing, Lining; Chen, Yingwu
2017-11-01
The autonomicity of self-scheduling on Earth observation satellite and the increasing scale of satellite network attract much attention from researchers in the last decades. In reality, the limited onboard computational resource presents challenge for the online scheduling algorithm. This study considered online scheduling problem for a single autonomous Earth observation satellite within satellite network environment. It especially addressed that the urgent tasks arrive stochastically during the scheduling horizon. We described the problem and proposed a hybrid online scheduling mechanism with revision and progressive techniques to solve this problem. The mechanism includes two decision policies, a when-to-schedule policy combining periodic scheduling and critical cumulative number-based event-driven rescheduling, and a how-to-schedule policy combining progressive and revision approaches to accommodate two categories of task: normal tasks and urgent tasks. Thus, we developed two heuristic (re)scheduling algorithms and compared them with other generally used techniques. Computational experiments indicated that the into-scheduling percentage of urgent tasks in the proposed mechanism is much higher than that in periodic scheduling mechanism, and the specific performance is highly dependent on some mechanism-relevant and task-relevant factors. For the online scheduling, the modified weighted shortest imaging time first and dynamic profit system benefit heuristics outperformed the others on total profit and the percentage of successfully scheduled urgent tasks.
Climate Reanalysis: Progress and Future Prospects
NASA Technical Reports Server (NTRS)
Gelaro, Ron
2018-01-01
Reanalysis is the process whereby an unchanging data assimilation system is used to provide a consistent reprocessing of observations, typically spanning an extended segment of the historical data record. The process relies on an underlying model to combine often-disparate observations in a physically consistent manner, enabling production of gridded data sets for a broad range of applications including the study of historical weather events, preparation of climatologies, business sector development and, more recently, climate monitoring. Over the last few decades, several generations of reanalyses of the global atmosphere have been produced by various operational and research centers, focusing more or less on the period of regular conventional and satellite observations beginning in the mid to late twentieth century. There have also been successful efforts to extend atmospheric reanalyses back to the late nineteenth and early twentieth centuries, using mostly surface observations. Much progress has resulted from (and contributed to) advancements in numerical weather prediction, especially improved models and data assimilation techniques, increased computing capacity, the availability of new observation types and efforts to recover and improve the quality of historical ones. The recent extension of forecast systems that allow integrated modeling of meteorological, oceanic, land surface, and chemical variables provide the basic elements for coupled data assimilation. This has opened the door to the development of a new generation of coupled reanalyses of the Earth system, or integrated Earth system analyses (IESA). Evidence so far suggests that this approach can improve the analysis of currently uncoupled components of the Earth system, especially at their interface, and lead to increased predictability. However, extensive analysis coupling as envisioned for IESA, while progressing, still presents significant challenges. These include model biases that can be exacerbated when coupled, component systems with different physical characteristics and different spatial and temporal scales, and component observations in different media with different spatial and temporal frequencies and different latencies. Quantification of uncertainty in reanalyses is also a critical challenge and is important for expanding their utility as a tool for climate change assessment. This talk provides a brief overview of the progress of reanalysis development during recent decades, and describes remaining challenges in the progression toward coupled Earth system reanalyses.
Opportunities and choice in a new vector era
NASA Astrophysics Data System (ADS)
Nowak, A.
2014-06-01
This work discusses the significant changes in computing landscape related to the progression of Moore's Law, and the implications on scientific computing. Particular attention is devoted to the High Energy Physics domain (HEP), which has always made good use of threading, but levels of parallelism closer to the hardware were often left underutilized. Findings of the CERN openlab Platform Competence Center are reported in the context of expanding "performance dimensions", and especially the resurgence of vectors. These suggest that data oriented designs are feasible in HEP and have considerable potential for performance improvements on multiple levels, but will rarely trump algorithmic enhancements. Finally, an analysis of upcoming hardware and software technologies identifies heterogeneity as a major challenge for software, which will require more emphasis on scalable, efficient design.
The challenge of understanding the brain: where we stand in 2015
Lisman, John
2015-01-01
Starting with the work of Cajal more than 100 years ago, neuroscience has sought to understand how the cells of the brain give rise to cognitive functions. How far has neuroscience progressed in this endeavor? This Perspective assesses progress in elucidating five basic brain processes: visual recognition, long-term memory, short-term memory, action selection, and motor control. Each of these processes entails several levels of analysis: the behavioral properties, the underlying computational algorithm, and the cellular/network mechanisms that implement that algorithm. At this juncture, while many questions remain unanswered, achievements in several areas of research have made it possible to relate specific properties of brain networks to cognitive functions. What has been learned reveals, at least in rough outline, how cognitive processes can be an emergent property of neurons and their connections. PMID:25996132
Automation of multi-agent control for complex dynamic systems in heterogeneous computational network
NASA Astrophysics Data System (ADS)
Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan
2017-01-01
The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.
Ichikawa, Shota; Kamishima, Tamotsu; Sutherland, Kenneth; Fukae, Jun; Katayama, Kou; Aoki, Yuko; Okubo, Takanobu; Okino, Taichi; Kaneda, Takahiko; Takagi, Satoshi; Tanimura, Kazuhide
2017-10-01
We have developed a refined computer-based method to detect joint space narrowing (JSN) progression with the joint space narrowing progression index (JSNPI) by superimposing sequential hand radiographs. The purpose of this study is to assess the validity of a computer-based method using images obtained from multiple institutions in rheumatoid arthritis (RA) patients. Sequential hand radiographs of 42 patients (37 females and 5 males) with RA from two institutions were analyzed by a computer-based method and visual scoring systems as a standard of reference. The JSNPI above the smallest detectable difference (SDD) defined JSN progression on the joint level. The sensitivity and specificity of the computer-based method for JSN progression was calculated using the SDD and a receiver operating characteristic (ROC) curve. Out of 314 metacarpophalangeal joints, 34 joints progressed based on the SDD, while 11 joints widened. Twenty-one joints progressed in the computer-based method, 11 joints in the scoring systems, and 13 joints in both methods. Based on the SDD, we found lower sensitivity and higher specificity with 54.2 and 92.8%, respectively. At the most discriminant cutoff point according to the ROC curve, the sensitivity and specificity was 70.8 and 81.7%, respectively. The proposed computer-based method provides quantitative measurement of JSN progression using sequential hand radiographs and may be a useful tool in follow-up assessment of joint damage in RA patients.
Computational Science: A Research Methodology for the 21st Century
NASA Astrophysics Data System (ADS)
Orbach, Raymond L.
2004-03-01
Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.
Genetic Toxicology in the 21st Century: Reflections and Future Directions
Mahadevan, Brinda; Snyder, Ronald D.; Waters, Michael D.; Benz, R. Daniel; Kemper, Raymond A.; Tice, Raymond R.; Richard, Ann M.
2011-01-01
A symposium at the 40th anniversary of the Environmental Mutagen Society, held from October 24–28, 2009 in St. Louis, MO, surveyed the current status and future directions of genetic toxicology. This article summarizes the presentations and provides a perspective on the future. An abbreviated history is presented, highlighting the current standard battery of genotoxicity assays and persistent challenges. Application of computational toxicology to safety testing within a regulatory setting is discussed as a means for reducing the need for animal testing and human clinical trials, and current approaches and applications of in silico genotoxicity screening approaches across the pharmaceutical industry were surveyed and are reported here. The expanded use of toxicogenomics to illuminate mechanisms and bridge genotoxicity and carcinogenicity, and new public efforts to use high-throughput screening technologies to address lack of toxicity evaluation for the backlog of thousands of industrial chemicals in the environment are detailed. The Tox21 project involves coordinated efforts of four U.S. Government regulatory/research entities to use new and innovative assays to characterize key steps in toxicity pathways, including genotoxic and nongenotoxic mechanisms for carcinogenesis. Progress to date, highlighting preliminary test results from the National Toxicology Program is summarized. Finally, an overview is presented of ToxCast™, a related research program of the U.S. Environmental Protection Agency, using a broad array of high throughput and high content technologies for toxicity profiling of environmental chemicals, and computational toxicology modeling. Progress and challenges, including the pressing need to incorporate metabolic activation capability, are summarized. PMID:21538556
Acquisition of Real-Time Operation Analytics for an Automated Serial Sectioning System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madison, Jonathan D.; Underwood, O. D.; Poulter, Gregory A.
Mechanical serial sectioning is a highly repetitive technique employed in metallography for the rendering of 3D reconstructions of microstructure. While alternate techniques such as ultrasonic detection, micro-computed tomography, and focused ion beam milling have progressed much in recent years, few alternatives provide equivalent opportunities for comparatively high resolutions over significantly sized cross-sectional areas and volumes. To that end, the introduction of automated serial sectioning systems has greatly heightened repeatability and increased data collection rates while diminishing opportunity for mishandling and other user-introduced errors. Unfortunately, even among current, state-of-the-art automated serial sectioning systems, challenges in data collection have not been fullymore » eradicated. Therefore, this paper highlights two specific advances to assist in this area; a non-contact laser triangulation method for assessment of material removal rates and a newly developed graphical user interface providing real-time monitoring of experimental progress. Furthermore, both are shown to be helpful in the rapid identification of anomalies and interruptions, while also providing comparable and less error-prone measures of removal rate over the course of these long-term, challenging, and innately destructive characterization experiments.« less
Acquisition of Real-Time Operation Analytics for an Automated Serial Sectioning System
Madison, Jonathan D.; Underwood, O. D.; Poulter, Gregory A.; ...
2017-03-22
Mechanical serial sectioning is a highly repetitive technique employed in metallography for the rendering of 3D reconstructions of microstructure. While alternate techniques such as ultrasonic detection, micro-computed tomography, and focused ion beam milling have progressed much in recent years, few alternatives provide equivalent opportunities for comparatively high resolutions over significantly sized cross-sectional areas and volumes. To that end, the introduction of automated serial sectioning systems has greatly heightened repeatability and increased data collection rates while diminishing opportunity for mishandling and other user-introduced errors. Unfortunately, even among current, state-of-the-art automated serial sectioning systems, challenges in data collection have not been fullymore » eradicated. Therefore, this paper highlights two specific advances to assist in this area; a non-contact laser triangulation method for assessment of material removal rates and a newly developed graphical user interface providing real-time monitoring of experimental progress. Furthermore, both are shown to be helpful in the rapid identification of anomalies and interruptions, while also providing comparable and less error-prone measures of removal rate over the course of these long-term, challenging, and innately destructive characterization experiments.« less
NASA Astrophysics Data System (ADS)
Petrie, C.; Margaria, T.; Lausen, H.; Zaremba, M.
Explores trade-offs among existing approaches. Reveals strengths and weaknesses of proposed approaches, as well as which aspects of the problem are not yet covered. Introduces software engineering approach to evaluating semantic web services. Service-Oriented Computing is one of the most promising software engineering trends because of the potential to reduce the programming effort for future distributed industrial systems. However, only a small part of this potential rests on the standardization of tools offered by the web services stack. The larger part of this potential rests upon the development of sufficient semantics to automate service orchestration. Currently there are many different approaches to semantic web service descriptions and many frameworks built around them. A common understanding, evaluation scheme, and test bed to compare and classify these frameworks in terms of their capabilities and shortcomings, is necessary to make progress in developing the full potential of Service-Oriented Computing. The Semantic Web Services Challenge is an open source initiative that provides a public evaluation and certification of multiple frameworks on common industrially-relevant problem sets. This edited volume reports on the first results in developing common understanding of the various technologies intended to facilitate the automation of mediation, choreography and discovery for Web Services using semantic annotations. Semantic Web Services Challenge: Results from the First Year is designed for a professional audience composed of practitioners and researchers in industry. Professionals can use this book to evaluate SWS technology for their potential practical use. The book is also suitable for advanced-level students in computer science.
NASA Astrophysics Data System (ADS)
Kerschbaum, M.; Hopmann, C.
2016-06-01
The computationally efficient simulation of the progressive damage behaviour of continuous fibre reinforced plastics is still a challenging task with currently available computer aided engineering methods. This paper presents an original approach for an energy based continuum damage model which accounts for stress-/strain nonlinearities, transverse and shear stress interaction phenomena, quasi-plastic shear strain components, strain rate effects, regularised damage evolution and consideration of load reversal effects. The physically based modelling approach enables experimental determination of all parameters on ply level to avoid expensive inverse analysis procedures. The modelling strategy, implementation and verification of this model using commercially available explicit finite element software are detailed. The model is then applied to simulate the impact and penetration of carbon fibre reinforced cross-ply specimens with variation of the impact speed. The simulation results show that the presented approach enables a good representation of the force-/displacement curves and especially well agreement with the experimentally observed fracture patterns. In addition, the mesh dependency of the results were assessed for one impact case showing only very little change of the simulation results which emphasises the general applicability of the presented method.
Construction of a single atom trap for quantum information protocols
NASA Astrophysics Data System (ADS)
Shea, Margaret E.; Baker, Paul M.; Gauthier, Daniel J.; Duke Physics Department Team
2016-05-01
The field of quantum information science addresses outstanding problems such as achieving fundamentally secure communication and solving computationally hard problems. Great progress has been made in the field, particularly using photons coupled to ions and super conducting qubits. Neutral atoms are also interesting for these applications and though the technology for control of neutrals lags behind that of trapped ions, they offer some key advantages: primarily coupling to optical frequencies closer to the telecom band than trapped ions or superconducting qubits. Here we report progress on constructing a single atom trap for 87 Rb. This system is a promising platform for studying the technical problems facing neutral atom quantum computing. For example, most protocols destroy the trap when reading out the neutral atom's state; we will investigate an alternative non-destructive state detection scheme. We detail the experimental systems involved and the challenges addressed in trapping a single atom. All of our hardware components are off the shelf and relatively inexpensive. Unlike many other systems, we place a high numerical aperture lens inside our vacuum system to increase photon collection efficiency. We gratefully acknowledge the financial support of the ARO through Grant # W911NF1520047.
Numerical simulations of merging black holes for gravitational-wave astronomy
NASA Astrophysics Data System (ADS)
Lovelace, Geoffrey
2014-03-01
Gravitational waves from merging binary black holes (BBHs) are among the most promising sources for current and future gravitational-wave detectors. Accurate models of these waves are necessary to maximize the number of detections and our knowledge of the waves' sources; near the time of merger, the waves can only be computed using numerical-relativity simulations. For optimal application to gravitational-wave astronomy, BBH simulations must achieve sufficient accuracy and length, and all relevant regions of the BBH parameter space must be covered. While great progress toward these goals has been made in the almost nine years since BBH simulations became possible, considerable challenges remain. In this talk, I will discuss current efforts to meet these challenges, and I will present recent BBH simulations produced using the Spectral Einstein Code, including a catalog of publicly available gravitational waveforms [black-holes.org/waveforms]. I will also discuss simulations of merging black holes with high mass ratios and with spins nearly as fast as possible, the most challenging regions of the BBH parameter space.
NASA Astrophysics Data System (ADS)
Karasiev, V. V.
2017-10-01
Free-energy density functional theory (DFT) is one of the standard tools in high-energy-density physics used to determine the fundamental properties of dense plasmas, especially in cold and warm regimes when quantum effects are essential. DFT is usually implemented via the orbital-dependent Kohn-Sham (KS) procedure. There are two challenges of conventional implementation: (1) KS computational cost becomes prohibitively expensive at high temperatures; and (2) ground-state exchange-correlation (XC) functionals do not take into account the XC thermal effects. This talk will address both challenges and report details of the formal development of new generalized gradient approximation (GGA) XC free-energy functional which bridges low-temperature (ground state) and high-temperature (plasma) limits. Recent progress on development of functionals for orbital-free DFT as a way to address the second challenge will also be discussed. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.
Spyrakis, Francesca; Cavasotto, Claudio N
2015-10-01
Structure-based virtual screening is currently an established tool in drug lead discovery projects. Although in the last years the field saw an impressive progress in terms of algorithm development, computational performance, and retrospective and prospective applications in ligand identification, there are still long-standing challenges where further improvement is needed. In this review, we consider the conceptual frame, state-of-the-art and recent developments of three critical "structural" issues in structure-based drug lead discovery: the use of homology modeling to accurately model the binding site when no experimental structures are available, the necessity of accounting for the dynamics of intrinsically flexible systems as proteins, and the importance of considering active site water molecules in lead identification and optimization campaigns. Copyright © 2015 Elsevier Inc. All rights reserved.
Phan, Philippe; Mezghani, Neila; Aubin, Carl-Éric; de Guise, Jacques A; Labelle, Hubert
2011-07-01
Adolescent idiopathic scoliosis (AIS) is a complex spinal deformity whose assessment and treatment present many challenges. Computer applications have been developed to assist clinicians. A literature review on computer applications used in AIS evaluation and treatment has been undertaken. The algorithms used, their accuracy and clinical usability were analyzed. Computer applications have been used to create new classifications for AIS based on 2D and 3D features, assess scoliosis severity or risk of progression and assist bracing and surgical treatment. It was found that classification accuracy could be improved using computer algorithms that AIS patient follow-up and screening could be done using surface topography thereby limiting radiation and that bracing and surgical treatment could be optimized using simulations. Yet few computer applications are routinely used in clinics. With the development of 3D imaging and databases, huge amounts of clinical and geometrical data need to be taken into consideration when researching and managing AIS. Computer applications based on advanced algorithms will be able to handle tasks that could otherwise not be done which can possibly improve AIS patients' management. Clinically oriented applications and evidence that they can improve current care will be required for their integration in the clinical setting.
Analysis of New Composite Architectures
NASA Technical Reports Server (NTRS)
Whitcomb, John D.
1996-01-01
Efficient and accurate specialty finite elements methods to analyze textile composites were developed and are described. Textile composites present unique challenges to the analyst because of the large, complex 'microstructure'. The geometry of the microstructure is difficult to model and it introduces unusual free surface effects. The size of the microstructure complicates the use of traditional homogenization methods. The methods developed constitute considerable progress in addressing the modeling difficulties. The details of the methods and attended results obtained therefrom, are described in the various chapters included in Part 1 of the report. Specific conclusions and computer codes generated are included in Part 2 of the report.
A data mining method to facilitate SAR transfer.
Wassermann, Anne Mai; Bajorath, Jürgen
2011-08-22
A challenging practical problem in medicinal chemistry is the transfer of SAR information from one chemical series to another. Currently, there are no computational methods available to rationalize or support this process. Herein, we present a data mining approach that enables the identification of alternative analog series with different core structures, corresponding substitution patterns, and comparable potency progression. Scaffolds can be exchanged between these series and new analogs suggested that incorporate preferred R-groups. The methodology can be applied to search for alternative analog series if one series is known or, alternatively, to systematically assess SAR transfer potential in compound databases.
NASA Astrophysics Data System (ADS)
Waite, C. T.
2013-04-01
Moonwalk is a stroll on the Moon through time and space, a lyrical history of humanity's scientific and allegorical relationship with the Moon from the beginnings of culture to the Space Age and the memories of the Cold-War generation. It is an experimental film in both genre and form, a computer animation designed for projection on a planetarium cupola. A hemispherical film, Moonwalk creates an immersive experience. The fulldome format presents aesthetic and technical challenges to create a new form of imagery and spatial montage. A seven-minute excerpt of the work-in-progress was shown at INSAPV in the Adler Planetarium, Chicago.
Women's decision to major in STEM fields
NASA Astrophysics Data System (ADS)
Conklin, Stephanie
This paper explores the lived experiences of high school female students who choose to enter into STEM fields, and describes the influencing factors which steered these women towards majors in computer science, engineering and biology. Utilizing phenomenological methodology, this study seeks to understand the essence of women's decisions to enter into STEM fields and further describe how the decision-making process varies for women in high female enrollment fields, like biology, as compared with low enrollment fields like, computer science and engineering. Using Bloom's 3-Stage Theory, this study analyzes how relationships, experiences and barriers influenced women towards, and possibly away, from STEM fields. An analysis of women's experiences highlight that support of family, sustained experience in a STEM program during high school as well as the presence of an influential teacher were all salient factors in steering women towards STEM fields. Participants explained that influential teacher worked individually with them, modified and extended assignments and also steered participants towards coursework and experiences. This study also identifies factors, like guidance counselors as well as personal challenges, which inhibited participant's path to STEM fields. Further, through analyzing all six participants' experiences, it is clear that a linear model, like Bloom's 3-Stage Model, with limited ability to include potential barriers inhibited the ability to capture the essence of each participant's decision-making process. Therefore, a revised model with no linear progression which allows for emerging factors, like personal challenges, has been proposed; this model focuses on how interest in STEM fields begins to develop and is honed and then mastered. This study also sought to identify key differences in the paths of female students pursuing different majors. The findings of this study suggest that the path to computer science and engineering is limited. Computer science majors faced few, if any, challenges, hoped to use computers as a tool to innovate and also participated in the same computer science program. For female engineering students, the essence of their experience focused on interaction at a young age with an expert in an engineering-related field as well as a strong desire to help solve world problems using engineering. These participants were able to articulate clearly future careers. In contrast, biology majors, faced more challenges and were undecided about their future career goals. These results suggest that a longitudinal study focused on women pursuing engineering and computer science fields is warranted; this will hopefully allow these findings to be substantiated and also for refinement of the revised theoretical model.
NASA Astrophysics Data System (ADS)
Pereira, Manuel S.; Ambrosio, Jorge A. C.
1993-07-01
During the last few years, major scientific progress has been achieved in fields related to computer aided analysis of multibody systems. In view of this progress and recent developments of computer hardware and general purpose software, there is a need to access the current state of art and results from different schools of thought, with the objective of focussing trends in future research. Going back to 1983 when an important NATO-NSF-ARO Advanced Study Institute on Computer Aided Analysis and Optimization of Mechanical Systems was held at the University of Iowa, one may notice that less then 10 years ago the state of art was mainly dwelling on rigid body dynamics. The interest in the dynamic simulation of mechanical systems has steadily increased in recent years coming mainly from the aerospace and automative industries. The development of multibody system analysis formulations have been more recently motivated with the need to include several features such as: real-time simulation capabilities, highly non-linear control devices, work space path planing, active control of machine flexibilities and reliability and accuracy in the analysis results. The need for accurate and efficient analysis tools for design of large and lightweight mechanical systems has driven many research groups in the challenging problem of flexible systems with an increasing interaction with finite element methodologies. Basic approaches to mechanical systems dynamic analysis have recently been presented in several new text books. These publications demonstrate that both recursive and absolute methods still have their proponents to resolve the redundancy encountered in most mechanical systems.
NASA Technical Reports Server (NTRS)
Kiris, Cetin C.; Kwak, Dochan; Rogers, Stuart E.
2002-01-01
This paper reviews recent progress made in incompressible Navier-Stokes simulation procedures and their application to problems of engineering interest. Discussions are focused on the methods designed for complex geometry applications in three dimensions, and thus are limited to primitive variable formulation. A summary of efforts in flow solver development is given followed by numerical studies of a few example problems of current interest. Both steady and unsteady solution algorithms and their salient features are discussed. Solvers discussed here are based on a structured-grid approach using either a finite -difference or a finite-volume frame work. As a grand-challenge application of these solvers, an unsteady turbopump flow simulation procedure has been developed which utilizes high performance computing platforms. In the paper, the progress toward the complete simulation capability of the turbo-pump for a liquid rocket engine is reported. The Space Shuttle Main Engine (SSME) turbo-pump is used as a test case for evaluation of two parallel computing algorithms that have been implemented in the INS3D code. The relative motion of the grid systems for the rotorstator interaction was obtained using overact grid techniques. Unsteady computations for the SSME turbo-pump, which contains 114 zones with 34.5 million grid points, are carried out on SCSI Origin 3000 systems at NASA Ames Research Center. The same procedure has been extended to the development of NASA-DeBakey Ventricular Assist Device (VAD) that is based on an axial blood pump. Computational, and clinical analysis of this device are presented.
Neo-Symbiosis: The Next Stage in the Evolution of Human Information Interaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffith, Douglas; Greitzer, Frank L.
We re-address the vision of human-computer symbiosis expressed by J. C. R. Licklider nearly a half-century ago, when he wrote: “The hope is that in not too many years, human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.” (Licklider, 1960). Unfortunately, little progress was made toward this vision over four decades following Licklider’s challenge, despite significant advancements in the fields of human factors and computer science. Licklider’s vision wasmore » largely forgotten. However, recent advances in information science and technology, psychology, and neuroscience have rekindled the potential of making the Licklider’s vision a reality. This paper provides a historical context for and updates the vision, and it argues that such a vision is needed as a unifying framework for advancing IS&T.« less
Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B.; Hewitt, Stephen M.
2017-01-01
Abstract The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future. PMID:28584625
Barisoni, Laura; Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B; Hewitt, Stephen M
2017-04-01
The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future.
Challenges and Opportunities for Extracting Cardiovascular Risk Biomarkers from Imaging Data
NASA Astrophysics Data System (ADS)
Kakadiaris, I. A.; Mendizabal-Ruiz, E. G.; Kurkure, U.; Naghavi, M.
Complications attributed to cardiovascular diseases (CDV) are the leading cause of death worldwide. In the United States, sudden heart attack remains the number one cause of death and accounts for the majority of the 280 billion burden of cardiovascular diseases. In spite of the advancements in cardiovascular imaging techniques, the rate of deaths due to unpredicted heart attack remains high. Thus, novel computational tools are of critical need, in order to mine quantitative parameters from the imaging data for early detection of persons with a high likelihood of developing a heart attack in the near future (vulnerable patients). In this paper, we present our progress in the research of computational methods for the extraction of cardiovascular risk biomarkers from cardiovascular imaging data. In particular, we focus on the methods developed for the analysis of intravascular ultrasound (IVUS) data.
2017-04-01
The reporting of research in a manner that allows reproduction in subsequent investigations is important for scientific progress. Several details of the recent study by Patrizi et al., 'Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics', are absent from the published manuscript and make reproduction of findings impossible. As new and complex technologies with great promise for ergonomics develop, new but surmountable challenges for reporting investigations using these technologies in a reproducible manner arise. Practitioner Summary: As with traditional methods, scientific reporting of new and complex ergonomics technologies should be performed in a manner that allows reproduction in subsequent investigations and supports scientific advancement.
Non operative management of cerebral abscess
NASA Astrophysics Data System (ADS)
Batubara, C. A.
2018-03-01
Cerebral abscess is a focal intracerebral infection that begins as a localized area of cerebritis and develops into a collection of pus surrounded by a well-vascularized capsule. Patients typically present with varying combinations of aheadache, progressive neurologic deficits, seizures, and evidence of infection. Computed Tomography and Magnetic Resonance Imagingare the most important diagnostic tools in diagnosing cerebral abscess. The treatment of cerebral abscess has been a challenge. Small cerebralabscesses (< 2.5 cm) have been treated empirically with antibiotics. Elevation of intracranial pressure and threatening herniation can be managed by the use of intravenous mannitol (or hypertonic saline) and dexamethasone. Acute seizures should be terminated with the administration of intravenous benzodiazepines or by intravenous fosphenytoin. Anticonvulsants prophylaxis must be initiated immediately and continued at least one year due to high risk in the cerebral abscesses. Easier detection of underlying conditions, monitoring of the therapeutic progress, and recognition of complications have probably contributed to the improved prognosis.
Identifying critical transitions and their leading biomolecular networks in complex diseases.
Liu, Rui; Li, Meiyi; Liu, Zhi-Ping; Wu, Jiarui; Chen, Luonan; Aihara, Kazuyuki
2012-01-01
Identifying a critical transition and its leading biomolecular network during the initiation and progression of a complex disease is a challenging task, but holds the key to early diagnosis and further elucidation of the essential mechanisms of disease deterioration at the network level. In this study, we developed a novel computational method for identifying early-warning signals of the critical transition and its leading network during a disease progression, based on high-throughput data using a small number of samples. The leading network makes the first move from the normal state toward the disease state during a transition, and thus is causally related with disease-driving genes or networks. Specifically, we first define a state-transition-based local network entropy (SNE), and prove that SNE can serve as a general early-warning indicator of any imminent transitions, regardless of specific differences among systems. The effectiveness of this method was validated by functional analysis and experimental data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.
The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less
Data issues in the life sciences.
Thessen, Anne E; Patterson, David J
2011-01-01
We review technical and sociological issues facing the Life Sciences as they transform into more data-centric disciplines - the "Big New Biology". Three major challenges are: 1) lack of comprehensive standards; 2) lack of incentives for individual scientists to share data; 3) lack of appropriate infrastructure and support. Technological advances with standards, bandwidth, distributed computing, exemplar successes, and a strong presence in the emerging world of Linked Open Data are sufficient to conclude that technical issues will be overcome in the foreseeable future. While motivated to have a shared open infrastructure and data pool, and pressured by funding agencies in move in this direction, the sociological issues determine progress. Major sociological issues include our lack of understanding of the heterogeneous data cultures within Life Sciences, and the impediments to progress include a lack of incentives to build appropriate infrastructures into projects and institutions or to encourage scientists to make data openly available.
Data issues in the life sciences
Thessen, Anne E.; Patterson, David J.
2011-01-01
Abstract We review technical and sociological issues facing the Life Sciences as they transform into more data-centric disciplines - the “Big New Biology”. Three major challenges are: 1) lack of comprehensive standards; 2) lack of incentives for individual scientists to share data; 3) lack of appropriate infrastructure and support. Technological advances with standards, bandwidth, distributed computing, exemplar successes, and a strong presence in the emerging world of Linked Open Data are sufficient to conclude that technical issues will be overcome in the foreseeable future. While motivated to have a shared open infrastructure and data pool, and pressured by funding agencies in move in this direction, the sociological issues determine progress. Major sociological issues include our lack of understanding of the heterogeneous data cultures within Life Sciences, and the impediments to progress include a lack of incentives to build appropriate infrastructures into projects and institutions or to encourage scientists to make data openly available. PMID:22207805
Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.; ...
2017-03-28
The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less
Dickler, Howard B; McCoy, J Philip; Nussenblatt, Robert; Perl, Shira; Schwartzberg, Pamela A; Tsang, John S; Wang, Ena; Young, Neil S
2013-05-01
The Center for Human Immunology, Autoimmunity, and Inflammation (CHI) is an exciting initiative of the NIH intramural program begun in 2009. It is uniquely trans-NIH in support (multiple institutes) and leadership (senior scientists from several institutes who donate their time). Its goal is an in-depth assessment of the human immune system using high-throughput multiplex technologies for examination of immune cells and their products, the genome, gene expression, and epigenetic modulation obtained from individuals both before and after interventions, adding information from in-depth clinical phenotyping, and then applying advanced biostatistical and computer modeling methods for mining these diverse data. The aim is to develop a comprehensive picture of the human "immunome" in health and disease, elucidate common pathogenic pathways in various diseases, identify and validate biomarkers that predict disease progression and responses to new interventions, and identify potential targets for new therapeutic modalities. Challenges, opportunities, and progress are detailed. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.
Imaging inflammatory acne: lesion detection and tracking
NASA Astrophysics Data System (ADS)
Cula, Gabriela O.; Bargo, Paulo R.; Kollias, Nikiforos
2010-02-01
It is known that effectiveness of acne treatment increases when the lesions are detected earlier, before they could progress into mature wound-like lesions, which lead to scarring and discoloration. However, little is known about the evolution of acne from early signs until after the lesion heals. In this work we computationally characterize the evolution of inflammatory acne lesions, based on analyzing cross-polarized images that document acne-prone facial skin over time. Taking skin images over time, and being able to follow skin features in these images present serious challenges, due to change in the appearance of skin, difficulty in repositioning the subject, involuntary movement such as breathing. A computational technique for automatic detection of lesions by separating the background normal skin from the acne lesions, based on fitting Gaussian distributions to the intensity histograms, is presented. In order to track and quantify the evolution of lesions, in terms of the degree of progress or regress, we designed a study to capture facial skin images from an acne-prone young individual, followed over the course of 3 different time points. Based on the behavior of the lesions between two consecutive time points, the automatically detected lesions are classified in four categories: new lesions, resolved lesions (i.e. lesions that disappear completely), lesions that are progressing, and lesions that are regressing (i.e. lesions in the process of healing). The classification our methods achieve correlates well with visual inspection of a trained human grader.
Abedini, Yasamin; Zamani, Bibi Eshrat; Kheradmand, Ali; Rajabizadeh, Ghodratollah
2012-01-01
Addiction to computer (video) games in adolescents and its relationship with educational progress has recently attracted the attention of rearing and education experts as well as organizations and institutes involved in physical and mental health. The current research attempted to propose a structural model of the relationships between parenting styles, mothers' occupation status, and addiction to computer games, self-control, and educational progress of secondary school students. Using multistage cluster random sampling, 500 female and male secondary school students in Kerman (Iran) were selected and studied. The research tools included self-control, parenting styles, and addiction to computer games questionnaires and a self-made questionnaire containing demographic details. The data was analyzed using exploratory factor analysis, Cronbach's alpha coefficient and route analysis (in LISREL). We found self-control to have a linking role in the relationship between four parenting styles and educational progress. Mothers' occupation status was directly and significantly correlated with addiction to computer games. Although four parenting styles directly and significantly affected addiction to computer games, the findings did not support the linking role of addiction to computer games in the relationship between four parenting styles and educational progress. In agreement with previous studies, the current research reflected the impact of four parenting styles on self-control, addiction to computer games, and educational progress of students. Among the parenting styles, authoritative style can affect the severity of addiction to computer games through self-control development. It can thus indirectly influence the educational progress of students. Parents are recommended to use authoritative parenting style to help both self-management and psychological health of their children. The employed mothers are also recommended to have more supervision and control on the degree and type of computer games selected by their children.
Abedini, Yasamin; Zamani, Bibi Eshrat; Kheradmand, Ali; Rajabizadeh, Ghodratollah
2012-01-01
Background Addiction to computer (video) games in adolescents and its relationship with educational progress has recently attracted the attention of rearing and education experts as well as organizations and institutes involved in physical and mental health. The current research attempted to propose a structural model of the relationships between parenting styles, mothers’ occupation status, and addiction to computer games, self-control, and educational progress of secondary school students. Methods Using multistage cluster random sampling, 500 female and male secondary school students in Kerman (Iran) were selected and studied. The research tools included self-control, parenting styles, and addiction to computer games questionnaires and a self-made questionnaire containing demographic details. The data was analyzed using exploratory factor analysis, Cronbach’s alpha coefficient and route analysis (in LISREL). Findings We found self-control to have a linking role in the relationship between four parenting styles and educational progress. Mothers’ occupation status was directly and significantly correlated with addiction to computer games. Although four parenting styles directly and significantly affected addiction to computer games, the findings did not support the linking role of addiction to computer games in the relationship between four parenting styles and educational progress. Conclusion In agreement with previous studies, the current research reflected the impact of four parenting styles on self-control, addiction to computer games, and educational progress of students. Among the parenting styles, authoritative style can affect the severity of addiction to computer games through self-control development. It can thus indirectly influence the educational progress of students. Parents are recommended to use authoritative parenting style to help both self-management and psychological health of their children. The employed mothers are also recommended to have more supervision and control on the degree and type of computer games selected by their children. PMID:24494143
The Statistical Mechanics of Ideal MHD Turbulence
NASA Technical Reports Server (NTRS)
Shebalin, John V.
2003-01-01
Turbulence is a universal, nonlinear phenomenon found in all energetic fluid and plasma motion. In particular. understanding magneto hydrodynamic (MHD) turbulence and incorporating its effects in the computation and prediction of the flow of ionized gases in space, for example, are great challenges that must be met if such computations and predictions are to be meaningful. Although a general solution to the "problem of turbulence" does not exist in closed form, numerical integrations allow us to explore the phase space of solutions for both ideal and dissipative flows. For homogeneous, incompressible turbulence, Fourier methods are appropriate, and phase space is defined by the Fourier coefficients of the physical fields. In the case of ideal MHD flows, a fairly robust statistical mechanics has been developed, in which the symmetry and ergodic properties of phase space is understood. A discussion of these properties will illuminate our principal discovery: Coherent structure and randomness co-exist in ideal MHD turbulence. For dissipative flows, as opposed to ideal flows, progress beyond the dimensional analysis of Kolmogorov has been difficult. Here, some possible future directions that draw on the ideal results will also be discussed. Our conclusion will be that while ideal turbulence is now well understood, real turbulence still presents great challenges.
Role of Laboratory Plasma Experiments in exploring the Physics of Solar Eruptions
NASA Astrophysics Data System (ADS)
Tripathi, S.
2017-12-01
Solar eruptive events are triggered over a broad range of spatio-temporal scales by a variety of fundamental processes (e.g., force-imbalance, magnetic-reconnection, electrical-current driven instabilities) associated with arched magnetoplasma structures in the solar atmosphere. Contemporary research on solar eruptive events is at the forefront of solar and heliospheric physics due to its relevance to space weather. Details on the formation of magnetized plasma structures on the Sun, storage of magnetic energy in such structures over a long period (several Alfven transit times), and their impulsive eruptions have been recorded in numerous observations and simulated in computer models. Inherent limitations of space observations and uncontrolled nature of solar eruptions pose significant challenges in testing theoretical models and developing the predictive capability for space-weather. The pace of scientific progress in this area can be significantly boosted by tapping the potential of appropriately scaled laboratory plasma experiments to compliment solar observations, theoretical models, and computer simulations. To give an example, recent results from a laboratory plasma experiment on arched magnetic flux ropes will be presented and future challenges will be discussed. (Work supported by National Science Foundation, USA under award number 1619551)
Albanese, Chris; Rodriguez, Olga C; VanMeter, John; Fricke, Stanley T; Rood, Brian R; Lee, YiChien; Wang, Sean S; Madhavan, Subha; Gusev, Yuriy; Petricoin, Emanuel F; Wang, Yue
2013-02-01
Biologically accurate mouse models of human cancer have become important tools for the study of human disease. The anatomical location of various target organs, such as brain, pancreas, and prostate, makes determination of disease status difficult. Imaging modalities, such as magnetic resonance imaging, can greatly enhance diagnosis, and longitudinal imaging of tumor progression is an important source of experimental data. Even in models where the tumors arise in areas that permit visual determination of tumorigenesis, longitudinal anatomical and functional imaging can enhance the scope of studies by facilitating the assessment of biological alterations, (such as changes in angiogenesis, metabolism, cellular invasion) as well as tissue perfusion and diffusion. One of the challenges in preclinical imaging is the development of infrastructural platforms required for integrating in vivo imaging and therapeutic response data with ex vivo pathological and molecular data using a more systems-based multiscale modeling approach. Further challenges exist in integrating these data for computational modeling to better understand the pathobiology of cancer and to better affect its cure. We review the current applications of preclinical imaging and discuss the implications of applying functional imaging to visualize cancer progression and treatment. Finally, we provide new data from an ongoing preclinical drug study demonstrating how multiscale modeling can lead to a more comprehensive understanding of cancer biology and therapy. Copyright © 2013 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Buongiorno Nardelli, Marco
High-Throughput Quantum-Mechanics computation of materials properties by ab initio methods has become the foundation of an effective approach to materials design, discovery and characterization. This data driven approach to materials science currently presents the most promising path to the development of advanced technological materials that could solve or mitigate important social and economic challenges of the 21st century. In particular, the rapid proliferation of computational data on materials properties presents the possibility to complement and extend materials property databases where the experimental data is lacking and difficult to obtain. Enhanced repositories such as AFLOWLIB open novel opportunities for structure discovery and optimization, including uncovering of unsuspected compounds, metastable structures and correlations between various properties. The practical realization of these opportunities depends almost exclusively on the the design of efficient algorithms for electronic structure simulations of realistic material systems beyond the limitations of the current standard theories. In this talk, I will review recent progress in theoretical and computational tools, and in particular, discuss the development and validation of novel functionals within Density Functional Theory and of local basis representations for effective ab-initio tight-binding schemes. Marco Buongiorno Nardelli is a pioneer in the development of computational platforms for theory/data/applications integration rooted in his profound and extensive expertise in the design of electronic structure codes and in his vision for sustainable and innovative software development for high-performance materials simulations. His research activities range from the design and discovery of novel materials for 21st century applications in renewable energy, environment, nano-electronics and devices, the development of advanced electronic structure theories and high-throughput techniques in materials genomics and computational materials design, to an active role as community scientific software developer (QUANTUM ESPRESSO, WanT, AFLOWpi)
Workplace Charging Challenge Progress Update 2016: A New Sustainable Commute
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
In the 2016 Workplace Charging Challenge annual survey, partners shared for the how their efforts were making an impact in their communities and helped identify best practices for workplace charging. The Workplace Charging Challenge Progress Update highlights the findings from this survey and recognizes leading employers for their workplace charging efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
In June 2016, the Workplace Charging Challenge distributed its third annual survey to 295 partners with the goal of tracking partners' progress and identifying trends in workplace charging. This document summarizes findings from the survey and highlights accomplishments of the EV Everywhere Workplace Charging Challenge.
2017-01-01
Computational modeling has been applied to simulate the heterogeneity of cancer behavior. The development of Cervical Cancer (CC) is a process in which the cell acquires dynamic behavior from non-deleterious and deleterious mutations, exhibiting chromosomal alterations as a manifestation of this dynamic. To further determine the progression of chromosomal alterations in precursor lesions and CC, we introduce a computational model to study the dynamics of deleterious and non-deleterious mutations as an outcome of tumor progression. The analysis of chromosomal alterations mediated by our model reveals that multiple deleterious mutations are more frequent in precursor lesions than in CC. Cells with lethal deleterious mutations would be eliminated, which would mitigate cancer progression; on the other hand, cells with non-deleterious mutations would become dominant, which could predispose them to cancer progression. The study of somatic alterations through computer simulations of cancer progression provides a feasible pathway for insights into the transformation of cell mechanisms in humans. During cancer progression, tumors may acquire new phenotype traits, such as the ability to invade and metastasize or to become clinically important when they develop drug resistance. Non-deleterious chromosomal alterations contribute to this progression. PMID:28723940
ceRNAs in plants: computational approaches and associated challenges for target mimic research.
Paschoal, Alexandre Rossi; Lozada-Chávez, Irma; Domingues, Douglas Silva; Stadler, Peter F
2017-05-30
The competing endogenous RNA hypothesis has gained increasing attention as a potential global regulatory mechanism of microRNAs (miRNAs), and as a powerful tool to predict the function of many noncoding RNAs, including miRNAs themselves. Most studies have been focused on animals, although target mimic (TMs) discovery as well as important computational and experimental advances has been developed in plants over the past decade. Thus, our contribution summarizes recent progresses in computational approaches for research of miRNA:TM interactions. We divided this article in three main contributions. First, a general overview of research on TMs in plants is presented with practical descriptions of the available literature, tools, data, databases and computational reports. Second, we describe a common protocol for the computational and experimental analyses of TM. Third, we provide a bioinformatics approach for the prediction of TM motifs potentially cross-targeting both members within the same or from different miRNA families, based on the identification of consensus miRNA-binding sites from known TMs across sequenced genomes, transcriptomes and known miRNAs. This computational approach is promising because, in contrast to animals, miRNA families in plants are large with identical or similar members, several of which are also highly conserved. From the three consensus TM motifs found with our approach: MIM166, MIM171 and MIM159/319, the last one has found strong support on the recent experimental work by Reichel and Millar [Specificity of plant microRNA TMs: cross-targeting of mir159 and mir319. J Plant Physiol 2015;180:45-8]. Finally, we stress the discussion on the major computational and associated experimental challenges that have to be faced in future ceRNA studies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Single-molecule techniques in biophysics: a review of the progress in methods and applications.
Miller, Helen; Zhou, Zhaokun; Shepherd, Jack; Wollman, Adam J M; Leake, Mark C
2018-02-01
Single-molecule biophysics has transformed our understanding of biology, but also of the physics of life. More exotic than simple soft matter, biomatter lives far from thermal equilibrium, covering multiple lengths from the nanoscale of single molecules to up to several orders of magnitude higher in cells, tissues and organisms. Biomolecules are often characterized by underlying instability: multiple metastable free energy states exist, separated by levels of just a few multiples of the thermal energy scale k B T, where k B is the Boltzmann constant and T absolute temperature, implying complex inter-conversion kinetics in the relatively hot, wet environment of active biological matter. A key benefit of single-molecule biophysics techniques is their ability to probe heterogeneity of free energy states across a molecular population, too challenging in general for conventional ensemble average approaches. Parallel developments in experimental and computational techniques have catalysed the birth of multiplexed, correlative techniques to tackle previously intractable biological questions. Experimentally, progress has been driven by improvements in sensitivity and speed of detectors, and the stability and efficiency of light sources, probes and microfluidics. We discuss the motivation and requirements for these recent experiments, including the underpinning mathematics. These methods are broadly divided into tools which detect molecules and those which manipulate them. For the former we discuss the progress of super-resolution microscopy, transformative for addressing many longstanding questions in the life sciences, and for the latter we include progress in 'force spectroscopy' techniques that mechanically perturb molecules. We also consider in silico progress of single-molecule computational physics, and how simulation and experimentation may be drawn together to give a more complete understanding. Increasingly, combinatorial techniques are now used, including correlative atomic force microscopy and fluorescence imaging, to probe questions closer to native physiological behaviour. We identify the trade-offs, limitations and applications of these techniques, and discuss exciting new directions.
Single-molecule techniques in biophysics: a review of the progress in methods and applications
NASA Astrophysics Data System (ADS)
Miller, Helen; Zhou, Zhaokun; Shepherd, Jack; Wollman, Adam J. M.; Leake, Mark C.
2018-02-01
Single-molecule biophysics has transformed our understanding of biology, but also of the physics of life. More exotic than simple soft matter, biomatter lives far from thermal equilibrium, covering multiple lengths from the nanoscale of single molecules to up to several orders of magnitude higher in cells, tissues and organisms. Biomolecules are often characterized by underlying instability: multiple metastable free energy states exist, separated by levels of just a few multiples of the thermal energy scale k B T, where k B is the Boltzmann constant and T absolute temperature, implying complex inter-conversion kinetics in the relatively hot, wet environment of active biological matter. A key benefit of single-molecule biophysics techniques is their ability to probe heterogeneity of free energy states across a molecular population, too challenging in general for conventional ensemble average approaches. Parallel developments in experimental and computational techniques have catalysed the birth of multiplexed, correlative techniques to tackle previously intractable biological questions. Experimentally, progress has been driven by improvements in sensitivity and speed of detectors, and the stability and efficiency of light sources, probes and microfluidics. We discuss the motivation and requirements for these recent experiments, including the underpinning mathematics. These methods are broadly divided into tools which detect molecules and those which manipulate them. For the former we discuss the progress of super-resolution microscopy, transformative for addressing many longstanding questions in the life sciences, and for the latter we include progress in ‘force spectroscopy’ techniques that mechanically perturb molecules. We also consider in silico progress of single-molecule computational physics, and how simulation and experimentation may be drawn together to give a more complete understanding. Increasingly, combinatorial techniques are now used, including correlative atomic force microscopy and fluorescence imaging, to probe questions closer to native physiological behaviour. We identify the trade-offs, limitations and applications of these techniques, and discuss exciting new directions.
Damage progression in Composite Structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon
1996-01-01
A computational simulation tool is used to evaluate the various stages of damage progression in composite materials during Iosipescu sheat testing. Unidirectional composite specimens with either the major or minor material axis in the load direction are considered. Damage progression characteristics are described for each specimen using two types of boundary conditions. A procedure is outlined regarding the use of computational simulation in composites testing. Iosipescu shear testing using the V-notched beam specimen is a convenient method to measure both shear strength and shear stiffness simultaneously. The evaluation of composite test response can be made more productive and informative via computational simulation of progressive damage and fracture. Computational simulation performs a complete evaluation of laminated composite fracture via assessment of ply and subply level damage/fracture processes.
Reference Structures: Stagnation, Progress, and Future Challenges.
ERIC Educational Resources Information Center
Greenberg, Jane
1997-01-01
Assesses the current state of reference structures in online public access catalogs (OPACs) in a framework defined by stagnation, progress, and future challenges. Outlines six areas for reference structure development. Twenty figures provide illustrations. (AEF)
Outlook and Challenges for Hydrogen Storage in Nanoporous Materials
Broom, D. P.; Webb, C. J.; Hurst, Katherine E.; ...
2016-02-16
Considerable progress has been made recently in the use of nanoporous materials for hydrogen storage. In our article, the current status of the field and future challenges are discussed, ranging from important open fundamental questions, such as the density and volume of the adsorbed phase and its relationship to overall storage capacity, to the development of new functional materials and complete storage system design. With regard to fundamentals, the use of neutron scattering to study adsorbed H 2, suitable adsorption isotherm equations, and the accurate computational modelling and simulation of H 2 adsorption are discussed. We cover new materials andmore » they include flexible metal–organic frameworks, core–shell materials, and porous organic cage compounds. The article concludes with a discussion of the experimental investigation of real adsorptive hydrogen storage tanks, the improvement in the thermal conductivity of storage beds, and new storage system concepts and designs.« less
Stern, Andrew M.; Schurdak, Mark E.; Bahar, Ivet; Berg, Jeremy M.; Taylor, D. Lansing
2016-01-01
Drug candidates exhibiting well-defined pharmacokinetic and pharmacodynamic profiles that are otherwise safe often fail to demonstrate proof-of-concept in phase II and III trials. Innovation in drug discovery and development has been identified as a critical need for improving the efficiency of drug discovery, especially through collaborations between academia, government agencies, and industry. To address the innovation challenge, we describe a comprehensive, unbiased, integrated, and iterative quantitative systems pharmacology (QSP)–driven drug discovery and development strategy and platform that we have implemented at the University of Pittsburgh Drug Discovery Institute. Intrinsic to QSP is its integrated use of multiscale experimental and computational methods to identify mechanisms of disease progression and to test predicted therapeutic strategies likely to achieve clinical validation for appropriate subpopulations of patients. The QSP platform can address biological heterogeneity and anticipate the evolution of resistance mechanisms, which are major challenges for drug development. The implementation of this platform is dedicated to gaining an understanding of mechanism(s) of disease progression to enable the identification of novel therapeutic strategies as well as repurposing drugs. The QSP platform will help promote the paradigm shift from reactive population-based medicine to proactive personalized medicine by focusing on the patient as the starting and the end point. PMID:26962875
Downie, H F; Adu, M O; Schmidt, S; Otten, W; Dupuy, L X; White, P J; Valentine, T A
2015-07-01
The morphology of roots and root systems influences the efficiency by which plants acquire nutrients and water, anchor themselves and provide stability to the surrounding soil. Plant genotype and the biotic and abiotic environment significantly influence root morphology, growth and ultimately crop yield. The challenge for researchers interested in phenotyping root systems is, therefore, not just to measure roots and link their phenotype to the plant genotype, but also to understand how the growth of roots is influenced by their environment. This review discusses progress in quantifying root system parameters (e.g. in terms of size, shape and dynamics) using imaging and image analysis technologies and also discusses their potential for providing a better understanding of root:soil interactions. Significant progress has been made in image acquisition techniques, however trade-offs exist between sample throughput, sample size, image resolution and information gained. All of these factors impact on downstream image analysis processes. While there have been significant advances in computation power, limitations still exist in statistical processes involved in image analysis. Utilizing and combining different imaging systems, integrating measurements and image analysis where possible, and amalgamating data will allow researchers to gain a better understanding of root:soil interactions. © 2014 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kippen, Karen Elizabeth; Bourke, Mark Andrew M.
Our challenge derives from the fact that in metals or explosives grains, interfaces and defects control engineering performance in ways that are neither amenable to continuum codes (which fail to rigorously describe the heterogeneities derived from microstructure) nor computationally tractable to first principles atomistic calculations. This is a region called the mesoscale, which stands at the frontier of our desire to translate fundamental science insights into confidence in aging system performance over the range of extreme conditions relevant in a nuclear weapon. For dynamic problems, the phenomena of interest can require extremely good temporal resolutions. A shock wave traveling atmore » 1000 m/s (or 1 mm/μs) passes through a grain with a diameter of 1 micron in a nanosecond (10-9 sec). Thus, to observe the mesoscale phenomena—such as dislocations or phase transformations—as the shock passes, temporal resolution better than picoseconds (10-12 sec) may be needed. As we anticipate the science challenges over the next decade, experimental insights on material performance at the micron spatial scale with picosecond temporal resolution—at the mesoscale— are a clear challenge. This is a challenge fit for Los Alamos in partnership with our sister labs and academia. Mesoscale Connections will draw attention to our progress as we tackle the mesoscale challenge. We hope you like it and encourage suggestions of content you are interested in.« less
Crashworthiness simulations with DYNA3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schauer, D.A.; Hoover, C.G.; Kay, G.J.
1996-04-01
Current progress in parallel algorithm research and applications in vehicle crash simulation is described for the explicit, finite element algorithms in DYNA3D. Problem partitioning methods and parallel algorithms for contact at material interfaces are the two challenging algorithm research problems that are addressed. Two prototype parallel contact algorithms have been developed for treating the cases of local and arbitrary contact. Demonstration problems for local contact are crashworthiness simulations with 222 locally defined contact surfaces and a vehicle/barrier collision modeled with arbitrary contact. A simulation of crash tests conducted for a vehicle impacting a U-channel small sign post embedded in soilmore » has been run on both the serial and parallel versions of DYNA3D. A significant reduction in computational time has been observed when running these problems on the parallel version. However, to achieve maximum efficiency, complex problems must be appropriately partitioned, especially when contact dominates the computation.« less
Patel, Shyamal; McGinnis, Ryan S; Silva, Ikaro; DiCristofaro, Steve; Mahadevan, Nikhil; Jortberg, Elise; Franco, Jaime; Martin, Albert; Lust, Joseph; Raj, Milan; McGrane, Bryan; DePetrillo, Paolo; Aranyosi, A J; Ceruolo, Melissa; Pindado, Jesus; Ghaffari, Roozbeh
2016-08-01
Wearable sensors have the potential to enable clinical-grade ambulatory health monitoring outside the clinic. Technological advances have enabled development of devices that can measure vital signs with great precision and significant progress has been made towards extracting clinically meaningful information from these devices in research studies. However, translating measurement accuracies achieved in the controlled settings such as the lab and clinic to unconstrained environments such as the home remains a challenge. In this paper, we present a novel wearable computing platform for unobtrusive collection of labeled datasets and a new paradigm for continuous development, deployment and evaluation of machine learning models to ensure robust model performance as we transition from the lab to home. Using this system, we train activity classification models across two studies and track changes in model performance as we go from constrained to unconstrained settings.
The computational future for climate and Earth system models: on the path to petaflop and beyond.
Washington, Warren M; Buja, Lawrence; Craig, Anthony
2009-03-13
The development of the climate and Earth system models has had a long history, starting with the building of individual atmospheric, ocean, sea ice, land vegetation, biogeochemical, glacial and ecological model components. The early researchers were much aware of the long-term goal of building the Earth system models that would go beyond what is usually included in the climate models by adding interactive biogeochemical interactions. In the early days, the progress was limited by computer capability, as well as by our knowledge of the physical and chemical processes. Over the last few decades, there has been much improved knowledge, better observations for validation and more powerful supercomputer systems that are increasingly meeting the new challenges of comprehensive models. Some of the climate model history will be presented, along with some of the successes and difficulties encountered with present-day supercomputer systems.
Bassett, Danielle S; Sporns, Olaf
2017-01-01
Despite substantial recent progress, our understanding of the principles and mechanisms underlying complex brain function and cognition remains incomplete. Network neuroscience proposes to tackle these enduring challenges. Approaching brain structure and function from an explicitly integrative perspective, network neuroscience pursues new ways to map, record, analyze and model the elements and interactions of neurobiological systems. Two parallel trends drive the approach: the availability of new empirical tools to create comprehensive maps and record dynamic patterns among molecules, neurons, brain areas and social systems; and the theoretical framework and computational tools of modern network science. The convergence of empirical and computational advances opens new frontiers of scientific inquiry, including network dynamics, manipulation and control of brain networks, and integration of network processes across spatiotemporal domains. We review emerging trends in network neuroscience and attempt to chart a path toward a better understanding of the brain as a multiscale networked system. PMID:28230844
A Hybrid Human-Computer Approach to the Extraction of Scientific Facts from the Literature.
Tchoua, Roselyne B; Chard, Kyle; Audus, Debra; Qin, Jian; de Pablo, Juan; Foster, Ian
2016-01-01
A wealth of valuable data is locked within the millions of research articles published each year. Reading and extracting pertinent information from those articles has become an unmanageable task for scientists. This problem hinders scientific progress by making it hard to build on results buried in literature. Moreover, these data are loosely structured, encoded in manuscripts of various formats, embedded in different content types, and are, in general, not machine accessible. We present a hybrid human-computer solution for semi-automatically extracting scientific facts from literature. This solution combines an automated discovery, download, and extraction phase with a semi-expert crowd assembled from students to extract specific scientific facts. To evaluate our approach we apply it to a challenging molecular engineering scenario, extraction of a polymer property: the Flory-Huggins interaction parameter. We demonstrate useful contributions to a comprehensive database of polymer properties.
Security policies and trust in ubiquitous computing.
Joshi, Anupam; Finin, Tim; Kagal, Lalana; Parker, Jim; Patwardhan, Anand
2008-10-28
Ubiquitous environments comprise resource-constrained mobile and wearable devices and computational elements embedded in everyday artefacts. These are connected to each other using both infrastructure-based as well as short-range ad hoc networks. Limited Internet connectivity limits the use of conventional security mechanisms such as public key infrastructures and other forms of server-centric authentication. Under these circumstances, peer-to-peer interactions are well suited for not just information interchange, but also managing security and privacy. However, practical solutions for protecting mobile devices, preserving privacy, evaluating trust and determining the reliability and accuracy of peer-provided data in such interactions are still in their infancy. Our research is directed towards providing stronger assurances of the reliability and trustworthiness of information and services, and the use of declarative policy-driven approaches to handle the open and dynamic nature of such systems. This paper provides an overview of some of the challenges and issues, and points out directions for progress.
Zhang, Zhuhua; Liu, Xiaofei; Yu, Jin; Hang, Yang; Li, Yao; Guo, Yufeng; Xu, Ying; Sun, Xu; Zhou, Jianxin; Guo, Wanlin
2016-01-01
Low-dimensional materials exhibit many exceptional properties and functionalities which can be efficiently tuned by externally applied force or fields. Here we review the current status of research on tuning the electronic and magnetic properties of low-dimensional carbon, boron nitride, metal-dichalcogenides, phosphorene nanomaterials by applied engineering strain, external electric field and interaction with substrates, etc, with particular focus on the progress of computational methods and studies. We highlight the similarities and differences of the property modulation among one- and two-dimensional nanomaterials. Recent breakthroughs in experimental demonstration of the tunable functionalities in typical nanostructures are also presented. Finally, prospective and challenges for applying the tunable properties into functional devices are discussed. WIREs Comput Mol Sci 2016, 6:324-350. doi: 10.1002/wcms.1251 For further resources related to this article, please visit the WIREs website. The authors have declared no conflicts of interest for this article.
Pharmacoinformatic approaches to understand complexation of dendrimeric nanoparticles with drugs
NASA Astrophysics Data System (ADS)
Jain, Vaibhav; Bharatam, Prasad V.
2014-02-01
Nanoparticle based drug delivery systems are gaining popularity due to their wide spectrum advantages over traditional drug delivery systems; among them, dendrimeric nano-vectors are the most widely explored carriers for pharmaceutical and biomedical applications. The precise mechanism of encapsulation of drug molecules inside the dendritic matrix, delivery of drugs into specific cells, interactions of nano-formulation with biological targets and proteins, etc. present a substantial challenge to the scientific understanding of the subject. Computational methods complement experimental techniques in the design and optimization of drug delivery systems, thus minimizing the investment in drug design and development. Significant progress in computer simulations could facilitate an understanding of the precise mechanism of encapsulation of bioactive molecules and their delivery. This review summarizes the pharmacoinformatic studies spanning from quantum chemical calculations to coarse-grained simulations, aimed at providing better insight into dendrimer-drug interactions and the physicochemical parameters influencing the binding and release mechanism of drugs.
Wilson, Wouter; Isaksson, Hanna; Jurvelin, Jukka S.; Herzog, Walter; Korhonen, Rami K.
2013-01-01
The function of articular cartilage depends on its structure and composition, sensitively impaired in disease (e.g. osteoarthritis, OA). Responses of chondrocytes to tissue loading are modulated by the structure. Altered cell responses as an effect of OA may regulate cartilage mechanotransduction and cell biosynthesis. To be able to evaluate cell responses and factors affecting the onset and progression of OA, local tissue and cell stresses and strains in cartilage need to be characterized. This is extremely challenging with the presently available experimental techniques and therefore computational modeling is required. Modern models of articular cartilage are inhomogeneous and anisotropic, and they include many aspects of the real tissue structure and composition. In this paper, we provide an overview of the computational applications that have been developed for modeling the mechanics of articular cartilage at the tissue and cellular level. We concentrate on the use of fibril-reinforced models of cartilage. Furthermore, we introduce practical considerations for modeling applications, including also experimental tests that can be combined with the modeling approach. At the end, we discuss the prospects for patient-specific models when aiming to use finite element modeling analysis and evaluation of articular cartilage function, cellular responses, failure points, OA progression, and rehabilitation. PMID:23653665
Digital information management: a progress report on the National Digital Mammography Archive
NASA Astrophysics Data System (ADS)
Beckerman, Barbara G.; Schnall, Mitchell D.
2002-05-01
Digital mammography creates very large images, which require new approaches to storage, retrieval, management, and security. The National Digital Mammography Archive (NDMA) project, funded by the National Library of Medicine (NLM), is developing a limited testbed that demonstrates the feasibility of a national breast imaging archive, with access to prior exams; patient information; computer aids for image processing, teaching, and testing tools; and security components to ensure confidentiality of patient information. There will be significant benefits to patients and clinicians in terms of accessible data with which to make a diagnosis and to researchers performing studies on breast cancer. Mammography was chosen for the project, because standards were already available for digital images, report formats, and structures. New standards have been created for communications protocols between devices, front- end portal and archive. NDMA is a distributed computing concept that provides for sharing and access across corporate entities. Privacy, auditing, and patient consent are all integrated into the system. Five sites, Universities of Pennsylvania, Chicago, North Carolina and Toronto, and BWXT Y12, are connected through high-speed networks to demonstrate functionality. We will review progress, including technical challenges, innovative research and development activities, standards and protocols being implemented, and potential benefits to healthcare systems.
Performance Portability Strategies for Grid C++ Expression Templates
NASA Astrophysics Data System (ADS)
Boyle, Peter A.; Clark, M. A.; DeTar, Carleton; Lin, Meifeng; Rana, Verinder; Vaquero Avilés-Casco, Alejandro
2018-03-01
One of the key requirements for the Lattice QCD Application Development as part of the US Exascale Computing Project is performance portability across multiple architectures. Using the Grid C++ expression template as a starting point, we report on the progress made with regards to the Grid GPU offloading strategies. We present both the successes and issues encountered in using CUDA, OpenACC and Just-In-Time compilation. Experimentation and performance on GPUs with a SU(3)×SU(3) streaming test will be reported. We will also report on the challenges of using current OpenMP 4.x for GPU offloading in the same code.
A generative, probabilistic model of local protein structure.
Boomsma, Wouter; Mardia, Kanti V; Taylor, Charles C; Ferkinghoff-Borg, Jesper; Krogh, Anders; Hamelryck, Thomas
2008-07-01
Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state. Our method represents a significant theoretical and practical improvement over the widely used fragment assembly technique by avoiding the drawbacks associated with a discrete and nonprobabilistic approach.
Nutrition and the science of disease prevention: a systems approach to support metabolic health
Bennett, Brian J.; Hall, Kevin D.; Hu, Frank B.; McCartney, Anne L.; Roberto, Christina
2017-01-01
Progress in nutritional science, genetics, computer science, and behavioral economics can be leveraged to address the challenge of noncommunicable disease. This report highlights the connection between nutrition and the complex science of preventing disease and discusses the promotion of optimal metabolic health, building on input from several complementary disciplines. The discussion focuses on (1) the basic science of optimal metabolic health, including data from gene–diet interactions, microbiome, and epidemiological research in nutrition, with the goal of defining better targets and interventions, and (2) how nutrition, from pharma to lifestyle, can build on systems science to address complex issues. PMID:26415028
Direct numerical simulation of reactor two-phase flows enabled by high-performance computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Jun; Cambareri, Joseph J.; Brown, Cameron S.
Nuclear reactor two-phase flows remain a great engineering challenge, where the high-resolution two-phase flow database which can inform practical model development is still sparse due to the extreme reactor operation conditions and measurement difficulties. Owing to the rapid growth of computing power, the direct numerical simulation (DNS) is enjoying a renewed interest in investigating the related flow problems. A combination between DNS and an interface tracking method can provide a unique opportunity to study two-phase flows based on first principles calculations. More importantly, state-of-the-art high-performance computing (HPC) facilities are helping unlock this great potential. This paper reviews the recent researchmore » progress of two-phase flow DNS related to reactor applications. The progress in large-scale bubbly flow DNS has been focused not only on the sheer size of those simulations in terms of resolved Reynolds number, but also on the associated advanced modeling and analysis techniques. Specifically, the current areas of active research include modeling of sub-cooled boiling, bubble coalescence, as well as the advanced post-processing toolkit for bubbly flow simulations in reactor geometries. A novel bubble tracking method has been developed to track the evolution of bubbles in two-phase bubbly flow. Also, spectral analysis of DNS database in different geometries has been performed to investigate the modulation of the energy spectrum slope due to bubble-induced turbulence. In addition, the single-and two-phase analysis results are presented for turbulent flows within the pressurized water reactor (PWR) core geometries. The related simulations are possible to carry out only with the world leading HPC platforms. These simulations are allowing more complex turbulence model development and validation for use in 3D multiphase computational fluid dynamics (M-CFD) codes.« less
Marino, Simeone; Gideon, Hannah P.; Gong, Chang; ...
2016-04-11
Identifying biomarkers for tuberculosis (TB) is an ongoing challenge in developing immunological correlates of infection outcome and protection. Biomarker discovery is also necessary for aiding design and testing of new treatments and vaccines. To effectively predict biomarkers for infection progression in any disease, including TB, large amounts of experimental data are required to reach statistical power and make accurate predictions. We took a two-pronged approach using both experimental and computational modeling to address this problem. We first collected 200 blood samples over a 2-year period from 28 non-human primates (NHP) infected with a low dose of Mycobacterium tuberculosis. We identifiedmore » T cells and the cytokines that they were producing (single and multiple) from each sample along with monkey status and infection progression data. Machine learning techniques were used to interrogate the experimental NHP datasets without identifying any potential TB biomarker. In parallel, we used our extensive novel NHP datasets to build and calibrate a multi-organ computational model that combines what is occurring at the site of infection (e.g., lung) at a single granuloma scale with blood level readouts that can be tracked in monkeys and humans. We then generated a large in silico repository of in silico granulomas coupled to lymph node and blood dynamics and developed an in silico tool to scale granuloma level results to a full host scale to identify what best predicts Mycobacterium tuberculosis (Mtb) infection outcomes. The analysis of in silico blood measures identifies Mtb-specific frequencies of effector T cell phenotypes at various time points post infection as promising indicators of infection outcome. As a result, we emphasize that pairing wetlab and computational approaches holds great promise to accelerate TB biomarker discovery.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marino, Simeone; Gideon, Hannah P.; Gong, Chang
Identifying biomarkers for tuberculosis (TB) is an ongoing challenge in developing immunological correlates of infection outcome and protection. Biomarker discovery is also necessary for aiding design and testing of new treatments and vaccines. To effectively predict biomarkers for infection progression in any disease, including TB, large amounts of experimental data are required to reach statistical power and make accurate predictions. We took a two-pronged approach using both experimental and computational modeling to address this problem. We first collected 200 blood samples over a 2-year period from 28 non-human primates (NHP) infected with a low dose of Mycobacterium tuberculosis. We identifiedmore » T cells and the cytokines that they were producing (single and multiple) from each sample along with monkey status and infection progression data. Machine learning techniques were used to interrogate the experimental NHP datasets without identifying any potential TB biomarker. In parallel, we used our extensive novel NHP datasets to build and calibrate a multi-organ computational model that combines what is occurring at the site of infection (e.g., lung) at a single granuloma scale with blood level readouts that can be tracked in monkeys and humans. We then generated a large in silico repository of in silico granulomas coupled to lymph node and blood dynamics and developed an in silico tool to scale granuloma level results to a full host scale to identify what best predicts Mycobacterium tuberculosis (Mtb) infection outcomes. The analysis of in silico blood measures identifies Mtb-specific frequencies of effector T cell phenotypes at various time points post infection as promising indicators of infection outcome. As a result, we emphasize that pairing wetlab and computational approaches holds great promise to accelerate TB biomarker discovery.« less
ERIC Educational Resources Information Center
Anderson, O. Roger
2014-01-01
Modern neuroscientific research has substantially enhanced our understanding of the human brain. However, many challenges remain in developing a strong, brain-based theory of human learning, especially in complex environments such as educational settings. Some of the current issues and challenges in our progress toward developing comprehensive…
ERIC Educational Resources Information Center
Kirby, Gretchen; Caronongan, Pia; Esposito, Andrea Mraz; Murphy, Lauren; Shoji, Megan; Del Grosso, Patricia; Kiambuthi, Wamaitha; Clark, Melissa; Dragoset, Lisa
2017-01-01
This report focuses on the progress made and challenges faced by the nine states in achieving the first three objectives for which they were held accountable for use of their Race to the Top Early Learning Challenge (RTTELC) funds. It discusses discrete findings related to these objectives in relevant chapters of the report: Tiered Quality Rating…
Unique Challenges for Modeling Defect Dynamics in Concentrated Solid-Solution Alloys
NASA Astrophysics Data System (ADS)
Zhao, Shijun; Weber, William J.; Zhang, Yanwen
2017-11-01
Recently developed concentrated solid solution alloys (CSAs) are shown to have improved performance under irradiation that depends strongly on the number of alloying elements, alloying species, and their concentrations. In contrast to conventional dilute alloys, CSAs are composed of multiple principal elements situated randomly in a simple crystalline lattice. As a result, the intrinsic disorder has a profound influence on energy dissipation pathways and defect evolution when these CSAs are subjected to energetic particle irradiation. Extraordinary irradiation resistance, including suppression of void formation by two orders of magnitude at an elevated temperature, has been achieved with increasing compositional complexity in CSAs. Unfortunately, the loss of translational invariance associated with the intrinsic chemical disorder poses great challenges to theoretical modeling at the electronic and atomic levels. Based on recent computer simulation results for a set of novel Ni-containing, face-centered cubic CSAs, we review theoretical modeling progress in handling disorder in CSAs and underscore the impact of disorder on defect dynamics. We emphasize in particular the unique challenges associated with the description of defect dynamics in CSAs.
Progressive fracture of fiber composites
NASA Technical Reports Server (NTRS)
Irvin, T. B.; Ginty, C. A.
1983-01-01
Refined models and procedures are described for determining progressive composite fracture in graphite/epoxy angleplied laminates. Lewis Research Center capabilities are utilized including the Real Time Ultrasonic C Scan (RUSCAN) experimental facility and the Composite Durability Structural Analysis (CODSTRAN) computer code. The CODSTRAN computer code is used to predict the fracture progression based on composite mechanics, finite element stress analysis, and fracture criteria modules. The RUSCAN facility, CODSTRAN computer code, and scanning electron microscope are used to determine durability and identify failure mechanisms in graphite/epoxy composites.
Boxes of Model Building and Visualization.
Turk, Dušan
2017-01-01
Macromolecular crystallography and electron microscopy (single-particle and in situ tomography) are merging into a single approach used by the two coalescing scientific communities. The merger is a consequence of technical developments that enabled determination of atomic structures of macromolecules by electron microscopy. Technological progress in experimental methods of macromolecular structure determination, computer hardware, and software changed and continues to change the nature of model building and visualization of molecular structures. However, the increase in automation and availability of structure validation are reducing interactive manual model building to fiddling with details. On the other hand, interactive modeling tools increasingly rely on search and complex energy calculation procedures, which make manually driven changes in geometry increasingly powerful and at the same time less demanding. Thus, the need for accurate manual positioning of a model is decreasing. The user's push only needs to be sufficient to bring the model within the increasing convergence radius of the computing tools. It seems that we can now better than ever determine an average single structure. The tools work better, requirements for engagement of human brain are lowered, and the frontier of intellectual and scientific challenges has moved on. The quest for resolution of new challenges requires out-of-the-box thinking. A few issues such as model bias and correctness of structure, ongoing developments in parameters defining geometric restraints, limitations of the ideal average single structure, and limitations of Bragg spot data are discussed here, together with the challenges that lie ahead.
González-Nilo, Fernando; Pérez-Acle, Tomás; Guínez-Molinos, Sergio; Geraldo, Daniela A; Sandoval, Claudia; Yévenes, Alejandro; Santos, Leonardo S; Laurie, V Felipe; Mendoza, Hegaly; Cachau, Raúl E
2011-01-01
After the progress made during the genomics era, bioinformatics was tasked with supporting the flow of information generated by nanobiotechnology efforts. This challenge requires adapting classical bioinformatic and computational chemistry tools to store, standardize, analyze, and visualize nanobiotechnological information. Thus, old and new bioinformatic and computational chemistry tools have been merged into a new sub-discipline: nanoinformatics. This review takes a second look at the development of this new and exciting area as seen from the perspective of the evolution of nanobiotechnology applied to the life sciences. The knowledge obtained at the nano-scale level implies answers to new questions and the development of new concepts in different fields. The rapid convergence of technologies around nanobiotechnologies has spun off collaborative networks and web platforms created for sharing and discussing the knowledge generated in nanobiotechnology. The implementation of new database schemes suitable for storage, processing and integrating physical, chemical, and biological properties of nanoparticles will be a key element in achieving the promises in this convergent field. In this work, we will review some applications of nanobiotechnology to life sciences in generating new requirements for diverse scientific fields, such as bioinformatics and computational chemistry.
Blood Pump Development Using Rocket Engine Flow Simulation Technology
NASA Technical Reports Server (NTRS)
Kwak, Dochan; Kiris, Cetin
2001-01-01
This paper reports the progress made towards developing complete blood flow simulation capability in humans, especially in the presence of artificial devices such as valves and ventricular assist devices. Devices modeling poses unique challenges different from computing the blood flow in natural hearts and arteries. There are many elements needed to quantify the flow in these devices such as flow solvers, geometry modeling including flexible walls, moving boundary procedures and physiological characterization of blood. As a first step, computational technology developed for aerospace applications was extended to the analysis and development of a ventricular assist device (VAD), i.e., a blood pump. The blood flow in a VAD is practically incompressible and Newtonian, and thus an incompressible Navier-Stokes solution procedure can be applied. A primitive variable formulation is used in conjunction with the overset grid approach to handle complex moving geometry. The primary purpose of developing the incompressible flow analysis capability was to quantify the flow in advanced turbopump for space propulsion system. The same procedure has been extended to the development of NASA-DeBakey VAD that is based on an axial blood pump. Due to massive computing requirements, high-end computing is necessary for simulating three-dimensional flow in these pumps. Computational, experimental, and clinical results are presented.
ERIC Educational Resources Information Center
Worrell, Jamie; Duffy, Mary Lou; Brady, Michael P.; Dukes, Charles; Gonzalez-DeHass, Alyssa
2016-01-01
Many schools use computer-based testing to measure students' progress for end-of-the-year and statewide assessments. There is little research to support whether computer-based testing accurately reflects student progress, particularly among students with learning, performance, and generalization difficulties. This article summarizes an…
Graphical Man/Machine Communications
Progress is reported concerning the use of computer controlled graphical displays in the areas of radiaton diffusion and hydrodynamics, general...ventricular dynamics. Progress is continuing on the use of computer graphics in architecture. Some progress in halftone graphics is reported with no basic...developments presented. Colored halftone perspective pictures are being used to represent multivariable situations. Nonlinear waveform processing is
NASA Technical Reports Server (NTRS)
Miller, Ronald L.; Garcia-Pando, Carlos Perez; Perlwitz, Jan; Ginoux, Paul
2015-01-01
Past decades have seen an accelerating increase in computing efficiency, while climate models are representing a rapidly widening set of physical processes. Yet simulations of some fundamental aspects of climate like precipitation or aerosol forcing remain highly uncertain and resistant to progress. Dust aerosol modeling of soil particles lofted by wind erosion has seen a similar conflict between increasing model sophistication and remaining uncertainty. Dust aerosols perturb the energy and water cycles by scattering radiation and acting as ice nuclei, while mediating atmospheric chemistry and marine photosynthesis (and thus the carbon cycle). These effects take place across scales from the dimensions of an ice crystal to the planetary-scale circulation that disperses dust far downwind of its parent soil. Representing this range leads to several modeling challenges. Should we limit complexity in our model, which consumes computer resources and inhibits interpretation? How do we decide if a process involving dust is worthy of inclusion within our model? Can we identify a minimal representation of a complex process that is efficient yet retains the physics relevant to climate? Answering these questions about the appropriate degree of representation is guided by model evaluation, which presents several more challenges. How do we proceed if the available observations do not directly constrain our process of interest? (This could result from competing processes that influence the observed variable and obscure the signature of our process of interest.) Examples will be presented from dust modeling, with lessons that might be more broadly applicable. The end result will either be clinical depression or there assuring promise of continued gainful employment as the community confronts these challenges.
RNA nanotechnology for computer design and in vivo computation
Qiu, Meikang; Khisamutdinov, Emil; Zhao, Zhengyi; Pan, Cheryl; Choi, Jeong-Woo; Leontis, Neocles B.; Guo, Peixuan
2013-01-01
Molecular-scale computing has been explored since 1989 owing to the foreseeable limitation of Moore's law for silicon-based computation devices. With the potential of massive parallelism, low energy consumption and capability of working in vivo, molecular-scale computing promises a new computational paradigm. Inspired by the concepts from the electronic computer, DNA computing has realized basic Boolean functions and has progressed into multi-layered circuits. Recently, RNA nanotechnology has emerged as an alternative approach. Owing to the newly discovered thermodynamic stability of a special RNA motif (Shu et al. 2011 Nat. Nanotechnol. 6, 658–667 (doi:10.1038/nnano.2011.105)), RNA nanoparticles are emerging as another promising medium for nanodevice and nanomedicine as well as molecular-scale computing. Like DNA, RNA sequences can be designed to form desired secondary structures in a straightforward manner, but RNA is structurally more versatile and more thermodynamically stable owing to its non-canonical base-pairing, tertiary interactions and base-stacking property. A 90-nucleotide RNA can exhibit 490 nanostructures, and its loops and tertiary architecture can serve as a mounting dovetail that eliminates the need for external linking dowels. Its enzymatic and fluorogenic activity creates diversity in computational design. Varieties of small RNA can work cooperatively, synergistically or antagonistically to carry out computational logic circuits. The riboswitch and enzymatic ribozyme activities and its special in vivo attributes offer a great potential for in vivo computation. Unique features in transcription, termination, self-assembly, self-processing and acid resistance enable in vivo production of RNA nanoparticles that harbour various regulators for intracellular manipulation. With all these advantages, RNA computation is promising, but it is still in its infancy. Many challenges still exist. Collaborations between RNA nanotechnologists and computer scientists are necessary to advance this nascent technology. PMID:24000362
RNA nanotechnology for computer design and in vivo computation.
Qiu, Meikang; Khisamutdinov, Emil; Zhao, Zhengyi; Pan, Cheryl; Choi, Jeong-Woo; Leontis, Neocles B; Guo, Peixuan
2013-10-13
Molecular-scale computing has been explored since 1989 owing to the foreseeable limitation of Moore's law for silicon-based computation devices. With the potential of massive parallelism, low energy consumption and capability of working in vivo, molecular-scale computing promises a new computational paradigm. Inspired by the concepts from the electronic computer, DNA computing has realized basic Boolean functions and has progressed into multi-layered circuits. Recently, RNA nanotechnology has emerged as an alternative approach. Owing to the newly discovered thermodynamic stability of a special RNA motif (Shu et al. 2011 Nat. Nanotechnol. 6, 658-667 (doi:10.1038/nnano.2011.105)), RNA nanoparticles are emerging as another promising medium for nanodevice and nanomedicine as well as molecular-scale computing. Like DNA, RNA sequences can be designed to form desired secondary structures in a straightforward manner, but RNA is structurally more versatile and more thermodynamically stable owing to its non-canonical base-pairing, tertiary interactions and base-stacking property. A 90-nucleotide RNA can exhibit 4⁹⁰ nanostructures, and its loops and tertiary architecture can serve as a mounting dovetail that eliminates the need for external linking dowels. Its enzymatic and fluorogenic activity creates diversity in computational design. Varieties of small RNA can work cooperatively, synergistically or antagonistically to carry out computational logic circuits. The riboswitch and enzymatic ribozyme activities and its special in vivo attributes offer a great potential for in vivo computation. Unique features in transcription, termination, self-assembly, self-processing and acid resistance enable in vivo production of RNA nanoparticles that harbour various regulators for intracellular manipulation. With all these advantages, RNA computation is promising, but it is still in its infancy. Many challenges still exist. Collaborations between RNA nanotechnologists and computer scientists are necessary to advance this nascent technology.
Modeling high-temperature superconductors and metallic alloys on the Intel IPSC/860
NASA Astrophysics Data System (ADS)
Geist, G. A.; Peyton, B. W.; Shelton, W. A.; Stocks, G. M.
Oak Ridge National Laboratory has embarked on several computational Grand Challenges, which require the close cooperation of physicists, mathematicians, and computer scientists. One of these projects is the determination of the material properties of alloys from first principles and, in particular, the electronic structure of high-temperature superconductors. While the present focus of the project is on superconductivity, the approach is general enough to permit study of other properties of metallic alloys such as strength and magnetic properties. This paper describes the progress to date on this project. We include a description of a self-consistent KKR-CPA method, parallelization of the model, and the incorporation of a dynamic load balancing scheme into the algorithm. We also describe the development and performance of a consolidated KKR-CPA code capable of running on CRAYs, workstations, and several parallel computers without source code modification. Performance of this code on the Intel iPSC/860 is also compared to a CRAY 2, CRAY YMP, and several workstations. Finally, some density of state calculations of two perovskite superconductors are given.
The Role of Computer Simulation in Nanoporous Metals—A Review
Xia, Re; Wu, Run Ni; Liu, Yi Lun; Sun, Xiao Yu
2015-01-01
Nanoporous metals (NPMs) have proven to be all-round candidates in versatile and diverse applications. In this decade, interest has grown in the fabrication, characterization and applications of these intriguing materials. Most existing reviews focus on the experimental and theoretical works rather than the numerical simulation. Actually, with numerous experiments and theory analysis, studies based on computer simulation, which may model complex microstructure in more realistic ways, play a key role in understanding and predicting the behaviors of NPMs. In this review, we present a comprehensive overview of the computer simulations of NPMs, which are prepared through chemical dealloying. Firstly, we summarize the various simulation approaches to preparation, processing, and the basic physical and chemical properties of NPMs. In this part, the emphasis is attached to works involving dealloying, coarsening and mechanical properties. Then, we conclude with the latest progress as well as the future challenges in simulation studies. We believe that highlighting the importance of simulations will help to better understand the properties of novel materials and help with new scientific research on these materials. PMID:28793491
Modeling the effect of nano-sized polymer particles on the properties of lipid membranes
NASA Astrophysics Data System (ADS)
Rossi, Giulia; Monticelli, Luca
2014-12-01
The interaction between polymers and biological membranes has recently gained significant interest in several research areas. On the biomedical side, dendrimers, linear polyelectrolytes, and neutral copolymers find application as drug and gene delivery agents, as biocidal agents, and as platforms for biological sensors. On the environmental side, plastic debris is often disposed of in the oceans and gets degraded into small particles; therefore concern is raising about the interaction of small plastic particles with living organisms. From both perspectives, it is crucial to understand the processes driving the interaction between polymers and cell membranes. In recent times progress in computer technology and simulation methods has allowed computational predictions on the molecular mechanism of interaction between polymeric materials and lipid membranes. Here we review the computational studies on the interaction between lipid membranes and different classes of polymers: dendrimers, linear charged polymers, polyethylene glycol (PEG) and its derivatives, polystyrene, and some generic models of polymer chains. We conclude by discussing some of the technical challenges in this area and future developments.
Radhakrishnan, Ravi; Yu, Hsiu-Yu; Eckmann, David M.; Ayyaswamy, Portonovo S.
2017-01-01
Traditionally, the numerical computation of particle motion in a fluid is resolved through computational fluid dynamics (CFD). However, resolving the motion of nanoparticles poses additional challenges due to the coupling between the Brownian and hydrodynamic forces. Here, we focus on the Brownian motion of a nanoparticle coupled to adhesive interactions and confining-wall-mediated hydrodynamic interactions. We discuss several techniques that are founded on the basis of combining CFD methods with the theory of nonequilibrium statistical mechanics in order to simultaneously conserve thermal equipartition and to show correct hydrodynamic correlations. These include the fluctuating hydrodynamics (FHD) method, the generalized Langevin method, the hybrid method, and the deterministic method. Through the examples discussed, we also show a top-down multiscale progression of temporal dynamics from the colloidal scales to the molecular scales, and the associated fluctuations, hydrodynamic correlations. While the motivation and the examples discussed here pertain to nanoscale fluid dynamics and mass transport, the methodologies presented are rather general and can be easily adopted to applications in convective heat transfer. PMID:28035168
Modeling compressible multiphase flows with dispersed particles in both dense and dilute regimes
NASA Astrophysics Data System (ADS)
McGrath, T.; St. Clair, J.; Balachandar, S.
2018-05-01
Many important explosives and energetics applications involve multiphase formulations employing dispersed particles. While considerable progress has been made toward developing mathematical models and computational methodologies for these flows, significant challenges remain. In this work, we apply a mathematical model for compressible multiphase flows with dispersed particles to existing shock and explosive dispersal problems from the literature. The model is cast in an Eulerian framework, treats all phases as compressible, is hyperbolic, and satisfies the second law of thermodynamics. It directly applies the continuous-phase pressure gradient as a forcing function for particle acceleration and thereby retains relaxed characteristics for the dispersed particle phase that remove the constituent material sound velocity from the eigenvalues. This is consistent with the expected characteristics of dispersed particle phases and can significantly improve the stable time-step size for explicit methods. The model is applied to test cases involving the shock and explosive dispersal of solid particles and compared to data from the literature. Computed results compare well with experimental measurements, providing confidence in the model and computational methods applied.
Orion Absolute Navigation System Progress and Challenges
NASA Technical Reports Server (NTRS)
Holt, Greg N.; D'Souza, Christopher
2011-01-01
The Orion spacecraft is being designed as NASA's next-generation exploration vehicle for crewed missions beyond Low-Earth Orbit. The navigation system for the Orion spacecraft is being designed in a Multi-Organizational Design Environment (MODE) team including contractor and NASA personnel. The system uses an Extended Kalman Filter to process measurements and determine the state. The design of the navigation system has undergone several iterations and modifications since its inception, and continues as a work-in-progress. This paper seeks to benchmark the current state of the design and some of the rationale and analysis behind it. There are specific challenges to address when preparing a timely and effective design for the Exploration Flight Test (EFT-1), while still looking ahead and providing software extensibility for future exploration missions. The primary measurements in a Near-Earth or Mid-Earth environment consist of GPS pseudorange and deltarange, but for future explorations missions the use of star-tracker and optical navigation sources need to be considered. Discussions are presented for state size and composition, processing techniques, and consider states. A presentation is given for the processing technique using the computationally stable and robust UDU formulation with an Agee-Turner Rank-One update. This allows for computational savings when dealing with many parameters which are modeled as slowly varying Gauss-Markov processes. Preliminary analysis shows up to a 50% reduction in computation versus a more traditional formulation. Several state elements are discussed and evaluated, including position, velocity, attitude, clock bias/drift, and GPS measurement biases in addition to bias, scale factor, misalignment, and non-orthogonalities of the accelerometers and gyroscopes. Another consideration is the initialization of the EKF in various scenarios. Scenarios such as single-event upset, ground command, pad alignment, cold start are discussed as are strategies for whole and partial state updates as well as covariance considerations. Strategies are given for dealing with latent measurements and high-rate propagation using multi-rate architecture. The details of the rate groups and the data ow between the elements is discussed and evaluated.
Optical correction of refractive error for preventing and treating eye symptoms in computer users.
Heus, Pauline; Verbeek, Jos H; Tikka, Christina
2018-04-10
Computer users frequently complain about problems with seeing and functioning of the eyes. Asthenopia is a term generally used to describe symptoms related to (prolonged) use of the eyes like ocular fatigue, headache, pain or aching around the eyes, and burning and itchiness of the eyelids. The prevalence of asthenopia during or after work on a computer ranges from 46.3% to 68.5%. Uncorrected or under-corrected refractive error can contribute to the development of asthenopia. A refractive error is an error in the focusing of light by the eye and can lead to reduced visual acuity. There are various possibilities for optical correction of refractive errors including eyeglasses, contact lenses and refractive surgery. To examine the evidence on the effectiveness, safety and applicability of optical correction of refractive error for reducing and preventing eye symptoms in computer users. We searched the Cochrane Central Register of Controlled Trials (CENTRAL); PubMed; Embase; Web of Science; and OSH update, all to 20 December 2017. Additionally, we searched trial registries and checked references of included studies. We included randomised controlled trials (RCTs) and quasi-randomised trials of interventions evaluating optical correction for computer workers with refractive error for preventing or treating asthenopia and their effect on health related quality of life. Two authors independently assessed study eligibility and risk of bias, and extracted data. Where appropriate, we combined studies in a meta-analysis. We included eight studies with 381 participants. Three were parallel group RCTs, three were cross-over RCTs and two were quasi-randomised cross-over trials. All studies evaluated eyeglasses, there were no studies that evaluated contact lenses or surgery. Seven studies evaluated computer glasses with at least one focal area for the distance of the computer screen with or without additional focal areas in presbyopic persons. Six studies compared computer glasses to other types of glasses; and one study compared them to an ergonomic workplace assessment. The eighth study compared optimal correction of refractive error with the actual spectacle correction in use. Two studies evaluated computer glasses in persons with asthenopia but for the others the glasses were offered to all workers regardless of symptoms. The risk of bias was unclear in five, high in two and low in one study. Asthenopia was measured as eyestrain or a summary score of symptoms but there were no studies on health-related quality of life. Adverse events were measured as headache, nausea or dizziness. Median asthenopia scores at baseline were about 30% of the maximum possible score.Progressive computer glasses versus monofocal glassesOne study found no considerable difference in asthenopia between various progressive computer glasses and monofocal computer glasses after one-year follow-up (mean difference (MD) change scores 0.23, 95% confidence interval (CI) -5.0 to 5.4 on a 100 mm VAS scale, low quality evidence). For headache the results were in favour of progressive glasses.Progressive computer glasses with an intermediate focus in the upper part of the glasses versus other glassesIn two studies progressive computer glasses with intermediate focus led to a small decrease in asthenopia symptoms (SMD -0.49, 95% CI -0.75 to -0.23, low-quality evidence) but not in headache score in the short-term compared to general purpose progressive glasses. There were similar small decreases in dizziness. At medium term follow-up, in one study the effect size was not statistically significant (SMD -0.64, 95% CI -1.40 to 0.12). The study did not assess adverse events.Another study found no considerable difference in asthenopia between progressive computer glasses and monofocal computer glasses after one-year follow-up (MD change scores 1.44, 95% CI -6.95 to 9.83 on a 100 mm VAS scale, very low quality evidence). For headache the results were inconsistent.Progressive computer glasses with far-distance focus in the upper part of the glasses versus other glassesOne study found no considerable difference in number of persons with asthenopia between progressive computer glasses with far-distance focus and bifocal computer glasses after four weeks' follow-up (OR 1.00, 95% CI 0.40 to 2.50, very low quality evidence). The number of persons with headache, nausea and dizziness was also not different between groups.Another study found no considerable difference in asthenopia between progressive computer glasses with far-distance focus and monofocal computer glasses after one-year follow-up (MD change scores -1.79, 95% CI -11.60 to 8.02 on a 100 mm VAS scale, very low quality evidence). The effects on headaches were inconsistent.One study found no difference between progressive far-distance focus computer glasses and trifocal glasses in effect on eyestrain severity (MD -0.50, 95% CI -1.07 to 0.07, very low quality evidence) or on eyestrain frequency (MD -0.75, 95% CI -1.61 to 0.11, very low quality evidence).Progressive computer glasses versus ergonomic assessment with habitual (computer) glassesOne study found that computer glasses optimised for individual needs reduced asthenopia sum score more than an ergonomic assessment and habitual (computer) glasses (MD -8.9, 95% CI -16.47 to -1.33, scale 0 to 140, very low quality evidence) but there was no effect on the frequency of eyestrain (OR 1.08, 95% CI 0.38 to 3.11, very low quality evidence).We rated the quality of the evidence as low or very low due to risk of bias in the included studies, inconsistency in the results and imprecision. There is low to very low quality evidence that providing computer users with progressive computer glasses does not lead to a considerable decrease in problems with the eyes or headaches compared to other computer glasses. Progressive computer glasses might be slightly better than progressive glasses for daily use in the short term but not in the intermediate term and there is no data on long-term follow-up. The quality of the evidence is low or very low and therefore we are uncertain about this conclusion. Larger studies with several hundreds of participants are needed with proper randomisation, validated outcome measurement methods, and longer follow-up of at least one year to improve the quality of the evidence.
Jin, Miaomiao; Cheng, Long; Li, Yi; Hu, Siyu; Lu, Ke; Chen, Jia; Duan, Nian; Wang, Zhuorui; Zhou, Yaxiong; Chang, Ting-Chang; Miao, Xiangshui
2018-06-27
Owing to the capability of integrating the information storage and computing in the same physical location, in-memory computing with memristors has become a research hotspot as a promising route for non von Neumann architecture. However, it is still a challenge to develop high performance devices as well as optimized logic methodologies to realize energy-efficient computing. Herein, filamentary Cu/GeTe/TiN memristor is reported to show satisfactory properties with nanosecond switching speed (< 60 ns), low voltage operation (< 2 V), high endurance (>104 cycles) and good retention (>104 s @85℃). It is revealed that the charge carrier conduction mechanisms in high resistance and low resistance states are Schottky emission and hopping transport between the adjacent Cu clusters, respectively, based on the analysis of current-voltage behaviors and resistance-temperature characteristics. An intuitive picture is given to describe the dynamic processes of resistive switching. Moreover, based on the basic material implication (IMP) logic circuit, we proposed a reconfigurable logic method and experimentally implemented IMP, NOT, OR, and COPY logic functions. Design of a one-bit full adder with reduction in computational sequences and its validation in simulation further demonstrate the potential practical application. The results provide important progress towards understanding of resistive switching mechanism and realization of energy-efficient in-memory computing architecture. © 2018 IOP Publishing Ltd.
Clinical Information Systems - From Yesterday to Tomorrow.
Gardner, R M
2016-06-30
To review the history of clinical information systems over the past twenty-five years and project anticipated changes to those systems over the next twenty-five years. Over 250 Medline references about clinical information systems, quality of patient care, and patient safety were reviewed. Books, Web resources, and the author's personal experience with developing the HELP system were also used. There have been dramatic improvements in the use and acceptance of clinical computing systems and Electronic Health Records (EHRs), especially in the United States. Although there are still challenges with the implementation of such systems, the rate of progress has been remarkable. Over the next twenty-five years, there will remain many important opportunities and challenges. These opportunities include understanding complex clinical computing issues that must be studied, understood and optimized. Dramatic improvements in quality of care and patient safety must be anticipated as a result of the use of clinical information systems. These improvements will result from a closer involvement of clinical informaticians in the optimization of patient care processes. Clinical information systems and computerized clinical decision support have made contributions to medicine in the past. Therefore, by using better medical knowledge, optimized clinical information systems, and computerized clinical decision, we will enable dramatic improvements in both the quality and safety of patient care in the next twenty-five years.
Yang, Shuai; Zhang, Xinlei; Diao, Lihong; Guo, Feifei; Wang, Dan; Liu, Zhongyang; Li, Honglei; Zheng, Junjie; Pan, Jingshan; Nice, Edouard C; Li, Dong; He, Fuchu
2015-09-04
The Chromosome-centric Human Proteome Project (C-HPP) aims to catalog genome-encoded proteins using a chromosome-by-chromosome strategy. As the C-HPP proceeds, the increasing requirement for data-intensive analysis of the MS/MS data poses a challenge to the proteomic community, especially small laboratories lacking computational infrastructure. To address this challenge, we have updated the previous CAPER browser into a higher version, CAPER 3.0, which is a scalable cloud-based system for data-intensive analysis of C-HPP data sets. CAPER 3.0 uses cloud computing technology to facilitate MS/MS-based peptide identification. In particular, it can use both public and private cloud, facilitating the analysis of C-HPP data sets. CAPER 3.0 provides a graphical user interface (GUI) to help users transfer data, configure jobs, track progress, and visualize the results comprehensively. These features enable users without programming expertise to easily conduct data-intensive analysis using CAPER 3.0. Here, we illustrate the usage of CAPER 3.0 with four specific mass spectral data-intensive problems: detecting novel peptides, identifying single amino acid variants (SAVs) derived from known missense mutations, identifying sample-specific SAVs, and identifying exon-skipping events. CAPER 3.0 is available at http://prodigy.bprc.ac.cn/caper3.
NASA Astrophysics Data System (ADS)
Santos, Olga C.; Saneiro, Mar; Boticario, Jesus G.; Rodriguez-Sanchez, M. C.
2016-01-01
This work explores the benefits of supporting learners affectively in a context-aware learning situation. This features a new challenge in related literature in terms of providing affective educational recommendations that take advantage of ambient intelligence and are delivered through actuators available in the environment, thus going beyond previous approaches which provided computer-based recommendation that present some text or tell aloud the learner what to do. To address this open issue, we have applied TORMES elicitation methodology, which has been used to investigate the potential of ambient intelligence for making more interactive recommendations in an emotionally challenging scenario (i.e. preparing for the oral examination of a second language learning course). Arduino open source electronics prototyping platform is used both to sense changes in the learners' affective state and to deliver the recommendation in a more interactive way through different complementary sensory communication channels (sight, hearing, touch) to cope with a universal design. An Ambient Intelligence Context-aware Affective Recommender Platform (AICARP) has been built to support the whole experience, which represents a progress in the state of the art. In particular, we have come up with what is most likely the first interactive context-aware affective educational recommendation. The value of this contribution lies in discussing methodological and practical issues involved.
2013-01-01
Predicting the onset of atrial fibrillation : the Computers in Cardiology Challenge 2001. Comput Cardiol 2001;28:113-6. [22] Moody GB, Mark RG, Goldberger AL...Computers in Cardiology Challenge 2006: QT interval measurement. Comput Cardiol 2006;33:313-6. [18] Moody GB. Spontaneous termination of atrial ... fibrillation : a challenge from PhysioNet and Computers in Cardiology 2004. Comput Cardiol 2004;31:101-4. [19] Moody GB, Jager F. Distinguishing ischemic from non
Protein alignment algorithms with an efficient backtracking routine on multiple GPUs.
Blazewicz, Jacek; Frohmberg, Wojciech; Kierzynka, Michal; Pesch, Erwin; Wojciechowski, Pawel
2011-05-20
Pairwise sequence alignment methods are widely used in biological research. The increasing number of sequences is perceived as one of the upcoming challenges for sequence alignment methods in the nearest future. To overcome this challenge several GPU (Graphics Processing Unit) computing approaches have been proposed lately. These solutions show a great potential of a GPU platform but in most cases address the problem of sequence database scanning and computing only the alignment score whereas the alignment itself is omitted. Thus, the need arose to implement the global and semiglobal Needleman-Wunsch, and Smith-Waterman algorithms with a backtracking procedure which is needed to construct the alignment. In this paper we present the solution that performs the alignment of every given sequence pair, which is a required step for progressive multiple sequence alignment methods, as well as for DNA recognition at the DNA assembly stage. Performed tests show that the implementation, with performance up to 6.3 GCUPS on a single GPU for affine gap penalties, is very efficient in comparison to other CPU and GPU-based solutions. Moreover, multiple GPUs support with load balancing makes the application very scalable. The article shows that the backtracking procedure of the sequence alignment algorithms may be designed to fit in with the GPU architecture. Therefore, our algorithm, apart from scores, is able to compute pairwise alignments. This opens a wide range of new possibilities, allowing other methods from the area of molecular biology to take advantage of the new computational architecture. Performed tests show that the efficiency of the implementation is excellent. Moreover, the speed of our GPU-based algorithms can be almost linearly increased when using more than one graphics card.
Duan, Qiangde; Lee, Kuo Hao; Nandre, Rahul M; Garcia, Carolina; Chen, Jianhan; Zhang, Weiping
2017-01-01
Vaccine development often encounters the challenge of virulence heterogeneity. Enterotoxigenic Escherichia coli (ETEC) bacteria producing immunologically heterogeneous virulence factors are a leading cause of children’s diarrhea and travelers’ diarrhea. Currently, we do not have licensed vaccines against ETEC bacteria. While conventional methods continue to make progress but encounter challenge, new computational and structure-based approaches are explored to accelerate ETEC vaccine development. In this study, we applied a structural vaccinology concept to construct a structure-based multiepitope fusion antigen (MEFA) to carry representing epitopes of the seven most important ETEC adhesins [CFA/I, CFA/II (CS1–CS3), CFA/IV (CS4–CS6)], simulated antigenic structure of the CFA/I/II/IV MEFA with computational atomistic modeling and simulation, characterized immunogenicity in mouse immunization, and examined the potential of structure-informed vaccine design for ETEC vaccine development. A tag-less recombinant MEFA protein (CFA/I/II/IV MEFA) was effectively expressed and extracted. Molecular dynamics simulations indicated that this MEFA immunogen maintained a stable secondary structure and presented epitopes on the protein surface. Empirical data showed that mice immunized with the tagless CFA/I/II/IV MEFA developed strong antigen-specific antibody responses, and mouse serum antibodies significantly inhibited in vitro adherence of bacteria expressing these seven adhesins. These results revealed congruence of antigen immunogenicity between computational simulation and empirical mouse immunization and indicated this tag-less CFA/I/II/IV MEFA potentially an antigen for a broadly protective ETEC vaccine, suggesting a potential application of MEFA-based structural vaccinology for vaccine design against ETEC and likely other pathogens. PMID:28944092
Walsh, Aron
2015-03-19
The performance of solar cells based on hybrid halide perovskites has seen an unparalleled rate of progress, while our understanding of the underlying physical chemistry of these materials trails behind. Superficially, CH 3 NH 3 PbI 3 is similar to other thin-film photovoltaic materials: a semiconductor with an optical band gap in the optimal region of the electromagnetic spectrum. Microscopically, the material is more unconventional. Progress in our understanding of the local and long-range chemical bonding of hybrid perovskites is discussed here, drawing from a series of computational studies involving electronic structure, molecular dynamics, and Monte Carlo simulation techniques. The orientational freedom of the dipolar methylammonium ion gives rise to temperature-dependent dielectric screening and the possibility for the formation of polar (ferroelectric) domains. The ability to independently substitute on the A, B, and X lattice sites provides the means to tune the optoelectronic properties. Finally, ten critical challenges and opportunities for physical chemists are highlighted.
Recent Progress in the Development of a Multi-Layer Green's Function Code for Ion Beam Transport
NASA Technical Reports Server (NTRS)
Tweed, John; Walker, Steven A.; Wilson, John W.; Tripathi, Ram K.
2008-01-01
To meet the challenge of future deep space programs, an accurate and efficient engineering code for analyzing the shielding requirements against high-energy galactic heavy radiation is needed. To address this need, a new Green's function code capable of simulating high charge and energy ions with either laboratory or space boundary conditions is currently under development. The computational model consists of combinations of physical perturbation expansions based on the scales of atomic interaction, multiple scattering, and nuclear reactive processes with use of the Neumann-asymptotic expansions with non-perturbative corrections. The code contains energy loss due to straggling, nuclear attenuation, nuclear fragmentation with energy dispersion and downshifts. Previous reports show that the new code accurately models the transport of ion beams through a single slab of material. Current research efforts are focused on enabling the code to handle multiple layers of material and the present paper reports on progress made towards that end.
2015-01-01
The performance of solar cells based on hybrid halide perovskites has seen an unparalleled rate of progress, while our understanding of the underlying physical chemistry of these materials trails behind. Superficially, CH3NH3PbI3 is similar to other thin-film photovoltaic materials: a semiconductor with an optical band gap in the optimal region of the electromagnetic spectrum. Microscopically, the material is more unconventional. Progress in our understanding of the local and long-range chemical bonding of hybrid perovskites is discussed here, drawing from a series of computational studies involving electronic structure, molecular dynamics, and Monte Carlo simulation techniques. The orientational freedom of the dipolar methylammonium ion gives rise to temperature-dependent dielectric screening and the possibility for the formation of polar (ferroelectric) domains. The ability to independently substitute on the A, B, and X lattice sites provides the means to tune the optoelectronic properties. Finally, ten critical challenges and opportunities for physical chemists are highlighted. PMID:25838846
ERIC Educational Resources Information Center
Nelson, Peter M.; Van Norman, Ethan R.; Klingbeil, Dave A.; Parker, David C.
2017-01-01
Although extensive research exists on the use of curriculum-based measures for progress monitoring, little is known about using computer adaptive tests (CATs) for progress-monitoring purposes. The purpose of this study was to evaluate the impact of the frequency of data collection on individual and group growth estimates using a CAT. Data were…
Liu, Gang; Neelamegham, Sriram
2015-01-01
The glycome constitutes the entire complement of free carbohydrates and glycoconjugates expressed on whole cells or tissues. ‘Systems Glycobiology’ is an emerging discipline that aims to quantitatively describe and analyse the glycome. Here, instead of developing a detailed understanding of single biochemical processes, a combination of computational and experimental tools are used to seek an integrated or ‘systems-level’ view. This can explain how multiple biochemical reactions and transport processes interact with each other to control glycome biosynthesis and function. Computational methods in this field commonly build in silico reaction network models to describe experimental data derived from structural studies that measure cell-surface glycan distribution. While considerable progress has been made, several challenges remain due to the complex and heterogeneous nature of this post-translational modification. First, for the in silico models to be standardized and shared among laboratories, it is necessary to integrate glycan structure information and glycosylation-related enzyme definitions into the mathematical models. Second, as glycoinformatics resources grow, it would be attractive to utilize ‘Big Data’ stored in these repositories for model construction and validation. Third, while the technology for profiling the glycome at the whole-cell level has been standardized, there is a need to integrate mass spectrometry derived site-specific glycosylation data into the models. The current review discusses progress that is being made to resolve the above bottlenecks. The focus is on how computational models can bridge the gap between ‘data’ generated in wet-laboratory studies with ‘knowledge’ that can enhance our understanding of the glycome. PMID:25871730
Storming the Citadel: The Fundamental Revolution Against Progressive Education.
ERIC Educational Resources Information Center
Vetterli, Richard
The first four chapters ("Progressive Education,""The Impact of Progressive Education,""The Remedies: Focusing on the Wrong Problems," and "Progressive Education Challenged") examine the deleterious effect that progressive education has had on student achievement and on society as a whole. The last five…
Kane, J.S.
1991-01-01
A brief history of programs to develop geochemical reference samples and certified reference samples for use in geochemical analysis is presented. While progress has been made since G-1 and W-1 were issued, many challenges remain. ?? 1991.
Nuclear Data Covariances in the Indian Context – Progress, Challenges, Excitement and Perspectives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ganesan, S., E-mail: gane-san555@gmail.com
We present a brief overview of progress, challenges, excitement and perspectives in developing nuclear data covariances in the Indian context in relation to target accuracies and sensitivity studies that are of great importance to Bhabha's 3-stage nuclear programme for energy and non-energy applications.
Nonclassical light sources for silicon photonics
NASA Astrophysics Data System (ADS)
Bajoni, Daniele; Galli, Matteo
2017-09-01
Quantum photonics has recently attracted a lot of attention for its disruptive potential in emerging technologies like quantum cryptography, quantum communication and quantum computing. Driven by the impressive development in nanofabrication technologies and nanoscale engineering, silicon photonics has rapidly become the platform of choice for on-chip integration of high performing photonic devices, now extending their functionalities towards quantum-based applications. Focusing on quantum Information Technology (qIT) as a key application area, we review recent progress in integrated silicon-based sources of nonclassical states of light. We assess the state of the art in this growing field and highlight the challenges that need to be overcome to make quantum photonics a reliable and widespread technology.
Milestones toward Majorana-based quantum computing
NASA Astrophysics Data System (ADS)
Alicea, Jason
Experiments on nanowire-based Majorana platforms now appear poised to move beyond the preliminary problem of zero-mode detection and towards loftier goals of realizing non-Abelian statistics and quantum information applications. Using an approach that synthesizes recent materials growth breakthroughs with tools long successfully deployed in quantum-dot research, I will outline a number of relatively modest milestones that progressively bridge the gap between the current state of the art and these grand longer-term challenges. The intermediate Majorana experiments surveyed in this talk should be broadly adaptable to other approaches as well. Supported by the National Science Foundation (DMR-1341822), Institute for Quantum Information and Matter, and Walter Burke Institute at Caltech.
Stern, Andrew M; Schurdak, Mark E; Bahar, Ivet; Berg, Jeremy M; Taylor, D Lansing
2016-07-01
Drug candidates exhibiting well-defined pharmacokinetic and pharmacodynamic profiles that are otherwise safe often fail to demonstrate proof-of-concept in phase II and III trials. Innovation in drug discovery and development has been identified as a critical need for improving the efficiency of drug discovery, especially through collaborations between academia, government agencies, and industry. To address the innovation challenge, we describe a comprehensive, unbiased, integrated, and iterative quantitative systems pharmacology (QSP)-driven drug discovery and development strategy and platform that we have implemented at the University of Pittsburgh Drug Discovery Institute. Intrinsic to QSP is its integrated use of multiscale experimental and computational methods to identify mechanisms of disease progression and to test predicted therapeutic strategies likely to achieve clinical validation for appropriate subpopulations of patients. The QSP platform can address biological heterogeneity and anticipate the evolution of resistance mechanisms, which are major challenges for drug development. The implementation of this platform is dedicated to gaining an understanding of mechanism(s) of disease progression to enable the identification of novel therapeutic strategies as well as repurposing drugs. The QSP platform will help promote the paradigm shift from reactive population-based medicine to proactive personalized medicine by focusing on the patient as the starting and the end point. © 2016 Society for Laboratory Automation and Screening.
Preclinical Magnetic Resonance Imaging and Systems Biology in Cancer Research
Albanese, Chris; Rodriguez, Olga C.; VanMeter, John; Fricke, Stanley T.; Rood, Brian R.; Lee, YiChien; Wang, Sean S.; Madhavan, Subha; Gusev, Yuriy; Petricoin, Emanuel F.; Wang, Yue
2014-01-01
Biologically accurate mouse models of human cancer have become important tools for the study of human disease. The anatomical location of various target organs, such as brain, pancreas, and prostate, makes determination of disease status difficult. Imaging modalities, such as magnetic resonance imaging, can greatly enhance diagnosis, and longitudinal imaging of tumor progression is an important source of experimental data. Even in models where the tumors arise in areas that permit visual determination of tumorigenesis, longitudinal anatomical and functional imaging can enhance the scope of studies by facilitating the assessment of biological alterations, (such as changes in angiogenesis, metabolism, cellular invasion) as well as tissue perfusion and diffusion. One of the challenges in preclinical imaging is the development of infrastructural platforms required for integrating in vivo imaging and therapeutic response data with ex vivo pathological and molecular data using a more systems-based multiscale modeling approach. Further challenges exist in integrating these data for computational modeling to better understand the pathobiology of cancer and to better affect its cure. We review the current applications of preclinical imaging and discuss the implications of applying functional imaging to visualize cancer progression and treatment. Finally, we provide new data from an ongoing preclinical drug study demonstrating how multiscale modeling can lead to a more comprehensive understanding of cancer biology and therapy. PMID:23219428
Distributed Computing Environment for Mine Warfare Command
1993-06-01
based system to a decentralized network of personal computers over the past several years. This thesis analyzes the progress of the evolution as of May of...network of personal computers over the past several years. This thesis analyzes the progress of the evolution as of May of 1992. The building blocks of a...85 A. BACKGROUND ............. .................. 85 B. PAST ENVIRONMENT ........... ............... 86 C. PRESENT ENVIRONMENT
Toward high-resolution computational design of helical membrane protein structure and function
Barth, Patrick; Senes, Alessandro
2016-01-01
The computational design of α-helical membrane proteins is still in its infancy but has made important progress. De novo design has produced stable, specific and active minimalistic oligomeric systems. Computational re-engineering can improve stability and modulate the function of natural membrane proteins. Currently, the major hurdle for the field is not computational, but the experimental characterization of the designs. The emergence of new structural methods for membrane proteins will accelerate progress PMID:27273630
Terashima, Taiko; Yoshimura, Sadako
2018-03-01
To determine whether nurses can accurately assess the skin colour of replanted fingers displayed as digital images on a computer screen. Colour measurement and clinical diagnostic methods for medical digital images have been studied, but reproducing skin colour on a computer screen remains difficult. The inter-rater reliability of skin colour assessment scores was evaluated. In May 2014, 21 nurses who worked on a trauma ward in Japan participated in testing. Six digital images with different skin colours were used. Colours were scored from both digital images and direct patient's observation. The score from a digital image was defined as the test score, and its difference from the direct assessment score as the difference score. Intraclass correlation coefficients were calculated. Nurses' opinions were classified and summarised. The intraclass correlation coefficients for the test scores were fair. Although the intraclass correlation coefficients for the difference scores were poor, they improved to good when three images that might have contributed to poor reliability were excluded. Most nurses stated that it is difficult to assess skin colour in digital images; they did not think it could be a substitute for direct visual assessment. However, most nurses were in favour of including images in nursing progress notes. Although the inter-rater reliability was fairly high, the reliability of colour reproduction in digital images as indicated by the difference scores was poor. Nevertheless, nurses expect the incorporation of digital images in nursing progress notes to be useful. This gap between the reliability of digital colour reproduction and nurses' expectations towards it must be addressed. High inter-rater reliability for digital images in nursing progress notes was not observed. Assessments of future improvements in colour reproduction technologies are required. Further digitisation and visualisation of nursing records might pose challenges. © 2017 John Wiley & Sons Ltd.
Moris, Demetrios; Damaskos, Christos; Spartalis, Eleftherios; Papalampros, Alexandros; Vernadakis, Spyridon; Dimitroulis, Dimitrios; Griniatsos, John; Felekouras, Evangelos; Nikiteas, Nikolaos
2017-05-01
Intraductal papillary mucinous neoplasms (IPMNs) are presumed to evolve from low-grade dysplasia to high-grade dysplasia to invasive carcinoma. Resection of lesions before the development of pancreatic cancer may prevent the development of an incurable process as, once IPMNs progress to invasive cancer, the prognosis may be as poor as resected conventional pancreatic ductal adenocarcinoma. Resection of IPMNs, particularly in the setting of high-grade dysplasia, is presumed to provide a survival benefit. IPMNs also present many challenges as the identification of high-grade dysplasia and early invasive carcinoma and the timing and frequency of malignant progression are not yet established. The limited predictive accuracy presents a challenge as pancreatic resection is associated with a risk of substantial morbidity and mortality; 20-30% and 2-4%, respectively. Diagnostic armamentarium contains pancreas-protocol computed tomography (CT) scan, gadolinium-enhanced magnetic resonance imaging (MRI) with or without magnetic resonance cholangiopancreatography (MRCP) and endoscopic ultrasound (EUS). The most promising method is endoscopic ultrasound-guided fine-needle aspiration (EUS-FNA) as this technique allows analysis of cyst fluid using biomarkers. Until now, in clinical practice, we utilize two biomarkers, carcinoembryonic antigen (CEA) and carbohydrate antigen 19-9 (CA19-9); however, DNA analysis of pancreatic cystic fluid and genomic analysis could offer new tools to the diagnosis and administration of IPMNs. Novel genomic and serum biomarkers could play an important future role to identify those individuals who will benefit from an early operation and those who will benefit from watchful waiting approach. More prospective studies are needed. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.
2016 Institutional Computing Progress Report for w14_firetec
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Judith W.; Linn, Rodman
2016-07-14
This is a computing progress report for w14_firetec. FIRETEC simulations will explore the prescribed fire ignition methods to achieve burning objectives (understory reduction and ecosystem health) but at the same time minimize the risk of escaped fire.
78 FR 25482 - Notice of Revised Determination on Reconsideration
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-01
...-PROGRESSIVE SOFTWARE COMPUTING, QUALITY TESTING SERVICES, INC., RAILROAD CONSTRUCTION CO. OF SOUTH JERSEY, INC..., LP, PSCI- Progressive Software Computing, Quality Testing Services, Inc., Railroad Construction Co..., ANDERSON CONSTRUCTION SERVICES, BAKER PETROLITE, BAKERCORP, BELL-FAST FIRE PROTECTION INC., BOLTTECH INC...
Marciano, Roberta; Servetto, Alberto; Bianco, Cataldo; Bianco, Roberto
2017-09-26
Intrahepatic cholangiocarcinoma is an aggressive tumor originating in the epithelium of the bile duct, often associated with distant dissemination. The prognosis is poor and treatment is challenging due to low response rate to standard chemotherapy and lack of targeted therapies. Here we report the case of a 74-year-old white woman affected by intrahepatic cholangiocarcinoma with metastatic involvement of spleen, lung, peritoneum, and intra-abdominal lymph nodes. As first-line chemotherapy, she was given cisplatin-gemcitabine chemotherapy. The treatment was well tolerated with the exception of grade 1 constipation and a single episode of grade 4 thrombocytopenia occurring after the fourth course. After the first three courses of chemotherapy a computed tomography scan evaluation demonstrated no change; her CA19-9 levels were slightly decreased. However, after the sixth course of chemotherapy a computed tomography scan revealed a dimensional enlargement of the lung metastases; her CA19-9 levels increased. She was then treated with gemcitabine alone. After 2 months of gemcitabine monotherapy a significant regression of lung and spleen metastases, as well a CA19-9 level reduction, occurred. Eight months after the start of gemcitabine monotherapy no signs of progression were reported. Treatment of metastatic intrahepatic cholangiocarcinoma with gemcitabine as maintenance therapy after first-line chemotherapy could be continued until clear evidence of disease progression since delayed responses are possible.
Energy Efficient Digital Logic Using Nanoscale Magnetic Devices
NASA Astrophysics Data System (ADS)
Lambson, Brian James
Increasing demand for information processing in the last 50 years has been largely satisfied by the steadily declining price and improving performance of microelectronic devices. Much of this progress has been made by aggressively scaling the size of semiconductor transistors and metal interconnects that microprocessors are built from. As devices shrink to the size regime in which quantum effects pose significant challenges, new physics may be required in order to continue historical scaling trends. A variety of new devices and physics are currently under investigation throughout the scientific and engineering community to meet these challenges. One of the more drastic proposals on the table is to replace the electronic components of information processors with magnetic components. Magnetic components are already commonplace in computers for their information storage capability. Unlike most electronic devices, magnetic materials can store data in the absence of a power supply. Today's magnetic hard disk drives can routinely hold billions of bits of information and are in widespread commercial use. Their ability to function without a constant power source hints at an intrinsic energy efficiency. The question we investigate in this dissertation is whether or not this advantage can be extended from information storage to the notoriously energy intensive task of information processing. Several proof-of-concept magnetic logic devices were proposed and tested in the past decade. In this dissertation, we build on the prior work by answering fundamental questions about how magnetic devices achieve such high energy efficiency and how they can best function in digital logic applications. The results of this analysis are used to suggest and test improvements to nanomagnetic computing devices. Two of our results are seen as especially important to the field of nanomagnetic computing: (1) we show that it is possible to operate nanomagnetic computers at the fundamental thermodyanimic limits of computation and (2) we develop a nanomagnet with a unique shape that is engineered to significantly improve the reliability of nanomagnetic logic.
Modeling Effects of RNA on Capsid Assembly Pathways via Coarse-Grained Stochastic Simulation
Smith, Gregory R.; Xie, Lu; Schwartz, Russell
2016-01-01
The environment of a living cell is vastly different from that of an in vitro reaction system, an issue that presents great challenges to the use of in vitro models, or computer simulations based on them, for understanding biochemistry in vivo. Virus capsids make an excellent model system for such questions because they typically have few distinct components, making them amenable to in vitro and modeling studies, yet their assembly can involve complex networks of possible reactions that cannot be resolved in detail by any current experimental technology. We previously fit kinetic simulation parameters to bulk in vitro assembly data to yield a close match between simulated and real data, and then used the simulations to study features of assembly that cannot be monitored experimentally. The present work seeks to project how assembly in these simulations fit to in vitro data would be altered by computationally adding features of the cellular environment to the system, specifically the presence of nucleic acid about which many capsids assemble. The major challenge of such work is computational: simulating fine-scale assembly pathways on the scale and in the parameter domains of real viruses is far too computationally costly to allow for explicit models of nucleic acid interaction. We bypass that limitation by applying analytical models of nucleic acid effects to adjust kinetic rate parameters learned from in vitro data to see how these adjustments, singly or in combination, might affect fine-scale assembly progress. The resulting simulations exhibit surprising behavioral complexity, with distinct effects often acting synergistically to drive efficient assembly and alter pathways relative to the in vitro model. The work demonstrates how computer simulations can help us understand how assembly might differ between the in vitro and in vivo environments and what features of the cellular environment account for these differences. PMID:27244559
How to Build a Quantum Computer
NASA Astrophysics Data System (ADS)
Sanders, Barry C.
2017-11-01
Quantum computer technology is progressing rapidly with dozens of qubits and hundreds of quantum logic gates now possible. Although current quantum computer technology is distant from being able to solve computational problems beyond the reach of non-quantum computers, experiments have progressed well beyond simply demonstrating the requisite components. We can now operate small quantum logic processors with connected networks of qubits and quantum logic gates, which is a great stride towards functioning quantum computers. This book aims to be accessible to a broad audience with basic knowledge of computers, electronics and physics. The goal is to convey key notions relevant to building quantum computers and to present state-of-the-art quantum-computer research in various media such as trapped ions, superconducting circuits, photonics and beyond.
The implementation of AI technologies in computer wargames
NASA Astrophysics Data System (ADS)
Tiller, John A.
2004-08-01
Computer wargames involve the most in-depth analysis of general game theory. The enumerated turns of a game like chess are dwarfed by the exponentially larger possibilities of even a simple computer wargame. Implementing challenging AI is computer wargames is an important goal in both the commercial and military environments. In the commercial marketplace, customers demand a challenging AI opponent when they play a computer wargame and are frustrated by a lack of competence on the part of the AI. In the military environment, challenging AI opponents are important for several reasons. A challenging AI opponent will force the military professional to avoid routine or set-piece approaches to situations and cause them to think much deeper about military situations before taking action. A good AI opponent would also include national characteristics of the opponent being simulated, thus providing the military professional with even more of a challenge in planning and approach. Implementing current AI technologies in computer wargames is a technological challenge. The goal is to join the needs of AI in computer wargames with the solutions of current AI technologies. This talk will address several of those issues, possible solutions, and currently unsolved problems.
A Systematic Review of Wearable Systems for Cancer Detection: Current State and Challenges.
Ray, Partha Pratim; Dash, Dinesh; De, Debashis
2017-10-02
Rapid growth of sensor and computing platforms have introduced the wearable systems. In recent years, wearable systems have led to new applications across all medical fields. The aim of this review is to present current state-of-the-art approach in the field of wearable system based cancer detection and identify key challenges that resist it from clinical adoption. A total of 472 records were screened and 11 were finally included in this study. Two types of records were studied in this context that includes 45% research articles and 55% manufactured products. The review was performed per PRISMA guidelines where considerations was given to records that were published or reported between 2009 and 2017. The identified records included 4 cancer detecting wearable systems such as breast cancer (36.3%), skin cancer (36.3%), prostate cancer (18.1%), and multi-type cancer (9%). Most works involved sensor based smart systems comprising of microcontroller, Bluetooth module, and smart phone. Few demonstrated Ultra-Wide Band (i.e. UWB) antenna based wearable systems. Skin cancer detecting wearable systems were most comprehensible ones. The current works are gradually progressing with seamless integration of sensory units along with smart networking. However, they lack in cloud computing and long-range communication paradigms. Artificial intelligence and machine learning are key ports that need to be attached with current wearable systems. Further, clinical inertia, lack of awareness, and high cost are altogether pulling back the actual growth of such system. It is well comprehended that upon sincere orientation of all identified challenges, wearable systems would emerge as vital alternative to futuristic cancer detection.
Terrain Hazard Detection and Avoidance During the Descent and Landing Phase of the Altair Mission
NASA Technical Reports Server (NTRS)
Strhan, Alan L.; Johnson, Andrew E.
2010-01-01
This paper describes some of the environmental challenges associated with landing a crewed or robotic vehicle at any certified location on the lunar surface (i.e. not a mountain peak, permanently dark crater floor or overly steep terrain), with a specific focus on how hazard detection technology may be incorporated to mitigate these challenges. For this discussion, the vehicle of interest is the Altair Lunar Lander, being the vehicle element of the NASA Constellation Program aimed at returning humans to the moon. Lunar environmental challenges for such global lunar access primarily involve terrain and lighting. These would include sizable rocks and slopes, which are more concentrated in highland areas; small craters, which are essentially everywhere independent of terrain type; and for polar regions, low-angle sunlight, which leaves significant terrain in shadow. To address these issues, as well as to provide for precision landing, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project was charted by NASA Headquarters, and has since been making significant progress. The ALHAT team considered several sensors for real-time hazard detection, settling on the use of a Flash Lidar mounted to a high-speed gimbal, with computationally intense image processing and elevation interpretation software. The Altair Project has been working with the ALHAT team to understand the capabilities and limitations of their concept, and has incorporated much of the ALHAT hazard detection system into the Altair baseline design. This integration, along with open issues relating to computational performance, the need for system redundancy, and potential pilot interaction, will be explored further in this paper.
Theoretical models for coronary vascular biomechanics: Progress & challenges
Waters, Sarah L.; Alastruey, Jordi; Beard, Daniel A.; Bovendeerd, Peter H.M.; Davies, Peter F.; Jayaraman, Girija; Jensen, Oliver E.; Lee, Jack; Parker, Kim H.; Popel, Aleksander S.; Secomb, Timothy W.; Siebes, Maria; Sherwin, Spencer J.; Shipley, Rebecca J.; Smith, Nicolas P.; van de Vosse, Frans N.
2013-01-01
A key aim of the cardiac Physiome Project is to develop theoretical models to simulate the functional behaviour of the heart under physiological and pathophysiological conditions. Heart function is critically dependent on the delivery of an adequate blood supply to the myocardium via the coronary vasculature. Key to this critical function of the coronary vasculature is system dynamics that emerge via the interactions of the numerous constituent components at a range of spatial and temporal scales. Here, we focus on several components for which theoretical approaches can be applied, including vascular structure and mechanics, blood flow and mass transport, flow regulation, angiogenesis and vascular remodelling, and vascular cellular mechanics. For each component, we summarise the current state of the art in model development, and discuss areas requiring further research. We highlight the major challenges associated with integrating the component models to develop a computational tool that can ultimately be used to simulate the responses of the coronary vascular system to changing demands and to diseases and therapies. PMID:21040741
Bharucha, Ashok J.; Anand, Vivek; Forlizzi, Jodi; Dew, Mary Amanda; Reynolds, Charles F.; Stevens, Scott; Wactlar, Howard
2009-01-01
The number of older Americans afflicted by Alzheimer disease and related dementias will triple to 13 million persons by 2050, thus greatly increasing healthcare needs. An approach to this emerging crisis is the development and deployment of intelligent assistive technologies that compensate for the specific physical and cognitive deficits of older adults with dementia, and thereby also reduce caregiver burden. The authors conducted an extensive search of the computer science, engineering, and medical databases to review intelligent cognitive devices, physiologic and environmental sensors, and advanced integrated sensor networks that may find future applications in dementia care. Review of the extant literature reveals an overwhelming focus on the physical disability of younger persons with typically nonprogressive anoxic and traumatic brain injuries, with few clinical studies specifically involving persons with dementia. A discussion of the specific capabilities, strengths, and limitations of each technology is followed by an overview of research methodological challenges that must be addressed to achieve measurable progress to meet the healthcare needs of an aging America. PMID:18849532
The rise of organic electrode materials for energy storage.
Schon, Tyler B; McAllister, Bryony T; Li, Peng-Fei; Seferos, Dwight S
2016-11-07
Organic electrode materials are very attractive for electrochemical energy storage devices because they can be flexible, lightweight, low cost, benign to the environment, and used in a variety of device architectures. They are not mere alternatives to more traditional energy storage materials, rather, they have the potential to lead to disruptive technologies. Although organic electrode materials for energy storage have progressed in recent years, there are still significant challenges to overcome before reaching large-scale commercialization. This review provides an overview of energy storage systems as a whole, the metrics that are used to quantify the performance of electrodes, recent strategies that have been investigated to overcome the challenges associated with organic electrode materials, and the use of computational chemistry to design and study new materials and their properties. Design strategies are examined to overcome issues with capacity/capacitance, device voltage, rate capability, and cycling stability in order to guide future work in the area. The use of low cost materials is highlighted as a direction towards commercial realization.
FACE-IT. A Science Gateway for Food Security Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montella, Raffaele; Kelly, David; Xiong, Wei
Progress in sustainability science is hindered by challenges in creating and managing complex data acquisition, processing, simulation, post-processing, and intercomparison pipelines. To address these challenges, we developed the Framework to Advance Climate, Economic, and Impact Investigations with Information Technology (FACE-IT) for crop and climate impact assessments. This integrated data processing and simulation framework enables data ingest from geospatial archives; data regridding, aggregation, and other processing prior to simulation; large-scale climate impact simulations with agricultural and other models, leveraging high-performance and cloud computing; and post-processing to produce aggregated yields and ensemble variables needed for statistics, for model intercomparison, and to connectmore » biophysical models to global and regional economic models. FACE-IT leverages the capabilities of the Globus Galaxies platform to enable the capture of workflows and outputs in well-defined, reusable, and comparable forms. We describe FACE-IT and applications within the Agricultural Model Intercomparison and Improvement Project and the Center for Robust Decision-making on Climate and Energy Policy.« less
The Abbott Districts in 2005-06: Progress and Challenges, Spring 2006
ERIC Educational Resources Information Center
Hirsch, Lesley
2006-01-01
New Jersey's urban--or "Abbott"--schools have improved at the preschool and elementary school level, but lag when it comes to middle and high school performance. These are the key findings of an Abbott Indicators Project report entitled, "The Abbott Districts in 2005-06: Progress and Challenges." The report was prepared by…
Progressive Damage and Fracture in Composites Under Dynamic Loading
NASA Technical Reports Server (NTRS)
Minnetyan, Levon
1994-01-01
A computational simulation tool is used to evaluate the various stages of damage progression in composite materials during losipescu shear testing. Unidirectional composite specimens with either the major or minor material axis in the load direction are considered. Damage progression characteristics are described for each specimen using two types of boundary conditions. A procedure is outlined regarding the use of computational simulation in the testing of composite materials.
Nonlinear water waves: introduction and overview
NASA Astrophysics Data System (ADS)
Constantin, A.
2017-12-01
For more than two centuries progress in the study of water waves proved to be interdependent with innovative and deep developments in theoretical and experimental directions of investigation. In recent years, considerable progress has been achieved towards the understanding of waves of large amplitude. Within this setting one cannot rely on linear theory as nonlinearity becomes an essential feature. Various analytic methods have been developed and adapted to come to terms with the challenges encountered in settings where approximations (such as those provided by linear or weakly nonlinear theory) are ineffective. Without relying on simpler models, progress becomes contingent upon the discovery of structural properties, the exploitation of which requires a combination of creative ideas and state-of-the-art technical tools. The successful quest for structure often reveals unexpected patterns and confers aesthetic value on some of these studies. The topics covered in this issue are both multi-disciplinary and interdisciplinary: there is a strong interplay between mathematical analysis, numerical computation and experimental/field data, interacting with each other via mutual stimulation and feedback. This theme issue reflects some of the new important developments that were discussed during the programme `Nonlinear water waves' that took place at the Isaac Newton Institute for Mathematical Sciences (Cambridge, UK) from 31st July to 25th August 2017. A cross-section of the experts in the study of water waves who participated in the programme authored the collected papers. These papers illustrate the diversity, intensity and interconnectivity of the current research activity in this area. They offer new insight, present emerging theoretical methodologies and computational approaches, and describe sophisticated experimental results. This article is part of the theme issue 'Nonlinear water waves'.
Nonlinear water waves: introduction and overview.
Constantin, A
2018-01-28
For more than two centuries progress in the study of water waves proved to be interdependent with innovative and deep developments in theoretical and experimental directions of investigation. In recent years, considerable progress has been achieved towards the understanding of waves of large amplitude. Within this setting one cannot rely on linear theory as nonlinearity becomes an essential feature. Various analytic methods have been developed and adapted to come to terms with the challenges encountered in settings where approximations (such as those provided by linear or weakly nonlinear theory) are ineffective. Without relying on simpler models, progress becomes contingent upon the discovery of structural properties, the exploitation of which requires a combination of creative ideas and state-of-the-art technical tools. The successful quest for structure often reveals unexpected patterns and confers aesthetic value on some of these studies. The topics covered in this issue are both multi-disciplinary and interdisciplinary: there is a strong interplay between mathematical analysis, numerical computation and experimental/field data, interacting with each other via mutual stimulation and feedback. This theme issue reflects some of the new important developments that were discussed during the programme 'Nonlinear water waves' that took place at the Isaac Newton Institute for Mathematical Sciences (Cambridge, UK) from 31st July to 25th August 2017. A cross-section of the experts in the study of water waves who participated in the programme authored the collected papers. These papers illustrate the diversity, intensity and interconnectivity of the current research activity in this area. They offer new insight, present emerging theoretical methodologies and computational approaches, and describe sophisticated experimental results.This article is part of the theme issue 'Nonlinear water waves'. © 2017 The Author(s).
Progressive Fracture of Fiber Composite Build-Up Structures
NASA Technical Reports Server (NTRS)
Gotsis, Pascal K.; Chamis, C. C.; Minnetyan, Levon
1997-01-01
Damage progression and fracture of built-up composite structures is evaluated by using computational simulation. The objective is to examine the behavior and response of a stiffened composite (0/ +/- 45/90)(sub s6) laminate panel by simulating the damage initiation, growth, accumulation, progression and propagation to structural collapse. An integrated computer code, CODSTRAN, was augmented for the simulation of the progressive damage and fracture of built-up composite structures under mechanical loading. Results show that damage initiation and progression have significant effect on the structural response. Influence of the type of loading is investigated on the damage initiation, propagation and final fracture of the build-up composite panel.
Progressive Fracture of Fiber Composite Build-Up Structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon; Gotsis, Pascal K.; Chamis, C. C.
1997-01-01
Damage progression and fracture of built-up composite structures is evaluated by using computational simulation. The objective is to examine the behavior and response of a stiffened composite (0 +/-45/90)(sub s6) laminate panel by simulating the damage initiation, growth, accumulation, progression and propagation to structural collapse. An integrated computer code CODSTRAN was augmented for the simulation of the progressive damage and fracture of built-up composite structures under mechanical loading. Results show that damage initiation and progression to have significant effect on the structural response. Influence of the type of loading is investigated on the damage initiation, propagation and final fracture of the build-up composite panel.
Computational fluid dynamics modelling in cardiovascular medicine
Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P
2016-01-01
This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards ‘digital patient’ or ‘virtual physiological human’ representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. PMID:26512019
Benchmark Problems Used to Assess Computational Aeroacoustics Codes
NASA Technical Reports Server (NTRS)
Dahl, Milo D.; Envia, Edmane
2005-01-01
The field of computational aeroacoustics (CAA) encompasses numerical techniques for calculating all aspects of sound generation and propagation in air directly from fundamental governing equations. Aeroacoustic problems typically involve flow-generated noise, with and without the presence of a solid surface, and the propagation of the sound to a receiver far away from the noise source. It is a challenge to obtain accurate numerical solutions to these problems. The NASA Glenn Research Center has been at the forefront in developing and promoting the development of CAA techniques and methodologies for computing the noise generated by aircraft propulsion systems. To assess the technological advancement of CAA, Glenn, in cooperation with the Ohio Aerospace Institute and the AeroAcoustics Research Consortium, organized and hosted the Fourth CAA Workshop on Benchmark Problems. Participants from industry and academia from both the United States and abroad joined to present and discuss solutions to benchmark problems. These demonstrated technical progress ranging from the basic challenges to accurate CAA calculations to the solution of CAA problems of increasing complexity and difficulty. The results are documented in the proceedings of the workshop. Problems were solved in five categories. In three of the five categories, exact solutions were available for comparison with CAA results. A fourth category of problems representing sound generation from either a single airfoil or a blade row interacting with a gust (i.e., problems relevant to fan noise) had approximate analytical or completely numerical solutions. The fifth category of problems involved sound generation in a viscous flow. In this case, the CAA results were compared with experimental data.
Computational understanding of Li-ion batteries
NASA Astrophysics Data System (ADS)
Urban, Alexander; Seo, Dong-Hwa; Ceder, Gerbrand
2016-03-01
Over the last two decades, computational methods have made tremendous advances, and today many key properties of lithium-ion batteries can be accurately predicted by first principles calculations. For this reason, computations have become a cornerstone of battery-related research by providing insight into fundamental processes that are not otherwise accessible, such as ionic diffusion mechanisms and electronic structure effects, as well as a quantitative comparison with experimental results. The aim of this review is to provide an overview of state-of-the-art ab initio approaches for the modelling of battery materials. We consider techniques for the computation of equilibrium cell voltages, 0-Kelvin and finite-temperature voltage profiles, ionic mobility and thermal and electrolyte stability. The strengths and weaknesses of different electronic structure methods, such as DFT+U and hybrid functionals, are discussed in the context of voltage and phase diagram predictions, and we review the merits of lattice models for the evaluation of finite-temperature thermodynamics and kinetics. With such a complete set of methods at hand, first principles calculations of ordered, crystalline solids, i.e., of most electrode materials and solid electrolytes, have become reliable and quantitative. However, the description of molecular materials and disordered or amorphous phases remains an important challenge. We highlight recent exciting progress in this area, especially regarding the modelling of organic electrolytes and solid-electrolyte interfaces.
QM/MM free energy simulations: recent progress and challenges
Lu, Xiya; Fang, Dong; Ito, Shingo; Okamoto, Yuko; Ovchinnikov, Victor
2016-01-01
Due to the higher computational cost relative to pure molecular mechanical (MM) simulations, hybrid quantum mechanical/molecular mechanical (QM/MM) free energy simulations particularly require a careful consideration of balancing computational cost and accuracy. Here we review several recent developments in free energy methods most relevant to QM/MM simulations and discuss several topics motivated by these developments using simple but informative examples that involve processes in water. For chemical reactions, we highlight the value of invoking enhanced sampling technique (e.g., replica-exchange) in umbrella sampling calculations and the value of including collective environmental variables (e.g., hydration level) in metadynamics simulations; we also illustrate the sensitivity of string calculations, especially free energy along the path, to various parameters in the computation. Alchemical free energy simulations with a specific thermodynamic cycle are used to probe the effect of including the first solvation shell into the QM region when computing solvation free energies. For cases where high-level QM/MM potential functions are needed, we analyze two different approaches: the QM/MM-MFEP method of Yang and co-workers and perturbative correction to low-level QM/MM free energy results. For the examples analyzed here, both approaches seem productive although care needs to be exercised when analyzing the perturbative corrections. PMID:27563170
Knowledge-based computer systems for radiotherapy planning.
Kalet, I J; Paluszynski, W
1990-08-01
Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.
RNA 3D Modules in Genome-Wide Predictions of RNA 2D Structure
Theis, Corinna; Zirbel, Craig L.; zu Siederdissen, Christian Höner; Anthon, Christian; Hofacker, Ivo L.; Nielsen, Henrik; Gorodkin, Jan
2015-01-01
Recent experimental and computational progress has revealed a large potential for RNA structure in the genome. This has been driven by computational strategies that exploit multiple genomes of related organisms to identify common sequences and secondary structures. However, these computational approaches have two main challenges: they are computationally expensive and they have a relatively high false discovery rate (FDR). Simultaneously, RNA 3D structure analysis has revealed modules composed of non-canonical base pairs which occur in non-homologous positions, apparently by independent evolution. These modules can, for example, occur inside structural elements which in RNA 2D predictions appear as internal loops. Hence one question is if the use of such RNA 3D information can improve the prediction accuracy of RNA secondary structure at a genome-wide level. Here, we use RNAz in combination with 3D module prediction tools and apply them on a 13-way vertebrate sequence-based alignment. We find that RNA 3D modules predicted by metaRNAmodules and JAR3D are significantly enriched in the screened windows compared to their shuffled counterparts. The initially estimated FDR of 47.0% is lowered to below 25% when certain 3D module predictions are present in the window of the 2D prediction. We discuss the implications and prospects for further development of computational strategies for detection of RNA 2D structure in genomic sequence. PMID:26509713
Karmonik, C; Bismuth, J; Shah, D J; Davies, M G; Purdy, D; Lumsden, A B
2011-08-01
Outcome prediction in DeBakey Type III aortic dissections (ADs) remains challenging. Large variations in AD morphology, physiology and treatment exist. Here, we investigate if computational fluid dynamics (CFD) can provide an initial understanding of pressure changes in an AD computational model when covering entry and exit tears and removing the intra-arterial septum (IS). A computational mesh was constructed from magnetic resonance images from one patient (one entrance and one exit tear) and CFD simulations performed (scenario #1). Additional meshes were derived by virtually (1) covering the exit tear (false lumen (FL) thrombus progression) (scenario #2), (2) covering the entrance tear (thoracic endovascular treatment, TEVAR) (scenario #3) and (3) completely removing the IS (fenestration) (scenario #4). Changes in flow patterns and pressures were quantified relative to the initial mesh. Systolic pressures increased for #2 (300 Pa increase) with largest inter-luminal differences distally (2500 Pa). In #3, false lumen pressure decreased essentially to zero. In #4, systolic pressure in combined lumen reduced from 2400 to 800 Pa. CFD results from computational models of a DeBakey type III AD representing separate coverage of entrance and exit tears correlated with clinical experience. The reported results present a preliminary look at a complex clinical problem. Copyright © 2011 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.
Progress on the Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Box, Dennis; Boyd, Joseph; Dykstra, Dave; Garzoglio, Gabriele; Herner, Kenneth; Kirby, Michael; Kreymer, Arthur; Levshina, Tanya; Mhashilkar, Parag; Sharma, Neha
2015-12-01
The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.
A survey of body sensor networks.
Lai, Xiaochen; Liu, Quanli; Wei, Xin; Wang, Wei; Zhou, Guoqiao; Han, Guangyi
2013-04-24
The technology of sensor, pervasive computing, and intelligent information processing is widely used in Body Sensor Networks (BSNs), which are a branch of wireless sensor networks (WSNs). BSNs are playing an increasingly important role in the fields of medical treatment, social welfare and sports, and are changing the way humans use computers. Existing surveys have placed emphasis on the concept and architecture of BSNs, signal acquisition, context-aware sensing, and system technology, while this paper will focus on sensor, data fusion, and network communication. And we will introduce the research status of BSNs, the analysis of hotspots, and future development trends, the discussion of major challenges and technical problems facing currently. The typical research projects and practical application of BSNs are introduced as well. BSNs are progressing along the direction of multi-technology integration and intelligence. Although there are still many problems, the future of BSNs is fundamentally promising, profoundly changing the human-machine relationships and improving the quality of people's lives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gehl, Michael; Gibson, Ricky; Zandbergen, Sander
Currently, superconducting qubits lead the way in potential candidates for quantum computing. This is a result of the robust nature of superconductivity and the non-linear Josephson effect which make possible many types of qubits. At the same time, transferring quantum information over long distances typically relies on the use of photons as the elementary qubit. Converting between stationary electronic qubits in superconducting systems and traveling photonic qubits is a challenging yet necessary goal for the interface of quantum computing and communication. The most promising path to achieving this goal appears to be the integration of superconductivity with optically active semiconductors,more » with quantum information being transferred between the two by means of the superconducting proximity effect. Obtaining good interfaces between superconductor and semiconductor is the next obvious step for improving these hybrid systems. As a result, we report on our observation of superconductivity in self-assembled indium structures grown epitaxially on the surface of semiconductor material.« less
A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing
Abdul Wahab, Ainuddin Wahid; Han, Qi; Bin Abdul Rahman, Zulkanain
2014-01-01
Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC. PMID:25097880
A comprehensive review on adaptability of network forensics frameworks for mobile cloud computing.
Khan, Suleman; Shiraz, Muhammad; Wahab, Ainuddin Wahid Abdul; Gani, Abdullah; Han, Qi; Rahman, Zulkanain Bin Abdul
2014-01-01
Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.
The Unified Medical Language System
Humphreys, Betsy L.; Lindberg, Donald A. B.; Schoolman, Harold M.; Barnett, G. Octo
1998-01-01
In 1986, the National Library of Medicine (NLM) assembled a large multidisciplinary, multisite team to work on the Unified Medical Language System (UMLS), a collaborative research project aimed at reducing fundamental barriers to the application of computers to medicine. Beyond its tangible products, the UMLS Knowledge Sources, and its influence on the field of informatics, the UMLS project is an interesting case study in collaborative research and development. It illustrates the strengths and challenges of substantive collaboration among widely distributed research groups. Over the past decade, advances in computing and communications have minimized the technical difficulties associated with UMLS collaboration and also facilitated the development, dissemination, and use of the UMLS Knowledge Sources. The spread of the World Wide Web has increased the visibility of the information access problems caused by multiple vocabularies and many information sources which are the focus of UMLS work. The time is propitious for building on UMLS accomplishments and making more progress on the informatics research issues first highlighted by the UMLS project more than 10 years ago. PMID:9452981
The Unified Medical Language System: an informatics research collaboration.
Humphreys, B L; Lindberg, D A; Schoolman, H M; Barnett, G O
1998-01-01
In 1986, the National Library of Medicine (NLM) assembled a large multidisciplinary, multisite team to work on the Unified Medical Language System (UMLS), a collaborative research project aimed at reducing fundamental barriers to the application of computers to medicine. Beyond its tangible products, the UMLS Knowledge Sources, and its influence on the field of informatics, the UMLS project is an interesting case study in collaborative research and development. It illustrates the strengths and challenges of substantive collaboration among widely distributed research groups. Over the past decade, advances in computing and communications have minimized the technical difficulties associated with UMLS collaboration and also facilitated the development, dissemination, and use of the UMLS Knowledge Sources. The spread of the World Wide Web has increased the visibility of the information access problems caused by multiple vocabularies and many information sources which are the focus of UMLS work. The time is propitious for building on UMLS accomplishments and making more progress on the informatics research issues first highlighted by the UMLS project more than 10 years ago.
Zhang, Zhuhua; Liu, Xiaofei; Yu, Jin; Hang, Yang; Li, Yao; Guo, Yufeng; Xu, Ying; Sun, Xu; Zhou, Jianxin
2016-01-01
Low‐dimensional materials exhibit many exceptional properties and functionalities which can be efficiently tuned by externally applied force or fields. Here we review the current status of research on tuning the electronic and magnetic properties of low‐dimensional carbon, boron nitride, metal‐dichalcogenides, phosphorene nanomaterials by applied engineering strain, external electric field and interaction with substrates, etc, with particular focus on the progress of computational methods and studies. We highlight the similarities and differences of the property modulation among one‐ and two‐dimensional nanomaterials. Recent breakthroughs in experimental demonstration of the tunable functionalities in typical nanostructures are also presented. Finally, prospective and challenges for applying the tunable properties into functional devices are discussed. WIREs Comput Mol Sci 2016, 6:324–350. doi: 10.1002/wcms.1251 For further resources related to this article, please visit the WIREs website. Conflict of interest: The authors have declared no conflicts of interest for this article. PMID:27818710
A Survey of Body Sensor Networks
Lai, Xiaochen; Liu, Quanli; Wei, Xin; Wang, Wei; Zhou, Guoqiao; Han, Guangyi
2013-01-01
The technology of sensor, pervasive computing, and intelligent information processing is widely used in Body Sensor Networks (BSNs), which are a branch of wireless sensor networks (WSNs). BSNs are playing an increasingly important role in the fields of medical treatment, social welfare and sports, and are changing the way humans use computers. Existing surveys have placed emphasis on the concept and architecture of BSNs, signal acquisition, context-aware sensing, and system technology, while this paper will focus on sensor, data fusion, and network communication. And we will introduce the research status of BSNs, the analysis of hotspots, and future development trends, the discussion of major challenges and technical problems facing currently. The typical research projects and practical application of BSNs are introduced as well. BSNs are progressing along the direction of multi-technology integration and intelligence. Although there are still many problems, the future of BSNs is fundamentally promising, profoundly changing the human-machine relationships and improving the quality of people's lives. PMID:23615581
Microfabricated Microwave-Integrated Surface Ion Trap
NASA Astrophysics Data System (ADS)
Revelle, Melissa C.; Blain, Matthew G.; Haltli, Raymond A.; Hollowell, Andrew E.; Nordquist, Christopher D.; Maunz, Peter
2017-04-01
Quantum information processing holds the key to solving computational problems that are intractable with classical computers. Trapped ions are a physical realization of a quantum information system in which qubits are encoded in hyperfine energy states. Coupling the qubit states to ion motion, as needed for two-qubit gates, is typically accomplished using Raman laser beams. Alternatively, this coupling can be achieved with strong microwave gradient fields. While microwave radiation is easier to control than a laser, it is challenging to precisely engineer the radiated microwave field. Taking advantage of Sandia's microfabrication techniques, we created a surface ion trap with integrated microwave electrodes with sub-wavelength dimensions. This multi-layered device permits co-location of the microwave antennae and the ion trap electrodes to create localized microwave gradient fields and necessary trapping fields. Here, we characterize the trap design and present simulated microwave performance with progress towards experimental results. This research was funded, in part, by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA).
Toward a new culture in verified quantum operations
NASA Astrophysics Data System (ADS)
Flammia, Steve
Measuring error rates of quantum operations has become an indispensable component in any aspiring platform for quantum computation. As the quality of controlled quantum operations increases, the demands on the accuracy and precision with which we measure these error rates also grows. However, well-meaning scientists that report these error measures are faced with a sea of non-standardized methodologies and are often asked during publication for only coarse information about how their estimates were obtained. Moreover, there are serious incentives to use methodologies and measures that will continually produce numbers that improve with time to show progress. These problems will only get exacerbated as our typical error rates go from 1 in 100 to 1 in 1000 or less. This talk will survey existing challenges presented by the current paradigm and offer some suggestions for solutions than can help us move toward fair and standardized methods for error metrology in quantum computing experiments, and towards a culture that values full disclose of methodologies and higher standards for data analysis.
Gehl, Michael; Gibson, Ricky; Zandbergen, Sander; ...
2016-02-01
Currently, superconducting qubits lead the way in potential candidates for quantum computing. This is a result of the robust nature of superconductivity and the non-linear Josephson effect which make possible many types of qubits. At the same time, transferring quantum information over long distances typically relies on the use of photons as the elementary qubit. Converting between stationary electronic qubits in superconducting systems and traveling photonic qubits is a challenging yet necessary goal for the interface of quantum computing and communication. The most promising path to achieving this goal appears to be the integration of superconductivity with optically active semiconductors,more » with quantum information being transferred between the two by means of the superconducting proximity effect. Obtaining good interfaces between superconductor and semiconductor is the next obvious step for improving these hybrid systems. As a result, we report on our observation of superconductivity in self-assembled indium structures grown epitaxially on the surface of semiconductor material.« less
A Research Agenda and Vision for Data Science
NASA Astrophysics Data System (ADS)
Mattmann, C. A.
2014-12-01
Big Data has emerged as a first-class citizen in the research community spanning disciplines in the domain sciences - Astronomy is pushing velocity with new ground-based instruments such as the Square Kilometre Array (SKA) and its unprecedented data rates (700 TB/sec!); Earth-science is pushing the boundaries of volume with increasing experiments in the international Intergovernmental Panel on Climate Change (IPCC) and climate modeling and remote sensing communities increasing the size of the total archives into the Exabytes scale; airborne missions from NASA such as the JPL Airborne Snow Observatory (ASO) is increasing both its velocity and decreasing the overall turnaround time required to receive products and to make them available to water managers and decision makers. Proteomics and the computational biology community are sequencing genomes and providing near real time answers to clinicians, researchers, and ultimately to patients, helping to process and understand and create diagnoses. Data complexity is on the rise, and the norm is no longer 100s of metadata attributes, but thousands to hundreds of thousands, including complex interrelationships between data and metadata and knowledge. I published a vision for data science in Nature 2013 that encapsulates four thrust areas and foci that I believe the computer science, Big Data, and data science communities need to attack over the next decade to make fundamental progress in the data volume, velocity and complexity challenges arising from the domain sciences such as those described above. These areas include: (1) rapid and unobtrusive algorithm integration; (2) intelligent and automatic data movement; (3) automated and rapid extraction text, metadata and language from heterogeneous file formats; and (4) participation and people power via open source communities. In this talk I will revisit these four areas and describe current progress; future work and challenges ahead as we move forward in this exciting age of Data Science.
Biomolecular computing systems: principles, progress and potential.
Benenson, Yaakov
2012-06-12
The task of information processing, or computation, can be performed by natural and man-made 'devices'. Man-made computers are made from silicon chips, whereas natural 'computers', such as the brain, use cells and molecules. Computation also occurs on a much smaller scale in regulatory and signalling pathways in individual cells and even within single biomolecules. Indeed, much of what we recognize as life results from the remarkable capacity of biological building blocks to compute in highly sophisticated ways. Rational design and engineering of biological computing systems can greatly enhance our ability to study and to control biological systems. Potential applications include tissue engineering and regeneration and medical treatments. This Review introduces key concepts and discusses recent progress that has been made in biomolecular computing.
NASA Astrophysics Data System (ADS)
Saghafian, Amirreza; Pitsch, Heinz
2012-11-01
A compressible flamelet/progress variable approach (CFPV) has been devised for high-speed flows. Temperature is computed from the transported total energy and tabulated species mass fractions and the source term of the progress variable is rescaled with pressure and temperature. The combustion is thus modeled by three additional scalar equations and a chemistry table that is computed in a pre-processing step. Three-dimensional direct numerical simulation (DNS) databases of reacting supersonic turbulent mixing layer with detailed chemistry are analyzed to assess the underlying assumptions of CFPV. Large eddy simulations (LES) of the same configuration using the CFPV method have been performed and compared with the DNS results. The LES computations are based on the presumed subgrid PDFs of mixture fraction and progress variable, beta function and delta function respectively, which are assessed using DNS databases. The flamelet equation budget is also computed to verify the validity of CFPV method for high-speed flows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tawhai, Merryn; Bischoff, Jeff; Einstein, Daniel R.
2009-05-01
Abstract In this article, we describe some current multiscale modeling issues in computational biomechanics from the perspective of the musculoskeletal and respiratory systems and mechanotransduction. First, we outline the necessity of multiscale simulations in these biological systems. Then we summarize challenges inherent to multiscale biomechanics modeling, regardless of the subdiscipline, followed by computational challenges that are system-specific. We discuss some of the current tools that have been utilized to aid research in multiscale mechanics simulations, and the priorities to further the field of multiscale biomechanics computation.
Useful Sensor Web Capabilities to Enable Progressive Mission Autonomy
NASA Technical Reports Server (NTRS)
Mandl, Dan
2007-01-01
This viewgraph presentation reviews using the Sensor Web capabilities as an enabling technology to allow for progressive autonomy of NASA space missions. The presentation reviews technical challenges for future missions, and some of the capabilities that exist to meet those challenges. To establish the ability of the technology to meet the challenges, experiments were conducted on three missions: Earth Observing 1 (EO-1), Cosmic Hot Interstellar Plasma Spectrometer (CHIPS) and Space Technology 5 (ST-5). These experiments are reviewed.
NASA Astrophysics Data System (ADS)
Le, Anh H.; Park, Young W.; Ma, Kevin; Jacobs, Colin; Liu, Brent J.
2010-03-01
Multiple Sclerosis (MS) is a progressive neurological disease affecting myelin pathways in the brain. Multiple lesions in the white matter can cause paralysis and severe motor disabilities of the affected patient. To solve the issue of inconsistency and user-dependency in manual lesion measurement of MRI, we have proposed a 3-D automated lesion quantification algorithm to enable objective and efficient lesion volume tracking. The computer-aided detection (CAD) of MS, written in MATLAB, utilizes K-Nearest Neighbors (KNN) method to compute the probability of lesions on a per-voxel basis. Despite the highly optimized algorithm of imaging processing that is used in CAD development, MS CAD integration and evaluation in clinical workflow is technically challenging due to the requirement of high computation rates and memory bandwidth in the recursive nature of the algorithm. In this paper, we present the development and evaluation of using a computing engine in the graphical processing unit (GPU) with MATLAB for segmentation of MS lesions. The paper investigates the utilization of a high-end GPU for parallel computing of KNN in the MATLAB environment to improve algorithm performance. The integration is accomplished using NVIDIA's CUDA developmental toolkit for MATLAB. The results of this study will validate the practicality and effectiveness of the prototype MS CAD in a clinical setting. The GPU method may allow MS CAD to rapidly integrate in an electronic patient record or any disease-centric health care system.
Recent progress of quantum annealing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suzuki, Sei
2015-03-10
We review the recent progress of quantum annealing. Quantum annealing was proposed as a method to solve generic optimization problems. Recently a Canadian company has drawn a great deal of attention, as it has commercialized a quantum computer based on quantum annealing. Although the performance of quantum annealing is not sufficiently understood, it is likely that quantum annealing will be a practical method both on a conventional computer and on a quantum computer.
Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.
ERIC Educational Resources Information Center
Pierre, Samuel
2001-01-01
Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…
Software Systems for High-performance Quantum Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S; Britt, Keith A
Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventionalmore » computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.« less
Unfinished Agendas: New and Continuing Gender Challenges in Higher Education
ERIC Educational Resources Information Center
Glazer-Raymo, Judith, Ed.
2008-01-01
This revealing volume examines the current role and status of women in higher education--and suggests a direction for the future. Judith Glazer-Raymo and other distinguished scholars and administrators assess the progress of women in academe using three lenses: the feminist agenda as a work in progress, growing internal and external challenges to…
Early Childhood Education in Pakistan--Progress and Challenges
ERIC Educational Resources Information Center
Shami, Pervez A.
2009-01-01
This paper is on Early Childhood Education (ECE)--a goal of Education For All (EFA), assesses the progress of Pakistan's efforts and strategies mainly during post Dakar period 2001-02 to 2005-06 and the major challenges being encountered. The information includes analysis of National Educational Data on ECE in the light of policies and plans, EFA…
Compartments, Customers, or Convergence?: Evolving Challenges to IT Progress
ERIC Educational Resources Information Center
Jackson, Gregory A.
2007-01-01
For most of the past decade, IT leaders in higher education felt on top of things. Challenges were usually comprehensible and addressable, doing reasonable things yielded reasonable results, and for the most part, IT was a substantial contributor to institutional progress. But these days, IT leaders rarely feel on top of things. Instead, some days…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
Simulations & Measurements of Airframe Noise: A BANC Workshops Perspective
NASA Technical Reports Server (NTRS)
Choudhari, Meelan; Lockard, David
2016-01-01
Airframe noise corresponds to the acoustic radiation due to turbulent flow in the vicinity of airframe components such as high-lift devices and landing gears. Since 2010, the American Institute of Aeronautics and Astronautics has organized an ongoing series of workshops devoted to Benchmark Problems for Airframe Noise Computations (BANC). The BANC workshops are aimed at enabling a systematic progress in the understanding and high-fidelity predictions of airframe noise via collaborative investigations that integrate computational fluid dynamics, computational aeroacoustics, and in depth measurements targeting a selected set of canonical yet realistic configurations that advance the current state-of-the-art in multiple respects. Unique features of the BANC Workshops include: intrinsically multi-disciplinary focus involving both fluid dynamics and aeroacoustics, holistic rather than predictive emphasis, concurrent, long term evolution of experiments and simulations with a powerful interplay between the two, and strongly integrative nature by virtue of multi-team, multi-facility, multiple-entry measurements. This paper illustrates these features in the context of the BANC problem categories and outlines some of the challenges involved and how they were addressed. A brief summary of the BANC effort, including its technical objectives, strategy, and selective outcomes thus far is also included.
NASA Astrophysics Data System (ADS)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.
2018-03-01
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.
Density functional theory in the solid state
Hasnip, Philip J.; Refson, Keith; Probert, Matt I. J.; Yates, Jonathan R.; Clark, Stewart J.; Pickard, Chris J.
2014-01-01
Density functional theory (DFT) has been used in many fields of the physical sciences, but none so successfully as in the solid state. From its origins in condensed matter physics, it has expanded into materials science, high-pressure physics and mineralogy, solid-state chemistry and more, powering entire computational subdisciplines. Modern DFT simulation codes can calculate a vast range of structural, chemical, optical, spectroscopic, elastic, vibrational and thermodynamic phenomena. The ability to predict structure–property relationships has revolutionized experimental fields, such as vibrational and solid-state NMR spectroscopy, where it is the primary method to analyse and interpret experimental spectra. In semiconductor physics, great progress has been made in the electronic structure of bulk and defect states despite the severe challenges presented by the description of excited states. Studies are no longer restricted to known crystallographic structures. DFT is increasingly used as an exploratory tool for materials discovery and computational experiments, culminating in ex nihilo crystal structure prediction, which addresses the long-standing difficult problem of how to predict crystal structure polymorphs from nothing but a specified chemical composition. We present an overview of the capabilities of solid-state DFT simulations in all of these topics, illustrated with recent examples using the CASTEP computer program. PMID:24516184
NASA Astrophysics Data System (ADS)
Schrooyen, Pierre; Chatelain, Philippe; Hillewaert, Koen; Magin, Thierry E.
2014-11-01
The atmospheric entry of spacecraft presents several challenges in simulating the aerothermal flow around the heat shield. Predicting an accurate heat-flux is a complex task, especially regarding the interaction between the flow in the free stream and the erosion of the thermal protection material. To capture this interaction, a continuum approach is developed to go progressively from the region fully occupied by fluid to a receding porous medium. The volume averaged Navier-Stokes equations are used to model both phases in the same computational domain considering a single set of conservation laws. The porosity is itself a variable of the computation, allowing to take volumetric ablation into account through adequate source terms. This approach is implemented within a computational tool based on a high-order discontinuous Galerkin discretization. The multi-dimensional tool has already been validated and has proven its efficient parallel implementation. Within this platform, a fully implicit method was developed to simulate multi-phase reacting flows. Numerical results to verify and validate the methodology are considered within this work. Interactions between the flow and the ablated geometry are also presented. Supported by Fund for Research Training in Industry and Agriculture.
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...
2018-02-09
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
Progressive Fracture of Fiber Composite Builtup Structures
NASA Technical Reports Server (NTRS)
Gotsis, Pascal K.; Chamis, Christos C.; Minnetyan, Levon
1996-01-01
The damage progression and fracture of builtup composite structures was evaluated by using computational simulation to examine the behavior and response of a stiffened composite (0 +/- 45/90)(sub s6) laminate panel subjected to a bending load. The damage initiation, growth, accumulation, progression, and propagation to structural collapse were simulated. An integrated computer code (CODSTRAN) was augmented for the simulation of the progressive damage and fracture of builtup composite structures under mechanical loading. Results showed that damage initiation and progression have a significant effect on the structural response. Also investigated was the influence of different types of bending load on the damage initiation, propagation, and final fracture of the builtup composite panel.
Clinical Information Systems – From Yesterday to Tomorrow
2016-01-01
Summary Objectives To review the history of clinical information systems over the past twenty-five years and project anticipated changes to those systems over the next twenty-five years. Methods Over 250 Medline references about clinical information systems, quality of patient care, and patient safety were reviewed. Books, Web resources, and the author’s personal experience with developing the HELP system were also used. Results There have been dramatic improvements in the use and acceptance of clinical computing systems and Electronic Health Records (EHRs), especially in the United States. Although there are still challenges with the implementation of such systems, the rate of progress has been remarkable. Over the next twenty-five years, there will remain many important opportunities and challenges. These opportunities include understanding complex clinical computing issues that must be studied, understood and optimized. Dramatic improvements in quality of care and patient safety must be anticipated as a result of the use of clinical information systems. These improvements will result from a closer involvement of clinical informaticians in the optimization of patient care processes. Conclusions Clinical information systems and computerized clinical decision support have made contributions to medicine in the past. Therefore, by using better medical knowledge, optimized clinical information systems, and computerized clinical decision, we will enable dramatic improvements in both the quality and safety of patient care in the next twenty-five years. PMID:27362589
Trelease, Robert B
2016-11-01
Until the late-twentieth century, primary anatomical sciences education was relatively unenhanced by advanced technology and dependent on the mainstays of printed textbooks, chalkboard- and photographic projection-based classroom lectures, and cadaver dissection laboratories. But over the past three decades, diffusion of innovations in computer technology transformed the practices of anatomical education and research, along with other aspects of work and daily life. Increasing adoption of first-generation personal computers (PCs) in the 1980s paved the way for the first practical educational applications, and visionary anatomists foresaw the usefulness of computers for teaching. While early computers lacked high-resolution graphics capabilities and interactive user interfaces, applications with video discs demonstrated the practicality of programming digital multimedia linking descriptive text with anatomical imaging. Desktop publishing established that computers could be used for producing enhanced lecture notes, and commercial presentation software made it possible to give lectures using anatomical and medical imaging, as well as animations. Concurrently, computer processing supported the deployment of medical imaging modalities, including computed tomography, magnetic resonance imaging, and ultrasound, that were subsequently integrated into anatomy instruction. Following its public birth in the mid-1990s, the World Wide Web became the ubiquitous multimedia networking technology underlying the conduct of contemporary education and research. Digital video, structural simulations, and mobile devices have been more recently applied to education. Progressive implementation of computer-based learning methods interacted with waves of ongoing curricular change, and such technologies have been deemed crucial for continuing medical education reforms, providing new challenges and opportunities for anatomical sciences educators. Anat Sci Educ 9: 583-602. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.
Computer Technology Standards of Learning for Virginia's Public Schools
ERIC Educational Resources Information Center
Virginia Department of Education, 2005
2005-01-01
The Computer/Technology Standards of Learning identify and define the progressive development of essential knowledge and skills necessary for students to access, evaluate, use, and create information using technology. They provide a framework for technology literacy and demonstrate a progression from physical manipulation skills for the use of…
NASA Astrophysics Data System (ADS)
Narayanaswami, Chandra; Raghunath, Mandayam T.
2004-09-01
We outline a collection of technological challenges in the design of wearable computers with a focus on one of the most desirable form-factors, the wrist watch. We describe our experience with building three generations of wrist watch computers. We built these research prototypes as platforms to investigate the fundamental limitations of wearable computing. Results of our investigations are presented in the form of challenges that have been overcome and those that still remain.
Synthetic Analog and Digital Circuits for Cellular Computation and Memory
Purcell, Oliver; Lu, Timothy K.
2014-01-01
Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene circuits that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. PMID:24794536
Zao, John K.; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping
2014-01-01
EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system. PMID:24917804
Zao, John K; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping
2014-01-01
EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system.
Ahmed, N; Zheng, Ziyi; Mueller, K
2012-12-01
Due to the inherent characteristics of the visualization process, most of the problems in this field have strong ties with human cognition and perception. This makes the human brain and sensory system the only truly appropriate evaluation platform for evaluating and fine-tuning a new visualization method or paradigm. However, getting humans to volunteer for these purposes has always been a significant obstacle, and thus this phase of the development process has traditionally formed a bottleneck, slowing down progress in visualization research. We propose to take advantage of the newly emerging field of Human Computation (HC) to overcome these challenges. HC promotes the idea that rather than considering humans as users of the computational system, they can be made part of a hybrid computational loop consisting of traditional computation resources and the human brain and sensory system. This approach is particularly successful in cases where part of the computational problem is considered intractable using known computer algorithms but is trivial to common sense human knowledge. In this paper, we focus on HC from the perspective of solving visualization problems and also outline a framework by which humans can be easily seduced to volunteer their HC resources. We introduce a purpose-driven game titled "Disguise" which serves as a prototypical example for how the evaluation of visualization algorithms can be mapped into a fun and addicting activity, allowing this task to be accomplished in an extensive yet cost effective way. Finally, we sketch out a framework that transcends from the pure evaluation of existing visualization methods to the design of a new one.
Interdisciplinary and physics challenges of network theory
NASA Astrophysics Data System (ADS)
Bianconi, Ginestra
2015-09-01
Network theory has unveiled the underlying structure of complex systems such as the Internet or the biological networks in the cell. It has identified universal properties of complex networks, and the interplay between their structure and dynamics. After almost twenty years of the field, new challenges lie ahead. These challenges concern the multilayer structure of most of the networks, the formulation of a network geometry and topology, and the development of a quantum theory of networks. Making progress on these aspects of network theory can open new venues to address interdisciplinary and physics challenges including progress on brain dynamics, new insights into quantum technologies, and quantum gravity.
Computer Graphics Research Laboratory Quarterly Progress Report Number 49, July-September 1993
1993-11-22
20 Texture Sampling and Strength Guided Motion: Jeffry S. Nimeroff 23 21 Radiosity : Min-Zhi Shao 24 22 Blended Shape Primitives: Douglas DeCarlo 25 23...placement. "* Extensions of radiosity rendering. "* A discussion of blended shape primitives and the applications in computer vision and computer...user. Radiosity : An improved version of the radiosity renderer is included. This version uses a fast over- relaxation progressive refinement algorithm
Johnston, W S; Mendelson, Y
2006-01-01
Despite steady progress in the miniaturization of pulse oximeters over the years, significant challenges remain since advanced signal processing must be implemented efficiently in real-time by a relatively small size wearable device. The goal of this study was to investigate several potential digital signal processing algorithms for computing arterial oxygen saturation (SpO(2)) and heart rate (HR) in a battery-operated wearable reflectance pulse oximeter that is being developed in our laboratory for use by medics and first responders in the field. We found that a differential measurement approach, combined with a low-pass filter (LPF), yielded the most suitable signal processing technique for estimating SpO(2), while a signal derivative approach produced the most accurate HR measurements.
The need and potential for building a integrated knowledge-base of the Earth-Human system
NASA Astrophysics Data System (ADS)
Jacobs, Clifford
2011-03-01
The pursuit of scientific understanding is increasingly based on interdisciplinary research. To understand more deeply the planet and its interactions requires a progressively more holistic approach, exploring knowledge coming from all scientific and engineering disciplines including but not limited to, biology, chemistry, computer sciences, geosciences, material sciences, mathematics, physics, cyberinfrastucture, and social sciences. Nowhere is such an approach more critical than in the study of global climate change in which one of the major challenges is the development of next-generation Earth System Models that include coupled and interactive representations of ecosystems, agricultural working lands and forests, urban environments, biogeochemistry, atmospheric chemistry, ocean and atmospheric currents, the water cycle, land ice, and human activities.
Chang, Wei-Ting; Hsieh, Tung-Han; Liu, Ming-Fei
2013-04-01
Systemic lupus erythematosus (SLE) poses great difficulty in making an early diagnosis in elderly males, often presenting with atypical manifestations. Acute onset of empyematous pleural effusion has rarely been seen. Herein, we report a 66-year-old man with SLE presenting with rapid progression of bilateral pleural effusion. Diagnostic thoracocentesis disclosed neutrophil-predominant exudates and chest computed tomography revealed multiple loculated pleural effusions. Nevertheless, optimal antibiotic therapy plus surgical decortication of the pleura did not improve his condition. The diagnosis of SLE was readily established after LE cells were accidentally found in the pleural effusion. Large amounts of pleural effusion subsided soon after high dose corticosteroid therapy. Copyright © 2011. Published by Elsevier B.V.
Progress in atherosclerotic plaque imaging
Soloperto, Giulia; Casciaro, Sergio
2012-01-01
Cardiovascular diseases are the primary cause of mortality in the industrialized world, and arterial obstruction, triggered by rupture-prone atherosclerotic plaques, lead to myocardial infarction and cerebral stroke. Vulnerable plaques do not necessarily occur with flow-limiting stenosis, thus conventional luminographic assessment of the pathology fails to identify unstable lesions. In this review we discuss the currently available imaging modalities used to investigate morphological features and biological characteristics of the atherosclerotic plaque. The different imaging modalities such as ultrasound, magnetic resonance imaging, computed tomography, nuclear imaging and their intravascular applications are illustrated, highlighting their specific diagnostic potential. Clinically available and upcoming methodologies are also reviewed along with the related challenges in their clinical translation, concerning the specific invasiveness, accuracy and cost-effectiveness of these methods. PMID:22937215
Transmission and visualization of large geographical maps
NASA Astrophysics Data System (ADS)
Zhang, Liqiang; Zhang, Liang; Ren, Yingchao; Guo, Zhifeng
Transmission and visualization of large geographical maps have become a challenging research issue in GIS applications. This paper presents an efficient and robust way to simplify large geographical maps using frame buffers and Voronoi diagrams. The topological relationships are kept during the simplification by removing the Voronoi diagram's self-overlapped regions. With the simplified vector maps, we establish different levels of detail (LOD) models of these maps. Then we introduce a client/server architecture which integrates our out-of-core algorithm, progressive transmission and rendering scheme based on computer graphics hardware. The architecture allows the viewers to view different regions interactively at different LODs on the network. Experimental results show that our proposed scheme provides an effective way for powerful transmission and manipulation of large maps.
Parallel Programming Strategies for Irregular Adaptive Applications
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Biegel, Bryan (Technical Monitor)
2001-01-01
Achieving scalable performance for dynamic irregular applications is eminently challenging. Traditional message-passing approaches have been making steady progress towards this goal; however, they suffer from complex implementation requirements. The use of a global address space greatly simplifies the programming task, but can degrade the performance for such computations. In this work, we examine two typical irregular adaptive applications, Dynamic Remeshing and N-Body, under competing programming methodologies and across various parallel architectures. The Dynamic Remeshing application simulates flow over an airfoil, and refines localized regions of the underlying unstructured mesh. The N-Body experiment models two neighboring Plummer galaxies that are about to undergo a merger. Both problems demonstrate dramatic changes in processor workloads and interprocessor communication with time; thus, dynamic load balancing is a required component.
ERIC Educational Resources Information Center
Gaines, Gale F.
2004-01-01
Is your state making progress toward having a high-quality teacher in every classroom? How will you know? One indicator of progress in the Challenge to Lead goals refers to teacher compensation: "Salaries, benefits and incentives are competitive in the marketplace. They are aimed at recognized expertise, student performance, state needs and taking…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP, high school graduation rates, and access…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
ERIC Educational Resources Information Center
Southern Regional Education Board (SREB), 2014
2014-01-01
"Taking Stock and Pushing Forward" (2014) reports states' progress toward the Challenge to Lead 2020 Goals for Education. State-specific documents report on student achievement as well as essential state policies to improve it. Among the many metrics: how states are improving achievement on NAEP [National Assessment of Educational…
NASA Astrophysics Data System (ADS)
Valencia, J.; Muñoz-Nieto, A.; Rodriguez-Gonzalvez, P.
2015-02-01
3D virtual modeling, visualization, dissemination and management of urban areas is one of the most exciting challenges that must face geomatics in the coming years. This paper aims to review, compare and analyze the new technologies, policies and software tools that are in progress to manage urban 3D information. It is assumed that the third dimension increases the quality of the model provided, allowing new approaches to urban planning, conservation and management of architectural and archaeological areas. Despite the fact that displaying 3D urban environments is an issue nowadays solved, there are some challenges to be faced by geomatics in the coming future. Displaying georeferenced linked information would be considered the first challenge. Another challenge to face is to improve the technical requirements if this georeferenced information must be shown in real time. Are there available software tools ready for this challenge? Are they useful to provide services required in smart cities? Throughout this paper, many practical examples that require 3D georeferenced information and linked data will be shown. Computer advances related to 3D spatial databases and software that are being developed to convert rendering virtual environment to a new enriched environment with linked information will be also analyzed. Finally, different standards that Open Geospatial Consortium has assumed and developed regarding the three-dimensional geographic information will be reviewed. Particular emphasis will be devoted on KML, LandXML, CityGML and the new IndoorGML.
Progress on the FabrIc for Frontier Experiments project at Fermilab
Box, Dennis; Boyd, Joseph; Dykstra, Dave; ...
2015-12-23
The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercialmore » cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. Hence, the progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide« less
NASA Astrophysics Data System (ADS)
Fritze, Matthew D.
Fluid-structure interaction (FSI) modeling of spacecraft parachutes involves a number of computational challenges. The canopy complexity created by the hundreds of gaps and slits and design-related modification of that geometric porosity by removal of some of the sails and panels are among the formidable challenges. Disreefing from one stage to another when the parachute is used in multiple stages is another formidable challenge. This thesis addresses the computational challenges involved in disreefing of spacecraft parachutes and fully-open and reefed stages of the parachutes with modified geometric porosity. The special techniques developed to address these challenges are described and the FSI computations are be reported. The thesis also addresses the modeling and computation challenges involved in very early stages, where the sudden separation of a cover jettisoned to the spacecraft wake needs to be modeled. Higher-order temporal representations used in modeling the separation motion are described, and the computed separation and wake-induced forces acting on the cover are reported.
Nelson, Michelle; Salguero, Francisco J; Dean, Rachel E; Ngugi, Sarah A; Smither, Sophie J; Atkins, Timothy P; Lever, Mark S
2014-01-01
Glanders and melioidosis are caused by two distinct Burkholderia species and have generally been considered to have similar disease progression. While both of these pathogens are HHS/CDC Tier 1 agents, natural infection with both these pathogens is primarily through skin inoculation. The common marmoset (Callithrix jacchus) was used to compare disease following experimental subcutaneous challenge. Acute, lethal disease was observed in marmosets following challenge with between 26 and 1.2 × 108 cfu Burkholderia pseudomallei within 22–85 h. The reproducibility and progression of the disease were assessed following a challenge of 1 × 102 cfu of B. pseudomallei. Melioidosis was characterised by high levels of bacteraemia, focal microgranuloma progressing to non-necrotic multifocal solid lesions in the livers and spleens and multi-organ failure. Lethal disease was observed in 93% of animals challenged with Burkholderia mallei, occurring between 5 and 10.6 days. Following challenge with 1 × 102 cfu of B. mallei, glanders was characterised with lymphatic spread of the bacteria and non-necrotic, multifocal solid lesions progressing to a multifocal lesion with severe necrosis and pneumonia. The experimental results confirmed that the disease pathology and presentation is strikingly different between the two pathogens. The marmoset provides a model of the human syndrome for both diseases facilitating the development of medical countermeasures. PMID:25477002
Ancient Biomolecules and Evolutionary Inference.
Cappellini, Enrico; Prohaska, Ana; Racimo, Fernando; Welker, Frido; Pedersen, Mikkel Winther; Allentoft, Morten E; de Barros Damgaard, Peter; Gutenbrunner, Petra; Dunne, Julie; Hammann, Simon; Roffet-Salque, Mélanie; Ilardo, Melissa; Moreno-Mayar, J Víctor; Wang, Yucheng; Sikora, Martin; Vinner, Lasse; Cox, Jürgen; Evershed, Richard P; Willerslev, Eske
2018-04-25
Over the last decade, studies of ancient biomolecules-particularly ancient DNA, proteins, and lipids-have revolutionized our understanding of evolutionary history. Though initially fraught with many challenges, the field now stands on firm foundations. Researchers now successfully retrieve nucleotide and amino acid sequences, as well as lipid signatures, from progressively older samples, originating from geographic areas and depositional environments that, until recently, were regarded as hostile to long-term preservation of biomolecules. Sampling frequencies and the spatial and temporal scope of studies have also increased markedly, and with them the size and quality of the data sets generated. This progress has been made possible by continuous technical innovations in analytical methods, enhanced criteria for the selection of ancient samples, integrated experimental methods, and advanced computational approaches. Here, we discuss the history and current state of ancient biomolecule research, its applications to evolutionary inference, and future directions for this young and exciting field. Expected final online publication date for the Annual Review of Biochemistry Volume 87 is June 20, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
YU, YI-SHAN; LI, WAN-HU; LI, MING-HUAN; MENG, XUE; KONG, LI; YU, JIN-MING
2015-01-01
Bone metastases are common in prostate cancer. However, differentiating neoplastic from non-neoplastic alterations of bone on images is challenging. In the present report, a rare case of bone marrow reconversion on magnetic resonance imaging (MRI) assessment, which may lead to a false-positive diagnosis of disease progression of bone metastases in hormone-resistant prostate cancer, is presented. Furthermore, a review of the literature regarding the pitfalls of images for response assessment, including the ‘flare’ phenomenon on bone scintigraphy, computed tomography (CT), positron emission tomography/CT and marrow reconversion on MRI is also provided. These inaccuracies, which may lead to a premature termination of an efficacious treatment, should be carefully considered by the radiologists and oncologists involved in clinical trials. The case reported in the present study showed how to assess the early therapeutic response and select the appropriate treatment for the patient when these pitfalls are encountered on clinical images. PMID:26788174
Advanced Imaging in Osteoarthritis
Li, Qi; Amano, Keiko; Link, Thomas M.; Ma, C. Benjamin
2016-01-01
Context: Radiography is widely accepted as the gold standard for diagnosing osteoarthritis (OA), but it has limitations when assessing early stage OA and monitoring progression. While there are improvements in the treatment of OA, the challenge is early recognition. Evidence Acquisition: MEDLINE and PubMed as well as professional orthopaedic and imaging websites were reviewed from 2006 to 2016. Study Design: Clinical review. Level of Evidence: Level 4. Results: Magnetic resonance imaging (MRI) can provide the most comprehensive assessment of joint injury and OA with the advantages of being noninvasive and multiplanar with excellent soft tissue contrast. However, MRI is expensive, time consuming, and not widely used for monitoring OA clinically. Computed tomography (CT) and CT arthrography (CTA) can also be used to evaluate OA, but these are also invasive and require radiation exposure. Ultrasound is particularly useful for evaluation of synovitis but not for progression of OA. Conclusion: MRI, CT, and CTA are available for the diagnosis and monitoring of OA. Improvement in techniques and decrease in cost can allow some of these modalities to be effective methods of detecting early OA. PMID:27510507
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
The EV Everywhere Workplace Charging Challenge aims to have 500 U.S. employers offering workplace charging by 2018. These reports describe the progress made in the Challenge. In 2015, the Workplace Charging Challenge celebrated a major milestone – it reached the halfway point to its goal of 500 Challenge partners committed to installing workplace charging by 2018. More than 250 employers have joined as Challenge partners and the installation of workplace charging as a sustainable business practice is growing across the country. Their efforts have resulted in more than 600 workplaces with over 5,500 charging stations accessible to nearly one millionmore » employees. In 2015, more than 9,000 PEV-driving employees charged at these worksites on a regular basis. Our Workplace Charging Challenge Mid-Program Review reports this progress and other statistics related to workplace charging, including employee satisfaction and charger usage.« less
ERIC Educational Resources Information Center
Darabi, Aubteen; Nelson, David W.; Meeker, Richard; Liang, Xinya; Boulware, Wilma
2010-01-01
In a diagnostic problem solving operation of a computer-simulated chemical plant, chemical engineering students were randomly assigned to two groups: one studying product-oriented worked examples, the other practicing conventional problem solving. Effects of these instructional strategies on the progression of learners' mental models were examined…
Progress in computational toxicology.
Ekins, Sean
2014-01-01
Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.
Barth, Patrick; Senes, Alessandro
2016-06-07
The computational design of α-helical membrane proteins is still in its infancy but has already made great progress. De novo design allows stable, specific and active minimal oligomeric systems to be obtained. Computational reengineering can improve the stability and function of naturally occurring membrane proteins. Currently, the major hurdle for the field is the experimental characterization of the designs. The emergence of new structural methods for membrane proteins will accelerate progress.
NASA Technical Reports Server (NTRS)
Biegel, Bryan A.
1999-01-01
We are on the path to meet the major challenges ahead for TCAD (technology computer aided design). The emerging computational grid will ultimately solve the challenge of limited computational power. The Modular TCAD Framework will solve the TCAD software challenge once TCAD software developers realize that there is no other way to meet industry's needs. The modular TCAD framework (MTF) also provides the ideal platform for solving the TCAD model challenge by rapid implementation of models in a partial differential solver.
NASA Astrophysics Data System (ADS)
Neighbour, Gordon
2013-04-01
In 2012 Computing and Information Technology was disapplied from the English National Curriculum and therefore no longer has a compulsory programme of study. Data logging and data modelling are still essential components of the curriculum in the Computing and Information Technology classroom. Once the students have mastered the basics of both spreadsheet and information handling software they need to be further challenged. All too often the data used with relation to data-logging and data-handling is not realistic enough to really challenge very able students. However, using data from seismology allows students to manipulate "real" data and enhances their experience of geo-science, developing their skills and then allowing them to build on this work in both the science and geography classroom. This new scheme of work "Seismology at School" has allowed the students to work and develop skills beyond those normally expected for their age group and has allowed them to better appreciate their learning experience of "Natural Hazards" in the science and geography classroom in later years. The students undertake research to help them develop their understanding of earthquakes. This includes using materials from other nations within the European Economic Area, to also develop and challenge their use of Modern Foreign Languages. They are then challenged to create their own seismometers using simple kits and 'free' software - this "problem-solving" approach to their work is designed to enhance team-work and to extend the challenge they experience in the classroom. The students are then are asked to manipulate a "real" set of data using international earthquake data from the most recent whole year. This allows the students to make use of many of the analytical and statistical functions of both spreadsheet software and information handling software in a meaningful way. The students will need to have developed a hypothesis which their work should have provided either validation for or against. They are required to document their progress throughout the project and submit their work as an electronic portfolio for marking and this thus challenges their organisational abilities. Finally through the project it is hoped to develop and extend partnerships with other schools in the European Economic Area so that the students are able to work with students from these areas to further appreciate the teaching of "Natural Hazards" in other cultures within the EEA.
Synthetic biology as it relates to CAM photosynthesis: challenges and opportunities.
DePaoli, Henrique C; Borland, Anne M; Tuskan, Gerald A; Cushman, John C; Yang, Xiaohan
2014-07-01
To meet future food and energy security needs, which are amplified by increasing population growth and reduced natural resource availability, metabolic engineering efforts have moved from manipulating single genes/proteins to introducing multiple genes and novel pathways to improve photosynthetic efficiency in a more comprehensive manner. Biochemical carbon-concentrating mechanisms such as crassulacean acid metabolism (CAM), which improves photosynthetic, water-use, and possibly nutrient-use efficiency, represent a strategic target for synthetic biology to engineer more productive C3 crops for a warmer and drier world. One key challenge for introducing multigene traits like CAM onto a background of C3 photosynthesis is to gain a better understanding of the dynamic spatial and temporal regulatory events that underpin photosynthetic metabolism. With the aid of systems and computational biology, vast amounts of experimental data encompassing transcriptomics, proteomics, and metabolomics can be related in a network to create dynamic models. Such models can undergo simulations to discover key regulatory elements in metabolism and suggest strategic substitution or augmentation by synthetic components to improve photosynthetic performance and water-use efficiency in C3 crops. Another key challenge in the application of synthetic biology to photosynthesis research is to develop efficient systems for multigene assembly and stacking. Here, we review recent progress in computational modelling as applied to plant photosynthesis, with attention to the requirements for CAM, and recent advances in synthetic biology tool development. Lastly, we discuss possible options for multigene pathway construction in plants with an emphasis on CAM-into-C3 engineering. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.
The IS-GEO RCN: Fostering Collaborations for Intelligent Systems Research to Support Geosciences
NASA Astrophysics Data System (ADS)
Gil, Y.; Pierce, S. A.
2016-12-01
Geoscience problems are complex and often involve data that changes across space and time. Frequently geoscience knowledge and understanding provides valuable information and insight for problems related to energy, water, climate, mineral resources, and our understanding of how the Earth evolves through time. Simultaneously, many grand challenges in the geosciences cannot be addressed without the aid of computational support and innovations. Intelligent and Information Systems (IS) research includes a broad range of computational methods and topics such as knowledge representation, information integration, machine learning, robotics, adaptive sensors, and intelligent interfaces. IS research has a very important role to play in accelerating the speed of scientific discovery in geosciences and thus in solving challenges in geosciences. Many aspects of geosciences (GEO) research pose novel open problems for intelligent systems researchers. To develop intelligent systems with sound knowledge of theory and practice, it is important that GEO and IS experts collaborate. The EarthCube Research Coordination Network for Intelligent Systems for Geosciences (IS-GEO RCN) represents an emerging community of interdisciplinary researchers producing fundamental new capabilities for understanding Earth systems. Furthermore, the educational component aims to identify new approaches to teaching students in this new interdisciplinary area, seeking to raise a new generation of scientists that are better able to apply IS methods and tools to geoscience challenges of the future. By providing avenues for IS and GEO researchers to work together, the IS-GEO RCN will serve as both a point of contact, as well as an avenue for educational outreach across the disciplines for the nascent community of research and practice. The initial efforts are focused on connecting the communities in ways that help researchers understand opportunities and challenges that can benefit from IS-GEO collaborations. The IS-GEO RCN will jumpstart interdisciplinary research collaborations in this emerging new area so that progress across both disciplines can be accelerated.
ERIC Educational Resources Information Center
Balfanz, Robert; Bridgeland, John M.; Fox, Joanna Hornig; Moore, Laura A.
2011-01-01
America continues to make progress in meeting its high school dropout challenge. Leaders in education, government, nonprofits and business have awakened to the individual, social and economic costs of the dropout crisis and are working together to solve it. This year, all states, districts, and schools are required by law to calculate high school…
Synthetic analog and digital circuits for cellular computation and memory.
Purcell, Oliver; Lu, Timothy K
2014-10-01
Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene networks that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Exploring the challenges faced by polytechnic students
NASA Astrophysics Data System (ADS)
Matore, Mohd Effendi @ Ewan Mohd; Khairani, Ahmad Zamri
2015-02-01
This study aims to identify other challenges besides those already faced by students, in seven polytechnics in Malaysia as a continuation to the previous research that had identified 52 main challenges faced by students using the Rasch Model. The explorative study focuses on the challenges that are not included in the Mooney Problem Checklist (MPCL). A total of 121 polytechnic students submitted 183 written responses through the open questions provided. Two hundred fifty two students had responded from a students' perspective on the dichotomous questions regarding their view on the challenges faced. The data was analysed qualitatively using the NVivo 8.0. The findings showed that students from Politeknik Seberang Perai (PSP) gave the highest response, which was 56 (30.6%) and Politeknik Metro Kuala Lumpur (PMKL) had the lowest response of 2 (1.09%). Five dominant challenges were identified, which were the English language (32, 17.5%), learning (14, 7.7%), vehicles (13, 7.1%), information technology and communication (ICT) (13, 7.1%), and peers (11, 6.0%). This article, however, focus on three apparent challenges, namely, English language, vehicles, as well as computer and ICT, as the challenges of learning and peers had been analysed in the previous MPCL. The challenge of English language that had been raised was regarding the weakness in commanding the aspects of speech and fluency. The computer and ICT challenge covered the weakness in mastering ICT and computers, as well as computer breakdowns and low-performance computers. The challenge of vehicles emphasized the unavailability of vehicles to attend lectures and go elsewhere, lack of transportation service in the polytechnic and not having a valid driving license. These challenges are very relevant and need to be discussed in an effort to prepare polytechnics in facing the transformational process of polytechnics.
Challenges Facing Design and Analysis Tools
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)
2001-01-01
The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.
Computational fluid dynamics modelling in cardiovascular medicine.
Morris, Paul D; Narracott, Andrew; von Tengg-Kobligk, Hendrik; Silva Soto, Daniel Alejandro; Hsiao, Sarah; Lungu, Angela; Evans, Paul; Bressloff, Neil W; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P
2016-01-01
This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Challenges in Scale-Resolving Simulations of turbulent wake flows with coherent structures
NASA Astrophysics Data System (ADS)
Pereira, Filipe S.; Eça, Luís; Vaz, Guilherme; Girimaji, Sharath S.
2018-06-01
The objective of this work is to investigate the challenges encountered in Scale-Resolving Simulations (SRS) of turbulent wake flows driven by spatially-developing coherent structures. SRS of practical interest are expressly intended for efficiently computing such flows by resolving only the most important features of the coherent structures and modelling the remainder as stochastic field. The success of SRS methods depends upon three important factors: i) ability to identify key flow mechanisms responsible for the generation of coherent structures; ii) determine the optimum range of resolution required to adequately capture key elements of coherent structures; and iii) ensure that the modelled part is comprised nearly exclusively of fully-developed stochastic turbulence. This study considers the canonical case of the flow around a circular cylinder to address the aforementioned three key issues. It is first demonstrated using experimental evidence that the vortex-shedding instability and flow-structure development involves four important stages. A series of SRS computations of progressively increasing resolution (decreasing cut-off length) are performed. An a priori basis for locating the origin of the coherent structures development is proposed and examined. The criterion is based on the fact that the coherent structures are generated by the Kelvin-Helmholtz (KH) instability. The most important finding is that the key aspects of coherent structures can be resolved only if the effective computational Reynolds number (based on total viscosity) exceeds the critical value of the KH instability in laminar flows. Finally, a quantitative criterion assessing the nature of the unresolved field based on the strain-rate ratio of mean and unresolved fields is examined. The two proposed conditions and rationale offer a quantitative basis for developing "good practice" guidelines for SRS of complex turbulent wake flows with coherent structures.
Method and system for benchmarking computers
Gustafson, John L.
1993-09-14
A testing system and method for benchmarking computer systems. The system includes a store containing a scalable set of tasks to be performed to produce a solution in ever-increasing degrees of resolution as a larger number of the tasks are performed. A timing and control module allots to each computer a fixed benchmarking interval in which to perform the stored tasks. Means are provided for determining, after completion of the benchmarking interval, the degree of progress through the scalable set of tasks and for producing a benchmarking rating relating to the degree of progress for each computer.
High-speed multiple sequence alignment on a reconfigurable platform.
Oliver, Tim; Schmidt, Bertil; Maskell, Douglas; Nathan, Darran; Clemens, Ralf
2006-01-01
Progressive alignment is a widely used approach to compute multiple sequence alignments (MSAs). However, aligning several hundred sequences by popular progressive alignment tools requires hours on sequential computers. Due to the rapid growth of sequence databases biologists have to compute MSAs in a far shorter time. In this paper we present a new approach to MSA on reconfigurable hardware platforms to gain high performance at low cost. We have constructed a linear systolic array to perform pairwise sequence distance computations using dynamic programming. This results in an implementation with significant runtime savings on a standard FPGA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crabtree, George; Glotzer, Sharon; McCurdy, Bill
This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dahlburg, Jill; Corones, James; Batchelor, Donald
Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individualmore » features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC integrated planning document (IPPA, 2000), represents a significant opportunity for the DOE Office of Science to further the understanding of fusion plasmas to a level unparalleled worldwide.« less
Imaging biomarkers in multiple Sclerosis: From image analysis to population imaging.
Barillot, Christian; Edan, Gilles; Commowick, Olivier
2016-10-01
The production of imaging data in medicine increases more rapidly than the capacity of computing models to extract information from it. The grand challenges of better understanding the brain, offering better care for neurological disorders, and stimulating new drug design will not be achieved without significant advances in computational neuroscience. The road to success is to develop a new, generic, computational methodology and to confront and validate this methodology on relevant diseases with adapted computational infrastructures. This new concept sustains the need to build new research paradigms to better understand the natural history of the pathology at the early phase; to better aggregate data that will provide the most complete representation of the pathology in order to better correlate imaging with other relevant features such as clinical, biological or genetic data. In this context, one of the major challenges of neuroimaging in clinical neurosciences is to detect quantitative signs of pathological evolution as early as possible to prevent disease progression, evaluate therapeutic protocols or even better understand and model the natural history of a given neurological pathology. Many diseases encompass brain alterations often not visible on conventional MRI sequences, especially in normal appearing brain tissues (NABT). MRI has often a low specificity for differentiating between possible pathological changes which could help in discriminating between the different pathological stages or grades. The objective of medical image analysis procedures is to define new quantitative neuroimaging biomarkers to track the evolution of the pathology at different levels. This paper illustrates this issue in one acute neuro-inflammatory pathology: Multiple Sclerosis (MS). It exhibits the current medical image analysis approaches and explains how this field of research will evolve in the next decade to integrate larger scale of information at the temporal, cellular, structural and morphological levels. Copyright © 2016 Elsevier B.V. All rights reserved.
Computational chemistry in pharmaceutical research: at the crossroads.
Bajorath, Jürgen
2012-01-01
Computational approaches are an integral part of pharmaceutical research. However, there are many of unsolved key questions that limit the scientific progress in the still evolving computational field and its impact on drug discovery. Importantly, a number of these questions are not new but date back many years. Hence, it might be difficult to conclusively answer them in the foreseeable future. Moreover, the computational field as a whole is characterized by a high degree of heterogeneity and so is, unfortunately, the quality of its scientific output. In light of this situation, it is proposed that changes in scientific standards and culture should be seriously considered now in order to lay a foundation for future progress in computational research.
ERIC Educational Resources Information Center
Van Norman, Ethan R.; Nelson, Peter M.; Parker, David C.
2017-01-01
Computer adaptive tests (CATs) hold promise to monitor student progress within multitiered systems of support. However, the relationship between how long and how often data are collected and the technical adequacy of growth estimates from CATs has not been explored. Given CAT administration times, it is important to identify optimal data…
Multicore job scheduling in the Worldwide LHC Computing Grid
NASA Astrophysics Data System (ADS)
Forti, A.; Pérez-Calero Yzquierdo, A.; Hartmann, T.; Alef, M.; Lahiff, A.; Templon, J.; Dal Pra, S.; Gila, M.; Skipsey, S.; Acosta-Silva, C.; Filipcic, A.; Walker, R.; Walker, C. J.; Traynor, D.; Gadrat, S.
2015-12-01
After the successful first run of the LHC, data taking is scheduled to restart in Summer 2015 with experimental conditions leading to increased data volumes and event complexity. In order to process the data generated in such scenario and exploit the multicore architectures of current CPUs, the LHC experiments have developed parallelized software for data reconstruction and simulation. However, a good fraction of their computing effort is still expected to be executed as single-core tasks. Therefore, jobs with diverse resources requirements will be distributed across the Worldwide LHC Computing Grid (WLCG), making workload scheduling a complex problem in itself. In response to this challenge, the WLCG Multicore Deployment Task Force has been created in order to coordinate the joint effort from experiments and WLCG sites. The main objective is to ensure the convergence of approaches from the different LHC Virtual Organizations (VOs) to make the best use of the shared resources in order to satisfy their new computing needs, minimizing any inefficiency originated from the scheduling mechanisms, and without imposing unnecessary complexities in the way sites manage their resources. This paper describes the activities and progress of the Task Force related to the aforementioned topics, including experiences from key sites on how to best use different batch system technologies, the evolution of workload submission tools by the experiments and the knowledge gained from scale tests of the different proposed job submission strategies.
Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges
Millán, J. d. R.; Rupp, R.; Müller-Putz, G. R.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kübler, A.; Leeb, R.; Neuper, C.; Müller, K.-R.; Mattia, D.
2010-01-01
In recent years, new research has brought the field of electroencephalogram (EEG)-based brain–computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, “Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user–machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human–computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices. PMID:20877434
A Computational Approach for Model Update of an LS-DYNA Energy Absorbing Cell
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Jackson, Karen E.; Kellas, Sotiris
2008-01-01
NASA and its contractors are working on structural concepts for absorbing impact energy of aerospace vehicles. Recently, concepts in the form of multi-cell honeycomb-like structures designed to crush under load have been investigated for both space and aeronautics applications. Efforts to understand these concepts are progressing from tests of individual cells to tests of systems with hundreds of cells. Because of fabrication irregularities, geometry irregularities, and material properties uncertainties, the problem of reconciling analytical models, in particular LS-DYNA models, with experimental data is a challenge. A first look at the correlation results between single cell load/deflection data with LS-DYNA predictions showed problems which prompted additional work in this area. This paper describes a computational approach that uses analysis of variance, deterministic sampling techniques, response surface modeling, and genetic optimization to reconcile test with analysis results. Analysis of variance provides a screening technique for selection of critical parameters used when reconciling test with analysis. In this study, complete ignorance of the parameter distribution is assumed and, therefore, the value of any parameter within the range that is computed using the optimization procedure is considered to be equally likely. Mean values from tests are matched against LS-DYNA solutions by minimizing the square error using a genetic optimization. The paper presents the computational methodology along with results obtained using this approach.
HSR Aerodynamic Performance Status and Challenges
NASA Technical Reports Server (NTRS)
Gilbert, William P.; Antani, Tony; Ball, Doug; Calloway, Robert L.; Snyder, Phil
1999-01-01
This paper describes HSR (High Speed Research) Aerodynamic Performance Status and Challenges. The topics include: 1) Aero impact on HSR; 2) Goals and Targets; 3) Progress and Status; and 4) Remaining Challenges. This paper is presented in viewgraph form.
The Awareness and Challenges of Cloud Computing Adoption on Tertiary Education in Malaysia
NASA Astrophysics Data System (ADS)
Hazreeni Hamzah, Nor; Mahmud, Maziah; Zukri, Shamsunarnie Mohamed; Yaacob, Wan Fairos Wan; Yacob, Jusoh
2017-09-01
This preliminary study aims to investigate the awareness of the adoption of cloud computing among the academicians in tertiary education in Malaysia. Besides, this study also want to explore the possible challenges faced by the academician while adopting this new technology. The pilot study was done on 40 lecturers in Universiti Teknologi MARA Kampus Kota Bharu (UiTMKB) by using self administered questionnaire. The results found that almost half (40 percent) were not aware on the existing of cloud computing in teaching and learning (T&L) process. The challenges confronting the adoption of cloud computing are data insecurity, data insecurity, unsolicited advertisement, lock-in, reluctance to eliminate staff positions, privacy concerns, reliability challenge, regulatory compliance concerns/user control and institutional culture/resistance to change in technology. This possible challenges can be factorized in two major factors which were security and dependency factor and user control and mentality factor.
Yamada, Kazunori D.; Tomii, Kentaro; Katoh, Kazutaka
2016-01-01
Motivation: Large multiple sequence alignments (MSAs), consisting of thousands of sequences, are becoming more and more common, due to advances in sequencing technologies. The MAFFT MSA program has several options for building large MSAs, but their performances have not been sufficiently assessed yet, because realistic benchmarking of large MSAs has been difficult. Recently, such assessments have been made possible through the HomFam and ContTest benchmark protein datasets. Along with the development of these datasets, an interesting theory was proposed: chained guide trees increase the accuracy of MSAs of structurally conserved regions. This theory challenges the basis of progressive alignment methods and needs to be examined by being compared with other known methods including computationally intensive ones. Results: We used HomFam, ContTest and OXFam (an extended version of OXBench) to evaluate several methods enabled in MAFFT: (1) a progressive method with approximate guide trees, (2) a progressive method with chained guide trees, (3) a combination of an iterative refinement method and a progressive method and (4) a less approximate progressive method that uses a rigorous guide tree and consistency score. Other programs, Clustal Omega and UPP, available for large MSAs, were also included into the comparison. The effect of method 2 (chained guide trees) was positive in ContTest but negative in HomFam and OXFam. Methods 3 and 4 increased the benchmark scores more consistently than method 2 for the three datasets, suggesting that they are safer to use. Availability and Implementation: http://mafft.cbrc.jp/alignment/software/ Contact: katoh@ifrec.osaka-u.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27378296
IFE Chamber Technology - Status and Future Challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meier, W.R.; Raffray, A.R.; Abdel-Khalik, S.I.
2003-07-15
Significant progress has been made on addressing critical issues for inertial fusion energy (IFE) chambers for heavy-ion, laser and Z-pinch drivers. A variety of chamber concepts are being investigated including drywall (currently favored for laser IFE), wetted-wall (applicable to both laser and ion drivers), and thick-liquid-wall (favored by heavy ion and z-pinch drivers). Recent progress and remaining challenges in developing IFE chambers are reviewed.
Leavy, Breiffni; Roaldsen, Kirsti Skavberg; Nylund, Kamilla; Hagströmer, Maria; Franzén, Erika
2017-01-01
There is growing evidence for the positive effects of exercise training programs on balance control in Parkinson disease (PD). To be effective, balance training needs to be specific, progressive, and highly challenging. Little evidence exists, however, for how people with PD-related balance impairments perceive highly challenging and progressive balance training programs with dual-task components. The purpose of this study was to explore and describe perceptions of a highly challenging balance training program among people with mild to moderate PD. This study was qualitative in nature. In-depth interviews were conducted with 13 individuals with mild to moderate PD who had participated in a highly challenging balance training program. Interview transcripts were analyzed using qualitative content analysis, with an inductive approach. The analysis revealed 3 subthemes concerning participants' perceptions of highly challenging and progressive balance training: (1) movement to counter the disease, (2) dual-task training in contrast to everyday strategies, and (3) the struggle to maintain positive effects. The first subtheme reflects how physical activity was used as a short-term and long-term strategy for counteracting PD symptoms and their progression. The second subtheme incorporates the described experiences of being maximally challenged in a secure and supportive group environment, circumstances that stood in contrast to participants' everyday lives. The third subtheme describes participants' long-term struggle to maintain program effects on cognitive and physical function in the face of disease progression. Interpretation of the underlying patterns of these subthemes resulted in one overarching theme: training at the limits of balance capacity causes a rethinking motor and cognitive resources. The findings of this study cannot be considered to reflect the beliefs of those with weaker or negative beliefs concerning physical activity or be transferred to those at more severe stages of the disease. Findings from this study suggest that being pushed to the limits of balance capacity provoked people with mild to moderate PD to rethink their individual motor and cognitive resources, a process that was further enabled by the PD-specific group setting. © 2017 American Physical Therapy Association
Ability evaluation by binary tests: Problems, challenges & recent advances
NASA Astrophysics Data System (ADS)
Bashkansky, E.; Turetsky, V.
2016-11-01
Binary tests designed to measure abilities of objects under test (OUTs) are widely used in different fields of measurement theory and practice. The number of test items in such tests is usually very limited. The response to each test item provides only one bit of information per OUT. The problem of correct ability assessment is even more complicated, when the levels of difficulty of the test items are unknown beforehand. This fact makes the search for effective ways of planning and processing the results of such tests highly relevant. In recent years, there has been some progress in this direction, generated by both the development of computational tools and the emergence of new ideas. The latter are associated with the use of so-called “scale invariant item response models”. Together with maximum likelihood estimation (MLE) approach, they helped to solve some problems of engineering and proficiency testing. However, several issues related to the assessment of uncertainties, replications scheduling, the use of placebo, as well as evaluation of multidimensional abilities still present a challenge for researchers. The authors attempt to outline the ways to solve the above problems.
Polylactides in additive biomanufacturing.
Poh, Patrina S P; Chhaya, Mohit P; Wunner, Felix M; De-Juan-Pardo, Elena M; Schilling, Arndt F; Schantz, Jan-Thorsten; van Griensven, Martijn; Hutmacher, Dietmar W
2016-12-15
New advanced manufacturing technologies under the alias of additive biomanufacturing allow the design and fabrication of a range of products from pre-operative models, cutting guides and medical devices to scaffolds. The process of printing in 3 dimensions of cells, extracellular matrix (ECM) and biomaterials (bioinks, powders, etc.) to generate in vitro and/or in vivo tissue analogue structures has been termed bioprinting. To further advance in additive biomanufacturing, there are many aspects that we can learn from the wider additive manufacturing (AM) industry, which have progressed tremendously since its introduction into the manufacturing sector. First, this review gives an overview of additive manufacturing and both industry and academia efforts in addressing specific challenges in the AM technologies to drive toward AM-enabled industrial revolution. After which, considerations of poly(lactides) as a biomaterial in additive biomanufacturing are discussed. Challenges in wider additive biomanufacturing field are discussed in terms of (a) biomaterials; (b) computer-aided design, engineering and manufacturing; (c) AM and additive biomanufacturing printers hardware; and (d) system integration. Finally, the outlook for additive biomanufacturing was discussed. Copyright © 2016 Elsevier B.V. All rights reserved.
Understanding immunology via engineering design: the role of mathematical prototyping.
Klinke, David J; Wang, Qing
2012-01-01
A major challenge in immunology is how to translate data into knowledge given the inherent complexity and dynamics of human physiology. Both the physiology and engineering communities have rich histories in applying computational approaches to translate data obtained from complex systems into knowledge of system behavior. However, there are some differences in how disciplines approach problems. By referring to mathematical models as mathematical prototypes, we aim to highlight aspects related to the process (i.e., prototyping) rather than the product (i.e., the model). The objective of this paper is to review how two related engineering concepts, specifically prototyping and "fitness for use," can be applied to overcome the pressing challenge in translating data into improved knowledge of basic immunology that can be used to improve therapies for disease. These concepts are illustrated using two immunology-related examples. The prototypes presented focus on the beta cell mass at the onset of type 1 diabetes and the dynamics of dendritic cells in the lung. This paper is intended to illustrate some of the nuances associated with applying mathematical modeling to improve understanding of the dynamics of disease progression in humans.
Electronic imaging of the human body
NASA Astrophysics Data System (ADS)
Vannier, Michael W.; Yates, Randall E.; Whitestone, Jennifer J.
1992-09-01
The Human Engineering Division of the Armstrong Laboratory (USAF); the Mallinckrodt Institute of Radiology; the Washington University School of Medicine; and the Lister-Hill National Center for Biomedical Communication, National Library of Medicine are sponsoring a working group on electronic imaging of the human body. Electronic imaging of the surface of the human body has been pursued and developed by a number of disciplines including radiology, forensics, surgery, engineering, medical education, and anthropometry. The applications range from reconstructive surgery to computer-aided design (CAD) of protective equipment. Although these areas appear unrelated, they have a great deal of commonality. All the organizations working in this area are faced with the challenges of collecting, reducing, and formatting the data in an efficient and standard manner; storing this data in a computerized database to make it readily accessible; and developing software applications that can visualize, manipulate, and analyze the data. This working group is being established to encourage effective use of the resources of all the various groups and disciplines involved in electronic imaging of the human body surface by providing a forum for discussing progress and challenges with these types of data.
Multiagent Work Practice Simulation: Progress and Challenges
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Shaffe, Michael G. (Technical Monitor)
2001-01-01
Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and a computer system. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3D space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).
Multiagent Work Practice Simulation: Progress and Challenges
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten
2002-01-01
Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and computer systems. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3d space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).
Nelson, Michelle; Salguero, Francisco J; Dean, Rachel E; Ngugi, Sarah A; Smither, Sophie J; Atkins, Timothy P; Lever, Mark S
2014-12-01
Glanders and melioidosis are caused by two distinct Burkholderia species and have generally been considered to have similar disease progression. While both of these pathogens are HHS/CDC Tier 1 agents, natural infection with both these pathogens is primarily through skin inoculation. The common marmoset (Callithrix jacchus) was used to compare disease following experimental subcutaneous challenge. Acute, lethal disease was observed in marmosets following challenge with between 26 and 1.2 × 10(8) cfu Burkholderia pseudomallei within 22-85 h. The reproducibility and progression of the disease were assessed following a challenge of 1 × 10(2) cfu of B. pseudomallei. Melioidosis was characterised by high levels of bacteraemia, focal microgranuloma progressing to non-necrotic multifocal solid lesions in the livers and spleens and multi-organ failure. Lethal disease was observed in 93% of animals challenged with Burkholderia mallei, occurring between 5 and 10.6 days. Following challenge with 1 × 10(2) cfu of B. mallei, glanders was characterised with lymphatic spread of the bacteria and non-necrotic, multifocal solid lesions progressing to a multifocal lesion with severe necrosis and pneumonia. The experimental results confirmed that the disease pathology and presentation is strikingly different between the two pathogens. The marmoset provides a model of the human syndrome for both diseases facilitating the development of medical countermeasures. © 2014 Crown copyright. International Journal of Experimental Pathology © 2014 Company of the International Journal of Experimental Pathology (CIJEP).
Race to the Top - Early Learning Challenge: 2015 Annual Performance Report. Maryland
ERIC Educational Resources Information Center
Race to the Top - Early Learning Challenge, 2016
2016-01-01
This Race to the Top - Early Learning Challenge (RTT-ELC) annual performance report for the year 2015 describes Maryland's accomplishments, lessons learned, challenges, and strategies Maryland will implement to address those challenges. Maryland's remarkable progress in increasing participation in their tiered quality rating and improvement…
Race to the Top - Early Learning Challenge: 2015 Annual Performance Report. Vermont
ERIC Educational Resources Information Center
Race to the Top - Early Learning Challenge, 2016
2016-01-01
This Race to the Top - Early Learning Challenge (RTT-ELC) annual performance report for the year 2015 describes Vermont's accomplishments, lessons learned, challenges, and strategies Vermont will implement to address those challenges. Vermont's remarkable progress in increasing participation in their tiered quality rating and improvement system,…
Race to the Top - Early Learning Challenge: 2015 Annual Performance Report. Pennsylvania
ERIC Educational Resources Information Center
Race to the Top - Early Learning Challenge, 2016
2016-01-01
This Race to the Top - Early Learning Challenge (RTT-ELC) annual performance report for the year 2015 describes Pennsylvania's accomplishments, lessons learned, challenges, and strategies Pennsylvania will implement to address those challenges. Pennsylvania's remarkable progress in increasing participation in their tiered quality rating and…
Race to the Top - Early Learning Challenge: 2015 Annual Performance Report. Minnesota
ERIC Educational Resources Information Center
Race to the Top - Early Learning Challenge, 2016
2016-01-01
This Race to the Top - Early Learning Challenge (RTT-ELC) annual performance report for the year 2015 describes Minnesota's accomplishments, lessons learned, challenges, and strategies Minnesota will implement to address those challenges. Minnesota's remarkable progress in increasing participation in their tiered quality rating and improvement…
ERIC Educational Resources Information Center
Williamson, Jacob
2007-01-01
Each year, students in Kittanning Senior High School's Principles of Technology (POT) classes have a design challenge, usually beginning with CO2-powered or Maglev-powered vehicles and progressing to such challenges as soap-box cars or radio-controlled airplanes. The author's recent POT class challenged them with the most exciting challenge yet:…
Challenges and Security in Cloud Computing
NASA Astrophysics Data System (ADS)
Chang, Hyokyung; Choi, Euiin
People who live in this world want to solve any problems as they happen then. An IT technology called Ubiquitous computing should help the situations easier and we call a technology which makes it even better and powerful cloud computing. Cloud computing, however, is at the stage of the beginning to implement and use and it faces a lot of challenges in technical matters and security issues. This paper looks at the cloud computing security.
1920x1080 pixel color camera with progressive scan at 50 to 60 frames per second
NASA Astrophysics Data System (ADS)
Glenn, William E.; Marcinka, John W.
1998-09-01
For over a decade, the broadcast industry, the film industry and the computer industry have had a long-range objective to originate high definition images with progressive scan. This produces images with better vertical resolution and much fewer artifacts than interlaced scan. Computers almost universally use progressive scan. The broadcast industry has resisted switching from interlace to progressive because no cameras were available in that format with the 1920 X 1080 resolution that had obtained international acceptance for high definition program production. The camera described in this paper produces an output in that format derived from two 1920 X 1080 CCD sensors produced by Eastman Kodak.
A Multi-Level Parallelization Concept for High-Fidelity Multi-Block Solvers
NASA Technical Reports Server (NTRS)
Hatay, Ferhat F.; Jespersen, Dennis C.; Guruswamy, Guru P.; Rizk, Yehia M.; Byun, Chansup; Gee, Ken; VanDalsem, William R. (Technical Monitor)
1997-01-01
The integration of high-fidelity Computational Fluid Dynamics (CFD) analysis tools with the industrial design process benefits greatly from the robust implementations that are transportable across a wide range of computer architectures. In the present work, a hybrid domain-decomposition and parallelization concept was developed and implemented into the widely-used NASA multi-block Computational Fluid Dynamics (CFD) packages implemented in ENSAERO and OVERFLOW. The new parallel solver concept, PENS (Parallel Euler Navier-Stokes Solver), employs both fine and coarse granularity in data partitioning as well as data coalescing to obtain the desired load-balance characteristics on the available computer platforms. This multi-level parallelism implementation itself introduces no changes to the numerical results, hence the original fidelity of the packages are identically preserved. The present implementation uses the Message Passing Interface (MPI) library for interprocessor message passing and memory accessing. By choosing an appropriate combination of the available partitioning and coalescing capabilities only during the execution stage, the PENS solver becomes adaptable to different computer architectures from shared-memory to distributed-memory platforms with varying degrees of parallelism. The PENS implementation on the IBM SP2 distributed memory environment at the NASA Ames Research Center obtains 85 percent scalable parallel performance using fine-grain partitioning of single-block CFD domains using up to 128 wide computational nodes. Multi-block CFD simulations of complete aircraft simulations achieve 75 percent perfect load-balanced executions using data coalescing and the two levels of parallelism. SGI PowerChallenge, SGI Origin 2000, and a cluster of workstations are the other platforms where the robustness of the implementation is tested. The performance behavior on the other computer platforms with a variety of realistic problems will be included as this on-going study progresses.
An automated method for accurate vessel segmentation.
Yang, Xin; Liu, Chaoyue; Le Minh, Hung; Wang, Zhiwei; Chien, Aichi; Cheng, Kwang-Ting Tim
2017-05-07
Vessel segmentation is a critical task for various medical applications, such as diagnosis assistance of diabetic retinopathy, quantification of cerebral aneurysm's growth, and guiding surgery in neurosurgical procedures. Despite technology advances in image segmentation, existing methods still suffer from low accuracy for vessel segmentation in the two challenging while common scenarios in clinical usage: (1) regions with a low signal-to-noise-ratio (SNR), and (2) at vessel boundaries disturbed by adjacent non-vessel pixels. In this paper, we present an automated system which can achieve highly accurate vessel segmentation for both 2D and 3D images even under these challenging scenarios. Three key contributions achieved by our system are: (1) a progressive contrast enhancement method to adaptively enhance contrast of challenging pixels that were otherwise indistinguishable, (2) a boundary refinement method to effectively improve segmentation accuracy at vessel borders based on Canny edge detection, and (3) a content-aware region-of-interests (ROI) adjustment method to automatically determine the locations and sizes of ROIs which contain ambiguous pixels and demand further verification. Extensive evaluation of our method is conducted on both 2D and 3D datasets. On a public 2D retinal dataset (named DRIVE (Staal 2004 IEEE Trans. Med. Imaging 23 501-9)) and our 2D clinical cerebral dataset, our approach achieves superior performance to the state-of-the-art methods including a vesselness based method (Frangi 1998 Int. Conf. on Medical Image Computing and Computer-Assisted Intervention) and an optimally oriented flux (OOF) based method (Law and Chung 2008 European Conf. on Computer Vision). An evaluation on 11 clinical 3D CTA cerebral datasets shows that our method can achieve 94% average accuracy with respect to the manual segmentation reference, which is 23% to 33% better than the five baseline methods (Yushkevich 2006 Neuroimage 31 1116-28; Law and Chung 2008 European Conf. on Computer Vision; Law and Chung 2009 IEEE Trans. Image Process. 18 596-612; Wang 2015 J. Neurosci. Methods 241 30-6) with manually optimized parameters. Our system has also been applied clinically for cerebral aneurysm development analysis. Experimental results on 10 patients' data, with two 3D CT scans per patient, show that our system's automatic diagnosis outcomes are consistent with clinicians' manual measurements.
An automated method for accurate vessel segmentation
NASA Astrophysics Data System (ADS)
Yang, Xin; Liu, Chaoyue; Le Minh, Hung; Wang, Zhiwei; Chien, Aichi; (Tim Cheng, Kwang-Ting
2017-05-01
Vessel segmentation is a critical task for various medical applications, such as diagnosis assistance of diabetic retinopathy, quantification of cerebral aneurysm’s growth, and guiding surgery in neurosurgical procedures. Despite technology advances in image segmentation, existing methods still suffer from low accuracy for vessel segmentation in the two challenging while common scenarios in clinical usage: (1) regions with a low signal-to-noise-ratio (SNR), and (2) at vessel boundaries disturbed by adjacent non-vessel pixels. In this paper, we present an automated system which can achieve highly accurate vessel segmentation for both 2D and 3D images even under these challenging scenarios. Three key contributions achieved by our system are: (1) a progressive contrast enhancement method to adaptively enhance contrast of challenging pixels that were otherwise indistinguishable, (2) a boundary refinement method to effectively improve segmentation accuracy at vessel borders based on Canny edge detection, and (3) a content-aware region-of-interests (ROI) adjustment method to automatically determine the locations and sizes of ROIs which contain ambiguous pixels and demand further verification. Extensive evaluation of our method is conducted on both 2D and 3D datasets. On a public 2D retinal dataset (named DRIVE (Staal 2004 IEEE Trans. Med. Imaging 23 501-9)) and our 2D clinical cerebral dataset, our approach achieves superior performance to the state-of-the-art methods including a vesselness based method (Frangi 1998 Int. Conf. on Medical Image Computing and Computer-Assisted Intervention) and an optimally oriented flux (OOF) based method (Law and Chung 2008 European Conf. on Computer Vision). An evaluation on 11 clinical 3D CTA cerebral datasets shows that our method can achieve 94% average accuracy with respect to the manual segmentation reference, which is 23% to 33% better than the five baseline methods (Yushkevich 2006 Neuroimage 31 1116-28; Law and Chung 2008 European Conf. on Computer Vision; Law and Chung 2009 IEEE Trans. Image Process. 18 596-612; Wang 2015 J. Neurosci. Methods 241 30-6) with manually optimized parameters. Our system has also been applied clinically for cerebral aneurysm development analysis. Experimental results on 10 patients’ data, with two 3D CT scans per patient, show that our system’s automatic diagnosis outcomes are consistent with clinicians’ manual measurements.
Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.
Stolper, Charles D; Perer, Adam; Gotz, David
2014-12-01
As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.
ERIC Educational Resources Information Center
American Federation of Teachers, 2005
2005-01-01
This document suggests changes to the No Child Left Behind Act (NCLB) in four targeted areas. Regarding Adequate Yearly Progress (AYP), The American Federation of Teachers (AFT) supports: (1) Setting challenging but demonstrably attainable student progress goals; (2) Judging school effectiveness by measuring progress of the same students over…
Technological Innovations from NASA
NASA Technical Reports Server (NTRS)
Pellis, Neal R.
2006-01-01
The challenge of human space exploration places demands on technology that push concepts and development to the leading edge. In biotechnology and biomedical equipment development, NASA science has been the seed for numerous innovations, many of which are in the commercial arena. The biotechnology effort has led to rational drug design, analytical equipment, and cell culture and tissue engineering strategies. Biomedical research and development has resulted in medical devices that enable diagnosis and treatment advances. NASA Biomedical developments are exemplified in the new laser light scattering analysis for cataracts, the axial flow left ventricular-assist device, non contact electrocardiography, and the guidance system for LASIK surgery. Many more developments are in progress. NASA will continue to advance technologies, incorporating new approaches from basic and applied research, nanotechnology, computational modeling, and database analyses.
NASA Astrophysics Data System (ADS)
Doursat, René
Exploding growth growth in computational systems forces us to gradually replace rigid design and control with decentralization and autonomy. Information technologies will progress, instead, by"meta-designing" mechanisms of system self-assembly, self-regulation and evolution. Nature offers a great variety of efficient complex systems, in which numerous small elements form large-scale, adaptive patterns. The new engineering challenge is to recreate this self-organization and let it freely generate innovative designs under guidance. This article presents an original model of artificial system growth inspired by embryogenesis. A virtual organism is a lattice of cells that proliferate, migrate and self-pattern into differentiated domains. Each cell's fate is controlled by an internal gene regulatory network network. Embryomorphic engineering emphasizes hyperdistributed architectures, and their development as a prerequisite of evolutionary design.
Emerging pathogens in the fish farming industry and sequencing-based pathogen discovery.
Tengs, Torstein; Rimstad, Espen
2017-10-01
The use of large scale DNA/RNA sequencing has become an integral part of biomedical research. Reduced sequencing costs and the availability of efficient computational resources has led to a revolution in how problems concerning genomics and transcriptomics are addressed. Sequencing-based pathogen discovery represents one example of how genetic data can now be used in ways that were previously considered infeasible. Emerging pathogens affect both human and animal health due to a multitude of factors, including globalization, a shifting environment and an increasing human population. Fish farming represents a relevant, interesting and challenging system to study emerging pathogens. This review summarizes recent progress in pathogen discovery using sequence data, with particular emphasis on viruses in Atlantic salmon (Salmo salar). Copyright © 2017 Elsevier Ltd. All rights reserved.
Progress in hyperspectral imaging of vegetation
NASA Astrophysics Data System (ADS)
Goetz, Alexander F. H.
2001-03-01
Computer-related technologies, such as the Internet, have posed new challenges for intellectual property law. Legislation and court decisions impacting patents, copyrights, trade secrets and trademarks have adapted intellectual property law to address new issues brought about by such emerging technologies. As the pace of technological change continues to increase, intellectual property law will need to keep up. Accordingly, the balance struck by intellectual property laws today will likely be set askew by technological changes in the future. Engineers need to consider not only the law as it exists today, but also how it might change in the future. Likewise, lawyers and judges need to consider legal issues not only in view of the current state of the art in technology, but also with an eye to technologies yet to come.
Successful generation of structural information for fragment-based drug discovery.
Öster, Linda; Tapani, Sofia; Xue, Yafeng; Käck, Helena
2015-09-01
Fragment-based drug discovery relies upon structural information for efficient compound progression, yet it is often challenging to generate structures with bound fragments. A summary of recent literature reveals that a wide repertoire of experimental procedures is employed to generate ligand-bound crystal structures successfully. We share in-house experience from setting up and executing fragment crystallography in a project that resulted in 55 complex structures. The ligands span five orders of magnitude in affinity and the resulting structures are made available to be of use, for example, for development of computational methods. Analysis of the results revealed that ligand properties such as potency, ligand efficiency (LE) and, to some degree, clogP influence the success of complex structure generation. Copyright © 2015 Elsevier Ltd. All rights reserved.
Paciente inmunocompetente con criptococosis cerebral: reporte de un caso.
Becerra-Pedraza, Luis Cuitláhuac; Martínez-Piña, Daniel Arturo; Calles-Carmona, Génesis Rocío; San-Juan, Daniel
2017-01-01
A 14-year-old female, presenting sudden and progressive holocraneal headache along with incoercible vomiting arrived to emergency room. Acute confusional state and meningoencephalitis syndrome where identified. Brain computed tomography-scan with normal results was performed. Lumbar puncture with crystal-clear cerebrospinal fluid was obtained: low glucose, elevated proteins and cell-count of 15/mm. China-Ink and Criptococcus neoformans culture both positive. Viral, lupus-anticoagulant, and HIV tests negative. Fluconazole 200 mg/kg/day, amphotericin-B 0.7 mg/kg/day, dexamethasone 1 mg/kg/day were prescribed. 48-h later evolved to cerebral edema, multiple-organ-failure and death. Hereby we present a Cryptococcus spp. infection case report, addressing the public health challenge and vulnerability of immunocompromised patients in Mexico. Copyright: © 2017 SecretarÍa de Salud.
Virtual reality-assisted robotic surgery simulation.
Albani, Justin M; Lee, David I
2007-03-01
For more than a decade, advancing computer technologies have allowed incorporation of virtual reality (VR) into surgical training. This has become especially important in training for laparoscopic procedures, which often are complex and leave little room for error. With the advent of robotic surgery and the development and prevalence of a commercial surgical system (da Vinci robot; Intuitive Surgical, Sunnyvale, CA), a valid VR-assisted robotic surgery simulator could minimize the steep learning curve associated with many of these complex procedures and thus enable better outcomes. To date, such simulation does not exist; however, several agencies and corporations are involved in making this dream a reality. We review the history and progress of VR simulation in surgical training, its promising applications in robotic-assisted surgery, and the remaining challenges to implementation.
Recent Advances in Transferable Coarse-Grained Modeling of Proteins
Kar, Parimal; Feig, Michael
2017-01-01
Computer simulations are indispensable tools for studying the structure and dynamics of biological macromolecules. Biochemical processes occur on different scales of length and time. Atomistic simulations cannot cover the relevant spatiotemporal scales at which the cellular processes occur. To address this challenge, coarse-grained (CG) modeling of the biological systems are employed. Over the last few years, many CG models for proteins continue to be developed. However, many of them are not transferable with respect to different systems and different environments. In this review, we discuss those CG protein models that are transferable and that retain chemical specificity. We restrict ourselves to CG models of soluble proteins only. We also briefly review recent progress made in the multi-scale hybrid all-atom/coarse-grained simulations of proteins. PMID:25443957
NASA Astrophysics Data System (ADS)
Bijeljic, Branko; Icardi, Matteo; Prodanović, Maša
2018-05-01
Substantial progress has been made over last few decades on understanding the physics of multiphase flow and reactive transport phenomena in subsurface porous media. Confluence of advances in experimental techniques (including micromodels, X-ray microtomography, Nuclear Magnetic Resonance (NMR)) as well as computational power have made it possible to observe static and dynamic multi-scale flow, transport and reactive processes, thus stimulating development of new generation of modelling tools from pore to field scale. One of the key challenges is to make experiment and models as complementary as possible, with continuously improving experimental methods in order to increase predictive capabilities of theoretical models across scales. This creates need to establish rigorous benchmark studies of flow, transport and reaction in porous media which can then serve as the basis for introducing more complex phenomena in future developments.
Obtaining lower bounds from the progressive hedging algorithm for stochastic mixed-integer programs
Gade, Dinakar; Hackebeil, Gabriel; Ryan, Sarah M.; ...
2016-04-02
We present a method for computing lower bounds in the progressive hedging algorithm (PHA) for two-stage and multi-stage stochastic mixed-integer programs. Computing lower bounds in the PHA allows one to assess the quality of the solutions generated by the algorithm contemporaneously. The lower bounds can be computed in any iteration of the algorithm by using dual prices that are calculated during execution of the standard PHA. In conclusion, we report computational results on stochastic unit commitment and stochastic server location problem instances, and explore the relationship between key PHA parameters and the quality of the resulting lower bounds.
Current Computational Challenges for CMC Processes, Properties, and Structures
NASA Technical Reports Server (NTRS)
DiCarlo, James
2008-01-01
In comparison to current state-of-the-art metallic alloys, ceramic matrix composites (CMC) offer a variety of performance advantages, such as higher temperature capability (greater than the approx.2100 F capability for best metallic alloys), lower density (approx.30-50% metal density), and lower thermal expansion. In comparison to other competing high-temperature materials, CMC are also capable of providing significantly better static and dynamic toughness than un-reinforced monolithic ceramics and significantly better environmental resistance than carbon-fiber reinforced composites. Because of these advantages, NASA, the Air Force, and other U.S. government agencies and industries are currently seeking to implement these advanced materials into hot-section components of gas turbine engines for both propulsion and power generation. For applications such as these, CMC are expected to result in many important performance benefits, such as reduced component cooling air requirements, simpler component design, reduced weight, improved fuel efficiency, reduced emissions, higher blade frequencies, reduced blade clearances, and higher thrust. Although much progress has been made recently in the development of CMC constituent materials and fabrication processes, major challenges still remain for implementation of these advanced composite materials into viable engine components. The objective of this presentation is to briefly review some of those challenges that are generally related to the need to develop physics-based computational approaches to allow CMC fabricators and designers to model (1) CMC processes for fiber architecture formation and matrix infiltration, (2) CMC properties of high technical interest such as multidirectional creep, thermal conductivity, matrix cracking stress, damage accumulation, and degradation effects in aggressive environments, and (3) CMC component life times when all of these effects are interacting in a complex stress and service environment. To put these computational issues in perspective, the various modeling needs within these three areas are briefly discussed in terms of their technical importance and their key controlling mechanistic factors as we know them today. Emphasis is placed primarily on the SiC/SiC ceramic composite system because of its higher temperature capability and enhanced development within the CMC industry. A brief summary is then presented concerning on-going property studies aimed at addressing these CMC modeling needs within NASA in terms of their computational approaches and recent important results. Finally an overview perspective is presented on those key areas where further CMC computational studies are needed today to enhance the viability of CMC structural components for high-temperature applications.
John G. Michopoulos; John Hermanson; Athanasios Iliopoulos
2014-01-01
The research areas of mutiaxial robotic testing and design optimization have been recently utilized for the purpose of data-driven constitutive characterization of anisotropic material systems. This effort has been enabled by both the progress in the areas of computers and information in engineering as well as the progress in computational automation. Although our...
ERIC Educational Resources Information Center
HANKIN, EDWARD K.; AND OTHERS
THIS TECHNICAL PROGRESS REPORT COVERS THE FIRST THREE MONTHS OF A PROJECT TO DEVELOP COMPUTER ASSISTED PREVOCATIONAL READING AND ARITHMETIC COURSES FOR DISADVANTAGED YOUTHS AND ADULTS. DURING THE FIRST MONTH OF OPERATION, PROJECT PERSONNEL CONCENTRATED ON SUCH ADMINISTRATIVE MATTERS AS TRAINING STAFF AND PREPARING FACILITIES. AN ARITHMETIC PROGRAM…
Research, Development and Validation of the Daily Demand Computer Schedule 360/50. Final Report.
ERIC Educational Resources Information Center
Ovard, Glen F.; Rowley, Vernon C.
A study was designed to further the research, development and validation of the Daily Demand Computer Schedule (DDCS), a system by which students can be rescheduled daily for facilitating their individual continuous progress through the curriculum. It will allow teachers to regroup students as needed based upon that progress, and will make time a…
ERIC Educational Resources Information Center
Forster, Natalie; Souvignier, Elmar
2011-01-01
The purpose of this study was to examine the technical adequacy of a computer-based assessment instrument which is based on hierarchical models of text comprehension for monitoring student reading progress following the Curriculum-Based Measurement (CBM) approach. At intervals of two weeks, 120 third-grade students finished eight CBM tests. To…
Strategies, Challenges and Prospects for Active Learning in the Computer-Based Classroom
ERIC Educational Resources Information Center
Holbert, K. E.; Karady, G. G.
2009-01-01
The introduction of computer-equipped classrooms into engineering education has brought with it a host of opportunities and issues. Herein, some of the challenges and successes for creating an environment for active learning within computer-based classrooms are described. The particular teaching approach developed for undergraduate electrical…
Current Challenges for HTCMC Aero-Propulsion Components
NASA Technical Reports Server (NTRS)
DiCarlo, James A.; Bansal, Narottam P.
2007-01-01
In comparison to the best metallic materials, HTCMC aero-propulsion engine components offer the opportunity of reduced weight and higher temperature operation, with corresponding improvements in engine cooling requirements, emissions, thrust, and specific fuel consumption. Although much progress has been made in the development of advanced HTCMC constituent materials and processes, major challenges still remain for their implementation into these components. The objectives of this presentation are to briefly review (1) potential HTCMC aero-propulsion components and their generic material performance requirements, (2) recent progress at NASA and elsewhere concerning advanced constituents and processes for meeting these requirements, (3) key HTCMC component implementation challenges that are currently being encountered, and (4) on-going activities within the new NASA Fundamental Aeronautics Program that are addressing these challenges.
ERIC Educational Resources Information Center
Gerwien, Johannes; Flecken, Monique
2015-01-01
The acquisition of linguistic structures that require "perspective-taking" at the level of "message generation" is challenging. We investigate use of "progressive aspect" in L2 event encoding, using a sentence priming paradigm. We focus on Dutch, in which use of progressive aspect is optional. The progressive consists…
Coarse-Graining of Polymer Dynamics via Energy Renormalization
NASA Astrophysics Data System (ADS)
Xia, Wenjie; Song, Jake; Phelan, Frederick; Douglas, Jack; Keten, Sinan
The computational prediction of the properties of polymeric materials to serve the needs of materials design and prediction of their performance is a grand challenge due to the prohibitive computational times of all-atomistic (AA) simulations. Coarse-grained (CG) modeling is an essential strategy for making progress on this problem. While there has been intense activity in this area, effective methods of coarse-graining have been slow to develop. Our approach to this fundamental problem starts from the observation that integrating out degrees of freedom of the AA model leads to a strong modification of the configurational entropy and cohesive interaction. Based on this observation, we propose a temperature-dependent systematic renormalization of the cohesive interaction in the CG modeling to recover the thermodynamic modifications in the system and the dynamics of the AA model. Here, we show that this energy renormalization approach to CG can faithfully estimate the diffusive, segmental and glassy dynamics of the AA model over a large temperature range spanning from the Arrhenius melt to the non-equilibrium glassy states. Our proposed CG strategy offers a promising strategy for developing thermodynamically consistent CG models with temperature transferability.
Automated Generation of Tabular Equations of State with Uncertainty Information
NASA Astrophysics Data System (ADS)
Carpenter, John H.; Robinson, Allen C.; Debusschere, Bert J.; Mattsson, Ann E.
2015-06-01
As computational science pushes toward higher fidelity prediction, understanding the uncertainty associated with closure models, such as the equation of state (EOS), has become a key focus. Traditional EOS development often involves a fair amount of art, where expert modelers may appear as magicians, providing what is felt to be the closest possible representation of the truth. Automation of the development process gives a means by which one may demystify the art of EOS, while simultaneously obtaining uncertainty information in a manner that is both quantifiable and reproducible. We describe our progress on the implementation of such a system to provide tabular EOS tables with uncertainty information to hydrocodes. Key challenges include encoding the artistic expert opinion into an algorithmic form and preserving the analytic models and uncertainty information in a manner that is both accurate and computationally efficient. Results are demonstrated on a multi-phase aluminum model. *Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Neuroblastoma, a Paradigm for Big Data Science in Pediatric Oncology
Salazar, Brittany M.; Balczewski, Emily A.; Ung, Choong Yong; Zhu, Shizhen
2016-01-01
Pediatric cancers rarely exhibit recurrent mutational events when compared to most adult cancers. This poses a challenge in understanding how cancers initiate, progress, and metastasize in early childhood. Also, due to limited detected driver mutations, it is difficult to benchmark key genes for drug development. In this review, we use neuroblastoma, a pediatric solid tumor of neural crest origin, as a paradigm for exploring “big data” applications in pediatric oncology. Computational strategies derived from big data science–network- and machine learning-based modeling and drug repositioning—hold the promise of shedding new light on the molecular mechanisms driving neuroblastoma pathogenesis and identifying potential therapeutics to combat this devastating disease. These strategies integrate robust data input, from genomic and transcriptomic studies, clinical data, and in vivo and in vitro experimental models specific to neuroblastoma and other types of cancers that closely mimic its biological characteristics. We discuss contexts in which “big data” and computational approaches, especially network-based modeling, may advance neuroblastoma research, describe currently available data and resources, and propose future models of strategic data collection and analyses for neuroblastoma and other related diseases. PMID:28035989
Neuroblastoma, a Paradigm for Big Data Science in Pediatric Oncology.
Salazar, Brittany M; Balczewski, Emily A; Ung, Choong Yong; Zhu, Shizhen
2016-12-27
Pediatric cancers rarely exhibit recurrent mutational events when compared to most adult cancers. This poses a challenge in understanding how cancers initiate, progress, and metastasize in early childhood. Also, due to limited detected driver mutations, it is difficult to benchmark key genes for drug development. In this review, we use neuroblastoma, a pediatric solid tumor of neural crest origin, as a paradigm for exploring "big data" applications in pediatric oncology. Computational strategies derived from big data science-network- and machine learning-based modeling and drug repositioning-hold the promise of shedding new light on the molecular mechanisms driving neuroblastoma pathogenesis and identifying potential therapeutics to combat this devastating disease. These strategies integrate robust data input, from genomic and transcriptomic studies, clinical data, and in vivo and in vitro experimental models specific to neuroblastoma and other types of cancers that closely mimic its biological characteristics. We discuss contexts in which "big data" and computational approaches, especially network-based modeling, may advance neuroblastoma research, describe currently available data and resources, and propose future models of strategic data collection and analyses for neuroblastoma and other related diseases.
Dynamical Modeling of Surface Tension
NASA Technical Reports Server (NTRS)
Brackbill, Jeremiah U.; Kothe, Douglas B.
1996-01-01
In a recent review it is said that free-surface flows 'represent some of the difficult remaining challenges in computational fluid dynamics'. There has been progress with the development of new approaches to treating interfaces, such as the level-set method and the improvement of older methods such as the VOF method. A common theme of many of the new developments has been the regularization of discontinuities at the interface. One example of this approach is the continuum surface force (CSF) formulation for surface tension, which replaces the surface stress given by Laplace's equation by an equivalent volume force. Here, we describe how CSF formulation might be made more useful. Specifically, we consider a derivation of the CSF equations from a minimization of surface energy as outlined by Jacqmin (1996). This reformulation suggests that if one eliminates the computation of curvature in terms of a unit normal vector, parasitic currents may be eliminated. For this reformulation to work, it is necessary that transition region thickness be controlled. Various means for this, in addition to the one discussed by Jacqmin (1996), are discussed.
Challenges for Rule Systems on the Web
NASA Astrophysics Data System (ADS)
Hu, Yuh-Jong; Yeh, Ching-Long; Laun, Wolfgang
The RuleML Challenge started in 2007 with the objective of inspiring the issues of implementation for management, integration, interoperation and interchange of rules in an open distributed environment, such as the Web. Rules are usually classified as three types: deductive rules, normative rules, and reactive rules. The reactive rules are further classified as ECA rules and production rules. The study of combination rule and ontology is traced back to an earlier active rule system for relational and object-oriented (OO) databases. Recently, this issue has become one of the most important research problems in the Semantic Web. Once we consider a computer executable policy as a declarative set of rules and ontologies that guides the behavior of entities within a system, we have a flexible way to implement real world policies without rewriting the computer code, as we did before. Fortunately, we have de facto rule markup languages, such as RuleML or RIF to achieve the portability and interchange of rules for different rule systems. Otherwise, executing real-life rule-based applications on the Web is almost impossible. Several commercial or open source rule engines are available for the rule-based applications. However, we still need a standard rule language and benchmark for not only to compare the rule systems but also to measure the progress in the field. Finally, a number of real-life rule-based use cases will be investigated to demonstrate the applicability of current rule systems on the Web.
Integration of language and sensor information
NASA Astrophysics Data System (ADS)
Perlovsky, Leonid I.; Weijers, Bertus
2003-04-01
The talk describes the development of basic technologies of intelligent systems fusing data from multiple domains and leading to automated computational techniques for understanding data contents. Understanding involves inferring appropriate decisions and recommending proper actions, which in turn requires fusion of data and knowledge about objects, situations, and actions. Data might include sensory data, verbal reports, intelligence intercepts, or public records, whereas knowledge ought to encompass the whole range of objects, situations, people and their behavior, and knowledge of languages. In the past, a fundamental difficulty in combining knowledge with data was the combinatorial complexity of computations, too many combinations of data and knowledge pieces had to be evaluated. Recent progress in understanding of natural intelligent systems, including the human mind, leads to the development of neurophysiologically motivated architectures for solving these challenging problems, in particular the role of emotional neural signals in overcoming combinatorial complexity of old logic-based approaches. Whereas past approaches based on logic tended to identify logic with language and thinking, recent studies in cognitive linguistics have led to appreciation of more complicated nature of linguistic models. Little is known about the details of the brain mechanisms integrating language and thinking. Understanding and fusion of linguistic information with sensory data represent a novel challenging aspect of the development of integrated fusion systems. The presentation will describe a non-combinatorial approach to this problem and outline techniques that can be used for fusing diverse and uncertain knowledge with sensory and linguistic data.
HammerCloud: A Stress Testing System for Distributed Analysis
NASA Astrophysics Data System (ADS)
van der Ster, Daniel C.; Elmsheuser, Johannes; Úbeda García, Mario; Paladin, Massimo
2011-12-01
Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).
Shaping the light for the investigation of depth-extended scattering media
NASA Astrophysics Data System (ADS)
Osten, W.; Frenner, K.; Pedrini, G.; Singh, A. K.; Schindler, J.; Takeda, M.
2018-02-01
Scattering media are an ongoing challenge for all kind of imaging technologies including coherent and incoherent principles. Inspired by new approaches of computational imaging and supported by the availability of powerful computers, spatial light modulators, light sources and detectors, a variety of new methods ranging from holography to time-of-flight imaging, phase conjugation, phase recovery using iterative algorithms and correlation techniques have been introduced and applied to different types of objects. However, considering the obvious progress in this field, several problems are still matter of investigation and their solution could open new doors for the inspection and application of scattering media as well. In particular, these open questions include the possibility of extending the 2d-approach to the inspection of depth-extended objects, the direct use of a scattering media as a simple tool for imaging of complex objects and the improvement of coherent inspection techniques for the dimensional characterization of incoherently radiating spots embedded in scattering media. In this paper we show our recent findings in coping with these challenges. First we describe how to explore depth-extended objects by means of a scattering media. Afterwards, we extend this approach by implementing a new type of microscope making use of a simple scatter plate as a kind of flat and unconventional imaging lens. Finally, we introduce our shearing interferometer in combination with structured illumination for retrieving the axial position of fluorescent light emitting spots embedded in scattering media.
Reinforcement learning and episodic memory in humans and animals: an integrative framework
Gershman, Samuel J.; Daw, Nathaniel D.
2018-01-01
We review the psychology and neuroscience of reinforcement learning (RL), which has witnessed significant progress in the last two decades, enabled by the comprehensive experimental study of simple learning and decision-making tasks. However, the simplicity of these tasks misses important aspects of reinforcement learning in the real world: (i) State spaces are high-dimensional, continuous, and partially observable; this implies that (ii) data are relatively sparse: indeed precisely the same situation may never be encountered twice; and also that (iii) rewards depend on long-term consequences of actions in ways that violate the classical assumptions that make RL tractable. A seemingly distinct challenge is that, cognitively, these theories have largely connected with procedural and semantic memory: how knowledge about action values or world models extracted gradually from many experiences can drive choice. This misses many aspects of memory related to traces of individual events, such as episodic memory. We suggest that these two gaps are related. In particular, the computational challenges can be dealt with, in part, by endowing RL systems with episodic memory, allowing them to (i) efficiently approximate value functions over complex state spaces, (ii) learn with very little data, and (iii) bridge long-term dependencies between actions and rewards. We review the computational theory underlying this proposal and the empirical evidence to support it. Our proposal suggests that the ubiquitous and diverse roles of memory in RL may function as part of an integrated learning system. PMID:27618944
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
NASA Technical Reports Server (NTRS)
Simoneau, Robert J.; Strazisar, Anthony J.; Sockol, Peter M.; Reid, Lonnie; Adamczyk, John J.
1987-01-01
The discipline research in turbomachinery, which is directed toward building the tools needed to understand such a complex flow phenomenon, is based on the fact that flow in turbomachinery is fundamentally unsteady or time dependent. Success in building a reliable inventory of analytic and experimental tools will depend on how the time and time-averages are treated, as well as on who the space and space-averages are treated. The raw tools at disposal (both experimentally and computational) are truly powerful and their numbers are growing at a staggering pace. As a result of this power, a case can be made that a situation exists where information is outstripping understanding. The challenge is to develop a set of computational and experimental tools which genuinely increase understanding of the fluid flow and heat transfer in a turbomachine. Viewgraphs outline a philosophy based on working on a stairstep hierarchy of mathematical and experimental complexity to build a system of tools, which enable one to aggressively design the turbomachinery of the next century. Examples of the types of computational and experimental tools under current development at Lewis, with progress to date, are examined. The examples include work in both the time-resolved and time-averaged domains. Finally, an attempt is made to identify the proper place for Lewis in this continuum of research.
Semiautomatic tumor segmentation with multimodal images in a conditional random field framework.
Hu, Yu-Chi; Grossberg, Michael; Mageras, Gikas
2016-04-01
Volumetric medical images of a single subject can be acquired using different imaging modalities, such as computed tomography, magnetic resonance imaging (MRI), and positron emission tomography. In this work, we present a semiautomatic segmentation algorithm that can leverage the synergies between different image modalities while integrating interactive human guidance. The algorithm provides a statistical segmentation framework partly automating the segmentation task while still maintaining critical human oversight. The statistical models presented are trained interactively using simple brush strokes to indicate tumor and nontumor tissues and using intermediate results within a patient's image study. To accomplish the segmentation, we construct the energy function in the conditional random field (CRF) framework. For each slice, the energy function is set using the estimated probabilities from both user brush stroke data and prior approved segmented slices within a patient study. The progressive segmentation is obtained using a graph-cut-based minimization. Although no similar semiautomated algorithm is currently available, we evaluated our method with an MRI data set from Medical Image Computing and Computer Assisted Intervention Society multimodal brain segmentation challenge (BRATS 2012 and 2013) against a similar fully automatic method based on CRF and a semiautomatic method based on grow-cut, and our method shows superior performance.
Edge co-occurrences can account for rapid categorization of natural versus animal images
NASA Astrophysics Data System (ADS)
Perrinet, Laurent U.; Bednar, James A.
2015-06-01
Making a judgment about the semantic category of a visual scene, such as whether it contains an animal, is typically assumed to involve high-level associative brain areas. Previous explanations require progressively analyzing the scene hierarchically at increasing levels of abstraction, from edge extraction to mid-level object recognition and then object categorization. Here we show that the statistics of edge co-occurrences alone are sufficient to perform a rough yet robust (translation, scale, and rotation invariant) scene categorization. We first extracted the edges from images using a scale-space analysis coupled with a sparse coding algorithm. We then computed the “association field” for different categories (natural, man-made, or containing an animal) by computing the statistics of edge co-occurrences. These differed strongly, with animal images having more curved configurations. We show that this geometry alone is sufficient for categorization, and that the pattern of errors made by humans is consistent with this procedure. Because these statistics could be measured as early as the primary visual cortex, the results challenge widely held assumptions about the flow of computations in the visual system. The results also suggest new algorithms for image classification and signal processing that exploit correlations between low-level structure and the underlying semantic category.
High-End Computing Challenges in Aerospace Design and Engineering
NASA Technical Reports Server (NTRS)
Bailey, F. Ronald
2004-01-01
High-End Computing (HEC) has had significant impact on aerospace design and engineering and is poised to make even more in the future. In this paper we describe four aerospace design and engineering challenges: Digital Flight, Launch Simulation, Rocket Fuel System and Digital Astronaut. The paper discusses modeling capabilities needed for each challenge and presents projections of future near and far-term HEC computing requirements. NASA's HEC Project Columbia is described and programming strategies presented that are necessary to achieve high real performance.
The National Cancer Institute (NCI) Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce that teams led by Jaewoo Kang (Korea University), and Yuanfang Guan with Hongyang Li (University of Michigan) as the best performers of the NCI-CPTAC DREAM Proteogenomics Computational Challenge. Over 500 participants from 20 countries registered for the Challenge, which offered $25,000 in cash awards contributed by the NVIDIA Foundation through its Compute the Cure initiative.
10 CFR 961.11 - Text of the contract.
Code of Federal Regulations, 2014 CFR
2014-01-01
... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...
10 CFR 961.11 - Text of the contract.
Code of Federal Regulations, 2013 CFR
2013-01-01
... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...
Looking from Within: Prospects and Challenges for Progressive Education in Indonesia
ERIC Educational Resources Information Center
Zulfikar, Teuku
2013-01-01
Many Indonesian scholars (Azra, 2002; Darmaningtyas, 2004; Yunus, 2004), have attempted to bring progressive education to their country. They believe that progressive practices such as critical thinking, critical dialogue and child-centered instruction will help students learn better. However, this implementation is resisted because of cultural…
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.
Race to the Top - Early Learning Challenge: 2015 Annual Performance Report. New Jersey
ERIC Educational Resources Information Center
Race to the Top - Early Learning Challenge, 2016
2016-01-01
This Race to the Top - Early Learning Challenge (RTT-ELC) annual performance report for the year 2015 describes New Jersey's accomplishments, lessons learned, challenges, and strategies New Jersey will implement to address those challenges. New Jersey's remarkable progress in increasing participation in their tiered quality rating and improvement…
EDITORIAL: XXVI IUPAP Conference on Computational Physics (CCP2014)
NASA Astrophysics Data System (ADS)
Sandvik, A. W.; Campbell, D. K.; Coker, D. F.; Tang, Y.
2015-09-01
The 26th IUPAP Conference on Computational Physics, CCP2014, was held in Boston, Massachusetts, during August 11-14, 2014. Almost 400 participants from 38 countries convened at the George Sherman Union at Boston University for four days of plenary and parallel sessions spanning a broad range of topics in computational physics and related areas. The first meeting in the series that developed into the annual Conference on Computational Physics (CCP) was held in 1989, also on the campus of Boston University and chaired by our colleague Claudio Rebbi. The express purpose of that meeting was to discuss the progress, opportunities and challenges of common interest to physicists engaged in computational research. The conference having returned to the site of its inception, it is interesting to recect on the development of the field during the intervening years. Though 25 years is a short time for mankind, computational physics has taken giant leaps during these years, not only because of the enormous increases in computer power but especially because of the development of new methods and algorithms, and the growing awareness of the opportunities the new technologies and methods can offer. Computational physics now represents a ''third leg'' of research alongside analytical theory and experiments in almost all subfields of physics, and because of this there is also increasing specialization within the community of computational physicists. It is therefore a challenge to organize a meeting such as CCP, which must have suffcient depth in different areas to hold the interest of experts while at the same time being broad and accessible. Still, at a time when computational research continues to gain in importance, the CCP series is critical in the way it fosters cross-fertilization among fields, with many participants specifically attending in order to get exposure to new methods in fields outside their own. As organizers and editors of these Proceedings, we are very pleased with the high quality of the papers provided by the participants. These articles represent a good cross-section of what was presented at the meeting, and it is our hope that they will not only be useful individually for their specific scientific content but will also represent a historical snapshot of the state of computational physics that they represent collectively. The remainder of this Preface contains lists detailing the organizational structure of CCP2014, endorsers and sponsors of the meeting, plenary and invited talks, and a presentation of the 2014 IUPAP C20 Young Scientist Prize. We would like to take the opportunity to again thank all those who contributed to the success of CCP214, as organizers, sponsors, presenters, exhibitors, and participants. Anders Sandvik, David Campbell, David Coker, Ying Tang
Next Generation Distributed Computing for Cancer Research
Agarwal, Pankaj; Owzar, Kouros
2014-01-01
Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing. PMID:25983539
Next generation distributed computing for cancer research.
Agarwal, Pankaj; Owzar, Kouros
2014-01-01
Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing.
Challenges at the Frontiers of Matter and Energy: Transformative Opportunities for Discovery Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hemminger, John C.; Sarrao, John; Crabtree, George
FIVE TRANSFORMATIVE OPPORTUNITIES FOR DISCOVERY SCIENCE As a result of this effort, it has become clear that the progress made to date on the five Grand Challenges has created a springboard for seizing five new Transformative Opportunities that have the potential to further transform key technologies involving matter and energy. These five new Transformative Opportunities and the evidence supporting them are discussed in this new report, “Challenges at the Frontiers of Matter and Energy: Transformative Opportunities for Discovery Science.” Mastering Hierarchical Architectures and Beyond-Equilibrium Matter Complex materials and chemical processes transmute matter and energy, for example from CO2 and watermore » to chemical fuel in photosynthesis, from visible light to electricity in solar cells and from electricity to light in light emitting diodes (LEDs) Such functionality requires complex assemblies of heterogeneous materials in hierarchical architectures that display time-dependent away-from-equilibrium behaviors. Much of the foundation of our understanding of such transformations however, is based on monolithic single- phase materials operating at or near thermodynamic equilibrium. The emergent functionalities enabling next-generation disruptive energy technologies require mastering the design, synthesis, and control of complex hierarchical materials employing dynamic far-from-equilibrium behavior. A key guide in this pursuit is nature, for biological systems prove the power of hierarchical assembly and far- from-equilibrium behavior. The challenges here are many: a description of the functionality of hierarchical assemblies in terms of their constituent parts, a blueprint of atomic and molecular positions for each constituent part, and a synthesis strategy for (a) placing the atoms and molecules in the proper positions for the component parts and (b) arranging the component parts into the required hierarchical structure. Targeted functionality will open the door to significant advances in the harvesting, transforming (e.g., reducing CO2, splitting water, and fixing nitrogen), storing, and use of energy to create new materials, manufacturing processes, and technologies—the lifeblood of human societies and economic growth. Beyond Ideal Materials and Systems: Understanding the Critical Roles of Heterogeneity, Interfaces, and Disorder Real materials, both natural ones and those we engineer, are usually a complex mixture of compositional and structural heterogeneities, interfaces, and disorder across all spatial and temporal scales. It is the fluctuations and disorderly states of these heterogeneities and interfaces that often determine the system’s properties and functionality. Much of our fundamental scientific knowledge is based on “ideal” systems, meaning materials that are observed in “frozen” states or represented by spatially or temporally averaged states. Too often, this approach has yielded overly simplistic models that hide important nuances and do not capture the complex behaviors of materials under realistic conditions. These behaviors drive vital chemical transformations such as catalysis, which initiates most industrial manufacturing processes, and friction and corrosion, the parasitic effects of which cost the U.S. economy billions of dollars annually. Expanding our scientific knowledge from the relative simplicity of ideal, perfectly ordered, or structurally averaged materials to the true complexity of real-world heterogeneities, interfaces, and disorder should enable us to realize enormous benefits in the materials and chemical sciences, which translates to the energy sciences, including solar and nuclear power, hydraulic fracturing, power conversion, airframes, and batteries. Harnessing Coherence in Light and Matter Quantum coherence in light and matter is a measure of the extent to which a wave field vibrates in unison with itself at neighboring points in space and time. Although this phenomenon is expressed at the atomic and electronic scales, it can dominate the macroscopic properties of materials and chemical reactions such as superconductivity and efficient photosynthesis. In recent years, enormous progress has been made in recognizing, manipulating, and exploiting quantum coherence. This progress has already elucidated the role that symmetry plays in protecting coherence in key materials, taught us how to use light to manipulate atoms and molecules, and provided us with increasingly sophisticated techniques for controlling and probing the charges and spins of quantum coherent systems. With the arrival of new sources of coherent light and electron beams, thanks in large part to investments by the U.S. Department of Energy’s Office of Basic Energy Sciences (BES), there is now an opportunity to engineer coherence in heterostructures that incorporate multiple types of materials and to control complex, multistep chemical transformations. This approach will pave the way for quantum information processing and next-generation photovoltaic cells and sensors. Revolutionary Advances in Models, Mathematics, Algorithms, Data, and Computing Science today is benefiting from a convergence of theoretical, mathematical, computational, and experimental capabilities that put us on the brink of greatly accelerating our ability to predict, synthesize, and control new materials and chemical processes, and to understand the complexities of matter across a range of scales. Imagine being able to chart a path through a vast sea of possible new materials to find a select few with desired properties. Instead of the time-honored forward approach, in which materials with desired properties are found through either trial-and-error experiments or lucky accidents, we have the opportunity to inversely design and create new materials that possess the properties we desire. The traditional approach has allowed us to make only a tiny fraction of all the materials that are theoretically possible. The inverse design approach, through the harmonious convergence of theoretical, mathematical, computational, and experimental capabilities, could usher in a virtual cornucopia of new materials with functionalities far beyond what nature can provide. Similarly, enhanced mathematical and computational capabilities significantly enhance our ability to extract physical and chemical insights from vastly larger data streams gathered during multimodal and multidimensional experiments using advanced characterization facilities. Exploiting Transformative Advances in Imaging Capabilities across Multiple Scales Historically, improvements in imaging capabilities have always resulted in improved understanding of scientific phenomena. A prime challenge today is finding ways to reconstruct raw data, obtained by probing and mapping matter across multiple scales, into analyzable images. BES investments in new and improved imaging facilities, most notably synchrotron x-ray sources, free-electron lasers, electron microscopes, and neutron sources, have greatly advanced our powers of observation, as have substantial improvements in laboratory- scale technologies. Furthermore, BES is now planning or actively discussing exciting new capabilities. Taken together, these advances in imaging capabilities provide an opportunity to expand our ability to observe and study matter from the 3D spatial perspectives of today to true “4D” spatially and temporally resolved maps of dynamics that allow quantitative predictions of time-dependent material properties and chemical processes. The knowledge gained will impact data storage, catalyst design, drug delivery, structural materials, and medical implants, to name just a few key technologies. ENABLING SUCCESS Seizing each of these five Transformative Opportunities, as well as accelerating further progress on Grand Challenge research, will require specific, targeted investments from BES in the areas of synthesis, meaning the ability to make the materials and architectures that are envisioned; instrumentation and tools, a category that includes theory and computation; and human capital, the most important asset for advancing the Grand Challenges and Transformative Opportunities. While “Challenges at the Frontiers of Matter and Energy: Transformative Opportunities for Discovery Science” could be viewed as a sequel to the original Grand Challenges report, it breaks much new ground in its assessment of the scientific landscape today versus the scientific landscape just a few years ago. In the original Grand Challenges report, it was noted that if the five Grand Challenges were met, our ability to direct matter and energy would be measured only by the limits of human imagination. This new report shows that, prodded by those challenges, the scientific community is positioned today to seize new opportunities whose impacts promise to be transformative for science and society, as well as dramatically accelerate progress in the pursuit of the original Grand Challenges.« less
Results of the First National Assessment of Computer Competence (The Printout).
ERIC Educational Resources Information Center
Balajthy, Ernest
1988-01-01
Discusses the findings of the National Assessment of Educational Progress 1985-86 survey of American students' computer competence, focusing on findings of interest to reading teachers who use computers. (MM)
Computational challenges in modeling gene regulatory events.
Pataskar, Abhijeet; Tiwari, Vijay K
2016-10-19
Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating "omics" data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology.
Computer-Assisted Diagnostic Decision Support: History, Challenges, and Possible Paths Forward
ERIC Educational Resources Information Center
Miller, Randolph A.
2009-01-01
This paper presents a brief history of computer-assisted diagnosis, including challenges and future directions. Some ideas presented in this article on computer-assisted diagnostic decision support systems (CDDSS) derive from prior work by the author and his colleagues (see list in Acknowledgments) on the INTERNIST-1 and QMR projects. References…
NASA Astrophysics Data System (ADS)
Marinos, Alexandros; Briscoe, Gerard
Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.
Advanced computations in plasma physics
NASA Astrophysics Data System (ADS)
Tang, W. M.
2002-05-01
Scientific simulation in tandem with theory and experiment is an essential tool for understanding complex plasma behavior. In this paper we review recent progress and future directions for advanced simulations in magnetically confined plasmas with illustrative examples chosen from magnetic confinement research areas such as microturbulence, magnetohydrodynamics, magnetic reconnection, and others. Significant recent progress has been made in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics, giving increasingly good agreement between experimental observations and computational modeling. This was made possible by innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales together with access to powerful new computational resources. In particular, the fusion energy science community has made excellent progress in developing advanced codes for which computer run-time and problem size scale well with the number of processors on massively parallel machines (MPP's). A good example is the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPP's to produce three-dimensional, general geometry, nonlinear particle simulations which have accelerated progress in understanding the nature of turbulence self-regulation by zonal flows. It should be emphasized that these calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In general, results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. The associated scientific excitement should serve to stimulate improved cross-cutting collaborations with other fields and also to help attract bright young talent to plasma science.
Domanskyi, Sergii; Nicholatos, Justin W; Schilling, Joshua E; Privman, Vladimir; Libert, Sergiy
2017-11-01
Apoptosis is essential for numerous processes, such as development, resistance to infections, and suppression of tumorigenesis. Here, we investigate the influence of the nutrient sensing and longevity-assuring enzyme SIRT6 on the dynamics of apoptosis triggered by serum starvation. Specifically, we characterize the progression of apoptosis in wild type and SIRT6 deficient mouse embryonic fibroblasts using time-lapse flow cytometry and computational modelling based on rate-equations and cell distribution analysis. We find that SIRT6 deficient cells resist apoptosis by delaying its initiation. Interestingly, once apoptosis is initiated, the rate of its progression is higher in SIRT6 null cells compared to identically cultured wild type cells. However, SIRT6 null cells succumb to apoptosis more slowly, not only in response to nutrient deprivation but also in response to other stresses. Our data suggest that SIRT6 plays a role in several distinct steps of apoptosis. Overall, we demonstrate the utility of our computational model to describe stages of apoptosis progression and the integrity of the cellular membrane. Such measurements will be useful in a broad range of biological applications.
Batey, Michael A.; Almeida, Gilberto S.; Wilson, Ian; Dildey, Petra; Sharma, Abhishek; Blair, Helen; Hide, I. Geoff; Heidenreich, Olaf; Vormoor, Josef; Maxwell, Ross J.; Bacon, Chris M.
2014-01-01
Ewing sarcoma and osteosarcoma represent the two most common primary bone tumours in childhood and adolescence, with bone metastases being the most adverse prognostic factor. In prostate cancer, osseous metastasis poses a major clinical challenge. We developed a preclinical orthotopic model of Ewing sarcoma, reflecting the biology of the tumour-bone interactions in human disease and allowing in vivo monitoring of disease progression, and compared this with models of osteosarcoma and prostate carcinoma. Human tumour cell lines were transplanted into non-obese diabetic/severe combined immunodeficient (NSG) and Rag2−/−/γc−/− mice by intrafemoral injection. For Ewing sarcoma, minimal cell numbers (1000–5000) injected in small volumes were able to induce orthotopic tumour growth. Tumour progression was studied using positron emission tomography, computed tomography, magnetic resonance imaging and bioluminescent imaging. Tumours and their interactions with bones were examined by histology. Each tumour induced bone destruction and outgrowth of extramedullary tumour masses, together with characteristic changes in bone that were well visualised by computed tomography, which correlated with post-mortem histology. Ewing sarcoma and, to a lesser extent, osteosarcoma cells induced prominent reactive new bone formation. Osteosarcoma cells produced osteoid and mineralised “malignant” bone within the tumour mass itself. Injection of prostate carcinoma cells led to osteoclast-driven osteolytic lesions. Bioluminescent imaging of Ewing sarcoma xenografts allowed easy and rapid monitoring of tumour growth and detection of tumour dissemination to lungs, liver and bone. Magnetic resonance imaging proved useful for monitoring soft tissue tumour growth and volume. Positron emission tomography proved to be of limited use in this model. Overall, we have developed an orthotopic in vivo model for Ewing sarcoma and other primary and secondary human bone malignancies, which resemble the human disease. We have shown the utility of small animal bioimaging for tracking disease progression, making this model a useful assay for preclinical drug testing. PMID:24409320
CAGI: Computer Aided Grid Interface. A work in progress
NASA Technical Reports Server (NTRS)
Soni, Bharat K.; Yu, Tzu-Yi; Vaughn, David
1992-01-01
Progress realized in the development of a Computer Aided Grid Interface (CAGI) software system in integrating CAD/CAM geometric system output and/or Interactive Graphics Exchange Standard (IGES) files, geometry manipulations associated with grid generation, and robust grid generation methodologies is presented. CAGI is being developed in a modular fashion and will offer fast, efficient and economical response to geometry/grid preparation, allowing the ability to upgrade basic geometry in a step-by-step fashion interactively and under permanent visual control along with minimizing the differences between the actual hardware surface descriptions and corresponding numerical analog. The computer code GENIE is used as a basis. The Non-Uniform Rational B-Splines (NURBS) representation of sculptured surfaces is utilized for surface grid redistribution. The computer aided analysis system, PATRAN, is adapted as a CAD/CAM system. The progress realized in NURBS surface grid generation, the development of IGES transformer, and geometry adaption using PATRAN will be presented along with their applicability to grid generation associated with rocket propulsion applications.
How controllers compensate for the lack of flight progress strips.
DOT National Transportation Integrated Search
1996-02-01
The role of the Flight Progress Strip, currently used to display important flight data, has been debated because of long range plans to automate the air traffic control (ATC) human-computer interface. Currently, the Fight Progress Strip is viewed by ...
Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing
NASA Astrophysics Data System (ADS)
Chen, A.; Pham, L.; Kempler, S.; Theobald, M.; Esfandiari, A.; Campino, J.; Vollmer, B.; Lynnes, C.
2011-12-01
Cloud Computing technology has been used to offer high-performance and low-cost computing and storage resources for both scientific problems and business services. Several cloud computing services have been implemented in the commercial arena, e.g. Amazon's EC2 & S3, Microsoft's Azure, and Google App Engine. There are also some research and application programs being launched in academia and governments to utilize Cloud Computing. NASA launched the Nebula Cloud Computing platform in 2008, which is an Infrastructure as a Service (IaaS) to deliver on-demand distributed virtual computers. Nebula users can receive required computing resources as a fully outsourced service. NASA Goddard Earth Science Data and Information Service Center (GES DISC) migrated several GES DISC's applications to the Nebula as a proof of concept, including: a) The Simple, Scalable, Script-based Science Processor for Measurements (S4PM) for processing scientific data; b) the Atmospheric Infrared Sounder (AIRS) data process workflow for processing AIRS raw data; and c) the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (GIOVANNI) for online access to, analysis, and visualization of Earth science data. This work aims to evaluate the practicability and adaptability of the Nebula. The initial work focused on the AIRS data process workflow to evaluate the Nebula. The AIRS data process workflow consists of a series of algorithms being used to process raw AIRS level 0 data and output AIRS level 2 geophysical retrievals. Migrating the entire workflow to the Nebula platform is challenging, but practicable. After installing several supporting libraries and the processing code itself, the workflow is able to process AIRS data in a similar fashion to its current (non-cloud) configuration. We compared the performance of processing 2 days of AIRS level 0 data through level 2 using a Nebula virtual computer and a local Linux computer. The result shows that Nebula has significantly better performance than the local machine. Much of the difference was due to newer equipment in the Nebula than the legacy computer, which is suggestive of a potential economic advantage beyond elastic power, i.e., access to up-to-date hardware vs. legacy hardware that must be maintained past its prime to amortize the cost. In addition to a trade study of advantages and challenges of porting complex processing to the cloud, a tutorial was developed to enable further progress in utilizing the Nebula for Earth Science applications and understanding better the potential for Cloud Computing in further data- and computing-intensive Earth Science research. In particular, highly bursty computing such as that experienced in the user-demand-driven Giovanni system may become more tractable in a Cloud environment. Our future work will continue to focus on migrating more GES DISC's applications/instances, e.g. Giovanni instances, to the Nebula platform and making matured migrated applications to be in operation on the Nebula.
Paesani, Francesco
2016-09-20
The central role played by water in fundamental processes relevant to different disciplines, including chemistry, physics, biology, materials science, geology, and climate research, cannot be overemphasized. It is thus not surprising that, since the pioneering work by Stillinger and Rahman, many theoretical and computational studies have attempted to develop a microscopic description of the unique properties of water under different thermodynamic conditions. Consequently, numerous molecular models based on either molecular mechanics or ab initio approaches have been proposed over the years. However, despite continued progress, the correct prediction of the properties of water from small gas-phase clusters to the liquid phase and ice through a single molecular model remains challenging. To large extent, this is due to the difficulties encountered in the accurate modeling of the underlying hydrogen-bond network in which both number and strength of the hydrogen bonds vary continuously as a result of a subtle interplay between energetic, entropic, and nuclear quantum effects. In the past decade, the development of efficient algorithms for correlated electronic structure calculations of small molecular complexes, accompanied by tremendous progress in the analytical representation of multidimensional potential energy surfaces, opened the doors to the design of highly accurate potential energy functions built upon rigorous representations of the many-body expansion (MBE) of the interaction energies. This Account provides a critical overview of the performance of the MB-pol many-body potential energy function through a systematic analysis of energetic, structural, thermodynamic, and dynamical properties as well as of vibrational spectra of water from the gas to the condensed phase. It is shown that MB-pol achieves unprecedented accuracy across all phases of water through a quantitative description of each individual term of the MBE, with a physically correct representation of both short- and long-range many-body contributions. Comparisons with experimental data probing different regions of the water potential energy surface from clusters to bulk demonstrate that MB-pol represents a major step toward the long-sought-after "universal model" capable of accurately describing the molecular properties of water under different conditions and in different environments. Along this path, future challenges include the extension of the many-body scheme adopted by MB-pol to the description of generic solutes as well as the integration of MB-pol in an efficient theoretical and computational framework to model acid-base reactions in aqueous environments. In this context, given the nontraditional form of the MB-pol energy and force expressions, synergistic efforts by theoretical/computational chemists/physicists and computer scientists will be critical for the development of high-performance software for many-body molecular dynamics simulations.
Challenges of Big Data Analysis.
Fan, Jianqing; Han, Fang; Liu, Han
2014-06-01
Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.
Challenges of Big Data Analysis
Fan, Jianqing; Han, Fang; Liu, Han
2014-01-01
Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469
Teaching 1H NMR Spectrometry Using Computer Modeling.
ERIC Educational Resources Information Center
Habata, Yoichi; Akabori, Sadatoshi
2001-01-01
Molecular modeling by computer is used to display stereochemistry, molecular orbitals, structure of transition states, and progress of reactions. Describes new ideas for teaching 1H NMR spectroscopy using computer modeling. (Contains 12 references.) (ASK)
NASA Astrophysics Data System (ADS)
Jaworski, Allan
1993-08-01
The Earth Observing System (EOS) Data and Information System (EOSDIS) will serve as a major resource for the earth science community, supporting both command and control of complex instruments onboard the EOS spacecraft and the archiving, distribution, and analysis of data. The scale of EOSDIS and the volume of multidisciplinary research to be conducted using EOSDIS resources will produce unparalleled needs for technology transparency, data integration, and system interoperability. The scale of this effort far outscopes any previous scientific data system in its breadth or operational and performance needs. Modern hardware technology can meet the EOSDIS technical challenge. Multiprocessing speeds of many giga-flops are being realized by modern computers. Online storage disk, optical disk, and videocassette libraries with storage capacities of many terabytes are now commercially available. Radio frequency and fiber optics communications networks with gigabit rates are demonstrable today. It remains, of course, to perform the system engineering to establish the requirements, architectures, and designs that will implement the EOSDIS systems. Software technology, however, has not enjoyed the price/performance advances of hardware. Although we have learned to engineer hardware systems which have several orders of magnitude greater complexity and performance than those built in the 1960's, we have not made comparable progress in dramatically reducing the cost of software development. This lack of progress may significantly reduce our capabilities to achieve economically the types of highly interoperable, responsive, integraded, and productive environments which are needed by the earth science community. This paper describes some of the EOSDIS software requirements and current activities in the software community which are applicable to meeting the EOSDIS challenge. Some of these areas include intelligent user interfaces, software reuse libraries, and domain engineering. Also included are discussions of applicable standards in the areas of operating systems interfaces, user interfaces, communications interfaces, data transport, and science algorithm support, and their role in supporting the software development process.
Global Gravity Field Determination by Combination of terrestrial and Satellite Gravity Data
NASA Astrophysics Data System (ADS)
Fecher, T.; Pail, R.; Gruber, T.
2011-12-01
A multitude of impressive results document the success of the satellite gravity field mission GOCE with a wide field of applications in geodesy, geophysics and oceanography. The high performance of GOCE gravity field models can be further improved by combination with GRACE data, which is contributing the long wavelength signal content of the gravity field with very high accuracy. An example for such a consistent combination of satellite gravity data are the satellite-only models GOCO01S and GOCO02S. However, only the further combination with terrestrial and altimetric gravity data enables to expand gravity field models up to very high spherical harmonic degrees and thus to achieve a spatial resolution down to 20-30 km. First numerical studies for high-resolution global gravity field models combining GOCE, GRACE and terrestrial/altimetric data on basis of the DTU10 model have already been presented. Computations up to degree/order 600 based on full normal equations systems to preserve the full variance-covariance information, which results mainly from different weights of individual terrestrial/altimetric data sets, have been successfully performed. We could show that such large normal equations systems (degree/order 600 corresponds to a memory demand of almost 1TByte), representing an immense computational challenge as computation time and memory requirements put high demand on computational resources, can be handled. The DTU10 model includes gravity anomalies computed from the global model EGM08 in continental areas. Therefore, the main focus of this presentation lies on the computation of high-resolution combined gravity field models based on real terrestrial gravity anomaly data sets. This is a challenge due to the inconsistency of these data sets, including also systematic error components, but a further step to a real independent gravity field model. This contribution will present our recent developments and progress by using independent data sets at certain land areas, which are combined with DTU10 in the ocean areas, as well as satellite gravity data. Investigations have been made concerning the preparation and optimum weighting of the different data sources. The results, which should be a major step towards a GOCO-C model, will be validated using external gravity field data and by applying different validation methods.
The National Cancer Institute’s Clinical Proteomic Tumor Analysis Consortium (CPTAC) is pleased to announce the opening of the leaderboard to its Proteogenomics Computational DREAM Challenge. The leadership board remains open for submissions during September 25, 2017 through October 8, 2017, with the Challenge expected to run until November 17, 2017.
NASA Astrophysics Data System (ADS)
Wei, Tzu-Chieh; Huang, Ching-Yu
2017-09-01
Recent progress in the characterization of gapped quantum phases has also triggered the search for a universal resource for quantum computation in symmetric gapped phases. Prior works in one dimension suggest that it is a feature more common than previously thought, in that nontrivial one-dimensional symmetry-protected topological (SPT) phases provide quantum computational power characterized by the algebraic structure defining these phases. Progress in two and higher dimensions so far has been limited to special fixed points. Here we provide two families of two-dimensional Z2 symmetric wave functions such that there exists a finite region of the parameter in the SPT phases that supports universal quantum computation. The quantum computational power appears to lose its universality at the boundary between the SPT and the symmetry-breaking phases.
76 FR 13984 - Cloud Computing Forum & Workshop III
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Cloud Computing Forum... public workshop. SUMMARY: NIST announces the Cloud Computing Forum & Workshop III to be held on April 7... provide information on the NIST strategic and tactical Cloud Computing program, including progress on the...
Progress and challenges of disaster health management in China: a scoping review.
Zhong, Shuang; Clark, Michele; Hou, Xiang-Yu; Zang, Yuli; FitzGerald, Gerard
2014-01-01
Despite the importance of an effective health system response to various disasters, relevant research is still in its infancy, especially in middle- and low-income countries. This paper provides an overview of the status of disaster health management in China, with its aim to promote the effectiveness of the health response for reducing disaster-related mortality and morbidity. A scoping review method was used to address the recent progress of and challenges to disaster health management in China. Major health electronic databases were searched to identify English and Chinese literature that were relevant to the research aims. The review found that since 2003 considerable progress has been achieved in the health disaster response system in China. However, there remain challenges that hinder effective health disaster responses, including low standards of disaster-resistant infrastructure safety, the lack of specific disaster plans, poor emergency coordination between hospitals, lack of portable diagnostic equipment and underdeveloped triage skills, surge capacity, and psychological interventions. Additional challenges include the fragmentation of the emergency health service system, a lack of specific legislation for emergencies, disparities in the distribution of funding, and inadequate cost-effective considerations for disaster rescue. One solution identified to address these challenges appears to be through corresponding policy strategies at multiple levels (e.g. community, hospital, and healthcare system level).
ERIC Educational Resources Information Center
Mulder, Yvonne G.; Lazonder, Ard W.; de Jong, Ton
2011-01-01
The educational advantages of inquiry learning environments that incorporate modelling facilities are often challenged by students' poor inquiry skills. This study examined two types of model progression as means to compensate for these skill deficiencies. Model order progression (MOP), the predicted optimal variant, gradually increases the…
Case management considerations of progressive dementia in a home setting.
Pierce, Mary Ellen
2010-01-01
Nursing theory, research, and best practice guidelines contribute substantially to the field of dementia care. Interventional plans are challenged most by those dementias considered progressive and deteriorative in nature, requiring ongoing reassessment and modification of care practices as the clinical course changes. The purpose of this article is to provide guidelines for case managers in the development of effective, individualized care plans for clients with progressive dementia residing in a home setting. The application of these guidelines is illustrated through the presentation of an actual case. The practice setting is a private home in the Pacific Northwest. Geriatric case management is provided by an RN case manager. Progressive dementia presents challenges to home care. Professional case management using comprehensive, holistic assessment, collaborative approaches, and best practice fundamentals serve to create an effective, individualized plan of care. The increasing geriatric population presents great opportunities for case managers in strategic management for creating successful home care models in clients with progressive dementia. Use of nursing diagnoses, dementia research, and collaborative approaches with families and other medical providers creates a viable alternative for clients with progressive dementia.
Ozgul, Betul Memis; Orhan, Kaan; Oz, Firdevs Tulga
2015-09-01
We investigated inhibition of lesion progression in artificial enamel lesions. Lesions were created on primary and permanent anterior teeth (n = 10 each) and were divided randomly into two groups with two windows: Group 1 (window A: resin infiltration; window B: negative control) and Group 2 (window A: resin infiltration + fluoride varnish; window B: fluoride varnish). After pH cycling, micro-computed tomography was used to analyze progression of lesion depth and changes in mineral density. Resin infiltration and resin infiltration + fluoride varnish significantly inhibited progression of lesion depth in primary teeth (P < 0.05). Inhibition of lesion depth progression in permanent teeth was significantly greater after treatment with resin infiltration + fluoride varnish than in the negative control (P < 0.05). Change in mineral density was smaller in the resin infiltration and resin infiltration + fluoride varnish groups; however, the difference was not significant for either group (P > 0.05). Resin infiltration is a promising method of inhibiting progression of caries lesions.
Parallel Computational Fluid Dynamics: Current Status and Future Requirements
NASA Technical Reports Server (NTRS)
Simon, Horst D.; VanDalsem, William R.; Dagum, Leonardo; Kutler, Paul (Technical Monitor)
1994-01-01
One or the key objectives of the Applied Research Branch in the Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Allies Research Center is the accelerated introduction of highly parallel machines into a full operational environment. In this report we discuss the performance results obtained from the implementation of some computational fluid dynamics (CFD) applications on the Connection Machine CM-2 and the Intel iPSC/860. We summarize some of the experiences made so far with the parallel testbed machines at the NAS Applied Research Branch. Then we discuss the long term computational requirements for accomplishing some of the grand challenge problems in computational aerosciences. We argue that only massively parallel machines will be able to meet these grand challenge requirements, and we outline the computer science and algorithm research challenges ahead.
ADDRESSING ENVIRONMENTAL ENGINEERING CHALLENGES WITH COMPUTATIONAL FLUID DYNAMICS
This paper discusses the status and application of Computational Fluid Dynamics )CFD) models to address environmental engineering challenges for more detailed understanding of air pollutant source emissions, atmospheric dispersion and resulting human exposure. CFD simulations ...
Beyond moore computing research challenge workshop report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huey, Mark C.; Aidun, John Bahram
2013-10-01
We summarize the presentations and break out session discussions from the in-house workshop that was held on 11 July 2013 to acquaint a wider group of Sandians with the Beyond Moore Computing research challenge.
Annual Research Briefs - 2000: Center for Turbulence Research
NASA Technical Reports Server (NTRS)
2000-01-01
This report contains the 2000 annual progress reports of the postdoctoral Fellows and visiting scholars of the Center for Turbulence Research (CTR). It summarizes the research efforts undertaken under the core CTR program. Last year, CTR sponsored sixteen resident Postdoctoral Fellows, nine Research Associates, and two Senior Research Fellows, hosted seven short term visitors, and supported four doctoral students. The Research Associates are supported by the Departments of Defense and Energy. The reports in this volume are divided into five groups. The first group largely consists of the new areas of interest at CTR. It includes efficient algorithms for molecular dynamics, stability in protoplanetary disks, and experimental and numerical applications of evolutionary optimization algorithms for jet flow control. The next group of reports is in experimental, theoretical, and numerical modeling efforts in turbulent combustion. As more challenging computations are attempted, the need for additional theoretical and experimental studies in combustion has emerged. A pacing item for computation of nonpremixed combustion is the prediction of extinction and re-ignition phenomena, which is currently being addressed at CTR. The third group of reports is in the development of accurate and efficient numerical methods, which has always been an important part of CTR's work. This is the tool development part of the program which supports our high fidelity numerical simulations in such areas as turbulence in complex geometries, hypersonics, and acoustics. The final two groups of reports are concerned with LES and RANS prediction methods. There has been significant progress in wall modeling for LES of high Reynolds number turbulence and in validation of the v(exp 2) - f model for industrial applications.
3D printing of soft-matter to open a new era of soft-matter MEMS/robotics (Conference Presentation)
NASA Astrophysics Data System (ADS)
Furukawa, Hidemitsu
2017-04-01
3D printing technology is becoming useful and applicable by the progress of information and communication technology (ICT). It means 3D printer is a kind of useful robot for additive manufacturing and is controlled by computer with human-friendly software. Once user starts to use 3D printing of soft-matter, one can immediately understand computer-aided design (CAD) and engineering (CAE) technology will be more important and applicable for soft-matter systems. User can easily design soft-matter objects and 3D-print them. User can easily apply 3D-printed soft-matter objects to develop new research and application on MEMS and robotics. Here we introduce the recent progress of 3D printing (i.e. additive manufacturing), especially focusing on our 3D gel printing. We are trying to develop new advanced research and applications of 3D gel printer, including GEL-MECHANICS, GEL-PHOTONICS, and GEL-ROBOTICS. In the gel-mechanics, we are developing new gel materials for mechanical engineering. Some gels have high-mechanical strength and shape memory properties. In the gel-photonics. We are applying our original characterizing system, named `Scanning Microscopic Light Scattering (SMILS)', to analyze 3D printed gel materials. In the gel-robotics, we focus on 3D printing of soft parts for soft-robotics made form gel materials, like gel finger. Also we are challenging to apply 3D gel printing to start new company, to innovate new businesses in county side, and to create new 3D-printed foods.
Computer Programming in Middle School: How Pairs Respond to Challenges
ERIC Educational Resources Information Center
Denner, Jill; Werner, Linda
2007-01-01
Many believe that girls lack the confidence and motivation to persist with computers when they face a challenge. In order to increase the number of girls and women in information technology careers, we need a better understanding of how they think about and solve problems while working on the computer. In this article, we describe a qualitative…
Computational challenges in modeling gene regulatory events
Pataskar, Abhijeet; Tiwari, Vijay K.
2016-01-01
ABSTRACT Cellular transcriptional programs driven by genetic and epigenetic mechanisms could be better understood by integrating “omics” data and subsequently modeling the gene-regulatory events. Toward this end, computational biology should keep pace with evolving experimental procedures and data availability. This article gives an exemplified account of the current computational challenges in molecular biology. PMID:27390891
A K-6 Computational Thinking Curriculum Framework: Implications for Teacher Knowledge
ERIC Educational Resources Information Center
Angeli, Charoula; Voogt, Joke; Fluck, Andrew; Webb, Mary; Cox, Margaret; Malyn-Smith, Joyce; Zagami, Jason
2016-01-01
Adding computer science as a separate school subject to the core K-6 curriculum is a complex issue with educational challenges. The authors herein address two of these challenges: (1) the design of the curriculum based on a generic computational thinking framework, and (2) the knowledge teachers need to teach the curriculum. The first issue is…
Mathematical challenges from theoretical/computational chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-12-31
The committee believes that this report has relevance and potentially valuable suggestions for a wide range of readers. Target audiences include: graduate departments in the mathematical and chemical sciences; federal and private agencies that fund research in the mathematical and chemical sciences; selected industrial and government research and development laboratories; developers of software and hardware for computational chemistry; and selected individual researchers. Chapter 2 of this report covers some history of computational chemistry for the nonspecialist, while Chapter 3 illustrates the fruits of some past successful cross-fertilization between mathematical scientists and computational/theoretical chemists. In Chapter 4 the committee has assembledmore » a representative, but not exhaustive, survey of research opportunities. Most of these are descriptions of important open problems in computational/theoretical chemistry that could gain much from the efforts of innovative mathematical scientists, written so as to be accessible introductions to the nonspecialist. Chapter 5 is an assessment, necessarily subjective, of cultural differences that must be overcome if collaborative work is to be encouraged between the mathematical and the chemical communities. Finally, the report ends with a brief list of conclusions and recommendations that, if followed, could promote accelerated progress at this interface. Recognizing that bothersome language issues can inhibit prospects for collaborative research at the interface between distinctive disciplines, the committee has attempted throughout to maintain an accessible style, in part by using illustrative boxes, and has included at the end of the report a glossary of technical terms that may be familiar to only a subset of the target audiences listed above.« less
A collaborative institutional model for integrating computer applications in the medical curriculum.
Friedman, C. P.; Oxford, G. S.; Juliano, E. L.
1991-01-01
The introduction and promotion of information technology in an established medical curriculum with existing academic and technical support structures poses a number of challenges. The UNC School of Medicine has developed the Taskforce on Educational Applications in Medicine (TEAM), to coordinate this effort. TEAM works as a confederation of existing research and support units with interests in computers and education, along with a core of interested faculty with curricular responsibilities. Constituent units of the TEAM confederation include the medical center library, medical television studios, basic science teaching laboratories, educational development office, microcomputer and network support groups, academic affairs administration, and a subset of course directors and teaching faculty. Among our efforts have been the establishment of (1) a mini-grant program to support faculty initiated development and implementation of computer applications in the curriculum, (2) a symposium series with visiting speakers to acquaint faculty with current developments in medical informatics and related curricular efforts at other institution, (3) 20 computer workstations located in the multipurpose teaching labs where first and second year students do much of their academic work, (4) a demonstration center for evaluation of courseware and technologically advanced delivery systems. The student workstations provide convenient access to electronic mail, University schedules and calendars, the CoSy computer conferencing system, and several software applications integral to their courses in pathology, histology, microbiology, biochemistry, and neurobiology. The progress achieved toward the primary goal has modestly exceeded our initial expectations, while the collegiality and interest expressed toward TEAM activities in the local environment stand as empirical measures of the success of the concept. PMID:1807705
Geometry Modeling and Grid Generation for Design and Optimization
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.
1998-01-01
Geometry modeling and grid generation (GMGG) have played and will continue to play an important role in computational aerosciences. During the past two decades, tremendous progress has occurred in GMGG; however, GMGG is still the biggest bottleneck to routine applications for complicated Computational Fluid Dynamics (CFD) and Computational Structures Mechanics (CSM) models for analysis, design, and optimization. We are still far from incorporating GMGG tools in a design and optimization environment for complicated configurations. It is still a challenging task to parameterize an existing model in today's Computer-Aided Design (CAD) systems, and the models created are not always good enough for automatic grid generation tools. Designers may believe their models are complete and accurate, but unseen imperfections (e.g., gaps, unwanted wiggles, free edges, slivers, and transition cracks) often cause problems in gridding for CSM and CFD. Despite many advances in grid generation, the process is still the most labor-intensive and time-consuming part of the computational aerosciences for analysis, design, and optimization. In an ideal design environment, a design engineer would use a parametric model to evaluate alternative designs effortlessly and optimize an existing design for a new set of design objectives and constraints. For this ideal environment to be realized, the GMGG tools must have the following characteristics: (1) be automated, (2) provide consistent geometry across all disciplines, (3) be parametric, and (4) provide sensitivity derivatives. This paper will review the status of GMGG for analysis, design, and optimization processes, and it will focus on some emerging ideas that will advance the GMGG toward the ideal design environment.
Christensen, Signe; Horowitz, Scott; Bardwell, James C.A.; Olsen, Johan G.; Willemoës, Martin; Lindorff-Larsen, Kresten; Ferkinghoff-Borg, Jesper; Hamelryck, Thomas; Winther, Jakob R.
2017-01-01
Despite the development of powerful computational tools, the full-sequence design of proteins still remains a challenging task. To investigate the limits and capabilities of computational tools, we conducted a study of the ability of the program Rosetta to predict sequences that recreate the authentic fold of thioredoxin. Focusing on the influence of conformational details in the template structures, we based our study on 8 experimentally determined template structures and generated 120 designs from each. For experimental evaluation, we chose six sequences from each of the eight templates by objective criteria. The 48 selected sequences were evaluated based on their progressive ability to (1) produce soluble protein in Escherichia coli and (2) yield stable monomeric protein, and (3) on the ability of the stable, soluble proteins to adopt the target fold. Of the 48 designs, we were able to synthesize 32, 20 of which resulted in soluble protein. Of these, only two were sufficiently stable to be purified. An X-ray crystal structure was solved for one of the designs, revealing a close resemblance to the target structure. We found a significant difference among the eight template structures to realize the above three criteria despite their high structural similarity. Thus, in order to improve the success rate of computational full-sequence design methods, we recommend that multiple template structures are used. Furthermore, this study shows that special care should be taken when optimizing the geometry of a structure prior to computational design when using a method that is based on rigid conformations. PMID:27659562
Applicability Analysis of Validation Evidence for Biomedical Computational Models
Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.; ...
2017-09-07
Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less
Applicability Analysis of Validation Evidence for Biomedical Computational Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pathmanathan, Pras; Gray, Richard A.; Romero, Vicente J.
Computational modeling has the potential to revolutionize medicine the way it transformed engineering. However, despite decades of work, there has only been limited progress to successfully translate modeling research to patient care. One major difficulty which often occurs with biomedical computational models is an inability to perform validation in a setting that closely resembles how the model will be used. For example, for a biomedical model that makes in vivo clinically relevant predictions, direct validation of predictions may be impossible for ethical, technological, or financial reasons. Unavoidable limitations inherent to the validation process lead to challenges in evaluating the credibilitymore » of biomedical model predictions. Therefore, when evaluating biomedical models, it is critical to rigorously assess applicability, that is, the relevance of the computational model, and its validation evidence to the proposed context of use (COU). However, there are no well-established methods for assessing applicability. In this paper, we present a novel framework for performing applicability analysis and demonstrate its use with a medical device computational model. The framework provides a systematic, step-by-step method for breaking down the broad question of applicability into a series of focused questions, which may be addressed using supporting evidence and subject matter expertise. The framework can be used for model justification, model assessment, and validation planning. While motivated by biomedical models, it is relevant to a broad range of disciplines and underlying physics. Finally, the proposed applicability framework could help overcome some of the barriers inherent to validation of, and aid clinical implementation of, biomedical models.« less
Johansson, Kristoffer E; Tidemand Johansen, Nicolai; Christensen, Signe; Horowitz, Scott; Bardwell, James C A; Olsen, Johan G; Willemoës, Martin; Lindorff-Larsen, Kresten; Ferkinghoff-Borg, Jesper; Hamelryck, Thomas; Winther, Jakob R
2016-10-23
Despite the development of powerful computational tools, the full-sequence design of proteins still remains a challenging task. To investigate the limits and capabilities of computational tools, we conducted a study of the ability of the program Rosetta to predict sequences that recreate the authentic fold of thioredoxin. Focusing on the influence of conformational details in the template structures, we based our study on 8 experimentally determined template structures and generated 120 designs from each. For experimental evaluation, we chose six sequences from each of the eight templates by objective criteria. The 48 selected sequences were evaluated based on their progressive ability to (1) produce soluble protein in Escherichia coli and (2) yield stable monomeric protein, and (3) on the ability of the stable, soluble proteins to adopt the target fold. Of the 48 designs, we were able to synthesize 32, 20 of which resulted in soluble protein. Of these, only two were sufficiently stable to be purified. An X-ray crystal structure was solved for one of the designs, revealing a close resemblance to the target structure. We found a significant difference among the eight template structures to realize the above three criteria despite their high structural similarity. Thus, in order to improve the success rate of computational full-sequence design methods, we recommend that multiple template structures are used. Furthermore, this study shows that special care should be taken when optimizing the geometry of a structure prior to computational design when using a method that is based on rigid conformations. Copyright © 2016 Elsevier Ltd. All rights reserved.