Sample records for computational challenges posed

  1. Efficient computation of hashes

    NASA Astrophysics Data System (ADS)

    Lopes, Raul H. C.; Franqueira, Virginia N. L.; Hobson, Peter R.

    2014-06-01

    The sequential computation of hashes at the core of many distributed storage systems and found, for example, in grid services can hinder efficiency in service quality and even pose security challenges that can only be addressed by the use of parallel hash tree modes. The main contributions of this paper are, first, the identification of several efficiency and security challenges posed by the use of sequential hash computation based on the Merkle-Damgard engine. In addition, alternatives for the parallel computation of hash trees are discussed, and a prototype for a new parallel implementation of the Keccak function, the SHA-3 winner, is introduced.

  2. Using physics-based pose predictions and free energy perturbation calculations to predict binding poses and relative binding affinities for FXR ligands in the D3R Grand Challenge 2

    NASA Astrophysics Data System (ADS)

    Athanasiou, Christina; Vasilakaki, Sofia; Dellis, Dimitris; Cournia, Zoe

    2018-01-01

    Computer-aided drug design has become an integral part of drug discovery and development in the pharmaceutical and biotechnology industry, and is nowadays extensively used in the lead identification and lead optimization phases. The drug design data resource (D3R) organizes challenges against blinded experimental data to prospectively test computational methodologies as an opportunity for improved methods and algorithms to emerge. We participated in Grand Challenge 2 to predict the crystallographic poses of 36 Farnesoid X Receptor (FXR)-bound ligands and the relative binding affinities for two designated subsets of 18 and 15 FXR-bound ligands. Here, we present our methodology for pose and affinity predictions and its evaluation after the release of the experimental data. For predicting the crystallographic poses, we used docking and physics-based pose prediction methods guided by the binding poses of native ligands. For FXR ligands with known chemotypes in the PDB, we accurately predicted their binding modes, while for those with unknown chemotypes the predictions were more challenging. Our group ranked #1st (based on the median RMSD) out of 46 groups, which submitted complete entries for the binding pose prediction challenge. For the relative binding affinity prediction challenge, we performed free energy perturbation (FEP) calculations coupled with molecular dynamics (MD) simulations. FEP/MD calculations displayed a high success rate in identifying compounds with better or worse binding affinity than the reference (parent) compound. Our studies suggest that when ligands with chemical precedent are available in the literature, binding pose predictions using docking and physics-based methods are reliable; however, predictions are challenging for ligands with completely unknown chemotypes. We also show that FEP/MD calculations hold predictive value and can nowadays be used in a high throughput mode in a lead optimization project provided that crystal structures of sufficiently high quality are available.

  3. A Problem Posing-Based Practicing Strategy for Facilitating Students' Computer Programming Skills in the Team-Based Learning Mode

    ERIC Educational Resources Information Center

    Wang, Xiao-Ming; Hwang, Gwo-Jen

    2017-01-01

    Computer programming is a subject that requires problem-solving strategies and involves a great number of programming logic activities which pose challenges for learners. Therefore, providing learning support and guidance is important. Collaborative learning is widely believed to be an effective teaching approach; it can enhance learners' social…

  4. Automating a Massive Online Course with Cluster Computing

    ERIC Educational Resources Information Center

    Haas, Timothy C.

    2016-01-01

    Before massive numbers of students can take online courses for college credit, the challenges of providing tutoring support, answers to student-posed questions, and the control of cheating will need to be addressed. These challenges are taken up here by developing an online course delivery system that runs in a cluster computing environment and is…

  5. Head pose estimation in computer vision: a survey.

    PubMed

    Murphy-Chutorian, Erik; Trivedi, Mohan Manubhai

    2009-04-01

    The capacity to estimate the head pose of another person is a common human ability that presents a unique challenge for computer vision systems. Compared to face detection and recognition, which have been the primary foci of face-related vision research, identity-invariant head pose estimation has fewer rigorously evaluated systems or generic solutions. In this paper, we discuss the inherent difficulties in head pose estimation and present an organized survey describing the evolution of the field. Our discussion focuses on the advantages and disadvantages of each approach and spans 90 of the most innovative and characteristic papers that have been published on this topic. We compare these systems by focusing on their ability to estimate coarse and fine head pose, highlighting approaches that are well suited for unconstrained environments.

  6. Predictive Models and Computational Toxicology

    EPA Science Inventory

    Understanding the potential health risks posed by environmental chemicals is a significant challenge elevated by the large number of diverse chemicals with generally uncharacterized exposures, mechanisms, and toxicities. The ToxCast computational toxicology research program was l...

  7. Computational thinking and thinking about computing

    PubMed Central

    Wing, Jeannette M.

    2008-01-01

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing. PMID:18672462

  8. A Computer Science Educational Program for Establishing an Entry Point into the Computing Community of Practice

    ERIC Educational Resources Information Center

    Haberman, Bruria; Yehezkel, Cecile

    2008-01-01

    The rapid evolvement of the computing domain has posed challenges in attempting to bridge the gap between school and the contemporary world of computing, which is related to content, learning culture, and professional norms. We believe that the interaction of high-school students who major in computer science or software engineering with leading…

  9. Combining in silico and in cerebro approaches for virtual screening and pose prediction in SAMPL4.

    PubMed

    Voet, Arnout R D; Kumar, Ashutosh; Berenger, Francois; Zhang, Kam Y J

    2014-04-01

    The SAMPL challenges provide an ideal opportunity for unbiased evaluation and comparison of different approaches used in computational drug design. During the fourth round of this SAMPL challenge, we participated in the virtual screening and binding pose prediction on inhibitors targeting the HIV-1 integrase enzyme. For virtual screening, we used well known and widely used in silico methods combined with personal in cerebro insights and experience. Regular docking only performed slightly better than random selection, but the performance was significantly improved upon incorporation of additional filters based on pharmacophore queries and electrostatic similarities. The best performance was achieved when logical selection was added. For the pose prediction, we utilized a similar consensus approach that amalgamated the results of the Glide-XP docking with structural knowledge and rescoring. The pose prediction results revealed that docking displayed reasonable performance in predicting the binding poses. However, prediction performance can be improved utilizing scientific experience and rescoring approaches. In both the virtual screening and pose prediction challenges, the top performance was achieved by our approaches. Here we describe the methods and strategies used in our approaches and discuss the rationale of their performances.

  10. Combining in silico and in cerebro approaches for virtual screening and pose prediction in SAMPL4

    NASA Astrophysics Data System (ADS)

    Voet, Arnout R. D.; Kumar, Ashutosh; Berenger, Francois; Zhang, Kam Y. J.

    2014-04-01

    The SAMPL challenges provide an ideal opportunity for unbiased evaluation and comparison of different approaches used in computational drug design. During the fourth round of this SAMPL challenge, we participated in the virtual screening and binding pose prediction on inhibitors targeting the HIV-1 integrase enzyme. For virtual screening, we used well known and widely used in silico methods combined with personal in cerebro insights and experience. Regular docking only performed slightly better than random selection, but the performance was significantly improved upon incorporation of additional filters based on pharmacophore queries and electrostatic similarities. The best performance was achieved when logical selection was added. For the pose prediction, we utilized a similar consensus approach that amalgamated the results of the Glide-XP docking with structural knowledge and rescoring. The pose prediction results revealed that docking displayed reasonable performance in predicting the binding poses. However, prediction performance can be improved utilizing scientific experience and rescoring approaches. In both the virtual screening and pose prediction challenges, the top performance was achieved by our approaches. Here we describe the methods and strategies used in our approaches and discuss the rationale of their performances.

  11. Incrementally Dissociating Syntax and Semantics

    ERIC Educational Resources Information Center

    Brennan, Jonathan R.

    2010-01-01

    A basic challenge for research into the neurobiology of language is understanding how the brain combines words to make complex representations. Linguistic theory divides this task into several computations including syntactic structure building and semantic composition. The close relationship between these computations, however, poses a strong…

  12. Prediction, Error, and Adaptation during Online Sentence Comprehension

    ERIC Educational Resources Information Center

    Fine, Alex Brabham

    2013-01-01

    A fundamental challenge for human cognition is perceiving and acting in a world in which the statistics that characterize available sensory data are non-stationary. This thesis focuses on this problem specifically in the domain of sentence comprehension, where linguistic variability poses computational challenges to the processes underlying…

  13. Lessons learned in induced fit docking and metadynamics in the Drug Design Data Resource Grand Challenge 2

    NASA Astrophysics Data System (ADS)

    Baumgartner, Matthew P.; Evans, David A.

    2018-01-01

    Two of the major ongoing challenges in computational drug discovery are predicting the binding pose and affinity of a compound to a protein. The Drug Design Data Resource Grand Challenge 2 was developed to address these problems and to drive development of new methods. The challenge provided the 2D structures of compounds for which the organizers help blinded data in the form of 35 X-ray crystal structures and 102 binding affinity measurements and challenged participants to predict the binding pose and affinity of the compounds. We tested a number of pose prediction methods as part of the challenge; we found that docking methods that incorporate protein flexibility (Induced Fit Docking) outperformed methods that treated the protein as rigid. We also found that using binding pose metadynamics, a molecular dynamics based method, to score docked poses provided the best predictions of our methods with an average RMSD of 2.01 Å. We tested both structure-based (e.g. docking) and ligand-based methods (e.g. QSAR) in the affinity prediction portion of the competition. We found that our structure-based methods based on docking with Smina (Spearman ρ = 0.614), performed slightly better than our ligand-based methods (ρ = 0.543), and had equivalent performance with the other top methods in the competition. Despite the overall good performance of our methods in comparison to other participants in the challenge, there exists significant room for improvement especially in cases such as these where protein flexibility plays such a large role.

  14. Computational Systems Toxicology: recapitulating the logistical dynamics of cellular response networks in virtual tissue models (Eurotox_2017)

    EPA Science Inventory

    Translating in vitro data and biological information into a predictive model for human toxicity poses a significant challenge. This is especially true for complex adaptive systems such as the embryo where cellular dynamics are precisely orchestrated in space and time. Computer ce...

  15. Fast normal mode computations of capsid dynamics inspired by resonance

    NASA Astrophysics Data System (ADS)

    Na, Hyuntae; Song, Guang

    2018-07-01

    Increasingly more and larger structural complexes are being determined experimentally. The sizes of these systems pose a formidable computational challenge to the study of their vibrational dynamics by normal mode analysis. To overcome this challenge, this work presents a novel resonance-inspired approach. Tests on large shell structures of protein capsids demonstrate that there is a strong resonance between the vibrations of a whole capsid and those of individual capsomeres. We then show how this resonance can be taken advantage of to significantly speed up normal mode computations.

  16. US EPA - A*Star Partnership - Accelerating the Acceptance of Next-Generation Sciences and Their Application to Regulatory Risk Assessment (A*Star Symposium, Singapore)

    EPA Science Inventory

    The path for incorporating new alternative methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. Some of these challenges include development of relevant and predictive test systems and computational models to integrate...

  17. The structural science of functional materials.

    PubMed

    Catlow, C Richard A

    2018-01-01

    The growing complexity of functional materials and the major challenges this poses to structural science are discussed. The diversity of structural materials science and the contributions that computation is making to the field are highlighted.

  18. Fostering Critical Reflection in a Computer-Based, Asynchronously Delivered Diversity Training Course

    ERIC Educational Resources Information Center

    Givhan, Shawn T.

    2013-01-01

    This dissertation study chronicles the creation of a computer-based, asynchronously delivered diversity training course for a state agency. The course format enabled efficient delivery of a mandatory curriculum to the Massachusetts Department of State Police workforce. However, the asynchronous format posed a challenge to achieving the learning…

  19. Integrated Speech and Language Technology for Intelligence, Surveillance, and Reconnaissance (ISR)

    DTIC Science & Technology

    2017-07-01

    applying submodularity techniques to address computing challenges posed by large datasets in speech and language processing. MT and speech tools were...aforementioned research-oriented activities, the IT system administration team provided necessary support to laboratory computing and network operations...operations of SCREAM Lab computer systems and networks. Other miscellaneous activities in relation to Task Order 29 are presented in an additional fourth

  20. Ranking docking poses by graph matching of protein-ligand interactions: lessons learned from the D3R Grand Challenge 2

    NASA Astrophysics Data System (ADS)

    da Silva Figueiredo Celestino Gomes, Priscila; Da Silva, Franck; Bret, Guillaume; Rognan, Didier

    2018-01-01

    A novel docking challenge has been set by the Drug Design Data Resource (D3R) in order to predict the pose and affinity ranking of a set of Farnesoid X receptor (FXR) agonists, prior to the public release of their bound X-ray structures and potencies. In a first phase, 36 agonists were docked to 26 Protein Data Bank (PDB) structures of the FXR receptor, and next rescored using the in-house developed GRIM method. GRIM aligns protein-ligand interaction patterns of docked poses to those of available PDB templates for the target protein, and rescore poses by a graph matching method. In agreement with results obtained during the previous 2015 docking challenge, we clearly show that GRIM rescoring improves the overall quality of top-ranked poses by prioritizing interaction patterns already visited in the PDB. Importantly, this challenge enables us to refine the applicability domain of the method by better defining the conditions of its success. We notably show that rescoring apolar ligands in hydrophobic pockets leads to frequent GRIM failures. In the second phase, 102 FXR agonists were ranked by decreasing affinity according to the Gibbs free energy of the corresponding GRIM-selected poses, computed by the HYDE scoring function. Interestingly, this fast and simple rescoring scheme provided the third most accurate ranking method among 57 contributions. Although the obtained ranking is still unsuitable for hit to lead optimization, the GRIM-HYDE scoring scheme is accurate and fast enough to post-process virtual screening data.

  1. Predicting the affinity of Farnesoid X Receptor ligands through a hierarchical ranking protocol: a D3R Grand Challenge 2 case study

    NASA Astrophysics Data System (ADS)

    Réau, Manon; Langenfeld, Florent; Zagury, Jean-François; Montes, Matthieu

    2018-01-01

    The Drug Design Data Resource (D3R) Grand Challenges are blind contests organized to assess the state-of-the-art methods accuracy in predicting binding modes and relative binding free energies of experimentally validated ligands for a given target. The second stage of the D3R Grand Challenge 2 (GC2) was focused on ranking 102 compounds according to their predicted affinity for Farnesoid X Receptor. In this task, our workflow was ranked 5th out of the 77 submissions in the structure-based category. Our strategy consisted in (1) a combination of molecular docking using AutoDock 4.2 and manual edition of available structures for binding poses generation using SeeSAR, (2) the use of HYDE scoring for pose selection, and (3) a hierarchical ranking using HYDE and MM/GBSA. In this report, we detail our pose generation and ligands ranking protocols and provide guidelines to be used in a prospective computer aided drug design program.

  2. Motivating Computer Engineering Freshmen through Mathematical and Logical Puzzles

    ERIC Educational Resources Information Center

    Parhami, B.

    2009-01-01

    As in many other fields of science and technology, college students in computer engineering do not come into full contact with the key ideas and challenges of their chosen discipline until the third year of their studies. This situation poses a problem in terms of keeping the students motivated as they labor through their foundational, basic…

  3. DOCKSCORE: a webserver for ranking protein-protein docked poses.

    PubMed

    Malhotra, Sony; Mathew, Oommen K; Sowdhamini, Ramanathan

    2015-04-24

    Proteins interact with a variety of other molecules such as nucleic acids, small molecules and other proteins inside the cell. Structure-determination of protein-protein complexes is challenging due to several reasons such as the large molecular weights of these macromolecular complexes, their dynamic nature, difficulty in purification and sample preparation. Computational docking permits an early understanding of the feasibility and mode of protein-protein interactions. However, docking algorithms propose a number of solutions and it is a challenging task to select the native or near native pose(s) from this pool. DockScore is an objective scoring scheme that can be used to rank protein-protein docked poses. It considers several interface parameters, namely, surface area, evolutionary conservation, hydrophobicity, short contacts and spatial clustering at the interface for scoring. We have implemented DockScore in form of a webserver for its use by the scientific community. DockScore webserver can be employed, subsequent to docking, to perform scoring of the docked solutions, starting from multiple poses as inputs. The results, on scores and ranks for all the poses, can be downloaded as a csv file and graphical view of the interface of best ranking poses is possible. The webserver for DockScore is made freely available for the scientific community at: http://caps.ncbs.res.in/dockscore/ .

  4. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  5. Grids: The Top Ten Questions

    DOE PAGES

    Schopf, Jennifer M.; Nitzberg, Bill

    2002-01-01

    The design and implementation of a national computing system and data grid has become a reachable goal from both the computer science and computational science point of view. A distributed infrastructure capable of sophisticated computational functions can bring many benefits to scientific work, but poses many challenges, both technical and socio-political. Technical challenges include having basic software tools, higher-level services, functioning and pervasive security, and standards, while socio-political issues include building a user community, adding incentives for sites to be part of a user-centric environment, and educating funding sources about the needs of this community. This paper details the areasmore » relating to Grid research that we feel still need to be addressed to fully leverage the advantages of the Grid.« less

  6. Deterrence of device counterfeiting, cloning, and subversion by substitution using hardware fingerprinting

    DOEpatents

    Hamlet, Jason R; Bauer, Todd M; Pierson, Lyndon G

    2014-09-30

    Deterrence of device subversion by substitution may be achieved by including a cryptographic fingerprint unit within a computing device for authenticating a hardware platform of the computing device. The cryptographic fingerprint unit includes a physically unclonable function ("PUF") circuit disposed in or on the hardware platform. The PUF circuit is used to generate a PUF value. A key generator is coupled to generate a private key and a public key based on the PUF value while a decryptor is coupled to receive an authentication challenge posed to the computing device and encrypted with the public key and coupled to output a response to the authentication challenge decrypted with the private key.

  7. Virtual screening of integrase inhibitors by large scale binding free energy calculations: the SAMPL4 challenge

    PubMed Central

    Gallicchio, Emilio; Deng, Nanjie; He, Peng; Wickstrom, Lauren; Perryman, Alexander L.; Santiago, Daniel N.; Forli, Stefano; Olson, Arthur J.; Levy, Ronald M.

    2014-01-01

    As part of the SAMPL4 blind challenge, filtered AutoDock Vina ligand docking predictions and large scale binding energy distribution analysis method binding free energy calculations have been applied to the virtual screening of a focused library of candidate binders to the LEDGF site of the HIV integrase protein. The computational protocol leveraged docking and high level atomistic models to improve enrichment. The enrichment factor of our blind predictions ranked best among all of the computational submissions, and second best overall. This work represents to our knowledge the first example of the application of an all-atom physics-based binding free energy model to large scale virtual screening. A total of 285 parallel Hamiltonian replica exchange molecular dynamics absolute protein-ligand binding free energy simulations were conducted starting from docked poses. The setup of the simulations was fully automated, calculations were distributed on multiple computing resources and were completed in a 6-weeks period. The accuracy of the docked poses and the inclusion of intramolecular strain and entropic losses in the binding free energy estimates were the major factors behind the success of the method. Lack of sufficient time and computing resources to investigate additional protonation states of the ligands was a major cause of mispredictions. The experiment demonstrated the applicability of binding free energy modeling to improve hit rates in challenging virtual screening of focused ligand libraries during lead optimization. PMID:24504704

  8. In-the-wild facial expression recognition in extreme poses

    NASA Astrophysics Data System (ADS)

    Yang, Fei; Zhang, Qian; Zheng, Chi; Qiu, Guoping

    2018-04-01

    In the computer research area, facial expression recognition is a hot research problem. Recent years, the research has moved from the lab environment to in-the-wild circumstances. It is challenging, especially under extreme poses. But current expression detection systems are trying to avoid the pose effects and gain the general applicable ability. In this work, we solve the problem in the opposite approach. We consider the head poses and detect the expressions within special head poses. Our work includes two parts: detect the head pose and group it into one pre-defined head pose class; do facial expression recognize within each pose class. Our experiments show that the recognition results with pose class grouping are much better than that of direct recognition without considering poses. We combine the hand-crafted features, SIFT, LBP and geometric feature, with deep learning feature as the representation of the expressions. The handcrafted features are added into the deep learning framework along with the high level deep learning features. As a comparison, we implement SVM and random forest to as the prediction models. To train and test our methodology, we labeled the face dataset with 6 basic expressions.

  9. Exploring the Relationships between Students' Ability of Computer-Based Chinese Input and Other Variables Associated to Their Performances in Composition Writing

    ERIC Educational Resources Information Center

    Chai, Ching Sing; Wong, Lung-Hsiang; Sim, Seok Hwa; Deng, Feng

    2012-01-01

    Computer-based writing is already a norm to a large extent in social communication for any major language around the world. From this perspective, it would be pedagogically sound for students to master the Chinese input system as early as possible. This poses some challenges to students in Singapore, most of which are learning Chinese as a second…

  10. Effectiveness of a Computer-Tailored Print-Based Physical Activity Intervention among French Canadians with Type 2 Diabetes in a Real-Life Setting

    ERIC Educational Resources Information Center

    Boudreau, Francois; Godin, Gaston; Poirier, Paul

    2011-01-01

    The promotion of regular physical activity for people with type 2 diabetes poses a challenge for public health authorities. The purpose of this study was to evaluate the efficiency of a computer-tailoring print-based intervention to promote the adoption of regular physical activity among people with type 2 diabetes. An experimental design was…

  11. Bayesian Methods for Scalable Multivariate Value-Added Assessment

    ERIC Educational Resources Information Center

    Lockwood, J. R.; McCaffrey, Daniel F.; Mariano, Louis T.; Setodji, Claude

    2007-01-01

    There is increased interest in value-added models relying on longitudinal student-level test score data to isolate teachers' contributions to student achievement. The complex linkage of students to teachers as students progress through grades poses both substantive and computational challenges. This article introduces a multivariate Bayesian…

  12. Analysis of Cross-Cultural Online Collaborative Learning with Social Software

    ERIC Educational Resources Information Center

    Law, Effie Lai-Chong; Nguyen-Ngoc, Anh Vu

    2010-01-01

    Purpose: The rising popularity of social software poses challenges to the design and evaluation of pedagogically sound cross-cultural online collaborative learning environments (OCLEs). In the literature of computer-mediated communications, there exist only a limited number of related empirical studies, indicating that it is still an emergent…

  13. Internet and Information Control: The Case of China.

    ERIC Educational Resources Information Center

    Xiaoming, Hao; Zhang, Kewen; Yu, Huang

    1996-01-01

    Examines the potential impact of computer-mediated communication (CMC) on government controls of information in China. Examines the current development of CMC in China; challenges posed by new media technologies to government information control; and the current policy of the Chinese government towards on-line communication and its implications.…

  14. Rational Approximations to Rational Models: Alternative Algorithms for Category Learning

    ERIC Educational Resources Information Center

    Sanborn, Adam N.; Griffiths, Thomas L.; Navarro, Daniel J.

    2010-01-01

    Rational models of cognition typically consider the abstract computational problems posed by the environment, assuming that people are capable of optimally solving those problems. This differs from more traditional formal models of cognition, which focus on the psychological processes responsible for behavior. A basic challenge for rational models…

  15. The Effect of Password Management Procedures on the Entropy of User Selected Passwords

    ERIC Educational Resources Information Center

    Enamait, John D.

    2012-01-01

    Maintaining the security of information contained within computer systems poses challenges for users and administrators. Attacks on information systems continue to rise. Specifically, attacks that target user authentication are increasingly popular. These attacks are based on the common perception that traditional alphanumeric passwords are weak…

  16. Techniques for Enhancing Web-Based Education.

    ERIC Educational Resources Information Center

    Barbieri, Kathy; Mehringer, Susan

    The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…

  17. Comparative assessment of techniques for initial pose estimation using monocular vision

    NASA Astrophysics Data System (ADS)

    Sharma, Sumant; D`Amico, Simone

    2016-06-01

    This work addresses the comparative assessment of initial pose estimation techniques for monocular navigation to enable formation-flying and on-orbit servicing missions. Monocular navigation relies on finding an initial pose, i.e., a coarse estimate of the attitude and position of the space resident object with respect to the camera, based on a minimum number of features from a three dimensional computer model and a single two dimensional image. The initial pose is estimated without the use of fiducial markers, without any range measurements or any apriori relative motion information. Prior work has been done to compare different pose estimators for terrestrial applications, but there is a lack of functional and performance characterization of such algorithms in the context of missions involving rendezvous operations in the space environment. Use of state-of-the-art pose estimation algorithms designed for terrestrial applications is challenging in space due to factors such as limited on-board processing power, low carrier to noise ratio, and high image contrasts. This paper focuses on performance characterization of three initial pose estimation algorithms in the context of such missions and suggests improvements.

  18. Robust head pose estimation via supervised manifold learning.

    PubMed

    Wang, Chao; Song, Xubo

    2014-05-01

    Head poses can be automatically estimated using manifold learning algorithms, with the assumption that with the pose being the only variable, the face images should lie in a smooth and low-dimensional manifold. However, this estimation approach is challenging due to other appearance variations related to identity, head location in image, background clutter, facial expression, and illumination. To address the problem, we propose to incorporate supervised information (pose angles of training samples) into the process of manifold learning. The process has three stages: neighborhood construction, graph weight computation and projection learning. For the first two stages, we redefine inter-point distance for neighborhood construction as well as graph weight by constraining them with the pose angle information. For Stage 3, we present a supervised neighborhood-based linear feature transformation algorithm to keep the data points with similar pose angles close together but the data points with dissimilar pose angles far apart. The experimental results show that our method has higher estimation accuracy than the other state-of-art algorithms and is robust to identity and illumination variations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. A review of cooperative and uncooperative spacecraft pose determination techniques for close-proximity operations

    NASA Astrophysics Data System (ADS)

    Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele

    2017-08-01

    The capability of an active spacecraft to accurately estimate its relative position and attitude (pose) with respect to an active/inactive, artificial/natural space object (target) orbiting in close-proximity is required to carry out various activities like formation flying, on-orbit servicing, active debris removal, and space exploration. According to the specific mission scenario, the pose determination task involves both theoretical and technological challenges related to the search for the most suitable algorithmic solution and sensor architecture, respectively. As regards the latter aspect, electro-optical sensors represent the best option as their use is compatible with mass and power limitation of micro and small satellites, and their measurements can be processed to estimate all the pose parameters. Overall, the degree of complexity of the challenges related to pose determination largely varies depending on the nature of the targets, which may be actively/passively cooperative, uncooperative but known, or uncooperative and unknown space objects. In this respect, while cooperative pose determination has been successfully demonstrated in orbit, the uncooperative case is still under study by universities, research centers, space agencies and private companies. However, in both the cases, the demand for space applications involving relative navigation maneuvers, also in close-proximity, for which pose determination capabilities are mandatory, is significantly increasing. In this framework, a review of state-of-the-art techniques and algorithms developed in the last decades for cooperative and uncooperative pose determination by processing data provided by electro-optical sensors is herein presented. Specifically, their main advantages and drawbacks in terms of achieved performance, computational complexity, and sensitivity to variability of pose and target geometry, are highlighted.

  20. Exploring the stability of ligand binding modes to proteins by molecular dynamics simulations.

    PubMed

    Liu, Kai; Watanabe, Etsurou; Kokubo, Hironori

    2017-02-01

    The binding mode prediction is of great importance to structure-based drug design. The discrimination of various binding poses of ligand generated by docking is a great challenge not only to docking score functions but also to the relatively expensive free energy calculation methods. Here we systematically analyzed the stability of various ligand poses under molecular dynamics (MD) simulation. First, a data set of 120 complexes was built based on the typical physicochemical properties of drug-like ligands. Three potential binding poses (one correct pose and two decoys) were selected for each ligand from self-docking in addition to the experimental pose. Then, five independent MD simulations for each pose were performed with different initial velocities for the statistical analysis. Finally, the stabilities of ligand poses under MD were evaluated and compared with the native one from crystal structure. We found that about 94% of the native poses were maintained stable during the simulations, which suggests that MD simulations are accurate enough to judge most experimental binding poses as stable properly. Interestingly, incorrect decoy poses were maintained much less and 38-44% of decoys could be excluded just by performing equilibrium MD simulations, though 56-62% of decoys were stable. The computationally-heavy binding free energy calculation can be performed only for these survived poses.

  1. Denuded Data! Grounded Theory Using the NUDIST Computer Analysis Program: In Researching the Challenge to Teacher Self-Efficacy Posed by Students with Learning Disabilities in Australian Education.

    ERIC Educational Resources Information Center

    Burroughs-Lange, Sue G.; Lange, John

    This paper evaluates the effects of using the NUDIST (Non-numerical, Unstructured Data Indexing, Searching and Theorising) computer program to organize coded, qualitative data. The use of the software is discussed within the context of the study for which it was used: an Australian study that aimed to develop a theoretical understanding of the…

  2. Open source tools for large-scale neuroscience.

    PubMed

    Freeman, Jeremy

    2015-06-01

    New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  3. Pulse sequences for suppressing leakage in single-qubit gate operations

    NASA Astrophysics Data System (ADS)

    Ghosh, Joydip; Coppersmith, S. N.; Friesen, Mark

    2017-06-01

    Many realizations of solid-state qubits involve couplings to leakage states lying outside the computational subspace, posing a threat to high-fidelity quantum gate operations. Mitigating leakage errors is especially challenging when the coupling strength is unknown, e.g., when it is caused by noise. Here we show that simple pulse sequences can be used to strongly suppress leakage errors for a qubit embedded in a three-level system. As an example, we apply our scheme to the recently proposed charge quadrupole (CQ) qubit for quantum dots. These results provide a solution to a key challenge for fault-tolerant quantum computing with solid-state elements.

  4. Computational Social Creativity.

    PubMed

    Saunders, Rob; Bown, Oliver

    2015-01-01

    This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.

  5. Incorporating CAD Instruction into the Drafting Curriculum.

    ERIC Educational Resources Information Center

    Yuen, Steve Chi-Yin

    1990-01-01

    If education is to meet the challenged posed by the U.S. productivity crisis and the large number of computer-assisted design (CAD) workstations forecast as necessary in the future, schools must integrate CAD into the drafting curriculum and become aggressive in providing CAD training. Teachers need to maintain close contact with local industries…

  6. Technology Teaching or Mediated Learning, Part II, 1990s: Literacy Linkages and Intervention Contexts.

    ERIC Educational Resources Information Center

    Coufal, Kathy L.

    2002-01-01

    Themes during the 1990s included the bootstrapping effects between oral and literate language, importance of supporting emergent literacy, parallels between oral language impairment and academic failure, and challenges in facilitating language learning. This article addresses questions posed in Part I related to use of computer technology for…

  7. Schools Facing the Expiration of Windows XP

    ERIC Educational Resources Information Center

    Cavanagh, Sean

    2013-01-01

    Microsoft's plans to end support for Windows XP, believed to be the dominant computer operating system in K-12 education, could pose big technological and financial challenges for districts nationwide--issues that many school systems have yet to confront. The giant software company has made it clear for years that it plans to stop supporting XP…

  8. Computer-Based Aids for Learning, Job Performance, and Decision Making in Military Applications: Emergent Technology and Challenges

    DTIC Science & Technology

    2003-10-01

    paper, which addresses the following questions: Is it worth it? What do we know about the value of technology applications in learning ( education and......fax) fletcher@ida.org SUMMARY Technology -based systems for education , training, and performance aiding (including decision aiding) may pose the

  9. FPGA-Based Laboratory Assignments for NoC-Based Manycore Systems

    ERIC Educational Resources Information Center

    Ttofis, C.; Theocharides, T.; Michael, M. K.

    2012-01-01

    Manycore systems have emerged as being one of the dominant architectural trends in next-generation computer systems. These highly parallel systems are expected to be interconnected via packet-based networks-on-chip (NoC). The complexity of such systems poses novel and exciting challenges in academia, as teaching their design requires the students…

  10. Robust Head-Pose Estimation Based on Partially-Latent Mixture of Linear Regressions.

    PubMed

    Drouard, Vincent; Horaud, Radu; Deleforge, Antoine; Ba, Sileye; Evangelidis, Georgios

    2017-03-01

    Head-pose estimation has many applications, such as social event analysis, human-robot and human-computer interaction, driving assistance, and so forth. Head-pose estimation is challenging, because it must cope with changing illumination conditions, variabilities in face orientation and in appearance, partial occlusions of facial landmarks, as well as bounding-box-to-face alignment errors. We propose to use a mixture of linear regressions with partially-latent output. This regression method learns to map high-dimensional feature vectors (extracted from bounding boxes of faces) onto the joint space of head-pose angles and bounding-box shifts, such that they are robustly predicted in the presence of unobservable phenomena. We describe in detail the mapping method that combines the merits of unsupervised manifold learning techniques and of mixtures of regressions. We validate our method with three publicly available data sets and we thoroughly benchmark four variants of the proposed algorithm with several state-of-the-art head-pose estimation methods.

  11. Information Assurance and Forensic Readiness

    NASA Astrophysics Data System (ADS)

    Pangalos, Georgios; Katos, Vasilios

    Egalitarianism and justice are amongst the core attributes of a democratic regime and should be also secured in an e-democratic setting. As such, the rise of computer related offenses pose a threat to the fundamental aspects of e-democracy and e-governance. Digital forensics are a key component for protecting and enabling the underlying (e-)democratic values and therefore forensic readiness should be considered in an e-democratic setting. This position paper commences from the observation that the density of compliance and potential litigation activities is monotonically increasing in modern organizations, as rules, legislative regulations and policies are being constantly added to the corporate environment. Forensic practices seem to be departing from the niche of law enforcement and are becoming a business function and infrastructural component, posing new challenges to the security professionals. Having no a priori knowledge on whether a security related event or corporate policy violation will lead to litigation, we advocate that computer forensics need to be applied to all investigatory, monitoring and auditing activities. This would result into an inflation of the responsibilities of the Information Security Officer. After exploring some commonalities and differences between IS audit and computer forensics, we present a list of strategic challenges the organization and, in effect, the IS security and audit practitioner will face.

  12. Energy Efficiency Challenges of 5G Small Cell Networks.

    PubMed

    Ge, Xiaohu; Yang, Jing; Gharavi, Hamid; Sun, Yang

    2017-05-01

    The deployment of a large number of small cells poses new challenges to energy efficiency, which has often been ignored in fifth generation (5G) cellular networks. While massive multiple-input multiple outputs (MIMO) will reduce the transmission power at the expense of higher computational cost, the question remains as to which computation or transmission power is more important in the energy efficiency of 5G small cell networks. Thus, the main objective in this paper is to investigate the computation power based on the Landauer principle. Simulation results reveal that more than 50% of the energy is consumed by the computation power at 5G small cell base stations (BSs). Moreover, the computation power of 5G small cell BS can approach 800 watt when the massive MIMO (e.g., 128 antennas) is deployed to transmit high volume traffic. This clearly indicates that computation power optimization can play a major role in the energy efficiency of small cell networks.

  13. Energy Efficiency Challenges of 5G Small Cell Networks

    PubMed Central

    Ge, Xiaohu; Yang, Jing; Gharavi, Hamid; Sun, Yang

    2017-01-01

    The deployment of a large number of small cells poses new challenges to energy efficiency, which has often been ignored in fifth generation (5G) cellular networks. While massive multiple-input multiple outputs (MIMO) will reduce the transmission power at the expense of higher computational cost, the question remains as to which computation or transmission power is more important in the energy efficiency of 5G small cell networks. Thus, the main objective in this paper is to investigate the computation power based on the Landauer principle. Simulation results reveal that more than 50% of the energy is consumed by the computation power at 5G small cell base stations (BSs). Moreover, the computation power of 5G small cell BS can approach 800 watt when the massive MIMO (e.g., 128 antennas) is deployed to transmit high volume traffic. This clearly indicates that computation power optimization can play a major role in the energy efficiency of small cell networks. PMID:28757670

  14. Pose tracking for augmented reality applications in outdoor archaeological sites

    NASA Astrophysics Data System (ADS)

    Younes, Georges; Asmar, Daniel; Elhajj, Imad; Al-Harithy, Howayda

    2017-01-01

    In recent years, agencies around the world have invested huge amounts of effort toward digitizing many aspects of the world's cultural heritage. Of particular importance is the digitization of outdoor archaeological sites. In the spirit of valorization of this digital information, many groups have developed virtual or augmented reality (AR) computer applications themed around a particular archaeological object. The problem of pose tracking in outdoor AR applications is addressed. Different positional systems are analyzed, resulting in the selection of a monocular camera-based user tracker. The limitations that challenge this technique from map generation, scale, anchoring, to lighting conditions are analyzed and systematically addressed. Finally, as a case study, our pose tracking system is implemented within an AR experience in the Byblos Roman theater in Lebanon.

  15. 3D face recognition under expressions, occlusions, and pose variations.

    PubMed

    Drira, Hassen; Ben Amor, Boulbaba; Srivastava, Anuj; Daoudi, Mohamed; Slama, Rim

    2013-09-01

    We propose a novel geometric framework for analyzing 3D faces, with the specific goals of comparing, matching, and averaging their shapes. Here we represent facial surfaces by radial curves emanating from the nose tips and use elastic shape analysis of these curves to develop a Riemannian framework for analyzing shapes of full facial surfaces. This representation, along with the elastic Riemannian metric, seems natural for measuring facial deformations and is robust to challenges such as large facial expressions (especially those with open mouths), large pose variations, missing parts, and partial occlusions due to glasses, hair, and so on. This framework is shown to be promising from both--empirical and theoretical--perspectives. In terms of the empirical evaluation, our results match or improve upon the state-of-the-art methods on three prominent databases: FRGCv2, GavabDB, and Bosphorus, each posing a different type of challenge. From a theoretical perspective, this framework allows for formal statistical inferences, such as the estimation of missing facial parts using PCA on tangent spaces and computing average shapes.

  16. The role of synergies within generative models of action execution and recognition: A computational perspective. Comment on "Grasping synergies: A motor-control approach to the mirror neuron mechanism" by A. D'Ausilio et al.

    NASA Astrophysics Data System (ADS)

    Pezzulo, Giovanni; Donnarumma, Francesco; Iodice, Pierpaolo; Prevete, Roberto; Dindo, Haris

    2015-03-01

    Controlling the body - given its huge number of degrees of freedom - poses severe computational challenges. Mounting evidence suggests that the brain alleviates this problem by exploiting "synergies", or patterns of muscle activities (and/or movement dynamics and kinematics) that can be combined to control action, rather than controlling individual muscles of joints [1-10].

  17. Memory management in genome-wide association studies

    PubMed Central

    2009-01-01

    Genome-wide association is a powerful tool for the identification of genes that underlie common diseases. Genome-wide association studies generate billions of genotypes and pose significant computational challenges for most users including limited computer memory. We applied a recently developed memory management tool to two analyses of North American Rheumatoid Arthritis Consortium studies and measured the performance in terms of central processing unit and memory usage. We conclude that our memory management approach is simple, efficient, and effective for genome-wide association studies. PMID:20018047

  18. Advantages of Computer Simulation in Enhancing Students' Learning about Landform Evolution: A Case Study Using the Grand Canyon

    ERIC Educational Resources Information Center

    Luo, Wei; Pelletier, Jon; Duffin, Kirk; Ormand, Carol; Hung, Wei-chen; Shernoff, David J.; Zhai, Xiaoming; Iverson, Ellen; Whalley, Kyle; Gallaher, Courtney; Furness, Walter

    2016-01-01

    The long geological time needed for landform development and evolution poses a challenge for understanding and appreciating the processes involved. The Web-based Interactive Landform Simulation Model--Grand Canyon (WILSIM-GC, http://serc.carleton.edu/landform/) is an educational tool designed to help students better understand such processes,…

  19. The Teaching of Anthropogenic Climate Change and Earth Science via Technology-Enabled Inquiry Education

    ERIC Educational Resources Information Center

    Bush, Drew; Sieber, Renee; Seiler, Gale; Chandler, Mark

    2016-01-01

    A gap has existed between the tools and processes of scientists working on anthropogenic global climate change (AGCC) and the technologies and curricula available to educators teaching the subject through student inquiry. Designing realistic scientific inquiry into AGCC poses a challenge because research on it relies on complex computer models,…

  20. A Framework for Representing and Jointly Reasoning over Linguistic and Non-Linguistic Knowledge

    ERIC Educational Resources Information Center

    Murugesan, Arthi

    2009-01-01

    Natural language poses several challenges to developing computational systems for modeling it. Natural language is not a precise problem but is rather ridden with a number of uncertainties in the form of either alternate words or interpretations. Furthermore, natural language is a generative system where the problem size is potentially infinite.…

  1. Toxcast and the Use of Human Relevant In Vitro Exposures ...

    EPA Pesticide Factsheets

    The path for incorporating new approach methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. These challenges include sufficient coverage of toxicological mechanisms to meaningfully interpret negative test results, development of increasingly relevant test systems, computational modeling to integrate experimental data, putting results in a dose and exposure context, characterizing uncertainty, and efficient validation of the test systems and computational models. The presentation will cover progress at the U.S. EPA in systematically addressing each of these challenges and delivering more human-relevant risk-based assessments. This abstract does not necessarily reflect U.S. EPA policy. Presentation at the British Toxicological Society Annual Congress on ToxCast and the Use of Human Relevant In Vitro Exposures: Incorporating high-throughput exposure and toxicity testing data for 21st century risk assessments .

  2. WLCG scale testing during CMS data challenges

    NASA Astrophysics Data System (ADS)

    Gutsche, O.; Hajdu, C.

    2008-07-01

    The CMS computing model to process and analyze LHC collision data follows a data-location driven approach and is using the WLCG infrastructure to provide access to GRID resources. As a preparation for data taking, CMS tests its computing model during dedicated data challenges. An important part of the challenges is the test of the user analysis which poses a special challenge for the infrastructure with its random distributed access patterns. The CMS Remote Analysis Builder (CRAB) handles all interactions with the WLCG infrastructure transparently for the user. During the 2006 challenge, CMS set its goal to test the infrastructure at a scale of 50,000 user jobs per day using CRAB. Both direct submissions by individual users and automated submissions by robots were used to achieve this goal. A report will be given about the outcome of the user analysis part of the challenge using both the EGEE and OSG parts of the WLCG. In particular, the difference in submission between both GRID middlewares (resource broker vs. direct submission) will be discussed. In the end, an outlook for the 2007 data challenge is given.

  3. Spatiotemporal Domain Decomposition for Massive Parallel Computation of Space-Time Kernel Density

    NASA Astrophysics Data System (ADS)

    Hohl, A.; Delmelle, E. M.; Tang, W.

    2015-07-01

    Accelerated processing capabilities are deemed critical when conducting analysis on spatiotemporal datasets of increasing size, diversity and availability. High-performance parallel computing offers the capacity to solve computationally demanding problems in a limited timeframe, but likewise poses the challenge of preventing processing inefficiency due to workload imbalance between computing resources. Therefore, when designing new algorithms capable of implementing parallel strategies, careful spatiotemporal domain decomposition is necessary to account for heterogeneity in the data. In this study, we perform octtree-based adaptive decomposition of the spatiotemporal domain for parallel computation of space-time kernel density. In order to avoid edge effects near subdomain boundaries, we establish spatiotemporal buffers to include adjacent data-points that are within the spatial and temporal kernel bandwidths. Then, we quantify computational intensity of each subdomain to balance workloads among processors. We illustrate the benefits of our methodology using a space-time epidemiological dataset of Dengue fever, an infectious vector-borne disease that poses a severe threat to communities in tropical climates. Our parallel implementation of kernel density reaches substantial speedup compared to sequential processing, and achieves high levels of workload balance among processors due to great accuracy in quantifying computational intensity. Our approach is portable of other space-time analytical tests.

  4. SU-G-JeP3-03: Effect of Robot Pose On Beam Blocking for Ultrasound Guided SBRT of the Prostate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerlach, S; Schlaefer, A; Kuhlemann, I

    Purpose: Ultrasound presents a fast, volumetric image modality for real-time tracking of abdominal organ motion. How-ever, ultrasound transducer placement during radiation therapy is challenging. Recently, approaches using robotic arms for intra-treatment ultrasound imaging have been proposed. Good and reliable imaging requires placing the transducer close to the PTV. We studied the effect of a seven degrees of freedom robot on the fea-sible beam directions. Methods: For five CyberKnife prostate treatment plans we established viewports for the transducer, i.e., points on the patient surface with a soft tissue view towards the PTV. Choosing a feasible transducer pose and using the kinematicmore » redundancy of the KUKA LBR iiwa robot, we considered three robot poses. Poses 1 to 3 had the elbow point anterior, superior, and inferior, respectively. For each pose and each beam starting point, the pro-jections of robot and PTV were computed. We added a 20 mm margin accounting for organ / beam motion. The number of nodes for which the PTV was partially of fully blocked were established. Moreover, the cumula-tive overlap for each of the poses and the minimum overlap over all poses were computed. Results: The fully and partially blocked nodes ranged from 12% to 20% and 13% to 27%, respectively. Typically, pose 3 caused the fewest blocked nodes. The cumulative overlap ranged from 19% to 29%. Taking the minimum overlap, i.e., considering moving the robot’s elbow while maintaining the transducer pose, the cumulative over-lap was reduced to 16% to 18% and was 3% to 6% lower than for the best individual pose. Conclusion: Our results indicate that it is possible to identify feasible ultrasound transducer poses and to use the kinematic redundancy of a 7 DOF robot to minimize the impact of the imaging subsystem on the feasible beam directions for ultrasound guided and motion compensated SBRT. Research partially funded by DFG grants ER 817/1-1 and SCHL 1844/3-1.« less

  5. [Medical cooperation on the internet].

    PubMed

    Meier, N; Lenzen, H; Renger, B C

    1998-01-01

    Post-1999, the economically united EEC will pose new challenges to European business, industry and citizen. It is a key objective that in the domain of European "infostructure" these problems are challenged and overcome, and that "advanced communications technologies and services" (ACTS) become the cement which binds the Community together. Within ACTS, 130 different projects are building new services. The consortium Emerald develops a telemedicine platform, setting up teleworking with teleconference, computer supported co-operative work (cscw, joint editing), demonstration and teleteaching for radiology, cardiology, nuclear medicine and radio surgery working environments.

  6. Translational Biomedical Informatics in the Cloud: Present and Future

    PubMed Central

    Chen, Jiajia; Qian, Fuliang; Yan, Wenying; Shen, Bairong

    2013-01-01

    Next generation sequencing and other high-throughput experimental techniques of recent decades have driven the exponential growth in publicly available molecular and clinical data. This information explosion has prepared the ground for the development of translational bioinformatics. The scale and dimensionality of data, however, pose obvious challenges in data mining, storage, and integration. In this paper we demonstrated the utility and promise of cloud computing for tackling the big data problems. We also outline our vision that cloud computing could be an enabling tool to facilitate translational bioinformatics research. PMID:23586054

  7. Virtualization in education: Information Security lab in your hands

    NASA Astrophysics Data System (ADS)

    Karlov, A. A.

    2016-09-01

    The growing demand for qualified specialists in advanced information technologies poses serious challenges to the education and training of young personnel for science, industry and social problems. Virtualization as a way to isolate the user from the physical characteristics of computing resources (processors, servers, operating systems, networks, applications, etc.), has, in particular, an enormous influence in the field of education, increasing its efficiency, reducing the cost, making it more widely and readily available. The study of Information Security of computer systems is considered as an example of use of virtualization in education.

  8. A review of computer-aided oral and maxillofacial surgery: planning, simulation and navigation.

    PubMed

    Chen, Xiaojun; Xu, Lu; Sun, Yi; Politis, Constantinus

    2016-11-01

    Currently, oral and maxillofacial surgery (OMFS) still poses a significant challenge for surgeons due to the anatomic complexity and limited field of view of the oral cavity. With the great development of computer technologies, he computer-aided surgery has been widely used for minimizing the risks and improving the precision of surgery. Areas covered: The major goal of this paper is to provide a comprehensive reference source of current and future development of computer-aided OMFS including surgical planning, simulation and navigation for relevant researchers. Expert commentary: Compared with the traditional OMFS, computer-aided OMFS overcomes the disadvantage that the treatment on the region of anatomically complex maxillofacial depends almost exclusively on the experience of the surgeon.

  9. Secure distributed genome analysis for GWAS and sequence comparison computation.

    PubMed

    Zhang, Yihua; Blanton, Marina; Almashaqbeh, Ghada

    2015-01-01

    The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice.

  10. Secure distributed genome analysis for GWAS and sequence comparison computation

    PubMed Central

    2015-01-01

    Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307

  11. Performance of multiple docking and refinement methods in the pose prediction D3R prospective Grand Challenge 2016

    NASA Astrophysics Data System (ADS)

    Fradera, Xavier; Verras, Andreas; Hu, Yuan; Wang, Deping; Wang, Hongwu; Fells, James I.; Armacost, Kira A.; Crespo, Alejandro; Sherborne, Brad; Wang, Huijun; Peng, Zhengwei; Gao, Ying-Duo

    2018-01-01

    We describe the performance of multiple pose prediction methods for the D3R 2016 Grand Challenge. The pose prediction challenge includes 36 ligands, which represent 4 chemotypes and some miscellaneous structures against the FXR ligand binding domain. In this study we use a mix of fully automated methods as well as human-guided methods with considerations of both the challenge data and publicly available data. The methods include ensemble docking, colony entropy pose prediction, target selection by molecular similarity, molecular dynamics guided pose refinement, and pose selection by visual inspection. We evaluated the success of our predictions by method, chemotype, and relevance of publicly available data. For the overall data set, ensemble docking, visual inspection, and molecular dynamics guided pose prediction performed the best with overall mean RMSDs of 2.4, 2.2, and 2.2 Å respectively. For several individual challenge molecules, the best performing method is evaluated in light of that particular ligand. We also describe the protein, ligand, and public information data preparations that are typical of our binding mode prediction workflow.

  12. Combining self- and cross-docking as benchmark tools: the performance of DockBench in the D3R Grand Challenge 2

    NASA Astrophysics Data System (ADS)

    Salmaso, Veronica; Sturlese, Mattia; Cuzzolin, Alberto; Moro, Stefano

    2018-01-01

    Molecular docking is a powerful tool in the field of computer-aided molecular design. In particular, it is the technique of choice for the prediction of a ligand pose within its target binding site. A multitude of docking methods is available nowadays, whose performance may vary depending on the data set. Therefore, some non-trivial choices should be made before starting a docking simulation. In the same framework, the selection of the target structure to use could be challenging, since the number of available experimental structures is increasing. Both issues have been explored within this work. The pose prediction of a pool of 36 compounds provided by D3R Grand Challenge 2 organizers was preceded by a pipeline to choose the best protein/docking-method couple for each blind ligand. An integrated benchmark approach including ligand shape comparison and cross-docking evaluations was implemented inside our DockBench software. The results are encouraging and show that bringing attention to the choice of the docking simulation fundamental components improves the results of the binding mode predictions.

  13. Technology and the Longer View. Proceedings of the Annual National Conference on Technology and Education (1st, Ft. Worth, Texas, April 19-20, 1984).

    ERIC Educational Resources Information Center

    Klier, Betje, Ed.; And Others

    To address the challenges to education posed by the new information technologies, participants from 41 states--publishers, administrators, teachers, subject specialists, and leaders from the field of educational computing--gathered to share ideas and visions. This report of the conference proceedings includes introductory comments by John Roach of…

  14. Early BHs: simulations and observations

    NASA Astrophysics Data System (ADS)

    Cappelluti, Nico; di-Matteo, Tiziana; Schawinski, Kevin; Fragos, Tassos

    We report recent investigations in the field of Early Black Holes. We summarize recent theoretical and observational efforts to understand how Black Holes formed and eventually evolved into Super Massive Black Holes at high-z. This paper makes use of state of the art computer simulations and multiwavelength surveys. Although non conclusive, we present results and hypothesis that pose exciting challenges to modern astrophysics and to future facilities.

  15. Modeling the spatial distribution of forest crown biomass and effects on fire behavior with FUEL3D and WFDS

    Treesearch

    Russell A. Parsons; William Mell; Peter McCauley

    2010-01-01

    Crown fire poses challenges to fire managers and can endanger fire fighters. Understanding of how fire interacts with tree crowns is essential to informed decisions about crown fire. Current operational crown fire predictions in the United States assume homogeneous crown fuels. While a new class of research fire models, which model fire behavior with computational...

  16. Tissue Engineered Bone Using Polycaprolactone Scaffolds Made by Selective Laser Sintering

    DTIC Science & Technology

    2005-01-01

    temporo - mandibular joint (TMJ) pose many challenges for bone tissue engineering. Adverse reactions to alloplastic, non- biological materials result in...producing a prototype mandibular condyle scaffold based on an actual pig condyle. INTRODUCTION Repair and reconstruction of complex joints such as the...computed tomography (CT) data with a designed porous architecture to build a complex scaffold that mimics a mandibular condyle. Results show that

  17. Towards Modeling False Memory With Computational Knowledge Bases.

    PubMed

    Li, Justin; Kohanyi, Emma

    2017-01-01

    One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.

  18. Deep Learning for Computer Vision: A Brief Review

    PubMed Central

    Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios

    2018-01-01

    Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein. PMID:29487619

  19. Electro-textile garments for power and data distribution

    NASA Astrophysics Data System (ADS)

    Slade, Jeremiah R.; Winterhalter, Carole

    2015-05-01

    U.S. troops are increasingly being equipped with various electronic assets including flexible displays, computers, and communications systems. While these systems can significantly enhance operational capabilities, forming reliable connections between them poses a number of challenges in terms of comfort, weight, ergonomics, and operational security. IST has addressed these challenges by developing the technologies needed to integrate large-scale cross-seam electrical functionality into virtually any textile product, including the various garments and vests that comprise the warfighter's ensemble. Using this technology IST is able to develop textile products that do not simply support or accommodate a network but are the network.

  20. Multi-Cone Model for Estimating GPS Ionospheric Delays

    NASA Technical Reports Server (NTRS)

    Sparks, Lawrence; Komjathy, Attila; Mannucci, Anthony

    2009-01-01

    The multi-cone model is a computational model for estimating ionospheric delays of Global Positioning System (GPS) signals. It is a direct descendant of the conical-domain model. A primary motivation for the development of this model is the need to find alternatives for modeling slant delays at low latitudes, where ionospheric behavior poses an acute challenge for GPS signal-delay estimates based upon the thin-shell model of the ionosphere.

  1. Hybrid Quantum-Classical Approach to Quantum Optimal Control.

    PubMed

    Li, Jun; Yang, Xiaodong; Peng, Xinhua; Sun, Chang-Pu

    2017-04-14

    A central challenge in quantum computing is to identify more computational problems for which utilization of quantum resources can offer significant speedup. Here, we propose a hybrid quantum-classical scheme to tackle the quantum optimal control problem. We show that the most computationally demanding part of gradient-based algorithms, namely, computing the fitness function and its gradient for a control input, can be accomplished by the process of evolution and measurement on a quantum simulator. By posing queries to and receiving answers from the quantum simulator, classical computing devices update the control parameters until an optimal control solution is found. To demonstrate the quantum-classical scheme in experiment, we use a seven-qubit nuclear magnetic resonance system, on which we have succeeded in optimizing state preparation without involving classical computation of the large Hilbert space evolution.

  2. Exploring the Stability of Ligand Binding Modes to Proteins by Molecular Dynamics Simulations: A Cross-docking Study.

    PubMed

    Liu, Kai; Kokubo, Hironori

    2017-10-23

    Docking has become an indispensable approach in drug discovery research to predict the binding mode of a ligand. One great challenge in docking is to efficiently refine the correct pose from various putative docking poses through scoring functions. We recently examined the stability of self-docking poses under molecular dynamics (MD) simulations and showed that equilibrium MD simulations have some capability to discriminate between correct and decoy poses. Here, we have extended our previous work to cross-docking studies for practical applications. Three target proteins (thrombin, heat shock protein 90-alpha, and cyclin-dependent kinase 2) of pharmaceutical interest were selected. Three comparable poses (one correct pose and two decoys) for each ligand were then selected from the docking poses. To obtain the docking poses for the three target proteins, we used three different protocols, namely: normal docking, induced fit docking (IFD), and IFD against the homology model. Finally, five parallel MD equilibrium runs were performed on each pose for the statistical analysis. The results showed that the correct poses were generally more stable than the decoy poses under MD. The discrimination capability of MD depends on the strategy. The safest way was to judge a pose as being stable if any one run among five parallel runs was stable under MD. In this case, 95% of the correct poses were retained under MD, and about 25-44% of the decoys could be excluded by the simulations for all cases. On the other hand, if we judge a pose as being stable when any two or three runs were stable, with the risk of incorrectly excluding some correct poses, approximately 31-53% or 39-56% of the two decoys could be excluded by MD, respectively. Our results suggest that simple equilibrium simulations can serve as an effective filter to exclude decoy poses that cannot be distinguished by docking scores from the computationally expensive free-energy calculations.

  3. Spectral method for a kinetic swarming model

    DOE PAGES

    Gamba, Irene M.; Haack, Jeffrey R.; Motsch, Sebastien

    2015-04-28

    Here we present the first numerical method for a kinetic description of the Vicsek swarming model. The kinetic model poses a unique challenge, as there is a distribution dependent collision invariant to satisfy when computing the interaction term. We use a spectral representation linked with a discrete constrained optimization to compute these interactions. To test the numerical scheme we investigate the kinetic model at different scales and compare the solution with the microscopic and macroscopic descriptions of the Vicsek model. Lastly, we observe that the kinetic model captures key features such as vortex formation and traveling waves.

  4. Color engineering in the age of digital convergence

    NASA Astrophysics Data System (ADS)

    MacDonald, Lindsay W.

    1998-09-01

    Digital color imaging has developed over the past twenty years from specialized scientific applications into the mainstream of computing. In addition to the phenomenal growth of computer processing power and storage capacity, great advances have been made in the capabilities and cost-effectiveness of color imaging peripherals. The majority of imaging applications, including the graphic arts, video and film have made the transition from analogue to digital production methods. Digital convergence of computing, communications and television now heralds new possibilities for multimedia publishing and mobile lifestyles. Color engineering, the application of color science to the design of imaging products, is an emerging discipline that poses exciting challenges to the international color imaging community for training, research and standards.

  5. Multi-object segmentation using coupled nonparametric shape and relative pose priors

    NASA Astrophysics Data System (ADS)

    Uzunbas, Mustafa Gökhan; Soldea, Octavian; Çetin, Müjdat; Ünal, Gözde; Erçil, Aytül; Unay, Devrim; Ekin, Ahmet; Firat, Zeynep

    2009-02-01

    We present a new method for multi-object segmentation in a maximum a posteriori estimation framework. Our method is motivated by the observation that neighboring or coupling objects in images generate configurations and co-dependencies which could potentially aid in segmentation if properly exploited. Our approach employs coupled shape and inter-shape pose priors that are computed using training images in a nonparametric multi-variate kernel density estimation framework. The coupled shape prior is obtained by estimating the joint shape distribution of multiple objects and the inter-shape pose priors are modeled via standard moments. Based on such statistical models, we formulate an optimization problem for segmentation, which we solve by an algorithm based on active contours. Our technique provides significant improvements in the segmentation of weakly contrasted objects in a number of applications. In particular for medical image analysis, we use our method to extract brain Basal Ganglia structures, which are members of a complex multi-object system posing a challenging segmentation problem. We also apply our technique to the problem of handwritten character segmentation. Finally, we use our method to segment cars in urban scenes.

  6. Continuing challenges for computer-based neuropsychological tests.

    PubMed

    Letz, Richard

    2003-08-01

    A number of issues critical to the development of computer-based neuropsychological testing systems that remain continuing challenges to their widespread use in occupational and environmental health are reviewed. Several computer-based neuropsychological testing systems have been developed over the last 20 years, and they have contributed substantially to the study of neurologic effects of a number of environmental exposures. However, many are no longer supported and do not run on contemporary personal computer operating systems. Issues that are continuing challenges for development of computer-based neuropsychological tests in environmental and occupational health are discussed: (1) some current technological trends that generally make test development more difficult; (2) lack of availability of usable speech recognition of the type required for computer-based testing systems; (3) implementing computer-based procedures and tasks that are improvements over, not just adaptations of, their manually-administered predecessors; (4) implementing tests of a wider range of memory functions than the limited range now available; (5) paying more attention to motivational influences that affect the reliability and validity of computer-based measurements; and (6) increasing the usability of and audience for computer-based systems. Partial solutions to some of these challenges are offered. The challenges posed by current technological trends are substantial and generally beyond the control of testing system developers. Widespread acceptance of the "tablet PC" and implementation of accurate small vocabulary, discrete, speaker-independent speech recognition would enable revolutionary improvements to computer-based testing systems, particularly for testing memory functions not covered in existing systems. Dynamic, adaptive procedures, particularly ones based on item-response theory (IRT) and computerized-adaptive testing (CAT) methods, will be implemented in new tests that will be more efficient, reliable, and valid than existing test procedures. These additional developments, along with implementation of innovative reporting formats, are necessary for more widespread acceptance of the testing systems.

  7. The investigation and implementation of real-time face pose and direction estimation on mobile computing devices

    NASA Astrophysics Data System (ADS)

    Fu, Deqian; Gao, Lisheng; Jhang, Seong Tae

    2012-04-01

    The mobile computing device has many limitations, such as relative small user interface and slow computing speed. Usually, augmented reality requires face pose estimation can be used as a HCI and entertainment tool. As far as the realtime implementation of head pose estimation on relatively resource limited mobile platforms is concerned, it is required to face different constraints while leaving enough face pose estimation accuracy. The proposed face pose estimation method met this objective. Experimental results running on a testing Android mobile device delivered satisfactory performing results in the real-time and accurately.

  8. Progress on Intelligent Guidance and Control for Wind Shear Encounter

    NASA Technical Reports Server (NTRS)

    Stratton, D. Alexander

    1990-01-01

    Low altitude wind shear poses a serious threat to air safety. Avoiding severe wind shear challenges the ability of flight crews, as it involves assessing risk from uncertain evidence. A computerized intelligent cockpit aid can increase flight crew awareness of wind shear, improving avoidance decisions. The primary functions of a cockpit advisory expert system for wind shear avoidance are discussed. Also introduced are computational techniques being implemented to enable these primary functions.

  9. New strategy for protein interactions and application to structure-based drug design

    NASA Astrophysics Data System (ADS)

    Zou, Xiaoqin

    One of the greatest challenges in computational biophysics is to predict interactions between biological molecules, which play critical roles in biological processes and rational design of therapeutic drugs. Biomolecular interactions involve delicate interplay between multiple interactions, including electrostatic interactions, van der Waals interactions, solvent effect, and conformational entropic effect. Accurate determination of these complex and subtle interactions is challenging. Moreover, a biological molecule such as a protein usually consists of thousands of atoms, and thus occupies a huge conformational space. The large degrees of freedom pose further challenges for accurate prediction of biomolecular interactions. Here, I will present our development of physics-based theory and computational modeling on protein interactions with other molecules. The major strategy is to extract microscopic energetics from the information embedded in the experimentally-determined structures of protein complexes. I will also present applications of the methods to structure-based therapeutic design. Supported by NSF CAREER Award DBI-0953839, NIH R01GM109980, and the American Heart Association (Midwest Affiliate) [13GRNT16990076].

  10. US EPA - A*Star Partnership - Accelerating the Acceptance of ...

    EPA Pesticide Factsheets

    The path for incorporating new alternative methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. Some of these challenges include development of relevant and predictive test systems and computational models to integrate and extrapolate experimental data, and rapid characterization and acceptance of these systems and models. The series of presentations will highlight a collaborative effort between the U.S. Environmental Protection Agency (EPA) and the Agency for Science, Technology and Research (A*STAR) that is focused on developing and applying experimental and computational models for predicting chemical-induced liver and kidney toxicity, brain angiogenesis, and blood-brain-barrier formation. In addressing some of these challenges, the U.S. EPA and A*STAR collaboration will provide a glimpse of what chemical risk assessments could look like in the 21st century. Presentation on US EPA – A*STAR Partnership at international symposium on Accelerating the acceptance of next-generation sciences and their application to regulatory risk assessment in Singapore.

  11. Aspects of Unstructured Grids and Finite-Volume Solvers for the Euler and Navier-Stokes Equations

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    1992-01-01

    One of the major achievements in engineering science has been the development of computer algorithms for solving nonlinear differential equations such as the Navier-Stokes equations. In the past, limited computer resources have motivated the development of efficient numerical schemes in computational fluid dynamics (CFD) utilizing structured meshes. The use of structured meshes greatly simplifies the implementation of CFD algorithms on conventional computers. Unstructured grids on the other hand offer an alternative to modeling complex geometries. Unstructured meshes have irregular connectivity and usually contain combinations of triangles, quadrilaterals, tetrahedra, and hexahedra. The generation and use of unstructured grids poses new challenges in CFD. The purpose of this note is to present recent developments in the unstructured grid generation and flow solution technology.

  12. Force fields and scoring functions for carbohydrate simulation.

    PubMed

    Xiong, Xiuming; Chen, Zhaoqiang; Cossins, Benjamin P; Xu, Zhijian; Shao, Qiang; Ding, Kai; Zhu, Weiliang; Shi, Jiye

    2015-01-12

    Carbohydrate dynamics plays a vital role in many biological processes, but we are not currently able to probe this with experimental approaches. The highly flexible nature of carbohydrate structures differs in many aspects from other biomolecules, posing significant challenges for studies employing computational simulation. Over past decades, computational study of carbohydrates has been focused on the development of structure prediction methods, force field optimization, molecular dynamics simulation, and scoring functions for carbohydrate-protein interactions. Advances in carbohydrate force fields and scoring functions can be largely attributed to enhanced computational algorithms, application of quantum mechanics, and the increasing number of experimental structures determined by X-ray and NMR techniques. The conformational analysis of carbohydrates is challengeable and has gone into intensive study in elucidating the anomeric, the exo-anomeric, and the gauche effects. Here, we review the issues associated with carbohydrate force fields and scoring functions, which will have a broad application in the field of carbohydrate-based drug design. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. The promise and challenge of personalized medicine: aging populations, complex diseases, and unmet medical need

    PubMed Central

    Henney, Adriano M

    2012-01-01

    Abstract The concept of personalized medicine is not new. It is being discussed with increasing interest in the medical, scientific, and general media because of the availability of advanced scientific and computational technologies, and the promise of the potential to improve the targeting and delivery of novel medicines. It is also being seen as one approach that may have a beneficial impact on reducing health care budgets. But what are the challenges that need to be addressed in its implementation in the clinic? This article poses some provocative questions and suggests some things that need to be considered. PMID:22661132

  14. Role of neural networks for avionics

    NASA Astrophysics Data System (ADS)

    Bowman, Christopher L.; DeYong, Mark R.; Eskridge, Thomas C.

    1995-08-01

    Neural network (NN) architectures provide a thousand-fold speed-up in computational power per watt along with the flexibility to learn/adapt so as to reduce software life-cycle costs. Thus NNs are posed to provide a key supporting role to meet the avionics upgrade challenge for affordable improved mission capability especially near hardware where flexible and powerful smart processing is needed. This paper summarizes the trends for air combat and the resulting avionics needs. A paradigm for information fusion and response management is then described from which viewpoint the role for NNs as a complimentary technology in meeting these avionics challenges is explained along with the key obstacles for NNs.

  15. A cross docking pipeline for improving pose prediction and virtual screening performance

    NASA Astrophysics Data System (ADS)

    Kumar, Ashutosh; Zhang, Kam Y. J.

    2018-01-01

    Pose prediction and virtual screening performance of a molecular docking method depend on the choice of protein structures used for docking. Multiple structures for a target protein are often used to take into account the receptor flexibility and problems associated with a single receptor structure. However, the use of multiple receptor structures is computationally expensive when docking a large library of small molecules. Here, we propose a new cross-docking pipeline suitable to dock a large library of molecules while taking advantage of multiple target protein structures. Our method involves the selection of a suitable receptor for each ligand in a screening library utilizing ligand 3D shape similarity with crystallographic ligands. We have prospectively evaluated our method in D3R Grand Challenge 2 and demonstrated that our cross-docking pipeline can achieve similar or better performance than using either single or multiple-receptor structures. Moreover, our method displayed not only decent pose prediction performance but also better virtual screening performance over several other methods.

  16. ATLAS and LHC computing on CRAY

    NASA Astrophysics Data System (ADS)

    Sciacca, F. G.; Haug, S.; ATLAS Collaboration

    2017-10-01

    Access and exploitation of large scale computing resources, such as those offered by general purpose HPC centres, is one important measure for ATLAS and the other Large Hadron Collider experiments in order to meet the challenge posed by the full exploitation of the future data within the constraints of flat budgets. We report on the effort of moving the Swiss WLCG T2 computing, serving ATLAS, CMS and LHCb, from a dedicated cluster to the large Cray systems at the Swiss National Supercomputing Centre CSCS. These systems do not only offer very efficient hardware, cooling and highly competent operators, but also have large backfill potentials due to size and multidisciplinary usage and potential gains due to economy at scale. Technical solutions, performance, expected return and future plans are discussed.

  17. Parallelisation study of a three-dimensional environmental flow model

    NASA Astrophysics Data System (ADS)

    O'Donncha, Fearghal; Ragnoli, Emanuele; Suits, Frank

    2014-03-01

    There are many simulation codes in the geosciences that are serial and cannot take advantage of the parallel computational resources commonly available today. One model important for our work in coastal ocean current modelling is EFDC, a Fortran 77 code configured for optimal deployment on vector computers. In order to take advantage of our cache-based, blade computing system we restructured EFDC from serial to parallel, thereby allowing us to run existing models more quickly, and to simulate larger and more detailed models that were previously impractical. Since the source code for EFDC is extensive and involves detailed computation, it is important to do such a port in a manner that limits changes to the files, while achieving the desired speedup. We describe a parallelisation strategy involving surgical changes to the source files to minimise error-prone alteration of the underlying computations, while allowing load-balanced domain decomposition for efficient execution on a commodity cluster. The use of conjugate gradient posed particular challenges due to implicit non-local communication posing a hindrance to standard domain partitioning schemes; a number of techniques are discussed to address this in a feasible, computationally efficient manner. The parallel implementation demonstrates good scalability in combination with a novel domain partitioning scheme that specifically handles mixed water/land regions commonly found in coastal simulations. The approach presented here represents a practical methodology to rejuvenate legacy code on a commodity blade cluster with reasonable effort; our solution has direct application to other similar codes in the geosciences.

  18. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures.

    PubMed

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.

  19. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures

    PubMed Central

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315

  20. A combined vision-inertial fusion approach for 6-DoF object pose estimation

    NASA Astrophysics Data System (ADS)

    Li, Juan; Bernardos, Ana M.; Tarrío, Paula; Casar, José R.

    2015-02-01

    The estimation of the 3D position and orientation of moving objects (`pose' estimation) is a critical process for many applications in robotics, computer vision or mobile services. Although major research efforts have been carried out to design accurate, fast and robust indoor pose estimation systems, it remains as an open challenge to provide a low-cost, easy to deploy and reliable solution. Addressing this issue, this paper describes a hybrid approach for 6 degrees of freedom (6-DoF) pose estimation that fuses acceleration data and stereo vision to overcome the respective weaknesses of single technology approaches. The system relies on COTS technologies (standard webcams, accelerometers) and printable colored markers. It uses a set of infrastructure cameras, located to have the object to be tracked visible most of the operation time; the target object has to include an embedded accelerometer and be tagged with a fiducial marker. This simple marker has been designed for easy detection and segmentation and it may be adapted to different service scenarios (in shape and colors). Experimental results show that the proposed system provides high accuracy, while satisfactorily dealing with the real-time constraints.

  1. Implementing a High-Assurance Smart-Card OS

    NASA Astrophysics Data System (ADS)

    Karger, Paul A.; Toll, David C.; Palmer, Elaine R.; McIntosh, Suzanne K.; Weber, Samuel; Edwards, Jonathan W.

    Building a high-assurance, secure operating system for memory constrained systems, such as smart cards, introduces many challenges. The increasing power of smart cards has made their use feasible in applications such as electronic passports, military and public sector identification cards, and cell-phone based financial and entertainment applications. Such applications require a secure environment, which can only be provided with sufficient hardware and a secure operating system. We argue that smart cards pose additional security challenges when compared to traditional computer platforms. We discuss our design for a secure smart card operating system, named Caernarvon, and show that it addresses these challenges, which include secure application download, protection of cryptographic functions from malicious applications, resolution of covert channels, and assurance of both security and data integrity in the face of arbitrary power losses.

  2. Bird song: in vivo, in vitro, in silico

    NASA Astrophysics Data System (ADS)

    Mukherjee, Aryesh; Mandre, Shreyas; Mahadevan, Lakshminarayan

    2010-11-01

    Bird song, long since an inspiration for artists, writers and poets also poses challenges for scientists interested in dissecting the mechanisms underlying the neural, motor, learning and behavioral systems behind the beak and brain, as a way to recreate and synthesize it. We use a combination of quantitative visualization experiments with physical models and computational theories to understand the simplest aspects of these complex musical boxes, focusing on using the controllable elastohydrodynamic interactions to mimic aural gestures and simple songs.

  3. Beyond Fourier

    NASA Astrophysics Data System (ADS)

    Hoch, Jeffrey C.

    2017-10-01

    Non-Fourier methods of spectrum analysis are gaining traction in NMR spectroscopy, driven by their utility for processing nonuniformly sampled data. These methods afford new opportunities for optimizing experiment time, resolution, and sensitivity of multidimensional NMR experiments, but they also pose significant challenges not encountered with the discrete Fourier transform. A brief history of non-Fourier methods in NMR serves to place different approaches in context. Non-Fourier methods reflect broader trends in the growing importance of computation in NMR, and offer insights for future software development.

  4. WFIRST: Microlensing Analysis Data Challenge

    NASA Astrophysics Data System (ADS)

    Street, Rachel; WFIRST Microlensing Science Investigation Team

    2018-01-01

    WFIRST will produce thousands of high cadence, high photometric precision lightcurves of microlensing events, from which a wealth of planetary and stellar systems will be discovered. However, the analysis of such lightcurves has historically been very time consuming and expensive in both labor and computing facilities. This poses a potential bottleneck to deriving the full science potential of the WFIRST mission. To address this problem, the WFIRST Microlensing Science Investigation Team designing a series of data challenges to stimulate research to address outstanding problems of microlensing analysis. These range from the classification and modeling of triple lens events to methods to efficiently yet thoroughly search a high-dimensional parameter space for the best fitting models.

  5. Advantages and Challenges of 10-Gbps Transmission on High-Density Interconnect Boards

    NASA Astrophysics Data System (ADS)

    Yee, Chang Fei; Jambek, Asral Bahari; Al-Hadi, Azremi Abdullah

    2016-06-01

    This paper provides a brief introduction to high-density interconnect (HDI) technology and its implementation on printed circuit boards (PCBs). The advantages and challenges of implementing 10-Gbps signal transmission on high-density interconnect boards are discussed in detail. The advantages (e.g., smaller via dimension and via stub removal) and challenges (e.g., crosstalk due to smaller interpair separation) of HDI are studied by analyzing the S-parameter, time-domain reflectometry (TDR), and transmission-line eye diagrams obtained by three-dimensional electromagnetic modeling (3DEM) and two-dimensional electromagnetic modeling (2DEM) using Mentor Graphics HyperLynx and Keysight Advanced Design System (ADS) electronic computer-aided design (ECAD) software. HDI outperforms conventional PCB technology in terms of signal integrity, but proper routing topology should be applied to overcome the challenge posed by crosstalk due to the tight spacing between traces.

  6. Multi -omics and metabolic modelling pipelines: challenges and tools for systems microbiology.

    PubMed

    Fondi, Marco; Liò, Pietro

    2015-02-01

    Integrated -omics approaches are quickly spreading across microbiology research labs, leading to (i) the possibility of detecting previously hidden features of microbial cells like multi-scale spatial organization and (ii) tracing molecular components across multiple cellular functional states. This promises to reduce the knowledge gap between genotype and phenotype and poses new challenges for computational microbiologists. We underline how the capability to unravel the complexity of microbial life will strongly depend on the integration of the huge and diverse amount of information that can be derived today from -omics experiments. In this work, we present opportunities and challenges of multi -omics data integration in current systems biology pipelines. We here discuss which layers of biological information are important for biotechnological and clinical purposes, with a special focus on bacterial metabolism and modelling procedures. A general review of the most recent computational tools for performing large-scale datasets integration is also presented, together with a possible framework to guide the design of systems biology experiments by microbiologists. Copyright © 2015. Published by Elsevier GmbH.

  7. A scientific workflow framework for (13)C metabolic flux analysis.

    PubMed

    Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina

    2016-08-20

    Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Automatic Parameterization Strategy for Cardiac Electrophysiology Simulations.

    PubMed

    Costa, Caroline Mendonca; Hoetzl, Elena; Rocha, Bernardo Martins; Prassl, Anton J; Plank, Gernot

    2013-10-01

    Driven by recent advances in medical imaging, image segmentation and numerical techniques, computer models of ventricular electrophysiology account for increasingly finer levels of anatomical and biophysical detail. However, considering the large number of model parameters involved parameterization poses a major challenge. A minimum requirement in combined experimental and modeling studies is to achieve good agreement in activation and repolarization sequences between model and experiment or patient data. In this study, we propose basic techniques which aid in determining bidomain parameters to match activation sequences. An iterative parameterization algorithm is implemented which determines appropriate bulk conductivities which yield prescribed velocities. In addition, a method is proposed for splitting the computed bulk conductivities into individual bidomain conductivities by prescribing anisotropy ratios.

  9. Parallel Processing of Images in Mobile Devices using BOINC

    NASA Astrophysics Data System (ADS)

    Curiel, Mariela; Calle, David F.; Santamaría, Alfredo S.; Suarez, David F.; Flórez, Leonardo

    2018-04-01

    Medical image processing helps health professionals make decisions for the diagnosis and treatment of patients. Since some algorithms for processing images require substantial amounts of resources, one could take advantage of distributed or parallel computing. A mobile grid can be an adequate computing infrastructure for this problem. A mobile grid is a grid that includes mobile devices as resource providers. In a previous step of this research, we selected BOINC as the infrastructure to build our mobile grid. However, parallel processing of images in mobile devices poses at least two important challenges: the execution of standard libraries for processing images and obtaining adequate performance when compared to desktop computers grids. By the time we started our research, the use of BOINC in mobile devices also involved two issues: a) the execution of programs in mobile devices required to modify the code to insert calls to the BOINC API, and b) the division of the image among the mobile devices as well as its merging required additional code in some BOINC components. This article presents answers to these four challenges.

  10. Machine Learning in the Big Data Era: Are We There Yet?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas Rangan

    In this paper, we discuss the machine learning challenges of the Big Data era. We observe that recent innovations in being able to collect, access, organize, integrate, and query massive amounts of data from a wide variety of data sources have brought statistical machine learning under more scrutiny and evaluation for gleaning insights from the data than ever before. In that context, we pose and debate the question - Are machine learning algorithms scaling with the ability to store and compute? If yes, how? If not, why not? We survey recent developments in the state-of-the-art to discuss emerging and outstandingmore » challenges in the design and implementation of machine learning algorithms at scale. We leverage experience from real-world Big Data knowledge discovery projects across domains of national security and healthcare to suggest our efforts be focused along the following axes: (i) the data science challenge - designing scalable and flexible computational architectures for machine learning (beyond just data-retrieval); (ii) the science of data challenge the ability to understand characteristics of data before applying machine learning algorithms and tools; and (iii) the scalable predictive functions challenge the ability to construct, learn and infer with increasing sample size, dimensionality, and categories of labels. We conclude with a discussion of opportunities and directions for future research.« less

  11. Multi-Directional Multi-Level Dual-Cross Patterns for Robust Face Recognition.

    PubMed

    Ding, Changxing; Choi, Jonghyun; Tao, Dacheng; Davis, Larry S

    2016-03-01

    To perform unconstrained face recognition robust to variations in illumination, pose and expression, this paper presents a new scheme to extract "Multi-Directional Multi-Level Dual-Cross Patterns" (MDML-DCPs) from face images. Specifically, the MDML-DCPs scheme exploits the first derivative of Gaussian operator to reduce the impact of differences in illumination and then computes the DCP feature at both the holistic and component levels. DCP is a novel face image descriptor inspired by the unique textural structure of human faces. It is computationally efficient and only doubles the cost of computing local binary patterns, yet is extremely robust to pose and expression variations. MDML-DCPs comprehensively yet efficiently encodes the invariant characteristics of a face image from multiple levels into patterns that are highly discriminative of inter-personal differences but robust to intra-personal variations. Experimental results on the FERET, CAS-PERL-R1, FRGC 2.0, and LFW databases indicate that DCP outperforms the state-of-the-art local descriptors (e.g., LBP, LTP, LPQ, POEM, tLBP, and LGXP) for both face identification and face verification tasks. More impressively, the best performance is achieved on the challenging LFW and FRGC 2.0 databases by deploying MDML-DCPs in a simple recognition scheme.

  12. Accelerating Full Configuration Interaction Calculations for Nuclear Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Chao; Sternberg, Philip; Maris, Pieter

    2008-04-14

    One of the emerging computational approaches in nuclear physics is the full configuration interaction (FCI) method for solving the many-body nuclear Hamiltonian in a sufficiently large single-particle basis space to obtain exact answers - either directly or by extrapolation. The lowest eigenvalues and correspondingeigenvectors for very large, sparse and unstructured nuclear Hamiltonian matrices are obtained and used to evaluate additional experimental quantities. These matrices pose a significant challenge to the design and implementation of efficient and scalable algorithms for obtaining solutions on massively parallel computer systems. In this paper, we describe the computational strategies employed in a state-of-the-art FCI codemore » MFDn (Many Fermion Dynamics - nuclear) as well as techniques we recently developed to enhance the computational efficiency of MFDn. We will demonstrate the current capability of MFDn and report the latest performance improvement we have achieved. We will also outline our future research directions.« less

  13. Open Research Challenges with Big Data - A Data-Scientist s Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R

    In this paper, we discuss data-driven discovery challenges of the Big Data era. We observe that recent innovations in being able to collect, access, organize, integrate, and query massive amounts of data from a wide variety of data sources have brought statistical data mining and machine learning under more scrutiny and evaluation for gleaning insights from the data than ever before. In that context, we pose and debate the question - Are data mining algorithms scaling with the ability to store and compute? If yes, how? If not, why not? We survey recent developments in the state-of-the-art to discuss emergingmore » and outstanding challenges in the design and implementation of machine learning algorithms at scale. We leverage experience from real-world Big Data knowledge discovery projects across domains of national security, healthcare and manufacturing to suggest our efforts be focused along the following axes: (i) the data science challenge - designing scalable and flexible computational architectures for machine learning (beyond just data-retrieval); (ii) the science of data challenge the ability to understand characteristics of data before applying machine learning algorithms and tools; and (iii) the scalable predictive functions challenge the ability to construct, learn and infer with increasing sample size, dimensionality, and categories of labels. We conclude with a discussion of opportunities and directions for future research.« less

  14. Advancing Capabilities for Understanding the Earth System Through Intelligent Systems, the NSF Perspective

    NASA Astrophysics Data System (ADS)

    Gil, Y.; Zanzerkia, E. E.; Munoz-Avila, H.

    2015-12-01

    The National Science Foundation (NSF) Directorate for Geosciences (GEO) and Directorate for Computer and Information Science (CISE) acknowledge the significant scientific challenges required to understand the fundamental processes of the Earth system, within the atmospheric and geospace, Earth, ocean and polar sciences, and across those boundaries. A broad view of the opportunities and directions for GEO are described in the report "Dynamic Earth: GEO imperative and Frontiers 2015-2020." Many of the aspects of geosciences research, highlighted both in this document and other community grand challenges, pose novel problems for researchers in intelligent systems. Geosciences research will require solutions for data-intensive science, advanced computational capabilities, and transformative concepts for visualizing, using, analyzing and understanding geo phenomena and data. Opportunities for the scientific community to engage in addressing these challenges are available and being developed through NSF's portfolio of investments and activities. The NSF-wide initiative, Cyberinfrastructure Framework for 21st Century Science and Engineering (CIF21), looks to accelerate research and education through new capabilities in data, computation, software and other aspects of cyberinfrastructure. EarthCube, a joint program between GEO and the Advanced Cyberinfrastructure Division, aims to create a well-connected and facile environment to share data and knowledge in an open, transparent, and inclusive manner, thus accelerating our ability to understand and predict the Earth system. EarthCube's mission opens an opportunity for collaborative research on novel information systems enhancing and supporting geosciences research efforts. NSF encourages true, collaborative partnerships between scientists in computer sciences and the geosciences to meet these challenges.

  15. Face pose tracking using the four-point algorithm

    NASA Astrophysics Data System (ADS)

    Fung, Ho Yin; Wong, Kin Hong; Yu, Ying Kin; Tsui, Kwan Pang; Kam, Ho Chuen

    2017-06-01

    In this paper, we have developed an algorithm to track the pose of a human face robustly and efficiently. Face pose estimation is very useful in many applications such as building virtual reality systems and creating an alternative input method for the disabled. Firstly, we have modified a face detection toolbox called DLib for the detection of a face in front of a camera. The detected face features are passed to a pose estimation method, known as the four-point algorithm, for pose computation. The theory applied and the technical problems encountered during system development are discussed in the paper. It is demonstrated that the system is able to track the pose of a face in real time using a consumer grade laptop computer.

  16. Structural Interface Parameters Are Discriminatory in Recognising Near-Native Poses of Protein-Protein Interactions

    PubMed Central

    Malhotra, Sony; Sankar, Kannan; Sowdhamini, Ramanathan

    2014-01-01

    Interactions at the molecular level in the cellular environment play a very crucial role in maintaining the physiological functioning of the cell. These molecular interactions exist at varied levels viz. protein-protein interactions, protein-nucleic acid interactions or protein-small molecules interactions. Presently in the field, these interactions and their mechanisms mark intensively studied areas. Molecular interactions can also be studied computationally using the approach named as Molecular Docking. Molecular docking employs search algorithms to predict the possible conformations for interacting partners and then calculates interaction energies. However, docking proposes number of solutions as different docked poses and hence offers a serious challenge to identify the native (or near native) structures from the pool of these docked poses. Here, we propose a rigorous scoring scheme called DockScore which can be used to rank the docked poses and identify the best docked pose out of many as proposed by docking algorithm employed. The scoring identifies the optimal interactions between the two protein partners utilising various features of the putative interface like area, short contacts, conservation, spatial clustering and the presence of positively charged and hydrophobic residues. DockScore was first trained on a set of 30 protein-protein complexes to determine the weights for different parameters. Subsequently, we tested the scoring scheme on 30 different protein-protein complexes and native or near-native structure were assigned the top rank from a pool of docked poses in 26 of the tested cases. We tested the ability of DockScore to discriminate likely dimer interactions that differ substantially within a homologous family and also demonstrate that DOCKSCORE can distinguish correct pose for all 10 recent CAPRI targets. PMID:24498255

  17. Structural interface parameters are discriminatory in recognising near-native poses of protein-protein interactions.

    PubMed

    Malhotra, Sony; Sankar, Kannan; Sowdhamini, Ramanathan

    2014-01-01

    Interactions at the molecular level in the cellular environment play a very crucial role in maintaining the physiological functioning of the cell. These molecular interactions exist at varied levels viz. protein-protein interactions, protein-nucleic acid interactions or protein-small molecules interactions. Presently in the field, these interactions and their mechanisms mark intensively studied areas. Molecular interactions can also be studied computationally using the approach named as Molecular Docking. Molecular docking employs search algorithms to predict the possible conformations for interacting partners and then calculates interaction energies. However, docking proposes number of solutions as different docked poses and hence offers a serious challenge to identify the native (or near native) structures from the pool of these docked poses. Here, we propose a rigorous scoring scheme called DockScore which can be used to rank the docked poses and identify the best docked pose out of many as proposed by docking algorithm employed. The scoring identifies the optimal interactions between the two protein partners utilising various features of the putative interface like area, short contacts, conservation, spatial clustering and the presence of positively charged and hydrophobic residues. DockScore was first trained on a set of 30 protein-protein complexes to determine the weights for different parameters. Subsequently, we tested the scoring scheme on 30 different protein-protein complexes and native or near-native structure were assigned the top rank from a pool of docked poses in 26 of the tested cases. We tested the ability of DockScore to discriminate likely dimer interactions that differ substantially within a homologous family and also demonstrate that DOCKSCORE can distinguish correct pose for all 10 recent CAPRI targets.

  18. Atomic Detail Visualization of Photosynthetic Membranes with GPU-Accelerated Ray Tracing

    PubMed Central

    Vandivort, Kirby L.; Barragan, Angela; Singharoy, Abhishek; Teo, Ivan; Ribeiro, João V.; Isralewitz, Barry; Liu, Bo; Goh, Boon Chong; Phillips, James C.; MacGregor-Chatwin, Craig; Johnson, Matthew P.; Kourkoutis, Lena F.; Hunter, C. Neil

    2016-01-01

    The cellular process responsible for providing energy for most life on Earth, namely photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. We present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. We describe the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers. PMID:27274603

  19. Algorithmic aspects for the reconstruction of spatio-spectral data cubes in the perspective of the SKA

    NASA Astrophysics Data System (ADS)

    Mary, D.; Ferrari, A.; Ferrari, C.; Deguignet, J.; Vannier, M.

    2016-12-01

    With millions of receivers leading to TerraByte data cubes, the story of the giant SKA telescope is also that of collaborative efforts from radioastronomy, signal processing, optimization and computer sciences. Reconstructing SKA cubes poses two challenges. First, the majority of existing algorithms work in 2D and cannot be directly translated into 3D. Second, the reconstruction implies solving an inverse problem and it is not clear what ultimate limit we can expect on the error of this solution. This study addresses (of course partially) both challenges. We consider an extremely simple data acquisition model, and we focus on strategies making it possible to implement 3D reconstruction algorithms that use state-of-the-art image/spectral regularization. The proposed approach has two main features: (i) reduced memory storage with respect to a previous approach; (ii) efficient parallelization and ventilation of the computational load over the spectral bands. This work will allow to implement and compare various 3D reconstruction approaches in a large scale framework.

  20. Systems engineering for very large systems

    NASA Technical Reports Server (NTRS)

    Lewkowicz, Paul E.

    1993-01-01

    Very large integrated systems have always posed special problems for engineers. Whether they are power generation systems, computer networks or space vehicles, whenever there are multiple interfaces, complex technologies or just demanding customers, the challenges are unique. 'Systems engineering' has evolved as a discipline in order to meet these challenges by providing a structured, top-down design and development methodology for the engineer. This paper attempts to define the general class of problems requiring the complete systems engineering treatment and to show how systems engineering can be utilized to improve customer satisfaction and profit ability. Specifically, this work will focus on a design methodology for the largest of systems, not necessarily in terms of physical size, but in terms of complexity and interconnectivity.

  1. Systems engineering for very large systems

    NASA Astrophysics Data System (ADS)

    Lewkowicz, Paul E.

    Very large integrated systems have always posed special problems for engineers. Whether they are power generation systems, computer networks or space vehicles, whenever there are multiple interfaces, complex technologies or just demanding customers, the challenges are unique. 'Systems engineering' has evolved as a discipline in order to meet these challenges by providing a structured, top-down design and development methodology for the engineer. This paper attempts to define the general class of problems requiring the complete systems engineering treatment and to show how systems engineering can be utilized to improve customer satisfaction and profit ability. Specifically, this work will focus on a design methodology for the largest of systems, not necessarily in terms of physical size, but in terms of complexity and interconnectivity.

  2. Adding question answering to an e-tutor for programming languages

    NASA Astrophysics Data System (ADS)

    Taylor, Kate; Moore, Simon

    Control over a closed domain of textual material removes many question answering issues, as does an ontology that is closely intertwined with its sources. This pragmatic, shallow approach to many challenging areas of research in adaptive hypermedia, question answering, intelligent tutoring and humancomputer interaction has been put into practice at Cambridge in the Computer Science undergraduate course to teach the hardware description language Veri/og. This language itself poses many challenges as it crosses the interdisciplinary boundary between hardware and software engineers, giving rise to severalhuman ontologies as well as theprogramming language itself We present further results from ourformal and informal surveys. We look at further work to increase the dialogue between studentand tutor and export our knowledge to the Semantic Web.

  3. Security Planning and Policies to Meet the Challenges of Climate Change

    DTIC Science & Technology

    2010-07-01

    Security Climate change poses challenges to societies and governments that go far beyond the alteration of our environment. The physical impacts of...capacity of governments to respond. In this sense, the growing likelihood of events such as mass migrations, crop failures, economic shocks, public...riots and violence, floods and other natural disasters, widespread epidemics, and competition for resources pose serious challenges for governments and

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harms, Kevin; Oral, H. Sarp; Atchley, Scott

    The Oak Ridge and Argonne Leadership Computing Facilities are both receiving new systems under the Collaboration of Oak Ridge, Argonne, and Livermore (CORAL) program. Because they are both part of the INCITE program, applications need to be portable between these two facilities. However, the Summit and Aurora systems will be vastly different architectures, including their I/O subsystems. While both systems will have POSIX-compliant parallel file systems, their Burst Buffer technologies will be different. This difference may pose challenges to application portability between facilities. Application developers need to pay attention to specific burst buffer implementations to maximize code portability.

  5. A novel smart lighting clinical testbed.

    PubMed

    Gleason, Joseph D; Oishi, Meeko; Simkulet, Michelle; Tuzikas, Arunas; Brown, Lee K; Brueck, S R J; Karlicek, Robert F

    2017-07-01

    A real-time, feedback-capable, variable spectrum lighting system was recently installed at the University of New Mexico Hospital to facilitate biomedical research on the health impacts of lighting. The system consists of variable spectrum troffers, color sensors, occupancy sensors, and computing and communication infrastructure, and is the only such clinical facility in the US. The clinical environment posed special challenges for installation as well as for ongoing maintenance and operations. Pilot studies are currently underway to evaluate the effectiveness of the system to regulate circadian phase in subjects with delayed sleep-wake phase disorder.

  6. Beyond Fourier.

    PubMed

    Hoch, Jeffrey C

    2017-10-01

    Non-Fourier methods of spectrum analysis are gaining traction in NMR spectroscopy, driven by their utility for processing nonuniformly sampled data. These methods afford new opportunities for optimizing experiment time, resolution, and sensitivity of multidimensional NMR experiments, but they also pose significant challenges not encountered with the discrete Fourier transform. A brief history of non-Fourier methods in NMR serves to place different approaches in context. Non-Fourier methods reflect broader trends in the growing importance of computation in NMR, and offer insights for future software development. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Imaging of thymic disorders

    PubMed Central

    Bogot, Naama R; Quint, Leslie E

    2005-01-01

    Evaluation of the thymus poses a challenge to the radiologist. In addition to age-related changes in thymic size, shape, and tissue composition, there is considerable variability in the normal adult thymic appearance within any age group. Many different types of disorders may affect the thymus, including hyperplasia, cysts, and benign and malignant neoplasms, both primary and secondary; clinical and imaging findings typical for each disease process are described in this article. Whereas computed tomography is the mainstay for imaging the thymus, other imaging modalities may occasionally provide additional structural or functional information. PMID:16361143

  8. Human body segmentation via data-driven graph cut.

    PubMed

    Li, Shifeng; Lu, Huchuan; Shao, Xingqing

    2014-11-01

    Human body segmentation is a challenging and important problem in computer vision. Existing methods usually entail a time-consuming training phase for prior knowledge learning with complex shape matching for body segmentation. In this paper, we propose a data-driven method that integrates top-down body pose information and bottom-up low-level visual cues for segmenting humans in static images within the graph cut framework. The key idea of our approach is first to exploit human kinematics to search for body part candidates via dynamic programming for high-level evidence. Then, by using the body parts classifiers, obtaining bottom-up cues of human body distribution for low-level evidence. All the evidence collected from top-down and bottom-up procedures are integrated in a graph cut framework for human body segmentation. Qualitative and quantitative experiment results demonstrate the merits of the proposed method in segmenting human bodies with arbitrary poses from cluttered backgrounds.

  9. Prospective evaluation of shape similarity based pose prediction method in D3R Grand Challenge 2015

    NASA Astrophysics Data System (ADS)

    Kumar, Ashutosh; Zhang, Kam Y. J.

    2016-09-01

    Evaluation of ligand three-dimensional (3D) shape similarity is one of the commonly used approaches to identify ligands similar to one or more known active compounds from a library of small molecules. Apart from using ligand shape similarity as a virtual screening tool, its role in pose prediction and pose scoring has also been reported. We have recently developed a method that utilizes ligand 3D shape similarity with known crystallographic ligands to predict binding poses of query ligands. Here, we report the prospective evaluation of our pose prediction method through the participation in drug design data resource (D3R) Grand Challenge 2015. Our pose prediction method was used to predict binding poses of heat shock protein 90 (HSP90) and mitogen activated protein kinase kinase kinase kinase (MAP4K4) ligands and it was able to predict the pose within 2 Å root mean square deviation (RMSD) either as the top pose or among the best of five poses in a majority of cases. Specifically for HSP90 protein, a median RMSD of 0.73 and 0.68 Å was obtained for the top and the best of five predictions respectively. For MAP4K4 target, although the median RMSD for our top prediction was only 2.87 Å but the median RMSD of 1.67 Å for the best of five predictions was well within the limit for successful prediction. Furthermore, the performance of our pose prediction method for HSP90 and MAP4K4 ligands was always among the top five groups. Particularly, for MAP4K4 protein our pose prediction method was ranked number one both in terms of mean and median RMSD when the best of five predictions were considered. Overall, our D3R Grand Challenge 2015 results demonstrated that ligand 3D shape similarity with the crystal ligand is sufficient to predict binding poses of new ligands with acceptable accuracy.

  10. Endodontic management of maxillary third molar with MB2 (Vertucci type IV) canal configuration diagnosed with Cone Beam Computed Tomography - a case report.

    PubMed

    Jain, Pradeep; Patni, Pallav; Yogesh, Pant; Anup, Vyas

    2017-01-01

    The endodontic treatment of maxillary third molar often poses a challenge even to an experienced endodontist because of their most posterior location in the dental arch, aberrant occlusal anatomy, abnormal root canal configuration and eruption patterns. Owing to these anatomical limitations, their extraction remains the treatment of choice for many clinicians. As we know, retaining every functional component of the dental arch is of prime importance in contemporary dental practice. This clinical case report aims to discuss the endodontic treatment of maxillary third molar with MB2 root canal separated throughout the length and exit at two separate apical foramina (Vertucci type IV) diagnosed with Cone Beam Computed Tomography (CBCT)..

  11. Automatic Parameterization Strategy for Cardiac Electrophysiology Simulations

    PubMed Central

    Costa, Caroline Mendonca; Hoetzl, Elena; Rocha, Bernardo Martins; Prassl, Anton J; Plank, Gernot

    2014-01-01

    Driven by recent advances in medical imaging, image segmentation and numerical techniques, computer models of ventricular electrophysiology account for increasingly finer levels of anatomical and biophysical detail. However, considering the large number of model parameters involved parameterization poses a major challenge. A minimum requirement in combined experimental and modeling studies is to achieve good agreement in activation and repolarization sequences between model and experiment or patient data. In this study, we propose basic techniques which aid in determining bidomain parameters to match activation sequences. An iterative parameterization algorithm is implemented which determines appropriate bulk conductivities which yield prescribed velocities. In addition, a method is proposed for splitting the computed bulk conductivities into individual bidomain conductivities by prescribing anisotropy ratios. PMID:24729986

  12. Asynchronous sampled-data approach for event-triggered systems

    NASA Astrophysics Data System (ADS)

    Mahmoud, Magdi S.; Memon, Azhar M.

    2017-11-01

    While aperiodically triggered network control systems save a considerable amount of communication bandwidth, they also pose challenges such as coupling between control and event-condition design, optimisation of the available resources such as control, communication and computation power, and time-delays due to computation and communication network. With this motivation, the paper presents separate designs of control and event-triggering mechanism, thus simplifying the overall analysis, asynchronous linear quadratic Gaussian controller which tackles delays and aperiodic nature of transmissions, and a novel event mechanism which compares the cost of the aperiodic system against a reference periodic implementation. The proposed scheme is simulated on a linearised wind turbine model for pitch angle control and the results show significant improvement against the periodic counterpart.

  13. Toward Interactive Scenario Analysis and Exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gayle, Thomas R.; Summers, Kenneth Lee; Jungels, John

    2015-01-01

    As Modeling and Simulation (M&S) tools have matured, their applicability and importance have increased across many national security challenges. In particular, they provide a way to test how something may behave without the need to do real world testing. However, current and future changes across several factors including capabilities, policy, and funding are driving a need for rapid response or evaluation in ways that many M&S tools cannot address. Issues around large data, computational requirements, delivery mechanisms, and analyst involvement already exist and pose significant challenges. Furthermore, rising expectations, rising input complexity, and increasing depth of analysis will only increasemore » the difficulty of these challenges. In this study we examine whether innovations in M&S software coupled with advances in ''cloud'' computing and ''big-data'' methodologies can overcome many of these challenges. In particular, we propose a simple, horizontally-scalable distributed computing environment that could provide the foundation (i.e. ''cloud'') for next-generation M&S-based applications based on the notion of ''parallel multi-simulation''. In our context, the goal of parallel multi- simulation is to consider as many simultaneous paths of execution as possible. Therefore, with sufficient resources, the complexity is dominated by the cost of single scenario runs as opposed to the number of runs required. We show the feasibility of this architecture through a stable prototype implementation coupled with the Umbra Simulation Framework [6]. Finally, we highlight the utility through multiple novel analysis tools and by showing the performance improvement compared to existing tools.« less

  14. Mining the Quantified Self: Personal Knowledge Discovery as a Challenge for Data Science.

    PubMed

    Fawcett, Tom

    2015-12-01

    The last several years have seen an explosion of interest in wearable computing, personal tracking devices, and the so-called quantified self (QS) movement. Quantified self involves ordinary people recording and analyzing numerous aspects of their lives to understand and improve themselves. This is now a mainstream phenomenon, attracting a great deal of attention, participation, and funding. As more people are attracted to the movement, companies are offering various new platforms (hardware and software) that allow ever more aspects of daily life to be tracked. Nearly every aspect of the QS ecosystem is advancing rapidly, except for analytic capabilities, which remain surprisingly primitive. With increasing numbers of qualified self participants collecting ever greater amounts and types of data, many people literally have more data than they know what to do with. This article reviews the opportunities and challenges posed by the QS movement. Data science provides well-tested techniques for knowledge discovery. But making these useful for the QS domain poses unique challenges that derive from the characteristics of the data collected as well as the specific types of actionable insights that people want from the data. Using a small sample of QS time series data containing information about personal health we provide a formulation of the QS problem that connects data to the decisions of interest to the user.

  15. Sustainable Materials Management (SMM) Web Academy Webinar: An Introduction to Lithium Batteries and the Challenges that they Pose to the Waste and Recycling Industry

    EPA Pesticide Factsheets

    This is a webinar page for the Sustainable Management of Materials (SMM) Web Academy webinar titled, An Introduction to Lithium Batteries and the Challenges that they Pose to the Waste and Recycling Industry.

  16. Career Counseling and the Information Highway: Heeding the Road Signs.

    ERIC Educational Resources Information Center

    O'Halloran, Theresa M.; Fahr, Alicia V.; Keller, Jenny R.

    2002-01-01

    Traveling the "information highway" in the process of career counseling or providing career counseling services via the Internet pose additional challenges for counselors. The authors use current ethical guidelines to guide discussion of, and possible resolutions to, challenges posed by incorporating the Internet into career counseling. (Contains…

  17. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.

  18. Nanoinformatics knowledge infrastructures: bringing efficient information management to nanomedical research.

    PubMed

    de la Iglesia, D; Cachau, R E; García-Remesal, M; Maojo, V

    2013-11-27

    Nanotechnology represents an area of particular promise and significant opportunity across multiple scientific disciplines. Ongoing nanotechnology research ranges from the characterization of nanoparticles and nanomaterials to the analysis and processing of experimental data seeking correlations between nanoparticles and their functionalities and side effects. Due to their special properties, nanoparticles are suitable for cellular-level diagnostics and therapy, offering numerous applications in medicine, e.g. development of biomedical devices, tissue repair, drug delivery systems and biosensors. In nanomedicine, recent studies are producing large amounts of structural and property data, highlighting the role for computational approaches in information management. While in vitro and in vivo assays are expensive, the cost of computing is falling. Furthermore, improvements in the accuracy of computational methods (e.g. data mining, knowledge discovery, modeling and simulation) have enabled effective tools to automate the extraction, management and storage of these vast data volumes. Since this information is widely distributed, one major issue is how to locate and access data where it resides (which also poses data-sharing limitations). The novel discipline of nanoinformatics addresses the information challenges related to nanotechnology research. In this paper, we summarize the needs and challenges in the field and present an overview of extant initiatives and efforts.

  19. Nanoinformatics knowledge infrastructures: bringing efficient information management to nanomedical research

    NASA Astrophysics Data System (ADS)

    de la Iglesia, D.; Cachau, R. E.; García-Remesal, M.; Maojo, V.

    2013-01-01

    Nanotechnology represents an area of particular promise and significant opportunity across multiple scientific disciplines. Ongoing nanotechnology research ranges from the characterization of nanoparticles and nanomaterials to the analysis and processing of experimental data seeking correlations between nanoparticles and their functionalities and side effects. Due to their special properties, nanoparticles are suitable for cellular-level diagnostics and therapy, offering numerous applications in medicine, e.g. development of biomedical devices, tissue repair, drug delivery systems and biosensors. In nanomedicine, recent studies are producing large amounts of structural and property data, highlighting the role for computational approaches in information management. While in vitro and in vivo assays are expensive, the cost of computing is falling. Furthermore, improvements in the accuracy of computational methods (e.g. data mining, knowledge discovery, modeling and simulation) have enabled effective tools to automate the extraction, management and storage of these vast data volumes. Since this information is widely distributed, one major issue is how to locate and access data where it resides (which also poses data-sharing limitations). The novel discipline of nanoinformatics addresses the information challenges related to nanotechnology research. In this paper, we summarize the needs and challenges in the field and present an overview of extant initiatives and efforts.

  20. Bio-inspired nano tools for neuroscience.

    PubMed

    Das, Suradip; Carnicer-Lombarte, Alejandro; Fawcett, James W; Bora, Utpal

    2016-07-01

    Research and treatment in the nervous system is challenged by many physiological barriers posing a major hurdle for neurologists. The CNS is protected by a formidable blood brain barrier (BBB) which limits surgical, therapeutic and diagnostic interventions. The hostile environment created by reactive astrocytes in the CNS along with the limited regeneration capacity of the PNS makes functional recovery after tissue damage difficult and inefficient. Nanomaterials have the unique ability to interface with neural tissue in the nano-scale and are capable of influencing the function of a single neuron. The ability of nanoparticles to transcend the BBB through surface modifications has been exploited in various neuro-imaging techniques and for targeted drug delivery. The tunable topography of nanofibers provides accurate spatio-temporal guidance to regenerating axons. This review is an attempt to comprehend the progress in understanding the obstacles posed by the complex physiology of the nervous system and the innovations in design and fabrication of advanced nanomaterials drawing inspiration from natural phenomenon. We also discuss the development of nanomaterials for use in Neuro-diagnostics, Neuro-therapy and the fabrication of advanced nano-devices for use in opto-electronic and ultrasensitive electrophysiological applications. The energy efficient and parallel computing ability of the human brain has inspired the design of advanced nanotechnology based computational systems. However, extensive use of nanomaterials in neuroscience also raises serious toxicity issues as well as ethical concerns regarding nano implants in the brain. In conclusion we summarize these challenges and provide an insight into the huge potential of nanotechnology platforms in neuroscience. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, John E.; Sener, Melih; Vandivort, Kirby L.

    The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. In this paper, we present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. Finally, we describemore » the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less

  2. Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, John E.; Sener, Melih; Vandivort, Kirby L.

    The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. We present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. We describe the techniques that weremore » used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less

  3. Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing

    DOE PAGES

    Stone, John E.; Sener, Melih; Vandivort, Kirby L.; ...

    2015-12-12

    The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. In this paper, we present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. Finally, we describemore » the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less

  4. Large Eddy Simulation of Ducted Propulsors in Crashback

    NASA Astrophysics Data System (ADS)

    Jang, Hyunchul; Mahesh, Krishnan

    2009-11-01

    Flow around a ducted marine propulsor is computed using the large eddy simulation methodology under crashback conditions. Crashback is an operating condition where a propulsor rotates in the reverse direction while the vessel moves in the forward direction. It is characterized by massive flow separation and highly unsteady propeller loads, which affect both blade life and maneuverability. The simulations are performed on unstructured grids using the discrete kinetic energy conserving algorithm developed by Mahesh at al. (2004, J. Comput. Phys 197). Numerical challenges posed by sharp blade edges and small blade tip clearances are discussed. The flow is computed at the advance ratio J=-0.7 and Reynolds number Re=480,000 based on the propeller diameter. Average and RMS values of the unsteady loads such as thrust, torque, and side force on the blades and duct are compared to experiment, and the effect of the duct on crashback is discussed.

  5. Bed Bug Epidemic: A Challenge to Public Health

    ERIC Educational Resources Information Center

    Ratnapradipa, Dhitinut; Ritzel, Dale O.; Haramis, Linn D.; Bliss, Kadi R.

    2011-01-01

    In recent years, reported cases of bed bug infestations in the U.S. and throughout the world have escalated dramatically, posing a global public health problem. Although bed bugs are not known to transmit disease to humans, they pose both direct and indirect public health challenges in terms of health effects, treatment, cost, and resource…

  6. POSE Algorithms for Automated Docking

    NASA Technical Reports Server (NTRS)

    Heaton, Andrew F.; Howard, Richard T.

    2011-01-01

    POSE (relative position and attitude) can be computed in many different ways. Given a sensor that measures bearing to a finite number of spots corresponding to known features (such as a target) of a spacecraft, a number of different algorithms can be used to compute the POSE. NASA has sponsored the development of a flash LIDAR proximity sensor called the Vision Navigation Sensor (VNS) for use by the Orion capsule in future docking missions. This sensor generates data that can be used by a variety of algorithms to compute POSE solutions inside of 15 meters, including at the critical docking range of approximately 1-2 meters. Previously NASA participated in a DARPA program called Orbital Express that achieved the first automated docking for the American space program. During this mission a large set of high quality mated sensor data was obtained at what is essentially the docking distance. This data set is perhaps the most accurate truth data in existence for docking proximity sensors in orbit. In this paper, the flight data from Orbital Express is used to test POSE algorithms at 1.22 meters range. Two different POSE algorithms are tested for two different Fields-of-View (FOVs) and two different pixel noise levels. The results of the analysis are used to predict future performance of the POSE algorithms with VNS data.

  7. Fast non-Abelian geometric gates via transitionless quantum driving.

    PubMed

    Zhang, J; Kyaw, Thi Ha; Tong, D M; Sjöqvist, Erik; Kwek, Leong-Chuan

    2015-12-21

    A practical quantum computer must be capable of performing high fidelity quantum gates on a set of quantum bits (qubits). In the presence of noise, the realization of such gates poses daunting challenges. Geometric phases, which possess intrinsic noise-tolerant features, hold the promise for performing robust quantum computation. In particular, quantum holonomies, i.e., non-Abelian geometric phases, naturally lead to universal quantum computation due to their non-commutativity. Although quantum gates based on adiabatic holonomies have already been proposed, the slow evolution eventually compromises qubit coherence and computational power. Here, we propose a general approach to speed up an implementation of adiabatic holonomic gates by using transitionless driving techniques and show how such a universal set of fast geometric quantum gates in a superconducting circuit architecture can be obtained in an all-geometric approach. Compared with standard non-adiabatic holonomic quantum computation, the holonomies obtained in our approach tends asymptotically to those of the adiabatic approach in the long run-time limit and thus might open up a new horizon for realizing a practical quantum computer.

  8. Fast non-Abelian geometric gates via transitionless quantum driving

    PubMed Central

    Zhang, J.; Kyaw, Thi Ha; Tong, D. M.; Sjöqvist, Erik; Kwek, Leong-Chuan

    2015-01-01

    A practical quantum computer must be capable of performing high fidelity quantum gates on a set of quantum bits (qubits). In the presence of noise, the realization of such gates poses daunting challenges. Geometric phases, which possess intrinsic noise-tolerant features, hold the promise for performing robust quantum computation. In particular, quantum holonomies, i.e., non-Abelian geometric phases, naturally lead to universal quantum computation due to their non-commutativity. Although quantum gates based on adiabatic holonomies have already been proposed, the slow evolution eventually compromises qubit coherence and computational power. Here, we propose a general approach to speed up an implementation of adiabatic holonomic gates by using transitionless driving techniques and show how such a universal set of fast geometric quantum gates in a superconducting circuit architecture can be obtained in an all-geometric approach. Compared with standard non-adiabatic holonomic quantum computation, the holonomies obtained in our approach tends asymptotically to those of the adiabatic approach in the long run-time limit and thus might open up a new horizon for realizing a practical quantum computer. PMID:26687580

  9. Uncertainty Management in Remote Sensing of Climate Data. Summary of A Workshop

    NASA Technical Reports Server (NTRS)

    McConnell, M.; Weidman, S.

    2009-01-01

    Great advances have been made in our understanding of the climate system over the past few decades, and remotely sensed data have played a key role in supporting many of these advances. Improvements in satellites and in computational and data-handling techniques have yielded high quality, readily accessible data. However, rapid increases in data volume have also led to large and complex datasets that pose significant challenges in data analysis (NRC, 2007). Uncertainty characterization is needed for every satellite mission and scientists continue to be challenged by the need to reduce the uncertainty in remotely sensed climate records and projections. The approaches currently used to quantify the uncertainty in remotely sensed data, including statistical methods used to calibrate and validate satellite instruments, lack an overall mathematically based framework.

  10. LOCAL ORTHOGONAL CUTTING METHOD FOR COMPUTING MEDIAL CURVES AND ITS BIOMEDICAL APPLICATIONS

    PubMed Central

    Einstein, Daniel R.; Dyedov, Vladimir

    2010-01-01

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method called local orthogonal cutting (LOC) for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stability and consistency tests. These concepts lend themselves to robust numerical techniques and result in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods. PMID:20628546

  11. Massively parallel algorithms for real-time wavefront control of a dense adaptive optics system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fijany, A.; Milman, M.; Redding, D.

    1994-12-31

    In this paper massively parallel algorithms and architectures for real-time wavefront control of a dense adaptive optic system (SELENE) are presented. The authors have already shown that the computation of a near optimal control algorithm for SELENE can be reduced to the solution of a discrete Poisson equation on a regular domain. Although, this represents an optimal computation, due the large size of the system and the high sampling rate requirement, the implementation of this control algorithm poses a computationally challenging problem since it demands a sustained computational throughput of the order of 10 GFlops. They develop a novel algorithm,more » designated as Fast Invariant Imbedding algorithm, which offers a massive degree of parallelism with simple communication and synchronization requirements. Due to these features, this algorithm is significantly more efficient than other Fast Poisson Solvers for implementation on massively parallel architectures. The authors also discuss two massively parallel, algorithmically specialized, architectures for low-cost and optimal implementation of the Fast Invariant Imbedding algorithm.« less

  12. Advanced Methodology for Simulation of Complex Flows Using Structured Grid Systems

    NASA Technical Reports Server (NTRS)

    Steinthorsson, Erlendur; Modiano, David

    1995-01-01

    Detailed simulations of viscous flows in complicated geometries pose a significant challenge to current capabilities of Computational Fluid Dynamics (CFD). To enable routine application of CFD to this class of problems, advanced methodologies are required that employ (a) automated grid generation, (b) adaptivity, (c) accurate discretizations and efficient solvers, and (d) advanced software techniques. Each of these ingredients contributes to increased accuracy, efficiency (in terms of human effort and computer time), and/or reliability of CFD software. In the long run, methodologies employing structured grid systems will remain a viable choice for routine simulation of flows in complex geometries only if genuinely automatic grid generation techniques for structured grids can be developed and if adaptivity is employed more routinely. More research in both these areas is urgently needed.

  13. Computer- and Internet-related intellectual property issues

    NASA Astrophysics Data System (ADS)

    Meyer, Stuart P.

    2001-05-01

    Computer-related technologies, such as the Internet, have posed new challenges for intellectual property law. Legislation and court decisions impacting patents, copyrights, trade secrets and trademarks have adapted intellectual property law to address new issues brought about by such emerging technologies. As the pace of technological change continues to increase, intellectual property law will need to keep up. Accordingly, the balance struck by intellectual property laws today will likely be set askew by technological changes in the future. Engineers need to consider not only the law as it exists today, but also how it might change in the future. Likewise, lawyers and judges need to consider legal issues not only in view of the current state of the art in technology, but also with an eye to technologies yet to come.

  14. What Is an Activity? Appropriating an Activity-Centric System

    NASA Astrophysics Data System (ADS)

    Yarosh, Svetlana; Matthews, Tara; Moran, Thomas P.; Smith, Barton

    Activity-Centric Computing (ACC) systems seek to address the fragmentation of office work across tools and documents by allowing users to organize work around the computational construct of an Activity. Defining and structuring appropriate Activities within a system poses a challenge for users that must be overcome in order to benefit from ACC support. We know little about how knowledge workers appropriate the Activity construct. To address this, we studied users’ appropriation of a production-quality ACC system, Lotus Activities, for everyday work by employees in a large corporation. We contribute to a better understanding of how users articulate their individual and collaborative work in the system by providing empirical evidence of their patterns of appropriation. We conclude by discussing how our findings can inform the design of other ACC systems for the workplace.

  15. Benchmarking real-time RGBD odometry for light-duty UAVs

    NASA Astrophysics Data System (ADS)

    Willis, Andrew R.; Sahawneh, Laith R.; Brink, Kevin M.

    2016-06-01

    This article describes the theoretical and implementation challenges associated with generating 3D odometry estimates (delta-pose) from RGBD sensor data in real-time to facilitate navigation in cluttered indoor environments. The underlying odometry algorithm applies to general 6DoF motion; however, the computational platforms, trajectories, and scene content are motivated by their intended use on indoor, light-duty UAVs. Discussion outlines the overall software pipeline for sensor processing and details how algorithm choices for the underlying feature detection and correspondence computation impact the real-time performance and accuracy of the estimated odometry and associated covariance. This article also explores the consistency of odometry covariance estimates and the correlation between successive odometry estimates. The analysis is intended to provide users information needed to better leverage RGBD odometry within the constraints of their systems.

  16. Lessons about Climate Change Pose Many Challenges for Science Teachers

    ERIC Educational Resources Information Center

    Cavanagh, Sean

    2007-01-01

    This article reports on lessons about climate change which pose many challenges for science teachers. The natural world today offers a broad--and dire--catalog of scientific phenomena for teachers wanting to craft classroom lessons on the topic of climate change. As public concern about global warming increases, teachers are carving out a larger…

  17. Space-Time Conservation Element and Solution Element Method Being Developed

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung; Himansu, Ananda; Jorgenson, Philip C. E.; Loh, Ching-Yuen; Wang, Xiao-Yen; Yu, Sheng-Tao

    1999-01-01

    The engineering research and design requirements of today pose great computer-simulation challenges to engineers and scientists who are called on to analyze phenomena in continuum mechanics. The future will bring even more daunting challenges, when increasingly complex phenomena must be analyzed with increased accuracy. Traditionally used numerical simulation methods have evolved to their present state by repeated incremental extensions to broaden their scope. They are reaching the limits of their applicability and will need to be radically revised, at the very least, to meet future simulation challenges. At the NASA Lewis Research Center, researchers have been developing a new numerical framework for solving conservation laws in continuum mechanics, namely, the Space-Time Conservation Element and Solution Element Method, or the CE/SE method. This method has been built from fundamentals and is not a modification of any previously existing method. It has been designed with generality, simplicity, robustness, and accuracy as cornerstones. The CE/SE method has thus far been applied in the fields of computational fluid dynamics, computational aeroacoustics, and computational electromagnetics. Computer programs based on the CE/SE method have been developed for calculating flows in one, two, and three spatial dimensions. Results have been obtained for numerous problems and phenomena, including various shock-tube problems, ZND detonation waves, an implosion and explosion problem, shocks over a forward-facing step, a blast wave discharging from a nozzle, various acoustic waves, and shock/acoustic-wave interactions. The method can clearly resolve shock/acoustic-wave interactions, wherein the difference of the magnitude between the acoustic wave and shock could be up to six orders. In two-dimensional flows, the reflected shock is as crisp as the leading shock. CE/SE schemes are currently being used for advanced applications to jet and fan noise prediction and to chemically reacting flows.

  18. Optimizing care for the obese patient in interventional radiology

    PubMed Central

    Aberle, Dwight; Charles, Hearns; Hodak, Steven; O’Neill, Daniel; Oklu, Rahmi; Deipolyi, Amy R.

    2017-01-01

    With the rising epidemic of obesity, interventional radiologists are treating increasing numbers of obese patients, as comorbidities associated with obesity preclude more invasive treatments. These patients are at heightened risk of vascular and oncologic disease, both of which often require interventional radiology care. Obese patients pose unique challenges in imaging, technical feasibility, and periprocedural monitoring. This review describes the technical and clinical challenges posed by this population, with proposed methods to mitigate these challenges and optimize care. PMID:28082253

  19. Geophysical Tools, Challenges and Perspectives Related to Natural Hazards, Climate Change and Food Security

    NASA Astrophysics Data System (ADS)

    Fucugauchi, J. U.

    2013-05-01

    In the coming decades a changing climate and natural hazards will likely increase the vulnerability of agricultural and other food production infrastructures, posing increasing treats to industrialized and developing economies. While food security concerns affect us globally, the huge differences among countries in stocks, population size, poverty levels, economy, technologic development, transportation, health care systems and basic infrastructure will pose a much larger burden on populations in the developing and less developed world. In these economies, increase in the magnitude, duration and frequency of droughts, floods, hurricanes, rising sea levels, heat waves, thunderstorms, freezing events and other phenomena will pose severe costs on the population. For this presentation, we concentrate on a geophysical perspective of the problems, tools available, challenges and short and long-term perspectives. In many instances, a range of natural hazards are considered as unforeseen catastrophes, which suddenly affect without warning, resulting in major losses. Although the forecasting capacity in the different situations arising from climate change and natural hazards is still limited, there are a range of tools available to assess scenarios and forecast models for developing and implementing better mitigation strategies and prevention programs. Earth observation systems, geophysical instrumental networks, satellite observatories, improved understanding of phenomena, expanded global and regional databases, geographic information systems, higher capacity for computer modeling, numerical simulations, etc provide a scientific-technical framework for developing strategies. Hazard prevention and mitigation programs will result in high costs globally, however major costs and challenges concentrate on the less developed economies already affected by poverty, famines, health problems, social inequalities, poor infrastructure, low life expectancy, high population growth, inadequate education systems, immigration, economic crises, conflicts and other issues. Case history analyses and proposals for collaboration programs, know-how transfer and better use of geophysical tools, data, observatories and monitoring networks will be discussed.

  20. CDOCKER and lambda λ -dynamics for prospective prediction in D3R Grand Challenge 2

    NASA Astrophysics Data System (ADS)

    Ding, Xinqiang; Hayes, Ryan L.; Vilseck, Jonah Z.; Charles, Murchtricia K.; Brooks, Charles L.

    2018-01-01

    The opportunity to prospectively predict ligand bound poses and free energies of binding to the Farnesoid X Receptor in the D3R Grand Challenge 2 provided a useful exercise to evaluate CHARMM based docking (CDOCKER) and λ-dynamics methodologies for use in "real-world" applications in computer aided drug design. In addition to measuring their current performance, several recent methodological developments have been analyzed retrospectively to highlight best procedural practices in future applications. For pose prediction with CDOCKER, when the protein structure used for rigid receptor docking was close to the crystallographic holo structure, reliable poses were obtained. Benzimidazoles, with a known holo receptor structure, were successfully docked with an average RMSD of 0.97 Å. Other non-benzimidazole ligands displayed less accuracy largely because the receptor structures we chose for docking were too different from the experimental holo structures. However, retrospective analysis has shown that when these ligands were re-docked into their holo structures, the average RMSD dropped to 1.18 Å for all ligands. When sulfonamides and spiros were docked with the apo structure, which agrees more with their holo structure than the structures we chose, five out of six ligands were correctly docked. These docking results emphasize the need for flexible receptor docking approaches. For λ-dynamics techniques, including multisite λ-dynamics (MSλD), reasonable agreement with experiment was observed for the 33 ligands investigated; root mean square errors of 2.08 and 1.67 kcal/mol were obtained for free energy sets 1 and 2, respectively. Retrospectively, soft-core potentials, adaptive landscape flattening, and biasing potential replica exchange (BP-REX) algorithms were critical to model large substituent perturbations with sufficient precision and within restrictive timeframes, such as was required with participation in Grand Challenge 2. These developments, their associated benefits, and proposed procedures for their use in future applications are discussed.

  1. CDOCKER and λ-dynamics for prospective prediction in D₃R Grand Challenge 2.

    PubMed

    Ding, Xinqiang; Hayes, Ryan L; Vilseck, Jonah Z; Charles, Murchtricia K; Brooks, Charles L

    2018-01-01

    The opportunity to prospectively predict ligand bound poses and free energies of binding to the Farnesoid X Receptor in the D3R Grand Challenge 2 provided a useful exercise to evaluate CHARMM based docking (CDOCKER) and [Formula: see text]-dynamics methodologies for use in "real-world" applications in computer aided drug design. In addition to measuring their current performance, several recent methodological developments have been analyzed retrospectively to highlight best procedural practices in future applications. For pose prediction with CDOCKER, when the protein structure used for rigid receptor docking was close to the crystallographic holo structure, reliable poses were obtained. Benzimidazoles, with a known holo receptor structure, were successfully docked with an average RMSD of 0.97 [Formula: see text]. Other non-benzimidazole ligands displayed less accuracy largely because the receptor structures we chose for docking were too different from the experimental holo structures. However, retrospective analysis has shown that when these ligands were re-docked into their holo structures, the average RMSD dropped to 1.18 [Formula: see text] for all ligands. When sulfonamides and spiros were docked with the apo structure, which agrees more with their holo structure than the structures we chose, five out of six ligands were correctly docked. These docking results emphasize the need for flexible receptor docking approaches. For [Formula: see text]-dynamics techniques, including multisite [Formula: see text]-dynamics (MS[Formula: see text]D), reasonable agreement with experiment was observed for the 33 ligands investigated; root mean square errors of 2.08 and 1.67 kcal/mol were obtained for free energy sets 1 and 2, respectively. Retrospectively, soft-core potentials, adaptive landscape flattening, and biasing potential replica exchange (BP-REX) algorithms were critical to model large substituent perturbations with sufficient precision and within restrictive timeframes, such as was required with participation in Grand Challenge 2. These developments, their associated benefits, and proposed procedures for their use in future applications are discussed.

  2. Computational Hemodynamics Involving Artificial Devices

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin; Feiereisen, William (Technical Monitor)

    2001-01-01

    This paper reports the progress being made towards developing complete blood flow simulation capability in human, especially, in the presence of artificial devices such as valves and ventricular assist devices. Devices modeling poses unique challenges different from computing the blood flow in natural hearts and arteries. There are many elements needed such as flow solvers, geometry modeling including flexible walls, moving boundary procedures and physiological characterization of blood. As a first step, computational technology developed for aerospace applications was extended in the recent past to the analysis and development of mechanical devices. The blood flow in these devices is practically incompressible and Newtonian, and thus various incompressible Navier-Stokes solution procedures can be selected depending on the choice of formulations, variables and numerical schemes. Two primitive variable formulations used are discussed as well as the overset grid approach to handle complex moving geometry. This procedure has been applied to several artificial devices. Among these, recent progress made in developing DeBakey axial flow blood pump will be presented from computational point of view. Computational and clinical issues will be discussed in detail as well as additional work needed.

  3. Progress in Computational Electron-Molecule Collisions

    NASA Astrophysics Data System (ADS)

    Rescigno, Tn

    1997-10-01

    The past few years have witnessed tremendous progress in the development of sophisticated ab initio methods for treating collisions of slow electrons with isolated small molecules. Researchers in this area have benefited greatly from advances in computer technology; indeed, the advent of parallel computers has made it possible to carry out calculations at a level of sophistication inconceivable a decade ago. But bigger and faster computers are only part of the picture. Even with today's computers, the practical need to study electron collisions with the kinds of complex molecules and fragments encountered in real-world plasma processing environments is taxing present methods beyond their current capabilities. Since extrapolation of existing methods to handle increasingly larger targets will ultimately fail as it would require computational resources beyond any imagined, continued progress must also be linked to new theoretical developments. Some of the techniques recently introduced to address these problems will be discussed and illustrated with examples of electron-molecule collision calculations we have carried out on some fairly complex target gases encountered in processing plasmas. Electron-molecule scattering continues to pose many formidable theoretical and computational challenges. I will touch on some of the outstanding open questions.

  4. Introductory Geophysics at Colorado College: A Research-Driven Course

    NASA Astrophysics Data System (ADS)

    Bank, C.

    2003-12-01

    Doing research during an undergraduate course provides stimulus for students and instructor. Students learn to appreciate the scientific method and get hands-on experience, while the instructor remains thrilled about teaching her/his discipline. The introductory geophysics course taught at Colorado College is made up of four units (gravity, seismic, resistivity, and magnetic) using available geophysical equipment. Within each unit students learn the physical background of the method, and then tackle a small research project selected by the instructor. Students pose a research question (or formulate a hypothesis), collect near-surface data in the field, process it using personal computers, and analyse it by creating computer models and running simple inversions. Computer work is done using the programming language Matlab, with several pre-coded scripts to make the programming experience more comfortable. Students then interpret the data and answer the question posed at the beginning. The unit ends with students writing a summary report, creating a poster, or presenting their findings orally. First evaluations of the course show that students appreciate the emphasis on field work and applications to real problems, as well as developing and testing their own hypotheses. The main challenge for the instructor is to find feasible projects, given the time constraints of a course and availability of field sites with new questions to answer. My presentation will feature a few projects done by students during the course and will discuss the experience students and I have had with this approach.

  5. D3R grand challenge 2015: Evaluation of protein-ligand pose and affinity predictions

    NASA Astrophysics Data System (ADS)

    Gathiaka, Symon; Liu, Shuai; Chiu, Michael; Yang, Huanwang; Stuckey, Jeanne A.; Kang, You Na; Delproposto, Jim; Kubish, Ginger; Dunbar, James B.; Carlson, Heather A.; Burley, Stephen K.; Walters, W. Patrick; Amaro, Rommie E.; Feher, Victoria A.; Gilson, Michael K.

    2016-09-01

    The Drug Design Data Resource (D3R) ran Grand Challenge 2015 between September 2015 and February 2016. Two targets served as the framework to test community docking and scoring methods: (1) HSP90, donated by AbbVie and the Community Structure Activity Resource (CSAR), and (2) MAP4K4, donated by Genentech. The challenges for both target datasets were conducted in two stages, with the first stage testing pose predictions and the capacity to rank compounds by affinity with minimal structural data; and the second stage testing methods for ranking compounds with knowledge of at least a subset of the ligand-protein poses. An additional sub-challenge provided small groups of chemically similar HSP90 compounds amenable to alchemical calculations of relative binding free energy. Unlike previous blinded Challenges, we did not provide cognate receptors or receptors prepared with hydrogens and likewise did not require a specified crystal structure to be used for pose or affinity prediction in Stage 1. Given the freedom to select from over 200 crystal structures of HSP90 in the PDB, participants employed workflows that tested not only core docking and scoring technologies, but also methods for addressing water-mediated ligand-protein interactions, binding pocket flexibility, and the optimal selection of protein structures for use in docking calculations. Nearly 40 participating groups submitted over 350 prediction sets for Grand Challenge 2015. This overview describes the datasets and the organization of the challenge components, summarizes the results across all submitted predictions, and considers broad conclusions that may be drawn from this collaborative community endeavor.

  6. D3R Grand Challenge 2015: Evaluation of Protein-Ligand Pose and Affinity Predictions

    PubMed Central

    Gathiaka, Symon; Liu, Shuai; Chiu, Michael; Yang, Huanwang; Stuckey, Jeanne A; Kang, You Na; Delproposto, Jim; Kubish, Ginger; Dunbar, James B.; Carlson, Heather A.; Burley, Stephen K.; Walters, W. Patrick; Amaro, Rommie E.; Feher, Victoria A.; Gilson, Michael K.

    2017-01-01

    The Drug Design Data Resource (D3R) ran Grand Challenge 2015 between September 2015 and February 2016. Two targets served as the framework to test community docking and scoring methods: (i) HSP90, donated by AbbVie and the Community Structure Activity Resource (CSAR), and (ii) MAP4K4, donated by Genentech. The challenges for both target datasets were conducted in two stages, with the first stage testing pose predictions and the capacity to rank compounds by affinity with minimal structural data; and the second stage testing methods for ranking compounds with knowledge of at least a subset of the ligand-protein poses. An additional sub-challenge provided small groups of chemically similar HSP90 compounds amenable to alchemical calculations of relative binding free energy. Unlike previous blinded Challenges, we did not provide cognate receptors or receptors prepared with hydrogens and likewise did not require a specified crystal structure to be used for pose or affinity prediction in Stage 1. Given the freedom to select from over 200 crystal structures of HSP90 in the PDB, participants employed workflows that tested not only core docking and scoring technologies, but also methods for addressing water-mediated ligand-protein interactions, binding pocket flexibility, and the optimal selection of protein structures for use in docking calculations. Nearly 40 participating groups submitted over 350 prediction sets for Grand Challenge 2015. This overview describes the datasets and the organization of the challenge components, summarizes the results across all submitted predictions, and considers broad conclusions that may be drawn from this collaborative community endeavor. PMID:27696240

  7. Pose-Invariant Face Recognition via RGB-D Images.

    PubMed

    Sang, Gaoli; Li, Jing; Zhao, Qijun

    2016-01-01

    Three-dimensional (3D) face models can intrinsically handle large pose face recognition problem. In this paper, we propose a novel pose-invariant face recognition method via RGB-D images. By employing depth, our method is able to handle self-occlusion and deformation, both of which are challenging problems in two-dimensional (2D) face recognition. Texture images in the gallery can be rendered to the same view as the probe via depth. Meanwhile, depth is also used for similarity measure via frontalization and symmetric filling. Finally, both texture and depth contribute to the final identity estimation. Experiments on Bosphorus, CurtinFaces, Eurecom, and Kiwi databases demonstrate that the additional depth information has improved the performance of face recognition with large pose variations and under even more challenging conditions.

  8. New Unintended Adverse Consequences of Electronic Health Records

    PubMed Central

    Wright, A.; Ash, J.; Singh, H.

    2016-01-01

    Summary Although the health information technology industry has made considerable progress in the design, development, implementation, and use of electronic health records (EHRs), the lofty expectations of the early pioneers have not been met. In 2006, the Provider Order Entry Team at Oregon Health & Science University described a set of unintended adverse consequences (UACs), or unpredictable, emergent problems associated with computer-based provider order entry implementation, use, and maintenance. Many of these originally identified UACs have not been completely addressed or alleviated, some have evolved over time, and some new ones have emerged as EHRs became more widely available. The rapid increase in the adoption of EHRs, coupled with the changes in the types and attitudes of clinical users, has led to several new UACs, specifically: complete clinical information unavailable at the point of care; lack of innovations to improve system usability leading to frustrating user experiences; inadvertent disclosure of large amounts of patient-specific information; increased focus on computer-based quality measurement negatively affecting clinical workflows and patient-provider interactions; information overload from marginally useful computer-generated data; and a decline in the development and use of internally-developed EHRs. While each of these new UACs poses significant challenges to EHR developers and users alike, they also offer many opportunities. The challenge for clinical informatics researchers is to continue to refine our current systems while exploring new methods of overcoming these challenges and developing innovations to improve EHR interoperability, usability, security, functionality, clinical quality measurement, and information summarization and display. PMID:27830226

  9. Vision Based SLAM in Dynamic Scenes

    DTIC Science & Technology

    2012-12-20

    the correct relative poses between cameras at frame F. For this purpose, we detect and match SURF features between cameras in dilierent groups, and...all cameras in s uch a challenging case. For a compa rison, we disabled the ’ inte r-camera pose estimation’ and applied the ’ intra-camera pose esti

  10. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    DOE PAGES

    Li, Weixuan; Lin, Guang

    2015-03-21

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less

  11. Nanoinformatics knowledge infrastructures: bringing efficient information management to nanomedical research

    PubMed Central

    de la Iglesia, D; Cachau, R E; García-Remesal, M; Maojo, V

    2014-01-01

    Nanotechnology represents an area of particular promise and significant opportunity across multiple scientific disciplines. Ongoing nanotechnology research ranges from the characterization of nanoparticles and nanomaterials to the analysis and processing of experimental data seeking correlations between nanoparticles and their functionalities and side effects. Due to their special properties, nanoparticles are suitable for cellular-level diagnostics and therapy, offering numerous applications in medicine, e.g. development of biomedical devices, tissue repair, drug delivery systems and biosensors. In nanomedicine, recent studies are producing large amounts of structural and property data, highlighting the role for computational approaches in information management. While in vitro and in vivo assays are expensive, the cost of computing is falling. Furthermore, improvements in the accuracy of computational methods (e.g. data mining, knowledge discovery, modeling and simulation) have enabled effective tools to automate the extraction, management and storage of these vast data volumes. Since this information is widely distributed, one major issue is how to locate and access data where it resides (which also poses data-sharing limitations). The novel discipline of nanoinformatics addresses the information challenges related to nanotechnology research. In this paper, we summarize the needs and challenges in the field and present an overview of extant initiatives and efforts. PMID:24932210

  12. Rational approximations to rational models: alternative algorithms for category learning.

    PubMed

    Sanborn, Adam N; Griffiths, Thomas L; Navarro, Daniel J

    2010-10-01

    Rational models of cognition typically consider the abstract computational problems posed by the environment, assuming that people are capable of optimally solving those problems. This differs from more traditional formal models of cognition, which focus on the psychological processes responsible for behavior. A basic challenge for rational models is thus explaining how optimal solutions can be approximated by psychological processes. We outline a general strategy for answering this question, namely to explore the psychological plausibility of approximation algorithms developed in computer science and statistics. In particular, we argue that Monte Carlo methods provide a source of rational process models that connect optimal solutions to psychological processes. We support this argument through a detailed example, applying this approach to Anderson's (1990, 1991) rational model of categorization (RMC), which involves a particularly challenging computational problem. Drawing on a connection between the RMC and ideas from nonparametric Bayesian statistics, we propose 2 alternative algorithms for approximate inference in this model. The algorithms we consider include Gibbs sampling, a procedure appropriate when all stimuli are presented simultaneously, and particle filters, which sequentially approximate the posterior distribution with a small number of samples that are updated as new data become available. Applying these algorithms to several existing datasets shows that a particle filter with a single particle provides a good description of human inferences.

  13. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lin, Guang, E-mail: guanglin@purdue.edu

    2015-08-01

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less

  14. Cloud computing approaches for prediction of ligand binding poses and pathways.

    PubMed

    Lawrenz, Morgan; Shukla, Diwakar; Pande, Vijay S

    2015-01-22

    We describe an innovative protocol for ab initio prediction of ligand crystallographic binding poses and highly effective analysis of large datasets generated for protein-ligand dynamics. We include a procedure for setup and performance of distributed molecular dynamics simulations on cloud computing architectures, a model for efficient analysis of simulation data, and a metric for evaluation of model convergence. We give accurate binding pose predictions for five ligands ranging in affinity from 7 nM to > 200 μM for the immunophilin protein FKBP12, for expedited results in cases where experimental structures are difficult to produce. Our approach goes beyond single, low energy ligand poses to give quantitative kinetic information that can inform protein engineering and ligand design.

  15. Gradient-based stochastic estimation of the density matrix

    NASA Astrophysics Data System (ADS)

    Wang, Zhentao; Chern, Gia-Wei; Batista, Cristian D.; Barros, Kipton

    2018-03-01

    Fast estimation of the single-particle density matrix is key to many applications in quantum chemistry and condensed matter physics. The best numerical methods leverage the fact that the density matrix elements f(H)ij decay rapidly with distance rij between orbitals. This decay is usually exponential. However, for the special case of metals at zero temperature, algebraic decay of the density matrix appears and poses a significant numerical challenge. We introduce a gradient-based probing method to estimate all local density matrix elements at a computational cost that scales linearly with system size. For zero-temperature metals, the stochastic error scales like S-(d+2)/2d, where d is the dimension and S is a prefactor to the computational cost. The convergence becomes exponential if the system is at finite temperature or is insulating.

  16. A Primer on Infectious Disease Bacterial Genomics

    PubMed Central

    Petkau, Aaron; Knox, Natalie; Graham, Morag; Van Domselaar, Gary

    2016-01-01

    SUMMARY The number of large-scale genomics projects is increasing due to the availability of affordable high-throughput sequencing (HTS) technologies. The use of HTS for bacterial infectious disease research is attractive because one whole-genome sequencing (WGS) run can replace multiple assays for bacterial typing, molecular epidemiology investigations, and more in-depth pathogenomic studies. The computational resources and bioinformatics expertise required to accommodate and analyze the large amounts of data pose new challenges for researchers embarking on genomics projects for the first time. Here, we present a comprehensive overview of a bacterial genomics projects from beginning to end, with a particular focus on the planning and computational requirements for HTS data, and provide a general understanding of the analytical concepts to develop a workflow that will meet the objectives and goals of HTS projects. PMID:28590251

  17. Mandibular second molar exhibiting a unique "Y-" and "J-" "shaped" root canal anatomy diagnosed using cone-beam computed tomographic scanning: A case report.

    PubMed

    Parashar, Saumya-Rajesh; Kowsky, R Dinesh; Natanasabapathy, Velmurugan

    2017-01-01

    This article aims to report a unique case with aberrant root canal anatomy exhibiting "Y-" and "J"-shaped canal pattern in a mandibular second molar. Anatomic complexities may pose challenges for endodontic treatment. Before performing endodontic treatment, the clinician should be aware of the internal anatomy of the tooth being treated and should recognize anatomic aberrations if present. Presence of unusual anatomy may call for modifications in treatment planning. This report describes in detail about a mandibular second molar tooth associated with two paramolar tubercles having a peculiar "Y-" and "J-"shaped canal anatomy detected with the aid of cone beam computed tomography, which has never been reported in the dental literature. The proposed treatment protocol for the endodontic management of the same has also been discussed.

  18. HGML: a hypertext guideline markup language.

    PubMed Central

    Hagerty, C. G.; Pickens, D.; Kulikowski, C.; Sonnenberg, F.

    2000-01-01

    Existing text-based clinical practice guidelines can be difficult to put into practice. While a growing number of such documents have gained acceptance in the medical community and contain a wealth of valuable information, the time required to digest them is substantial. Yet the expressive power, subtlety and flexibility of natural language pose challenges when designing computer tools that will help in their application. At the same time, formal computer languages typically lack such expressiveness and the effort required to translate existing documents into these languages may be costly. We propose a method based on the mark-up concept for converting text-based clinical guidelines into a machine-operable form. This allows existing guidelines to be manipulated by machine, and viewed in different formats at various levels of detail according to the needs of the practitioner, while preserving their originally published form. PMID:11079898

  19. Real-time scalable visual analysis on mobile devices

    NASA Astrophysics Data System (ADS)

    Pattath, Avin; Ebert, David S.; May, Richard A.; Collins, Timothy F.; Pike, William

    2008-02-01

    Interactive visual presentation of information can help an analyst gain faster and better insight from data. When combined with situational or context information, visualization on mobile devices is invaluable to in-field responders and investigators. However, several challenges are posed by the form-factor of mobile devices in developing such systems. In this paper, we classify these challenges into two broad categories - issues in general mobile computing and issues specific to visual analysis on mobile devices. Using NetworkVis and Infostar as example systems, we illustrate some of the techniques that we employed to overcome many of the identified challenges. NetworkVis is an OpenVG-based real-time network monitoring and visualization system developed for Windows Mobile devices. Infostar is a flash-based interactive, real-time visualization application intended to provide attendees access to conference information. Linked time-synchronous visualization, stylus/button-based interactivity, vector graphics, overview-context techniques, details-on-demand and statistical information display are some of the highlights of these applications.

  20. Modelling Glacial Lake Outburst Floods: Key Considerations and Challenges Posed By Climatic Change

    NASA Astrophysics Data System (ADS)

    Westoby, M.

    2014-12-01

    The number and size of moraine-dammed supraglacial and proglacial lakes is increasing as a result of contemporary climatic change. Moraine-dammed lakes are capable of impounding volumes of water in excess of 107 m3, and often represent a very real threat to downstream communities and infrastructure, should the bounding moraine fail and produce a catastrophic Glacial Lake Outburst Flood (GLOF). Modelling the individual components of a GLOF, including a triggering event, the complex dam-breaching process and downstream propagation of the flood is incredibly challenging, not least because direct observation and instrumentation of such high-magnitude flows is virtually impossible. We briefly review the current state-of-the-art in numerical GLOF modelling, with a focus on the theoretical and computational challenges associated with reconstructing or predicting GLOF dynamics in the face of rates of cryospheric change that have no historical precedent, as well as various implications for researchers and professionals tasked with the production of hazard maps and disaster mitigation strategies.

  1. Discretization of the induced-charge boundary integral equation.

    PubMed

    Bardhan, Jaydeep P; Eisenberg, Robert S; Gillespie, Dirk

    2009-07-01

    Boundary-element methods (BEMs) for solving integral equations numerically have been used in many fields to compute the induced charges at dielectric boundaries. In this paper, we consider a more accurate implementation of BEM in the context of ions in aqueous solution near proteins, but our results are applicable more generally. The ions that modulate protein function are often within a few angstroms of the protein, which leads to the significant accumulation of polarization charge at the protein-solvent interface. Computing the induced charge accurately and quickly poses a numerical challenge in solving a popular integral equation using BEM. In particular, the accuracy of simulations can depend strongly on seemingly minor details of how the entries of the BEM matrix are calculated. We demonstrate that when the dielectric interface is discretized into flat tiles, the qualocation method of Tausch [IEEE Trans Comput.-Comput.-Aided Des. 20, 1398 (2001)] to compute the BEM matrix elements is always more accurate than the traditional centroid-collocation method. Qualocation is not more expensive to implement than collocation and can save significant computational time by reducing the number of boundary elements needed to discretize the dielectric interfaces.

  2. Discretization of the induced-charge boundary integral equation

    NASA Astrophysics Data System (ADS)

    Bardhan, Jaydeep P.; Eisenberg, Robert S.; Gillespie, Dirk

    2009-07-01

    Boundary-element methods (BEMs) for solving integral equations numerically have been used in many fields to compute the induced charges at dielectric boundaries. In this paper, we consider a more accurate implementation of BEM in the context of ions in aqueous solution near proteins, but our results are applicable more generally. The ions that modulate protein function are often within a few angstroms of the protein, which leads to the significant accumulation of polarization charge at the protein-solvent interface. Computing the induced charge accurately and quickly poses a numerical challenge in solving a popular integral equation using BEM. In particular, the accuracy of simulations can depend strongly on seemingly minor details of how the entries of the BEM matrix are calculated. We demonstrate that when the dielectric interface is discretized into flat tiles, the qualocation method of Tausch [IEEE Trans Comput.-Comput.-Aided Des. 20, 1398 (2001)] to compute the BEM matrix elements is always more accurate than the traditional centroid-collocation method. Qualocation is not more expensive to implement than collocation and can save significant computational time by reducing the number of boundary elements needed to discretize the dielectric interfaces.

  3. OCCAM: a flexible, multi-purpose and extendable HPC cluster

    NASA Astrophysics Data System (ADS)

    Aldinucci, M.; Bagnasco, S.; Lusso, S.; Pasteris, P.; Rabellino, S.; Vallero, S.

    2017-10-01

    The Open Computing Cluster for Advanced data Manipulation (OCCAM) is a multipurpose flexible HPC cluster designed and operated by a collaboration between the University of Torino and the Sezione di Torino of the Istituto Nazionale di Fisica Nucleare. It is aimed at providing a flexible, reconfigurable and extendable infrastructure to cater to a wide range of different scientific computing use cases, including ones from solid-state chemistry, high-energy physics, computer science, big data analytics, computational biology, genomics and many others. Furthermore, it will serve as a platform for R&D activities on computational technologies themselves, with topics ranging from GPU acceleration to Cloud Computing technologies. A heterogeneous and reconfigurable system like this poses a number of challenges related to the frequency at which heterogeneous hardware resources might change their availability and shareability status, which in turn affect methods and means to allocate, manage, optimize, bill, monitor VMs, containers, virtual farms, jobs, interactive bare-metal sessions, etc. This work describes some of the use cases that prompted the design and construction of the HPC cluster, its architecture and resource provisioning model, along with a first characterization of its performance by some synthetic benchmark tools and a few realistic use-case tests.

  4. A finite element method to compute three-dimensional equilibrium configurations of fluid membranes: Optimal parameterization, variational formulation and applications

    NASA Astrophysics Data System (ADS)

    Rangarajan, Ramsharan; Gao, Huajian

    2015-09-01

    We introduce a finite element method to compute equilibrium configurations of fluid membranes, identified as stationary points of a curvature-dependent bending energy functional under certain geometric constraints. The reparameterization symmetries in the problem pose a challenge in designing parametric finite element methods, and existing methods commonly resort to Lagrange multipliers or penalty parameters. In contrast, we exploit these symmetries by representing solution surfaces as normal offsets of given reference surfaces and entirely bypass the need for artificial constraints. We then resort to a Galerkin finite element method to compute discrete C1 approximations of the normal offset coordinate. The variational framework presented is suitable for computing deformations of three-dimensional membranes subject to a broad range of external interactions. We provide a systematic algorithm for computing large deformations, wherein solutions at subsequent load steps are identified as perturbations of previously computed ones. We discuss the numerical implementation of the method in detail and demonstrate its optimal convergence properties using examples. We discuss applications of the method to studying adhesive interactions of fluid membranes with rigid substrates and to investigate the influence of membrane tension in tether formation.

  5. Teachers Implementing Mathematical Problem Posing in the Classroom: Challenges and Strategies

    ERIC Educational Resources Information Center

    Leung, Shuk-kwan S.

    2013-01-01

    This paper reports a study about how a teacher educator shared knowledge with teachers when they worked together to implement mathematical problem posing (MPP) in the classroom. It includes feasible methods for getting practitioners to use research-based tasks aligned to the curriculum in order to encourage children to pose mathematical problems.…

  6. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  7. Cloud computing approaches to accelerate drug discovery value chain.

    PubMed

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  8. Efficient electromagnetic source imaging with adaptive standardized LORETA/FOCUSS.

    PubMed

    Schimpf, Paul H; Liu, Hesheng; Ramon, Ceon; Haueisen, Jens

    2005-05-01

    Functional brain imaging and source localization based on the scalp's potential field require a solution to an ill-posed inverse problem with many solutions. This makes it necessary to incorporate a priori knowledge in order to select a particular solution. A computational challenge for some subject-specific head models is that many inverse algorithms require a comprehensive sampling of the candidate source space at the desired resolution. In this study, we present an algorithm that can accurately reconstruct details of localized source activity from a sparse sampling of the candidate source space. Forward computations are minimized through an adaptive procedure that increases source resolution as the spatial extent is reduced. With this algorithm, we were able to compute inverses using only 6% to 11% of the full resolution lead-field, with a localization accuracy that was not significantly different than an exhaustive search through a fully-sampled source space. The technique is, therefore, applicable for use with anatomically-realistic, subject-specific forward models for applications with spatially concentrated source activity.

  9. Removal of a foreign body from the skull base using a customized computer-designed guide bar.

    PubMed

    Wei, Ran; Xiang-Zhen, Liu; Bing, Guo; Da-Long, Shu; Ze-Ming, Tan

    2010-06-01

    Foreign bodies located at the base of the skull pose a surgical challenge. Here, a customized computer-designed surgical guide bar was designed to facilitate removal of a skull base foreign body. Within 24h of the patient's presentation, a guide bar and mounting platform were designed to remove a foreign body located adjacent to the transverse process of the atlas and pressing against the internal carotid artery. The foreign body was successfully located and removed using the custom designed guide bar and computer operative planning. Ten months postoperatively the patient was free of complaints and lacked any complications such as restricted opening of the mouth or false aneurysm. The inferior alveolar nerve damage noted immediately postoperatively (a consequence of mandibular osteotomy) was slightly reduced at follow-up, but labial numbness persisted. The navigation tools described herein were successfully employed to aid foreign body removal from the skull base. Copyright (c) 2009 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  10. A cell-phone-based brain-computer interface for communication in daily life

    NASA Astrophysics Data System (ADS)

    Wang, Yu-Te; Wang, Yijun; Jung, Tzyy-Ping

    2011-04-01

    Moving a brain-computer interface (BCI) system from a laboratory demonstration to real-life applications still poses severe challenges to the BCI community. This study aims to integrate a mobile and wireless electroencephalogram (EEG) system and a signal-processing platform based on a cell phone into a truly wearable and wireless online BCI. Its practicality and implications in a routine BCI are demonstrated through the realization and testing of a steady-state visual evoked potential (SSVEP)-based BCI. This study implemented and tested online signal processing methods in both time and frequency domains for detecting SSVEPs. The results of this study showed that the performance of the proposed cell-phone-based platform was comparable, in terms of the information transfer rate, with other BCI systems using bulky commercial EEG systems and personal computers. To the best of our knowledge, this study is the first to demonstrate a truly portable, cost-effective and miniature cell-phone-based platform for online BCIs.

  11. Ab initio calculations of the concentration dependent band gap reduction in dilute nitrides

    NASA Astrophysics Data System (ADS)

    Rosenow, Phil; Bannow, Lars C.; Fischer, Eric W.; Stolz, Wolfgang; Volz, Kerstin; Koch, Stephan W.; Tonner, Ralf

    2018-02-01

    While being of persistent interest for the integration of lattice-matched laser devices with silicon circuits, the electronic structure of dilute nitride III/V-semiconductors has presented a challenge to ab initio computational approaches. The origin of the computational problems is the strong distortion exerted by the N atoms on most host materials. Here, these issues are resolved by combining density functional theory calculations based on the meta-GGA functional presented by Tran and Blaha (TB09) with a supercell approach for the dilute nitride Ga(NAs). Exploring the requirements posed to supercells, it is shown that the distortion field of a single N atom must be allowed to decrease so far that it does not overlap with its periodic images. This also prevents spurious electronic interactions between translational symmetric atoms, allowing us to compute band gaps in very good agreement with experimentally derived reference values. In addition to existing approaches, these results offer a promising ab initio avenue to the electronic structure of dilute nitride semiconductor compounds.

  12. A cell-phone-based brain-computer interface for communication in daily life.

    PubMed

    Wang, Yu-Te; Wang, Yijun; Jung, Tzyy-Ping

    2011-04-01

    Moving a brain-computer interface (BCI) system from a laboratory demonstration to real-life applications still poses severe challenges to the BCI community. This study aims to integrate a mobile and wireless electroencephalogram (EEG) system and a signal-processing platform based on a cell phone into a truly wearable and wireless online BCI. Its practicality and implications in a routine BCI are demonstrated through the realization and testing of a steady-state visual evoked potential (SSVEP)-based BCI. This study implemented and tested online signal processing methods in both time and frequency domains for detecting SSVEPs. The results of this study showed that the performance of the proposed cell-phone-based platform was comparable, in terms of the information transfer rate, with other BCI systems using bulky commercial EEG systems and personal computers. To the best of our knowledge, this study is the first to demonstrate a truly portable, cost-effective and miniature cell-phone-based platform for online BCIs.

  13. Modeling Early-Stage Processes of U-10 Wt.%Mo Alloy Using Integrated Computational Materials Engineering Concepts

    NASA Astrophysics Data System (ADS)

    Wang, Xiaowo; Xu, Zhijie; Soulami, Ayoub; Hu, Xiaohua; Lavender, Curt; Joshi, Vineet

    2017-12-01

    Low-enriched uranium alloyed with 10 wt.% molybdenum (U-10Mo) has been identified as a promising alternative to high-enriched uranium. Manufacturing U-10Mo alloy involves multiple complex thermomechanical processes that pose challenges for computational modeling. This paper describes the application of integrated computational materials engineering (ICME) concepts to integrate three individual modeling components, viz. homogenization, microstructure-based finite element method for hot rolling, and carbide particle distribution, to simulate the early-stage processes of U-10Mo alloy manufacture. The resulting integrated model enables information to be passed between different model components and leads to improved understanding of the evolution of the microstructure. This ICME approach is then used to predict the variation in the thickness of the Zircaloy-2 barrier as a function of the degree of homogenization and to analyze the carbide distribution, which can affect the recrystallization, hardness, and fracture properties of U-10Mo in subsequent processes.

  14. Kinetic barriers in the isomerization of substituted ureas: implications for computer-aided drug design.

    PubMed

    Loeffler, Johannes R; Ehmki, Emanuel S R; Fuchs, Julian E; Liedl, Klaus R

    2016-05-01

    Urea derivatives are ubiquitously found in many chemical disciplines. N,N'-substituted ureas may show different conformational preferences depending on their substitution pattern. The high energetic barrier for isomerization of the cis and trans state poses additional challenges on computational simulation techniques aiming at a reproduction of the biological properties of urea derivatives. Herein, we investigate energetics of urea conformations and their interconversion using a broad spectrum of methodologies ranging from data mining, via quantum chemistry to molecular dynamics simulation and free energy calculations. We find that the inversion of urea conformations is inherently slow and beyond the time scale of typical simulation protocols. Therefore, extra care needs to be taken by computational chemists to work with appropriate model systems. We find that both knowledge-driven approaches as well as physics-based methods may guide molecular modelers towards accurate starting structures for expensive calculations to ensure that conformations of urea derivatives are modeled as adequately as possible.

  15. Computational clustering for viral reference proteomes

    PubMed Central

    Chen, Chuming; Huang, Hongzhan; Mazumder, Raja; Natale, Darren A.; McGarvey, Peter B.; Zhang, Jian; Polson, Shawn W.; Wang, Yuqi; Wu, Cathy H.

    2016-01-01

    Motivation: The enormous number of redundant sequenced genomes has hindered efforts to analyze and functionally annotate proteins. As the taxonomy of viruses is not uniformly defined, viral proteomes pose special challenges in this regard. Grouping viruses based on the similarity of their proteins at proteome scale can normalize against potential taxonomic nomenclature anomalies. Results: We present Viral Reference Proteomes (Viral RPs), which are computed from complete virus proteomes within UniProtKB. Viral RPs based on 95, 75, 55, 35 and 15% co-membership in proteome similarity based clusters are provided. Comparison of our computational Viral RPs with UniProt’s curator-selected Reference Proteomes indicates that the two sets are consistent and complementary. Furthermore, each Viral RP represents a cluster of virus proteomes that was consistent with virus or host taxonomy. We provide BLASTP search and FTP download of Viral RP protein sequences, and a browser to facilitate the visualization of Viral RPs. Availability and implementation: http://proteininformationresource.org/rps/viruses/ Contact: chenc@udel.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153712

  16. Recycling potential of neodymium: the case of computer hard disk drives.

    PubMed

    Sprecher, Benjamin; Kleijn, Rene; Kramer, Gert Jan

    2014-08-19

    Neodymium, one of the more critically scarce rare earth metals, is often used in sustainable technologies. In this study, we investigate the potential contribution of neodymium recycling to reducing scarcity in supply, with a case study on computer hard disk drives (HDDs). We first review the literature on neodymium production and recycling potential. From this review, we find that recycling of computer HDDs is currently the most feasible pathway toward large-scale recycling of neodymium, even though HDDs do not represent the largest application of neodymium. We then use a combination of dynamic modeling and empirical experiments to conclude that within the application of NdFeB magnets for HDDs, the potential for loop-closing is significant: up to 57% in 2017. However, compared to the total NdFeB production capacity, the recovery potential from HDDs is relatively small (in the 1-3% range). The distributed nature of neodymium poses a significant challenge for recycling of neodymium.

  17. Fast human pose estimation using 3D Zernike descriptors

    NASA Astrophysics Data System (ADS)

    Berjón, Daniel; Morán, Francisco

    2012-03-01

    Markerless video-based human pose estimation algorithms face a high-dimensional problem that is frequently broken down into several lower-dimensional ones by estimating the pose of each limb separately. However, in order to do so they need to reliably locate the torso, for which they typically rely on time coherence and tracking algorithms. Their losing track usually results in catastrophic failure of the process, requiring human intervention and thus precluding their usage in real-time applications. We propose a very fast rough pose estimation scheme based on global shape descriptors built on 3D Zernike moments. Using an articulated model that we configure in many poses, a large database of descriptor/pose pairs can be computed off-line. Thus, the only steps that must be done on-line are the extraction of the descriptors for each input volume and a search against the database to get the most likely poses. While the result of such process is not a fine pose estimation, it can be useful to help more sophisticated algorithms to regain track or make more educated guesses when creating new particles in particle-filter-based tracking schemes. We have achieved a performance of about ten fps on a single computer using a database of about one million entries.

  18. Local Orthogonal Cutting Method for Computing Medial Curves and Its Biomedical Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiao, Xiangmin; Einstein, Daniel R.; Dyedov, Volodymyr

    2010-03-24

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stabilitymore » and consistency tests. These concepts lend themselves to robust numerical techniques including eigenvalue analysis, weighted least squares approximations, and numerical minimization, resulting in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.« less

  19. A component-based software environment for visualizing large macromolecular assemblies.

    PubMed

    Sanner, Michel F

    2005-03-01

    The interactive visualization of large biological assemblies poses a number of challenging problems, including the development of multiresolution representations and new interaction methods for navigating and analyzing these complex systems. An additional challenge is the development of flexible software environments that will facilitate the integration and interoperation of computational models and techniques from a wide variety of scientific disciplines. In this paper, we present a component-based software development strategy centered on the high-level, object-oriented, interpretive programming language: Python. We present several software components, discuss their integration, and describe some of their features that are relevant to the visualization of large molecular assemblies. Several examples are given to illustrate the interoperation of these software components and the integration of structural data from a variety of experimental sources. These examples illustrate how combining visual programming with component-based software development facilitates the rapid prototyping of novel visualization tools.

  20. Psychophysics and Neuronal Bases of Sound Localization in Humans

    PubMed Central

    Ahveninen, Jyrki; Kopco, Norbert; Jääskeläinen, Iiro P.

    2013-01-01

    Localization of sound sources is a considerable computational challenge for the human brain. Whereas the visual system can process basic spatial information in parallel, the auditory system lacks a straightforward correspondence between external spatial locations and sensory receptive fields. Consequently, the question how different acoustic features supporting spatial hearing are represented in the central nervous system is still open. Functional neuroimaging studies in humans have provided evidence for a posterior auditory “where” pathway that encompasses non-primary auditory cortex areas, including the planum temporale (PT) and posterior superior temporal gyrus (STG), which are strongly activated by horizontal sound direction changes, distance changes, and movement. However, these areas are also activated by a wide variety of other stimulus features, posing a challenge for the interpretation that the underlying areas are purely spatial. This review discusses behavioral and neuroimaging studies on sound localization, and some of the competing models of representation of auditory space in humans. PMID:23886698

  1. Serious Games for Health: Features, Challenges, Next Steps.

    PubMed

    Blumberg, Moderators Fran C; Burke, Lauren C; Hodent, Participants Celia; Evans, Michael A; Lane, H Chad; Schell, Jesse

    2014-10-01

    As articles in this journal have demonstrated over the past 3 years, serious game development continues to flourish as a vehicle for formal and informal health education. How best to characterize a "serious" game remains somewhat elusive in the literature. Many researchers and practitioners view serious games as capitalizing on computer technology and state-of-the-art video graphics as an enjoyable means by which to provide and promote instruction and training, or to facilitate attitude change among its players. We invited four distinguished researchers and practitioners to further discuss with us how they view the characteristics of serious games for health, how those characteristics differ from those for academic purposes, the challenges posed for serious game development among players of different ages, and next steps for the development and empirical examination of the effectiveness of serious games for players' psychological and physical well-being.

  2. ClimateSpark: An in-memory distributed computing framework for big climate data analytics

    NASA Astrophysics Data System (ADS)

    Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei

    2018-06-01

    The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.

  3. Efficient computation of the phylogenetic likelihood function on multi-gene alignments and multi-core architectures.

    PubMed

    Stamatakis, Alexandros; Ott, Michael

    2008-12-27

    The continuous accumulation of sequence data, for example, due to novel wet-laboratory techniques such as pyrosequencing, coupled with the increasing popularity of multi-gene phylogenies and emerging multi-core processor architectures that face problems of cache congestion, poses new challenges with respect to the efficient computation of the phylogenetic maximum-likelihood (ML) function. Here, we propose two approaches that can significantly speed up likelihood computations that typically represent over 95 per cent of the computational effort conducted by current ML or Bayesian inference programs. Initially, we present a method and an appropriate data structure to efficiently compute the likelihood score on 'gappy' multi-gene alignments. By 'gappy' we denote sampling-induced gaps owing to missing sequences in individual genes (partitions), i.e. not real alignment gaps. A first proof-of-concept implementation in RAXML indicates that this approach can accelerate inferences on large and gappy alignments by approximately one order of magnitude. Moreover, we present insights and initial performance results on multi-core architectures obtained during the transition from an OpenMP-based to a Pthreads-based fine-grained parallelization of the ML function.

  4. On the design of computer-based models for integrated environmental science.

    PubMed

    McIntosh, Brian S; Jeffrey, Paul; Lemon, Mark; Winder, Nick

    2005-06-01

    The current research agenda in environmental science is dominated by calls to integrate science and policy to better understand and manage links between social (human) and natural (nonhuman) processes. Freshwater resource management is one area where such calls can be heard. Designing computer-based models for integrated environmental science poses special challenges to the research community. At present it is not clear whether such tools, or their outputs, receive much practical policy or planning application. It is argued that this is a result of (1) a lack of appreciation within the research modeling community of the characteristics of different decision-making processes including policy, planning, and (2) participation, (3) a lack of appreciation of the characteristics of different decision-making contexts, (4) the technical difficulties in implementing the necessary support tool functionality, and (5) the socio-technical demands of designing tools to be of practical use. This article presents a critical synthesis of ideas from each of these areas and interprets them in terms of design requirements for computer-based models being developed to provide scientific information support for policy and planning. Illustrative examples are given from the field of freshwater resources management. Although computer-based diagramming and modeling tools can facilitate processes of dialogue, they lack adequate simulation capabilities. Component-based models and modeling frameworks provide such functionality and may be suited to supporting problematic or messy decision contexts. However, significant technical (implementation) and socio-technical (use) challenges need to be addressed before such ambition can be realized.

  5. Vision-guided gripping of a cylinder

    NASA Technical Reports Server (NTRS)

    Nicewarner, Keith E.; Kelley, Robert B.

    1991-01-01

    The motivation for vision-guided servoing is taken from tasks in automated or telerobotic space assembly and construction. Vision-guided servoing requires the ability to perform rapid pose estimates and provide predictive feature tracking. Monocular information from a gripper-mounted camera is used to servo the gripper to grasp a cylinder. The procedure is divided into recognition and servo phases. The recognition stage verifies the presence of a cylinder in the camera field of view. Then an initial pose estimate is computed and uncluttered scan regions are selected. The servo phase processes only the selected scan regions of the image. Given the knowledge, from the recognition phase, that there is a cylinder in the image and knowing the radius of the cylinder, 4 of the 6 pose parameters can be estimated with minimal computation. The relative motion of the cylinder is obtained by using the current pose and prior pose estimates. The motion information is then used to generate a predictive feature-based trajectory for the path of the gripper.

  6. A Probabilistic Risk Assessment of Groundwater-Related Risks at Excavation Sites

    NASA Astrophysics Data System (ADS)

    Jurado, A.; de Gaspari, F.; Vilarrasa, V.; Sanchez-Vila, X.; Fernandez-Garcia, D.; Tartakovsky, D. M.; Bolster, D.

    2010-12-01

    Excavation sites such as those associated with the construction of subway lines, railways and highway tunnels are hazardous places, posing risks to workers, machinery and surrounding buildings. Many of these risks can be groundwater related. In this work we develop a general framework based on a probabilistic risk assessment (PRA) to quantify such risks. This approach is compatible with standard PRA practices and it employs many well-developed risk analysis tools, such as fault trees. The novelty and computational challenges of the proposed approach stem from the reliance on stochastic differential equations, rather than reliability databases, to compute the probabilities of basic events. The general framework is applied to a specific case study in Spain. It is used to estimate and minimize risks for a potential construction site of an underground station for the new subway line in the Barcelona metropolitan area.

  7. 3DRISM-HI-D2MSA: an improved analytic theory to compute solvent structure around hydrophobic solutes with proper treatment of solute–solvent electrostatic interactions

    NASA Astrophysics Data System (ADS)

    Cao, Siqin; Zhu, Lizhe; Huang, Xuhui

    2018-04-01

    The 3D reference interaction site model (3DRISM) is a powerful tool to study the thermodynamic and structural properties of liquids. However, for hydrophobic solutes, the inhomogeneity of the solvent density around them poses a great challenge to the 3DRISM theory. To address this issue, we have previously introduced the hydrophobic-induced density inhomogeneity theory (HI) for purely hydrophobic solutes. To further consider the complex hydrophobic solutes containing partial charges, here we propose the D2MSA closure to incorporate the short-range and long-range interactions with the D2 closure and the mean spherical approximation, respectively. We demonstrate that our new theory can compute the solvent distributions around real hydrophobic solutes in water and complex organic solvents that agree well with the explicit solvent molecular dynamics simulations.

  8. Simulation and optimization of an experimental membrane wastewater treatment plant using computational intelligence methods.

    PubMed

    Ludwig, T; Kern, P; Bongards, M; Wolf, C

    2011-01-01

    The optimization of relaxation and filtration times of submerged microfiltration flat modules in membrane bioreactors used for municipal wastewater treatment is essential for efficient plant operation. However, the optimization and control of such plants and their filtration processes is a challenging problem due to the underlying highly nonlinear and complex processes. This paper presents the use of genetic algorithms for this optimization problem in conjunction with a fully calibrated simulation model, as computational intelligence methods are perfectly suited to the nonconvex multi-objective nature of the optimization problems posed by these complex systems. The simulation model is developed and calibrated using membrane modules from the wastewater simulation software GPS-X based on the Activated Sludge Model No.1 (ASM1). Simulation results have been validated at a technical reference plant. They clearly show that filtration process costs for cleaning and energy can be reduced significantly by intelligent process optimization.

  9. RACORO Extended-Term Aircraft Observations of Boundary-Layer Clouds

    NASA Technical Reports Server (NTRS)

    Vogelmann, Andrew M.; McFarquhar, Greg M.; Ogren, John A.; Turner, David D.; Comstock, Jennifer M.; Feingold, Graham; Long, Charles N.; Jonsson, Haflidi H.; Bucholtz, Anthony; Collins, Don R.; hide

    2012-01-01

    Small boundary-layer clouds are ubiquitous over many parts of the globe and strongly influence the Earths radiative energy balance. However, our understanding of these clouds is insufficient to solve pressing scientific problems. For example, cloud feedback represents the largest uncertainty amongst all climate feedbacks in general circulation models (GCM). Several issues complicate understanding boundary-layer clouds and simulating them in GCMs. The high spatial variability of boundary-layer clouds poses an enormous computational challenge, since their horizontal dimensions and internal variability occur at spatial scales much finer than the computational grids used in GCMs. Aerosol-cloud interactions further complicate boundary-layer cloud measurement and simulation. Additionally, aerosols influence processes such as precipitation and cloud lifetime. An added complication is that at small scales (order meters to 10s of meters) distinguishing cloud from aerosol is increasingly difficult, due to the effects of aerosol humidification, cloud fragments and photon scattering between clouds.

  10. Real-time Accurate Surface Reconstruction Pipeline for Vision Guided Planetary Exploration Using Unmanned Ground and Aerial Vehicles

    NASA Technical Reports Server (NTRS)

    Almeida, Eduardo DeBrito

    2012-01-01

    This report discusses work completed over the summer at the Jet Propulsion Laboratory (JPL), California Institute of Technology. A system is presented to guide ground or aerial unmanned robots using computer vision. The system performs accurate camera calibration, camera pose refinement and surface extraction from images collected by a camera mounted on the vehicle. The application motivating the research is planetary exploration and the vehicles are typically rovers or unmanned aerial vehicles. The information extracted from imagery is used primarily for navigation, as robot location is the same as the camera location and the surfaces represent the terrain that rovers traverse. The processed information must be very accurate and acquired very fast in order to be useful in practice. The main challenge being addressed by this project is to achieve high estimation accuracy and high computation speed simultaneously, a difficult task due to many technical reasons.

  11. Secure Enclaves: An Isolation-centric Approach for Creating Secure High Performance Computing Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aderholdt, Ferrol; Caldwell, Blake A.; Hicks, Susan Elaine

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data at various security levels but in so doing are often enclaved at the highest security posture. This approach places significant restrictions on the users of the system even when processing data at a lower security level and exposes data at higher levels of confidentiality to a much broader population than otherwise necessary. The traditional approach of isolation, while effective in establishing security enclaves poses significant challenges formore » the use of shared infrastructure in HPC environments. This report details current state-of-the-art in virtualization, reconfigurable network enclaving via Software Defined Networking (SDN), and storage architectures and bridging techniques for creating secure enclaves in HPC environments.« less

  12. Learning gestures for customizable human-computer interaction in the operating room.

    PubMed

    Schwarz, Loren Arthur; Bigdelou, Ali; Navab, Nassir

    2011-01-01

    Interaction with computer-based medical devices in the operating room is often challenging for surgeons due to sterility requirements and the complexity of interventional procedures. Typical solutions, such as delegating the interaction task to an assistant, can be inefficient. We propose a method for gesture-based interaction in the operating room that surgeons can customize to personal requirements and interventional workflow. Given training examples for each desired gesture, our system learns low-dimensional manifold models that enable recognizing gestures and tracking particular poses for fine-grained control. By capturing the surgeon's movements with a few wireless body-worn inertial sensors, we avoid issues of camera-based systems, such as sensitivity to illumination and occlusions. Using a component-based framework implementation, our method can easily be connected to different medical devices. Our experiments show that the approach is able to robustly recognize learned gestures and to distinguish these from other movements.

  13. Deep brain stimulation with a pre-existing cochlear implant: Surgical technique and outcome.

    PubMed

    Eddelman, Daniel; Wewel, Joshua; Wiet, R Mark; Metman, Leo V; Sani, Sepehr

    2017-01-01

    Patients with previously implanted cranial devices pose a special challenge in deep brain stimulation (DBS) surgery. We report the implantation of bilateral DBS leads in a patient with a cochlear implant. Technical nuances and long-term interdevice functionality are presented. A 70-year-old patient with advancing Parkinson's disease and a previously placed cochlear implant for sensorineural hearing loss was referred for placement of bilateral DBS in the subthalamic nucleus (STN). Prior to DBS, the patient underwent surgical removal of the subgaleal cochlear magnet, followed by stereotactic MRI, frame placement, stereotactic computed tomography (CT), and merging of imaging studies. This technique allowed for successful computational merging, MRI-guided targeting, and lead implantation with acceptable accuracy. Formal testing and programming of both the devices were successful without electrical interference. Successful DBS implantation with high resolution MRI-guided targeting is technically feasible in patients with previously implanted cochlear implants by following proper precautions.

  14. Autonomous facial recognition system inspired by human visual system based logarithmical image visualization technique

    NASA Astrophysics Data System (ADS)

    Wan, Qianwen; Panetta, Karen; Agaian, Sos

    2017-05-01

    Autonomous facial recognition system is widely used in real-life applications, such as homeland border security, law enforcement identification and authentication, and video-based surveillance analysis. Issues like low image quality, non-uniform illumination as well as variations in poses and facial expressions can impair the performance of recognition systems. To address the non-uniform illumination challenge, we present a novel robust autonomous facial recognition system inspired by the human visual system based, so called, logarithmical image visualization technique. In this paper, the proposed method, for the first time, utilizes the logarithmical image visualization technique coupled with the local binary pattern to perform discriminative feature extraction for facial recognition system. The Yale database, the Yale-B database and the ATT database are used for computer simulation accuracy and efficiency testing. The extensive computer simulation demonstrates the method's efficiency, accuracy, and robustness of illumination invariance for facial recognition.

  15. The Importance of Proving the Null

    PubMed Central

    Gallistel, C. R.

    2010-01-01

    Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is favored. A general solution is a sensitivity analysis: Compute the odds for or against the null as a function of the limit(s) on the vagueness of the alternative. If the odds on the null approach 1 from above as the hypothesized maximum size of the possible effect approaches 0, then the data favor the null over any vaguer alternative to it. The simple computations and the intuitive graphic representation of the analysis are illustrated by the analysis of diverse examples from the current literature. They pose 3 common experimental questions: (a) Are 2 means the same? (b) Is performance at chance? (c) Are factors additive? PMID:19348549

  16. Rigid-Docking Approaches to Explore Protein-Protein Interaction Space.

    PubMed

    Matsuzaki, Yuri; Uchikoga, Nobuyuki; Ohue, Masahito; Akiyama, Yutaka

    Protein-protein interactions play core roles in living cells, especially in the regulatory systems. As information on proteins has rapidly accumulated on publicly available databases, much effort has been made to obtain a better picture of protein-protein interaction networks using protein tertiary structure data. Predicting relevant interacting partners from their tertiary structure is a challenging task and computer science methods have the potential to assist with this. Protein-protein rigid docking has been utilized by several projects, docking-based approaches having the advantages that they can suggest binding poses of predicted binding partners which would help in understanding the interaction mechanisms and that comparing docking results of both non-binders and binders can lead to understanding the specificity of protein-protein interactions from structural viewpoints. In this review we focus on explaining current computational prediction methods to predict pairwise direct protein-protein interactions that form protein complexes.

  17. A Multi-Sensorial Simultaneous Localization and Mapping (SLAM) System for Low-Cost Micro Aerial Vehicles in GPS-Denied Environments

    PubMed Central

    López, Elena; García, Sergio; Barea, Rafael; Bergasa, Luis M.; Molinos, Eduardo J.; Arroyo, Roberto; Romera, Eduardo; Pardo, Samuel

    2017-01-01

    One of the main challenges of aerial robots navigation in indoor or GPS-denied environments is position estimation using only the available onboard sensors. This paper presents a Simultaneous Localization and Mapping (SLAM) system that remotely calculates the pose and environment map of different low-cost commercial aerial platforms, whose onboard computing capacity is usually limited. The proposed system adapts to the sensory configuration of the aerial robot, by integrating different state-of-the art SLAM methods based on vision, laser and/or inertial measurements using an Extended Kalman Filter (EKF). To do this, a minimum onboard sensory configuration is supposed, consisting of a monocular camera, an Inertial Measurement Unit (IMU) and an altimeter. It allows to improve the results of well-known monocular visual SLAM methods (LSD-SLAM and ORB-SLAM are tested and compared in this work) by solving scale ambiguity and providing additional information to the EKF. When payload and computational capabilities permit, a 2D laser sensor can be easily incorporated to the SLAM system, obtaining a local 2.5D map and a footprint estimation of the robot position that improves the 6D pose estimation through the EKF. We present some experimental results with two different commercial platforms, and validate the system by applying it to their position control. PMID:28397758

  18. Computer-aided biochemical programming of synthetic microreactors as diagnostic devices.

    PubMed

    Courbet, Alexis; Amar, Patrick; Fages, François; Renard, Eric; Molina, Franck

    2018-04-26

    Biological systems have evolved efficient sensing and decision-making mechanisms to maximize fitness in changing molecular environments. Synthetic biologists have exploited these capabilities to engineer control on information and energy processing in living cells. While engineered organisms pose important technological and ethical challenges, de novo assembly of non-living biomolecular devices could offer promising avenues toward various real-world applications. However, assembling biochemical parts into functional information processing systems has remained challenging due to extensive multidimensional parameter spaces that must be sampled comprehensively in order to identify robust, specification compliant molecular implementations. We introduce a systematic methodology based on automated computational design and microfluidics enabling the programming of synthetic cell-like microreactors embedding biochemical logic circuits, or protosensors , to perform accurate biosensing and biocomputing operations in vitro according to temporal logic specifications. We show that proof-of-concept protosensors integrating diagnostic algorithms detect specific patterns of biomarkers in human clinical samples. Protosensors may enable novel approaches to medicine and represent a step toward autonomous micromachines capable of precise interfacing of human physiology or other complex biological environments, ecosystems, or industrial bioprocesses. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.

  19. Role of Laboratory Plasma Experiments in exploring the Physics of Solar Eruptions

    NASA Astrophysics Data System (ADS)

    Tripathi, S.

    2017-12-01

    Solar eruptive events are triggered over a broad range of spatio-temporal scales by a variety of fundamental processes (e.g., force-imbalance, magnetic-reconnection, electrical-current driven instabilities) associated with arched magnetoplasma structures in the solar atmosphere. Contemporary research on solar eruptive events is at the forefront of solar and heliospheric physics due to its relevance to space weather. Details on the formation of magnetized plasma structures on the Sun, storage of magnetic energy in such structures over a long period (several Alfven transit times), and their impulsive eruptions have been recorded in numerous observations and simulated in computer models. Inherent limitations of space observations and uncontrolled nature of solar eruptions pose significant challenges in testing theoretical models and developing the predictive capability for space-weather. The pace of scientific progress in this area can be significantly boosted by tapping the potential of appropriately scaled laboratory plasma experiments to compliment solar observations, theoretical models, and computer simulations. To give an example, recent results from a laboratory plasma experiment on arched magnetic flux ropes will be presented and future challenges will be discussed. (Work supported by National Science Foundation, USA under award number 1619551)

  20. Scattering and radiative properties of complex soot and soot-containing particles

    NASA Astrophysics Data System (ADS)

    Liu, L.; Mishchenko, M. I.; Mackowski, D. W.; Dlugach, J.

    2012-12-01

    Tropospheric soot and soot containing aerosols often exhibit nonspherical overall shapes and complex morphologies. They can externally, semi-externally, and internally mix with other aerosol species. This poses a tremendous challenge in particle characterization, remote sensing, and global climate modeling studies. To address these challenges, we used the new numerically exact public-domain Fortran-90 code based on the superposition T-matrix method (STMM) and other theoretical models to analyze the potential effects of aggregation and heterogeneity on light scattering and absorption by morphologically complex soot containing particles. The parameters we computed include the whole scattering matrix elements, linear depolarization ratios, optical cross-sections, asymmetry parameters, and single scattering albedos. It is shown that the optical characteristics of soot and soot containing aerosols very much depend on particle sizes, compositions, and aerosol overall shapes. The soot particle configurations and heterogeneities can have a substantial effect that can result in a significant enhancement of extinction and absorption relative to those computed from the Lorenz-Mie theory. Meanwhile the model calculated information combined with in-situ and remote sensed data can be used to constrain soot particle shapes and sizes which are much needed in climate models.

  1. Orion EFT-1 Cavity Heating Tile Experiments and Environment Reconstruction

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Amar, Adam; Oliver, Brandon; Hyatt, Andrew; Rezin, Marc

    2016-01-01

    Developing aerothermodynamic environments for deep cavities, such as those produced by micrometeoroids and orbital debris impacts, poses a great challenge for engineers. In order to assess existing cavity heating models, two one-inch diameter cavities were flown on the Orion Multi-Purpose Crew Vehicle during Exploration Flight Test 1 (EFT1). These cavities were manufactured with depths of 1.0 in and 1.4 in, and they were both instrumented. Instrumentation included surface thermocouples upstream, downstream and within the cavities, and additional thermocouples at the TPS-structure interface. This paper will present the data obtained, and comparisons with computational predictions will be shown. Additionally, the development of a 3D material thermal model will be described, which will be used to account for the three-dimensionality of the problem when interpreting the data. Furthermore, using a multi-dimensional inverse heat conduction approach, a reconstruction of a time- and space-dependent flight heating distribution during EFT1 will be presented. Additional discussions will focus on instrumentation challenges and calibration techniques specific to these experiments. The analysis shown will highlight the accuracies and/or deficiencies of current computational techniques to model cavity flows during hypersonic re-entry.

  2. Testing Scientific Software: A Systematic Literature Review.

    PubMed

    Kanewala, Upulee; Bieman, James M

    2014-10-01

    Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.

  3. Dual Quaternions as Constraints in 4D-DPM Models for Pose Estimation.

    PubMed

    Martinez-Berti, Enrique; Sánchez-Salmerón, Antonio-José; Ricolfe-Viala, Carlos

    2017-08-19

    The goal of this research work is to improve the accuracy of human pose estimation using the Deformation Part Model (DPM) without increasing computational complexity. First, the proposed method seeks to improve pose estimation accuracy by adding the depth channel to DPM, which was formerly defined based only on red-green-blue (RGB) channels, in order to obtain a four-dimensional DPM (4D-DPM). In addition, computational complexity can be controlled by reducing the number of joints by taking it into account in a reduced 4D-DPM. Finally, complete solutions are obtained by solving the omitted joints by using inverse kinematics models. In this context, the main goal of this paper is to analyze the effect on pose estimation timing cost when using dual quaternions to solve the inverse kinematics.

  4. The Challenges of Career and Technical Education Concurrent Enrollment: An Administrative Perspective

    ERIC Educational Resources Information Center

    Haag, Patricia W.

    2015-01-01

    Career and technical education concurrent enrollment may pose unique challenges in programming and enrollment for program administrators, and this chapter describes the experiences and challenges of a CTE concurrent enrollment administrator.

  5. Neuroinformatics challenges to the structural, connectomic, functional and electrophysiological multimodal imaging of human traumatic brain injury

    PubMed Central

    Goh, S. Y. Matthew; Irimia, Andrei; Torgerson, Carinna M.; Horn, John D. Van

    2014-01-01

    Throughout the past few decades, the ability to treat and rehabilitate traumatic brain injury (TBI) patients has become critically reliant upon the use of neuroimaging to acquire adequate knowledge of injury-related effects upon brain function and recovery. As a result, the need for TBI neuroimaging analysis methods has increased in recent years due to the recognition that spatiotemporal computational analyses of TBI evolution are useful for capturing the effects of TBI dynamics. At the same time, however, the advent of such methods has brought about the need to analyze, manage, and integrate TBI neuroimaging data using informatically inspired approaches which can take full advantage of their large dimensionality and informational complexity. Given this perspective, we here discuss the neuroinformatics challenges for TBI neuroimaging analysis in the context of structural, connectivity, and functional paradigms. Within each of these, the availability of a wide range of neuroimaging modalities can be leveraged to fully understand the heterogeneity of TBI pathology; consequently, large-scale computer hardware resources and next-generation processing software are often required for efficient data storage, management, and analysis of TBI neuroimaging data. However, each of these paradigms poses challenges in the context of informatics such that the ability to address them is critical for augmenting current capabilities to perform neuroimaging analysis of TBI and to improve therapeutic efficacy. PMID:24616696

  6. Improved pose and affinity predictions using different protocols tailored on the basis of data availability

    NASA Astrophysics Data System (ADS)

    Prathipati, Philip; Nagao, Chioko; Ahmad, Shandar; Mizuguchi, Kenji

    2016-09-01

    The D3R 2015 grand drug design challenge provided a set of blinded challenges for evaluating the applicability of our protocols for pose and affinity prediction. In the present study, we report the application of two different strategies for the two D3R protein targets HSP90 and MAP4K4. HSP90 is a well-studied target system with numerous co-crystal structures and SAR data. Furthermore the D3R HSP90 test compounds showed high structural similarity to existing HSP90 inhibitors in BindingDB. Thus, we adopted an integrated docking and scoring approach involving a combination of both pharmacophoric and heavy atom similarity alignments, local minimization and quantitative structure activity relationships modeling, resulting in the reasonable prediction of pose [with the root mean square deviation (RMSD) values of 1.75 Å for mean pose 1, 1.417 Å for the mean best pose and 1.85 Å for the mean all poses] and affinity (ROC AUC = 0.702 at 7.5 pIC50 cut-off and R = 0.45 for 180 compounds). The second protein, MAP4K4, represents a novel system with limited SAR and co-crystal structure data and little structural similarity of the D3R MAP4K4 test compounds to known MAP4K4 ligands. For this system, we implemented an exhaustive pose and affinity prediction protocol involving docking and scoring using the PLANTS software which considers side chain flexibility together with protein-ligand fingerprints analysis assisting in pose prioritization. This protocol through fares poorly in pose prediction (with the RMSD values of 4.346 Å for mean pose 1, 4.69 Å for mean best pose and 4.75 Å for mean all poses) and produced reasonable affinity prediction (AUC = 0.728 at 7.5 pIC50 cut-off and R = 0.67 for 18 compounds, ranked 1st among 80 submissions).

  7. Large-scale 3D geoelectromagnetic modeling using parallel adaptive high-order finite element method

    DOE PAGES

    Grayver, Alexander V.; Kolev, Tzanio V.

    2015-11-01

    Here, we have investigated the use of the adaptive high-order finite-element method (FEM) for geoelectromagnetic modeling. Because high-order FEM is challenging from the numerical and computational points of view, most published finite-element studies in geoelectromagnetics use the lowest order formulation. Solution of the resulting large system of linear equations poses the main practical challenge. We have developed a fully parallel and distributed robust and scalable linear solver based on the optimal block-diagonal and auxiliary space preconditioners. The solver was found to be efficient for high finite element orders, unstructured and nonconforming locally refined meshes, a wide range of frequencies, largemore » conductivity contrasts, and number of degrees of freedom (DoFs). Furthermore, the presented linear solver is in essence algebraic; i.e., it acts on the matrix-vector level and thus requires no information about the discretization, boundary conditions, or physical source used, making it readily efficient for a wide range of electromagnetic modeling problems. To get accurate solutions at reduced computational cost, we have also implemented goal-oriented adaptive mesh refinement. The numerical tests indicated that if highly accurate modeling results were required, the high-order FEM in combination with the goal-oriented local mesh refinement required less computational time and DoFs than the lowest order adaptive FEM.« less

  8. Large-scale 3D geoelectromagnetic modeling using parallel adaptive high-order finite element method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grayver, Alexander V.; Kolev, Tzanio V.

    Here, we have investigated the use of the adaptive high-order finite-element method (FEM) for geoelectromagnetic modeling. Because high-order FEM is challenging from the numerical and computational points of view, most published finite-element studies in geoelectromagnetics use the lowest order formulation. Solution of the resulting large system of linear equations poses the main practical challenge. We have developed a fully parallel and distributed robust and scalable linear solver based on the optimal block-diagonal and auxiliary space preconditioners. The solver was found to be efficient for high finite element orders, unstructured and nonconforming locally refined meshes, a wide range of frequencies, largemore » conductivity contrasts, and number of degrees of freedom (DoFs). Furthermore, the presented linear solver is in essence algebraic; i.e., it acts on the matrix-vector level and thus requires no information about the discretization, boundary conditions, or physical source used, making it readily efficient for a wide range of electromagnetic modeling problems. To get accurate solutions at reduced computational cost, we have also implemented goal-oriented adaptive mesh refinement. The numerical tests indicated that if highly accurate modeling results were required, the high-order FEM in combination with the goal-oriented local mesh refinement required less computational time and DoFs than the lowest order adaptive FEM.« less

  9. Pitting corrosion as a mixed system: coupled deterministic-probabilistic simulation of pit growth

    NASA Astrophysics Data System (ADS)

    Ibrahim, Israr B. M.; Fonna, S.; Pidaparti, R.

    2018-05-01

    Stochastic behavior of pitting corrosion poses a unique challenge in its computational analysis. However, it also stems from electrochemical activity causing general corrosion. In this paper, a framework for corrosion pit growth simulation based on the coupling of the Cellular Automaton (CA) and Boundary Element Methods (BEM) is presented. The framework assumes that pitting corrosion is controlled by electrochemical activity inside the pit cavity. The BEM provides the prediction of electrochemical activity given the geometrical data and polarization curves, while the CA is used to simulate the evolution of pit shapes based on electrochemical activity provided by BEM. To demonstrate the methodology, a sample case of local corrosion cells formed in pitting corrosion with varied dimensions and polarization functions is considered. Results show certain shapes tend to grow in certain types of environments. Some pit shapes appear to pose a higher risk by being potentially significant stress raisers or potentially increasing the rate of corrosion under the surface. Furthermore, these pits are comparable to commonly observed pit shapes in general corrosion environments.

  10. Accurate estimation of human body orientation from RGB-D sensors.

    PubMed

    Liu, Wu; Zhang, Yongdong; Tang, Sheng; Tang, Jinhui; Hong, Richang; Li, Jintao

    2013-10-01

    Accurate estimation of human body orientation can significantly enhance the analysis of human behavior, which is a fundamental task in the field of computer vision. However, existing orientation estimation methods cannot handle the various body poses and appearances. In this paper, we propose an innovative RGB-D-based orientation estimation method to address these challenges. By utilizing the RGB-D information, which can be real time acquired by RGB-D sensors, our method is robust to cluttered environment, illumination change and partial occlusions. Specifically, efficient static and motion cue extraction methods are proposed based on the RGB-D superpixels to reduce the noise of depth data. Since it is hard to discriminate all the 360 (°) orientation using static cues or motion cues independently, we propose to utilize a dynamic Bayesian network system (DBNS) to effectively employ the complementary nature of both static and motion cues. In order to verify our proposed method, we build a RGB-D-based human body orientation dataset that covers a wide diversity of poses and appearances. Our intensive experimental evaluations on this dataset demonstrate the effectiveness and efficiency of the proposed method.

  11. Uncertainty quantification of Antarctic contribution to sea-level rise using the fast Elementary Thermomechanical Ice Sheet (f.ETISh) model

    NASA Astrophysics Data System (ADS)

    Bulthuis, Kevin; Arnst, Maarten; Pattyn, Frank; Favier, Lionel

    2017-04-01

    Uncertainties in sea-level rise projections are mostly due to uncertainties in Antarctic ice-sheet predictions (IPCC AR5 report, 2013), because key parameters related to the current state of the Antarctic ice sheet (e.g. sub-ice-shelf melting) and future climate forcing are poorly constrained. Here, we propose to improve the predictions of Antarctic ice-sheet behaviour using new uncertainty quantification methods. As opposed to ensemble modelling (Bindschadler et al., 2013) which provides a rather limited view on input and output dispersion, new stochastic methods (Le Maître and Knio, 2010) can provide deeper insight into the impact of uncertainties on complex system behaviour. Such stochastic methods usually begin with deducing a probabilistic description of input parameter uncertainties from the available data. Then, the impact of these input parameter uncertainties on output quantities is assessed by estimating the probability distribution of the outputs by means of uncertainty propagation methods such as Monte Carlo methods or stochastic expansion methods. The use of such uncertainty propagation methods in glaciology may be computationally costly because of the high computational complexity of ice-sheet models. This challenge emphasises the importance of developing reliable and computationally efficient ice-sheet models such as the f.ETISh ice-sheet model (Pattyn, 2015), a new fast thermomechanical coupled ice sheet/ice shelf model capable of handling complex and critical processes such as the marine ice-sheet instability mechanism. Here, we apply these methods to investigate the role of uncertainties in sub-ice-shelf melting, calving rates and climate projections in assessing Antarctic contribution to sea-level rise for the next centuries using the f.ETISh model. We detail the methods and show results that provide nominal values and uncertainty bounds for future sea-level rise as a reflection of the impact of the input parameter uncertainties under consideration, as well as a ranking of the input parameter uncertainties in the order of the significance of their contribution to uncertainty in future sea-level rise. In addition, we discuss how limitations posed by the available information (poorly constrained data) pose challenges that motivate our current research.

  12. Cockpit Accommodation Assessment of the Bell 412CF Helicopter (Evaluation du Poste de Pilotage de l’Hellcoptere Bell 412CF)

    DTIC Science & Technology

    2009-12-01

    Research, Ottawa , 1998 as issued jointly by the Canadian Institutes of Health Research, the Natural Sciences and Engineering Research Council of Canada...Flight Rules ) skills. Nine Bell 412CF helicopters were produced from existing CH-146 Griffons, which posed technical challenges. One such compromise was...IFR (Instrument Flight Rules ) skills. Nine Bell 412CF helicopters were produced from existing CH-146 Griffons, which posed technical challenges. One

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baudry, Laurent; Lukyanchuk, Igor; Vinokur, Valerii M.

    Here, the tunability of electrical polarization in ferroelectrics is instrumental to their applications in information-storage devices. The existing ferroelectric memory cells are based on the two-level storage capacity with the standard binary logics. However, the latter have reached its fundamental limitations. Here we propose ferroelectric multibit cells (FMBC) utilizing the ability of multiaxial ferroelectric materials to pin the polarization at a sequence of the multistable states. Employing the catastrophe theory principles we show that these states are symmetry-protected against the information loss and thus realize novel topologically-controlled access memory (TAM). Our findings enable developing a platform for the emergent many-valuedmore » non-Boolean information technology and target challenges posed by needs of quantum and neuromorphic computing.« less

  14. 3D reconstruction from non-uniform point clouds via local hierarchical clustering

    NASA Astrophysics Data System (ADS)

    Yang, Jiaqi; Li, Ruibo; Xiao, Yang; Cao, Zhiguo

    2017-07-01

    Raw scanned 3D point clouds are usually irregularly distributed due to the essential shortcomings of laser sensors, which therefore poses a great challenge for high-quality 3D surface reconstruction. This paper tackles this problem by proposing a local hierarchical clustering (LHC) method to improve the consistency of point distribution. Specifically, LHC consists of two steps: 1) adaptive octree-based decomposition of 3D space, and 2) hierarchical clustering. The former aims at reducing the computational complexity and the latter transforms the non-uniform point set into uniform one. Experimental results on real-world scanned point clouds validate the effectiveness of our method from both qualitative and quantitative aspects.

  15. Informatics and computational strategies for the study of lipids.

    PubMed

    Yetukuri, Laxman; Ekroos, Kim; Vidal-Puig, Antonio; Oresic, Matej

    2008-02-01

    Recent advances in mass spectrometry (MS)-based techniques for lipidomic analysis have empowered us with the tools that afford studies of lipidomes at the systems level. However, these techniques pose a number of challenges for lipidomic raw data processing, lipid informatics, and the interpretation of lipidomic data in the context of lipid function and structure. Integration of lipidomic data with other systemic levels, such as genomic or proteomic, in the context of molecular pathways and biophysical processes provides a basis for the understanding of lipid function at the systems level. The present report, based on the limited literature, is an update on a young but rapidly emerging field of lipid informatics and related pathway reconstruction strategies.

  16. The economic and social context of special populations.

    PubMed

    Ashford, N A

    1999-01-01

    Changes in both technology and international trade are altering the world economy and hence are affecting the demand and supply of labor and the nature of work and working conditions. New materials, faster and more powerful computers, electronic and mobile communications, alternative energy systems, miniaturization, robotics, and biotechnology pose new opportunities, problems, and challenges. The tremendous expansion in information-based technologies in both manufacturing and services has resulted in impressive increases in productivity and demand for new skills, but they also have brought about the displacement and de-skilling of some labor by capital, the lowering of wages, and the increase of contingent, part-time, and temporary work. Special populations, in particular, may be differentially impacted.

  17. Ferroelectric symmetry-protected multibit memory cell

    NASA Astrophysics Data System (ADS)

    Baudry, Laurent; Lukyanchuk, Igor; Vinokur, Valerii M.

    2017-02-01

    The tunability of electrical polarization in ferroelectrics is instrumental to their applications in information-storage devices. The existing ferroelectric memory cells are based on the two-level storage capacity with the standard binary logics. However, the latter have reached its fundamental limitations. Here we propose ferroelectric multibit cells (FMBC) utilizing the ability of multiaxial ferroelectric materials to pin the polarization at a sequence of the multistable states. Employing the catastrophe theory principles we show that these states are symmetry-protected against the information loss and thus realize novel topologically-controlled access memory (TAM). Our findings enable developing a platform for the emergent many-valued non-Boolean information technology and target challenges posed by needs of quantum and neuromorphic computing.

  18. Automated microscopy for high-content RNAi screening

    PubMed Central

    2010-01-01

    Fluorescence microscopy is one of the most powerful tools to investigate complex cellular processes such as cell division, cell motility, or intracellular trafficking. The availability of RNA interference (RNAi) technology and automated microscopy has opened the possibility to perform cellular imaging in functional genomics and other large-scale applications. Although imaging often dramatically increases the content of a screening assay, it poses new challenges to achieve accurate quantitative annotation and therefore needs to be carefully adjusted to the specific needs of individual screening applications. In this review, we discuss principles of assay design, large-scale RNAi, microscope automation, and computational data analysis. We highlight strategies for imaging-based RNAi screening adapted to different library and assay designs. PMID:20176920

  19. ERMHAN: A Context-Aware Service Platform to Support Continuous Care Networks for Home-Based Assistance

    PubMed Central

    Paganelli, Federica; Spinicci, Emilio; Giuli, Dino

    2008-01-01

    Continuous care models for chronic diseases pose several technology-oriented challenges for home-based continuous care, where assistance services rely on a close collaboration among different stakeholders such as health operators, patient relatives, and social community members. Here we describe Emilia Romagna Mobile Health Assistance Network (ERMHAN) a multichannel context-aware service platform designed to support care networks in cooperating and sharing information with the goal of improving patient quality of life. In order to meet extensibility and flexibility requirements, this platform has been developed through ontology-based context-aware computing and a service oriented approach. We also provide some preliminary results of performance analysis and user survey activity. PMID:18695739

  20. Single leg balancing in ballet: effects of shoe conditions and poses.

    PubMed

    Lobo da Costa, Paula H; Azevedo Nora, Fernanda G S; Vieira, Marcus Fraga; Bosch, Kerstin; Rosenbaum, Dieter

    2013-03-01

    The purpose of this study was to describe the effects of lower limb positioning and shoe conditions on stability levels of selected single leg ballet poses performed in demi-pointe position. Fourteen female non-professional ballet dancers (mean age of 18.4±2.8 years and mean body mass index of 21.5±2.8kg/m(2)) who had practiced ballet for at least seven years, without any musculoskeletal impairment volunteered to participate in this study. A capacitive pressure platform allowed for the assessment of center of pressure variables related to the execution of three single leg ballet poses in demi pointé position: attitude devant, attitude derriére, and attitude a la second. Peak pressures, contact areas, COP oscillation areas, anterior-posterior and medio-lateral COP oscillations and velocities were compared between two shoe conditions (barefoot versus slippers) and among the different poses. Barefoot performances produced more stable poses with significantly higher plantar contact areas, smaller COP oscillation areas and smaller anterior-posterior COP oscillations. COP oscillation areas, anterior-posterior COP oscillations and medio-lateral COP velocities indicated that attitude a la second is the least challenging and attitude derriére the most challenging pose. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Global Climate Change and the Mitigation Challenge

    EPA Science Inventory

    Book edited by Frank Princiotta titled Global Climate Change--The Technology Challenge Transparent modeling tools and the most recent literature are used, to quantify the challenge posed by climate change and potential technological remedies. The chapter examines forces driving ...

  2. Unique Challenges for Modeling Defect Dynamics in Concentrated Solid-Solution Alloys

    NASA Astrophysics Data System (ADS)

    Zhao, Shijun; Weber, William J.; Zhang, Yanwen

    2017-11-01

    Recently developed concentrated solid solution alloys (CSAs) are shown to have improved performance under irradiation that depends strongly on the number of alloying elements, alloying species, and their concentrations. In contrast to conventional dilute alloys, CSAs are composed of multiple principal elements situated randomly in a simple crystalline lattice. As a result, the intrinsic disorder has a profound influence on energy dissipation pathways and defect evolution when these CSAs are subjected to energetic particle irradiation. Extraordinary irradiation resistance, including suppression of void formation by two orders of magnitude at an elevated temperature, has been achieved with increasing compositional complexity in CSAs. Unfortunately, the loss of translational invariance associated with the intrinsic chemical disorder poses great challenges to theoretical modeling at the electronic and atomic levels. Based on recent computer simulation results for a set of novel Ni-containing, face-centered cubic CSAs, we review theoretical modeling progress in handling disorder in CSAs and underscore the impact of disorder on defect dynamics. We emphasize in particular the unique challenges associated with the description of defect dynamics in CSAs.

  3. Challenges in industrial fermentation technology research.

    PubMed

    Formenti, Luca Riccardo; Nørregaard, Anders; Bolic, Andrijana; Hernandez, Daniela Quintanilla; Hagemann, Timo; Heins, Anna-Lena; Larsson, Hilde; Mears, Lisa; Mauricio-Iglesias, Miguel; Krühne, Ulrich; Gernaey, Krist V

    2014-06-01

    Industrial fermentation processes are increasingly popular, and are considered an important technological asset for reducing our dependence on chemicals and products produced from fossil fuels. However, despite their increasing popularity, fermentation processes have not yet reached the same maturity as traditional chemical processes, particularly when it comes to using engineering tools such as mathematical models and optimization techniques. This perspective starts with a brief overview of these engineering tools. However, the main focus is on a description of some of the most important engineering challenges: scaling up and scaling down fermentation processes, the influence of morphology on broth rheology and mass transfer, and establishing novel sensors to measure and control insightful process parameters. The greatest emphasis is on the challenges posed by filamentous fungi, because of their wide applications as cell factories and therefore their relevance in a White Biotechnology context. Computational fluid dynamics (CFD) is introduced as a promising tool that can be used to support the scaling up and scaling down of bioreactors, and for studying mixing and the potential occurrence of gradients in a tank. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Perceiving emotion: towards a realistic understanding of the task.

    PubMed

    Cowie, Roddy

    2009-12-12

    A decade ago, perceiving emotion was generally equated with taking a sample (a still photograph or a few seconds of speech) that unquestionably signified an archetypal emotional state, and attaching the appropriate label. Computational research has shifted that paradigm in multiple ways. Concern with realism is key. Emotion generally colours ongoing action and interaction: describing that colouring is a different problem from categorizing brief episodes of relatively pure emotion. Multiple challenges flow from that. Describing emotional colouring is a challenge in itself. One approach is to use everyday categories describing states that are partly emotional and partly cognitive. Another approach is to use dimensions. Both approaches need ways to deal with gradual changes over time and mixed emotions. Attaching target descriptions to a sample poses problems of both procedure and validation. Cues are likely to be distributed both in time and across modalities, and key decisions may depend heavily on context. The usefulness of acted data is limited because it tends not to reproduce these features. By engaging with these challenging issues, research is not only achieving impressive results, but also offering a much deeper understanding of the problem.

  5. Will learning to solve one-step equations pose a challenge to 8th grade students?

    NASA Astrophysics Data System (ADS)

    Ngu, Bing Hiong; Phan, Huy P.

    2017-08-01

    Assimilating multiple interactive elements simultaneously in working memory to allow understanding to occur, while solving an equation, would impose a high cognitive load. Element interactivity arises from the interaction between elements within and across operational and relational lines. Moreover, operating with special features (e.g. negative pronumeral) poses additional challenge to master equation solving skills. In an experiment, 41 8th grade students (girls = 16, boys = 25) sat for a pre-test, attended a session about equation solving, completed an acquisition phase which constituted the main intervention and were tested again in a post-test. The results showed that at post-test, students performed better on one-step equations tapping low rather than high element interactivity knowledge. In addition, students performed better on those one-step equations that contained no special features. Thus, both the degree of element interactivity and the operation with special features affect the challenge posed to 8th grade students on learning how to solve one-step equations.

  6. Removing the center from computing: biology's new mode of digital knowledge production.

    PubMed

    November, Joseph

    2011-06-01

    This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.

  7. Docking pose selection by interaction pattern graph similarity: application to the D3R grand challenge 2015

    NASA Astrophysics Data System (ADS)

    Slynko, Inna; Da Silva, Franck; Bret, Guillaume; Rognan, Didier

    2016-09-01

    High affinity ligands for a given target tend to share key molecular interactions with important anchoring amino acids and therefore often present quite conserved interaction patterns. This simple concept was formalized in a topological knowledge-based scoring function (GRIM) for selecting the most appropriate docking poses from previously X-rayed interaction patterns. GRIM first converts protein-ligand atomic coordinates (docking poses) into a simple 3D graph describing the corresponding interaction pattern. In a second step, proposed graphs are compared to that found from template structures in the Protein Data Bank. Last, all docking poses are rescored according to an empirical score (GRIMscore) accounting for overlap of maximum common subgraphs. Taking the opportunity of the public D3R Grand Challenge 2015, GRIM was used to rescore docking poses for 36 ligands (6 HSP90α inhibitors, 30 MAP4K4 inhibitors) prior to the release of the corresponding protein-ligand X-ray structures. When applied to the HSP90α dataset, for which many protein-ligand X-ray structures are already available, GRIM provided very high quality solutions (mean rmsd = 1.06 Å, n = 6) as top-ranked poses, and significantly outperformed a state-of-the-art scoring function. In the case of MAP4K4 inhibitors, for which preexisting 3D knowledge is scarce and chemical diversity is much larger, the accuracy of GRIM poses decays (mean rmsd = 3.18 Å, n = 30) although GRIM still outperforms an energy-based scoring function. GRIM rescoring appears to be quite robust with comparison to the other approaches competing for the same challenge (42 submissions for the HSP90 dataset, 27 for the MAP4K4 dataset) as it ranked 3rd and 2nd respectively, for the two investigated datasets. The rescoring method is quite simple to implement, independent on a docking engine, and applicable to any target for which at least one holo X-ray structure is available.

  8. Prediction of binding poses to FXR using multi-targeted docking combined with molecular dynamics and enhanced sampling

    NASA Astrophysics Data System (ADS)

    Bhakat, Soumendranath; Åberg, Emil; Söderhjelm, Pär

    2018-01-01

    Advanced molecular docking methods often aim at capturing the flexibility of the protein upon binding to the ligand. In this study, we investigate whether instead a simple rigid docking method can be applied, if combined with multiple target structures to model the backbone flexibility and molecular dynamics simulations to model the sidechain and ligand flexibility. The methods are tested for the binding of 35 ligands to FXR as part of the first stage of the Drug Design Data Resource (D3R) Grand Challenge 2 blind challenge. The results show that the multiple-target docking protocol performs surprisingly well, with correct poses found for 21 of the ligands. MD simulations started on the docked structures are remarkably stable, but show almost no tendency of refining the structure closer to the experimentally found binding pose. Reconnaissance metadynamics enhances the exploration of new binding poses, but additional collective variables involving the protein are needed to exploit the full potential of the method.

  9. Prediction of binding poses to FXR using multi-targeted docking combined with molecular dynamics and enhanced sampling.

    PubMed

    Bhakat, Soumendranath; Åberg, Emil; Söderhjelm, Pär

    2018-01-01

    Advanced molecular docking methods often aim at capturing the flexibility of the protein upon binding to the ligand. In this study, we investigate whether instead a simple rigid docking method can be applied, if combined with multiple target structures to model the backbone flexibility and molecular dynamics simulations to model the sidechain and ligand flexibility. The methods are tested for the binding of 35 ligands to FXR as part of the first stage of the Drug Design Data Resource (D3R) Grand Challenge 2 blind challenge. The results show that the multiple-target docking protocol performs surprisingly well, with correct poses found for 21 of the ligands. MD simulations started on the docked structures are remarkably stable, but show almost no tendency of refining the structure closer to the experimentally found binding pose. Reconnaissance metadynamics enhances the exploration of new binding poses, but additional collective variables involving the protein are needed to exploit the full potential of the method.

  10. High resolution flow field prediction for tail rotor aeroacoustics

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Bliss, Donald B.

    1989-01-01

    The prediction of tail rotor noise due to the impingement of the main rotor wake poses a significant challenge to current analysis methods in rotorcraft aeroacoustics. This paper describes the development of a new treatment of the tail rotor aerodynamic environment that permits highly accurate resolution of the incident flow field with modest computational effort relative to alternative models. The new approach incorporates an advanced full-span free wake model of the main rotor in a scheme which reconstructs high-resolution flow solutions from preliminary, computationally inexpensive simulations with coarse resolution. The heart of the approach is a novel method for using local velocity correction terms to capture the steep velocity gradients characteristic of the vortex-dominated incident flow. Sample calculations have been undertaken to examine the principal types of interactions between the tail rotor and the main rotor wake and to examine the performance of the new method. The results of these sample problems confirm the success of this approach in capturing the high-resolution flows necessary for analysis of rotor-wake/rotor interactions with dramatically reduced computational cost. Computations of radiated sound are also carried out that explore the role of various portions of the main rotor wake in generating tail rotor noise.

  11. Characterization and classification of vegetation canopy structure and distribution within the Great Smoky Mountains National Park using LiDAR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Jitendra; HargroveJr., William Walter; Norman, Steven P

    Vegetation canopy structure is a critically important habit characteristic for many threatened and endangered birds and other animal species, and it is key information needed by forest and wildlife managers for monitoring and managing forest resources, conservation planning and fostering biodiversity. Advances in Light Detection and Ranging (LiDAR) technologies have enabled remote sensing-based studies of vegetation canopies by capturing three-dimensional structures, yielding information not available in two-dimensional images of the landscape pro- vided by traditional multi-spectral remote sensing platforms. However, the large volume data sets produced by airborne LiDAR instruments pose a significant computational challenge, requiring algorithms to identify andmore » analyze patterns of interest buried within LiDAR point clouds in a computationally efficient manner, utilizing state-of-art computing infrastructure. We developed and applied a computationally efficient approach to analyze a large volume of LiDAR data and to characterize and map the vegetation canopy structures for 139,859 hectares (540 sq. miles) in the Great Smoky Mountains National Park. This study helps improve our understanding of the distribution of vegetation and animal habitats in this extremely diverse ecosystem.« less

  12. Decentralized State Estimation and Remedial Control Action for Minimum Wind Curtailment Using Distributed Computing Platform

    DOE PAGES

    Liu, Ren; Srivastava, Anurag K.; Bakken, David E.; ...

    2017-08-17

    Intermittency of wind energy poses a great challenge for power system operation and control. Wind curtailment might be necessary at the certain operating condition to keep the line flow within the limit. Remedial Action Scheme (RAS) offers quick control action mechanism to keep reliability and security of the power system operation with high wind energy integration. In this paper, a new RAS is developed to maximize the wind energy integration without compromising the security and reliability of the power system based on specific utility requirements. A new Distributed Linear State Estimation (DLSE) is also developed to provide the fast andmore » accurate input data for the proposed RAS. A distributed computational architecture is designed to guarantee the robustness of the cyber system to support RAS and DLSE implementation. The proposed RAS and DLSE is validated using the modified IEEE-118 Bus system. Simulation results demonstrate the satisfactory performance of the DLSE and the effectiveness of RAS. Real-time cyber-physical testbed has been utilized to validate the cyber-resiliency of the developed RAS against computational node failure.« less

  13. An Improved Lattice Boltzmann Model for Non-Newtonian Flows with Applications to Solid-Fluid Interactions in External Flows

    NASA Astrophysics Data System (ADS)

    Adam, Saad; Premnath, Kannan

    2016-11-01

    Fluid mechanics of non-Newtonian fluids, which arise in numerous settings, are characterized by non-linear constitutive models that pose certain unique challenges for computational methods. Here, we consider the lattice Boltzmann method (LBM), which offers some computational advantages due to its kinetic basis and its simpler stream-and-collide procedure enabling efficient simulations. However, further improvements are necessary to improve its numerical stability and accuracy for computations involving broader parameter ranges. Hence, in this study, we extend the cascaded LBM formulation by modifying its moment equilibria and relaxation parameters to handle a variety of non-Newtonian constitutive equations, including power-law and Bingham fluids, with improved stability. In addition, we include corrections to the moment equilibria to obtain an inertial frame invariant scheme without cubic-velocity defects. After preforming its validation study for various benchmark flows, we study the physics of non-Newtonian flow over pairs of circular and square cylinders in a tandem arrangement, especially the wake structure interactions and their effects on resulting forces in each cylinder, and elucidate the effect of the various characteristic parameters.

  14. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    PubMed

    Abdullah, Afnizanfaizal; Deris, Safaai; Anwar, Sohail; Arjunan, Satya N V

    2013-01-01

    The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  15. An Evolutionary Firefly Algorithm for the Estimation of Nonlinear Biological Model Parameters

    PubMed Central

    Abdullah, Afnizanfaizal; Deris, Safaai; Anwar, Sohail; Arjunan, Satya N. V.

    2013-01-01

    The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test. PMID:23469172

  16. Verification of Electromagnetic Physics Models for Parallel Computing Architectures in the GeantV Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amadio, G.; et al.

    An intensive R&D and programming effort is required to accomplish new challenges posed by future experimental high-energy particle physics (HEP) programs. The GeantV project aims to narrow the gap between the performance of the existing HEP detector simulation software and the ideal performance achievable, exploiting latest advances in computing technology. The project has developed a particle detector simulation prototype capable of transporting in parallel particles in complex geometries exploiting instruction level microparallelism (SIMD and SIMT), task-level parallelism (multithreading) and high-level parallelism (MPI), leveraging both the multi-core and the many-core opportunities. We present preliminary verification results concerning the electromagnetic (EM) physicsmore » models developed for parallel computing architectures within the GeantV project. In order to exploit the potential of vectorization and accelerators and to make the physics model effectively parallelizable, advanced sampling techniques have been implemented and tested. In this paper we introduce a set of automated statistical tests in order to verify the vectorized models by checking their consistency with the corresponding Geant4 models and to validate them against experimental data.« less

  17. Computational Models for Nanoscale Fluid Dynamics and Transport Inspired by Nonequilibrium Thermodynamics1

    PubMed Central

    Radhakrishnan, Ravi; Yu, Hsiu-Yu; Eckmann, David M.; Ayyaswamy, Portonovo S.

    2017-01-01

    Traditionally, the numerical computation of particle motion in a fluid is resolved through computational fluid dynamics (CFD). However, resolving the motion of nanoparticles poses additional challenges due to the coupling between the Brownian and hydrodynamic forces. Here, we focus on the Brownian motion of a nanoparticle coupled to adhesive interactions and confining-wall-mediated hydrodynamic interactions. We discuss several techniques that are founded on the basis of combining CFD methods with the theory of nonequilibrium statistical mechanics in order to simultaneously conserve thermal equipartition and to show correct hydrodynamic correlations. These include the fluctuating hydrodynamics (FHD) method, the generalized Langevin method, the hybrid method, and the deterministic method. Through the examples discussed, we also show a top-down multiscale progression of temporal dynamics from the colloidal scales to the molecular scales, and the associated fluctuations, hydrodynamic correlations. While the motivation and the examples discussed here pertain to nanoscale fluid dynamics and mass transport, the methodologies presented are rather general and can be easily adopted to applications in convective heat transfer. PMID:28035168

  18. Phase unwrapping with graph cuts optimization and dual decomposition acceleration for 3D high-resolution MRI data.

    PubMed

    Dong, Jianwu; Chen, Feng; Zhou, Dong; Liu, Tian; Yu, Zhaofei; Wang, Yi

    2017-03-01

    Existence of low SNR regions and rapid-phase variations pose challenges to spatial phase unwrapping algorithms. Global optimization-based phase unwrapping methods are widely used, but are significantly slower than greedy methods. In this paper, dual decomposition acceleration is introduced to speed up a three-dimensional graph cut-based phase unwrapping algorithm. The phase unwrapping problem is formulated as a global discrete energy minimization problem, whereas the technique of dual decomposition is used to increase the computational efficiency by splitting the full problem into overlapping subproblems and enforcing the congruence of overlapping variables. Using three dimensional (3D) multiecho gradient echo images from an agarose phantom and five brain hemorrhage patients, we compared this proposed method with an unaccelerated graph cut-based method. Experimental results show up to 18-fold acceleration in computation time. Dual decomposition significantly improves the computational efficiency of 3D graph cut-based phase unwrapping algorithms. Magn Reson Med 77:1353-1358, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  19. Decentralized State Estimation and Remedial Control Action for Minimum Wind Curtailment Using Distributed Computing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ren; Srivastava, Anurag K.; Bakken, David E.

    Intermittency of wind energy poses a great challenge for power system operation and control. Wind curtailment might be necessary at the certain operating condition to keep the line flow within the limit. Remedial Action Scheme (RAS) offers quick control action mechanism to keep reliability and security of the power system operation with high wind energy integration. In this paper, a new RAS is developed to maximize the wind energy integration without compromising the security and reliability of the power system based on specific utility requirements. A new Distributed Linear State Estimation (DLSE) is also developed to provide the fast andmore » accurate input data for the proposed RAS. A distributed computational architecture is designed to guarantee the robustness of the cyber system to support RAS and DLSE implementation. The proposed RAS and DLSE is validated using the modified IEEE-118 Bus system. Simulation results demonstrate the satisfactory performance of the DLSE and the effectiveness of RAS. Real-time cyber-physical testbed has been utilized to validate the cyber-resiliency of the developed RAS against computational node failure.« less

  20. Model Order Reduction Algorithm for Estimating the Absorption Spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Beeumen, Roel; Williams-Young, David B.; Kasper, Joseph M.

    The ab initio description of the spectral interior of the absorption spectrum poses both a theoretical and computational challenge for modern electronic structure theory. Due to the often spectrally dense character of this domain in the quantum propagator’s eigenspectrum for medium-to-large sized systems, traditional approaches based on the partial diagonalization of the propagator often encounter oscillatory and stagnating convergence. Electronic structure methods which solve the molecular response problem through the solution of spectrally shifted linear systems, such as the complex polarization propagator, offer an alternative approach which is agnostic to the underlying spectral density or domain location. This generality comesmore » at a seemingly high computational cost associated with solving a large linear system for each spectral shift in some discretization of the spectral domain of interest. In this work, we present a novel, adaptive solution to this high computational overhead based on model order reduction techniques via interpolation. Model order reduction reduces the computational complexity of mathematical models and is ubiquitous in the simulation of dynamical systems and control theory. The efficiency and effectiveness of the proposed algorithm in the ab initio prediction of X-ray absorption spectra is demonstrated using a test set of challenging water clusters which are spectrally dense in the neighborhood of the oxygen K-edge. On the basis of a single, user defined tolerance we automatically determine the order of the reduced models and approximate the absorption spectrum up to the given tolerance. We also illustrate that, for the systems studied, the automatically determined model order increases logarithmically with the problem dimension, compared to a linear increase of the number of eigenvalues within the energy window. Furthermore, we observed that the computational cost of the proposed algorithm only scales quadratically with respect to the problem dimension.« less

  1. Visual Odometry for Autonomous Deep-Space Navigation Project

    NASA Technical Reports Server (NTRS)

    Robinson, Shane; Pedrotty, Sam

    2016-01-01

    Autonomous rendezvous and docking (AR&D) is a critical need for manned spaceflight, especially in deep space where communication delays essentially leave crews on their own for critical operations like docking. Previously developed AR&D sensors have been large, heavy, power-hungry, and may still require further development (e.g. Flash LiDAR). Other approaches to vision-based navigation are not computationally efficient enough to operate quickly on slower, flight-like computers. The key technical challenge for visual odometry is to adapt it from the current terrestrial applications it was designed for to function in the harsh lighting conditions of space. This effort leveraged Draper Laboratory’s considerable prior development and expertise, benefitting both parties. The algorithm Draper has created is unique from other pose estimation efforts as it has a comparatively small computational footprint (suitable for use onboard a spacecraft, unlike alternatives) and potentially offers accuracy and precision needed for docking. This presents a solution to the AR&D problem that only requires a camera, which is much smaller, lighter, and requires far less power than competing AR&D sensors. We have demonstrated the algorithm’s performance and ability to process ‘flight-like’ imagery formats with a ‘flight-like’ trajectory, positioning ourselves to easily process flight data from the upcoming ‘ISS Selfie’ activity and then compare the algorithm’s quantified performance to the simulated imagery. This will bring visual odometry beyond TRL 5, proving its readiness to be demonstrated as part of an integrated system.Once beyond TRL 5, visual odometry will be poised to be demonstrated as part of a system in an in-space demo where relative pose is critical, like Orion AR&D, ISS robotic operations, asteroid proximity operations, and more.

  2. Visual Odometry for Autonomous Deep-Space Navigation Project

    NASA Technical Reports Server (NTRS)

    Robinson, Shane; Pedrotty, Sam

    2016-01-01

    Autonomous rendezvous and docking (AR&D) is a critical need for manned spaceflight, especially in deep space where communication delays essentially leave crews on their own for critical operations like docking. Previously developed AR&D sensors have been large, heavy, power-hungry, and may still require further development (e.g. Flash LiDAR). Other approaches to vision-based navigation are not computationally efficient enough to operate quickly on slower, flight-like computers. The key technical challenge for visual odometry is to adapt it from the current terrestrial applications it was designed for to function in the harsh lighting conditions of space. This effort leveraged Draper Laboratory's considerable prior development and expertise, benefitting both parties. The algorithm Draper has created is unique from other pose estimation efforts as it has a comparatively small computational footprint (suitable for use onboard a spacecraft, unlike alternatives) and potentially offers accuracy and precision needed for docking. This presents a solution to the AR&D problem that only requires a camera, which is much smaller, lighter, and requires far less power than competing AR&D sensors. We have demonstrated the algorithm's performance and ability to process 'flight-like' imagery formats with a 'flight-like' trajectory, positioning ourselves to easily process flight data from the upcoming 'ISS Selfie' activity and then compare the algorithm's quantified performance to the simulated imagery. This will bring visual odometry beyond TRL 5, proving its readiness to be demonstrated as part of an integrated system. Once beyond TRL 5, visual odometry will be poised to be demonstrated as part of a system in an in-space demo where relative pose is critical, like Orion AR&D, ISS robotic operations, asteroid proximity operations, and more.

  3. Plant-based FRET biosensor discriminates enviornmental zinc levels

    USDA-ARS?s Scientific Manuscript database

    Heavy metal accumulation in the environment poses great risks to flora and fauna. However, monitoring sites prone to accumulation poses scale and economic challenges. In this study, we present and test a method for monitoring these sites using fluorescent resonance energy transfer (FRET) change in r...

  4. Security, development and human rights: normative, legal and policy challenges for the international drug control system.

    PubMed

    Barrett, Damon

    2010-03-01

    This commentary addresses some of the challenges posed by the broader normative, legal and policy framework of the United Nations for the international drug control system. The 'purposes and principles' of the United Nations are presented and set against the threat based rhetoric of the drug control system and the negative consequences of that system. Some of the challenges posed by human rights law and norms to the international drug control system are also described, and the need for an impact assessment of the current system alongside alternative policy options is highlighted as a necessary consequence of these analyses. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  5. Testing Scientific Software: A Systematic Literature Review

    PubMed Central

    Kanewala, Upulee; Bieman, James M.

    2014-01-01

    Context Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. Objective This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. Method We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. Results We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Conclusions Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques. PMID:25125798

  6. Blood Pump Development Using Rocket Engine Flow Simulation Technology

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin

    2001-01-01

    This paper reports the progress made towards developing complete blood flow simulation capability in humans, especially in the presence of artificial devices such as valves and ventricular assist devices. Devices modeling poses unique challenges different from computing the blood flow in natural hearts and arteries. There are many elements needed to quantify the flow in these devices such as flow solvers, geometry modeling including flexible walls, moving boundary procedures and physiological characterization of blood. As a first step, computational technology developed for aerospace applications was extended to the analysis and development of a ventricular assist device (VAD), i.e., a blood pump. The blood flow in a VAD is practically incompressible and Newtonian, and thus an incompressible Navier-Stokes solution procedure can be applied. A primitive variable formulation is used in conjunction with the overset grid approach to handle complex moving geometry. The primary purpose of developing the incompressible flow analysis capability was to quantify the flow in advanced turbopump for space propulsion system. The same procedure has been extended to the development of NASA-DeBakey VAD that is based on an axial blood pump. Due to massive computing requirements, high-end computing is necessary for simulating three-dimensional flow in these pumps. Computational, experimental, and clinical results are presented.

  7. Toward real-time endoscopically-guided robotic navigation based on a 3D virtual surgical field model

    NASA Astrophysics Data System (ADS)

    Gong, Yuanzheng; Hu, Danying; Hannaford, Blake; Seibel, Eric J.

    2015-03-01

    The challenge is to accurately guide the surgical tool within the three-dimensional (3D) surgical field for roboticallyassisted operations such as tumor margin removal from a debulked brain tumor cavity. The proposed technique is 3D image-guided surgical navigation based on matching intraoperative video frames to a 3D virtual model of the surgical field. A small laser-scanning endoscopic camera was attached to a mock minimally-invasive surgical tool that was manipulated toward a region of interest (residual tumor) within a phantom of a debulked brain tumor. Video frames from the endoscope provided features that were matched to the 3D virtual model, which were reconstructed earlier by raster scanning over the surgical field. Camera pose (position and orientation) is recovered by implementing a constrained bundle adjustment algorithm. Navigational error during the approach to fluorescence target (residual tumor) is determined by comparing the calculated camera pose to the measured camera pose using a micro-positioning stage. From these preliminary results, computation efficiency of the algorithm in MATLAB code is near real-time (2.5 sec for each estimation of pose), which can be improved by implementation in C++. Error analysis produced 3-mm distance error and 2.5 degree of orientation error on average. The sources of these errors come from 1) inaccuracy of the 3D virtual model, generated on a calibrated RAVEN robotic platform with stereo tracking; 2) inaccuracy of endoscope intrinsic parameters, such as focal length; and 3) any endoscopic image distortion from scanning irregularities. This work demonstrates feasibility of micro-camera 3D guidance of a robotic surgical tool.

  8. Aneurysm flow characteristics in realistic carotid artery aneurysm models induced by proximal virtual stenotic plaques: a computational hemodynamics study

    NASA Astrophysics Data System (ADS)

    Castro, Marcelo A.; Peloc, Nora L.; Chien, Aichi; Goldberg, Ezequiel; Putman, Christopher M.; Cebral, Juan R.

    2015-03-01

    Cerebral aneurysms may rarely coexist with a proximal artery stenosis. In that small percent of patients, such coexistence poses a challenge for interventional neuroradiologists and neurosurgeons to make the best treatment decision. According to previous studies, the incidence of cerebral aneurysms in patients with internal carotid artery stenosis is no greater than five percent, where the aneurysm is usually incidentally detected, being about two percent for aneurysms and stenoses in the same cerebral circulation. Those cases pose a difficult management decision for the physician. Case reports showed patients who died due to aneurysm rupture months after endarterectomy but before aneurysm clipping, while others did not show any change in the aneurysm after plaque removal, having optimum outcome after aneurysm coiling. The aim of this study is to investigate the intra-aneurysmal hemodynamic changes before and after treatment of stenotic plaque. Virtually created moderate stenoses in vascular models of internal carotid artery aneurysm patients were considered in a number of cases reconstructed from three dimensional rotational angiography images. The strategy to create those plaques was based on parameters analyzed in a previous work where idealized models were considered, including relative distance and stenosis grade. Ipsilateral and contralateral plaques were modeled. Wall shear stress and velocity pattern were computed from finite element pulsatile blood flow simulations. The results may suggest that wall shear stress changes depend on relative angular position between the aneurysm and the plaque.

  9. Learning toward practical head pose estimation

    NASA Astrophysics Data System (ADS)

    Sang, Gaoli; He, Feixiang; Zhu, Rong; Xuan, Shibin

    2017-08-01

    Head pose is useful information for many face-related tasks, such as face recognition, behavior analysis, human-computer interfaces, etc. Existing head pose estimation methods usually assume that the face images have been well aligned or that sufficient and precise training data are available. In practical applications, however, these assumptions are very likely to be invalid. This paper first investigates the impact of the failure of these assumptions, i.e., misalignment of face images, uncertainty and undersampling of training data, on head pose estimation accuracy of state-of-the-art methods. A learning-based approach is then designed to enhance the robustness of head pose estimation to these factors. To cope with misalignment, instead of using hand-crafted features, it seeks suitable features by learning from a set of training data with a deep convolutional neural network (DCNN), such that the training data can be best classified into the correct head pose categories. To handle uncertainty and undersampling, it employs multivariate labeling distributions (MLDs) with dense sampling intervals to represent the head pose attributes of face images. The correlation between the features and the dense MLD representations of face images is approximated by a maximum entropy model, whose parameters are optimized on the given training data. To estimate the head pose of a face image, its MLD representation is first computed according to the model based on the features extracted from the image by the trained DCNN, and its head pose is then assumed to be the one corresponding to the peak in its MLD. Evaluation experiments on the Pointing'04, FacePix, Multi-PIE, and CASIA-PEAL databases prove the effectiveness and efficiency of the proposed method.

  10. A Disciplined Architectural Approach to Scaling Data Analysis for Massive, Scientific Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Braverman, A. J.; Cinquini, L.; Turmon, M.; Lee, H.; Law, E.

    2014-12-01

    Data collections across remote sensing and ground-based instruments in astronomy, Earth science, and planetary science are outpacing scientists' ability to analyze them. Furthermore, the distribution, structure, and heterogeneity of the measurements themselves pose challenges that limit the scalability of data analysis using traditional approaches. Methods for developing science data processing pipelines, distribution of scientific datasets, and performing analysis will require innovative approaches that integrate cyber-infrastructure, algorithms, and data into more systematic approaches that can more efficiently compute and reduce data, particularly distributed data. This requires the integration of computer science, machine learning, statistics and domain expertise to identify scalable architectures for data analysis. The size of data returned from Earth Science observing satellites and the magnitude of data from climate model output, is predicted to grow into the tens of petabytes challenging current data analysis paradigms. This same kind of growth is present in astronomy and planetary science data. One of the major challenges in data science and related disciplines defining new approaches to scaling systems and analysis in order to increase scientific productivity and yield. Specific needs include: 1) identification of optimized system architectures for analyzing massive, distributed data sets; 2) algorithms for systematic analysis of massive data sets in distributed environments; and 3) the development of software infrastructures that are capable of performing massive, distributed data analysis across a comprehensive data science framework. NASA/JPL has begun an initiative in data science to address these challenges. Our goal is to evaluate how scientific productivity can be improved through optimized architectural topologies that identify how to deploy and manage the access, distribution, computation, and reduction of massive, distributed data, while managing the uncertainties of scientific conclusions derived from such capabilities. This talk will provide an overview of JPL's efforts in developing a comprehensive architectural approach to data science.

  11. A Computer Security Course in the Undergraduate Computer Science Curriculum.

    ERIC Educational Resources Information Center

    Spillman, Richard

    1992-01-01

    Discusses the importance of computer security and considers criminal, national security, and personal privacy threats posed by security breakdown. Several examples are given, including incidents involving computer viruses. Objectives, content, instructional strategies, resources, and a sample examination for an experimental undergraduate computer…

  12. Fan-out Estimation in Spin-based Quantum Computer Scale-up.

    PubMed

    Nguyen, Thien; Hill, Charles D; Hollenberg, Lloyd C L; James, Matthew R

    2017-10-17

    Solid-state spin-based qubits offer good prospects for scaling based on their long coherence times and nexus to large-scale electronic scale-up technologies. However, high-threshold quantum error correction requires a two-dimensional qubit array operating in parallel, posing significant challenges in fabrication and control. While architectures incorporating distributed quantum control meet this challenge head-on, most designs rely on individual control and readout of all qubits with high gate densities. We analysed the fan-out routing overhead of a dedicated control line architecture, basing the analysis on a generalised solid-state spin qubit platform parameterised to encompass Coulomb confined (e.g. donor based spin qubits) or electrostatically confined (e.g. quantum dot based spin qubits) implementations. The spatial scalability under this model is estimated using standard electronic routing methods and present-day fabrication constraints. Based on reasonable assumptions for qubit control and readout we estimate 10 2 -10 5 physical qubits, depending on the quantum interconnect implementation, can be integrated and fanned-out independently. Assuming relatively long control-free interconnects the scalability can be extended. Ultimately, the universal quantum computation may necessitate a much higher number of integrated qubits, indicating that higher dimensional electronics fabrication and/or multiplexed distributed control and readout schemes may be the preferredstrategy for large-scale implementation.

  13. epiDMS: Data Management and Analytics for Decision-Making From Epidemic Spread Simulation Ensembles.

    PubMed

    Liu, Sicong; Poccia, Silvestro; Candan, K Selçuk; Chowell, Gerardo; Sapino, Maria Luisa

    2016-12-01

    Carefully calibrated large-scale computational models of epidemic spread represent a powerful tool to support the decision-making process during epidemic emergencies. Epidemic models are being increasingly used for generating forecasts of the spatial-temporal progression of epidemics at different spatial scales and for assessing the likely impact of different intervention strategies. However, the management and analysis of simulation ensembles stemming from large-scale computational models pose challenges, particularly when dealing with multiple interdependent parameters, spanning multiple layers and geospatial frames, affected by complex dynamic processes operating at different resolutions. We describe and illustrate with examples a novel epidemic simulation data management system, epiDMS, that was developed to address the challenges that arise from the need to generate, search, visualize, and analyze, in a scalable manner, large volumes of epidemic simulation ensembles and observations during the progression of an epidemic. epiDMS is a publicly available system that facilitates management and analysis of large epidemic simulation ensembles. epiDMS aims to fill an important hole in decision-making during healthcare emergencies by enabling critical services with significant economic and health impact. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  14. CAPER 3.0: A Scalable Cloud-Based System for Data-Intensive Analysis of Chromosome-Centric Human Proteome Project Data Sets.

    PubMed

    Yang, Shuai; Zhang, Xinlei; Diao, Lihong; Guo, Feifei; Wang, Dan; Liu, Zhongyang; Li, Honglei; Zheng, Junjie; Pan, Jingshan; Nice, Edouard C; Li, Dong; He, Fuchu

    2015-09-04

    The Chromosome-centric Human Proteome Project (C-HPP) aims to catalog genome-encoded proteins using a chromosome-by-chromosome strategy. As the C-HPP proceeds, the increasing requirement for data-intensive analysis of the MS/MS data poses a challenge to the proteomic community, especially small laboratories lacking computational infrastructure. To address this challenge, we have updated the previous CAPER browser into a higher version, CAPER 3.0, which is a scalable cloud-based system for data-intensive analysis of C-HPP data sets. CAPER 3.0 uses cloud computing technology to facilitate MS/MS-based peptide identification. In particular, it can use both public and private cloud, facilitating the analysis of C-HPP data sets. CAPER 3.0 provides a graphical user interface (GUI) to help users transfer data, configure jobs, track progress, and visualize the results comprehensively. These features enable users without programming expertise to easily conduct data-intensive analysis using CAPER 3.0. Here, we illustrate the usage of CAPER 3.0 with four specific mass spectral data-intensive problems: detecting novel peptides, identifying single amino acid variants (SAVs) derived from known missense mutations, identifying sample-specific SAVs, and identifying exon-skipping events. CAPER 3.0 is available at http://prodigy.bprc.ac.cn/caper3.

  15. Intercollegiate Athletics Today and Tomorrow: The President's Challenge

    ERIC Educational Resources Information Center

    Hanford, George H.

    1977-01-01

    Colleges and university presidents must come to terms with the challenges posed by intercollegiate athletics. The author offers concrete ideas and suggestions aimed at breaking through the confusion surrounding intercollegiate athletics. (Editor/LBH)

  16. Strategies to improve learning of all students in a class

    NASA Astrophysics Data System (ADS)

    Suraishkumar, G. K.

    2018-05-01

    The statistical distribution of the student learning abilities in a typical undergraduate engineering class poses a significant challenge to simultaneously improve the learning of all the students in the class. With traditional instruction styles, the students with significantly high learning abilities are not satisfied due to a feeling of unfulfilled potential, and the students with significantly low learning abilities feel lost. To address the challenge in an undergraduate core/required course on 'transport phenomena in biological systems', a combination of learning strategies such as active learning including co-operative group learning, challenge exercises, and others were employed in a pro-advising context. The short-term and long-term impacts were evaluated through student course performances and input, respectively. The results show that it is possible to effectively address the challenge posed by the distribution of student learning abilities in a class.

  17. ACTIVIS: Visual Exploration of Industry-Scale Deep Neural Network Models.

    PubMed

    Kahng, Minsuk; Andrews, Pierre Y; Kalro, Aditya; Polo Chau, Duen Horng

    2017-08-30

    While deep learning models have achieved state-of-the-art accuracies for many prediction tasks, understanding these models remains a challenge. Despite the recent interest in developing visual tools to help users interpret deep learning models, the complexity and wide variety of models deployed in industry, and the large-scale datasets that they used, pose unique design challenges that are inadequately addressed by existing work. Through participatory design sessions with over 15 researchers and engineers at Facebook, we have developed, deployed, and iteratively improved ACTIVIS, an interactive visualization system for interpreting large-scale deep learning models and results. By tightly integrating multiple coordinated views, such as a computation graph overview of the model architecture, and a neuron activation view for pattern discovery and comparison, users can explore complex deep neural network models at both the instance- and subset-level. ACTIVIS has been deployed on Facebook's machine learning platform. We present case studies with Facebook researchers and engineers, and usage scenarios of how ACTIVIS may work with different models.

  18. The Feasibility of Wearables in an Enterprise Environment and Their Impact on IT Security

    NASA Technical Reports Server (NTRS)

    Scotti, Vincent, Jr.

    2015-01-01

    This paper is intended to explore the usability and feasibility of wearables in an enterprise environment and their impact on IT Security. In this day and age, with the advent of the Internet of Things, we must explore all the new technology emerging from the minds of the new inventors. This means exploring the use of wearables in regards to their benefits, limitations, and the new challenges they pose to securing computer networks in the Federal environment. We will explore the design of the wearables, the interfaces needed to connect them, and what it will take to connect personal devices in the Federal enterprise network environment. We will provide an overview of the wearable design, concerns of ensuring the confidentiality, integrity, and availability of information and the challenges faced by those doing so. We will also review the implications and limitations of the policies governing wearable technology and the physical efforts to enforce them.

  19. New Platforms for Characterization of Biological Material Failure and Resilience Properties

    NASA Astrophysics Data System (ADS)

    Brown, Katherine; Butler, Benjamin J.; Nguyen, Thuy-Tien N.; Sorry, David; Williams, Alun; Proud, William G.

    2017-06-01

    Obtaining information about the material responses of viscoelastic soft matter, such as polymers and foams has, required adaptation of techniques traditionally used with hard condensed matter. More recently it has been recognized that understanding the strain-rate behavior of natural and synthetic soft biological materials poses even greater challenges for materials research due their heterogeneous composition and structural complexity. Expanding fundamental knowledge about how these classes of biomaterials function under different loading regimes is of considerable interest in both fundamental and applied research. A comparative overview of methods, developed in our laboratory or elsewhere, for determining material responses of cells and soft tissues over a wide range of strain rates (quasi-static to blast loading) will be presented. Examples will illustrate how data are obtained for studying material responses of cells and tissues. Strengths and weaknesses of current approaches will be discussed, with particular emphasis on challenges associated with the development of realistic experimental and computational models for trauma and other disease indications.

  20. Diffuse-Interface Modelling of Flow in Porous Media

    NASA Astrophysics Data System (ADS)

    Addy, Doug; Pradas, Marc; Schmuck, Marcus; Kalliadasis, Serafim

    2016-11-01

    Multiphase flows are ubiquitous in a wide spectrum of scientific and engineering applications, and their computational modelling often poses many challenges associated with the presence of free boundaries and interfaces. Interfacial flows in porous media encounter additional challenges and complexities due to their inherently multiscale behaviour. Here we investigate the dynamics of interfaces in porous media using an effective convective Cahn-Hilliard (CH) equation recently developed in from a Stokes-CH equation for microscopic heterogeneous domains by means of a homogenization methodology, where the microscopic details are taken into account as effective tensor coefficients which are given by a Poisson equation. The equations are decoupled under appropriate assumptions and solved in series using a classic finite-element formulation with the open-source software FEniCS. We investigate the effects of different microscopic geometries, including periodic and non-periodic, at the bulk fluid flow, and find that our model is able to describe the effective macroscopic behaviour without the need to resolve the microscopic details.

  1. A Self-Assisting Protein Folding Model for Teaching Structural Molecular Biology.

    PubMed

    Davenport, Jodi; Pique, Michael; Getzoff, Elizabeth; Huntoon, Jon; Gardner, Adam; Olson, Arthur

    2017-04-04

    Structural molecular biology is now becoming part of high school science curriculum thus posing a challenge for teachers who need to convey three-dimensional (3D) structures with conventional text and pictures. In many cases even interactive computer graphics does not go far enough to address these challenges. We have developed a flexible model of the polypeptide backbone using 3D printing technology. With this model we have produced a polypeptide assembly kit to create an idealized model of the Triosephosphate isomerase mutase enzyme (TIM), which forms a structure known as TIM barrel. This kit has been used in a laboratory practical where students perform a step-by-step investigation into the nature of protein folding, starting with the handedness of amino acids to the formation of secondary and tertiary structure. Based on the classroom evidence we collected, we conclude that these models are valuable and inexpensive resource for teaching structural molecular biology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Population-based imaging biobanks as source of big data.

    PubMed

    Gatidis, Sergios; Heber, Sophia D; Storz, Corinna; Bamberg, Fabian

    2017-06-01

    Advances of computational sciences over the last decades have enabled the introduction of novel methodological approaches in biomedical research. Acquiring extensive and comprehensive data about a research subject and subsequently extracting significant information has opened new possibilities in gaining insight into biological and medical processes. This so-called big data approach has recently found entrance into medical imaging and numerous epidemiological studies have been implementing advanced imaging to identify imaging biomarkers that provide information about physiological processes, including normal development and aging but also on the development of pathological disease states. The purpose of this article is to present existing epidemiological imaging studies and to discuss opportunities, methodological and organizational aspects, and challenges that population imaging poses to the field of big data research.

  3. Systemic lupus erythematosus with initial presentation of empyematous pleural effusion in an elderly male patient: a diagnostic challenge.

    PubMed

    Chang, Wei-Ting; Hsieh, Tung-Han; Liu, Ming-Fei

    2013-04-01

    Systemic lupus erythematosus (SLE) poses great difficulty in making an early diagnosis in elderly males, often presenting with atypical manifestations. Acute onset of empyematous pleural effusion has rarely been seen. Herein, we report a 66-year-old man with SLE presenting with rapid progression of bilateral pleural effusion. Diagnostic thoracocentesis disclosed neutrophil-predominant exudates and chest computed tomography revealed multiple loculated pleural effusions. Nevertheless, optimal antibiotic therapy plus surgical decortication of the pleura did not improve his condition. The diagnosis of SLE was readily established after LE cells were accidentally found in the pleural effusion. Large amounts of pleural effusion subsided soon after high dose corticosteroid therapy. Copyright © 2011. Published by Elsevier B.V.

  4. Solving the quantum many-body problem with artificial neural networks

    NASA Astrophysics Data System (ADS)

    Carleo, Giuseppe; Troyer, Matthias

    2017-02-01

    The challenge posed by the many-body problem in quantum physics originates from the difficulty of describing the nontrivial correlations encoded in the exponential complexity of the many-body wave function. Here we demonstrate that systematic machine learning of the wave function can reduce this complexity to a tractable computational form for some notable cases of physical interest. We introduce a variational representation of quantum states based on artificial neural networks with a variable number of hidden neurons. A reinforcement-learning scheme we demonstrate is capable of both finding the ground state and describing the unitary time evolution of complex interacting quantum systems. Our approach achieves high accuracy in describing prototypical interacting spins models in one and two dimensions.

  5. Requirements for facilities and measurement techniques to support CFD development for hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Sellers, William L., III; Dwoyer, Douglas L.

    1992-01-01

    The design of a hypersonic aircraft poses unique challenges to the engineering community. Problems with duplicating flight conditions in ground based facilities have made performance predictions risky. Computational fluid dynamics (CFD) has been proposed as an additional means of providing design data. At the present time, CFD codes are being validated based on sparse experimental data and then used to predict performance at flight conditions with generally unknown levels of uncertainty. This paper will discuss the facility and measurement techniques that are required to support CFD development for the design of hypersonic aircraft. Illustrations are given of recent success in combining experimental and direct numerical simulation in CFD model development and validation for hypersonic perfect gas flows.

  6. Distributed computation: the new wave of synthetic biology devices.

    PubMed

    Macía, Javier; Posas, Francesc; Solé, Ricard V

    2012-06-01

    Synthetic biology (SB) offers a unique opportunity for designing complex molecular circuits able to perform predefined functions. But the goal of achieving a flexible toolbox of reusable molecular components has been shown to be limited due to circuit unpredictability, incompatible parts or random fluctuations. Many of these problems arise from the challenges posed by engineering the molecular circuitry: multiple wires are usually difficult to implement reliably within one cell and the resulting systems cannot be reused in other modules. These problems are solved by means of a nonstandard approach to single cell devices, using cell consortia and allowing the output signal to be distributed among different cell types, which can be combined in multiple, reusable and scalable ways. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Sample Delivery and Computer Control Systems for Detecting Leaks in the Main Engines of the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Griffin, Timothy P.; Naylor, Guy R.; Hritz, Richard J.; Barrett, Carolyn A.

    1997-01-01

    The main engines of the Space Shuttle use hydrogen and oxygen as the fuel and oxidant. The explosive and fire hazards associated with these two components pose a serious danger to personnel and equipment. Therefore prior to use the main engines undergo extensive leak tests. Instead of using hazardous gases there tests utilize helium as the tracer element. This results in a need to monitor helium in the ppm level continuously for hours. The major challenge in developing such a low level gas monitor is the sample delivery system. This paper discuss a system developed to meet the requirements while also being mobile. Also shown is the calibration technique, stability, and accuracy results for the system.

  8. Ferroelectric symmetry-protected multibit memory cell

    DOE PAGES

    Baudry, Laurent; Lukyanchuk, Igor; Vinokur, Valerii M.

    2017-02-08

    Here, the tunability of electrical polarization in ferroelectrics is instrumental to their applications in information-storage devices. The existing ferroelectric memory cells are based on the two-level storage capacity with the standard binary logics. However, the latter have reached its fundamental limitations. Here we propose ferroelectric multibit cells (FMBC) utilizing the ability of multiaxial ferroelectric materials to pin the polarization at a sequence of the multistable states. Employing the catastrophe theory principles we show that these states are symmetry-protected against the information loss and thus realize novel topologically-controlled access memory (TAM). Our findings enable developing a platform for the emergent many-valuedmore » non-Boolean information technology and target challenges posed by needs of quantum and neuromorphic computing.« less

  9. D3R Grand Challenge 2: blind prediction of protein-ligand poses, affinity rankings, and relative binding free energies

    NASA Astrophysics Data System (ADS)

    Gaieb, Zied; Liu, Shuai; Gathiaka, Symon; Chiu, Michael; Yang, Huanwang; Shao, Chenghua; Feher, Victoria A.; Walters, W. Patrick; Kuhn, Bernd; Rudolph, Markus G.; Burley, Stephen K.; Gilson, Michael K.; Amaro, Rommie E.

    2018-01-01

    The Drug Design Data Resource (D3R) ran Grand Challenge 2 (GC2) from September 2016 through February 2017. This challenge was based on a dataset of structures and affinities for the nuclear receptor farnesoid X receptor (FXR), contributed by F. Hoffmann-La Roche. The dataset contained 102 IC50 values, spanning six orders of magnitude, and 36 high-resolution co-crystal structures with representatives of four major ligand classes. Strong global participation was evident, with 49 participants submitting 262 prediction submission packages in total. Procedurally, GC2 mimicked Grand Challenge 2015 (GC2015), with a Stage 1 subchallenge testing ligand pose prediction methods and ranking and scoring methods, and a Stage 2 subchallenge testing only ligand ranking and scoring methods after the release of all blinded co-crystal structures. Two smaller curated sets of 18 and 15 ligands were developed to test alchemical free energy methods. This overview summarizes all aspects of GC2, including the dataset details, challenge procedures, and participant results. We also consider implications for progress in the field, while highlighting methodological areas that merit continued development. Similar to GC2015, the outcome of GC2 underscores the pressing need for methods development in pose prediction, particularly for ligand scaffolds not currently represented in the Protein Data Bank (http://www.pdb.org), and in affinity ranking and scoring of bound ligands.

  10. Using Computational and Mechanical Models to Study Animal Locomotion

    PubMed Central

    Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas

    2012-01-01

    Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: “Integrating living and physical systems.” PMID:22988026

  11. Manufacturing and Security Challenges in 3D Printing

    NASA Astrophysics Data System (ADS)

    Zeltmann, Steven Eric; Gupta, Nikhil; Tsoutsos, Nektarios Georgios; Maniatakos, Michail; Rajendran, Jeyavijayan; Karri, Ramesh

    2016-07-01

    As the manufacturing time, quality, and cost associated with additive manufacturing (AM) continue to improve, more and more businesses and consumers are adopting this technology. Some of the key benefits of AM include customizing products, localizing production and reducing logistics. Due to these and numerous other benefits, AM is enabling a globally distributed manufacturing process and supply chain spanning multiple parties, and hence raises concerns about the reliability of the manufactured product. In this work, we first present a brief overview of the potential risks that exist in the cyber-physical environment of additive manufacturing. We then evaluate the risks posed by two different classes of modifications to the AM process which are representative of the challenges that are unique to AM. The risks posed are examined through mechanical testing of objects with altered printing orientation and fine internal defects. Finite element analysis and ultrasonic inspection are also used to demonstrate the potential for decreased performance and for evading detection. The results highlight several scenarios, intentional or unintentional, that can affect the product quality and pose security challenges for the additive manufacturing supply chain.

  12. A Computational/Experimental Platform for Investigating Three-Dimensional Puzzle Solving of Comminuted Articular Fractures

    PubMed Central

    Thomas, Thaddeus P.; Anderson, Donald D.; Willis, Andrew R.; Liu, Pengcheng; Frank, Matthew C.; Marsh, J. Lawrence; Brown, Thomas D.

    2011-01-01

    Reconstructing highly comminuted articular fractures poses a difficult surgical challenge, akin to solving a complicated three-dimensional (3D) puzzle. Pre-operative planning using CT is critically important, given the desirability of less invasive surgical approaches. The goal of this work is to advance 3D puzzle solving methods toward use as a pre-operative tool for reconstructing these complex fractures. Methodology for generating typical fragmentation/dispersal patterns was developed. Five identical replicas of human distal tibia anatomy, were machined from blocks of high-density polyetherurethane foam (bone fragmentation surrogate), and were fractured using an instrumented drop tower. Pre- and post-fracture geometries were obtained using laser scans and CT. A semi-automatic virtual reconstruction computer program aligned fragment native (non-fracture) surfaces to a pre-fracture template. The tibias were precisely reconstructed with alignment accuracies ranging from 0.03-0.4mm. This novel technology has potential to significantly enhance surgical techniques for reconstructing comminuted intra-articular fractures, as illustrated for a representative clinical case. PMID:20924863

  13. FPGA cluster for high-performance AO real-time control system

    NASA Astrophysics Data System (ADS)

    Geng, Deli; Goodsell, Stephen J.; Basden, Alastair G.; Dipper, Nigel A.; Myers, Richard M.; Saunter, Chris D.

    2006-06-01

    Whilst the high throughput and low latency requirements for the next generation AO real-time control systems have posed a significant challenge to von Neumann architecture processor systems, the Field Programmable Gate Array (FPGA) has emerged as a long term solution with high performance on throughput and excellent predictability on latency. Moreover, FPGA devices have highly capable programmable interfacing, which lead to more highly integrated system. Nevertheless, a single FPGA is still not enough: multiple FPGA devices need to be clustered to perform the required subaperture processing and the reconstruction computation. In an AO real-time control system, the memory bandwidth is often the bottleneck of the system, simply because a vast amount of supporting data, e.g. pixel calibration maps and the reconstruction matrix, need to be accessed within a short period. The cluster, as a general computing architecture, has excellent scalability in processing throughput, memory bandwidth, memory capacity, and communication bandwidth. Problems, such as task distribution, node communication, system verification, are discussed.

  14. Methods for extracting social network data from chatroom logs

    NASA Astrophysics Data System (ADS)

    Osesina, O. Isaac; McIntire, John P.; Havig, Paul R.; Geiselman, Eric E.; Bartley, Cecilia; Tudoreanu, M. Eduard

    2012-06-01

    Identifying social network (SN) links within computer-mediated communication platforms without explicit relations among users poses challenges to researchers. Our research aims to extract SN links in internet chat with multiple users engaging in synchronous overlapping conversations all displayed in a single stream. We approached this problem using three methods which build on previous research. Response-time analysis builds on temporal proximity of chat messages; word context usage builds on keywords analysis and direct addressing which infers links by identifying the intended message recipient from the screen name (nickname) referenced in the message [1]. Our analysis of word usage within the chat stream also provides contexts for the extracted SN links. To test the capability of our methods, we used publicly available data from Internet Relay Chat (IRC), a real-time computer-mediated communication (CMC) tool used by millions of people around the world. The extraction performances of individual methods and their hybrids were assessed relative to a ground truth (determined a priori via manual scoring).

  15. GENOME-WIDE GENETIC INTERACTION ANALYSIS OF GLAUCOMA USING EXPERT KNOWLEDGE DERIVED FROM HUMAN PHENOTYPE NETWORKS

    PubMed Central

    HU, TING; DARABOS, CHRISTIAN; CRICCO, MARIA E.; KONG, EMILY; MOORE, JASON H.

    2014-01-01

    The large volume of GWAS data poses great computational challenges for analyzing genetic interactions associated with common human diseases. We propose a computational framework for characterizing epistatic interactions among large sets of genetic attributes in GWAS data. We build the human phenotype network (HPN) and focus around a disease of interest. In this study, we use the GLAUGEN glaucoma GWAS dataset and apply the HPN as a biological knowledge-based filter to prioritize genetic variants. Then, we use the statistical epistasis network (SEN) to identify a significant connected network of pairwise epistatic interactions among the prioritized SNPs. These clearly highlight the complex genetic basis of glaucoma. Furthermore, we identify key SNPs by quantifying structural network characteristics. Through functional annotation of these key SNPs using Biofilter, a software accessing multiple publicly available human genetic data sources, we find supporting biomedical evidences linking glaucoma to an array of genetic diseases, proving our concept. We conclude by suggesting hypotheses for a better understanding of the disease. PMID:25592582

  16. Investigation of smoothness-increasing accuracy-conserving filters for improving streamline integration through discontinuous fields.

    PubMed

    Steffen, Michael; Curtis, Sean; Kirby, Robert M; Ryan, Jennifer K

    2008-01-01

    Streamline integration of fields produced by computational fluid mechanics simulations is a commonly used tool for the investigation and analysis of fluid flow phenomena. Integration is often accomplished through the application of ordinary differential equation (ODE) integrators--integrators whose error characteristics are predicated on the smoothness of the field through which the streamline is being integrated--smoothness which is not available at the inter-element level of finite volume and finite element data. Adaptive error control techniques are often used to ameliorate the challenge posed by inter-element discontinuities. As the root of the difficulties is the discontinuous nature of the data, we present a complementary approach of applying smoothness-enhancing accuracy-conserving filters to the data prior to streamline integration. We investigate whether such an approach applied to uniform quadrilateral discontinuous Galerkin (high-order finite volume) data can be used to augment current adaptive error control approaches. We discuss and demonstrate through numerical example the computational trade-offs exhibited when one applies such a strategy.

  17. Action and language integration: from humans to cognitive robots.

    PubMed

    Borghi, Anna M; Cangelosi, Angelo

    2014-07-01

    The topic is characterized by a highly interdisciplinary approach to the issue of action and language integration. Such an approach, combining computational models and cognitive robotics experiments with neuroscience, psychology, philosophy, and linguistic approaches, can be a powerful means that can help researchers disentangle ambiguous issues, provide better and clearer definitions, and formulate clearer predictions on the links between action and language. In the introduction we briefly describe the papers and discuss the challenges they pose to future research. We identify four important phenomena the papers address and discuss in light of empirical and computational evidence: (a) the role played not only by sensorimotor and emotional information but also of natural language in conceptual representation; (b) the contextual dependency and high flexibility of the interaction between action, concepts, and language; (c) the involvement of the mirror neuron system in action and language processing; (d) the way in which the integration between action and language can be addressed by developmental robotics and Human-Robot Interaction. Copyright © 2014 Cognitive Science Society, Inc.

  18. Trigger learning and ECG parameter customization for remote cardiac clinical care information system.

    PubMed

    Bashir, Mohamed Ezzeldin A; Lee, Dong Gyu; Li, Meijing; Bae, Jang-Whan; Shon, Ho Sun; Cho, Myung Chan; Ryu, Keun Ho

    2012-07-01

    Coronary heart disease is being identified as the largest single cause of death along the world. The aim of a cardiac clinical information system is to achieve the best possible diagnosis of cardiac arrhythmias by electronic data processing. Cardiac information system that is designed to offer remote monitoring of patient who needed continues follow up is demanding. However, intra- and interpatient electrocardiogram (ECG) morphological descriptors are varying through the time as well as the computational limits pose significant challenges for practical implementations. The former requires that the classification model be adjusted continuously, and the latter requires a reduction in the number and types of ECG features, and thus, the computational burden, necessary to classify different arrhythmias. We propose the use of adaptive learning to automatically train the classifier on up-to-date ECG data, and employ adaptive feature selection to define unique feature subsets pertinent to different types of arrhythmia. Experimental results show that this hybrid technique outperforms conventional approaches and is, therefore, a promising new intelligent diagnostic tool.

  19. Principles and Overview of Sampling Methods for Modeling Macromolecular Structure and Dynamics

    PubMed Central

    Moffatt, Ryan; Ma, Buyong; Nussinov, Ruth

    2016-01-01

    Investigation of macromolecular structure and dynamics is fundamental to understanding how macromolecules carry out their functions in the cell. Significant advances have been made toward this end in silico, with a growing number of computational methods proposed yearly to study and simulate various aspects of macromolecular structure and dynamics. This review aims to provide an overview of recent advances, focusing primarily on methods proposed for exploring the structure space of macromolecules in isolation and in assemblies for the purpose of characterizing equilibrium structure and dynamics. In addition to surveying recent applications that showcase current capabilities of computational methods, this review highlights state-of-the-art algorithmic techniques proposed to overcome challenges posed in silico by the disparate spatial and time scales accessed by dynamic macromolecules. This review is not meant to be exhaustive, as such an endeavor is impossible, but rather aims to balance breadth and depth of strategies for modeling macromolecular structure and dynamics for a broad audience of novices and experts. PMID:27124275

  20. Cybersim: geographic, temporal, and organizational dynamics of malware propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santhi, Nandakishore; Yan, Guanhua; Eidenbenz, Stephan

    2010-01-01

    Cyber-infractions into a nation's strategic security envelope pose a constant and daunting challenge. We present the modular CyberSim tool which has been developed in response to the need to realistically simulate at a national level, software vulnerabilities and resulting mal ware propagation in online social networks. CyberSim suite (a) can generate realistic scale-free networks from a database of geocoordinated computers to closely model social networks arising from personal and business email contacts and online communities; (b) maintains for each,bost a list of installed software, along with the latest published vulnerabilities; (d) allows designated initial nodes where malware gets introduced; (e)more » simulates, using distributed discrete event-driven technology, the spread of malware exploiting a specific vulnerability, with packet delay and user online behavior models; (f) provides a graphical visualization of spread of infection, its severity, businesses affected etc to the analyst. We present sample simulations on a national level network with millions of computers.« less

  1. Structure Prediction of the Second Extracellular Loop in G-Protein-Coupled Receptors

    PubMed Central

    Kmiecik, Sebastian; Jamroz, Michal; Kolinski, Michal

    2014-01-01

    G-protein-coupled receptors (GPCRs) play key roles in living organisms. Therefore, it is important to determine their functional structures. The second extracellular loop (ECL2) is a functionally important region of GPCRs, which poses significant challenge for computational structure prediction methods. In this work, we evaluated CABS, a well-established protein modeling tool for predicting ECL2 structure in 13 GPCRs. The ECL2s (with between 13 and 34 residues) are predicted in an environment of other extracellular loops being fully flexible and the transmembrane domain fixed in its x-ray conformation. The modeling procedure used theoretical predictions of ECL2 secondary structure and experimental constraints on disulfide bridges. Our approach yielded ensembles of low-energy conformers and the most populated conformers that contained models close to the available x-ray structures. The level of similarity between the predicted models and x-ray structures is comparable to that of other state-of-the-art computational methods. Our results extend other studies by including newly crystallized GPCRs. PMID:24896119

  2. Design Sketches For Optical Crossbar Switches Intended For Large-Scale Parallel Processing Applications

    NASA Astrophysics Data System (ADS)

    Hartmann, Alfred; Redfield, Steve

    1989-04-01

    This paper discusses design of large-scale (1000x 1000) optical crossbar switching networks for use in parallel processing supercom-puters. Alternative design sketches for an optical crossbar switching network are presented using free-space optical transmission with either a beam spreading/masking model or a beam steering model for internodal communications. The performances of alternative multiple access channel communications protocol-unslotted and slotted ALOHA and carrier sense multiple access (CSMA)-are compared with the performance of the classic arbitrated bus crossbar of conventional electronic parallel computing. These comparisons indicate an almost inverse relationship between ease of implementation and speed of operation. Practical issues of optical system design are addressed, and an optically addressed, composite spatial light modulator design is presented for fabrication to arbitrarily large scale. The wide range of switch architecture, communications protocol, optical systems design, device fabrication, and system performance problems presented by these design sketches poses a serious challenge to practical exploitation of highly parallel optical interconnects in advanced computer designs.

  3. Principles and Overview of Sampling Methods for Modeling Macromolecular Structure and Dynamics.

    PubMed

    Maximova, Tatiana; Moffatt, Ryan; Ma, Buyong; Nussinov, Ruth; Shehu, Amarda

    2016-04-01

    Investigation of macromolecular structure and dynamics is fundamental to understanding how macromolecules carry out their functions in the cell. Significant advances have been made toward this end in silico, with a growing number of computational methods proposed yearly to study and simulate various aspects of macromolecular structure and dynamics. This review aims to provide an overview of recent advances, focusing primarily on methods proposed for exploring the structure space of macromolecules in isolation and in assemblies for the purpose of characterizing equilibrium structure and dynamics. In addition to surveying recent applications that showcase current capabilities of computational methods, this review highlights state-of-the-art algorithmic techniques proposed to overcome challenges posed in silico by the disparate spatial and time scales accessed by dynamic macromolecules. This review is not meant to be exhaustive, as such an endeavor is impossible, but rather aims to balance breadth and depth of strategies for modeling macromolecular structure and dynamics for a broad audience of novices and experts.

  4. The importance of proving the null.

    PubMed

    Gallistel, C R

    2009-04-01

    Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is favored. A general solution is a sensitivity analysis: Compute the odds for or against the null as a function of the limit(s) on the vagueness of the alternative. If the odds on the null approach 1 from above as the hypothesized maximum size of the possible effect approaches 0, then the data favor the null over any vaguer alternative to it. The simple computations and the intuitive graphic representation of the analysis are illustrated by the analysis of diverse examples from the current literature. They pose 3 common experimental questions: (a) Are 2 means the same? (b) Is performance at chance? (c) Are factors additive? (c) 2009 APA, all rights reserved

  5. Human action recognition based on kinematic similarity in real time

    PubMed Central

    Chen, Longting; Luo, Ailing; Zhang, Sicong

    2017-01-01

    Human action recognition using 3D pose data has gained a growing interest in the field of computer robotic interfaces and pattern recognition since the availability of hardware to capture human pose. In this paper, we propose a fast, simple, and powerful method of human action recognition based on human kinematic similarity. The key to this method is that the action descriptor consists of joints position, angular velocity and angular acceleration, which can meet the different individual sizes and eliminate the complex normalization. The angular parameters of joints within a short sliding time window (approximately 5 frames) around the current frame are used to express each pose frame of human action sequence. Moreover, three modified KNN (k-nearest-neighbors algorithm) classifiers are employed in our method: one for achieving the confidence of every frame in the training step, one for estimating the frame label of each descriptor, and one for classifying actions. Additional estimating of the frame’s time label makes it possible to address single input frames. This approach can be used on difficult, unsegmented sequences. The proposed method is efficient and can be run in real time. The research shows that many public datasets are irregularly segmented, and a simple method is provided to regularize the datasets. The approach is tested on some challenging datasets such as MSR-Action3D, MSRDailyActivity3D, and UTD-MHAD. The results indicate our method achieves a higher accuracy. PMID:29073131

  6. A Single Camera Motion Capture System for Human-Computer Interaction

    NASA Astrophysics Data System (ADS)

    Okada, Ryuzo; Stenger, Björn

    This paper presents a method for markerless human motion capture using a single camera. It uses tree-based filtering to efficiently propagate a probability distribution over poses of a 3D body model. The pose vectors and associated shapes are arranged in a tree, which is constructed by hierarchical pairwise clustering, in order to efficiently evaluate the likelihood in each frame. Anew likelihood function based on silhouette matching is proposed that improves the pose estimation of thinner body parts, i. e. the limbs. The dynamic model takes self-occlusion into account by increasing the variance of occluded body-parts, thus allowing for recovery when the body part reappears. We present two applications of our method that work in real-time on a Cell Broadband Engine™: a computer game and a virtual clothing application.

  7. Head Pose Estimation on Eyeglasses Using Line Detection and Classification Approach

    NASA Astrophysics Data System (ADS)

    Setthawong, Pisal; Vannija, Vajirasak

    This paper proposes a unique approach for head pose estimation of subjects with eyeglasses by using a combination of line detection and classification approaches. Head pose estimation is considered as an important non-verbal form of communication and could also be used in the area of Human-Computer Interface. A major improvement of the proposed approach is that it allows estimation of head poses at a high yaw/pitch angle when compared with existing geometric approaches, does not require expensive data preparation and training, and is generally fast when compared with other approaches.

  8. SMEs and their E-Commerce: Implications for Training in Wellington, New Zealand

    ERIC Educational Resources Information Center

    Beal, Tim; Abdullah, Moha Asri

    2005-01-01

    One of the greatest challenges facing traditional small and medium-sized enterprises (SMEs) throughout the world is that posed by the Internet. While the Internet offers great potential to SMEs, from improving and cheapening production processes through to reaching global customers, it also poses great problems. SMEs' resources, human and…

  9. Hadfield poses with MSL FLSS in the Node 2

    NASA Image and Video Library

    2012-12-23

    View of Canada Space Agency (CSA) Chris Hadfield, Expedition 34 Flight Engineer (FE), poses with a Materials Science Laboratory (MSL) Furnace Launch Support Structure (FLSS) in the U.S. Laboratory. Tom Marshburn (background), Expedition 34 FE uses laptop computer. Photo was taken during Expedition 34.

  10. Preconditioner and convergence study for the Quantum Computer Aided Design (QCAD) nonlinear poisson problem posed on the Ottawa Flat 270 design geometry.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalashnikova, Irina

    2012-05-01

    A numerical study aimed to evaluate different preconditioners within the Trilinos Ifpack and ML packages for the Quantum Computer Aided Design (QCAD) non-linear Poisson problem implemented within the Albany code base and posed on the Ottawa Flat 270 design geometry is performed. This study led to some new development of Albany that allows the user to select an ML preconditioner with Zoltan repartitioning based on nodal coordinates, which is summarized. Convergence of the numerical solutions computed within the QCAD computational suite with successive mesh refinement is examined in two metrics, the mean value of the solution (an L{sup 1} norm)more » and the field integral of the solution (L{sup 2} norm).« less

  11. The Marihuana Dilemma: Challenge to Commanders.

    DTIC Science & Technology

    The marihuana dilemma poses a major challenge to commanders in the US Army today. The problem was analyzed as to the characteristics of the drug...available to commanders to meet the challenge. The essay concludes that marihuana should not be legalized; drug users or former drug users should not be

  12. Student Media Production to Meet Challenges in Climate Change Science Education

    ERIC Educational Resources Information Center

    Rooney-Varga, Juliette N.; Brisk, Angelica Allende; Adams, Elizabeth; Shuldman, Elizabeth; Rath, Kenneth

    2014-01-01

    While the need for effective climate change education is growing, this area of geoscience also poses unique educational challenges. These challenges include the politicization of climate change, the psychological and affective responses it elicits, and common misconceptions, which can all create barriers to learning. Here, we present an…

  13. Transforming Teaching Challenges into Learning Opportunities: Interdisciplinary Reflective Collaboration

    ERIC Educational Resources Information Center

    Callaghan, Ronel

    2015-01-01

    Teaching in higher education poses unique sets of challenges, especially for academics in the engineering, built sciences and information science education disciplines. This article focuses on how reflective collaboration can support academics in their quest to find unique solutions to challenges in different academic contexts. A reflective…

  14. Tracking Climate Change through the Spatiotemporal Dynamics of the Teletherms, the Statistically Hottest and Coldest Days of the Year

    DOE PAGES

    Dodds, Peter Sheridan; Mitchell, Lewis; Reagan, Andrew J.; ...

    2016-05-11

    Instabilities and long term shifts in seasons, whether induced by natural drivers or human activities, pose great disruptive threats to ecological, agricultural, and social systems. Here, we propose, measure, and explore two fundamental markers of location-sensitive seasonal variations: the Summer and Winter Teletherms—the on-average annual dates of the hottest and coldest days of the year. We analyze daily temperature extremes recorded at 1218 stations across the contiguous United States from 1853–2012, and observe large regional variation with the Summer Teletherm falling up to 90 days after the Summer Solstice, and 50 days for the Winter Teletherm after the Winter Solstice.more » We show that Teletherm temporal dynamics are substantive with clear and in some cases dramatic shifts reflective of system bifurcations. We also compare recorded daily temperature extremes with output from two regional climate models finding considerable though relatively unbiased error. In conclusion, our work demonstrates that Teletherms are an intuitive, powerful, and statistically sound measure of local climate change, and that they pose detailed, stringent challenges for future theoretical and computational models.« less

  15. Local binary pattern variants-based adaptive texture features analysis for posed and nonposed facial expression recognition

    NASA Astrophysics Data System (ADS)

    Sultana, Maryam; Bhatti, Naeem; Javed, Sajid; Jung, Soon Ki

    2017-09-01

    Facial expression recognition (FER) is an important task for various computer vision applications. The task becomes challenging when it requires the detection and encoding of macro- and micropatterns of facial expressions. We present a two-stage texture feature extraction framework based on the local binary pattern (LBP) variants and evaluate its significance in recognizing posed and nonposed facial expressions. We focus on the parametric limitations of the LBP variants and investigate their effects for optimal FER. The size of the local neighborhood is an important parameter of the LBP technique for its extraction in images. To make the LBP adaptive, we exploit the granulometric information of the facial images to find the local neighborhood size for the extraction of center-symmetric LBP (CS-LBP) features. Our two-stage texture representations consist of an LBP variant and the adaptive CS-LBP features. Among the presented two-stage texture feature extractions, the binarized statistical image features and adaptive CS-LBP features were found showing high FER rates. Evaluation of the adaptive texture features shows competitive and higher performance than the nonadaptive features and other state-of-the-art approaches, respectively.

  16. Docking pose selection by interaction pattern graph similarity: application to the D3R grand challenge 2015.

    PubMed

    Slynko, Inna; Da Silva, Franck; Bret, Guillaume; Rognan, Didier

    2016-09-01

    High affinity ligands for a given target tend to share key molecular interactions with important anchoring amino acids and therefore often present quite conserved interaction patterns. This simple concept was formalized in a topological knowledge-based scoring function (GRIM) for selecting the most appropriate docking poses from previously X-rayed interaction patterns. GRIM first converts protein-ligand atomic coordinates (docking poses) into a simple 3D graph describing the corresponding interaction pattern. In a second step, proposed graphs are compared to that found from template structures in the Protein Data Bank. Last, all docking poses are rescored according to an empirical score (GRIMscore) accounting for overlap of maximum common subgraphs. Taking the opportunity of the public D3R Grand Challenge 2015, GRIM was used to rescore docking poses for 36 ligands (6 HSP90α inhibitors, 30 MAP4K4 inhibitors) prior to the release of the corresponding protein-ligand X-ray structures. When applied to the HSP90α dataset, for which many protein-ligand X-ray structures are already available, GRIM provided very high quality solutions (mean rmsd = 1.06 Å, n = 6) as top-ranked poses, and significantly outperformed a state-of-the-art scoring function. In the case of MAP4K4 inhibitors, for which preexisting 3D knowledge is scarce and chemical diversity is much larger, the accuracy of GRIM poses decays (mean rmsd = 3.18 Å, n = 30) although GRIM still outperforms an energy-based scoring function. GRIM rescoring appears to be quite robust with comparison to the other approaches competing for the same challenge (42 submissions for the HSP90 dataset, 27 for the MAP4K4 dataset) as it ranked 3rd and 2nd respectively, for the two investigated datasets. The rescoring method is quite simple to implement, independent on a docking engine, and applicable to any target for which at least one holo X-ray structure is available.

  17. Hadfield poses with MSL FLSS in the Node 2

    NASA Image and Video Library

    2012-12-23

    ISS034-E-010603 (28 Dec. 2012) --- Canadian Space Agency astronaut Chris Hadfield, Expedition 34 flight engineer, poses with a Materials Science Laboratory (MSL) Furnace Launch Support Structure (FLSS) in the Destiny laboratory of the International Space Station. NASA astronaut Tom Marshburn, flight engineer, uses a computer in the background.

  18. Joint Model and Parameter Dimension Reduction for Bayesian Inversion Applied to an Ice Sheet Flow Problem

    NASA Astrophysics Data System (ADS)

    Ghattas, O.; Petra, N.; Cui, T.; Marzouk, Y.; Benjamin, P.; Willcox, K.

    2016-12-01

    Model-based projections of the dynamics of the polar ice sheets play a central role in anticipating future sea level rise. However, a number of mathematical and computational challenges place significant barriers on improving predictability of these models. One such challenge is caused by the unknown model parameters (e.g., in the basal boundary conditions) that must be inferred from heterogeneous observational data, leading to an ill-posed inverse problem and the need to quantify uncertainties in its solution. In this talk we discuss the problem of estimating the uncertainty in the solution of (large-scale) ice sheet inverse problems within the framework of Bayesian inference. Computing the general solution of the inverse problem--i.e., the posterior probability density--is intractable with current methods on today's computers, due to the expense of solving the forward model (3D full Stokes flow with nonlinear rheology) and the high dimensionality of the uncertain parameters (which are discretizations of the basal sliding coefficient field). To overcome these twin computational challenges, it is essential to exploit problem structure (e.g., sensitivity of the data to parameters, the smoothing property of the forward model, and correlations in the prior). To this end, we present a data-informed approach that identifies low-dimensional structure in both parameter space and the forward model state space. This approach exploits the fact that the observations inform only a low-dimensional parameter space and allows us to construct a parameter-reduced posterior. Sampling this parameter-reduced posterior still requires multiple evaluations of the forward problem, therefore we also aim to identify a low dimensional state space to reduce the computational cost. To this end, we apply a proper orthogonal decomposition (POD) approach to approximate the state using a low-dimensional manifold constructed using ``snapshots'' from the parameter reduced posterior, and the discrete empirical interpolation method (DEIM) to approximate the nonlinearity in the forward problem. We show that using only a limited number of forward solves, the resulting subspaces lead to an efficient method to explore the high-dimensional posterior.

  19. Security model for VM in cloud

    NASA Astrophysics Data System (ADS)

    Kanaparti, Venkataramana; Naveen K., R.; Rajani, S.; Padmvathamma, M.; Anitha, C.

    2013-03-01

    Cloud computing is a new approach emerged to meet ever-increasing demand for computing resources and to reduce operational costs and Capital Expenditure for IT services. As this new way of computation allows data and applications to be stored away from own corporate server, it brings more issues in security such as virtualization security, distributed computing, application security, identity management, access control and authentication. Even though Virtualization forms the basis for cloud computing it poses many threats in securing cloud. As most of Security threats lies at Virtualization layer in cloud we proposed this new Security Model for Virtual Machine in Cloud (SMVC) in which every process is authenticated by Trusted-Agent (TA) in Hypervisor as well as in VM. Our proposed model is designed to with-stand attacks by unauthorized process that pose threat to applications related to Data Mining, OLAP systems, Image processing which requires huge resources in cloud deployed on one or more VM's.

  20. Overview of Variable-Speed Power-Turbine Research

    NASA Technical Reports Server (NTRS)

    Welch, Gerard E.

    2011-01-01

    The vertical take-off and landing (VTOL) and high-speed cruise capability of the NASA Large Civil Tilt-Rotor (LCTR) notional vehicle is envisaged to enable increased throughput in the national airspace. A key challenge of the LCTR is the requirement to vary the main rotor speeds from 100% at take-off to near 50% at cruise as required to minimize mission fuel burn. The variable-speed power-turbine (VSPT), driving a fixed gear-ratio transmission, provides one approach for effecting this wide speed variation. The key aerodynamic and rotordynamic challenges of the VSPT were described in the FAP Conference presentation. The challenges include maintaining high turbine efficiency at high work factor, wide (60 deg.) of incidence variation in all blade rows due to the speed variation, and operation at low Reynolds numbers (with transitional flow). The PT -shaft of the VSPT must be designed for safe operation in the wide speed range required, and therefore poses challenges associated with rotordynamics. The technical challenges drive research activities underway at NASA. An overview of the NASA SRW VSPT research activities was provided. These activities included conceptual and preliminary aero and mechanical (rotordynamics) design of the VSPT for the LCTR application, experimental and computational research supporting the development of incidence tolerant blading, and steps toward component-level testing of a variable-speed power-turbine of relevance to the LCTR application.

  1. Development of vaccines for bio-warfare agents.

    PubMed

    Rosenthal, S R; Clifford, J C M

    2002-01-01

    There is a recognized need for the development of new vaccines (as well as other biologicals and drugs) to counteract the effects of a potential bio-terrorist or bio-warfare event in the U.S. domestic population and military forces. Regulation of products to protect against potential bio-warfare agents poses unique challenges since the usual measures of efficacy that require exposure to natural disease may not currently be possible, for epidemiological and ethical reasons. To help to address this issue, the FDA has published and requested comments on a proposed animal rule intended to address certain efficacy issues for new agents for use against lethal or permanently disabling toxic substances. Recent product development activity has focused on Bacillus anthracis (anthrax) and variola major (smallpox), agents that are regarded as highest priority in posing a risk to national security. FDA resources exist to assist vaccine developers with regard to the novel challenges posed in the dinical development of these products.

  2. eHealth in cardiovascular medicine: A clinical update.

    PubMed

    Saner, Hugo; van der Velde, Enno

    2016-10-01

    Demographic changes, progress in medicine technology and regional problems in providing healthcare to low density populations are posing great challenges to our healthcare systems. Rapid progress in computer sciences and information technologies have a great impact on the way healthcare will be delivered in the near future. This article describes opportunities and challenges of eHealth and telemedicine in the framework of our health systems and, in particular, in the context of today's cardiology services. The most promising applications of eHealth and telemedicine include: (a) prevention and lifestyle interventions; (b) chronic disease management including hypertension, diabetes and heart failure; (c) arrhythmia detection including early detection of atrial fibrillation and telemonitoring of devices such as pacemaker, internal cardioverter defibrillators and implantable rhythm monitoring devices; (d) telerehabilitation. Major obstacles to the integration of eHealth and telemedicine into daily clinical practice include limited large-scale evidence, in particular, for cost-effectiveness, as well as lack of interoperability, inadequate or fragmented legal frameworks and lack of reimbursement. An important challenge for those involved in these new technologies will be to keep the main focus on patient's individual needs and to carefully evaluate the evidence behind the practice. © The European Society of Cardiology 2016.

  3. A General-purpose Framework for Parallel Processing of Large-scale LiDAR Data

    NASA Astrophysics Data System (ADS)

    Li, Z.; Hodgson, M.; Li, W.

    2016-12-01

    Light detection and ranging (LiDAR) technologies have proven efficiency to quickly obtain very detailed Earth surface data for a large spatial extent. Such data is important for scientific discoveries such as Earth and ecological sciences and natural disasters and environmental applications. However, handling LiDAR data poses grand geoprocessing challenges due to data intensity and computational intensity. Previous studies received notable success on parallel processing of LiDAR data to these challenges. However, these studies either relied on high performance computers and specialized hardware (GPUs) or focused mostly on finding customized solutions for some specific algorithms. We developed a general-purpose scalable framework coupled with sophisticated data decomposition and parallelization strategy to efficiently handle big LiDAR data. Specifically, 1) a tile-based spatial index is proposed to manage big LiDAR data in the scalable and fault-tolerable Hadoop distributed file system, 2) two spatial decomposition techniques are developed to enable efficient parallelization of different types of LiDAR processing tasks, and 3) by coupling existing LiDAR processing tools with Hadoop, this framework is able to conduct a variety of LiDAR data processing tasks in parallel in a highly scalable distributed computing environment. The performance and scalability of the framework is evaluated with a series of experiments conducted on a real LiDAR dataset using a proof-of-concept prototype system. The results show that the proposed framework 1) is able to handle massive LiDAR data more efficiently than standalone tools; and 2) provides almost linear scalability in terms of either increased workload (data volume) or increased computing nodes with both spatial decomposition strategies. We believe that the proposed framework provides valuable references on developing a collaborative cyberinfrastructure for processing big earth science data in a highly scalable environment.

  4. NASA Space Rocket Logistics Challenges

    NASA Technical Reports Server (NTRS)

    Bramon, Chris; Neeley, James R.; Jones, James V.; Watson, Michael D.; Inman, Sharon K.; Tuttle, Loraine

    2014-01-01

    The Space Launch System (SLS) is the new NASA heavy lift launch vehicle in development and is scheduled for its first mission in 2017. SLS has many of the same logistics challenges as any other large scale program. However, SLS also faces unique challenges. This presentation will address the SLS challenges, along with the analysis and decisions to mitigate the threats posed by each.

  5. Communicating about smoke from wildland fire: challenges and ways to address them

    Treesearch

    Christine S. Olsen; Danielle K. Mazzotta; Eric Toman; A. Paige Fischer

    2014-01-01

    Wildland fire and associated management efforts are dominant topics in natural resource fields. Smoke from fires can be a nuisance and pose serious health risks and aggravate pre-existing health conditions. When it results in reduced visibility near roadways, smoke can also pose hazardous driving conditions and reduce the scenic value of vistas. Communicating about...

  6. Tracking the Careers of Graduates: A New Agenda for Graduate Schools

    ERIC Educational Resources Information Center

    Stewart, Debra W.

    2013-01-01

    As candidates in the 2012 election debated issues raised by the state of the US economy, unemployment statistics and job creation took center stage. The problems under discussion posed (and continue to pose) a particularly clear and pressing challenge to the nation's graduate schools. While the US enjoys a reputation for having the most dynamic…

  7. Investigating power capping toward energy-efficient scientific applications: Investigating Power Capping toward Energy-Efficient Scientific Applications

    DOE PAGES

    Haidar, Azzam; Jagode, Heike; Vaccaro, Phil; ...

    2018-03-22

    The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less

  8. Investigating power capping toward energy-efficient scientific applications: Investigating Power Capping toward Energy-Efficient Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haidar, Azzam; Jagode, Heike; Vaccaro, Phil

    The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less

  9. Guest editorial: Introduction to the special issue on modern control for computer games.

    PubMed

    Argyriou, Vasileios; Kotsia, Irene; Zafeiriou, Stefanos; Petrou, Maria

    2013-12-01

    A typical gaming scenario, as developed in the past 20 years, involves a player interacting with a game using a specialized input device, such as a joystic, a mouse, a keyboard, etc. Recent technological advances and new sensors (for example, low cost commodity depth cameras) have enabled the introduction of more elaborated approaches in which the player is now able to interact with the game using his body pose, facial expressions, actions, and even his physiological signals. A new era of games has already started, employing computer vision techniques, brain-computer interfaces systems, haptic and wearable devices. The future lies in games that will be intelligent enough not only to extract the player's commands provided by his speech and gestures but also his behavioral cues, as well as his/her emotional states, and adjust their game plot accordingly in order to ensure more realistic and satisfactory gameplay experience. This special issue on modern control for computer games discusses several interdisciplinary factors that influence a user's input to a game, something directly linked to the gaming experience. These include, but are not limited to, the following: behavioral affective gaming, user satisfaction and perception, motion capture and scene modeling, and complete software frameworks that address several challenges risen in such scenarios.

  10. Development and refinement of computer-assisted planning and execution system for use in face-jaw-teeth transplantation to improve skeletal and dento-occlusal outcomes.

    PubMed

    Hashemi, Sepehr; Armand, Mehran; Gordon, Chad R

    2016-10-01

    To describe the development and refinement of the computer-assisted planning and execution (CAPE) system for use in face-jaw-teeth transplants (FJTTs). Although successful, some maxillofacial transplants result in suboptimal hybrid occlusion and may require subsequent surgical orthognathic revisions. Unfortunately, the use of traditional dental casts and splints pose several compromising shortcomings in the context of FJTT and hybrid occlusion. Computer-assisted surgery may overcome these challenges. Therefore, the use of computer-assisted orthognathic techniques and functional planning may prevent the need for such revisions and improve facial-skeletal outcomes. A comprehensive CAPE system for use in FJTT was developed through a multicenter collaboration and refined using plastic models, live miniature swine surgery, and human cadaver models. The system marries preoperative surgical planning and intraoperative execution by allowing on-table navigation of the donor fragment relative to recipient cranium, and real-time reporting of patient's cephalometric measurements relative to a desired dental-skeletal outcome. FJTTs using live-animal and cadaveric models demonstrate the CAPE system to be accurate in navigation and beneficial in improving hybrid occlusion and other craniofacial outcomes. Future refinement of the CAPE system includes integration of more commonly performed orthognathic/maxillofacial procedures.

  11. Binding pose and affinity prediction in the 2016 D3R Grand Challenge 2 using the Wilma-SIE method

    NASA Astrophysics Data System (ADS)

    Hogues, Hervé; Sulea, Traian; Gaudreault, Francis; Corbeil, Christopher R.; Purisima, Enrico O.

    2018-01-01

    The Farnesoid X receptor (FXR) exhibits significant backbone movement in response to the binding of various ligands and can be a challenge for pose prediction algorithms. As part of the D3R Grand Challenge 2, we tested Wilma-SIE, a rigid-protein docking method, on a set of 36 FXR ligands for which the crystal structures had originally been blinded. These ligands covered several classes of compounds. To overcome the rigid protein limitations of the method, we used an ensemble of publicly available structures for FXR from the PDB. The use of the ensemble allowed Wilma-SIE to predict poses with average and median RMSDs of 2.3 and 1.4 Å, respectively. It was quite clear, however, that had we used a single structure for the receptor the success rate would have been much lower. The most successful predictions were obtained on chemical classes for which one or more crystal structures of the receptor bound to a molecule of the same class was available. In the absence of a crystal structure for the class, observing a consensus binding mode for the ligands of the class using one or more receptor structures of other classes seemed to be indicative of a reasonable pose prediction. Affinity prediction proved to be more challenging with generally poor correlation with experimental IC50s (Kendall tau 0.3). Even when the 36 crystal structures were used the accuracy of the predicted affinities was not appreciably improved. A possible cause of difficulty is the internal energy strain arising from conformational differences in the receptor across complexes, which may need to be properly estimated and incorporated into the SIE scoring function.

  12. Incorporating Qualitative Evidence in Systematic Reviews: Strategies and Challenges

    ERIC Educational Resources Information Center

    Caracelli, Valerie J.; Cooksy, Leslie J.

    2013-01-01

    The quality of mixed methods systematic reviews relies on the quality of primary-level studies. The synthesis of qualitative evidence and the recent development of synthesizing mixed methods studies hold promise, but also pose challenges to evidence synthesis.

  13. Variation in Transpiration Efficiency in Sorghum

    USDA-ARS?s Scientific Manuscript database

    Declining freshwater resources, increasing population, and growing demand for biofuels pose new challenges for agriculture research. To meet these challenges, the concept "Blue Revolution" (more crop per drop) was proposed to improve water productivity in agriculture. We have identified several so...

  14. The Challenge of Teacher Retention in Urban Schools: Evidence of Variation from a Cross-Site Analysis

    ERIC Educational Resources Information Center

    Papay, John P.; Bacher-Hicks, Andrew; Page, Lindsay C.; Marinell, William H.

    2017-01-01

    Substantial teacher turnover poses a challenge to staffing public schools with effective teachers. The scope of the teacher retention challenge across school districts, however, remains poorly defined. Applying consistent data practices and analytical techniques to administrative data sets from 16 urban districts, we document substantial…

  15. The Enduring Challenges in Public Management: Surviving and Excelling in a Changing World.

    ERIC Educational Resources Information Center

    Halachmi, Arie, Ed.; Bouckaert, Geert, Ed.

    This book addresses in different ways the prospect of improving the performance of government and nonprofit organizations. The chapters are clustered around enduring management challenges that may influence productivity in the public sector. Fifteen chapters discuss the challenges to productivity posed by immediate and distant changes in the…

  16. Influenza Vaccines: Challenges and Solutions

    PubMed Central

    Houser, Katherine; Subbarao, Kanta

    2015-01-01

    Vaccination is the best method for the prevention and control of influenza. Vaccination can reduce illness and lessen severity of infection. This review focuses on how currently licensed influenza vaccines are generated in the U.S., why the biology of influenza poses vaccine challenges, and vaccine approaches on the horizon that address these challenges. PMID:25766291

  17. A study on facial expressions recognition

    NASA Astrophysics Data System (ADS)

    Xu, Jingjing

    2017-09-01

    In terms of communication, postures and facial expressions of such feelings like happiness, anger and sadness play important roles in conveying information. With the development of the technology, recently a number of algorithms dealing with face alignment, face landmark detection, classification, facial landmark localization and pose estimation have been put forward. However, there are a lot of challenges and problems need to be fixed. In this paper, a few technologies have been concluded and analyzed, and they all relate to handling facial expressions recognition and poses like pose-indexed based multi-view method for face alignment, robust facial landmark detection under significant head pose and occlusion, partitioning the input domain for classification, robust statistics face formalization.

  18. Human pose tracking from monocular video by traversing an image motion mapped body pose manifold

    NASA Astrophysics Data System (ADS)

    Basu, Saurav; Poulin, Joshua; Acton, Scott T.

    2010-01-01

    Tracking human pose from monocular video sequences is a challenging problem due to the large number of independent parameters affecting image appearance and nonlinear relationships between generating parameters and the resultant images. Unlike the current practice of fitting interpolation functions to point correspondences between underlying pose parameters and image appearance, we exploit the relationship between pose parameters and image motion flow vectors in a physically meaningful way. Change in image appearance due to pose change is realized as navigating a low dimensional submanifold of the infinite dimensional Lie group of diffeomorphisms of the two dimensional sphere S2. For small changes in pose, image motion flow vectors lie on the tangent space of the submanifold. Any observed image motion flow vector field is decomposed into the basis motion vector flow fields on the tangent space and combination weights are used to update corresponding pose changes in the different dimensions of the pose parameter space. Image motion flow vectors are largely invariant to style changes in experiments with synthetic and real data where the subjects exhibit variation in appearance and clothing. The experiments demonstrate the robustness of our method (within +/-4° of ground truth) to style variance.

  19. Numerical simulations of flying and swimming of biological systems with the viscous vortex particle method

    NASA Astrophysics Data System (ADS)

    Eldredge, Jeff

    2005-11-01

    Many biological mechanisms of locomotion involve the interaction of a fluid with a deformable surface undergoing large unsteady motion. Analysis of such problems poses a significant challenge to conventional grid-based computational approaches. Particularly in the moderate Reynolds number regime where many insects and fish function, viscous and inertial processes are both important, and vorticity serves a crucial role. In this work, the viscous vortex particle method is shown to provide an efficient, intuitive simulation approach for investigation of these biological systems. In contrast with a grid-based approach, the method solves the Navier--Stokes equations by tracking computational particles that carry smooth blobs of vorticity and exchange strength with one another to account for viscous diffusion. Thus, computational resources are focused on the physically relevant features of the flow, and there is no need for artificial boundary conditions. Building from previously-developed techniques for the creation of vorticity to enforce no-throughflow and no-slip conditions, the present method is extended to problems of coupled fluid--body dynamics by enforcement of global conservation of momenta. The application to several two-dimensional model problems is demonstrated, including single and multiple flapping wings and free swimming of a three-linkage fish.

  20. Planning/scheduling techniques for VQ-based image compression

    NASA Technical Reports Server (NTRS)

    Short, Nicholas M., Jr.; Manohar, Mareboyana; Tilton, James C.

    1994-01-01

    The enormous size of the data holding and the complexity of the information system resulting from the EOS system pose several challenges to computer scientists, one of which is data archival and dissemination. More than ninety percent of the data holdings of NASA is in the form of images which will be accessed by users across the computer networks. Accessing the image data in its full resolution creates data traffic problems. Image browsing using a lossy compression reduces this data traffic, as well as storage by factor of 30-40. Of the several image compression techniques, VQ is most appropriate for this application since the decompression of the VQ compressed images is a table lookup process which makes minimal additional demands on the user's computational resources. Lossy compression of image data needs expert level knowledge in general and is not straightforward to use. This is especially true in the case of VQ. It involves the selection of appropriate codebooks for a given data set and vector dimensions for each compression ratio, etc. A planning and scheduling system is described for using the VQ compression technique in the data access and ingest of raw satellite data.

  1. Hybrid RANS-LES using high order numerical methods

    NASA Astrophysics Data System (ADS)

    Henry de Frahan, Marc; Yellapantula, Shashank; Vijayakumar, Ganesh; Knaus, Robert; Sprague, Michael

    2017-11-01

    Understanding the impact of wind turbine wake dynamics on downstream turbines is particularly important for the design of efficient wind farms. Due to their tractable computational cost, hybrid RANS/LES models are an attractive framework for simulating separation flows such as the wake dynamics behind a wind turbine. High-order numerical methods can be computationally efficient and provide increased accuracy in simulating complex flows. In the context of LES, high-order numerical methods have shown some success in predictions of turbulent flows. However, the specifics of hybrid RANS-LES models, including the transition region between both modeling frameworks, pose unique challenges for high-order numerical methods. In this work, we study the effect of increasing the order of accuracy of the numerical scheme in simulations of canonical turbulent flows using RANS, LES, and hybrid RANS-LES models. We describe the interactions between filtering, model transition, and order of accuracy and their effect on turbulence quantities such as kinetic energy spectra, boundary layer evolution, and dissipation rate. This work was funded by the U.S. Department of Energy, Exascale Computing Project, under Contract No. DE-AC36-08-GO28308 with the National Renewable Energy Laboratory.

  2. Toward real-time virtual biopsy of oral lesions using confocal laser endomicroscopy interfaced with embedded computing.

    PubMed

    Thong, Patricia S P; Tandjung, Stephanus S; Movania, Muhammad Mobeen; Chiew, Wei-Ming; Olivo, Malini; Bhuvaneswari, Ramaswamy; Seah, Hock-Soon; Lin, Feng; Qian, Kemao; Soo, Khee-Chee

    2012-05-01

    Oral lesions are conventionally diagnosed using white light endoscopy and histopathology. This can pose a challenge because the lesions may be difficult to visualise under white light illumination. Confocal laser endomicroscopy can be used for confocal fluorescence imaging of surface and subsurface cellular and tissue structures. To move toward real-time "virtual" biopsy of oral lesions, we interfaced an embedded computing system to a confocal laser endomicroscope to achieve a prototype three-dimensional (3-D) fluorescence imaging system. A field-programmable gated array computing platform was programmed to enable synchronization of cross-sectional image grabbing and Z-depth scanning, automate the acquisition of confocal image stacks and perform volume rendering. Fluorescence imaging of the human and murine oral cavities was carried out using the fluorescent dyes fluorescein sodium and hypericin. Volume rendering of cellular and tissue structures from the oral cavity demonstrate the potential of the system for 3-D fluorescence visualization of the oral cavity in real-time. We aim toward achieving a real-time virtual biopsy technique that can complement current diagnostic techniques and aid in targeted biopsy for better clinical outcomes.

  3. PCA-based artifact removal algorithm for stroke detection using UWB radar imaging.

    PubMed

    Ricci, Elisa; di Domenico, Simone; Cianca, Ernestina; Rossi, Tommaso; Diomedi, Marina

    2017-06-01

    Stroke patients should be dispatched at the highest level of care available in the shortest time. In this context, a transportable system in specialized ambulances, able to evaluate the presence of an acute brain lesion in a short time interval (i.e., few minutes), could shorten delay of treatment. UWB radar imaging is an emerging diagnostic branch that has great potential for the implementation of a transportable and low-cost device. Transportability, low cost and short response time pose challenges to the signal processing algorithms of the backscattered signals as they should guarantee good performance with a reasonably low number of antennas and low computational complexity, tightly related to the response time of the device. The paper shows that a PCA-based preprocessing algorithm can: (1) achieve good performance already with a computationally simple beamforming algorithm; (2) outperform state-of-the-art preprocessing algorithms; (3) enable a further improvement in the performance (and/or decrease in the number of antennas) by using a multistatic approach with just a modest increase in computational complexity. This is an important result toward the implementation of such a diagnostic device that could play an important role in emergency scenario.

  4. Incorporating contact angles in the surface tension force with the ACES interface curvature scheme

    NASA Astrophysics Data System (ADS)

    Owkes, Mark

    2017-11-01

    In simulations of gas-liquid flows interacting with solid boundaries, the contact line dynamics effect the interface motion and flow field through the surface tension force. The surface tension force is directly proportional to the interface curvature and the problem of accurately imposing a contact angle must be incorporated into the interface curvature calculation. Many commonly used algorithms to compute interface curvatures (e.g., height function method) require extrapolating the interface, with defined contact angle, into the solid to allow for the calculation of a curvature near a wall. Extrapolating can be an ill-posed problem, especially in three-dimensions or when multiple contact lines are near each other. We have developed an accurate methodology to compute interface curvatures that allows for contact angles to be easily incorporated while avoiding extrapolation and the associated challenges. The method, known as Adjustable Curvature Evaluation Scale (ACES), leverages a least squares fit of a polynomial to points computed on the volume-of-fluid (VOF) representation of the gas-liquid interface. The method is tested by simulating canonical test cases and then applied to simulate the injection and motion of water droplets in a channel (relevant to PEM fuel cells).

  5. Six-dimensional real and reciprocal space small-angle X-ray scattering tomography

    NASA Astrophysics Data System (ADS)

    Schaff, Florian; Bech, Martin; Zaslansky, Paul; Jud, Christoph; Liebi, Marianne; Guizar-Sicairos, Manuel; Pfeiffer, Franz

    2015-11-01

    When used in combination with raster scanning, small-angle X-ray scattering (SAXS) has proven to be a valuable imaging technique of the nanoscale, for example of bone, teeth and brain matter. Although two-dimensional projection imaging has been used to characterize various materials successfully, its three-dimensional extension, SAXS computed tomography, poses substantial challenges, which have yet to be overcome. Previous work using SAXS computed tomography was unable to preserve oriented SAXS signals during reconstruction. Here we present a solution to this problem and obtain a complete SAXS computed tomography, which preserves oriented scattering information. By introducing virtual tomography axes, we take advantage of the two-dimensional SAXS information recorded on an area detector and use it to reconstruct the full three-dimensional scattering distribution in reciprocal space for each voxel of the three-dimensional object in real space. The presented method could be of interest for a combined six-dimensional real and reciprocal space characterization of mesoscopic materials with hierarchically structured features with length scales ranging from a few nanometres to a few millimetres—for example, biomaterials such as bone or teeth, or functional materials such as fuel-cell or battery components.

  6. Six-dimensional real and reciprocal space small-angle X-ray scattering tomography.

    PubMed

    Schaff, Florian; Bech, Martin; Zaslansky, Paul; Jud, Christoph; Liebi, Marianne; Guizar-Sicairos, Manuel; Pfeiffer, Franz

    2015-11-19

    When used in combination with raster scanning, small-angle X-ray scattering (SAXS) has proven to be a valuable imaging technique of the nanoscale, for example of bone, teeth and brain matter. Although two-dimensional projection imaging has been used to characterize various materials successfully, its three-dimensional extension, SAXS computed tomography, poses substantial challenges, which have yet to be overcome. Previous work using SAXS computed tomography was unable to preserve oriented SAXS signals during reconstruction. Here we present a solution to this problem and obtain a complete SAXS computed tomography, which preserves oriented scattering information. By introducing virtual tomography axes, we take advantage of the two-dimensional SAXS information recorded on an area detector and use it to reconstruct the full three-dimensional scattering distribution in reciprocal space for each voxel of the three-dimensional object in real space. The presented method could be of interest for a combined six-dimensional real and reciprocal space characterization of mesoscopic materials with hierarchically structured features with length scales ranging from a few nanometres to a few millimetres--for example, biomaterials such as bone or teeth, or functional materials such as fuel-cell or battery components.

  7. Live interactive computer music performance practice

    NASA Astrophysics Data System (ADS)

    Wessel, David

    2002-05-01

    A live-performance musical instrument can be assembled around current lap-top computer technology. One adds a controller such as a keyboard or other gestural input device, a sound diffusion system, some form of connectivity processor(s) providing for audio I/O and gestural controller input, and reactive real-time native signal processing software. A system consisting of a hand gesture controller; software for gesture analysis and mapping, machine listening, composition, and sound synthesis; and a controllable radiation pattern loudspeaker are described. Interactivity begins in the set up wherein the speaker-room combination is tuned with an LMS procedure. This system was designed for improvisation. It is argued that software suitable for carrying out an improvised musical dialog with another performer poses special challenges. The processes underlying the generation of musical material must be very adaptable, capable of rapid changes in musical direction. Machine listening techniques are used to help the performer adapt to new contexts. Machine learning can play an important role in the development of such systems. In the end, as with any musical instrument, human skill is essential. Practice is required not only for the development of musically appropriate human motor programs but for the adaptation of the computer-based instrument as well.

  8. Contributions of numerical simulation data bases to the physics, modeling and measurement of turbulence

    NASA Technical Reports Server (NTRS)

    Moin, Parviz; Spalart, Philippe R.

    1987-01-01

    The use of simulation data bases for the examination of turbulent flows is an effective research tool. Studies of the structure of turbulence have been hampered by the limited number of probes and the impossibility of measuring all desired quantities. Also, flow visualization is confined to the observation of passive markers with limited field of view and contamination caused by time-history effects. Computer flow fields are a new resource for turbulence research, providing all the instantaneous flow variables in three-dimensional space. Simulation data bases also provide much-needed information for phenomenological turbulence modeling. Three dimensional velocity and pressure fields from direct simulations can be used to compute all the terms in the transport equations for the Reynolds stresses and the dissipation rate. However, only a few, geometrically simple flows have been computed by direct numerical simulation, and the inventory of simulation does not fully address the current modeling needs in complex turbulent flows. The availability of three-dimensional flow fields also poses challenges in developing new techniques for their analysis, techniques based on experimental methods, some of which are used here for the analysis of direct-simulation data bases in studies of the mechanics of turbulent flows.

  9. Exploring the Issues: Humans and Computers.

    ERIC Educational Resources Information Center

    Walsh, Huber M.

    This presentation addresses three basic social issues generated by the computer revolution. The first section, "Money Matters," focuses on the economic effects of computer technology. These include the replacement of workers by fully automated machines, the threat to professionals posed by expanded access to specialized information, and the…

  10. RECRUITMENT STRATEGIES FOR AN EXPOSURE MEASUREMENT STUDY OF PRESCHOOL CHILDREN

    EPA Science Inventory

    Recruiting study participants is always a challenge for researchers. It poses an even bigger challenge for researchers to recruit participants for a study involving intrusive, burdensome data collection activities. A study of preschool children's exposure to persistent organic ...

  11. Sustainable Materials Management (SMM) Web Academy Webinar: Management Challenges for Lithium Batteries at Electronics Recyclers

    EPA Pesticide Factsheets

    This is a webinar page for the Sustainable Management of Materials (SMM) Web Academy webinar titled, An Introduction to Lithium Batteries and the Challenges that they Pose to the Waste and Recycling Industry.

  12. Real-time quasi-3D tomographic reconstruction

    NASA Astrophysics Data System (ADS)

    Buurlage, Jan-Willem; Kohr, Holger; Palenstijn, Willem Jan; Joost Batenburg, K.

    2018-06-01

    Developments in acquisition technology and a growing need for time-resolved experiments pose great computational challenges in tomography. In addition, access to reconstructions in real time is a highly demanded feature but has so far been out of reach. We show that by exploiting the mathematical properties of filtered backprojection-type methods, having access to real-time reconstructions of arbitrarily oriented slices becomes feasible. Furthermore, we present , software for visualization and on-demand reconstruction of slices. A user of can interactively shift and rotate slices in a GUI, while the software updates the slice in real time. For certain use cases, the possibility to study arbitrarily oriented slices in real time directly from the measured data provides sufficient visual and quantitative insight. Two such applications are discussed in this article.

  13. Using a MaxEnt Classifier for the Automatic Content Scoring of Free-Text Responses

    NASA Astrophysics Data System (ADS)

    Sukkarieh, Jana Z.

    2011-03-01

    Criticisms against multiple-choice item assessments in the USA have prompted researchers and organizations to move towards constructed-response (free-text) items. Constructed-response (CR) items pose many challenges to the education community—one of which is that they are expensive to score by humans. At the same time, there has been widespread movement towards computer-based assessment and hence, assessment organizations are competing to develop automatic content scoring engines for such items types—which we view as a textual entailment task. This paper describes how MaxEnt Modeling is used to help solve the task. MaxEnt has been used in many natural language tasks but this is the first application of the MaxEnt approach to textual entailment and automatic content scoring.

  14. Progress in hyperspectral imaging of vegetation

    NASA Astrophysics Data System (ADS)

    Goetz, Alexander F. H.

    2001-03-01

    Computer-related technologies, such as the Internet, have posed new challenges for intellectual property law. Legislation and court decisions impacting patents, copyrights, trade secrets and trademarks have adapted intellectual property law to address new issues brought about by such emerging technologies. As the pace of technological change continues to increase, intellectual property law will need to keep up. Accordingly, the balance struck by intellectual property laws today will likely be set askew by technological changes in the future. Engineers need to consider not only the law as it exists today, but also how it might change in the future. Likewise, lawyers and judges need to consider legal issues not only in view of the current state of the art in technology, but also with an eye to technologies yet to come.

  15. Examination of Observation Impacts derived from OSEs and Adjoint Models

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald

    2008-01-01

    With the adjoint of a data assimilation system, the impact of any or all assimilated observations on measures of forecast skill can be estimated accurately and efficiently. The approach allows aggregation of results in terms of individual data types, channels or locations, all computed simultaneously. In this study, adjoint-based estimates of observation impact are compared with results from standard observing system experiments (OSEs) in the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) GEOS-5 system. The two approaches are shown to provide unique, but complimentary, information. Used together, they reveal both redundancies and dependencies between observing system impacts as observations are added or removed. Understanding these dependencies poses a major challenge for optimizing the use of the current observational network and defining requirements for future observing systems.

  16. LES of Swirling Reacting Flows via the Unstructured scalar-FDF Solver

    NASA Astrophysics Data System (ADS)

    Ansari, Naseem; Pisciuneri, Patrick; Strakey, Peter; Givi, Peyman

    2011-11-01

    Swirling flames pose a significant challenge for computational modeling due to the presence of recirculation regions and vortex shedding. In this work, results are presented of LES of two swirl stabilized non-premixed flames (SM1 and SM2) via the FDF methodology. These flames are part of the database for validation of turbulent-combustion models. The scalar-FDF is simulated on a domain discretized by unstructured meshes, and is coupled with a finite volume flow solver. In the SM1 flame (with a low swirl number) chemistry is described by the flamelet model based on the full GRI 2.11 mechanism. The SM2 flame (with a high swirl number) is simulated via a 46-step 17-species mechanism. The simulated results are assessed via comparison with experimental data.

  17. From damage to discovery via virtual unwrapping: Reading the scroll from En-Gedi

    PubMed Central

    Seales, William Brent; Parker, Clifford Seth; Segal, Michael; Tov, Emanuel; Shor, Pnina; Porath, Yosef

    2016-01-01

    Computer imaging techniques are commonly used to preserve and share readable manuscripts, but capturing writing locked away in ancient, deteriorated documents poses an entirely different challenge. This software pipeline—referred to as “virtual unwrapping”—allows textual artifacts to be read completely and noninvasively. The systematic digital analysis of the extremely fragile En-Gedi scroll (the oldest Pentateuchal scroll in Hebrew outside of the Dead Sea Scrolls) reveals the writing hidden on its untouchable, disintegrating sheets. Our approach for recovering substantial ink-based text from a damaged object results in readable columns at such high quality that serious critical textual analysis can occur. Hence, this work creates a new pathway for subsequent textual discoveries buried within the confines of damaged materials. PMID:27679821

  18. Measurement-induced long-distance entanglement of superconducting qubits using optomechanical transducers

    NASA Astrophysics Data System (ADS)

    Černotík, Ondřej; Hammerer, Klemens

    2016-07-01

    Although superconducting systems provide a promising platform for quantum computing, their networking poses a challenge because they cannot be interfaced to light, the medium used to send quantum signals through channels at room temperature. We show that mechanical oscillators can mediate such coupling and light can be used to measure the joint state of two distant qubits. The measurement provides information on the total spin of the two qubits such that entangled qubit states can be postselected. Entanglement generation is possible without ground-state cooling of the mechanical oscillators for systems with optomechanical cooperativity moderately larger than unity; in addition, our setup tolerates a substantial transmission loss. The approach is scalable to the generation of multipartite entanglement and represents a crucial step towards quantum networks with superconducting circuits.

  19. “You’re dealing with an emotionally charged individual…”: an industry perspective on the challenges posed by medical tourists’ informal caregiver-companions

    PubMed Central

    2013-01-01

    Background Patients engage in medical tourism when they privately obtain a medical care abroad. Previous research shows that many medical tourists travel abroad with friends and family members who provide support and assistance. Meanwhile, very little is known about this important stakeholder group, referred to here as caregiver-companions. In this article we examine the challenges that can be posed by caregiver-companions and the overall practice of informal caregiving in medical tourism from an industry perspective. Specifically, we report on the findings of interviews conducted with international patient coordinators (IPCs) who work at destination facilities. IPCs come into regular contact with caregiver-companions in their professional positions and thus are ideally suited to comment on trends they have observed among this stakeholder group as well as the challenges they can pose to medical tourists, health workers, and facilities. Methods We conducted 20 semi-structured interviews with 21 IPCs from 16 different facilities across nine countries. Topics probed in the interviews included caregiver-companion roles, IPCs’ and others’ interaction with caregiver-companions, and potential health and safety risks posed to medical tourists and caregiver-companions. Thematic analysis of the verbatim transcripts was employed. Results Although most participants encouraged medical tourists to travel with a caregiver-companion, many challenges associated with caregiver-companions were identified. Three themes best characterize the challenges that emerged: (1) caregiver-companions require time, attention and resources; (2) caregiver-companions can disrupt the provision of quality care; and (3) caregiver-companions can be exposed to risks. IPCs pointed out that caregiver-companions may, for example, have a negative impact on the patient through cost of accompaniment or inadequate care provision. Caregiver-companions may also create unanticipated or extra work for IPCs, as additional clients and by ignoring established organizational rules, routines, and expectations. Furthermore, caregiver-companions may be susceptible to stresses and health and safety risks, which would further deteriorate their own abilities to offer the patient quality care. Conclusions Although caregiver-companions can pose challenges to medical tourists, health workers, and medical tourism facilities, they can also assist in enhancing best care and offering meaningful support to medical tourists. If caregiver-companions are open to collaboration with IPCs, and particularly in the form of information sharing, then their experience abroad can be safer and less stressful for themselves and, by extension, for the accompanied patients and facility staff. PMID:23889860

  20. "You're dealing with an emotionally charged individual...": an industry perspective on the challenges posed by medical tourists' informal caregiver-companions.

    PubMed

    Casey, Victoria; Crooks, Valorie A; Snyder, Jeremy; Turner, Leigh

    2013-07-26

    Patients engage in medical tourism when they privately obtain a medical care abroad. Previous research shows that many medical tourists travel abroad with friends and family members who provide support and assistance. Meanwhile, very little is known about this important stakeholder group, referred to here as caregiver-companions. In this article we examine the challenges that can be posed by caregiver-companions and the overall practice of informal caregiving in medical tourism from an industry perspective. Specifically, we report on the findings of interviews conducted with international patient coordinators (IPCs) who work at destination facilities. IPCs come into regular contact with caregiver-companions in their professional positions and thus are ideally suited to comment on trends they have observed among this stakeholder group as well as the challenges they can pose to medical tourists, health workers, and facilities. We conducted 20 semi-structured interviews with 21 IPCs from 16 different facilities across nine countries. Topics probed in the interviews included caregiver-companion roles, IPCs' and others' interaction with caregiver-companions, and potential health and safety risks posed to medical tourists and caregiver-companions. Thematic analysis of the verbatim transcripts was employed. Although most participants encouraged medical tourists to travel with a caregiver-companion, many challenges associated with caregiver-companions were identified. Three themes best characterize the challenges that emerged: (1) caregiver-companions require time, attention and resources; (2) caregiver-companions can disrupt the provision of quality care; and (3) caregiver-companions can be exposed to risks. IPCs pointed out that caregiver-companions may, for example, have a negative impact on the patient through cost of accompaniment or inadequate care provision. Caregiver-companions may also create unanticipated or extra work for IPCs, as additional clients and by ignoring established organizational rules, routines, and expectations. Furthermore, caregiver-companions may be susceptible to stresses and health and safety risks, which would further deteriorate their own abilities to offer the patient quality care. Although caregiver-companions can pose challenges to medical tourists, health workers, and medical tourism facilities, they can also assist in enhancing best care and offering meaningful support to medical tourists. If caregiver-companions are open to collaboration with IPCs, and particularly in the form of information sharing, then their experience abroad can be safer and less stressful for themselves and, by extension, for the accompanied patients and facility staff.

  1. Curriculum Integration: Helping Career and Technical Education Students Truly Develop College and Career Readiness

    ERIC Educational Resources Information Center

    Park, Travis; Pearson, Donna; Richardson, George B.

    2017-01-01

    All students need to learn how to read, write, solve mathematics problems, and understand and apply scientific principles to succeed in college and/or careers. The challenges posed by entry-level career fields are no less daunting than those posed by college-level study. Thus, career and technical education students must learn effective math,…

  2. National Security and Global Climate Change

    DTIC Science & Technology

    2008-01-01

    The uncertainty, confusion, and speculation about the causes, effects, and implications of global climate change (GCC) often paralyze serious...against scientific indications of global climate change , but to consider how it would pose challenges to national security, explore options for facing...generals and admirals, released a report concluding that projected climate change poses a serious threat to America’s national security. This article

  3. Applying Pose Clustering and MD Simulations To Eliminate False Positives in Molecular Docking.

    PubMed

    Makeneni, Spandana; Thieker, David F; Woods, Robert J

    2018-03-26

    In this work, we developed a computational protocol that employs multiple molecular docking experiments, followed by pose clustering, molecular dynamic simulations (10 ns), and energy rescoring to produce reliable 3D models of antibody-carbohydrate complexes. The protocol was applied to 10 antibody-carbohydrate co-complexes and three unliganded (apo) antibodies. Pose clustering significantly reduced the number of potential poses. For each system, 15 or fewer clusters out of 100 initial poses were generated and chosen for further analysis. Molecular dynamics (MD) simulations allowed the docked poses to either converge or disperse, and rescoring increased the likelihood that the best-ranked pose was an acceptable pose. This approach is amenable to automation and can be a valuable aid in determining the structure of antibody-carbohydrate complexes provided there is no major side chain rearrangement or backbone conformational change in the H3 loop of the CDR regions. Further, the basic protocol of docking a small ligand to a known binding site, clustering the results, and performing MD with a suitable force field is applicable to any protein ligand system.

  4. The Development of Questioning as a Means of Framing Problems and Posing Challenges.

    ERIC Educational Resources Information Center

    Feigenbaum, Peter

    When a person encounters a problem, the character, form, and content of his or her response provides psychologists with useful and interesting information about processes of challenge and their relationship to intellectual development. In essence, challenge is a developing relationship that is defined on the one hand by objective factors (a person…

  5. Social Software: A Powerful Paradigm for Building Technology for Global Learning

    ERIC Educational Resources Information Center

    Wooding, Amy; Wooding, Kjell

    2018-01-01

    It is not difficult to imagine a world where internet-connected mobile devices are accessible to everyone. Can these technologies be used to help solve the challenges of global education? This was the challenge posed by the Global Learning XPRIZE--a $15 million grand challenge competition aimed at addressing this global teaching shortfall. In…

  6. When Bible and Science Interact: Teachers' Pedagogic and Value Challenges in Teaching Religious Minority Students in Higher Education Settings

    ERIC Educational Resources Information Center

    Novis-Deutsch, Nurit; Lifshitz, Chen

    2016-01-01

    The integration of highly religious minority students into institutions of higher education poses significant pedagogical and value challenges for students and teachers alike. We offer a framework for analyzing such challenges, distinguishing between practical concerns, identity issues and value conflicts. By contrasting a deficit perspective to…

  7. Technology Teaching or Mediated Learning, Part I: Are Computers Skinnerian or Vygotskian?

    ERIC Educational Resources Information Center

    Coufal, Kathy L.

    2002-01-01

    This article highlights the theoretical framework that dominated speech-language pathology prior to the widespread introduction of microcomputers and poses questions regarding the application of computers in assessment and intervention for children with language-learning impairments. It discusses implications of computer use in the context of…

  8. The Real Challenge of ESD

    ERIC Educational Resources Information Center

    Jackson, M. G.

    2011-01-01

    Between the Inter-governmental Conference on Environmental Education at Tbilisi in 1977 and the Fourth International Conference on Environmental Education at Ahmedabad in 2007, our conception of the challenge posed by the global crises of climate change, environmental destruction, social disintegration, poverty, natural resources exhaustion and…

  9. DEVELOPMENT, DESIGN AND CONSUMER TESTING OF MARKETABLE RESIDENTIAL LED LIGHT LUMINAIRES

    EPA Science Inventory

    Developing marketable LED luminaires poses challenges, even though LEDs are energy-efficient and an ecological alternative to conventionally lighting. Challenges include: perceptions that the color rendition of LEDs is unacceptable to the public; numbers of LEDs must be grou...

  10. Variation in transpiration efficiency in sorghum

    USDA-ARS?s Scientific Manuscript database

    Declining freshwater resources, increasing population, and growing demand for biofuels pose new challenges for agriculture research. To meet these challenges, the concept “Blue Revolution” was proposed to improve water productivity in agriculture--“More Crop per Drop”. Sorghum is the fifth most imp...

  11. Variation of Transpiration Efficiency in Sorghum

    USDA-ARS?s Scientific Manuscript database

    Declining freshwater resources, increasing population, and growing demand for biofuels pose new challenges for agriculture research. To meet these challenges, the concept “Blue Revolution” was proposed to improve water productivity in agriculture--“More Crop per Drop”. Sorghum is the fifth most imp...

  12. Neural system prediction and identification challenge.

    PubMed

    Vlachos, Ioannis; Zaytsev, Yury V; Spreizer, Sebastian; Aertsen, Ad; Kumar, Arvind

    2013-01-01

    Can we infer the function of a biological neural network (BNN) if we know the connectivity and activity of all its constituent neurons?This question is at the core of neuroscience and, accordingly, various methods have been developed to record the activity and connectivity of as many neurons as possible. Surprisingly, there is no theoretical or computational demonstration that neuronal activity and connectivity are indeed sufficient to infer the function of a BNN. Therefore, we pose the Neural Systems Identification and Prediction Challenge (nuSPIC). We provide the connectivity and activity of all neurons and invite participants (1) to infer the functions implemented (hard-wired) in spiking neural networks (SNNs) by stimulating and recording the activity of neurons and, (2) to implement predefined mathematical/biological functions using SNNs. The nuSPICs can be accessed via a web-interface to the NEST simulator and the user is not required to know any specific programming language. Furthermore, the nuSPICs can be used as a teaching tool. Finally, nuSPICs use the crowd-sourcing model to address scientific issues. With this computational approach we aim to identify which functions can be inferred by systematic recordings of neuronal activity and connectivity. In addition, nuSPICs will help the design and application of new experimental paradigms based on the structure of the SNN and the presumed function which is to be discovered.

  13. Binding Free Energy Calculations for Lead Optimization: Assessment of Their Accuracy in an Industrial Drug Design Context.

    PubMed

    Homeyer, Nadine; Stoll, Friederike; Hillisch, Alexander; Gohlke, Holger

    2014-08-12

    Correctly ranking compounds according to their computed relative binding affinities will be of great value for decision making in the lead optimization phase of industrial drug discovery. However, the performance of existing computationally demanding binding free energy calculation methods in this context is largely unknown. We analyzed the performance of the molecular mechanics continuum solvent, the linear interaction energy (LIE), and the thermodynamic integration (TI) approach for three sets of compounds from industrial lead optimization projects. The data sets pose challenges typical for this early stage of drug discovery. None of the methods was sufficiently predictive when applied out of the box without considering these challenges. Detailed investigations of failures revealed critical points that are essential for good binding free energy predictions. When data set-specific features were considered accordingly, predictions valuable for lead optimization could be obtained for all approaches but LIE. Our findings lead to clear recommendations for when to use which of the above approaches. Our findings also stress the important role of expert knowledge in this process, not least for estimating the accuracy of prediction results by TI, using indicators such as the size and chemical structure of exchanged groups and the statistical error in the predictions. Such knowledge will be invaluable when it comes to the question which of the TI results can be trusted for decision making.

  14. Neural system prediction and identification challenge

    PubMed Central

    Vlachos, Ioannis; Zaytsev, Yury V.; Spreizer, Sebastian; Aertsen, Ad; Kumar, Arvind

    2013-01-01

    Can we infer the function of a biological neural network (BNN) if we know the connectivity and activity of all its constituent neurons?This question is at the core of neuroscience and, accordingly, various methods have been developed to record the activity and connectivity of as many neurons as possible. Surprisingly, there is no theoretical or computational demonstration that neuronal activity and connectivity are indeed sufficient to infer the function of a BNN. Therefore, we pose the Neural Systems Identification and Prediction Challenge (nuSPIC). We provide the connectivity and activity of all neurons and invite participants (1) to infer the functions implemented (hard-wired) in spiking neural networks (SNNs) by stimulating and recording the activity of neurons and, (2) to implement predefined mathematical/biological functions using SNNs. The nuSPICs can be accessed via a web-interface to the NEST simulator and the user is not required to know any specific programming language. Furthermore, the nuSPICs can be used as a teaching tool. Finally, nuSPICs use the crowd-sourcing model to address scientific issues. With this computational approach we aim to identify which functions can be inferred by systematic recordings of neuronal activity and connectivity. In addition, nuSPICs will help the design and application of new experimental paradigms based on the structure of the SNN and the presumed function which is to be discovered. PMID:24399966

  15. Visualization of decision processes using a cognitive architecture

    NASA Astrophysics Data System (ADS)

    Livingston, Mark A.; Murugesan, Arthi; Brock, Derek; Frost, Wende K.; Perzanowski, Dennis

    2013-01-01

    Cognitive architectures are computational theories of reasoning the human mind engages in as it processes facts and experiences. A cognitive architecture uses declarative and procedural knowledge to represent mental constructs that are involved in decision making. Employing a model of behavioral and perceptual constraints derived from a set of one or more scenarios, the architecture reasons about the most likely consequence(s) of a sequence of events. Reasoning of any complexity and depth involving computational processes, however, is often opaque and challenging to comprehend. Arguably, for decision makers who may need to evaluate or question the results of autonomous reasoning, it would be useful to be able to inspect the steps involved in an interactive, graphical format. When a chain of evidence and constraint-based decision points can be visualized, it becomes easier to explore both how and why a scenario of interest will likely unfold in a particular way. In initial work on a scheme for visualizing cognitively-based decision processes, we focus on generating graphical representations of models run in the Polyscheme cognitive architecture. Our visualization algorithm operates on a modified version of Polyscheme's output, which is accomplished by augmenting models with a simple set of tags. We provide example visualizations and discuss properties of our technique that pose challenges for our representation goals. We conclude with a summary of feedback solicited from domain experts and practitioners in the field of cognitive modeling.

  16. Clinical decision-making and secondary findings in systems medicine.

    PubMed

    Fischer, T; Brothers, K B; Erdmann, P; Langanke, M

    2016-05-21

    Systems medicine is the name for an assemblage of scientific strategies and practices that include bioinformatics approaches to human biology (especially systems biology); "big data" statistical analysis; and medical informatics tools. Whereas personalized and precision medicine involve similar analytical methods applied to genomic and medical record data, systems medicine draws on these as well as other sources of data. Given this distinction, the clinical translation of systems medicine poses a number of important ethical and epistemological challenges for researchers working to generate systems medicine knowledge and clinicians working to apply it. This article focuses on three key challenges: First, we will discuss the conflicts in decision-making that can arise when healthcare providers committed to principles of experimental medicine or evidence-based medicine encounter individualized recommendations derived from computer algorithms. We will explore in particular whether controlled experiments, such as comparative effectiveness trials, should mediate the translation of systems medicine, or if instead individualized findings generated through "big data" approaches can be applied directly in clinical decision-making. Second, we will examine the case of the Riyadh Intensive Care Program Mortality Prediction Algorithm, pejoratively referred to as the "death computer," to demonstrate the ethical challenges that can arise when big-data-driven scoring systems are applied in clinical contexts. We argue that the uncritical use of predictive clinical algorithms, including those envisioned for systems medicine, challenge basic understandings of the doctor-patient relationship. Third, we will build on the recent discourse on secondary findings in genomics and imaging to draw attention to the important implications of secondary findings derived from the joint analysis of data from diverse sources, including data recorded by patients in an attempt to realize their "quantified self." This paper examines possible ethical challenges that are likely to be raised as systems medicine to be translated into clinical medicine. These include the epistemological challenges for clinical decision-making, the use of scoring systems optimized by big data techniques and the risk that incidental and secondary findings will significantly increase. While some ethical implications remain still hypothetical we should use the opportunity to prospectively identify challenges to avoid making foreseeable mistakes when systems medicine inevitably arrives in routine care.

  17. Energy Efficient Digital Logic Using Nanoscale Magnetic Devices

    NASA Astrophysics Data System (ADS)

    Lambson, Brian James

    Increasing demand for information processing in the last 50 years has been largely satisfied by the steadily declining price and improving performance of microelectronic devices. Much of this progress has been made by aggressively scaling the size of semiconductor transistors and metal interconnects that microprocessors are built from. As devices shrink to the size regime in which quantum effects pose significant challenges, new physics may be required in order to continue historical scaling trends. A variety of new devices and physics are currently under investigation throughout the scientific and engineering community to meet these challenges. One of the more drastic proposals on the table is to replace the electronic components of information processors with magnetic components. Magnetic components are already commonplace in computers for their information storage capability. Unlike most electronic devices, magnetic materials can store data in the absence of a power supply. Today's magnetic hard disk drives can routinely hold billions of bits of information and are in widespread commercial use. Their ability to function without a constant power source hints at an intrinsic energy efficiency. The question we investigate in this dissertation is whether or not this advantage can be extended from information storage to the notoriously energy intensive task of information processing. Several proof-of-concept magnetic logic devices were proposed and tested in the past decade. In this dissertation, we build on the prior work by answering fundamental questions about how magnetic devices achieve such high energy efficiency and how they can best function in digital logic applications. The results of this analysis are used to suggest and test improvements to nanomagnetic computing devices. Two of our results are seen as especially important to the field of nanomagnetic computing: (1) we show that it is possible to operate nanomagnetic computers at the fundamental thermodyanimic limits of computation and (2) we develop a nanomagnet with a unique shape that is engineered to significantly improve the reliability of nanomagnetic logic.

  18. Ethics and Research with Undergraduates

    ERIC Educational Resources Information Center

    Richman, Kenneth A.; Alexander, Leslie B.

    2006-01-01

    Ethicists, researchers and policy makers have paid increasing attention to the ethical conduct of research, especially research involving human beings. Research performed with and by undergraduates poses a specific set of ethical challenges. These challenges are often overlooked by the research community because it is assumed that undergraduate…

  19. Jesuit "Eloquentia Perfecta" and Theotropic Logology

    ERIC Educational Resources Information Center

    Mailloux, Steven

    2015-01-01

    This essay takes a rhetorical pragmatist perspective on current questions concerning educational goals and pedagogical practices. It begins by considering some challenges to rhetorical approaches to education, placing those challenges in the theoretical context of their posing. The essay then describes one current rhetorical approach--based on…

  20. Challenges for PISA

    ERIC Educational Resources Information Center

    Schleicher, Andreas

    2016-01-01

    The OECD Programme for International Student Assessment (PISA) provides a framework in which over 80 countries collaborate to build advanced global metrics to assess the knowledge, skills and character attributes of the students. The design of assessments poses major conceptual and technical challenges, as successful learning. Beyond a sound…

  1. Challenges of Conducting Research With Young Offenders With Traumatic Brain Injury.

    PubMed

    OʼRourke, Conall; Templeton, Michelle; Cohen, Miriam H; Linden, Mark A

    2018-05-31

    The purpose of this commentary is to highlight the challenges encountered when conducting research with young offenders. This is drawn from the first-hand experience of 3 researchers working on separate projects within this environment. Young offenders present as a complex clinical population with high levels of illiteracy, substance abuse, and mental health issues. Significant planning is therefore required before working with this group. Consideration must be given to the heterogeneity of prison populations alongside the potential limitations of datacollection methods, in particular, reliance on self-report. The capacity of young offenders to comprehend and effectively engage with research is also of concern, posing issues of both a practical and ethical nature. The absence of a consistent "research culture" within prison environments poses further practical challenges, potentially also placing significant burden on both researchers and prison resources. The challenges discussed in this article may help inform future studies in the area and emphasize the need for greater critical reflection among researchers conducting work of this type.

  2. Bisphenol A and Risk Management Ethics

    PubMed Central

    Resnik, David B.; Elliot, Kevin C.

    2013-01-01

    It is widely recognized that endocrine disrupting compounds, such as Bisphenol A, pose challenges for traditional paradigms in toxicology, insofar as these substances appear to have a wider range of low-dose effects than previously recognized. These compounds also pose challenges for ethics and policymaking. When a chemical does not have significant low-dose effects, regulators can allow it to be introduced into commerce or the environment, provided that procedures and rules are in place to keep exposures below an acceptable level. This option allows society to maximize the benefits from the use of the chemical while minimizing risks to human health or the environment, and it represents a compromise between competing values. When it is not possible to establish acceptable exposure levels for chemicals that pose significant health or environmental risks, the most reasonable options for risk management may be to enact either partial or complete bans on their use. These options create greater moral conflict than other risk management strategies, leaving policymakers difficult choices between competing values. PMID:24471646

  3. Bisphenol A and risk management ethics.

    PubMed

    Resnik, David B; Elliott, Kevin C

    2015-03-01

    It is widely recognized that endocrine disrupting compounds, such as Bisphenol A, pose challenges for traditional paradigms in toxicology, insofar as these substances appear to have a wider range of low-dose effects than previously recognized. These compounds also pose challenges for ethics and policymaking. When a chemical does not have significant low-dose effects, regulators can allow it to be introduced into commerce or the environment, provided that procedures and rules are in place to keep exposures below an acceptable level. This option allows society to maximize the benefits from the use of the chemical while minimizing risks to human health or the environment, and it represents a compromise between competing values. When it is not possible to establish acceptable exposure levels for chemicals that pose significant health or environmental risks, the most reasonable options for risk management may be to enact either partial or complete bans on their use. These options create greater moral conflict than other risk management strategies, leaving policymakers difficult choices between competing values. © 2014 John Wiley & Sons Ltd.

  4. Random Walk Particle Tracking For Multiphase Heat Transfer

    NASA Astrophysics Data System (ADS)

    Lattanzi, Aaron; Yin, Xiaolong; Hrenya, Christine

    2017-11-01

    As computing capabilities have advanced, direct numerical simulation (DNS) has become a highly effective tool for quantitatively predicting the heat transfer within multiphase flows. Here we utilize a hybrid DNS framework that couples the lattice Boltzmann method (LBM) to the random walk particle tracking (RWPT) algorithm. The main challenge of such a hybrid is that discontinuous fields pose a significant challenge to the RWPT framework and special attention must be given to the handling of interfaces. We derive a method for addressing discontinuities in the diffusivity field, arising at the interface between two phases. Analytical means are utilized to develop an interfacial tracer balance and modify the RWPT algorithm. By expanding the modulus of the stochastic (diffusive) step and only allowing a subset of the tracers within the high diffusivity medium to undergo a diffusive step, the correct equilibrium state can be restored (globally homogeneous tracer distribution). The new RWPT algorithm is implemented within the SUSP3D code and verified against a variety of systems: effective diffusivity of a static gas-solids mixture, hot sphere in unbounded diffusion, cooling sphere in unbounded diffusion, and uniform flow past a hot sphere.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fischer, Sean A.; Ueltschi, Tyler W.; El-Khoury, Patrick Z.

    Carbon-hydrogen (C-H) vibration modes serve as key probes in the chemical identification of hydrocarbons and in vibrational sum-frequency generation (SFG) spectroscopy of hydrocarbons at the liquid/gas interface. Their assignments pose a challenge from a theoretical viewpoint. Here in this work, we present a detailed study of the C-H stretching region of dimethyl sulfoxide (DMSO) using a new Gaussian basis set- based ab initio molecular dynamics (AIMD) module that we have implemented in the NWChem computational chemistry program. By combining AIMD simulations and static normal mode analysis, we interpret experimental infrared and Raman spectra and explore the role of anharmonic effectsmore » in this system. Our anharmonic normal mode analysis of the in-phase and out-of-phase symmetric C-H stretching modes challenges the previous experimental assignment of the shoulder in the symmetric C-H stretching peak as an overtone or Fermi resonance. In addition, our AIMD simulations also show significant broadening of the in-phase symmetric C-H stretching resonance, which suggests that the experimentally observed shoulder is due to thermal broadening of the symmetric stretching resonance.« less

  6. Visualization of metabolic interaction networks in microbial communities using VisANT 5.0

    DOE PAGES

    Granger, Brian R.; Chang, Yi -Chien; Wang, Yan; ...

    2016-04-15

    Here, the complexity of metabolic networks in microbial communities poses an unresolved visualization and interpretation challenge. We address this challenge in the newly expanded version of a software tool for the analysis of biological networks, VisANT 5.0. We focus in particular on facilitating the visual exploration of metabolic interaction between microbes in a community, e.g. as predicted by COMETS (Computation of Microbial Ecosystems in Time and Space), a dynamic stoichiometric modeling framework. Using VisANT's unique meta-graph implementation, we show how one can use VisANT 5.0 to explore different time-dependent ecosystem-level metabolic networks. In particular, we analyze the metabolic interaction networkmore » between two bacteria previously shown to display an obligate cross-feeding interdependency. In addition, we illustrate how a putative minimal gut microbiome community could be represented in our framework, making it possible to highlight interactions across multiple coexisting species. We envisage that the "symbiotic layout" of VisANT can be employed as a general tool for the analysis of metabolism in complex microbial communities as well as heterogeneous human tissues.« less

  7. Visual navigation using edge curve matching for pinpoint planetary landing

    NASA Astrophysics Data System (ADS)

    Cui, Pingyuan; Gao, Xizhen; Zhu, Shengying; Shao, Wei

    2018-05-01

    Pinpoint landing is challenging for future Mars and asteroid exploration missions. Vision-based navigation scheme based on feature detection and matching is practical and can achieve the required precision. However, existing algorithms are computationally prohibitive and utilize poor-performance measurements, which pose great challenges for the application of visual navigation. This paper proposes an innovative visual navigation scheme using crater edge curves during descent and landing phase. In the algorithm, the edge curves of the craters tracked from two sequential images are utilized to determine the relative attitude and position of the lander through a normalized method. Then, considering error accumulation of relative navigation, a method is developed. That is to integrate the crater-based relative navigation method with crater-based absolute navigation method that identifies craters using a georeferenced database for continuous estimation of absolute states. In addition, expressions of the relative state estimate bias are derived. Novel necessary and sufficient observability criteria based on error analysis are provided to improve the navigation performance, which hold true for similar navigation systems. Simulation results demonstrate the effectiveness and high accuracy of the proposed navigation method.

  8. Modern drug design: the implication of using artificial neuronal networks and multiple molecular dynamic simulations

    NASA Astrophysics Data System (ADS)

    Yakovenko, Oleksandr; Jones, Steven J. M.

    2018-01-01

    We report the implementation of molecular modeling approaches developed as a part of the 2016 Grand Challenge 2, the blinded competition of computer aided drug design technologies held by the D3R Drug Design Data Resource (https://drugdesigndata.org/). The challenge was focused on the ligands of the farnesoid X receptor (FXR), a highly flexible nuclear receptor of the cholesterol derivative chenodeoxycholic acid. FXR is considered an important therapeutic target for metabolic, inflammatory, bowel and obesity related diseases (Expert Opin Drug Metab Toxicol 4:523-532, 2015), but in the context of this competition it is also interesting due to the significant ligand-induced conformational changes displayed by the protein. To deal with these conformational changes we employed multiple simulations of molecular dynamics (MD). Our MD-based protocols were top-ranked in estimating the free energy of binding of the ligands and FXR protein. Our approach was ranked second in the prediction of the binding poses where we also combined MD with molecular docking and artificial neural networks. Our approach showed mediocre results for high-throughput scoring of interactions.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granger, Brian R.; Chang, Yi -Chien; Wang, Yan

    Here, the complexity of metabolic networks in microbial communities poses an unresolved visualization and interpretation challenge. We address this challenge in the newly expanded version of a software tool for the analysis of biological networks, VisANT 5.0. We focus in particular on facilitating the visual exploration of metabolic interaction between microbes in a community, e.g. as predicted by COMETS (Computation of Microbial Ecosystems in Time and Space), a dynamic stoichiometric modeling framework. Using VisANT's unique meta-graph implementation, we show how one can use VisANT 5.0 to explore different time-dependent ecosystem-level metabolic networks. In particular, we analyze the metabolic interaction networkmore » between two bacteria previously shown to display an obligate cross-feeding interdependency. In addition, we illustrate how a putative minimal gut microbiome community could be represented in our framework, making it possible to highlight interactions across multiple coexisting species. We envisage that the "symbiotic layout" of VisANT can be employed as a general tool for the analysis of metabolism in complex microbial communities as well as heterogeneous human tissues.« less

  10. Educating the next generation of explorers at an historically Black University

    NASA Astrophysics Data System (ADS)

    Chaudhury, S.; Rodriguez, W. J.

    2003-04-01

    This paper describes the development of an innovative undergraduate research training model based at an Historically Black University in the USA that involves students with majors in diverse scientific disciplines in authentic Earth Systems Science research. Educating those who will be the next generation of explorers of earth and space poses several challenges at smaller academic institutions that might lack dedicated resources for this area of study. Over a 5-year span, Norfolk State University has been developing a program that has afforded the opportunity for students majoring in biology, chemistry, mathematics, computer science, physics, engineering and science education to work collaboratively in teams on research projects that emphasize the use of scientific visualization in studying the environment. Recently, a hands-on component has been added through partnerships with local K-12 school teachers in data collection and reporting for the GLOBE Program (GLobal Observations to Benefit the Environment). The successes and challenges of this program along with some innovative uses of technology to promote inquiry learning will be presented in this paper.

  11. Human Factors in Streaming Data Analysis: Challenges and Opportunities for Information Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Arendt, Dustin L.; Franklin, Lyndsey

    State-of-the-art visual analytics models and frameworks mostly assume a static snapshot of the data, while in many cases it is a stream with constant updates and changes. Exploration of streaming data poses unique challenges as machine-level computations and abstractions need to be synchronized with the visual representation of the data and the temporally evolving human insights. In the visual analytics literature, we lack a thorough characterization of streaming data and analysis of the challenges associated with task abstraction, visualization design, and adaptation of the role of human-in-the-loop for exploration of data streams. We aim to fill this gap by conductingmore » a survey of the state-of-the-art in visual analytics of streaming data for systematically describing the contributions and shortcomings of current techniques and analyzing the research gaps that need to be addressed in the future. Our contributions are: i) problem characterization for identifying challenges that are unique to streaming data analysis tasks, ii) a survey and analysis of the state-of-the-art in streaming data visualization research with a focus on the visualization design space for dynamic data and the role of the human-in-the-loop, and iii) reflections on the design-trade-offs for streaming visual analytics techniques and their practical applicability in real-world application scenarios.« less

  12. Quantifying airborne dispersal routes of pathogens over continents to safeguard global wheat supply.

    PubMed

    Meyer, M; Cox, J A; Hitchings, M D T; Burgin, L; Hort, M C; Hodson, D P; Gilligan, C A

    2017-10-01

    Infectious crop diseases spreading over large agricultural areas pose a threat to food security. Aggressive strains of the obligate pathogenic fungus Puccinia graminis f.sp. tritici (Pgt), causing the crop disease wheat stem rust, have been detected in East Africa and the Middle East, where they lead to substantial economic losses and threaten livelihoods of farmers. The majority of commercially grown wheat cultivars worldwide are susceptible to these emerging strains, which pose a risk to global wheat production, because the fungal spores transmitting the disease can be wind-dispersed over regions and even continents 1-11 . Targeted surveillance and control requires knowledge about airborne dispersal of pathogens, but the complex nature of long-distance dispersal poses significant challenges for quantitative research 12-14 . We combine international field surveys, global meteorological data, a Lagrangian dispersion model and high-performance computational resources to simulate a set of disease outbreak scenarios, tracing billions of stochastic trajectories of fungal spores over dynamically changing host and environmental landscapes for more than a decade. This provides the first quantitative assessment of spore transmission frequencies and amounts amongst all wheat producing countries in Southern/East Africa, the Middle East and Central/South Asia. We identify zones of high air-borne connectivity that geographically correspond with previously postulated wheat rust epidemiological zones (characterized by endemic disease and free movement of inoculum) 10,15 , and regions with genetic similarities in related pathogen populations 16,17 . We quantify the circumstances (routes, timing, outbreak sizes) under which virulent pathogen strains such as 'Ug99' 5,6 pose a threat from long-distance dispersal out of East Africa to the large wheat producing areas in Pakistan and India. Long-term mean spore dispersal trends (predominant direction, frequencies, amounts) are summarized for all countries in the domain (Supplementary Data). Our mechanistic modelling framework can be applied to other geographic areas, adapted for other pathogens and used to provide risk assessments in real-time 3 .

  13. An integrated approach to endoscopic instrument tracking for augmented reality applications in surgical simulation training.

    PubMed

    Loukas, Constantinos; Lahanas, Vasileios; Georgiou, Evangelos

    2013-12-01

    Despite the popular use of virtual and physical reality simulators in laparoscopic training, the educational potential of augmented reality (AR) has not received much attention. A major challenge is the robust tracking and three-dimensional (3D) pose estimation of the endoscopic instrument, which are essential for achieving interaction with the virtual world and for realistic rendering when the virtual scene is occluded by the instrument. In this paper we propose a method that addresses these issues, based solely on visual information obtained from the endoscopic camera. Two different tracking algorithms are combined for estimating the 3D pose of the surgical instrument with respect to the camera. The first tracker creates an adaptive model of a colour strip attached to the distal part of the tool (close to the tip). The second algorithm tracks the endoscopic shaft, using a combined Hough-Kalman approach. The 3D pose is estimated with perspective geometry, using appropriate measurements extracted by the two trackers. The method has been validated on several complex image sequences for its tracking efficiency, pose estimation accuracy and applicability in AR-based training. Using a standard endoscopic camera, the absolute average error of the tip position was 2.5 mm for working distances commonly found in laparoscopic training. The average error of the instrument's angle with respect to the camera plane was approximately 2°. The results are also supplemented by video segments of laparoscopic training tasks performed in a physical and an AR environment. The experiments yielded promising results regarding the potential of applying AR technologies for laparoscopic skills training, based on a computer vision framework. The issue of occlusion handling was adequately addressed. The estimated trajectory of the instruments may also be used for surgical gesture interpretation and assessment. Copyright © 2013 John Wiley & Sons, Ltd.

  14. The challenges associated with developing science-based landscape scale management plans.

    Treesearch

    Robert C. Szaro; Douglas A. Jr. Boyce; Thomas Puchlerz

    2005-01-01

    Planning activities over large landscapes poses a complex of challenges when trying to balance the implementation of a conservation strategy while still allowing for a variety of consumptive and nonconsumptive uses. We examine a case in southeast Alaska to illustrate the breadth of these challenges and an approach to developing a science-based resource plan. Not only...

  15. A New Framework for Effective and Efficient Global Sensitivity Analysis of Earth and Environmental Systems Models

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin

    2015-04-01

    Earth and Environmental Systems (EES) models are essential components of research, development, and decision-making in science and engineering disciplines. With continuous advances in understanding and computing power, such models are becoming more complex with increasingly more factors to be specified (model parameters, forcings, boundary conditions, etc.). To facilitate better understanding of the role and importance of different factors in producing the model responses, the procedure known as 'Sensitivity Analysis' (SA) can be very helpful. Despite the availability of a large body of literature on the development and application of various SA approaches, two issues continue to pose major challenges: (1) Ambiguous Definition of Sensitivity - Different SA methods are based in different philosophies and theoretical definitions of sensitivity, and can result in different, even conflicting, assessments of the underlying sensitivities for a given problem, (2) Computational Cost - The cost of carrying out SA can be large, even excessive, for high-dimensional problems and/or computationally intensive models. In this presentation, we propose a new approach to sensitivity analysis that addresses the dual aspects of 'effectiveness' and 'efficiency'. By effective, we mean achieving an assessment that is both meaningful and clearly reflective of the objective of the analysis (the first challenge above), while by efficiency we mean achieving statistically robust results with minimal computational cost (the second challenge above). Based on this approach, we develop a 'global' sensitivity analysis framework that efficiently generates a newly-defined set of sensitivity indices that characterize a range of important properties of metric 'response surfaces' encountered when performing SA on EES models. Further, we show how this framework embraces, and is consistent with, a spectrum of different concepts regarding 'sensitivity', and that commonly-used SA approaches (e.g., Sobol, Morris, etc.) are actually limiting cases of our approach under specific conditions. Multiple case studies are used to demonstrate the value of the new framework. The results show that the new framework provides a fundamental understanding of the underlying sensitivities for any given problem, while requiring orders of magnitude fewer model runs.

  16. Probabilistic visual and electromagnetic data fusion for robust drift-free sequential mosaicking: application to fetoscopy

    PubMed Central

    Tella-Amo, Marcel; Peter, Loic; Shakir, Dzhoshkun I.; Deprest, Jan; Iglesias, Juan Eugenio; Ourselin, Sebastien

    2018-01-01

    Abstract. The most effective treatment for twin-to-twin transfusion syndrome is laser photocoagulation of the shared vascular anastomoses in the placenta. Vascular connections are extremely challenging to locate due to their caliber and the reduced field-of-view of the fetoscope. Therefore, mosaicking techniques are beneficial to expand the scene, facilitate navigation, and allow vessel photocoagulation decision-making. Local vision-based mosaicking algorithms inherently drift over time due to the use of pairwise transformations. We propose the use of an electromagnetic tracker (EMT) sensor mounted at the tip of the fetoscope to obtain camera pose measurements, which we incorporate into a probabilistic framework with frame-to-frame visual information to achieve globally consistent sequential mosaics. We parametrize the problem in terms of plane and camera poses constrained by EMT measurements to enforce global consistency while leveraging pairwise image relationships in a sequential fashion through the use of local bundle adjustment. We show that our approach is drift-free and performs similarly to state-of-the-art global alignment techniques like bundle adjustment albeit with much less computational burden. Additionally, we propose a version of bundle adjustment that uses EMT information. We demonstrate the robustness to EMT noise and loss of visual information and evaluate mosaics for synthetic, phantom-based and ex vivo datasets. PMID:29487889

  17. Intelligent Tutoring Systems for Literacy: Existing Technologies and Continuing Challenges

    ERIC Educational Resources Information Center

    Jacovina, Matthew E.; McNamara, Danielle S.

    2017-01-01

    In this chapter, we describe several intelligent tutoring systems (ITSs) designed to support student literacy through reading comprehension and writing instruction and practice. Although adaptive instruction can be a powerful tool in the literacy domain, developing these technologies poses significant challenges. For example, evaluating the…

  18. THE DNAPL REMEDIATION CHALLENGE: IS THERE A CASE FOR SOURCE DEPLETION?

    EPA Science Inventory

    Releases of Dense Non-Aqueous Phase Liquids (DNAPLs) at a large number of public and private sector sites in the United States pose significant challenges in site remediation and long-term site management. Extensive contamination of groundwater occurs as a result of significant ...

  19. Social Risk Takers: Understanding Bilingualism in Mathematical Discussions

    ERIC Educational Resources Information Center

    Dominguez, Higinio

    2017-01-01

    The teaching and research communities in mathematics education agree that mathematical discussions pose challenges in elementary classrooms. These challenges continue to motivate research on mathematical discussions, with a focus on how students use talk in discussions. This study addresses the question, "What can teachers and researchers…

  20. LIFE CYCLE ASSESSMENT (LCA) AS A FRAMEWORK FOR ADDRESSING THE SUSTAINABILITY OF CONCENTRATED ANIMAL FEEDING OPERATIONS (CAFOS)

    EPA Science Inventory

    The challenges Concentrated Animal Feeding Operations (CAFOs) directly pose to sustainability include their impact on human health, receiving water bodies, groundwater, and air quality. These challenges result from the large quantities of macronutrients (carbon, nitrogen, and pho...

  1. The Threat or Challenge of Accountability

    ERIC Educational Resources Information Center

    Rosenberg, Marvin L.; Brody, Ralph

    1974-01-01

    Social service agencies can improve accountability to their clients in specific ways. These techniques borrow some of the language and principles of management science and can be applied successfully only if social workers accept the challenge posed by accountability and view these concepts as compatible with professional values. (Author)

  2. An interactive framework for acquiring vision models of 3-D objects from 2-D images.

    PubMed

    Motai, Yuichi; Kak, Avinash

    2004-02-01

    This paper presents a human-computer interaction (HCI) framework for building vision models of three-dimensional (3-D) objects from their two-dimensional (2-D) images. Our framework is based on two guiding principles of HCI: 1) provide the human with as much visual assistance as possible to help the human make a correct input; and 2) verify each input provided by the human for its consistency with the inputs previously provided. For example, when stereo correspondence information is elicited from a human, his/her job is facilitated by superimposing epipolar lines on the images. Although that reduces the possibility of error in the human marked correspondences, such errors are not entirely eliminated because there can be multiple candidate points close together for complex objects. For another example, when pose-to-pose correspondence is sought from a human, his/her job is made easier by allowing the human to rotate the partial model constructed in the previous pose in relation to the partial model for the current pose. While this facility reduces the incidence of human-supplied pose-to-pose correspondence errors, such errors cannot be eliminated entirely because of confusion created when multiple candidate features exist close together. Each input provided by the human is therefore checked against the previous inputs by invoking situation-specific constraints. Different types of constraints (and different human-computer interaction protocols) are needed for the extraction of polygonal features and for the extraction of curved features. We will show results on both polygonal objects and object containing curved features.

  3. An eye model for uncalibrated eye gaze estimation under variable head pose

    NASA Astrophysics Data System (ADS)

    Hnatow, Justin; Savakis, Andreas

    2007-04-01

    Gaze estimation is an important component of computer vision systems that monitor human activity for surveillance, human-computer interaction, and various other applications including iris recognition. Gaze estimation methods are particularly valuable when they are non-intrusive, do not require calibration, and generalize well across users. This paper presents a novel eye model that is employed for efficiently performing uncalibrated eye gaze estimation. The proposed eye model was constructed from a geometric simplification of the eye and anthropometric data about eye feature sizes in order to circumvent the requirement of calibration procedures for each individual user. The positions of the two eye corners and the midpupil, the distance between the two eye corners, and the radius of the eye sphere are required for gaze angle calculation. The locations of the eye corners and midpupil are estimated via processing following eye detection, and the remaining parameters are obtained from anthropometric data. This eye model is easily extended to estimating eye gaze under variable head pose. The eye model was tested on still images of subjects at frontal pose (0 °) and side pose (34 °). An upper bound of the model's performance was obtained by manually selecting the eye feature locations. The resulting average absolute error was 2.98 ° for frontal pose and 2.87 ° for side pose. The error was consistent across subjects, which indicates that good generalization was obtained. This level of performance compares well with other gaze estimation systems that utilize a calibration procedure to measure eye features.

  4. Implementation of Ada protocols on Mil-STD-1553 B data bus

    NASA Technical Reports Server (NTRS)

    Ruhman, Smil; Rosemberg, Flavia

    1986-01-01

    Standardization activity of data communication in avionic systems started in 1968 for the purpose of total system integration and the elimination of heavy wire bundles carrying signals between various subassemblies. The growing complexity of avionic systems is straining the capabilities of MIL-STD-1553 B (first issued in 1973), but a much greater challenge to it is posed by Ada, the standard language adopted for real-time, computer embedded-systems. Hardware implementation of Ada communication protocols in a contention/token bus or token ring network is proposed. However, during the transition period when the current command/response multiplex data bus is still flourishing and the development environment for distributed multi-computer Ada systems is as yet lacking, a temporary accomodation of the standard language with the standard bus could be very useful and even highly desirable. By concentrating all status informtion and decisions at the bus controller, it was found to be possible to construct an elegant and efficient harware impelementation of the Ada protocols at the bus interface. This solution is discussed.

  5. Distributed Parallel Processing and Dynamic Load Balancing Techniques for Multidisciplinary High Speed Aircraft Design

    NASA Technical Reports Server (NTRS)

    Krasteva, Denitza T.

    1998-01-01

    Multidisciplinary design optimization (MDO) for large-scale engineering problems poses many challenges (e.g., the design of an efficient concurrent paradigm for global optimization based on disciplinary analyses, expensive computations over vast data sets, etc.) This work focuses on the application of distributed schemes for massively parallel architectures to MDO problems, as a tool for reducing computation time and solving larger problems. The specific problem considered here is configuration optimization of a high speed civil transport (HSCT), and the efficient parallelization of the embedded paradigm for reasonable design space identification. Two distributed dynamic load balancing techniques (random polling and global round robin with message combining) and two necessary termination detection schemes (global task count and token passing) were implemented and evaluated in terms of effectiveness and scalability to large problem sizes and a thousand processors. The effect of certain parameters on execution time was also inspected. Empirical results demonstrated stable performance and effectiveness for all schemes, and the parametric study showed that the selected algorithmic parameters have a negligible effect on performance.

  6. Real-time probabilistic covariance tracking with efficient model update.

    PubMed

    Wu, Yi; Cheng, Jian; Wang, Jinqiao; Lu, Hanqing; Wang, Jun; Ling, Haibin; Blasch, Erik; Bai, Li

    2012-05-01

    The recently proposed covariance region descriptor has been proven robust and versatile for a modest computational cost. The covariance matrix enables efficient fusion of different types of features, where the spatial and statistical properties, as well as their correlation, are characterized. The similarity between two covariance descriptors is measured on Riemannian manifolds. Based on the same metric but with a probabilistic framework, we propose a novel tracking approach on Riemannian manifolds with a novel incremental covariance tensor learning (ICTL). To address the appearance variations, ICTL incrementally learns a low-dimensional covariance tensor representation and efficiently adapts online to appearance changes of the target with only O(1) computational complexity, resulting in a real-time performance. The covariance-based representation and the ICTL are then combined with the particle filter framework to allow better handling of background clutter, as well as the temporary occlusions. We test the proposed probabilistic ICTL tracker on numerous benchmark sequences involving different types of challenges including occlusions and variations in illumination, scale, and pose. The proposed approach demonstrates excellent real-time performance, both qualitatively and quantitatively, in comparison with several previously proposed trackers.

  7. Profiling and Improving I/O Performance of a Large-Scale Climate Scientific Application

    NASA Technical Reports Server (NTRS)

    Liu, Zhuo; Wang, Bin; Wang, Teng; Tian, Yuan; Xu, Cong; Wang, Yandong; Yu, Weikuan; Cruz, Carlos A.; Zhou, Shujia; Clune, Tom; hide

    2013-01-01

    Exascale computing systems are soon to emerge, which will pose great challenges on the huge gap between computing and I/O performance. Many large-scale scientific applications play an important role in our daily life. The huge amounts of data generated by such applications require highly parallel and efficient I/O management policies. In this paper, we adopt a mission-critical scientific application, GEOS-5, as a case to profile and analyze the communication and I/O issues that are preventing applications from fully utilizing the underlying parallel storage systems. Through in-detail architectural and experimental characterization, we observe that current legacy I/O schemes incur significant network communication overheads and are unable to fully parallelize the data access, thus degrading applications' I/O performance and scalability. To address these inefficiencies, we redesign its I/O framework along with a set of parallel I/O techniques to achieve high scalability and performance. Evaluation results on the NASA discover cluster show that our optimization of GEOS-5 with ADIOS has led to significant performance improvements compared to the original GEOS-5 implementation.

  8. Strategies to use tablet computers for collection of electronic patient-reported outcomes.

    PubMed

    Schick-Makaroff, Kara; Molzahn, Anita

    2015-01-22

    Mobile devices are increasingly being used for data collection in research. However, many researchers do not have experience in collecting data electronically. Hence, the purpose of this short report was to identify issues that emerged in a study that incorporated electronic capture of patient-reported outcomes in clinical settings, and strategies used to address the issues. The issues pertaining to electronic patient-reported outcome data collection were captured qualitatively during a study on use of electronic patient-reported outcomes in two home dialysis units. Fifty-six patients completed three surveys on tablet computers, including the Kidney Disease Quality of Life-36, the Edmonton Symptom Assessment Scale, and a satisfaction measure. Issues that arose throughout the research process were recorded during ethics reviews, implementation process, and data collection. Four core issues emerged including logistics of technology, security, institutional and financial support, and electronic design. Although use of mobile devices for data collection has many benefits, it also poses new challenges for researchers. Advance consideration of possible issues that emerge in the process, and strategies that can help address these issues, may prevent disruption and enhance validity of findings.

  9. Architectural Techniques For Managing Non-volatile Caches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mittal, Sparsh

    As chip power dissipation becomes a critical challenge in scaling processor performance, computer architects are forced to fundamentally rethink the design of modern processors and hence, the chip-design industry is now at a major inflection point in its hardware roadmap. The high leakage power and low density of SRAM poses serious obstacles in its use for designing large on-chip caches and for this reason, researchers are exploring non-volatile memory (NVM) devices, such as spin torque transfer RAM, phase change RAM and resistive RAM. However, since NVMs are not strictly superior to SRAM, effective architectural techniques are required for making themmore » a universal memory solution. This book discusses techniques for designing processor caches using NVM devices. It presents algorithms and architectures for improving their energy efficiency, performance and lifetime. It also provides both qualitative and quantitative evaluation to help the reader gain insights and motivate them to explore further. This book will be highly useful for beginners as well as veterans in computer architecture, chip designers, product managers and technical marketing professionals.« less

  10. Fast reconstruction of optical properties for complex segmentations in near infrared imaging

    NASA Astrophysics Data System (ADS)

    Jiang, Jingjing; Wolf, Martin; Sánchez Majos, Salvador

    2017-04-01

    The intrinsic ill-posed nature of the inverse problem in near infrared imaging makes the reconstruction of fine details of objects deeply embedded in turbid media challenging even for the large amounts of data provided by time-resolved cameras. In addition, most reconstruction algorithms for this type of measurements are only suitable for highly symmetric geometries and rely on a linear approximation to the diffusion equation since a numerical solution of the fully non-linear problem is computationally too expensive. In this paper, we will show that a problem of practical interest can be successfully addressed making efficient use of the totality of the information supplied by time-resolved cameras. We set aside the goal of achieving high spatial resolution for deep structures and focus on the reconstruction of complex arrangements of large regions. We show numerical results based on a combined approach of wavelength-normalized data and prior geometrical information, defining a fully parallelizable problem in arbitrary geometries for time-resolved measurements. Fast reconstructions are obtained using a diffusion approximation and Monte-Carlo simulations, parallelized in a multicore computer and a GPU respectively.

  11. Neuroblastoma, a Paradigm for Big Data Science in Pediatric Oncology

    PubMed Central

    Salazar, Brittany M.; Balczewski, Emily A.; Ung, Choong Yong; Zhu, Shizhen

    2016-01-01

    Pediatric cancers rarely exhibit recurrent mutational events when compared to most adult cancers. This poses a challenge in understanding how cancers initiate, progress, and metastasize in early childhood. Also, due to limited detected driver mutations, it is difficult to benchmark key genes for drug development. In this review, we use neuroblastoma, a pediatric solid tumor of neural crest origin, as a paradigm for exploring “big data” applications in pediatric oncology. Computational strategies derived from big data science–network- and machine learning-based modeling and drug repositioning—hold the promise of shedding new light on the molecular mechanisms driving neuroblastoma pathogenesis and identifying potential therapeutics to combat this devastating disease. These strategies integrate robust data input, from genomic and transcriptomic studies, clinical data, and in vivo and in vitro experimental models specific to neuroblastoma and other types of cancers that closely mimic its biological characteristics. We discuss contexts in which “big data” and computational approaches, especially network-based modeling, may advance neuroblastoma research, describe currently available data and resources, and propose future models of strategic data collection and analyses for neuroblastoma and other related diseases. PMID:28035989

  12. Neuroblastoma, a Paradigm for Big Data Science in Pediatric Oncology.

    PubMed

    Salazar, Brittany M; Balczewski, Emily A; Ung, Choong Yong; Zhu, Shizhen

    2016-12-27

    Pediatric cancers rarely exhibit recurrent mutational events when compared to most adult cancers. This poses a challenge in understanding how cancers initiate, progress, and metastasize in early childhood. Also, due to limited detected driver mutations, it is difficult to benchmark key genes for drug development. In this review, we use neuroblastoma, a pediatric solid tumor of neural crest origin, as a paradigm for exploring "big data" applications in pediatric oncology. Computational strategies derived from big data science-network- and machine learning-based modeling and drug repositioning-hold the promise of shedding new light on the molecular mechanisms driving neuroblastoma pathogenesis and identifying potential therapeutics to combat this devastating disease. These strategies integrate robust data input, from genomic and transcriptomic studies, clinical data, and in vivo and in vitro experimental models specific to neuroblastoma and other types of cancers that closely mimic its biological characteristics. We discuss contexts in which "big data" and computational approaches, especially network-based modeling, may advance neuroblastoma research, describe currently available data and resources, and propose future models of strategic data collection and analyses for neuroblastoma and other related diseases.

  13. Numerical Simulations of Reacting Flows Using Asynchrony-Tolerant Schemes for Exascale Computing

    NASA Astrophysics Data System (ADS)

    Cleary, Emmet; Konduri, Aditya; Chen, Jacqueline

    2017-11-01

    Communication and data synchronization between processing elements (PEs) are likely to pose a major challenge in scalability of solvers at the exascale. Recently developed asynchrony-tolerant (AT) finite difference schemes address this issue by relaxing communication and synchronization between PEs at a mathematical level while preserving accuracy, resulting in improved scalability. The performance of these schemes has been validated for simple linear and nonlinear homogeneous PDEs. However, many problems of practical interest are governed by highly nonlinear PDEs with source terms, whose solution may be sensitive to perturbations caused by communication asynchrony. The current work applies the AT schemes to combustion problems with chemical source terms, yielding a stiff system of PDEs with nonlinear source terms highly sensitive to temperature. Examples shown will use single-step and multi-step CH4 mechanisms for 1D premixed and nonpremixed flames. Error analysis will be discussed both in physical and spectral space. Results show that additional errors introduced by the AT schemes are negligible and the schemes preserve their accuracy. We acknowledge funding from the DOE Computational Science Graduate Fellowship administered by the Krell Institute.

  14. Structure prediction of the second extracellular loop in G-protein-coupled receptors.

    PubMed

    Kmiecik, Sebastian; Jamroz, Michal; Kolinski, Michal

    2014-06-03

    G-protein-coupled receptors (GPCRs) play key roles in living organisms. Therefore, it is important to determine their functional structures. The second extracellular loop (ECL2) is a functionally important region of GPCRs, which poses significant challenge for computational structure prediction methods. In this work, we evaluated CABS, a well-established protein modeling tool for predicting ECL2 structure in 13 GPCRs. The ECL2s (with between 13 and 34 residues) are predicted in an environment of other extracellular loops being fully flexible and the transmembrane domain fixed in its x-ray conformation. The modeling procedure used theoretical predictions of ECL2 secondary structure and experimental constraints on disulfide bridges. Our approach yielded ensembles of low-energy conformers and the most populated conformers that contained models close to the available x-ray structures. The level of similarity between the predicted models and x-ray structures is comparable to that of other state-of-the-art computational methods. Our results extend other studies by including newly crystallized GPCRs. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Three-dimensional deformable-model-based localization and recognition of road vehicles.

    PubMed

    Zhang, Zhaoxiang; Tan, Tieniu; Huang, Kaiqi; Wang, Yunhong

    2012-01-01

    We address the problem of model-based object recognition. Our aim is to localize and recognize road vehicles from monocular images or videos in calibrated traffic scenes. A 3-D deformable vehicle model with 12 shape parameters is set up as prior information, and its pose is determined by three parameters, which are its position on the ground plane and its orientation about the vertical axis under ground-plane constraints. An efficient local gradient-based method is proposed to evaluate the fitness between the projection of the vehicle model and image data, which is combined into a novel evolutionary computing framework to estimate the 12 shape parameters and three pose parameters by iterative evolution. The recovery of pose parameters achieves vehicle localization, whereas the shape parameters are used for vehicle recognition. Numerous experiments are conducted in this paper to demonstrate the performance of our approach. It is shown that the local gradient-based method can evaluate accurately and efficiently the fitness between the projection of the vehicle model and the image data. The evolutionary computing framework is effective for vehicles of different types and poses is robust to all kinds of occlusion.

  16. Exploration Design Challenge 2014

    NASA Image and Video Library

    2014-04-25

    Team Lore poses with NASA Administrator Charles Bolden and Lockheed Martin CEO, Marillyn Hewson. Team Lore was one of the semi-finalists in the Exploration Design Challenge. The goal of the Exploration Design Challenge is for students to research and design ways to protect astronauts from space radiation. The winner of the challenge was announced on April 25, 2014 at the USA Science and Engineering Festival at the Washington Convention Center in Washington, DC. Photo Credit: (NASA/Aubrey Gemignani)

  17. Exploration Design Challenge 2014

    NASA Image and Video Library

    2014-04-25

    Team Aegis poses with NASA Administrator Charles Bolden and Lockheed Martin CEO, Marillyn Hewson. Team Aegis was one of the semi-finalists in the Exploration Design Challenge. The goal of the Exploration Design Challenge is for students to research and design ways to protect astronauts from space radiation. The winner of the challenge was announced on April 25, 2014 at the USA Science and Engineering Festival at the Washington Convention Center in Washington, DC. Photo Credit: (NASA/Aubrey Gemignani)

  18. Time-Domain Impedance Boundary Conditions for Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.; Auriault, Laurent

    1996-01-01

    It is an accepted practice in aeroacoustics to characterize the properties of an acoustically treated surface by a quantity known as impedance. Impedance is a complex quantity. As such, it is designed primarily for frequency-domain analysis. Time-domain boundary conditions that are the equivalent of the frequency-domain impedance boundary condition are proposed. Both single frequency and model broadband time-domain impedance boundary conditions are provided. It is shown that the proposed boundary conditions, together with the linearized Euler equations, form well-posed initial boundary value problems. Unlike ill-posed problems, they are free from spurious instabilities that would render time-marching computational solutions impossible.

  19. Camera-pose estimation via projective Newton optimization on the manifold.

    PubMed

    Sarkis, Michel; Diepold, Klaus

    2012-04-01

    Determining the pose of a moving camera is an important task in computer vision. In this paper, we derive a projective Newton algorithm on the manifold to refine the pose estimate of a camera. The main idea is to benefit from the fact that the 3-D rigid motion is described by the special Euclidean group, which is a Riemannian manifold. The latter is equipped with a tangent space defined by the corresponding Lie algebra. This enables us to compute the optimization direction, i.e., the gradient and the Hessian, at each iteration of the projective Newton scheme on the tangent space of the manifold. Then, the motion is updated by projecting back the variables on the manifold itself. We also derive another version of the algorithm that employs homeomorphic parameterization to the special Euclidean group. We test the algorithm on several simulated and real image data sets. Compared with the standard Newton minimization scheme, we are now able to obtain the full numerical formula of the Hessian with a 60% decrease in computational complexity. Compared with Levenberg-Marquardt, the results obtained are more accurate while having a rather similar complexity.

  20. Exemplar-based human action pose correction.

    PubMed

    Shen, Wei; Deng, Ke; Bai, Xiang; Leyvand, Tommer; Guo, Baining; Tu, Zhuowen

    2014-07-01

    The launch of Xbox Kinect has built a very successful computer vision product and made a big impact on the gaming industry. This sheds lights onto a wide variety of potential applications related to action recognition. The accurate estimation of human poses from the depth image is universally a critical step. However, existing pose estimation systems exhibit failures when facing severe occlusion. In this paper, we propose an exemplar-based method to learn to correct the initially estimated poses. We learn an inhomogeneous systematic bias by leveraging the exemplar information within a specific human action domain. Furthermore, as an extension, we learn a conditional model by incorporation of pose tags to further increase the accuracy of pose correction. In the experiments, significant improvements on both joint-based skeleton correction and tag prediction are observed over the contemporary approaches, including what is delivered by the current Kinect system. Our experiments for the facial landmark correction also illustrate that our algorithm can improve the accuracy of other detection/estimation systems.

  1. Attribute And-Or Grammar for Joint Parsing of Human Pose, Parts and Attributes.

    PubMed

    Park, Seyoung; Nie, Xiaohan; Zhu, Song-Chun

    2017-07-25

    This paper presents an attribute and-or grammar (A-AOG) model for jointly inferring human body pose and human attributes in a parse graph with attributes augmented to nodes in the hierarchical representation. In contrast to other popular methods in the current literature that train separate classifiers for poses and individual attributes, our method explicitly represents the decomposition and articulation of body parts, and account for the correlations between poses and attributes. The A-AOG model is an amalgamation of three traditional grammar formulations: (i)Phrase structure grammar representing the hierarchical decomposition of the human body from whole to parts; (ii)Dependency grammar modeling the geometric articulation by a kinematic graph of the body pose; and (iii)Attribute grammar accounting for the compatibility relations between different parts in the hierarchy so that their appearances follow a consistent style. The parse graph outputs human detection, pose estimation, and attribute prediction simultaneously, which are intuitive and interpretable. We conduct experiments on two tasks on two datasets, and experimental results demonstrate the advantage of joint modeling in comparison with computing poses and attributes independently. Furthermore, our model obtains better performance over existing methods for both pose estimation and attribute prediction tasks.

  2. Food Aversions and Cravings during Pregnancy on Yasawa Island, Fiji.

    PubMed

    McKerracher, Luseadra; Collard, Mark; Henrich, Joseph

    2016-09-01

    Women often experience novel food aversions and cravings during pregnancy. These appetite changes have been hypothesized to work alongside cultural strategies as adaptive responses to the challenges posed by pregnancy (e.g., maternal immune suppression). Here, we report a study that assessed whether data from an indigenous population in Fiji are consistent with the predictions of this hypothesis. We found that aversions focus predominantly on foods expected to exacerbate the challenges of pregnancy. Cravings focus on foods that provide calories and micronutrients while posing few threats to mothers and fetuses. We also found that women who experience aversions to specific foods are more likely to crave foods that meet nutritional needs similar to those provided by the aversive foods. These findings are in line with the predictions of the hypothesis. This adds further weight to the argument that appetite changes may function in parallel with cultural mechanisms to solve pregnancy challenges.

  3. Climate Change and Health in the Urban Context: The Experience of Barcelona.

    PubMed

    Villalbí, Joan R; Ventayol, Irma

    2016-07-01

    Climate change poses huge challenges for public health, and cities are at the forefront of this process. The purpose of this paper is to present the issues climate change poses for public health in the city of Barcelona, how they are being addressed, and what are the current major challenges, trying to contribute to the development of a baseline understanding of the status of adaptation in cities from a public health perspective. The major issues related to climate change faced by the city are common to other urban centers in a Mediterranean climate: heat waves, water availability and quality, air quality, and diseases transmitted by vectors, and all are reviewed in detail with empirical data. They are not a potential threat for the future, but have actually challenged the city services and infrastructure over the last years, requiring sustainable responses and rigorous planning. © The Author(s) 2016.

  4. An improved silhouette for human pose estimation

    NASA Astrophysics Data System (ADS)

    Hawes, Anthony H.; Iftekharuddin, Khan M.

    2017-08-01

    We propose a novel method for analyzing images that exploits the natural lines of a human poses to find areas where self-occlusion could be present. Errors caused by self-occlusion cause several modern human pose estimation methods to mis-identify body parts, which reduces the performance of most action recognition algorithms. Our method is motivated by the observation that, in several cases, occlusion can be reasoned using only boundary lines of limbs. An intelligent edge detection algorithm based on the above principle could be used to augment the silhouette with information useful for pose estimation algorithms and push forward progress on occlusion handling for human action recognition. The algorithm described is applicable to computer vision scenarios involving 2D images and (appropriated flattened) 3D images.

  5. The future water environment--using scenarios to explore the significant water management challenges in England and Wales to 2050.

    PubMed

    Henriques, C; Garnett, K; Weatherhead, E K; Lickorish, F A; Forrow, D; Delgado, J

    2015-04-15

    Society gets numerous benefits from the water environment. It is crucial to ensure that water management practices deliver these benefits over the long-term in a sustainable and cost-effective way. Currently, hydromorphological alterations and nutrient enrichment pose the greatest challenges in European water bodies. The rapidly changing climatic and socio-economic boundary conditions pose further challenges to water management decisions and the achievement of policy goals. Scenarios are a strategic tool useful in conducting systematic investigations of future uncertainties pertaining to water management. In this study, the use of scenarios revealed water management challenges for England and Wales to 2050. A set of existing scenarios relevant to river basin management were elaborated through stakeholder workshops and interviews, relying on expert knowledge to identify drivers of change, their interdependencies, and influence on system dynamics. In a set of four plausible alternative futures, the causal chain from driving forces through pressures to states, impacts and responses (DPSIR framework) was explored. The findings suggest that scenarios driven by short-term economic growth and competitiveness undermine current environmental legislative requirements and exacerbate the negative impacts of climate change, producing a general deterioration of water quality and physical habitats, as well as reduced water availability with adverse implications for the environment, society and economy. Conversely, there are substantial environmental improvements under the scenarios characterised by long-term sustainability, though achieving currently desired environmental outcomes still poses challenges. The impacts vary across contrasting generic catchment types that exhibit distinct future water management challenges. The findings suggest the need to address hydromorphological alterations, nutrient enrichment and nitrates in drinking water, which are all likely to be exacerbated in the future. Future-proofing river basin management measures that deal with these challenges is crucial moving forward. The use of scenarios to future-proof strategy, policy and delivery mechanisms is discussed to inform next steps. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Epidemic forecasting is messier than weather forecasting: The role of human behavior and internet data streams in epidemic forecast

    DOE PAGES

    Moran, Kelly Renee; Fairchild, Geoffrey; Generous, Nicholas; ...

    2016-11-14

    Mathematical models, such as those that forecast the spread of epidemics or predict the weather, must overcome the challenges of integrating incomplete and inaccurate data in computer simulations, estimating the probability of multiple possible scenarios, incorporating changes in human behavior and/or the pathogen, and environmental factors. In the past 3 decades, the weather forecasting community has made significant advances in data collection, assimilating heterogeneous data steams into models and communicating the uncertainty of their predictions to the general public. Epidemic modelers are struggling with these same issues in forecasting the spread of emerging diseases, such as Zika virus infection andmore » Ebola virus disease. While weather models rely on physical systems, data from satellites, and weather stations, epidemic models rely on human interactions, multiple data sources such as clinical surveillance and Internet data, and environmental or biological factors that can change the pathogen dynamics. We describe some of similarities and differences between these 2 fields and how the epidemic modeling community is rising to the challenges posed by forecasting to help anticipate and guide the mitigation of epidemics. Here, we conclude that some of the fundamental differences between these 2 fields, such as human behavior, make disease forecasting more challenging than weather forecasting.« less

  7. Epidemic forecasting is messier than weather forecasting: The role of human behavior and internet data streams in epidemic forecast

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moran, Kelly Renee; Fairchild, Geoffrey; Generous, Nicholas

    Mathematical models, such as those that forecast the spread of epidemics or predict the weather, must overcome the challenges of integrating incomplete and inaccurate data in computer simulations, estimating the probability of multiple possible scenarios, incorporating changes in human behavior and/or the pathogen, and environmental factors. In the past 3 decades, the weather forecasting community has made significant advances in data collection, assimilating heterogeneous data steams into models and communicating the uncertainty of their predictions to the general public. Epidemic modelers are struggling with these same issues in forecasting the spread of emerging diseases, such as Zika virus infection andmore » Ebola virus disease. While weather models rely on physical systems, data from satellites, and weather stations, epidemic models rely on human interactions, multiple data sources such as clinical surveillance and Internet data, and environmental or biological factors that can change the pathogen dynamics. We describe some of similarities and differences between these 2 fields and how the epidemic modeling community is rising to the challenges posed by forecasting to help anticipate and guide the mitigation of epidemics. Here, we conclude that some of the fundamental differences between these 2 fields, such as human behavior, make disease forecasting more challenging than weather forecasting.« less

  8. Epidemic Forecasting is Messier Than Weather Forecasting: The Role of Human Behavior and Internet Data Streams in Epidemic Forecast

    PubMed Central

    Moran, Kelly R.; Fairchild, Geoffrey; Generous, Nicholas; Hickmann, Kyle; Osthus, Dave; Priedhorsky, Reid; Hyman, James; Del Valle, Sara Y.

    2016-01-01

    Mathematical models, such as those that forecast the spread of epidemics or predict the weather, must overcome the challenges of integrating incomplete and inaccurate data in computer simulations, estimating the probability of multiple possible scenarios, incorporating changes in human behavior and/or the pathogen, and environmental factors. In the past 3 decades, the weather forecasting community has made significant advances in data collection, assimilating heterogeneous data steams into models and communicating the uncertainty of their predictions to the general public. Epidemic modelers are struggling with these same issues in forecasting the spread of emerging diseases, such as Zika virus infection and Ebola virus disease. While weather models rely on physical systems, data from satellites, and weather stations, epidemic models rely on human interactions, multiple data sources such as clinical surveillance and Internet data, and environmental or biological factors that can change the pathogen dynamics. We describe some of similarities and differences between these 2 fields and how the epidemic modeling community is rising to the challenges posed by forecasting to help anticipate and guide the mitigation of epidemics. We conclude that some of the fundamental differences between these 2 fields, such as human behavior, make disease forecasting more challenging than weather forecasting. PMID:28830111

  9. Computed tomography and magnetic resonance imaging findings of intraorbital granular cell tumor (Abrikossoff's tumor): a case report.

    PubMed

    Yuan, Wei-Hsin; Lin, Tai-Chi; Lirng, Jiing-Feng; Guo, Wan-You; Chang, Fu-Pang; Ho, Donald Ming-Tak

    2016-05-13

    Granular cell tumors are rare neoplasms which can occur in any part of the body. Granular cell tumors of the orbit account for only 3 % of all granular cell tumor cases. Computed tomography and magnetic resonance imaging of the orbit have proven useful for diagnosing orbital tumors. However, the rarity of intraorbital granular cell tumors poses a significant diagnostic challenge for both clinicians and radiologists. We report a case of a 37-year-old Chinese woman with a rare intraocular granular cell tumor of her right eye presenting with diplopia, proptosis, and restriction of ocular movement. Preoperative orbital computed tomography and magnetic resonance imaging with contrast enhancement revealed an enhancing solid, ovoid, well-demarcated, retrobulbar nodule. In addition, magnetic resonance imaging features included an intraorbital tumor which was isointense relative to gray matter on T1-weighted imaging and hypointense on T2-weighted imaging. No diffusion restriction of water was noted on either axial diffusion-weighted images or apparent diffusion coefficient maps. Both computed tomography and magnetic resonance imaging features suggested an intraorbital hemangioma. However, postoperative pathology (together with immunohistochemistry) identified an intraorbital granular cell tumor. When intraorbital T2 hypointensity and free diffusion of water are observed on magnetic resonance imaging, a granular cell tumor should be included in the differential diagnosis of an intraocular tumor.

  10. Rasdaman for Big Spatial Raster Data

    NASA Astrophysics Data System (ADS)

    Hu, F.; Huang, Q.; Scheele, C. J.; Yang, C. P.; Yu, M.; Liu, K.

    2015-12-01

    Spatial raster data have grown exponentially over the past decade. Recent advancements on data acquisition technology, such as remote sensing, have allowed us to collect massive observation data of various spatial resolution and domain coverage. The volume, velocity, and variety of such spatial data, along with the computational intensive nature of spatial queries, pose grand challenge to the storage technologies for effective big data management. While high performance computing platforms (e.g., cloud computing) can be used to solve the computing-intensive issues in big data analysis, data has to be managed in a way that is suitable for distributed parallel processing. Recently, rasdaman (raster data manager) has emerged as a scalable and cost-effective database solution to store and retrieve massive multi-dimensional arrays, such as sensor, image, and statistics data. Within this paper, the pros and cons of using rasdaman to manage and query spatial raster data will be examined and compared with other common approaches, including file-based systems, relational databases (e.g., PostgreSQL/PostGIS), and NoSQL databases (e.g., MongoDB and Hive). Earth Observing System (EOS) data collected from NASA's Atmospheric Scientific Data Center (ASDC) will be used and stored in these selected database systems, and a set of spatial and non-spatial queries will be designed to benchmark their performance on retrieving large-scale, multi-dimensional arrays of EOS data. Lessons learnt from using rasdaman will be discussed as well.

  11. Distribution of man-machine controls in space teleoperation

    NASA Technical Reports Server (NTRS)

    Bejczy, A. K.

    1982-01-01

    The distribution of control between man and machine is dependent on the tasks, available technology, human performance characteristics and control goals. This dependency has very specific projections on systems designed for teleoperation in space. This paper gives a brief outline of the space-related issues and presents the results of advanced teleoperator research and development at the Jet Propulsion Laboratory (JPL). The research and development work includes smart sensors, flexible computer controls and intelligent man-machine interface devices in the area of visual displays and kinesthetic man-machine coupling in remote control of manipulators. Some of the development results have been tested at the Johnson Space Center (JSC) using the simulated full-scale Shuttle Remote Manipulator System (RMS). The research and development work for advanced space teleoperation is far from complete and poses many interdisciplinary challenges.

  12. A comparison of the primal and semi-dual variational formats of gradient-extended crystal inelasticity

    NASA Astrophysics Data System (ADS)

    Carlsson, Kristoffer; Runesson, Kenneth; Larsson, Fredrik; Ekh, Magnus

    2017-10-01

    In this paper we discuss issues related to the theoretical as well as the computational format of gradient-extended crystal viscoplasticity. The so-called primal format uses the displacements, the slip of each slip system and the dissipative stresses as the primary unknown fields. An alternative format is coined the semi-dual format, which in addition includes energetic microstresses among the primary unknown fields. We compare the primal and semi-dual variational formats in terms of advantages and disadvantages from modeling as well as numerical viewpoints. Finally, we perform a series of representative numerical tests to investigate the rate of convergence with finite element mesh refinement. In particular, it is shown that the commonly adopted microhard boundary condition poses a challenge in the special case that the slip direction is parallel to a grain boundary.

  13. Semantics driven approach for knowledge acquisition from EMRs.

    PubMed

    Perera, Sujan; Henson, Cory; Thirunarayan, Krishnaprasad; Sheth, Amit; Nair, Suhas

    2014-03-01

    Semantic computing technologies have matured to be applicable to many critical domains such as national security, life sciences, and health care. However, the key to their success is the availability of a rich domain knowledge base. The creation and refinement of domain knowledge bases pose difficult challenges. The existing knowledge bases in the health care domain are rich in taxonomic relationships, but they lack nontaxonomic (domain) relationships. In this paper, we describe a semiautomatic technique for enriching existing domain knowledge bases with causal relationships gleaned from Electronic Medical Records (EMR) data. We determine missing causal relationships between domain concepts by validating domain knowledge against EMR data sources and leveraging semantic-based techniques to derive plausible relationships that can rectify knowledge gaps. Our evaluation demonstrates that semantic techniques can be employed to improve the efficiency of knowledge acquisition.

  14. White House Science Fair

    NASA Image and Video Library

    2014-05-27

    Girl Scout troop 2612 members from Tulsa, OK take photos of one another with Google Glass at the White House Science Fair Tuesday, May 27, 2014. Avery Dodson, 6; Natalie Hurley, 8; Miriam Schaffer, 8; Claire Winton, 8; and Lucy Claire Sharp, 8 participated in the Junior FIRST Lego League's Disaster Blaster Challenge, which invites elementary-school-aged students from across the country to explore how simple machines, engineering, and math can help solve problems posed by natural disasters. The girls invented the "Flood Proof Bridge" and built a model mechanizing the bridge using motors and developing a computer program to automatically retract the bridge when flood conditions are detected. The fourth White House Science Fair was held at the White House and included 100 students from more than 30 different states who competed in science, technology, engineering, and math (STEM) competitions. (Photo Credit: NASA/Aubrey Gemignani)

  15. Dragonfly: an implementation of the expand-maximize-compress algorithm for single-particle imaging.

    PubMed

    Ayyer, Kartik; Lan, Ti-Yen; Elser, Veit; Loh, N Duane

    2016-08-01

    Single-particle imaging (SPI) with X-ray free-electron lasers has the potential to change fundamentally how biomacromolecules are imaged. The structure would be derived from millions of diffraction patterns, each from a different copy of the macromolecule before it is torn apart by radiation damage. The challenges posed by the resultant data stream are staggering: millions of incomplete, noisy and un-oriented patterns have to be computationally assembled into a three-dimensional intensity map and then phase reconstructed. In this paper, the Dragonfly software package is described, based on a parallel implementation of the expand-maximize-compress reconstruction algorithm that is well suited for this task. Auxiliary modules to simulate SPI data streams are also included to assess the feasibility of proposed SPI experiments at the Linac Coherent Light Source, Stanford, California, USA.

  16. Performance of InGaAs short wave infrared avalanche photodetector for low flux imaging

    NASA Astrophysics Data System (ADS)

    Singh, Anand; Pal, Ravinder

    2017-11-01

    Opto-electronic performance of the InGaAs/i-InGaAs/InP short wavelength infrared focal plane array suitable for high resolution imaging under low flux conditions and ranging is presented. More than 85% quantum efficiency is achieved in the optimized detector structure. Isotropic nature of the wet etching process poses a challenge in maintaining the required control in the small pitch high density detector array. Etching process is developed to achieve low dark current density of 1 nA/cm2 in the detector array with 25 µm pitch at 298 K. Noise equivalent photon performance less than one is achievable showing single photon detection capability. The reported photodiode with low photon flux is suitable for active cum passive imaging, optical information processing and quantum computing applications.

  17. A fragment-based approach to the SAMPL3 Challenge

    NASA Astrophysics Data System (ADS)

    Kulp, John L.; Blumenthal, Seth N.; Wang, Qiang; Bryan, Richard L.; Guarnieri, Frank

    2012-05-01

    The success of molecular fragment-based design depends critically on the ability to make predictions of binding poses and of affinity ranking for compounds assembled by linking fragments. The SAMPL3 Challenge provides a unique opportunity to evaluate the performance of a state-of-the-art fragment-based design methodology with respect to these requirements. In this article, we present results derived from linking fragments to predict affinity and pose in the SAMPL3 Challenge. The goal is to demonstrate how incorporating different aspects of modeling protein-ligand interactions impact the accuracy of the predictions, including protein dielectric models, charged versus neutral ligands, ΔΔGs solvation energies, and induced conformational stress. The core method is based on annealing of chemical potential in a Grand Canonical Monte Carlo (GC/MC) simulation. By imposing an initially very high chemical potential and then automatically running a sequence of simulations at successively decreasing chemical potentials, the GC/MC simulation efficiently discovers statistical distributions of bound fragment locations and orientations not found reliably without the annealing. This method accounts for configurational entropy, the role of bound water molecules, and results in a prediction of all the locations on the protein that have any affinity for the fragment. Disregarding any of these factors in affinity-rank prediction leads to significantly worse correlation with experimentally-determined free energies of binding. We relate three important conclusions from this challenge as applied to GC/MC: (1) modeling neutral ligands—regardless of the charged state in the active site—produced better affinity ranking than using charged ligands, although, in both cases, the poses were almost exactly overlaid; (2) simulating explicit water molecules in the GC/MC gave better affinity and pose predictions; and (3) applying a ΔΔGs solvation correction further improved the ranking of the neutral ligands. Using the GC/MC method under a variety of parameters in the blinded SAMPL3 Challenge provided important insights to the relevant parameters and boundaries in predicting binding affinities using simulated annealing of chemical potential calculations.

  18. Comparison of Student and Instructor Perceptions of Social Presence

    ERIC Educational Resources Information Center

    Mathieson, Kathleen; Leafman, Joan S.

    2014-01-01

    As enrollment in online courses continues to grow and online education is increasingly recognized as an established instructional mode, the unique challenges posed by this learning environment should be addressed. A primary challenge for virtual educators is developing social presence such that participants feel a sense of human connection with…

  19. Entrepreneurship Education Can Business Schools Meet the Challenge

    ERIC Educational Resources Information Center

    Kirby, David A.

    2004-01-01

    Examines the characteristics and role of the entrepreneur and the challenges for business schools posed by the need to develop more enterprising individuals. Argues that the traditional education system stultifies rather than develops the requisite attributes and skills to produce entrepreneurs, and proposes that if entrepreneurs are to be…

  20. Quantitative Approaches to Group Research: Suggestions for Best Practices

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal

    2017-01-01

    Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…

  1. Adoption of Internet2 in a Southwestern University: Human Resources Concerns

    ERIC Educational Resources Information Center

    Mendoza-Diaz, Noemi V.; Dooley, Larry M.; Dooley, Kim E.

    2007-01-01

    Human Resources are often times challenged by the integration of new technologies (Benson, Johnson, & Kichinke, 2002). Universities pose a unique challenge since they reluctantly adapt to changes (Torraco & Hoover, 2005; Watkins 2005). This is a dissertation study of the human resource concerns about adopting Internet2 in a…

  2. A Perkins Challenge: Assessing Technical Skills in CTE

    ERIC Educational Resources Information Center

    Stone, James R., III

    2009-01-01

    Federal law requires state to develop performance measures and data-collection systems for secondary and postsecondary technical-skill attainment. This poses many challenges, such as defining a technical skills, measurement and when to assess students. In this article, the author outlines various assessment models and looks at the challenges…

  3. Military Children and Families: Strengths and Challenges during Peace and War

    ERIC Educational Resources Information Center

    Park, Nansook

    2011-01-01

    Throughout history, military children and families have shown great capacity for adaptation and resilience. However, in recent years, unprecedented lengthy and multiple combat deployments of service members have posed multiple challenges for U.S. military children and families. Despite needs to better understand the impact of deployment on…

  4. Intergenerational Challenges in Australian Jewish School Education

    ERIC Educational Resources Information Center

    Gross, Zehavit; Rutland, Suzanne D.

    2014-01-01

    The aim of this research is to investigate the intergenerational changes that have occurred in Australian Jewish day schools and the challenges these pose for religious and Jewish education. Using a grounded theory approach according to the constant comparative method (Strauss 1987), data from three sources (interviews [296], observations [27],…

  5. Insights into the polerovirus-plant interactome revealed by co-immunoprecipitation and mass spectrometry

    USDA-ARS?s Scientific Manuscript database

    The identification of host proteins that interact with virus proteins is a major challenge for the field of virology. Phloem-limited viruses pose extraordinary challenges for in vivo protein interaction experiments because these viruses are localized in very few and highly specialized host cells. ...

  6. Evaluation Options for Family Resource Centers.

    ERIC Educational Resources Information Center

    Horsch, Karen, Ed.; Weiss, Heather B., Ed.

    Family resource centers (FRC) are emerging as a promising program approach to solving urgent social problems. Evaluation plays an important role in learning how these programs work, what their impact is, and whether they should be expanded. However, FRCs pose unique challenges to evaluation. This report considers the challenges to evaluating FRCs,…

  7. Many Faces Have I.

    ERIC Educational Resources Information Center

    Zilliox, Joseph T.; Lowery, Shannon G.

    1997-01-01

    Describes an extended investigation of polygons and polyhedra which was conducted in response to a challenge posed in Focus, a newsletter from the Mathematical Association of America (MAA). Students were challenged to construct a polyhedron with faces that measure more than 13 inches to a side. Outlines the process, including the questions posed…

  8. Relative Clauses in Cantonese-English Bilingual Children: Typological Challenges and Processing Motivations

    ERIC Educational Resources Information Center

    Yip, Virginia; Matthews, Stephen

    2007-01-01

    Findings from a longitudinal study of bilingual children acquiring Cantonese and English pose a challenge to the noun phrase accessibility hierarchy (NPAH; Keenan & Comrie, 1977), which predicts that object relatives should not be acquired before subject relatives. In the children's Cantonese, object relatives emerged earlier than or…

  9. A Self-Adaptive Multi-Agent System Approach for Collaborative Mobile Learning

    ERIC Educational Resources Information Center

    de la Iglesia, Didac Gil; Calderon, Juan Felipe; Weyns, Danny; Milrad, Marcelo; Nussbaum, Miguel

    2015-01-01

    Mobile technologies have emerged as facilitators in the learning process, extending traditional classroom activities. However, engineering mobile learning applications for outdoor usage poses severe challenges. The requirements of these applications are challenging, as many different aspects need to be catered, such as resource access and sharing,…

  10. Big Data Goes Personal: Privacy and Social Challenges

    ERIC Educational Resources Information Center

    Bonomi, Luca

    2015-01-01

    The Big Data phenomenon is posing new challenges in our modern society. In addition to requiring information systems to effectively manage high-dimensional and complex data, the privacy and social implications associated with the data collection, data analytics, and service requirements create new important research problems. First, the high…

  11. Science, Sport and Technology--A Contribution to Educational Challenges

    ERIC Educational Resources Information Center

    O'Hara, Kelly; Reis, Paula; Esteves, Dulce; Bras, Rui; Branco, Luisa

    2011-01-01

    Improve students' ability to link knowledge with real life practice, through enhancing children or teenagers' ability to think critically by way of making observations, posing questions, drawing up hypotheses, planning and carrying out investigations, analysing data and therefore improve their decision making is an educational challenge. Learning…

  12. Challenging Anti-Immigration Discourses in School and Community Contexts

    ERIC Educational Resources Information Center

    Allexsaht-Snider, Martha; Buxton, Cory A.; Harman, Ruth

    2012-01-01

    Rapid migration shifts, anti-immigrant discourses in the public sphere, and harsh immigration policies have posed daunting challenges for immigrant students, their families, their teachers, and their communities in the 21st century. Trends in public discourse and law enforcement in the United States mirror developments in European countries with…

  13. An investigation into multi-dimensional prediction models to estimate the pose error of a quadcopter in a CSP plant setting

    NASA Astrophysics Data System (ADS)

    Lock, Jacobus C.; Smit, Willie J.; Treurnicht, Johann

    2016-05-01

    The Solar Thermal Energy Research Group (STERG) is investigating ways to make heliostats cheaper to reduce the total cost of a concentrating solar power (CSP) plant. One avenue of research is to use unmanned aerial vehicles (UAVs) to automate and assist with the heliostat calibration process. To do this, the pose estimation error of each UAV must be determined and integrated into a calibration procedure. A computer vision (CV) system is used to measure the pose of a quadcopter UAV. However, this CV system contains considerable measurement errors. Since this is a high-dimensional problem, a sophisticated prediction model must be used to estimate the measurement error of the CV system for any given pose measurement vector. This paper attempts to train and validate such a model with the aim of using it to determine the pose error of a quadcopter in a CSP plant setting.

  14. Conduction-driven cooling of LED-based automotive LED lighting systems for abating local hot spots

    NASA Astrophysics Data System (ADS)

    Saati, Ferina; Arik, Mehmet

    2018-02-01

    Light-emitting diode (LED)-based automotive lighting systems pose unique challenges, such as dual-side packaging (front side for LEDs and back side for driver electronics circuit), size, harsh ambient, and cooling. Packaging for automotive lighting applications combining the advanced printed circuit board (PCB) technology with a multifunctional LED-based board is investigated with a focus on the effect of thermal conduction-based cooling for hot spot abatement. A baseline study with a flame retardant 4 technology, commonly known as FR4 PCB, is first compared with a metal-core PCB technology, both experimentally and computationally. The double-sided advanced PCB that houses both electronics and LEDs is then investigated computationally and experimentally compared with the baseline FR4 PCB. Computational models are first developed with a commercial computational fluid dynamics software and are followed by an advanced PCB technology based on embedded heat pipes, which is computationally and experimentally studied. Then, attention is turned to studying different heat pipe orientations and heat pipe placements on the board. Results show that conventional FR4-based light engines experience local hot spots (ΔT>50°C) while advanced PCB technology based on heat pipes and thermal spreaders eliminates these local hot spots (ΔT<10°C), leading to a higher lumen extraction with improved reliability. Finally, possible design options are presented with embedded heat pipe structures that further improve the PCB performance.

  15. The architecture challenge: Future artificial-intelligence systems will require sophisticated architectures, and knowledge of the brain might guide their construction.

    PubMed

    Baldassarre, Gianluca; Santucci, Vieri Giuliano; Cartoni, Emilio; Caligiore, Daniele

    2017-01-01

    In this commentary, we highlight a crucial challenge posed by the proposal of Lake et al. to introduce key elements of human cognition into deep neural networks and future artificial-intelligence systems: the need to design effective sophisticated architectures. We propose that looking at the brain is an important means of facing this great challenge.

  16. Challenges and climate of business environment and resources to support pediatric device development.

    PubMed

    Iqbal, Corey W; Wall, James; Harrison, Michael R

    2015-06-01

    The incidence of pediatric disease conditions pales in comparison to adult disease. Consequently, many pediatric disorders are considered orphan diseases. Resources for the development of devices targeting orphan diseases are scarce and this poses a unique challenge to the development of pediatric devices. This article outlines these challenges and offers solutions. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Thesis Writing Challenges for Non-Native MA Students

    ERIC Educational Resources Information Center

    Sadeghi, Karim; Shirzad Khajepasha, Arash

    2015-01-01

    Writing in a second (L2)/foreign language is generally a challenging activity, and writing an MA thesis, as an example of academic enterprise, can be daunting when done in a language in which the writer is not fully competent. The challenge such a genre of writing poses for L2 writers has not been properly addressed. To fill in the gap in this…

  18. A direct method for nonlinear ill-posed problems

    NASA Astrophysics Data System (ADS)

    Lakhal, A.

    2018-02-01

    We propose a direct method for solving nonlinear ill-posed problems in Banach-spaces. The method is based on a stable inversion formula we explicitly compute by applying techniques for analytic functions. Furthermore, we investigate the convergence and stability of the method and prove that the derived noniterative algorithm is a regularization. The inversion formula provides a systematic sensitivity analysis. The approach is applicable to a wide range of nonlinear ill-posed problems. We test the algorithm on a nonlinear problem of travel-time inversion in seismic tomography. Numerical results illustrate the robustness and efficiency of the algorithm.

  19. Transactions of the Army Conference on Applied Mathematics and Computing (7Th) Held in West Point, New York on 6-9 June 1989

    DTIC Science & Technology

    1990-02-01

    34 Arch. &ax. Mech. Anal- 94, 1986, pp. 1-14. [15] J. L. Ericksen, "Twinning of crystals I," in MetastahjliM and ImpJtiy Posed Problenms, S. Antman et al...metastability of quartz," in Metastahili and Inmjr~ici~ Posed Praem, S. Antman et al. eds, Springer-Verlag, 1987, pp. 147- 176. [30] R. D. James and D...and I mplete1y Posed Prnhlemi, S. Antman et al eds, Springer-Verlag, 1987, pp. 135-146. 301 (35] D. Kinderlehrer and G. Vergara-Caffarelli, "The

  20. Reflections from a Computer Simulations Program on Cell Division in Selected Kenyan Secondary Schools

    ERIC Educational Resources Information Center

    Ndirangu, Mwangi; Kiboss, Joel K.; Wekesa, Eric W.

    2005-01-01

    The application of computer technology in education is a relatively new approach that is trying to justify inclusion in the Kenyan school curriculum. Being abstract, with a dynamic nature that does not manifest itself visibly, the process of cell division has posed difficulties for teachers. Consequently, a computer simulation program, using…

  1. Lock It Up! Computer Security.

    ERIC Educational Resources Information Center

    Wodarz, Nan

    1997-01-01

    The data contained on desktop computer systems and networks pose security issues for virtually every district. Sensitive information can be protected by educating users, altering the physical layout, using password protection, designating access levels, backing up data, reformatting floppy disks, using antivirus software, and installing encryption…

  2. Tai chi/yoga effects on anxiety, heartrate, EEG and math computations.

    PubMed

    Field, Tiffany; Diego, Miguel; Hernandez-Reif, Maria

    2010-11-01

    To determine the immediate effects of a combined form of Tai chi/yoga. 38 adults participated in a 20-min Tai chi/yoga class. The session was comprised of standing Tai chi movements, balancing poses and a short Tai chi form and 10 min of standing, sitting and lying down yoga poses. The pre- and post- Tai chi/yoga effects were assessed using the State Anxiety Inventory (STAI), EKG, EEG and math computations. Heartrate increased during the session, as would be expected for this moderate-intensity exercise. Changes from pre to post-session assessments suggested increased relaxation including decreased anxiety and a trend for increased EEG theta activity. The increased relaxation may have contributed to the increased speed and accuracy noted on math computations following the Tai chi/yoga class. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. Tai Chi/ Yoga Effects on Anxiety, Heartrate, EEG and Math Computations

    PubMed Central

    Field, Tiffany; Diego, Miguel; Hernandez-Reif, Maria

    2010-01-01

    Objective To determine the immediate effects of a combined form of tai chi/yoga. Design 38 adults participated in a 20-minute tai chi/yoga class. The session was comprised of standing tai chi movements, balancing poses and a short tai chi form and 10 minutes of standing, sitting and lying down yoga poses. Main outcome measures The pre- and post- tai chi/ yoga effects were assessed using the State Anxiety Inventory (STAI), EKG, EEG and math computations. Results Heartrate increased during the session, as would be expected for this moderate intensity exercise. Changes from pre to post session assessments suggested increased relaxation including decreased anxiety and a trend for increased EEG theta activity. Conclusions The increased relaxation may have contributed to the increased speed and accuracy noted on math computations following the tai chi/yoga class. PMID:20920810

  4. Exploration Design Challenge 2014

    NASA Image and Video Library

    2014-04-25

    Team Titan Shielding Systems poses with NASA Administrator Charles Bolden and Lockheed Martin CEO, Marillyn Hewson. Team Titan Shielding Systems was one of the semi-finalists in the Exploration Design Challenge. The goal of the Exploration Design Challenge is for students to research and design ways to protect astronauts from space radiation. The winner of the challenge was announced on April 25, 2014 at the USA Science and Engineering Festival at the Washington Convention Center in Washington, DC. Photo Credit: (NASA/Aubrey Gemignani)

  5. Vision-Based Navigation and Parallel Computing

    DTIC Science & Technology

    1990-08-01

    33 5.8. Behizad Kamgar-Parsi and Behrooz Karngar-Parsi,"On Problem 5- lving with Hopfield Neural Networks", CAR-TR-462, CS-TR...Second. the hypercube connections support logarithmic implementations of fundamental parallel algorithms. such as grid permutations and scan...the pose space. It also uses a set of virtual processors to represent an orthogonal projection grid , and projections of the six dimensional pose space

  6. Decoding the Role of Water Dynamics in Ligand-Protein Unbinding: CRF1R as a Test Case.

    PubMed

    Bortolato, Andrea; Deflorian, Francesca; Weiss, Dahlia R; Mason, Jonathan S

    2015-09-28

    The residence time of a ligand-protein complex is a crucial aspect in determining biological effect in vivo. Despite its importance, the prediction of ligand koff still remains challenging for modern computational chemistry. We have developed aMetaD, a fast and generally applicable computational protocol to predict ligand-protein unbinding events using a molecular dynamics (MD) method based on adiabatic-bias MD and metadynamics. This physics-based, fully flexible, and pose-dependent ligand scoring function evaluates the maximum energy (RTscore) required to move the ligand from the bound-state energy basin to the next. Unbinding trajectories are automatically analyzed and translated into atomic solvation factor (SF) values representing the water dynamics during the unbinding event. This novel computational protocol was initially tested on two M3 muscarinic receptor and two adenosine A2A receptor antagonists and then evaluated on a test set of 12 CRF1R ligands. The resulting RTscores were used successfully to classify ligands with different residence times. Additionally, the SF analysis was used to detect key differences in the degree of accessibility to water molecules during the predicted ligand unbinding events. The protocol provides actionable working hypotheses that are applicable in a drug discovery program for the rational optimization of ligand binding kinetics.

  7. Invasion emerges from cancer cell adaptation to competitive microenvironments: Quantitative predictions from multiscale mathematical models

    PubMed Central

    Rejniak, Katarzyna A.; Gerlee, Philip

    2013-01-01

    Summary In this review we summarize our recent efforts using mathematical modeling and computation to simulate cancer invasion, with a special emphasis on the tumor microenvironment. We consider cancer progression as a complex multiscale process and approach it with three single-cell based mathematical models that examine the interactions between tumor microenvironment and cancer cells at several scales. The models exploit distinct mathematical and computational techniques, yet they share core elements and can be compared and/or related to each other. The overall aim of using mathematical models is to uncover the fundamental mechanisms that lend cancer progression its direction towards invasion and metastasis. The models effectively simulate various modes of cancer cell adaptation to the microenvironment in a growing tumor. All three point to a general mechanism underlying cancer invasion: competition for adaptation between distinct cancer cell phenotypes, driven by a tumor microenvironment with scarce resources. These theoretical predictions pose an intriguing experimental challenge: test the hypothesis that invasion is an emergent property of cancer cell populations adapting to selective microenvironment pressure, rather than culmination of cancer progression producing cells with the “invasive phenotype”. In broader terms, we propose that fundamental insights into cancer can be achieved by experimentation interacting with theoretical frameworks provided by computational and mathematical modeling. PMID:18524624

  8. Accelerating large-scale protein structure alignments with graphics processing units

    PubMed Central

    2012-01-01

    Background Large-scale protein structure alignment, an indispensable tool to structural bioinformatics, poses a tremendous challenge on computational resources. To ensure structure alignment accuracy and efficiency, efforts have been made to parallelize traditional alignment algorithms in grid environments. However, these solutions are costly and of limited accessibility. Others trade alignment quality for speedup by using high-level characteristics of structure fragments for structure comparisons. Findings We present ppsAlign, a parallel protein structure Alignment framework designed and optimized to exploit the parallelism of Graphics Processing Units (GPUs). As a general-purpose GPU platform, ppsAlign could take many concurrent methods, such as TM-align and Fr-TM-align, into the parallelized algorithm design. We evaluated ppsAlign on an NVIDIA Tesla C2050 GPU card, and compared it with existing software solutions running on an AMD dual-core CPU. We observed a 36-fold speedup over TM-align, a 65-fold speedup over Fr-TM-align, and a 40-fold speedup over MAMMOTH. Conclusions ppsAlign is a high-performance protein structure alignment tool designed to tackle the computational complexity issues from protein structural data. The solution presented in this paper allows large-scale structure comparisons to be performed using massive parallel computing power of GPU. PMID:22357132

  9. Software Carpentry and the Hydrological Sciences

    NASA Astrophysics Data System (ADS)

    Ahmadia, A. J.; Kees, C. E.; Farthing, M. W.

    2013-12-01

    Scientists are spending an increasing amount of time building and using hydrology software. However, most scientists are never taught how to do this efficiently. As a result, many are unaware of tools and practices that would allow them to write more reliable and maintainable code with less effort. As hydrology models increase in capability and enter use by a growing number of scientists and their communities, it is important that the scientific software development practices scale up to meet the challenges posed by increasing software complexity, lengthening software lifecycles, a growing number of stakeholders and contributers, and a broadened developer base that extends from application domains to high performance computing centers. Many of these challenges in complexity, lifecycles, and developer base have been successfully met by the open source community, and there are many lessons to be learned from their experiences and practices. Additionally, there is much wisdom to be found in the results of research studies conducted on software engineering itself. Software Carpentry aims to bridge the gap between the current state of software development and these known best practices for scientific software development, with a focus on hands-on exercises and practical advice based on the following principles: 1. Write programs for people, not computers. 2. Automate repetitive tasks 3. Use the computer to record history 4. Make incremental changes 5. Use version control 6. Don't repeat yourself (or others) 7. Plan for mistakes 8. Optimize software only after it works 9. Document design and purpose, not mechanics 10. Collaborate We discuss how these best practices, arising from solid foundations in research and experience, have been shown to help improve scientist's productivity and the reliability of their software.

  10. New technologies in radiation therapy: ensuring patient safety, radiation safety and regulatory issues in radiation oncology.

    PubMed

    Amols, Howard I

    2008-11-01

    New technologies such as intensity modulated and image guided radiation therapy, computer controlled linear accelerators, record and verify systems, electronic charts, and digital imaging have revolutionized radiation therapy over the past 10-15 y. Quality assurance (QA) as historically practiced and as recommended in reports such as American Association of Physicists in Medicine Task Groups 40 and 53 needs to be updated to address the increasing complexity and computerization of radiotherapy equipment, and the increased quantity of data defining a treatment plan and treatment delivery. While new technology has reduced the probability of many types of medical events, seeing new types of errors caused by improper use of new technology, communication failures between computers, corrupted or erroneous computer data files, and "software bugs" are now being seen. The increased use of computed tomography, magnetic resonance, and positron emission tomography imaging has become routine for many types of radiotherapy treatment planning, and QA for imaging modalities is beyond the expertise of most radiotherapy physicists. Errors in radiotherapy rarely result solely from hardware failures. More commonly they are a combination of computer and human errors. The increased use of radiosurgery, hypofractionation, more complex intensity modulated treatment plans, image guided radiation therapy, and increasing financial pressures to treat more patients in less time will continue to fuel this reliance on high technology and complex computer software. Clinical practitioners and regulatory agencies are beginning to realize that QA for new technologies is a major challenge and poses dangers different in nature than what are historically familiar.

  11. DHS S&T First Responders Group and NATO Counter UAS Proposal Interest Response.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salton, Jonathan R.

    The capability, speed, size, and widespread availability of small unmanned aerial systems (sUAS) makes them a serious security concern. The enabling technologies for sUAS are rapidly evolving and so too are the threats they pose to national security. Potential threat vehicles have a small cross-section, and are difficult to reliably detect using purely ground-based systems (e.g. radar or electro-optical) and challenging to target using conventional anti-aircraft defenses. Ground-based sensors are static and suffer from interference with the earth, vegetation and other man-made structures which obscure objects at low altitudes. Because of these challenges, sUAS pose a unique and rapidly evolvingmore » threat to national security.« less

  12. An infant with chronic severe neutropenia

    PubMed Central

    Bhat, Ramesh Y; Varma, Chaitanya P V; Bhatt, Sonia

    2014-01-01

    Neutropenia in infancy and childhood poses a diagnostic challenge as the aetiology ranges from acute life-threatening conditions to chronic benign diseases. Chronic benign neutropenia of infancy is a rare disorder occurring in 1:100 000. The neutrophil count continues to be low for a prolonged period until spontaneous resolution by the age of 3–4 years. Such infants are having higher incidences of minor infections requiring treatment with antibiotics and rare incidences of meningitis and sepsis. The authors describe an infant presenting with fever and cervical lymphadenitis, who was found to have isolated severe neutropenia and its persistence posing a diagnostic challenge. The prolonged course with minor infections and absence of serious underlying conditions finally confirmed chronic benign neutropenia of infancy. PMID:24711472

  13. The Graphical User Interface: Crisis, Danger, and Opportunity.

    ERIC Educational Resources Information Center

    Boyd, L. H.; And Others

    1990-01-01

    This article describes differences between the graphical user interface and traditional character-based interface systems, identifies potential problems posed by graphic computing environments for blind computer users, and describes some programs and strategies that are being developed to provide access to those environments. (Author/JDD)

  14. Free Software and Multivariable Calculus

    ERIC Educational Resources Information Center

    Nord, Gail M.

    2011-01-01

    Calculators and computers make new modes of instruction possible; yet, at the same time they pose hardships for school districts and mathematics educators trying to incorporate technology with limited monetary resources. In the "Standards," a recommended classroom is one in which calculators, computers, courseware, and manipulative materials are…

  15. A qualitative exploration of district nurses' care of patients with advanced cancer: the challenges of supporting families within the home.

    PubMed

    Wilson, Charlotte; Griffiths, Jane; Ewing, Gail; Connolly, Michael; Grande, Gunn

    2014-01-01

    In the United Kingdom, district nurses (DNs) support patients with advanced cancer in their homes. Although evidence suggests that DNs emphasize the distinctiveness of home rather than hospital settings, little is known about the specific challenges of delivering care in family-home settings. The objective of this study was to explore DNs' experiences of supporting patients within families. Focus groups were conducted with 40 DNs from 4 areas in the United Kingdom. The groups were digitally recorded and facilitated by researchers using a flexible topic guide. Verbatim transcripts were analyzed using thematic content analysis. Case-load complexity (household volatility) and family dynamics posed distinct challenges for nurses supporting patients. Many family members struggled with accepting the patients' prognosis and were complicit in withholding information. At times, this foreclosed a consideration of palliative options. Carers provide a great deal of positive supportive care within the home. However, for some, the home is characterized by conflict rather than consensus. Complexities surrounding family relationships pose a distinctive and challenging environment for DNs. Education and training of DNs should be designed to address the challenges of supporting patients within the family-home setting.

  16. Estimating the gaze of a virtuality human.

    PubMed

    Roberts, David J; Rae, John; Duckworth, Tobias W; Moore, Carl M; Aspin, Rob

    2013-04-01

    The aim of our experiment is to determine if eye-gaze can be estimated from a virtuality human: to within the accuracies that underpin social interaction; and reliably across gaze poses and camera arrangements likely in every day settings. The scene is set by explaining why Immersive Virtuality Telepresence has the potential to meet the grand challenge of faithfully communicating both the appearance and the focus of attention of a remote human participant within a shared 3D computer-supported context. Within the experiment n=22 participants rotated static 3D virtuality humans, reconstructed from surround images, until they felt most looked at. The dependent variable was absolute angular error, which was compared to that underpinning social gaze behaviour in the natural world. Independent variables were 1) relative orientations of eye, head and body of captured subject; and 2) subset of cameras used to texture the form. Analysis looked for statistical and practical significance and qualitative corroborating evidence. The analysed results tell us much about the importance and detail of the relationship between gaze pose, method of video based reconstruction, and camera arrangement. They tell us that virtuality can reproduce gaze to an accuracy useful in social interaction, but with the adopted method of Video Based Reconstruction, this is highly dependent on combination of gaze pose and camera arrangement. This suggests changes in the VBR approach in order to allow more flexible camera arrangements. The work is of interest to those wanting to support expressive meetings that are both socially and spatially situated, and particular those using or building Immersive Virtuality Telepresence to accomplish this. It is also of relevance to the use of virtuality humans in applications ranging from the study of human interactions to gaming and the crossing of the stage line in films and TV.

  17. A validated non-linear Kelvin-Helmholtz benchmark for numerical hydrodynamics

    NASA Astrophysics Data System (ADS)

    Lecoanet, D.; McCourt, M.; Quataert, E.; Burns, K. J.; Vasil, G. M.; Oishi, J. S.; Brown, B. P.; Stone, J. M.; O'Leary, R. M.

    2016-02-01

    The non-linear evolution of the Kelvin-Helmholtz instability is a popular test for code verification. To date, most Kelvin-Helmholtz problems discussed in the literature are ill-posed: they do not converge to any single solution with increasing resolution. This precludes comparisons among different codes and severely limits the utility of the Kelvin-Helmholtz instability as a test problem. The lack of a reference solution has led various authors to assert the accuracy of their simulations based on ad hoc proxies, e.g. the existence of small-scale structures. This paper proposes well-posed two-dimensional Kelvin-Helmholtz problems with smooth initial conditions and explicit diffusion. We show that in many cases numerical errors/noise can seed spurious small-scale structure in Kelvin-Helmholtz problems. We demonstrate convergence to a reference solution using both ATHENA, a Godunov code, and DEDALUS, a pseudo-spectral code. Problems with constant initial density throughout the domain are relatively straightforward for both codes. However, problems with an initial density jump (which are the norm in astrophysical systems) exhibit rich behaviour and are more computationally challenging. In the latter case, ATHENA simulations are prone to an instability of the inner rolled-up vortex; this instability is seeded by grid-scale errors introduced by the algorithm, and disappears as resolution increases. Both ATHENA and DEDALUS exhibit late-time chaos. Inviscid simulations are riddled with extremely vigorous secondary instabilities which induce more mixing than simulations with explicit diffusion. Our results highlight the importance of running well-posed test problems with demonstrated convergence to a reference solution. To facilitate future comparisons, we include as supplementary material the resolved, converged solutions to the Kelvin-Helmholtz problems in this paper in machine-readable form.

  18. A multi-camera system for real-time pose estimation

    NASA Astrophysics Data System (ADS)

    Savakis, Andreas; Erhard, Matthew; Schimmel, James; Hnatow, Justin

    2007-04-01

    This paper presents a multi-camera system that performs face detection and pose estimation in real-time and may be used for intelligent computing within a visual sensor network for surveillance or human-computer interaction. The system consists of a Scene View Camera (SVC), which operates at a fixed zoom level, and an Object View Camera (OVC), which continuously adjusts its zoom level to match objects of interest. The SVC is set to survey the whole filed of view. Once a region has been identified by the SVC as a potential object of interest, e.g. a face, the OVC zooms in to locate specific features. In this system, face candidate regions are selected based on skin color and face detection is accomplished using a Support Vector Machine classifier. The locations of the eyes and mouth are detected inside the face region using neural network feature detectors. Pose estimation is performed based on a geometrical model, where the head is modeled as a spherical object that rotates upon the vertical axis. The triangle formed by the mouth and eyes defines a vertical plane that intersects the head sphere. By projecting the eyes-mouth triangle onto a two dimensional viewing plane, equations were obtained that describe the change in its angles as the yaw pose angle increases. These equations are then combined and used for efficient pose estimation. The system achieves real-time performance for live video input. Testing results assessing system performance are presented for both still images and video.

  19. Pose estimation of industrial objects towards robot operation

    NASA Astrophysics Data System (ADS)

    Niu, Jie; Zhou, Fuqiang; Tan, Haishu; Cao, Yu

    2017-10-01

    With the advantages of wide range, non-contact and high flexibility, the visual estimation technology of target pose has been widely applied in modern industry, robot guidance and other engineering practices. However, due to the influence of complicated industrial environment, outside interference factors, lack of object characteristics, restrictions of camera and other limitations, the visual estimation technology of target pose is still faced with many challenges. Focusing on the above problems, a pose estimation method of the industrial objects is developed based on 3D models of targets. By matching the extracted shape characteristics of objects with the priori 3D model database of targets, the method realizes the recognition of target. Thus a pose estimation of objects can be determined based on the monocular vision measuring model. The experimental results show that this method can be implemented to estimate the position of rigid objects based on poor images information, and provides guiding basis for the operation of the industrial robot.

  20. 3D ocular ultrasound using gaze tracking on the contralateral eye: a feasibility study.

    PubMed

    Afsham, Narges; Najafi, Mohammad; Abolmaesumi, Purang; Rohling, Robert

    2011-01-01

    A gaze-deviated examination of the eye with a 2D ultrasound transducer is a common and informative ophthalmic test; however, the complex task of the pose estimation of the ultrasound images relative to the eye affects 3D interpretation. To tackle this challenge, a novel system for 3D image reconstruction based on gaze tracking of the contralateral eye has been proposed. The gaze fixates on several target points and, for each fixation, the pose of the examined eye is inferred from the gaze tracking. A single camera system has been developed for pose estimation combined with subject-specific parameter identification. The ultrasound images are then transformed to the coordinate system of the examined eye to create a 3D volume. Accuracy of the proposed gaze tracking system and the pose estimation of the eye have been validated in a set of experiments. Overall system error, including pose estimation and calibration, are 3.12 mm and 4.68 degrees.

  1. The IS-GEO RCN: Fostering Collaborations for Intelligent Systems Research to Support Geosciences

    NASA Astrophysics Data System (ADS)

    Gil, Y.; Pierce, S. A.

    2016-12-01

    Geoscience problems are complex and often involve data that changes across space and time. Frequently geoscience knowledge and understanding provides valuable information and insight for problems related to energy, water, climate, mineral resources, and our understanding of how the Earth evolves through time. Simultaneously, many grand challenges in the geosciences cannot be addressed without the aid of computational support and innovations. Intelligent and Information Systems (IS) research includes a broad range of computational methods and topics such as knowledge representation, information integration, machine learning, robotics, adaptive sensors, and intelligent interfaces. IS research has a very important role to play in accelerating the speed of scientific discovery in geosciences and thus in solving challenges in geosciences. Many aspects of geosciences (GEO) research pose novel open problems for intelligent systems researchers. To develop intelligent systems with sound knowledge of theory and practice, it is important that GEO and IS experts collaborate. The EarthCube Research Coordination Network for Intelligent Systems for Geosciences (IS-GEO RCN) represents an emerging community of interdisciplinary researchers producing fundamental new capabilities for understanding Earth systems. Furthermore, the educational component aims to identify new approaches to teaching students in this new interdisciplinary area, seeking to raise a new generation of scientists that are better able to apply IS methods and tools to geoscience challenges of the future. By providing avenues for IS and GEO researchers to work together, the IS-GEO RCN will serve as both a point of contact, as well as an avenue for educational outreach across the disciplines for the nascent community of research and practice. The initial efforts are focused on connecting the communities in ways that help researchers understand opportunities and challenges that can benefit from IS-GEO collaborations. The IS-GEO RCN will jumpstart interdisciplinary research collaborations in this emerging new area so that progress across both disciplines can be accelerated.

  2. Predicting binding poses and affinities for protein - ligand complexes in the 2015 D3R Grand Challenge using a physical model with a statistical parameter estimation

    NASA Astrophysics Data System (ADS)

    Grudinin, Sergei; Kadukova, Maria; Eisenbarth, Andreas; Marillet, Simon; Cazals, Frédéric

    2016-09-01

    The 2015 D3R Grand Challenge provided an opportunity to test our new model for the binding free energy of small molecules, as well as to assess our protocol to predict binding poses for protein-ligand complexes. Our pose predictions were ranked 3-9 for the HSP90 dataset, depending on the assessment metric. For the MAP4K dataset the ranks are very dispersed and equal to 2-35, depending on the assessment metric, which does not provide any insight into the accuracy of the method. The main success of our pose prediction protocol was the re-scoring stage using the recently developed Convex-PL potential. We make a thorough analysis of our docking predictions made with AutoDock Vina and discuss the effect of the choice of rigid receptor templates, the number of flexible residues in the binding pocket, the binding pocket size, and the benefits of re-scoring. However, the main challenge was to predict experimentally determined binding affinities for two blind test sets. Our affinity prediction model consisted of two terms, a pairwise-additive enthalpy, and a non pairwise-additive entropy. We trained the free parameters of the model with a regularized regression using affinity and structural data from the PDBBind database. Our model performed very well on the training set, however, failed on the two test sets. We explain the drawback and pitfalls of our model, in particular in terms of relative coverage of the test set by the training set and missed dynamical properties from crystal structures, and discuss different routes to improve it.

  3. Hierarchical graphical-based human pose estimation via local multi-resolution convolutional neural network

    NASA Astrophysics Data System (ADS)

    Zhu, Aichun; Wang, Tian; Snoussi, Hichem

    2018-03-01

    This paper addresses the problems of the graphical-based human pose estimation in still images, including the diversity of appearances and confounding background clutter. We present a new architecture for estimating human pose using a Convolutional Neural Network (CNN). Firstly, a Relative Mixture Deformable Model (RMDM) is defined by each pair of connected parts to compute the relative spatial information in the graphical model. Secondly, a Local Multi-Resolution Convolutional Neural Network (LMR-CNN) is proposed to train and learn the multi-scale representation of each body parts by combining different levels of part context. Thirdly, a LMR-CNN based hierarchical model is defined to explore the context information of limb parts. Finally, the experimental results demonstrate the effectiveness of the proposed deep learning approach for human pose estimation.

  4. Rigorous Numerics for ill-posed PDEs: Periodic Orbits in the Boussinesq Equation

    NASA Astrophysics Data System (ADS)

    Castelli, Roberto; Gameiro, Marcio; Lessard, Jean-Philippe

    2018-04-01

    In this paper, we develop computer-assisted techniques for the analysis of periodic orbits of ill-posed partial differential equations. As a case study, our proposed method is applied to the Boussinesq equation, which has been investigated extensively because of its role in the theory of shallow water waves. The idea is to use the symmetry of the solutions and a Newton-Kantorovich type argument (the radii polynomial approach) to obtain rigorous proofs of existence of the periodic orbits in a weighted ℓ1 Banach space of space-time Fourier coefficients with exponential decay. We present several computer-assisted proofs of the existence of periodic orbits at different parameter values.

  5. Gravitational Wave Science: Challenges for Numerical Relativistic Astrophysics

    NASA Technical Reports Server (NTRS)

    Cenrella, Joan

    2005-01-01

    Gravitational wave detectors on earth and in space will open up a new observational window on the universe. The new information about astrophysics and fundamental physics these observations will bring is expected to pose exciting challenges. This talk will provide an overview of this emerging area of gravitational wave science, with a focus on the challenges it will bring for numerical relativistic astrophysics and a look at some recent results.

  6. Exploration Design Challenge 2014

    NASA Image and Video Library

    2014-04-25

    Team ARES poses with NASA Administrator Charles Bolden and Lockheed Martin CEO, Marillyn Hewson. Team ARES was the winner of the Exploration Design Challenge. The goal of the Exploration Design Challenge is for students to research and design ways to protect astronauts from space radiation. The winning team was announced on April 25, 2014 at the USA Science and Engineering Festival at the Washington Convention Center in Washington, DC. Photo Credit: (NASA/Aubrey Gemignani)

  7. Web-Scale Search-Based Data Extraction and Integration

    DTIC Science & Technology

    2011-10-17

    differently, posing challenges for aggregating this information. For example, for the task of finding population for cities in Benin, we were faced with...merged record. Our GeoMerging algorithm attempts to address various ambiguity challenges : • For name: The name of a hospital is not a unique...departments in the same building. For agent-extractor results from structured sources, our GeoMerging algorithm overcomes these challenges using a two

  8. Green Flight Challenge

    NASA Image and Video Library

    2011-09-28

    e-Genius Aircraft Pilot Klaus Ohlmann poses for a photograph during the 2011 Green Flight Challenge, sponsored by Google, held at the Charles M. Schulz Sonoma County Airport in Santa Rosa, Calif. on Thursday, Sept. 29, 2011. NASA and the Comparative Aircraft Flight Efficiency (CAFE) Foundation are having the challenge with the goal to advance technologies in fuel efficiency and reduced emissions with cleaner renewable fuels and electric aircraft. Photo Credit: (NASA/Bill Ingalls)

  9. Green Flight Challenge

    NASA Image and Video Library

    2011-09-28

    e-Genius Aircraft Pilot Eric Raymond poses for a photograph during the 2011 Green Flight Challenge, sponsored by Google, held at the Charles M. Schulz Sonoma County Airport in Santa Rosa, Calif. on Thursday, Sept. 29, 2011. NASA and the Comparative Aircraft Flight Efficiency (CAFE) Foundation are having the challenge with the goal to advance technologies in fuel efficiency and reduced emissions with cleaner renewable fuels and electric aircraft. Photo Credit: (NASA/Bill Ingalls)

  10. Green Flight Challenge

    NASA Image and Video Library

    2011-09-28

    PhoEnix Aircraft Co-Pilot Jeff Shingleton poses for a photograph during the 2011 Green Flight Challenge, sponsored by Google, held at the Charles M. Schulz Sonoma County Airport in Santa Rosa, Calif. on Thursday, Sept. 29, 2011. NASA and the Comparative Aircraft Flight Efficiency (CAFE) Foundation are having the challenge with the goal to advance technologies in fuel efficiency and reduced emissions with cleaner renewable fuels and electric aircraft. Photo Credit: (NASA/Bill Ingalls)

  11. Green Flight Challenge

    NASA Image and Video Library

    2011-09-28

    PhoEnix Aircraft Pilot Jim Lee poses for a photograph during the 2011 Green Flight Challenge, sponsored by Google, held at the Charles M. Schulz Sonoma County Airport in Santa Rosa, Calif. on Thursday, Sept. 29, 2011. NASA and the Comparative Aircraft Flight Efficiency (CAFE) Foundation are having the challenge with the goal to advance technologies in fuel efficiency and reduced emissions with cleaner renewable fuels and electric aircraft. Photo Credit: (NASA/Bill Ingalls)

  12. Green Flight Challenge

    NASA Image and Video Library

    2011-09-28

    EcoEagle Aircraft Pilot Mikhael Ponso poses for a photograph during the 2011 Green Flight Challenge, sponsored by Google, held at the Charles M. Schulz Sonoma County Airport in Santa Rosa, Calif. on Thursday, Sept. 29, 2011. NASA and the Comparative Aircraft Flight Efficiency (CAFE) Foundation are having the challenge with the goal to advance technologies in fuel efficiency and reduced emissions with cleaner renewable fuels and electric aircraft. Photo Credit: (NASA/Bill Ingalls)

  13. A Donor-Focused Fundraising Model: An Essential Tool in Community College Foundations' Toolkit

    ERIC Educational Resources Information Center

    Carter, Linnie S.

    2011-01-01

    The increased focus on private fundraising poses challenges for community colleges (Jackson & Glass, 2000). A challenge is a lack of fundraising experience within community colleges and their foundations. There now exists a donor-focused fundraising model for community colleges to use to enhance their fundraising initiatives and increase the…

  14. "How Can I Help?": Practicing Familial Support through Simulation

    ERIC Educational Resources Information Center

    Coughlin, April B.; Dotger, Benjamin H.

    2016-01-01

    Teachers face numerous challenges in daily practice, including situations that involve the health, safety, and well-being of students and families. When harassment and physical abuse impact K-12 students, these situations pose unexpected challenges to novice teachers working to support their students. In this article, the authors report on a study…

  15. A Strategy to Support Educational Leaders in Developing Countries to Manage Contextual Challenges

    ERIC Educational Resources Information Center

    Wolhuter, Charl; van der Walt, Hannes; Steyn, Hennie

    2016-01-01

    The central theoretical argument of this paper is that educational leadership and organisational development and change in educational institutions in developing countries will not be effective unless school leaders are aware of the challenges posed by contextual factors that might have an impact on their professional activities. The article…

  16. Experimental Evidence for Diagramming Benefits in Science Writing

    ERIC Educational Resources Information Center

    Barstow, Brendan; Fazio, Lisa; Schunn, Christian; Ashley, Kevin

    2017-01-01

    Arguing for the need for a scientific research study (i.e. writing an introduction to a research paper) poses significant challenges for students. When faced with these challenges, students often generate overly safe replications (i.e. fail to find and include opposition to their hypothesis) or in contrast include no strong support for their…

  17. Places to Go: Challenges to Multicultural Art Education in a Global Economy

    ERIC Educational Resources Information Center

    Desai, Dipti

    2005-01-01

    This article examines the relationship between globalization and postmodern multicultural art education. The questions that drive my investigation are: What is the role of postmodern multiculturalism in this current phase of globalization and what challenges does globalization pose for multiculturalism? I explore the shifts in the field of art…

  18. A Systematic Approach to Faculty Development--Capability Improvement for Blended Learning

    ERIC Educational Resources Information Center

    Badawood, Ashraf; Steenkamp, Annette Lerine; Al-Werfalli, Daw

    2013-01-01

    Blended learning (BL) provides an efficient and effective instructional experience. However, adopting a BL approach poses some challenges to faculty; the most important obstacle found in this research is faculty's lack of knowledge regarding the use of technology in their teaching. This challenge prompted the research project focused on improving…

  19. Governing during an Institutional Crisis: 10 Fundamental Principles

    ERIC Educational Resources Information Center

    White, Lawrence

    2012-01-01

    In today's world, managing a campus crisis poses special challenges for an institution's governing board, which may operate some distance removed from the immediate events giving rise to the crisis. In its most challenging form, a campus crisis--a shooting, a natural disaster, a fraternity hazing death, the arrest of a prominent campus…

  20. Teaching Caribbean Students: Research on Social Issues in the Caribbean and Abroad.

    ERIC Educational Resources Information Center

    Bastick, Tony, Ed.; Ezenne, Austin, Ed.

    The issues and findings in the research essays in this collection focus on two main themes: the identification of challenges in preparing Caribbean students for the new global network and the isolation of the challenges posed in developing these global relations. Part 1, "Socially Sensitive Pedagogies," contains: (1) "Domain-Specific Modern…

Top